Jan 22 16:00:38 localhost kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 22 16:00:38 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 22 16:00:38 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 22 16:00:38 localhost kernel: BIOS-provided physical RAM map:
Jan 22 16:00:38 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 22 16:00:38 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 22 16:00:38 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 22 16:00:38 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 22 16:00:38 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 22 16:00:38 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 22 16:00:38 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 22 16:00:38 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 22 16:00:38 localhost kernel: NX (Execute Disable) protection: active
Jan 22 16:00:38 localhost kernel: APIC: Static calls initialized
Jan 22 16:00:38 localhost kernel: SMBIOS 2.8 present.
Jan 22 16:00:38 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 22 16:00:38 localhost kernel: Hypervisor detected: KVM
Jan 22 16:00:38 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 22 16:00:38 localhost kernel: kvm-clock: using sched offset of 3344005882 cycles
Jan 22 16:00:38 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 22 16:00:38 localhost kernel: tsc: Detected 2800.000 MHz processor
Jan 22 16:00:38 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Jan 22 16:00:38 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Jan 22 16:00:38 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 22 16:00:38 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 22 16:00:38 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 22 16:00:38 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 22 16:00:38 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 22 16:00:38 localhost kernel: Using GB pages for direct mapping
Jan 22 16:00:38 localhost kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 22 16:00:38 localhost kernel: ACPI: Early table checksum verification disabled
Jan 22 16:00:38 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 22 16:00:38 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 16:00:38 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 16:00:38 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 16:00:38 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 22 16:00:38 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 16:00:38 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 16:00:38 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 22 16:00:38 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 22 16:00:38 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 22 16:00:38 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 22 16:00:38 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 22 16:00:38 localhost kernel: No NUMA configuration found
Jan 22 16:00:38 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 22 16:00:38 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Jan 22 16:00:38 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 22 16:00:38 localhost kernel: Zone ranges:
Jan 22 16:00:38 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 22 16:00:38 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 22 16:00:38 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 22 16:00:38 localhost kernel:   Device   empty
Jan 22 16:00:38 localhost kernel: Movable zone start for each node
Jan 22 16:00:38 localhost kernel: Early memory node ranges
Jan 22 16:00:38 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 22 16:00:38 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 22 16:00:38 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 22 16:00:38 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 22 16:00:38 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 22 16:00:38 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 22 16:00:38 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 22 16:00:38 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Jan 22 16:00:38 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 22 16:00:38 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 22 16:00:38 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 22 16:00:38 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 22 16:00:38 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 22 16:00:38 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 22 16:00:38 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 22 16:00:38 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 22 16:00:38 localhost kernel: TSC deadline timer available
Jan 22 16:00:38 localhost kernel: CPU topo: Max. logical packages:   8
Jan 22 16:00:38 localhost kernel: CPU topo: Max. logical dies:       8
Jan 22 16:00:38 localhost kernel: CPU topo: Max. dies per package:   1
Jan 22 16:00:38 localhost kernel: CPU topo: Max. threads per core:   1
Jan 22 16:00:38 localhost kernel: CPU topo: Num. cores per package:     1
Jan 22 16:00:38 localhost kernel: CPU topo: Num. threads per package:   1
Jan 22 16:00:38 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 22 16:00:38 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 22 16:00:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 22 16:00:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 22 16:00:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 22 16:00:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 22 16:00:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 22 16:00:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 22 16:00:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 22 16:00:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 22 16:00:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 22 16:00:38 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 22 16:00:38 localhost kernel: Booting paravirtualized kernel on KVM
Jan 22 16:00:38 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 22 16:00:38 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 22 16:00:38 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 22 16:00:38 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Jan 22 16:00:38 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Jan 22 16:00:38 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 22 16:00:38 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 22 16:00:38 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 22 16:00:38 localhost kernel: random: crng init done
Jan 22 16:00:38 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 22 16:00:38 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 22 16:00:38 localhost kernel: Fallback order for Node 0: 0 
Jan 22 16:00:38 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 22 16:00:38 localhost kernel: Policy zone: Normal
Jan 22 16:00:38 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 22 16:00:38 localhost kernel: software IO TLB: area num 8.
Jan 22 16:00:38 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 22 16:00:38 localhost kernel: ftrace: allocating 49417 entries in 194 pages
Jan 22 16:00:38 localhost kernel: ftrace: allocated 194 pages with 3 groups
Jan 22 16:00:38 localhost kernel: Dynamic Preempt: voluntary
Jan 22 16:00:38 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 22 16:00:38 localhost kernel: rcu:         RCU event tracing is enabled.
Jan 22 16:00:38 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 22 16:00:38 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Jan 22 16:00:38 localhost kernel:         Rude variant of Tasks RCU enabled.
Jan 22 16:00:38 localhost kernel:         Tracing variant of Tasks RCU enabled.
Jan 22 16:00:38 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 22 16:00:38 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 22 16:00:38 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 22 16:00:38 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 22 16:00:38 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 22 16:00:38 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 22 16:00:38 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 22 16:00:38 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 22 16:00:38 localhost kernel: Console: colour VGA+ 80x25
Jan 22 16:00:38 localhost kernel: printk: console [ttyS0] enabled
Jan 22 16:00:38 localhost kernel: ACPI: Core revision 20230331
Jan 22 16:00:38 localhost kernel: APIC: Switch to symmetric I/O mode setup
Jan 22 16:00:38 localhost kernel: x2apic enabled
Jan 22 16:00:38 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Jan 22 16:00:38 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 22 16:00:38 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 22 16:00:38 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 22 16:00:38 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 22 16:00:38 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 22 16:00:38 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 22 16:00:38 localhost kernel: Spectre V2 : Mitigation: Retpolines
Jan 22 16:00:38 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 22 16:00:38 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 22 16:00:38 localhost kernel: RETBleed: Mitigation: untrained return thunk
Jan 22 16:00:38 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 22 16:00:38 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 22 16:00:38 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 22 16:00:38 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 22 16:00:38 localhost kernel: x86/bugs: return thunk changed
Jan 22 16:00:38 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 22 16:00:38 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 22 16:00:38 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 22 16:00:38 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 22 16:00:38 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 22 16:00:38 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 22 16:00:38 localhost kernel: Freeing SMP alternatives memory: 40K
Jan 22 16:00:38 localhost kernel: pid_max: default: 32768 minimum: 301
Jan 22 16:00:38 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 22 16:00:38 localhost kernel: landlock: Up and running.
Jan 22 16:00:38 localhost kernel: Yama: becoming mindful.
Jan 22 16:00:38 localhost kernel: SELinux:  Initializing.
Jan 22 16:00:38 localhost kernel: LSM support for eBPF active
Jan 22 16:00:38 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 22 16:00:38 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 22 16:00:38 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 22 16:00:38 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 22 16:00:38 localhost kernel: ... version:                0
Jan 22 16:00:38 localhost kernel: ... bit width:              48
Jan 22 16:00:38 localhost kernel: ... generic registers:      6
Jan 22 16:00:38 localhost kernel: ... value mask:             0000ffffffffffff
Jan 22 16:00:38 localhost kernel: ... max period:             00007fffffffffff
Jan 22 16:00:38 localhost kernel: ... fixed-purpose events:   0
Jan 22 16:00:38 localhost kernel: ... event mask:             000000000000003f
Jan 22 16:00:38 localhost kernel: signal: max sigframe size: 1776
Jan 22 16:00:38 localhost kernel: rcu: Hierarchical SRCU implementation.
Jan 22 16:00:38 localhost kernel: rcu:         Max phase no-delay instances is 400.
Jan 22 16:00:38 localhost kernel: smp: Bringing up secondary CPUs ...
Jan 22 16:00:38 localhost kernel: smpboot: x86: Booting SMP configuration:
Jan 22 16:00:38 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 22 16:00:38 localhost kernel: smp: Brought up 1 node, 8 CPUs
Jan 22 16:00:38 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 22 16:00:38 localhost kernel: node 0 deferred pages initialised in 10ms
Jan 22 16:00:38 localhost kernel: Memory: 7763708K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618364K reserved, 0K cma-reserved)
Jan 22 16:00:38 localhost kernel: devtmpfs: initialized
Jan 22 16:00:38 localhost kernel: x86/mm: Memory block size: 128MB
Jan 22 16:00:38 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 22 16:00:38 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 22 16:00:38 localhost kernel: pinctrl core: initialized pinctrl subsystem
Jan 22 16:00:38 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 22 16:00:38 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 22 16:00:38 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 22 16:00:38 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 22 16:00:38 localhost kernel: audit: initializing netlink subsys (disabled)
Jan 22 16:00:38 localhost kernel: audit: type=2000 audit(1769097636.485:1): state=initialized audit_enabled=0 res=1
Jan 22 16:00:38 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 22 16:00:38 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 22 16:00:38 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 22 16:00:38 localhost kernel: cpuidle: using governor menu
Jan 22 16:00:38 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 22 16:00:38 localhost kernel: PCI: Using configuration type 1 for base access
Jan 22 16:00:38 localhost kernel: PCI: Using configuration type 1 for extended access
Jan 22 16:00:38 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 22 16:00:38 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 22 16:00:38 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 22 16:00:38 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 22 16:00:38 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 22 16:00:38 localhost kernel: Demotion targets for Node 0: null
Jan 22 16:00:38 localhost kernel: cryptd: max_cpu_qlen set to 1000
Jan 22 16:00:38 localhost kernel: ACPI: Added _OSI(Module Device)
Jan 22 16:00:38 localhost kernel: ACPI: Added _OSI(Processor Device)
Jan 22 16:00:38 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 22 16:00:38 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 22 16:00:38 localhost kernel: ACPI: Interpreter enabled
Jan 22 16:00:38 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 22 16:00:38 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Jan 22 16:00:38 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 22 16:00:38 localhost kernel: PCI: Using E820 reservations for host bridge windows
Jan 22 16:00:38 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 22 16:00:38 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 22 16:00:38 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [3] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [4] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [5] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [6] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [7] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [8] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [9] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [10] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [11] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [12] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [13] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [14] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [15] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [16] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [17] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [18] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [19] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [20] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [21] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [22] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [23] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [24] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [25] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [26] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [27] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [28] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [29] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [30] registered
Jan 22 16:00:38 localhost kernel: acpiphp: Slot [31] registered
Jan 22 16:00:38 localhost kernel: PCI host bridge to bus 0000:00
Jan 22 16:00:38 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 22 16:00:38 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 22 16:00:38 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 22 16:00:38 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 22 16:00:38 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 22 16:00:38 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 22 16:00:38 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 22 16:00:38 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 22 16:00:38 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 22 16:00:38 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 22 16:00:38 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 22 16:00:38 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 22 16:00:38 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 22 16:00:38 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 22 16:00:38 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 22 16:00:38 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 22 16:00:38 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 22 16:00:38 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 22 16:00:38 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 22 16:00:38 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 22 16:00:38 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 22 16:00:38 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 22 16:00:38 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 22 16:00:38 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 22 16:00:38 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 22 16:00:38 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 22 16:00:38 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 22 16:00:38 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 22 16:00:38 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 22 16:00:38 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 22 16:00:38 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 22 16:00:38 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 22 16:00:38 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 22 16:00:38 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 22 16:00:38 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 22 16:00:38 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 22 16:00:38 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 22 16:00:38 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 22 16:00:38 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 22 16:00:38 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 22 16:00:38 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 22 16:00:38 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 22 16:00:38 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 22 16:00:38 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 22 16:00:38 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 22 16:00:38 localhost kernel: iommu: Default domain type: Translated
Jan 22 16:00:38 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 22 16:00:38 localhost kernel: SCSI subsystem initialized
Jan 22 16:00:38 localhost kernel: ACPI: bus type USB registered
Jan 22 16:00:38 localhost kernel: usbcore: registered new interface driver usbfs
Jan 22 16:00:38 localhost kernel: usbcore: registered new interface driver hub
Jan 22 16:00:38 localhost kernel: usbcore: registered new device driver usb
Jan 22 16:00:38 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 22 16:00:38 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 22 16:00:38 localhost kernel: PTP clock support registered
Jan 22 16:00:38 localhost kernel: EDAC MC: Ver: 3.0.0
Jan 22 16:00:38 localhost kernel: NetLabel: Initializing
Jan 22 16:00:38 localhost kernel: NetLabel:  domain hash size = 128
Jan 22 16:00:38 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 22 16:00:38 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Jan 22 16:00:38 localhost kernel: PCI: Using ACPI for IRQ routing
Jan 22 16:00:38 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Jan 22 16:00:38 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Jan 22 16:00:38 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Jan 22 16:00:38 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 22 16:00:38 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 22 16:00:38 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 22 16:00:38 localhost kernel: vgaarb: loaded
Jan 22 16:00:38 localhost kernel: clocksource: Switched to clocksource kvm-clock
Jan 22 16:00:38 localhost kernel: VFS: Disk quotas dquot_6.6.0
Jan 22 16:00:38 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 22 16:00:38 localhost kernel: pnp: PnP ACPI init
Jan 22 16:00:38 localhost kernel: pnp 00:03: [dma 2]
Jan 22 16:00:38 localhost kernel: pnp: PnP ACPI: found 5 devices
Jan 22 16:00:38 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 22 16:00:38 localhost kernel: NET: Registered PF_INET protocol family
Jan 22 16:00:38 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 22 16:00:38 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 22 16:00:38 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 22 16:00:38 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 22 16:00:38 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 22 16:00:38 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 22 16:00:38 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 22 16:00:38 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 22 16:00:38 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 22 16:00:38 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 22 16:00:38 localhost kernel: NET: Registered PF_XDP protocol family
Jan 22 16:00:38 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 22 16:00:38 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 22 16:00:38 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 22 16:00:38 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 22 16:00:38 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 22 16:00:38 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 22 16:00:38 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 22 16:00:38 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 22 16:00:38 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 93352 usecs
Jan 22 16:00:38 localhost kernel: PCI: CLS 0 bytes, default 64
Jan 22 16:00:38 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 22 16:00:38 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 22 16:00:38 localhost kernel: ACPI: bus type thunderbolt registered
Jan 22 16:00:38 localhost kernel: Trying to unpack rootfs image as initramfs...
Jan 22 16:00:38 localhost kernel: Initialise system trusted keyrings
Jan 22 16:00:38 localhost kernel: Key type blacklist registered
Jan 22 16:00:38 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 22 16:00:38 localhost kernel: zbud: loaded
Jan 22 16:00:38 localhost kernel: integrity: Platform Keyring initialized
Jan 22 16:00:38 localhost kernel: integrity: Machine keyring initialized
Jan 22 16:00:38 localhost kernel: Freeing initrd memory: 87956K
Jan 22 16:00:38 localhost kernel: NET: Registered PF_ALG protocol family
Jan 22 16:00:38 localhost kernel: xor: automatically using best checksumming function   avx       
Jan 22 16:00:38 localhost kernel: Key type asymmetric registered
Jan 22 16:00:38 localhost kernel: Asymmetric key parser 'x509' registered
Jan 22 16:00:38 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 22 16:00:38 localhost kernel: io scheduler mq-deadline registered
Jan 22 16:00:38 localhost kernel: io scheduler kyber registered
Jan 22 16:00:38 localhost kernel: io scheduler bfq registered
Jan 22 16:00:38 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 22 16:00:38 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 22 16:00:38 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 22 16:00:38 localhost kernel: ACPI: button: Power Button [PWRF]
Jan 22 16:00:38 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 22 16:00:38 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 22 16:00:38 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 22 16:00:38 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 22 16:00:38 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 22 16:00:38 localhost kernel: Non-volatile memory driver v1.3
Jan 22 16:00:38 localhost kernel: rdac: device handler registered
Jan 22 16:00:38 localhost kernel: hp_sw: device handler registered
Jan 22 16:00:38 localhost kernel: emc: device handler registered
Jan 22 16:00:38 localhost kernel: alua: device handler registered
Jan 22 16:00:38 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 22 16:00:38 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 22 16:00:38 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 22 16:00:38 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 22 16:00:38 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 22 16:00:38 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 22 16:00:38 localhost kernel: usb usb1: Product: UHCI Host Controller
Jan 22 16:00:38 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 22 16:00:38 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 22 16:00:38 localhost kernel: hub 1-0:1.0: USB hub found
Jan 22 16:00:38 localhost kernel: hub 1-0:1.0: 2 ports detected
Jan 22 16:00:38 localhost kernel: usbcore: registered new interface driver usbserial_generic
Jan 22 16:00:38 localhost kernel: usbserial: USB Serial support registered for generic
Jan 22 16:00:38 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 22 16:00:38 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 22 16:00:38 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 22 16:00:38 localhost kernel: mousedev: PS/2 mouse device common for all mice
Jan 22 16:00:38 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 22 16:00:38 localhost kernel: rtc_cmos 00:04: registered as rtc0
Jan 22 16:00:38 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 22 16:00:38 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-01-22T16:00:37 UTC (1769097637)
Jan 22 16:00:38 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 22 16:00:38 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 22 16:00:38 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 22 16:00:38 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 22 16:00:38 localhost kernel: usbcore: registered new interface driver usbhid
Jan 22 16:00:38 localhost kernel: usbhid: USB HID core driver
Jan 22 16:00:38 localhost kernel: drop_monitor: Initializing network drop monitor service
Jan 22 16:00:38 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 22 16:00:38 localhost kernel: Initializing XFRM netlink socket
Jan 22 16:00:38 localhost kernel: NET: Registered PF_INET6 protocol family
Jan 22 16:00:38 localhost kernel: Segment Routing with IPv6
Jan 22 16:00:38 localhost kernel: NET: Registered PF_PACKET protocol family
Jan 22 16:00:38 localhost kernel: mpls_gso: MPLS GSO support
Jan 22 16:00:38 localhost kernel: IPI shorthand broadcast: enabled
Jan 22 16:00:38 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Jan 22 16:00:38 localhost kernel: AES CTR mode by8 optimization enabled
Jan 22 16:00:38 localhost kernel: sched_clock: Marking stable (1304001380, 144761220)->(1576631969, -127869369)
Jan 22 16:00:38 localhost kernel: registered taskstats version 1
Jan 22 16:00:38 localhost kernel: Loading compiled-in X.509 certificates
Jan 22 16:00:38 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 22 16:00:38 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 22 16:00:38 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 22 16:00:38 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 22 16:00:38 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 22 16:00:38 localhost kernel: Demotion targets for Node 0: null
Jan 22 16:00:38 localhost kernel: page_owner is disabled
Jan 22 16:00:38 localhost kernel: Key type .fscrypt registered
Jan 22 16:00:38 localhost kernel: Key type fscrypt-provisioning registered
Jan 22 16:00:38 localhost kernel: Key type big_key registered
Jan 22 16:00:38 localhost kernel: Key type encrypted registered
Jan 22 16:00:38 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 22 16:00:38 localhost kernel: Loading compiled-in module X.509 certificates
Jan 22 16:00:38 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 22 16:00:38 localhost kernel: ima: Allocated hash algorithm: sha256
Jan 22 16:00:38 localhost kernel: ima: No architecture policies found
Jan 22 16:00:38 localhost kernel: evm: Initialising EVM extended attributes:
Jan 22 16:00:38 localhost kernel: evm: security.selinux
Jan 22 16:00:38 localhost kernel: evm: security.SMACK64 (disabled)
Jan 22 16:00:38 localhost kernel: evm: security.SMACK64EXEC (disabled)
Jan 22 16:00:38 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 22 16:00:38 localhost kernel: evm: security.SMACK64MMAP (disabled)
Jan 22 16:00:38 localhost kernel: evm: security.apparmor (disabled)
Jan 22 16:00:38 localhost kernel: evm: security.ima
Jan 22 16:00:38 localhost kernel: evm: security.capability
Jan 22 16:00:38 localhost kernel: evm: HMAC attrs: 0x1
Jan 22 16:00:38 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 22 16:00:38 localhost kernel: Running certificate verification RSA selftest
Jan 22 16:00:38 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 22 16:00:38 localhost kernel: Running certificate verification ECDSA selftest
Jan 22 16:00:38 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 22 16:00:38 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 22 16:00:38 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 22 16:00:38 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Jan 22 16:00:38 localhost kernel: usb 1-1: Manufacturer: QEMU
Jan 22 16:00:38 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 22 16:00:38 localhost kernel: clk: Disabling unused clocks
Jan 22 16:00:38 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 22 16:00:38 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 22 16:00:38 localhost kernel: Freeing unused decrypted memory: 2028K
Jan 22 16:00:38 localhost kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 22 16:00:38 localhost kernel: Write protecting the kernel read-only data: 30720k
Jan 22 16:00:38 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 22 16:00:38 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 22 16:00:38 localhost kernel: Run /init as init process
Jan 22 16:00:38 localhost kernel:   with arguments:
Jan 22 16:00:38 localhost kernel:     /init
Jan 22 16:00:38 localhost kernel:   with environment:
Jan 22 16:00:38 localhost kernel:     HOME=/
Jan 22 16:00:38 localhost kernel:     TERM=linux
Jan 22 16:00:38 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64
Jan 22 16:00:38 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 22 16:00:38 localhost systemd[1]: Detected virtualization kvm.
Jan 22 16:00:38 localhost systemd[1]: Detected architecture x86-64.
Jan 22 16:00:38 localhost systemd[1]: Running in initrd.
Jan 22 16:00:38 localhost systemd[1]: No hostname configured, using default hostname.
Jan 22 16:00:38 localhost systemd[1]: Hostname set to <localhost>.
Jan 22 16:00:38 localhost systemd[1]: Initializing machine ID from VM UUID.
Jan 22 16:00:38 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Jan 22 16:00:38 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 22 16:00:38 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 22 16:00:38 localhost systemd[1]: Reached target Initrd /usr File System.
Jan 22 16:00:38 localhost systemd[1]: Reached target Local File Systems.
Jan 22 16:00:38 localhost systemd[1]: Reached target Path Units.
Jan 22 16:00:38 localhost systemd[1]: Reached target Slice Units.
Jan 22 16:00:38 localhost systemd[1]: Reached target Swaps.
Jan 22 16:00:38 localhost systemd[1]: Reached target Timer Units.
Jan 22 16:00:38 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 22 16:00:38 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Jan 22 16:00:38 localhost systemd[1]: Listening on Journal Socket.
Jan 22 16:00:38 localhost systemd[1]: Listening on udev Control Socket.
Jan 22 16:00:38 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 22 16:00:38 localhost systemd[1]: Reached target Socket Units.
Jan 22 16:00:38 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 22 16:00:38 localhost systemd[1]: Starting Journal Service...
Jan 22 16:00:38 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 22 16:00:38 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 22 16:00:38 localhost systemd[1]: Starting Create System Users...
Jan 22 16:00:38 localhost systemd[1]: Starting Setup Virtual Console...
Jan 22 16:00:38 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 22 16:00:38 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 22 16:00:38 localhost systemd[1]: Finished Create System Users.
Jan 22 16:00:38 localhost systemd-journald[307]: Journal started
Jan 22 16:00:38 localhost systemd-journald[307]: Runtime Journal (/run/log/journal/814341d3bd19425d8185e66e96ccdc81) is 8.0M, max 153.6M, 145.6M free.
Jan 22 16:00:38 localhost systemd-sysusers[312]: Creating group 'users' with GID 100.
Jan 22 16:00:38 localhost systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Jan 22 16:00:38 localhost systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 22 16:00:38 localhost systemd[1]: Started Journal Service.
Jan 22 16:00:38 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 22 16:00:38 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 22 16:00:38 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 22 16:00:38 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 22 16:00:38 localhost systemd[1]: Finished Setup Virtual Console.
Jan 22 16:00:38 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 22 16:00:38 localhost systemd[1]: Starting dracut cmdline hook...
Jan 22 16:00:38 localhost dracut-cmdline[326]: dracut-9 dracut-057-102.git20250818.el9
Jan 22 16:00:38 localhost dracut-cmdline[326]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 22 16:00:38 localhost systemd[1]: Finished dracut cmdline hook.
Jan 22 16:00:38 localhost systemd[1]: Starting dracut pre-udev hook...
Jan 22 16:00:38 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 22 16:00:38 localhost kernel: device-mapper: uevent: version 1.0.3
Jan 22 16:00:38 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 22 16:00:39 localhost kernel: RPC: Registered named UNIX socket transport module.
Jan 22 16:00:39 localhost kernel: RPC: Registered udp transport module.
Jan 22 16:00:39 localhost kernel: RPC: Registered tcp transport module.
Jan 22 16:00:39 localhost kernel: RPC: Registered tcp-with-tls transport module.
Jan 22 16:00:39 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 22 16:00:39 localhost rpc.statd[443]: Version 2.5.4 starting
Jan 22 16:00:39 localhost rpc.statd[443]: Initializing NSM state
Jan 22 16:00:39 localhost rpc.idmapd[448]: Setting log level to 0
Jan 22 16:00:39 localhost systemd[1]: Finished dracut pre-udev hook.
Jan 22 16:00:39 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 22 16:00:39 localhost systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Jan 22 16:00:39 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 22 16:00:39 localhost systemd[1]: Starting dracut pre-trigger hook...
Jan 22 16:00:39 localhost systemd[1]: Finished dracut pre-trigger hook.
Jan 22 16:00:39 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 22 16:00:39 localhost systemd[1]: Created slice Slice /system/modprobe.
Jan 22 16:00:39 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 22 16:00:39 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 22 16:00:39 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 22 16:00:39 localhost systemd[1]: Reached target Network.
Jan 22 16:00:39 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 22 16:00:39 localhost systemd[1]: Starting dracut initqueue hook...
Jan 22 16:00:39 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 22 16:00:39 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 22 16:00:39 localhost systemd[1]: Mounting Kernel Configuration File System...
Jan 22 16:00:39 localhost systemd[1]: Mounted Kernel Configuration File System.
Jan 22 16:00:39 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 22 16:00:39 localhost systemd[1]: Reached target System Initialization.
Jan 22 16:00:39 localhost systemd[1]: Reached target Basic System.
Jan 22 16:00:39 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 22 16:00:39 localhost kernel:  vda: vda1
Jan 22 16:00:39 localhost kernel: libata version 3.00 loaded.
Jan 22 16:00:39 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Jan 22 16:00:39 localhost kernel: scsi host0: ata_piix
Jan 22 16:00:39 localhost kernel: scsi host1: ata_piix
Jan 22 16:00:39 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 22 16:00:39 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 22 16:00:39 localhost systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 22 16:00:39 localhost systemd[1]: Reached target Initrd Root Device.
Jan 22 16:00:39 localhost kernel: ata1: found unknown device (class 0)
Jan 22 16:00:39 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 22 16:00:39 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 22 16:00:39 localhost systemd-udevd[480]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 16:00:39 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 22 16:00:39 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 22 16:00:39 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 22 16:00:39 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Jan 22 16:00:39 localhost systemd[1]: Finished dracut initqueue hook.
Jan 22 16:00:39 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Jan 22 16:00:39 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Jan 22 16:00:39 localhost systemd[1]: Reached target Remote File Systems.
Jan 22 16:00:39 localhost systemd[1]: Starting dracut pre-mount hook...
Jan 22 16:00:39 localhost systemd[1]: Finished dracut pre-mount hook.
Jan 22 16:00:39 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 22 16:00:39 localhost systemd-fsck[555]: /usr/sbin/fsck.xfs: XFS file system.
Jan 22 16:00:39 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 22 16:00:39 localhost systemd[1]: Mounting /sysroot...
Jan 22 16:00:40 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 22 16:00:40 localhost kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 22 16:00:40 localhost kernel: XFS (vda1): Ending clean mount
Jan 22 16:00:40 localhost systemd[1]: Mounted /sysroot.
Jan 22 16:00:40 localhost systemd[1]: Reached target Initrd Root File System.
Jan 22 16:00:40 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 22 16:00:40 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 22 16:00:40 localhost systemd[1]: Reached target Initrd File Systems.
Jan 22 16:00:40 localhost systemd[1]: Reached target Initrd Default Target.
Jan 22 16:00:40 localhost systemd[1]: Starting dracut mount hook...
Jan 22 16:00:40 localhost systemd[1]: Finished dracut mount hook.
Jan 22 16:00:40 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 22 16:00:40 localhost rpc.idmapd[448]: exiting on signal 15
Jan 22 16:00:40 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 22 16:00:40 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 22 16:00:40 localhost systemd[1]: Stopped target Network.
Jan 22 16:00:40 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 22 16:00:40 localhost systemd[1]: Stopped target Timer Units.
Jan 22 16:00:40 localhost systemd[1]: dbus.socket: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 22 16:00:40 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 22 16:00:40 localhost systemd[1]: Stopped target Initrd Default Target.
Jan 22 16:00:40 localhost systemd[1]: Stopped target Basic System.
Jan 22 16:00:40 localhost systemd[1]: Stopped target Initrd Root Device.
Jan 22 16:00:40 localhost systemd[1]: Stopped target Initrd /usr File System.
Jan 22 16:00:40 localhost systemd[1]: Stopped target Path Units.
Jan 22 16:00:40 localhost systemd[1]: Stopped target Remote File Systems.
Jan 22 16:00:40 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 22 16:00:40 localhost systemd[1]: Stopped target Slice Units.
Jan 22 16:00:40 localhost systemd[1]: Stopped target Socket Units.
Jan 22 16:00:40 localhost systemd[1]: Stopped target System Initialization.
Jan 22 16:00:40 localhost systemd[1]: Stopped target Local File Systems.
Jan 22 16:00:40 localhost systemd[1]: Stopped target Swaps.
Jan 22 16:00:40 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Stopped dracut mount hook.
Jan 22 16:00:40 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Stopped dracut pre-mount hook.
Jan 22 16:00:40 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Jan 22 16:00:40 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 22 16:00:40 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Stopped dracut initqueue hook.
Jan 22 16:00:40 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Stopped Apply Kernel Variables.
Jan 22 16:00:40 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Jan 22 16:00:40 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Stopped Coldplug All udev Devices.
Jan 22 16:00:40 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Stopped dracut pre-trigger hook.
Jan 22 16:00:40 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 22 16:00:40 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Stopped Setup Virtual Console.
Jan 22 16:00:40 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 22 16:00:40 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 22 16:00:40 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Closed udev Control Socket.
Jan 22 16:00:40 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Closed udev Kernel Socket.
Jan 22 16:00:40 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Stopped dracut pre-udev hook.
Jan 22 16:00:40 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Stopped dracut cmdline hook.
Jan 22 16:00:40 localhost systemd[1]: Starting Cleanup udev Database...
Jan 22 16:00:40 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 22 16:00:40 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Jan 22 16:00:40 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Stopped Create System Users.
Jan 22 16:00:40 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 22 16:00:40 localhost systemd[1]: Finished Cleanup udev Database.
Jan 22 16:00:40 localhost systemd[1]: Reached target Switch Root.
Jan 22 16:00:40 localhost systemd[1]: Starting Switch Root...
Jan 22 16:00:40 localhost systemd[1]: Switching root.
Jan 22 16:00:40 localhost systemd-journald[307]: Journal stopped
Jan 22 16:00:41 localhost systemd-journald[307]: Received SIGTERM from PID 1 (systemd).
Jan 22 16:00:41 localhost kernel: audit: type=1404 audit(1769097640.949:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 22 16:00:41 localhost kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 16:00:41 localhost kernel: SELinux:  policy capability open_perms=1
Jan 22 16:00:41 localhost kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 16:00:41 localhost kernel: SELinux:  policy capability always_check_network=0
Jan 22 16:00:41 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 16:00:41 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 16:00:41 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 16:00:41 localhost kernel: audit: type=1403 audit(1769097641.068:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 22 16:00:41 localhost systemd[1]: Successfully loaded SELinux policy in 121.058ms.
Jan 22 16:00:41 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 26.145ms.
Jan 22 16:00:41 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 22 16:00:41 localhost systemd[1]: Detected virtualization kvm.
Jan 22 16:00:41 localhost systemd[1]: Detected architecture x86-64.
Jan 22 16:00:41 localhost systemd-rc-local-generator[639]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:00:41 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Jan 22 16:00:41 localhost systemd[1]: Stopped Switch Root.
Jan 22 16:00:41 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 22 16:00:41 localhost systemd[1]: Created slice Slice /system/getty.
Jan 22 16:00:41 localhost systemd[1]: Created slice Slice /system/serial-getty.
Jan 22 16:00:41 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Jan 22 16:00:41 localhost systemd[1]: Created slice User and Session Slice.
Jan 22 16:00:41 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 22 16:00:41 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Jan 22 16:00:41 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 22 16:00:41 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 22 16:00:41 localhost systemd[1]: Stopped target Switch Root.
Jan 22 16:00:41 localhost systemd[1]: Stopped target Initrd File Systems.
Jan 22 16:00:41 localhost systemd[1]: Stopped target Initrd Root File System.
Jan 22 16:00:41 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Jan 22 16:00:41 localhost systemd[1]: Reached target Path Units.
Jan 22 16:00:41 localhost systemd[1]: Reached target rpc_pipefs.target.
Jan 22 16:00:41 localhost systemd[1]: Reached target Slice Units.
Jan 22 16:00:41 localhost systemd[1]: Reached target Swaps.
Jan 22 16:00:41 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Jan 22 16:00:41 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Jan 22 16:00:41 localhost systemd[1]: Reached target RPC Port Mapper.
Jan 22 16:00:41 localhost systemd[1]: Listening on Process Core Dump Socket.
Jan 22 16:00:41 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Jan 22 16:00:41 localhost systemd[1]: Listening on udev Control Socket.
Jan 22 16:00:41 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 22 16:00:41 localhost systemd[1]: Mounting Huge Pages File System...
Jan 22 16:00:41 localhost systemd[1]: Mounting POSIX Message Queue File System...
Jan 22 16:00:41 localhost systemd[1]: Mounting Kernel Debug File System...
Jan 22 16:00:41 localhost systemd[1]: Mounting Kernel Trace File System...
Jan 22 16:00:41 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 22 16:00:41 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 22 16:00:41 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 22 16:00:41 localhost systemd[1]: Starting Load Kernel Module drm...
Jan 22 16:00:41 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Jan 22 16:00:41 localhost systemd[1]: Starting Load Kernel Module fuse...
Jan 22 16:00:41 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 22 16:00:41 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Jan 22 16:00:41 localhost systemd[1]: Stopped File System Check on Root Device.
Jan 22 16:00:41 localhost systemd[1]: Stopped Journal Service.
Jan 22 16:00:41 localhost systemd[1]: Starting Journal Service...
Jan 22 16:00:41 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 22 16:00:41 localhost systemd[1]: Starting Generate network units from Kernel command line...
Jan 22 16:00:41 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 22 16:00:41 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Jan 22 16:00:41 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 22 16:00:41 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 22 16:00:41 localhost kernel: fuse: init (API version 7.37)
Jan 22 16:00:41 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 22 16:00:41 localhost systemd[1]: Mounted Huge Pages File System.
Jan 22 16:00:41 localhost systemd[1]: Mounted POSIX Message Queue File System.
Jan 22 16:00:41 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 22 16:00:41 localhost systemd[1]: Mounted Kernel Debug File System.
Jan 22 16:00:41 localhost systemd[1]: Mounted Kernel Trace File System.
Jan 22 16:00:41 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 22 16:00:41 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 22 16:00:41 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 22 16:00:41 localhost systemd-journald[680]: Journal started
Jan 22 16:00:41 localhost systemd-journald[680]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 22 16:00:41 localhost systemd[1]: Queued start job for default target Multi-User System.
Jan 22 16:00:41 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 22 16:00:41 localhost systemd[1]: Started Journal Service.
Jan 22 16:00:41 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 22 16:00:41 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 22 16:00:41 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 22 16:00:41 localhost systemd[1]: Finished Load Kernel Module fuse.
Jan 22 16:00:41 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 22 16:00:41 localhost systemd[1]: Finished Generate network units from Kernel command line.
Jan 22 16:00:41 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 22 16:00:41 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 22 16:00:41 localhost kernel: ACPI: bus type drm_connector registered
Jan 22 16:00:41 localhost systemd[1]: Mounting FUSE Control File System...
Jan 22 16:00:41 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 22 16:00:41 localhost systemd[1]: Starting Rebuild Hardware Database...
Jan 22 16:00:41 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 22 16:00:41 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 22 16:00:41 localhost systemd[1]: Starting Load/Save OS Random Seed...
Jan 22 16:00:41 localhost systemd[1]: Starting Create System Users...
Jan 22 16:00:41 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 22 16:00:41 localhost systemd[1]: Finished Load Kernel Module drm.
Jan 22 16:00:41 localhost systemd[1]: Mounted FUSE Control File System.
Jan 22 16:00:41 localhost systemd-journald[680]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 22 16:00:41 localhost systemd-journald[680]: Received client request to flush runtime journal.
Jan 22 16:00:41 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 22 16:00:41 localhost systemd[1]: Finished Load/Save OS Random Seed.
Jan 22 16:00:41 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 22 16:00:41 localhost systemd[1]: Finished Create System Users.
Jan 22 16:00:41 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 22 16:00:41 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 22 16:00:41 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 22 16:00:41 localhost systemd[1]: Reached target Preparation for Local File Systems.
Jan 22 16:00:41 localhost systemd[1]: Reached target Local File Systems.
Jan 22 16:00:41 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 22 16:00:41 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 22 16:00:41 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 22 16:00:41 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 22 16:00:41 localhost systemd[1]: Starting Automatic Boot Loader Update...
Jan 22 16:00:41 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 22 16:00:41 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 22 16:00:41 localhost bootctl[697]: Couldn't find EFI system partition, skipping.
Jan 22 16:00:41 localhost systemd[1]: Finished Automatic Boot Loader Update.
Jan 22 16:00:41 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 22 16:00:41 localhost systemd[1]: Starting Security Auditing Service...
Jan 22 16:00:41 localhost systemd[1]: Starting RPC Bind...
Jan 22 16:00:41 localhost auditd[701]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 22 16:00:41 localhost auditd[701]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 22 16:00:41 localhost systemd[1]: Starting Rebuild Journal Catalog...
Jan 22 16:00:41 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 22 16:00:41 localhost systemd[1]: Finished Rebuild Journal Catalog.
Jan 22 16:00:41 localhost systemd[1]: Started RPC Bind.
Jan 22 16:00:41 localhost augenrules[708]: /sbin/augenrules: No change
Jan 22 16:00:41 localhost augenrules[723]: No rules
Jan 22 16:00:41 localhost augenrules[723]: enabled 1
Jan 22 16:00:41 localhost augenrules[723]: failure 1
Jan 22 16:00:41 localhost augenrules[723]: pid 701
Jan 22 16:00:41 localhost augenrules[723]: rate_limit 0
Jan 22 16:00:41 localhost augenrules[723]: backlog_limit 8192
Jan 22 16:00:41 localhost augenrules[723]: lost 0
Jan 22 16:00:41 localhost augenrules[723]: backlog 0
Jan 22 16:00:41 localhost augenrules[723]: backlog_wait_time 60000
Jan 22 16:00:41 localhost augenrules[723]: backlog_wait_time_actual 0
Jan 22 16:00:41 localhost augenrules[723]: enabled 1
Jan 22 16:00:41 localhost augenrules[723]: failure 1
Jan 22 16:00:41 localhost augenrules[723]: pid 701
Jan 22 16:00:41 localhost augenrules[723]: rate_limit 0
Jan 22 16:00:41 localhost augenrules[723]: backlog_limit 8192
Jan 22 16:00:41 localhost augenrules[723]: lost 0
Jan 22 16:00:41 localhost augenrules[723]: backlog 0
Jan 22 16:00:41 localhost augenrules[723]: backlog_wait_time 60000
Jan 22 16:00:41 localhost augenrules[723]: backlog_wait_time_actual 0
Jan 22 16:00:41 localhost augenrules[723]: enabled 1
Jan 22 16:00:41 localhost augenrules[723]: failure 1
Jan 22 16:00:41 localhost augenrules[723]: pid 701
Jan 22 16:00:41 localhost augenrules[723]: rate_limit 0
Jan 22 16:00:41 localhost augenrules[723]: backlog_limit 8192
Jan 22 16:00:41 localhost augenrules[723]: lost 0
Jan 22 16:00:41 localhost augenrules[723]: backlog 0
Jan 22 16:00:41 localhost augenrules[723]: backlog_wait_time 60000
Jan 22 16:00:41 localhost augenrules[723]: backlog_wait_time_actual 0
Jan 22 16:00:41 localhost systemd[1]: Started Security Auditing Service.
Jan 22 16:00:41 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 22 16:00:41 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 22 16:00:42 localhost systemd[1]: Finished Rebuild Hardware Database.
Jan 22 16:00:42 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 22 16:00:42 localhost systemd[1]: Starting Update is Completed...
Jan 22 16:00:42 localhost systemd[1]: Finished Update is Completed.
Jan 22 16:00:42 localhost systemd-udevd[731]: Using default interface naming scheme 'rhel-9.0'.
Jan 22 16:00:42 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 22 16:00:42 localhost systemd[1]: Reached target System Initialization.
Jan 22 16:00:42 localhost systemd[1]: Started dnf makecache --timer.
Jan 22 16:00:42 localhost systemd[1]: Started Daily rotation of log files.
Jan 22 16:00:42 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 22 16:00:42 localhost systemd[1]: Reached target Timer Units.
Jan 22 16:00:42 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 22 16:00:42 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 22 16:00:42 localhost systemd[1]: Reached target Socket Units.
Jan 22 16:00:42 localhost systemd[1]: Starting D-Bus System Message Bus...
Jan 22 16:00:42 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 22 16:00:42 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 22 16:00:42 localhost systemd-udevd[738]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 16:00:42 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 22 16:00:42 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 22 16:00:42 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 22 16:00:42 localhost systemd[1]: Started D-Bus System Message Bus.
Jan 22 16:00:42 localhost systemd[1]: Reached target Basic System.
Jan 22 16:00:42 localhost dbus-broker-lau[767]: Ready
Jan 22 16:00:42 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 22 16:00:42 localhost systemd[1]: Starting NTP client/server...
Jan 22 16:00:42 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 22 16:00:42 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 22 16:00:42 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 22 16:00:42 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 22 16:00:42 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 22 16:00:42 localhost chronyd[788]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 22 16:00:42 localhost chronyd[788]: Loaded 0 symmetric keys
Jan 22 16:00:42 localhost chronyd[788]: Using right/UTC timezone to obtain leap second data
Jan 22 16:00:42 localhost systemd[1]: Starting IPv4 firewall with iptables...
Jan 22 16:00:42 localhost chronyd[788]: Loaded seccomp filter (level 2)
Jan 22 16:00:42 localhost systemd[1]: Started irqbalance daemon.
Jan 22 16:00:42 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 22 16:00:42 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 16:00:42 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 16:00:42 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 16:00:42 localhost systemd[1]: Reached target sshd-keygen.target.
Jan 22 16:00:42 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 22 16:00:42 localhost systemd[1]: Reached target User and Group Name Lookups.
Jan 22 16:00:42 localhost systemd[1]: Starting User Login Management...
Jan 22 16:00:42 localhost systemd[1]: Started NTP client/server.
Jan 22 16:00:42 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 22 16:00:42 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 22 16:00:42 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 22 16:00:42 localhost systemd-logind[796]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 22 16:00:42 localhost kernel: kvm_amd: TSC scaling supported
Jan 22 16:00:42 localhost kernel: kvm_amd: Nested Virtualization enabled
Jan 22 16:00:42 localhost kernel: kvm_amd: Nested Paging enabled
Jan 22 16:00:42 localhost kernel: kvm_amd: LBR virtualization supported
Jan 22 16:00:42 localhost systemd-logind[796]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 22 16:00:42 localhost kernel: Console: switching to colour dummy device 80x25
Jan 22 16:00:42 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 22 16:00:42 localhost kernel: [drm] features: -context_init
Jan 22 16:00:42 localhost kernel: [drm] number of scanouts: 1
Jan 22 16:00:42 localhost kernel: [drm] number of cap sets: 0
Jan 22 16:00:42 localhost systemd-logind[796]: New seat seat0.
Jan 22 16:00:42 localhost systemd[1]: Started User Login Management.
Jan 22 16:00:42 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 22 16:00:42 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 22 16:00:42 localhost kernel: Console: switching to colour frame buffer device 128x48
Jan 22 16:00:42 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 22 16:00:42 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 22 16:00:42 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 22 16:00:42 localhost iptables.init[785]: iptables: Applying firewall rules: [  OK  ]
Jan 22 16:00:42 localhost systemd[1]: Finished IPv4 firewall with iptables.
Jan 22 16:00:43 localhost cloud-init[840]: Cloud-init v. 24.4-8.el9 running 'init-local' at Thu, 22 Jan 2026 16:00:42 +0000. Up 6.73 seconds.
Jan 22 16:00:43 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Jan 22 16:00:43 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Jan 22 16:00:43 localhost systemd[1]: run-cloud\x2dinit-tmp-tmph0hg1cas.mount: Deactivated successfully.
Jan 22 16:00:43 localhost systemd[1]: Starting Hostname Service...
Jan 22 16:00:43 localhost systemd[1]: Started Hostname Service.
Jan 22 16:00:43 np0005592449.novalocal systemd-hostnamed[854]: Hostname set to <np0005592449.novalocal> (static)
Jan 22 16:00:43 np0005592449.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 22 16:00:43 np0005592449.novalocal systemd[1]: Reached target Preparation for Network.
Jan 22 16:00:43 np0005592449.novalocal systemd[1]: Starting Network Manager...
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6031] NetworkManager (version 1.54.3-2.el9) is starting... (boot:da997f0d-f9f6-41ea-b801-9627d95136ee)
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6039] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6142] manager[0x55d84042d000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6198] hostname: hostname: using hostnamed
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6199] hostname: static hostname changed from (none) to "np0005592449.novalocal"
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6203] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6305] manager[0x55d84042d000]: rfkill: Wi-Fi hardware radio set enabled
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6306] manager[0x55d84042d000]: rfkill: WWAN hardware radio set enabled
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6344] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6344] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6345] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6345] manager: Networking is enabled by state file
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6348] settings: Loaded settings plugin: keyfile (internal)
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6357] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6379] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6391] dhcp: init: Using DHCP client 'internal'
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6394] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 22 16:00:43 np0005592449.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6407] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6414] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6422] device (lo): Activation: starting connection 'lo' (30955f9e-8f64-42bf-81b2-a9784deb7a51)
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6431] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6434] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6490] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6494] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6497] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6498] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6500] device (eth0): carrier: link connected
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6503] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6510] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6516] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6520] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6520] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6523] manager: NetworkManager state is now CONNECTING
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6524] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6531] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6534] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:00:43 np0005592449.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 16:00:43 np0005592449.novalocal systemd[1]: Started Network Manager.
Jan 22 16:00:43 np0005592449.novalocal systemd[1]: Reached target Network.
Jan 22 16:00:43 np0005592449.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 22 16:00:43 np0005592449.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 22 16:00:43 np0005592449.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6748] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6751] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 22 16:00:43 np0005592449.novalocal NetworkManager[858]: <info>  [1769097643.6757] device (lo): Activation: successful, device activated.
Jan 22 16:00:43 np0005592449.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Jan 22 16:00:43 np0005592449.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 22 16:00:43 np0005592449.novalocal systemd[1]: Reached target NFS client services.
Jan 22 16:00:43 np0005592449.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Jan 22 16:00:43 np0005592449.novalocal systemd[1]: Reached target Remote File Systems.
Jan 22 16:00:43 np0005592449.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 22 16:00:46 np0005592449.novalocal NetworkManager[858]: <info>  [1769097646.3668] dhcp4 (eth0): state changed new lease, address=38.102.83.176
Jan 22 16:00:46 np0005592449.novalocal NetworkManager[858]: <info>  [1769097646.3687] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 22 16:00:46 np0005592449.novalocal NetworkManager[858]: <info>  [1769097646.3720] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:00:46 np0005592449.novalocal NetworkManager[858]: <info>  [1769097646.3764] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:00:46 np0005592449.novalocal NetworkManager[858]: <info>  [1769097646.3769] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:00:46 np0005592449.novalocal NetworkManager[858]: <info>  [1769097646.3778] manager: NetworkManager state is now CONNECTED_SITE
Jan 22 16:00:46 np0005592449.novalocal NetworkManager[858]: <info>  [1769097646.3783] device (eth0): Activation: successful, device activated.
Jan 22 16:00:46 np0005592449.novalocal NetworkManager[858]: <info>  [1769097646.3792] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 22 16:00:46 np0005592449.novalocal NetworkManager[858]: <info>  [1769097646.3798] manager: startup complete
Jan 22 16:00:46 np0005592449.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 22 16:00:46 np0005592449.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: Cloud-init v. 24.4-8.el9 running 'init' at Thu, 22 Jan 2026 16:00:46 +0000. Up 10.46 seconds.
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: |  eth0  | True |        38.102.83.176         | 255.255.255.0 | global | fa:16:3e:ae:a1:db |
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: |  eth0  | True | fe80::f816:3eff:feae:a1db/64 |       .       |  link  | fa:16:3e:ae:a1:db |
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Jan 22 16:00:46 np0005592449.novalocal cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 22 16:00:47 np0005592449.novalocal useradd[989]: new group: name=cloud-user, GID=1001
Jan 22 16:00:47 np0005592449.novalocal useradd[989]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Jan 22 16:00:47 np0005592449.novalocal useradd[989]: add 'cloud-user' to group 'adm'
Jan 22 16:00:47 np0005592449.novalocal useradd[989]: add 'cloud-user' to group 'systemd-journal'
Jan 22 16:00:47 np0005592449.novalocal useradd[989]: add 'cloud-user' to shadow group 'adm'
Jan 22 16:00:47 np0005592449.novalocal useradd[989]: add 'cloud-user' to shadow group 'systemd-journal'
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: Generating public/private rsa key pair.
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: The key fingerprint is:
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: SHA256:OkrjCeM+i9T+3XjQDThNoyuXU6QRl0w29rja/Elib0k root@np0005592449.novalocal
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: The key's randomart image is:
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: +---[RSA 3072]----+
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |       .+*.      |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |       .+*+      |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |        O...     |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |       = +.      |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |        S.o      |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |  .  . B+. E     |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: | .o.o =.o=...    |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |.oo= +..+.=o.    |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |..++=. o...+     |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: +----[SHA256]-----+
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: Generating public/private ecdsa key pair.
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: The key fingerprint is:
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: SHA256:xbPHxbH9LgbNZMEg8lh8M5rln60v8wj1m+M54LCgwEk root@np0005592449.novalocal
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: The key's randomart image is:
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: +---[ECDSA 256]---+
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |        ..o .oo  |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |         *..=..= |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |        . =* o* .|
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |     E   .o+.*  .|
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |    o . S . +.+o.|
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |     +   . o +oo.|
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |      . . . = +.o|
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |       .   . ++=+|
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |              oX=|
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: +----[SHA256]-----+
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: Generating public/private ed25519 key pair.
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: The key fingerprint is:
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: SHA256:L9OFbjxGJK0+G6hBoJVEq1PovSzWfaS+xrLyXhWBLP8 root@np0005592449.novalocal
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: The key's randomart image is:
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: +--[ED25519 256]--+
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: | .o . ..         |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: | o + o  ..       |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |. * o  .. o      |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |.=.. .  .+ .     |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |+. .. .oS o .    |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: | .o.o +E * .     |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: | o +o+..* O      |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |....o=.  O .     |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: |  ++=o. .        |
Jan 22 16:00:47 np0005592449.novalocal cloud-init[921]: +----[SHA256]-----+
Jan 22 16:00:47 np0005592449.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Jan 22 16:00:47 np0005592449.novalocal systemd[1]: Reached target Cloud-config availability.
Jan 22 16:00:47 np0005592449.novalocal systemd[1]: Reached target Network is Online.
Jan 22 16:00:47 np0005592449.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Starting Crash recovery kernel arming...
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Starting System Logging Service...
Jan 22 16:00:48 np0005592449.novalocal sm-notify[1005]: Version 2.5.4 starting
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Starting OpenSSH server daemon...
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Starting Permit User Sessions...
Jan 22 16:00:48 np0005592449.novalocal sshd[1007]: Server listening on 0.0.0.0 port 22.
Jan 22 16:00:48 np0005592449.novalocal sshd[1007]: Server listening on :: port 22.
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Started OpenSSH server daemon.
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Started Notify NFS peers of a restart.
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Finished Permit User Sessions.
Jan 22 16:00:48 np0005592449.novalocal rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Jan 22 16:00:48 np0005592449.novalocal rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Started Command Scheduler.
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Started Getty on tty1.
Jan 22 16:00:48 np0005592449.novalocal crond[1011]: (CRON) STARTUP (1.5.7)
Jan 22 16:00:48 np0005592449.novalocal crond[1011]: (CRON) INFO (Syslog will be used instead of sendmail.)
Jan 22 16:00:48 np0005592449.novalocal crond[1011]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 22% if used.)
Jan 22 16:00:48 np0005592449.novalocal crond[1011]: (CRON) INFO (running with inotify support)
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Started Serial Getty on ttyS0.
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Reached target Login Prompts.
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Started System Logging Service.
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Reached target Multi-User System.
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 22 16:00:48 np0005592449.novalocal rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 22 16:00:48 np0005592449.novalocal kdumpctl[1013]: kdump: No kdump initial ramdisk found.
Jan 22 16:00:48 np0005592449.novalocal kdumpctl[1013]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 22 16:00:48 np0005592449.novalocal cloud-init[1111]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Thu, 22 Jan 2026 16:00:48 +0000. Up 12.04 seconds.
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Jan 22 16:00:48 np0005592449.novalocal sshd-session[1231]: Unable to negotiate with 38.102.83.114 port 51010: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Jan 22 16:00:48 np0005592449.novalocal sshd-session[1252]: Unable to negotiate with 38.102.83.114 port 51024: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Jan 22 16:00:48 np0005592449.novalocal sshd-session[1263]: Unable to negotiate with 38.102.83.114 port 51038: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Jan 22 16:00:48 np0005592449.novalocal cloud-init[1277]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Thu, 22 Jan 2026 16:00:48 +0000. Up 12.43 seconds.
Jan 22 16:00:48 np0005592449.novalocal sshd-session[1222]: Connection closed by 38.102.83.114 port 51006 [preauth]
Jan 22 16:00:48 np0005592449.novalocal dracut[1283]: dracut-057-102.git20250818.el9
Jan 22 16:00:48 np0005592449.novalocal sshd-session[1281]: Connection reset by 38.102.83.114 port 51048 [preauth]
Jan 22 16:00:48 np0005592449.novalocal sshd-session[1288]: Unable to negotiate with 38.102.83.114 port 51062: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Jan 22 16:00:48 np0005592449.novalocal sshd-session[1243]: Connection closed by 38.102.83.114 port 51016 [preauth]
Jan 22 16:00:48 np0005592449.novalocal cloud-init[1304]: #############################################################
Jan 22 16:00:48 np0005592449.novalocal sshd-session[1303]: Unable to negotiate with 38.102.83.114 port 51076: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Jan 22 16:00:48 np0005592449.novalocal cloud-init[1305]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 22 16:00:48 np0005592449.novalocal cloud-init[1308]: 256 SHA256:xbPHxbH9LgbNZMEg8lh8M5rln60v8wj1m+M54LCgwEk root@np0005592449.novalocal (ECDSA)
Jan 22 16:00:48 np0005592449.novalocal cloud-init[1310]: 256 SHA256:L9OFbjxGJK0+G6hBoJVEq1PovSzWfaS+xrLyXhWBLP8 root@np0005592449.novalocal (ED25519)
Jan 22 16:00:48 np0005592449.novalocal cloud-init[1312]: 3072 SHA256:OkrjCeM+i9T+3XjQDThNoyuXU6QRl0w29rja/Elib0k root@np0005592449.novalocal (RSA)
Jan 22 16:00:48 np0005592449.novalocal cloud-init[1313]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 22 16:00:48 np0005592449.novalocal cloud-init[1314]: #############################################################
Jan 22 16:00:48 np0005592449.novalocal sshd-session[1275]: Connection closed by 38.102.83.114 port 51042 [preauth]
Jan 22 16:00:48 np0005592449.novalocal cloud-init[1277]: Cloud-init v. 24.4-8.el9 finished at Thu, 22 Jan 2026 16:00:48 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 12.60 seconds
Jan 22 16:00:48 np0005592449.novalocal dracut[1286]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Jan 22 16:00:48 np0005592449.novalocal systemd[1]: Reached target Cloud-init target.
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 22 16:00:49 np0005592449.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: Module 'resume' will not be installed, because it's in the list to be omitted!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: memstrack is not available
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: memstrack is not available
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 22 16:00:50 np0005592449.novalocal dracut[1286]: *** Including module: systemd ***
Jan 22 16:00:51 np0005592449.novalocal dracut[1286]: *** Including module: fips ***
Jan 22 16:00:51 np0005592449.novalocal chronyd[788]: Selected source 167.160.187.12 (2.centos.pool.ntp.org)
Jan 22 16:00:51 np0005592449.novalocal chronyd[788]: System clock TAI offset set to 37 seconds
Jan 22 16:00:51 np0005592449.novalocal dracut[1286]: *** Including module: systemd-initrd ***
Jan 22 16:00:51 np0005592449.novalocal dracut[1286]: *** Including module: i18n ***
Jan 22 16:00:51 np0005592449.novalocal dracut[1286]: *** Including module: drm ***
Jan 22 16:00:52 np0005592449.novalocal dracut[1286]: *** Including module: prefixdevname ***
Jan 22 16:00:52 np0005592449.novalocal dracut[1286]: *** Including module: kernel-modules ***
Jan 22 16:00:52 np0005592449.novalocal kernel: block vda: the capability attribute has been deprecated.
Jan 22 16:00:53 np0005592449.novalocal dracut[1286]: *** Including module: kernel-modules-extra ***
Jan 22 16:00:53 np0005592449.novalocal dracut[1286]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Jan 22 16:00:53 np0005592449.novalocal dracut[1286]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Jan 22 16:00:53 np0005592449.novalocal dracut[1286]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Jan 22 16:00:53 np0005592449.novalocal dracut[1286]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Jan 22 16:00:53 np0005592449.novalocal dracut[1286]: *** Including module: qemu ***
Jan 22 16:00:53 np0005592449.novalocal dracut[1286]: *** Including module: fstab-sys ***
Jan 22 16:00:53 np0005592449.novalocal dracut[1286]: *** Including module: rootfs-block ***
Jan 22 16:00:53 np0005592449.novalocal dracut[1286]: *** Including module: terminfo ***
Jan 22 16:00:53 np0005592449.novalocal irqbalance[790]: Cannot change IRQ 25 affinity: Operation not permitted
Jan 22 16:00:53 np0005592449.novalocal irqbalance[790]: IRQ 25 affinity is now unmanaged
Jan 22 16:00:53 np0005592449.novalocal irqbalance[790]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 22 16:00:53 np0005592449.novalocal irqbalance[790]: IRQ 31 affinity is now unmanaged
Jan 22 16:00:53 np0005592449.novalocal irqbalance[790]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 22 16:00:53 np0005592449.novalocal irqbalance[790]: IRQ 28 affinity is now unmanaged
Jan 22 16:00:53 np0005592449.novalocal irqbalance[790]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 22 16:00:53 np0005592449.novalocal irqbalance[790]: IRQ 32 affinity is now unmanaged
Jan 22 16:00:53 np0005592449.novalocal irqbalance[790]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 22 16:00:53 np0005592449.novalocal irqbalance[790]: IRQ 30 affinity is now unmanaged
Jan 22 16:00:53 np0005592449.novalocal irqbalance[790]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 22 16:00:53 np0005592449.novalocal irqbalance[790]: IRQ 29 affinity is now unmanaged
Jan 22 16:00:53 np0005592449.novalocal dracut[1286]: *** Including module: udev-rules ***
Jan 22 16:00:54 np0005592449.novalocal dracut[1286]: Skipping udev rule: 91-permissions.rules
Jan 22 16:00:54 np0005592449.novalocal dracut[1286]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 22 16:00:54 np0005592449.novalocal dracut[1286]: *** Including module: virtiofs ***
Jan 22 16:00:54 np0005592449.novalocal dracut[1286]: *** Including module: dracut-systemd ***
Jan 22 16:00:54 np0005592449.novalocal dracut[1286]: *** Including module: usrmount ***
Jan 22 16:00:54 np0005592449.novalocal dracut[1286]: *** Including module: base ***
Jan 22 16:00:54 np0005592449.novalocal dracut[1286]: *** Including module: fs-lib ***
Jan 22 16:00:54 np0005592449.novalocal dracut[1286]: *** Including module: kdumpbase ***
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:   microcode_ctl module: mangling fw_dir
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: configuration "intel" is ignored
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 22 16:00:55 np0005592449.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 22 16:00:56 np0005592449.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 22 16:00:56 np0005592449.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 22 16:00:56 np0005592449.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 22 16:00:56 np0005592449.novalocal dracut[1286]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 22 16:00:56 np0005592449.novalocal dracut[1286]: *** Including module: openssl ***
Jan 22 16:00:56 np0005592449.novalocal dracut[1286]: *** Including module: shutdown ***
Jan 22 16:00:56 np0005592449.novalocal dracut[1286]: *** Including module: squash ***
Jan 22 16:00:56 np0005592449.novalocal dracut[1286]: *** Including modules done ***
Jan 22 16:00:56 np0005592449.novalocal dracut[1286]: *** Installing kernel module dependencies ***
Jan 22 16:00:56 np0005592449.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 16:00:57 np0005592449.novalocal dracut[1286]: *** Installing kernel module dependencies done ***
Jan 22 16:00:57 np0005592449.novalocal dracut[1286]: *** Resolving executable dependencies ***
Jan 22 16:00:59 np0005592449.novalocal dracut[1286]: *** Resolving executable dependencies done ***
Jan 22 16:00:59 np0005592449.novalocal dracut[1286]: *** Generating early-microcode cpio image ***
Jan 22 16:00:59 np0005592449.novalocal dracut[1286]: *** Store current command line parameters ***
Jan 22 16:00:59 np0005592449.novalocal dracut[1286]: Stored kernel commandline:
Jan 22 16:00:59 np0005592449.novalocal dracut[1286]: No dracut internal kernel commandline stored in the initramfs
Jan 22 16:00:59 np0005592449.novalocal dracut[1286]: *** Install squash loader ***
Jan 22 16:01:00 np0005592449.novalocal dracut[1286]: *** Squashing the files inside the initramfs ***
Jan 22 16:01:01 np0005592449.novalocal CROND[4153]: (root) CMD (run-parts /etc/cron.hourly)
Jan 22 16:01:01 np0005592449.novalocal run-parts[4156]: (/etc/cron.hourly) starting 0anacron
Jan 22 16:01:01 np0005592449.novalocal anacron[4164]: Anacron started on 2026-01-22
Jan 22 16:01:01 np0005592449.novalocal anacron[4164]: Will run job `cron.daily' in 38 min.
Jan 22 16:01:01 np0005592449.novalocal anacron[4164]: Will run job `cron.weekly' in 58 min.
Jan 22 16:01:01 np0005592449.novalocal anacron[4164]: Will run job `cron.monthly' in 78 min.
Jan 22 16:01:01 np0005592449.novalocal anacron[4164]: Jobs will be executed sequentially
Jan 22 16:01:01 np0005592449.novalocal run-parts[4166]: (/etc/cron.hourly) finished 0anacron
Jan 22 16:01:01 np0005592449.novalocal CROND[4152]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 22 16:01:01 np0005592449.novalocal dracut[1286]: *** Squashing the files inside the initramfs done ***
Jan 22 16:01:01 np0005592449.novalocal dracut[1286]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 22 16:01:01 np0005592449.novalocal dracut[1286]: *** Hardlinking files ***
Jan 22 16:01:01 np0005592449.novalocal dracut[1286]: Mode:           real
Jan 22 16:01:01 np0005592449.novalocal dracut[1286]: Files:          50
Jan 22 16:01:01 np0005592449.novalocal dracut[1286]: Linked:         0 files
Jan 22 16:01:01 np0005592449.novalocal dracut[1286]: Compared:       0 xattrs
Jan 22 16:01:01 np0005592449.novalocal dracut[1286]: Compared:       0 files
Jan 22 16:01:01 np0005592449.novalocal dracut[1286]: Saved:          0 B
Jan 22 16:01:01 np0005592449.novalocal dracut[1286]: Duration:       0.000369 seconds
Jan 22 16:01:01 np0005592449.novalocal dracut[1286]: *** Hardlinking files done ***
Jan 22 16:01:02 np0005592449.novalocal dracut[1286]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 22 16:01:02 np0005592449.novalocal kdumpctl[1013]: kdump: kexec: loaded kdump kernel
Jan 22 16:01:02 np0005592449.novalocal kdumpctl[1013]: kdump: Starting kdump: [OK]
Jan 22 16:01:02 np0005592449.novalocal systemd[1]: Finished Crash recovery kernel arming.
Jan 22 16:01:02 np0005592449.novalocal systemd[1]: Startup finished in 1.768s (kernel) + 2.940s (initrd) + 21.799s (userspace) = 26.508s.
Jan 22 16:01:05 np0005592449.novalocal sshd-session[4319]: Accepted publickey for zuul from 38.102.83.114 port 58260 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Jan 22 16:01:05 np0005592449.novalocal systemd[1]: Created slice User Slice of UID 1000.
Jan 22 16:01:05 np0005592449.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 22 16:01:05 np0005592449.novalocal systemd-logind[796]: New session 1 of user zuul.
Jan 22 16:01:05 np0005592449.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 22 16:01:05 np0005592449.novalocal systemd[1]: Starting User Manager for UID 1000...
Jan 22 16:01:05 np0005592449.novalocal systemd[4323]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:01:06 np0005592449.novalocal systemd[4323]: Queued start job for default target Main User Target.
Jan 22 16:01:06 np0005592449.novalocal systemd[4323]: Created slice User Application Slice.
Jan 22 16:01:06 np0005592449.novalocal systemd[4323]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 16:01:06 np0005592449.novalocal systemd[4323]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 16:01:06 np0005592449.novalocal systemd[4323]: Reached target Paths.
Jan 22 16:01:06 np0005592449.novalocal systemd[4323]: Reached target Timers.
Jan 22 16:01:06 np0005592449.novalocal systemd[4323]: Starting D-Bus User Message Bus Socket...
Jan 22 16:01:06 np0005592449.novalocal systemd[4323]: Starting Create User's Volatile Files and Directories...
Jan 22 16:01:06 np0005592449.novalocal systemd[4323]: Finished Create User's Volatile Files and Directories.
Jan 22 16:01:06 np0005592449.novalocal systemd[4323]: Listening on D-Bus User Message Bus Socket.
Jan 22 16:01:06 np0005592449.novalocal systemd[4323]: Reached target Sockets.
Jan 22 16:01:06 np0005592449.novalocal systemd[4323]: Reached target Basic System.
Jan 22 16:01:06 np0005592449.novalocal systemd[4323]: Reached target Main User Target.
Jan 22 16:01:06 np0005592449.novalocal systemd[4323]: Startup finished in 154ms.
Jan 22 16:01:06 np0005592449.novalocal systemd[1]: Started User Manager for UID 1000.
Jan 22 16:01:06 np0005592449.novalocal systemd[1]: Started Session 1 of User zuul.
Jan 22 16:01:06 np0005592449.novalocal sshd-session[4319]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:01:06 np0005592449.novalocal python3[4405]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:01:09 np0005592449.novalocal python3[4433]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:01:13 np0005592449.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 16:01:16 np0005592449.novalocal python3[4493]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:01:17 np0005592449.novalocal python3[4533]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 22 16:01:19 np0005592449.novalocal python3[4559]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDNMW1AfmmBc7+YDM/ntK8Bb7InKcwG2MxXA0tDI/9IDfIXJzfMqXa2e6Ez/dwyE92/Rrd5ARch14CVwkda0fIaG5Nc3i4pm9tmz7mWfQwTHa9Ee4qArsxChPC0rZqPRD7pOQScFwu+Her7sugovw3i2mLZxt1J9nuBuej1ZjB4McTup7ctPlcusljUT/HLiZDMH+C7WBENs9sK/bZ/c+dEsPVuirXVShtNLbhGfRTUGD3fhSTi5Qyls5WV1DNDF/IqLHexd9PfjXmV8YvfCSov4Fl1xV9Ev9LCBkWdh81C/cxs4G00ZMIcI1CZwA3mWjNq2xE6ZSBCC1IINHSaqWXQ7FQ8E8tyAOQzr/sYdFiUsV3JouZKBdc8wVKqnjlQ0VoWX3Kz1R1pziDljGUj9fkiGRhgvoH59hvaU79E3tieIvR9ang/7IXq/SVPEdM/MUaartzwLjIjo0ol9ldWv6tSBdw6GIJyH6zhlMZvRkHHHbxtNCQ0HDwfklM3yREWuMU= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:19 np0005592449.novalocal python3[4583]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:01:20 np0005592449.novalocal python3[4682]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:01:20 np0005592449.novalocal python3[4753]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769097679.9424093-207-128665715101836/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=8a2dd4918442465e97348e0f7d94c544_id_rsa follow=False checksum=0896161b5d4b16ae30aa6fb8bb4830748467df2f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:01:21 np0005592449.novalocal python3[4876]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:01:21 np0005592449.novalocal python3[4947]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769097680.964836-240-42213785198092/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=8a2dd4918442465e97348e0f7d94c544_id_rsa.pub follow=False checksum=7ff2005e3d74ac935947962537a855b64937286a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:01:23 np0005592449.novalocal python3[4995]: ansible-ping Invoked with data=pong
Jan 22 16:01:24 np0005592449.novalocal python3[5019]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:01:25 np0005592449.novalocal python3[5077]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 22 16:01:26 np0005592449.novalocal python3[5109]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:01:27 np0005592449.novalocal python3[5133]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:01:27 np0005592449.novalocal python3[5157]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:01:27 np0005592449.novalocal python3[5181]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:01:28 np0005592449.novalocal python3[5205]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:01:28 np0005592449.novalocal python3[5229]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:01:29 np0005592449.novalocal sudo[5253]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inawvezxaibwooifesorbdqepfuhwxli ; /usr/bin/python3'
Jan 22 16:01:29 np0005592449.novalocal sudo[5253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:01:29 np0005592449.novalocal python3[5255]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:01:29 np0005592449.novalocal sudo[5253]: pam_unix(sudo:session): session closed for user root
Jan 22 16:01:30 np0005592449.novalocal sudo[5331]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdwvrajjvgzxwpdssexqadxracfnenhc ; /usr/bin/python3'
Jan 22 16:01:30 np0005592449.novalocal sudo[5331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:01:30 np0005592449.novalocal python3[5333]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:01:30 np0005592449.novalocal sudo[5331]: pam_unix(sudo:session): session closed for user root
Jan 22 16:01:30 np0005592449.novalocal sudo[5404]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iphwyxnpgexszuynsazzaujlvwqkfffy ; /usr/bin/python3'
Jan 22 16:01:30 np0005592449.novalocal sudo[5404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:01:31 np0005592449.novalocal python3[5406]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769097690.085867-21-63035825127333/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:01:31 np0005592449.novalocal sudo[5404]: pam_unix(sudo:session): session closed for user root
Jan 22 16:01:31 np0005592449.novalocal python3[5454]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:32 np0005592449.novalocal python3[5478]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:32 np0005592449.novalocal python3[5502]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:32 np0005592449.novalocal python3[5526]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:32 np0005592449.novalocal python3[5550]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:33 np0005592449.novalocal python3[5574]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:33 np0005592449.novalocal python3[5598]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:33 np0005592449.novalocal python3[5622]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:34 np0005592449.novalocal python3[5646]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:34 np0005592449.novalocal python3[5670]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:34 np0005592449.novalocal python3[5694]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:35 np0005592449.novalocal python3[5718]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:35 np0005592449.novalocal python3[5742]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:35 np0005592449.novalocal python3[5766]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:35 np0005592449.novalocal python3[5790]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:36 np0005592449.novalocal python3[5814]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:36 np0005592449.novalocal python3[5838]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:36 np0005592449.novalocal python3[5862]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:37 np0005592449.novalocal python3[5886]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:37 np0005592449.novalocal python3[5910]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:37 np0005592449.novalocal python3[5934]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:37 np0005592449.novalocal python3[5958]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:38 np0005592449.novalocal python3[5982]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:38 np0005592449.novalocal python3[6006]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:38 np0005592449.novalocal python3[6030]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:39 np0005592449.novalocal python3[6054]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:01:41 np0005592449.novalocal sudo[6078]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdmhrfbtlhpuowdhzztzoxagjfkxqslv ; /usr/bin/python3'
Jan 22 16:01:41 np0005592449.novalocal sudo[6078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:01:42 np0005592449.novalocal python3[6080]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 22 16:01:42 np0005592449.novalocal systemd[1]: Starting Time & Date Service...
Jan 22 16:01:42 np0005592449.novalocal systemd[1]: Started Time & Date Service.
Jan 22 16:01:42 np0005592449.novalocal systemd-timedated[6082]: Changed time zone to 'UTC' (UTC).
Jan 22 16:01:42 np0005592449.novalocal sudo[6078]: pam_unix(sudo:session): session closed for user root
Jan 22 16:01:42 np0005592449.novalocal sudo[6109]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnwevmasqznnrtoppqileqzvjyfntakr ; /usr/bin/python3'
Jan 22 16:01:42 np0005592449.novalocal sudo[6109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:01:42 np0005592449.novalocal python3[6111]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:01:42 np0005592449.novalocal sudo[6109]: pam_unix(sudo:session): session closed for user root
Jan 22 16:01:43 np0005592449.novalocal python3[6187]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:01:43 np0005592449.novalocal python3[6258]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769097702.8431504-153-119429504035269/source _original_basename=tmppf0pm6r1 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:01:43 np0005592449.novalocal python3[6358]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:01:44 np0005592449.novalocal python3[6429]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769097703.739381-183-59829488390819/source _original_basename=tmpdijekj8z follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:01:44 np0005592449.novalocal sudo[6529]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yurdjouufrxifrzteflqrfccnobbkory ; /usr/bin/python3'
Jan 22 16:01:45 np0005592449.novalocal sudo[6529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:01:45 np0005592449.novalocal python3[6531]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:01:45 np0005592449.novalocal sudo[6529]: pam_unix(sudo:session): session closed for user root
Jan 22 16:01:45 np0005592449.novalocal sudo[6602]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrxrlckvywoqryyaraivpyrqrsljxdtl ; /usr/bin/python3'
Jan 22 16:01:45 np0005592449.novalocal sudo[6602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:01:45 np0005592449.novalocal python3[6604]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769097704.8285675-231-149271157276211/source _original_basename=tmpf4ql267c follow=False checksum=8e0e434468aa50922357fbdb56d8b197f48f0949 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:01:45 np0005592449.novalocal sudo[6602]: pam_unix(sudo:session): session closed for user root
Jan 22 16:01:46 np0005592449.novalocal python3[6652]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:01:46 np0005592449.novalocal python3[6678]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:01:46 np0005592449.novalocal sudo[6756]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boatpwcdbngpyhpffojpgxmzcfhfzash ; /usr/bin/python3'
Jan 22 16:01:46 np0005592449.novalocal sudo[6756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:01:46 np0005592449.novalocal python3[6758]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:01:46 np0005592449.novalocal sudo[6756]: pam_unix(sudo:session): session closed for user root
Jan 22 16:01:47 np0005592449.novalocal sudo[6829]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajxrbrufhouwzzulxqjujdipyjfjqccm ; /usr/bin/python3'
Jan 22 16:01:47 np0005592449.novalocal sudo[6829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:01:47 np0005592449.novalocal python3[6831]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769097706.5987787-273-187901802904412/source _original_basename=tmprfa4zyhw follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:01:47 np0005592449.novalocal sudo[6829]: pam_unix(sudo:session): session closed for user root
Jan 22 16:01:47 np0005592449.novalocal sudo[6880]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-outejfaytpontqzvnvafqcvihmuqqkkr ; /usr/bin/python3'
Jan 22 16:01:47 np0005592449.novalocal sudo[6880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:01:47 np0005592449.novalocal python3[6882]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-e9cc-d995-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:01:47 np0005592449.novalocal sudo[6880]: pam_unix(sudo:session): session closed for user root
Jan 22 16:01:48 np0005592449.novalocal python3[6910]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-e9cc-d995-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 22 16:01:49 np0005592449.novalocal python3[6938]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:02:05 np0005592449.novalocal sudo[6962]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xozxtbneelmueekqdwytesnaeighvlis ; /usr/bin/python3'
Jan 22 16:02:05 np0005592449.novalocal sudo[6962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:02:05 np0005592449.novalocal python3[6964]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:02:05 np0005592449.novalocal sudo[6962]: pam_unix(sudo:session): session closed for user root
Jan 22 16:02:12 np0005592449.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 22 16:02:40 np0005592449.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 22 16:02:40 np0005592449.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 22 16:02:40 np0005592449.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 22 16:02:40 np0005592449.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 22 16:02:40 np0005592449.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 22 16:02:40 np0005592449.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 22 16:02:40 np0005592449.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 22 16:02:40 np0005592449.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 22 16:02:40 np0005592449.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 22 16:02:40 np0005592449.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 22 16:02:40 np0005592449.novalocal NetworkManager[858]: <info>  [1769097760.1677] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 22 16:02:40 np0005592449.novalocal systemd-udevd[6967]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 16:02:40 np0005592449.novalocal NetworkManager[858]: <info>  [1769097760.1865] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:02:40 np0005592449.novalocal NetworkManager[858]: <info>  [1769097760.1902] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 22 16:02:40 np0005592449.novalocal NetworkManager[858]: <info>  [1769097760.1908] device (eth1): carrier: link connected
Jan 22 16:02:40 np0005592449.novalocal NetworkManager[858]: <info>  [1769097760.1911] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 22 16:02:40 np0005592449.novalocal NetworkManager[858]: <info>  [1769097760.1919] policy: auto-activating connection 'Wired connection 1' (996dff1a-21f6-3407-9055-9c1cc954befb)
Jan 22 16:02:40 np0005592449.novalocal NetworkManager[858]: <info>  [1769097760.1925] device (eth1): Activation: starting connection 'Wired connection 1' (996dff1a-21f6-3407-9055-9c1cc954befb)
Jan 22 16:02:40 np0005592449.novalocal NetworkManager[858]: <info>  [1769097760.1927] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:02:40 np0005592449.novalocal NetworkManager[858]: <info>  [1769097760.1933] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:02:40 np0005592449.novalocal NetworkManager[858]: <info>  [1769097760.1940] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:02:40 np0005592449.novalocal NetworkManager[858]: <info>  [1769097760.1946] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:02:41 np0005592449.novalocal python3[6994]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-0c91-b230-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:02:48 np0005592449.novalocal sudo[7072]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryrvswiafrgixhzwszroetbrgaqxwgil ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 22 16:02:48 np0005592449.novalocal sudo[7072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:02:48 np0005592449.novalocal python3[7074]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:02:48 np0005592449.novalocal sudo[7072]: pam_unix(sudo:session): session closed for user root
Jan 22 16:02:48 np0005592449.novalocal sudo[7145]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzjmkprrbhjhlnubcqpukkeejotpggzd ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 22 16:02:48 np0005592449.novalocal sudo[7145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:02:48 np0005592449.novalocal python3[7147]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769097767.8066692-102-130075598034103/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=1db69643f935f50e5d0015acaf7902a006880989 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:02:48 np0005592449.novalocal sudo[7145]: pam_unix(sudo:session): session closed for user root
Jan 22 16:02:49 np0005592449.novalocal sudo[7195]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zboksdyivydedrfhogqniuvvebluxaja ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 22 16:02:49 np0005592449.novalocal sudo[7195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:02:49 np0005592449.novalocal python3[7197]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:02:49 np0005592449.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 22 16:02:49 np0005592449.novalocal systemd[1]: Stopped Network Manager Wait Online.
Jan 22 16:02:49 np0005592449.novalocal systemd[1]: Stopping Network Manager Wait Online...
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[858]: <info>  [1769097769.5142] caught SIGTERM, shutting down normally.
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[858]: <info>  [1769097769.5149] dhcp4 (eth0): canceled DHCP transaction
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[858]: <info>  [1769097769.5149] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[858]: <info>  [1769097769.5149] dhcp4 (eth0): state changed no lease
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[858]: <info>  [1769097769.5151] manager: NetworkManager state is now CONNECTING
Jan 22 16:02:49 np0005592449.novalocal systemd[1]: Stopping Network Manager...
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[858]: <info>  [1769097769.5236] dhcp4 (eth1): canceled DHCP transaction
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[858]: <info>  [1769097769.5237] dhcp4 (eth1): state changed no lease
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[858]: <info>  [1769097769.5298] exiting (success)
Jan 22 16:02:49 np0005592449.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 16:02:49 np0005592449.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 16:02:49 np0005592449.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 22 16:02:49 np0005592449.novalocal systemd[1]: Stopped Network Manager.
Jan 22 16:02:49 np0005592449.novalocal systemd[1]: Starting Network Manager...
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.6205] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:da997f0d-f9f6-41ea-b801-9627d95136ee)
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.6209] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.6276] manager[0x557d9c074000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 22 16:02:49 np0005592449.novalocal systemd[1]: Starting Hostname Service...
Jan 22 16:02:49 np0005592449.novalocal systemd[1]: Started Hostname Service.
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7348] hostname: hostname: using hostnamed
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7349] hostname: static hostname changed from (none) to "np0005592449.novalocal"
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7359] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7367] manager[0x557d9c074000]: rfkill: Wi-Fi hardware radio set enabled
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7368] manager[0x557d9c074000]: rfkill: WWAN hardware radio set enabled
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7425] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7426] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7427] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7428] manager: Networking is enabled by state file
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7432] settings: Loaded settings plugin: keyfile (internal)
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7439] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7491] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7509] dhcp: init: Using DHCP client 'internal'
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7513] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7522] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7531] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7545] device (lo): Activation: starting connection 'lo' (30955f9e-8f64-42bf-81b2-a9784deb7a51)
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7558] device (eth0): carrier: link connected
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7565] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7574] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7575] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7587] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7599] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7610] device (eth1): carrier: link connected
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7619] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7629] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (996dff1a-21f6-3407-9055-9c1cc954befb) (indicated)
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7629] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7637] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7649] device (eth1): Activation: starting connection 'Wired connection 1' (996dff1a-21f6-3407-9055-9c1cc954befb)
Jan 22 16:02:49 np0005592449.novalocal systemd[1]: Started Network Manager.
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7662] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7671] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7676] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7680] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7684] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7689] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7693] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7698] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7703] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7716] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7722] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7745] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7749] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7765] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7770] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 22 16:02:49 np0005592449.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7776] device (lo): Activation: successful, device activated.
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7785] dhcp4 (eth0): state changed new lease, address=38.102.83.176
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7791] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7865] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7896] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7897] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7900] manager: NetworkManager state is now CONNECTED_SITE
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7901] device (eth0): Activation: successful, device activated.
Jan 22 16:02:49 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097769.7907] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 22 16:02:49 np0005592449.novalocal sudo[7195]: pam_unix(sudo:session): session closed for user root
Jan 22 16:02:50 np0005592449.novalocal python3[7281]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-0c91-b230-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:02:59 np0005592449.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 16:03:19 np0005592449.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.2403] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 22 16:03:35 np0005592449.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 16:03:35 np0005592449.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.2803] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.2812] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.2835] device (eth1): Activation: successful, device activated.
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.2847] manager: startup complete
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.2852] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <warn>  [1769097815.2871] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.2882] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 22 16:03:35 np0005592449.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.3047] dhcp4 (eth1): canceled DHCP transaction
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.3048] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.3048] dhcp4 (eth1): state changed no lease
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.3067] policy: auto-activating connection 'ci-private-network' (f8969003-395c-5487-8caf-079e54e358f5)
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.3073] device (eth1): Activation: starting connection 'ci-private-network' (f8969003-395c-5487-8caf-079e54e358f5)
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.3074] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.3079] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.3089] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.3102] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.3145] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.3149] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:03:35 np0005592449.novalocal NetworkManager[7206]: <info>  [1769097815.3159] device (eth1): Activation: successful, device activated.
Jan 22 16:03:45 np0005592449.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 16:03:50 np0005592449.novalocal sshd-session[4332]: Received disconnect from 38.102.83.114 port 58260:11: disconnected by user
Jan 22 16:03:50 np0005592449.novalocal sshd-session[4332]: Disconnected from user zuul 38.102.83.114 port 58260
Jan 22 16:03:50 np0005592449.novalocal sshd-session[4319]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:03:50 np0005592449.novalocal systemd-logind[796]: Session 1 logged out. Waiting for processes to exit.
Jan 22 16:03:56 np0005592449.novalocal sshd-session[7309]: Accepted publickey for zuul from 38.102.83.114 port 48752 ssh2: RSA SHA256:8EnoZWWmxQfrmVWtONiJoMXuv4iNTlOetYCFvLE13as
Jan 22 16:03:56 np0005592449.novalocal systemd-logind[796]: New session 3 of user zuul.
Jan 22 16:03:56 np0005592449.novalocal systemd[1]: Started Session 3 of User zuul.
Jan 22 16:03:56 np0005592449.novalocal sshd-session[7309]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:03:57 np0005592449.novalocal sudo[7388]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxggvbdsmunwbeyvjokkkekujqesjxab ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 22 16:03:57 np0005592449.novalocal sudo[7388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:03:57 np0005592449.novalocal python3[7390]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:03:57 np0005592449.novalocal sudo[7388]: pam_unix(sudo:session): session closed for user root
Jan 22 16:03:57 np0005592449.novalocal systemd[4323]: Starting Mark boot as successful...
Jan 22 16:03:57 np0005592449.novalocal systemd[4323]: Finished Mark boot as successful.
Jan 22 16:03:57 np0005592449.novalocal sudo[7462]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apxvuwvpcmeexeybfaultstgldswbffa ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 22 16:03:57 np0005592449.novalocal sudo[7462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:03:57 np0005592449.novalocal python3[7464]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769097836.921484-259-120292531590533/source _original_basename=tmpfv_y7q3y follow=False checksum=3350350d7ad415d76b3755051b544739a9322f8a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:03:57 np0005592449.novalocal sudo[7462]: pam_unix(sudo:session): session closed for user root
Jan 22 16:03:59 np0005592449.novalocal sshd-session[7312]: Connection closed by 38.102.83.114 port 48752
Jan 22 16:03:59 np0005592449.novalocal sshd-session[7309]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:03:59 np0005592449.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Jan 22 16:03:59 np0005592449.novalocal systemd-logind[796]: Session 3 logged out. Waiting for processes to exit.
Jan 22 16:03:59 np0005592449.novalocal systemd-logind[796]: Removed session 3.
Jan 22 16:04:48 np0005592449.novalocal sshd-session[7489]: Connection reset by authenticating user root 176.120.22.47 port 43412 [preauth]
Jan 22 16:06:14 np0005592449.novalocal sshd-session[7492]: Received disconnect from 45.148.10.141 port 53394:11:  [preauth]
Jan 22 16:06:14 np0005592449.novalocal sshd-session[7492]: Disconnected from authenticating user root 45.148.10.141 port 53394 [preauth]
Jan 22 16:06:57 np0005592449.novalocal systemd[4323]: Created slice User Background Tasks Slice.
Jan 22 16:06:57 np0005592449.novalocal systemd[4323]: Starting Cleanup of User's Temporary Files and Directories...
Jan 22 16:06:57 np0005592449.novalocal systemd[4323]: Finished Cleanup of User's Temporary Files and Directories.
Jan 22 16:11:08 np0005592449.novalocal sshd-session[7499]: Accepted publickey for zuul from 38.102.83.114 port 42710 ssh2: RSA SHA256:8EnoZWWmxQfrmVWtONiJoMXuv4iNTlOetYCFvLE13as
Jan 22 16:11:08 np0005592449.novalocal systemd-logind[796]: New session 4 of user zuul.
Jan 22 16:11:08 np0005592449.novalocal systemd[1]: Started Session 4 of User zuul.
Jan 22 16:11:08 np0005592449.novalocal sshd-session[7499]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:11:08 np0005592449.novalocal sudo[7526]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhiwlthztvbtdpeghemgcuazzpiwlbsv ; /usr/bin/python3'
Jan 22 16:11:08 np0005592449.novalocal sudo[7526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:11:08 np0005592449.novalocal python3[7528]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-ede4-0b3c-00000000216b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:11:08 np0005592449.novalocal sudo[7526]: pam_unix(sudo:session): session closed for user root
Jan 22 16:11:09 np0005592449.novalocal sudo[7555]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhyzvprgwfmgjejzzewnswnusqtgafgv ; /usr/bin/python3'
Jan 22 16:11:09 np0005592449.novalocal sudo[7555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:11:09 np0005592449.novalocal python3[7557]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:11:09 np0005592449.novalocal sudo[7555]: pam_unix(sudo:session): session closed for user root
Jan 22 16:11:09 np0005592449.novalocal sudo[7581]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjvftpolahswomezmhqblcifygugnxix ; /usr/bin/python3'
Jan 22 16:11:09 np0005592449.novalocal sudo[7581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:11:09 np0005592449.novalocal python3[7583]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:11:09 np0005592449.novalocal sudo[7581]: pam_unix(sudo:session): session closed for user root
Jan 22 16:11:09 np0005592449.novalocal sudo[7607]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzfmxazjeaokhtgmnraazjmcsveytzdy ; /usr/bin/python3'
Jan 22 16:11:09 np0005592449.novalocal sudo[7607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:11:09 np0005592449.novalocal python3[7609]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:11:09 np0005592449.novalocal sudo[7607]: pam_unix(sudo:session): session closed for user root
Jan 22 16:11:09 np0005592449.novalocal sudo[7633]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozfsglhjkmvwgmnrmemftwkshcfbfsxo ; /usr/bin/python3'
Jan 22 16:11:09 np0005592449.novalocal sudo[7633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:11:09 np0005592449.novalocal python3[7635]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:11:09 np0005592449.novalocal sudo[7633]: pam_unix(sudo:session): session closed for user root
Jan 22 16:11:10 np0005592449.novalocal sudo[7659]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjarujqcdovdhpfnzrnnqbrwgutttszi ; /usr/bin/python3'
Jan 22 16:11:10 np0005592449.novalocal sudo[7659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:11:10 np0005592449.novalocal python3[7661]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:11:10 np0005592449.novalocal sudo[7659]: pam_unix(sudo:session): session closed for user root
Jan 22 16:11:11 np0005592449.novalocal sudo[7737]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wynlduyqckbascdtckdgpwbkgslrjvah ; /usr/bin/python3'
Jan 22 16:11:11 np0005592449.novalocal sudo[7737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:11:11 np0005592449.novalocal python3[7739]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:11:11 np0005592449.novalocal sudo[7737]: pam_unix(sudo:session): session closed for user root
Jan 22 16:11:11 np0005592449.novalocal sudo[7810]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyklxfkkokqgizcdxbzqosleatecumhr ; /usr/bin/python3'
Jan 22 16:11:11 np0005592449.novalocal sudo[7810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:11:11 np0005592449.novalocal python3[7812]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769098270.9243383-497-23906680710009/source _original_basename=tmpubss9u7m follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:11:11 np0005592449.novalocal sudo[7810]: pam_unix(sudo:session): session closed for user root
Jan 22 16:11:12 np0005592449.novalocal sudo[7860]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngologetvsrzmqxgebovkwuyqcdmtuqm ; /usr/bin/python3'
Jan 22 16:11:12 np0005592449.novalocal sudo[7860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:11:12 np0005592449.novalocal python3[7862]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 16:11:12 np0005592449.novalocal systemd[1]: Reloading.
Jan 22 16:11:12 np0005592449.novalocal systemd-rc-local-generator[7881]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:11:12 np0005592449.novalocal sudo[7860]: pam_unix(sudo:session): session closed for user root
Jan 22 16:11:14 np0005592449.novalocal sudo[7916]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnnrrcjmkkpskpuvbzmlrlrjkuuppxyr ; /usr/bin/python3'
Jan 22 16:11:14 np0005592449.novalocal sudo[7916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:11:14 np0005592449.novalocal python3[7918]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 22 16:11:14 np0005592449.novalocal sudo[7916]: pam_unix(sudo:session): session closed for user root
Jan 22 16:11:14 np0005592449.novalocal sudo[7942]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnjtvyojeghbfyjffwfnceuwkhkphudz ; /usr/bin/python3'
Jan 22 16:11:14 np0005592449.novalocal sudo[7942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:11:14 np0005592449.novalocal python3[7944]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:11:14 np0005592449.novalocal sudo[7942]: pam_unix(sudo:session): session closed for user root
Jan 22 16:11:14 np0005592449.novalocal sudo[7970]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgtogejgxpcaltwrooxafbgvzdegkxog ; /usr/bin/python3'
Jan 22 16:11:14 np0005592449.novalocal sudo[7970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:11:14 np0005592449.novalocal python3[7972]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:11:14 np0005592449.novalocal sudo[7970]: pam_unix(sudo:session): session closed for user root
Jan 22 16:11:15 np0005592449.novalocal sudo[7998]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aevxfywbtckndiletwdrrycstcwewimw ; /usr/bin/python3'
Jan 22 16:11:15 np0005592449.novalocal sudo[7998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:11:15 np0005592449.novalocal python3[8000]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:11:15 np0005592449.novalocal sudo[7998]: pam_unix(sudo:session): session closed for user root
Jan 22 16:11:15 np0005592449.novalocal sudo[8026]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwjqlpepkkzrievxfshhtjjwgnowmgqu ; /usr/bin/python3'
Jan 22 16:11:15 np0005592449.novalocal sudo[8026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:11:15 np0005592449.novalocal python3[8028]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:11:15 np0005592449.novalocal sudo[8026]: pam_unix(sudo:session): session closed for user root
Jan 22 16:11:16 np0005592449.novalocal python3[8055]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-ede4-0b3c-000000002172-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:11:16 np0005592449.novalocal python3[8085]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 22 16:11:18 np0005592449.novalocal sshd-session[7502]: Connection closed by 38.102.83.114 port 42710
Jan 22 16:11:18 np0005592449.novalocal sshd-session[7499]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:11:18 np0005592449.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Jan 22 16:11:18 np0005592449.novalocal systemd[1]: session-4.scope: Consumed 3.914s CPU time.
Jan 22 16:11:18 np0005592449.novalocal systemd-logind[796]: Session 4 logged out. Waiting for processes to exit.
Jan 22 16:11:18 np0005592449.novalocal systemd-logind[796]: Removed session 4.
Jan 22 16:11:19 np0005592449.novalocal sshd-session[8090]: Accepted publickey for zuul from 38.102.83.114 port 54896 ssh2: RSA SHA256:8EnoZWWmxQfrmVWtONiJoMXuv4iNTlOetYCFvLE13as
Jan 22 16:11:19 np0005592449.novalocal systemd-logind[796]: New session 5 of user zuul.
Jan 22 16:11:19 np0005592449.novalocal systemd[1]: Started Session 5 of User zuul.
Jan 22 16:11:19 np0005592449.novalocal sshd-session[8090]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:11:20 np0005592449.novalocal sudo[8117]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acpbwkefovvaqdkxmwpgatrselknrbcv ; /usr/bin/python3'
Jan 22 16:11:20 np0005592449.novalocal sudo[8117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:11:20 np0005592449.novalocal python3[8119]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 22 16:11:25 np0005592449.novalocal setsebool[8158]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 22 16:11:25 np0005592449.novalocal setsebool[8158]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 22 16:11:36 np0005592449.novalocal kernel: SELinux:  Converting 386 SID table entries...
Jan 22 16:11:36 np0005592449.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 16:11:36 np0005592449.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 22 16:11:36 np0005592449.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 16:11:36 np0005592449.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 22 16:11:36 np0005592449.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 16:11:36 np0005592449.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 16:11:36 np0005592449.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 16:11:45 np0005592449.novalocal kernel: SELinux:  Converting 389 SID table entries...
Jan 22 16:11:45 np0005592449.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 16:11:45 np0005592449.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 22 16:11:45 np0005592449.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 16:11:45 np0005592449.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 22 16:11:45 np0005592449.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 16:11:45 np0005592449.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 16:11:45 np0005592449.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 16:12:03 np0005592449.novalocal dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 22 16:12:03 np0005592449.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 16:12:03 np0005592449.novalocal systemd[1]: Starting man-db-cache-update.service...
Jan 22 16:12:03 np0005592449.novalocal systemd[1]: Reloading.
Jan 22 16:12:03 np0005592449.novalocal systemd-rc-local-generator[8933]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:12:03 np0005592449.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 16:12:04 np0005592449.novalocal sudo[8117]: pam_unix(sudo:session): session closed for user root
Jan 22 16:12:08 np0005592449.novalocal python3[13166]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163e3b-3c83-d341-8707-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:12:09 np0005592449.novalocal kernel: evm: overlay not supported
Jan 22 16:12:09 np0005592449.novalocal systemd[4323]: Starting D-Bus User Message Bus...
Jan 22 16:12:09 np0005592449.novalocal dbus-broker-launch[13912]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 22 16:12:09 np0005592449.novalocal dbus-broker-launch[13912]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 22 16:12:09 np0005592449.novalocal systemd[4323]: Started D-Bus User Message Bus.
Jan 22 16:12:09 np0005592449.novalocal dbus-broker-lau[13912]: Ready
Jan 22 16:12:09 np0005592449.novalocal systemd[4323]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 22 16:12:09 np0005592449.novalocal systemd[4323]: Created slice Slice /user.
Jan 22 16:12:09 np0005592449.novalocal systemd[4323]: podman-13893.scope: unit configures an IP firewall, but not running as root.
Jan 22 16:12:09 np0005592449.novalocal systemd[4323]: (This warning is only shown for the first unit using IP firewalling.)
Jan 22 16:12:09 np0005592449.novalocal systemd[4323]: Started podman-13893.scope.
Jan 22 16:12:09 np0005592449.novalocal systemd[4323]: Started podman-pause-4891a413.scope.
Jan 22 16:12:10 np0005592449.novalocal sudo[14026]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmejtjzslzumcjweknrbjtybytvcrhhe ; /usr/bin/python3'
Jan 22 16:12:10 np0005592449.novalocal sudo[14026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:12:10 np0005592449.novalocal python3[14028]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.113:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.113:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:12:10 np0005592449.novalocal python3[14028]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 22 16:12:10 np0005592449.novalocal sudo[14026]: pam_unix(sudo:session): session closed for user root
Jan 22 16:12:10 np0005592449.novalocal sshd-session[8093]: Connection closed by 38.102.83.114 port 54896
Jan 22 16:12:10 np0005592449.novalocal sshd-session[8090]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:12:10 np0005592449.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Jan 22 16:12:10 np0005592449.novalocal systemd[1]: session-5.scope: Consumed 41.706s CPU time.
Jan 22 16:12:10 np0005592449.novalocal systemd-logind[796]: Session 5 logged out. Waiting for processes to exit.
Jan 22 16:12:10 np0005592449.novalocal systemd-logind[796]: Removed session 5.
Jan 22 16:12:29 np0005592449.novalocal sshd-session[21550]: Connection closed by 38.102.83.217 port 54252 [preauth]
Jan 22 16:12:29 np0005592449.novalocal sshd-session[21554]: Connection closed by 38.102.83.217 port 54256 [preauth]
Jan 22 16:12:29 np0005592449.novalocal sshd-session[21552]: Unable to negotiate with 38.102.83.217 port 54266: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 22 16:12:29 np0005592449.novalocal sshd-session[21556]: Unable to negotiate with 38.102.83.217 port 54282: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 22 16:12:29 np0005592449.novalocal sshd-session[21555]: Unable to negotiate with 38.102.83.217 port 54284: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 22 16:12:33 np0005592449.novalocal irqbalance[790]: Cannot change IRQ 27 affinity: Operation not permitted
Jan 22 16:12:33 np0005592449.novalocal irqbalance[790]: IRQ 27 affinity is now unmanaged
Jan 22 16:12:35 np0005592449.novalocal sshd-session[23191]: Accepted publickey for zuul from 38.102.83.114 port 34400 ssh2: RSA SHA256:8EnoZWWmxQfrmVWtONiJoMXuv4iNTlOetYCFvLE13as
Jan 22 16:12:35 np0005592449.novalocal systemd-logind[796]: New session 6 of user zuul.
Jan 22 16:12:35 np0005592449.novalocal systemd[1]: Started Session 6 of User zuul.
Jan 22 16:12:35 np0005592449.novalocal sshd-session[23191]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:12:35 np0005592449.novalocal python3[23306]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEEiuG319vJ8cstCpmfS8IIk33E4dMI8LrlmfgaZAVC0Hi4vWRP0puF1l6cJ1YDj0KOaKiNGFfiTsuigEtCVbcU= zuul@np0005592448.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:12:36 np0005592449.novalocal sudo[23492]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odaxelictvgneckutxauzqnxyoaqpwil ; /usr/bin/python3'
Jan 22 16:12:36 np0005592449.novalocal sudo[23492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:12:36 np0005592449.novalocal python3[23500]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEEiuG319vJ8cstCpmfS8IIk33E4dMI8LrlmfgaZAVC0Hi4vWRP0puF1l6cJ1YDj0KOaKiNGFfiTsuigEtCVbcU= zuul@np0005592448.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:12:36 np0005592449.novalocal sudo[23492]: pam_unix(sudo:session): session closed for user root
Jan 22 16:12:36 np0005592449.novalocal sudo[23807]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjrtmdhgbifsilxsiicemufcccslsdqt ; /usr/bin/python3'
Jan 22 16:12:36 np0005592449.novalocal sudo[23807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:12:37 np0005592449.novalocal python3[23814]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005592449.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 22 16:12:37 np0005592449.novalocal useradd[23895]: new group: name=cloud-admin, GID=1002
Jan 22 16:12:37 np0005592449.novalocal useradd[23895]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Jan 22 16:12:37 np0005592449.novalocal sudo[23807]: pam_unix(sudo:session): session closed for user root
Jan 22 16:12:37 np0005592449.novalocal sudo[24018]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbsscazzyhcdyubodrwhmswosrsjxlus ; /usr/bin/python3'
Jan 22 16:12:37 np0005592449.novalocal sudo[24018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:12:37 np0005592449.novalocal python3[24028]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEEiuG319vJ8cstCpmfS8IIk33E4dMI8LrlmfgaZAVC0Hi4vWRP0puF1l6cJ1YDj0KOaKiNGFfiTsuigEtCVbcU= zuul@np0005592448.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:12:37 np0005592449.novalocal sudo[24018]: pam_unix(sudo:session): session closed for user root
Jan 22 16:12:37 np0005592449.novalocal sudo[24296]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjantjcldrfizieqiwkexsnnhmyfpjqv ; /usr/bin/python3'
Jan 22 16:12:37 np0005592449.novalocal sudo[24296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:12:37 np0005592449.novalocal python3[24310]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:12:37 np0005592449.novalocal sudo[24296]: pam_unix(sudo:session): session closed for user root
Jan 22 16:12:38 np0005592449.novalocal sudo[24592]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbpyutldyriqmzronbfnmbkkeiezfqex ; /usr/bin/python3'
Jan 22 16:12:38 np0005592449.novalocal sudo[24592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:12:38 np0005592449.novalocal python3[24606]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769098357.6364968-135-50619455730513/source _original_basename=tmpewsuikp1 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:12:38 np0005592449.novalocal sudo[24592]: pam_unix(sudo:session): session closed for user root
Jan 22 16:12:38 np0005592449.novalocal sudo[24933]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjicnpxghmotidyqmfetqcvoppupohyv ; /usr/bin/python3'
Jan 22 16:12:38 np0005592449.novalocal sudo[24933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:12:39 np0005592449.novalocal python3[24942]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Jan 22 16:12:39 np0005592449.novalocal systemd[1]: Starting Hostname Service...
Jan 22 16:12:39 np0005592449.novalocal systemd[1]: Started Hostname Service.
Jan 22 16:12:39 np0005592449.novalocal systemd-hostnamed[25057]: Changed pretty hostname to 'compute-0'
Jan 22 16:12:39 compute-0 systemd-hostnamed[25057]: Hostname set to <compute-0> (static)
Jan 22 16:12:39 compute-0 NetworkManager[7206]: <info>  [1769098359.3051] hostname: static hostname changed from "np0005592449.novalocal" to "compute-0"
Jan 22 16:12:39 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 16:12:39 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 16:12:39 compute-0 sudo[24933]: pam_unix(sudo:session): session closed for user root
Jan 22 16:12:39 compute-0 sshd-session[23249]: Connection closed by 38.102.83.114 port 34400
Jan 22 16:12:39 compute-0 sshd-session[23191]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:12:39 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Jan 22 16:12:39 compute-0 systemd[1]: session-6.scope: Consumed 2.176s CPU time.
Jan 22 16:12:39 compute-0 systemd-logind[796]: Session 6 logged out. Waiting for processes to exit.
Jan 22 16:12:39 compute-0 systemd-logind[796]: Removed session 6.
Jan 22 16:12:49 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 16:12:50 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 16:12:50 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 16:12:50 compute-0 systemd[1]: man-db-cache-update.service: Consumed 55.837s CPU time.
Jan 22 16:12:50 compute-0 systemd[1]: run-r3cfd1fbc2fee482e96bd935ce23842b4.service: Deactivated successfully.
Jan 22 16:13:09 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 16:13:38 compute-0 sshd-session[29931]: Received disconnect from 45.227.254.170 port 60470:11:  [preauth]
Jan 22 16:13:38 compute-0 sshd-session[29931]: Disconnected from authenticating user root 45.227.254.170 port 60470 [preauth]
Jan 22 16:14:33 compute-0 sshd-session[29935]: error: kex_exchange_identification: read: Connection reset by peer
Jan 22 16:14:33 compute-0 sshd-session[29935]: Connection reset by 167.148.195.117 port 60157
Jan 22 16:15:57 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 22 16:15:57 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 22 16:15:57 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 22 16:15:57 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 22 16:16:18 compute-0 sshd-session[29940]: Accepted publickey for zuul from 38.102.83.217 port 40480 ssh2: RSA SHA256:8EnoZWWmxQfrmVWtONiJoMXuv4iNTlOetYCFvLE13as
Jan 22 16:16:18 compute-0 systemd-logind[796]: New session 7 of user zuul.
Jan 22 16:16:18 compute-0 systemd[1]: Started Session 7 of User zuul.
Jan 22 16:16:18 compute-0 sshd-session[29940]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:16:19 compute-0 python3[30016]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:16:20 compute-0 sudo[30130]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eejbttvktouadsauacwhkkjdzfqriark ; /usr/bin/python3'
Jan 22 16:16:20 compute-0 sudo[30130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:16:20 compute-0 python3[30132]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:16:20 compute-0 sudo[30130]: pam_unix(sudo:session): session closed for user root
Jan 22 16:16:20 compute-0 sudo[30203]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmuwmlkugizqeynljgfvmmzgeogxlyvz ; /usr/bin/python3'
Jan 22 16:16:20 compute-0 sudo[30203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:16:21 compute-0 python3[30205]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769098580.4046705-33555-174469894159754/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:16:21 compute-0 sudo[30203]: pam_unix(sudo:session): session closed for user root
Jan 22 16:16:21 compute-0 sudo[30229]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkclbuliqveckfcszwtlhnoalymmygys ; /usr/bin/python3'
Jan 22 16:16:21 compute-0 sudo[30229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:16:21 compute-0 python3[30231]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:16:21 compute-0 sudo[30229]: pam_unix(sudo:session): session closed for user root
Jan 22 16:16:21 compute-0 sudo[30302]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfacfevhjtqlftcjgesqirihrtxtwsbc ; /usr/bin/python3'
Jan 22 16:16:21 compute-0 sudo[30302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:16:21 compute-0 python3[30304]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769098580.4046705-33555-174469894159754/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:16:21 compute-0 sudo[30302]: pam_unix(sudo:session): session closed for user root
Jan 22 16:16:21 compute-0 sudo[30328]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymxoyemlftbaneiaihoufsyaxlmyshlm ; /usr/bin/python3'
Jan 22 16:16:21 compute-0 sudo[30328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:16:21 compute-0 python3[30330]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:16:21 compute-0 sudo[30328]: pam_unix(sudo:session): session closed for user root
Jan 22 16:16:22 compute-0 sudo[30401]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eamvuabhggnosnuejtqnqetrmaknufai ; /usr/bin/python3'
Jan 22 16:16:22 compute-0 sudo[30401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:16:22 compute-0 python3[30403]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769098580.4046705-33555-174469894159754/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:16:22 compute-0 sudo[30401]: pam_unix(sudo:session): session closed for user root
Jan 22 16:16:22 compute-0 sudo[30427]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmhlqsymqpbibwrvevnljormxzlasqdj ; /usr/bin/python3'
Jan 22 16:16:22 compute-0 sudo[30427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:16:22 compute-0 python3[30429]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:16:22 compute-0 sudo[30427]: pam_unix(sudo:session): session closed for user root
Jan 22 16:16:22 compute-0 sudo[30500]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmodubmadnjjmhyavqkotehvswqlekwn ; /usr/bin/python3'
Jan 22 16:16:22 compute-0 sudo[30500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:16:22 compute-0 python3[30502]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769098580.4046705-33555-174469894159754/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:16:22 compute-0 sudo[30500]: pam_unix(sudo:session): session closed for user root
Jan 22 16:16:23 compute-0 sudo[30526]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiysowwelwkyjqzupsrfknllmxspjbnl ; /usr/bin/python3'
Jan 22 16:16:23 compute-0 sudo[30526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:16:23 compute-0 python3[30528]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:16:23 compute-0 sudo[30526]: pam_unix(sudo:session): session closed for user root
Jan 22 16:16:23 compute-0 sudo[30599]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biirftifcdmcrxiuuscxhbnxxwxzprir ; /usr/bin/python3'
Jan 22 16:16:23 compute-0 sudo[30599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:16:23 compute-0 python3[30601]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769098580.4046705-33555-174469894159754/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:16:23 compute-0 sudo[30599]: pam_unix(sudo:session): session closed for user root
Jan 22 16:16:23 compute-0 sudo[30625]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osjpvyjloceobwexyutmgctvlfnuvkjz ; /usr/bin/python3'
Jan 22 16:16:23 compute-0 sudo[30625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:16:23 compute-0 python3[30627]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:16:23 compute-0 sudo[30625]: pam_unix(sudo:session): session closed for user root
Jan 22 16:16:23 compute-0 sudo[30698]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwogxynxhqcdwzpttkkbqylxbwbcrhgb ; /usr/bin/python3'
Jan 22 16:16:23 compute-0 sudo[30698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:16:24 compute-0 python3[30700]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769098580.4046705-33555-174469894159754/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:16:24 compute-0 sudo[30698]: pam_unix(sudo:session): session closed for user root
Jan 22 16:16:24 compute-0 sudo[30724]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koexwxcwjzvhcvkqqairkyobprrejaaa ; /usr/bin/python3'
Jan 22 16:16:24 compute-0 sudo[30724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:16:24 compute-0 python3[30726]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:16:24 compute-0 sudo[30724]: pam_unix(sudo:session): session closed for user root
Jan 22 16:16:24 compute-0 sudo[30797]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-octsmlkbonvzjzjqjmtomlfmbjwxabkh ; /usr/bin/python3'
Jan 22 16:16:24 compute-0 sudo[30797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:16:24 compute-0 python3[30799]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769098580.4046705-33555-174469894159754/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:16:24 compute-0 sudo[30797]: pam_unix(sudo:session): session closed for user root
Jan 22 16:16:26 compute-0 sshd-session[30825]: Connection closed by 192.168.122.11 port 55566 [preauth]
Jan 22 16:16:26 compute-0 sshd-session[30824]: Connection closed by 192.168.122.11 port 55550 [preauth]
Jan 22 16:16:26 compute-0 sshd-session[30826]: Unable to negotiate with 192.168.122.11 port 55576: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 22 16:16:26 compute-0 sshd-session[30827]: Unable to negotiate with 192.168.122.11 port 55582: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 22 16:16:26 compute-0 sshd-session[30828]: Unable to negotiate with 192.168.122.11 port 55586: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 22 16:16:36 compute-0 python3[30857]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:20:57 compute-0 sshd-session[30861]: Received disconnect from 45.148.10.151 port 60468:11:  [preauth]
Jan 22 16:20:57 compute-0 sshd-session[30861]: Disconnected from authenticating user root 45.148.10.151 port 60468 [preauth]
Jan 22 16:21:03 compute-0 sshd-session[30863]: Connection reset by 198.235.24.54 port 63480 [preauth]
Jan 22 16:21:36 compute-0 sshd-session[29943]: Received disconnect from 38.102.83.217 port 40480:11: disconnected by user
Jan 22 16:21:36 compute-0 sshd-session[29943]: Disconnected from user zuul 38.102.83.217 port 40480
Jan 22 16:21:36 compute-0 sshd-session[29940]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:21:36 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Jan 22 16:21:36 compute-0 systemd[1]: session-7.scope: Consumed 4.683s CPU time.
Jan 22 16:21:36 compute-0 systemd-logind[796]: Session 7 logged out. Waiting for processes to exit.
Jan 22 16:21:36 compute-0 systemd-logind[796]: Removed session 7.
Jan 22 16:27:19 compute-0 sshd-session[30867]: Invalid user pi from 112.164.20.69 port 55810
Jan 22 16:27:20 compute-0 sshd-session[30867]: Connection closed by invalid user pi 112.164.20.69 port 55810 [preauth]
Jan 22 16:27:20 compute-0 sshd-session[30869]: Invalid user pi from 112.164.20.69 port 55816
Jan 22 16:27:20 compute-0 sshd-session[30869]: Connection closed by invalid user pi 112.164.20.69 port 55816 [preauth]
Jan 22 16:28:20 compute-0 sshd-session[30873]: Received disconnect from 45.148.10.152 port 38906:11:  [preauth]
Jan 22 16:28:20 compute-0 sshd-session[30873]: Disconnected from authenticating user root 45.148.10.152 port 38906 [preauth]
Jan 22 16:29:13 compute-0 sshd-session[30875]: Accepted publickey for zuul from 192.168.122.30 port 39356 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:29:13 compute-0 systemd-logind[796]: New session 8 of user zuul.
Jan 22 16:29:13 compute-0 systemd[1]: Started Session 8 of User zuul.
Jan 22 16:29:13 compute-0 sshd-session[30875]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:29:14 compute-0 python3.9[31028]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:29:15 compute-0 sudo[31207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kthxwpemadxjegugthvwcrnpdcevncqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099354.876444-27-231851335694608/AnsiballZ_command.py'
Jan 22 16:29:15 compute-0 sudo[31207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:29:15 compute-0 python3.9[31209]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:29:23 compute-0 irqbalance[790]: Cannot change IRQ 26 affinity: Operation not permitted
Jan 22 16:29:23 compute-0 irqbalance[790]: IRQ 26 affinity is now unmanaged
Jan 22 16:29:23 compute-0 sudo[31207]: pam_unix(sudo:session): session closed for user root
Jan 22 16:29:24 compute-0 sshd-session[30878]: Connection closed by 192.168.122.30 port 39356
Jan 22 16:29:24 compute-0 sshd-session[30875]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:29:24 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Jan 22 16:29:24 compute-0 systemd[1]: session-8.scope: Consumed 7.977s CPU time.
Jan 22 16:29:24 compute-0 systemd-logind[796]: Session 8 logged out. Waiting for processes to exit.
Jan 22 16:29:24 compute-0 systemd-logind[796]: Removed session 8.
Jan 22 16:29:29 compute-0 sshd-session[31266]: Accepted publickey for zuul from 192.168.122.30 port 40010 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:29:29 compute-0 systemd-logind[796]: New session 9 of user zuul.
Jan 22 16:29:29 compute-0 systemd[1]: Started Session 9 of User zuul.
Jan 22 16:29:29 compute-0 sshd-session[31266]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:29:30 compute-0 python3.9[31419]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:29:31 compute-0 sshd-session[31269]: Connection closed by 192.168.122.30 port 40010
Jan 22 16:29:31 compute-0 sshd-session[31266]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:29:31 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Jan 22 16:29:31 compute-0 systemd-logind[796]: Session 9 logged out. Waiting for processes to exit.
Jan 22 16:29:31 compute-0 systemd-logind[796]: Removed session 9.
Jan 22 16:29:47 compute-0 sshd-session[31447]: Accepted publickey for zuul from 192.168.122.30 port 49258 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:29:47 compute-0 systemd-logind[796]: New session 10 of user zuul.
Jan 22 16:29:47 compute-0 systemd[1]: Started Session 10 of User zuul.
Jan 22 16:29:47 compute-0 sshd-session[31447]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:29:47 compute-0 python3.9[31600]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 22 16:29:49 compute-0 python3.9[31774]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:29:50 compute-0 sudo[31924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hospfkdtwcxgcfiixftxflfilqwyhlzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099389.505196-40-174732128126800/AnsiballZ_command.py'
Jan 22 16:29:50 compute-0 sudo[31924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:29:50 compute-0 python3.9[31926]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:29:50 compute-0 sudo[31924]: pam_unix(sudo:session): session closed for user root
Jan 22 16:29:51 compute-0 sudo[32077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kigorxyedpfriqejrsgirgqptvgjcfov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099390.7544897-52-190286232250421/AnsiballZ_stat.py'
Jan 22 16:29:51 compute-0 sudo[32077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:29:51 compute-0 python3.9[32079]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:29:51 compute-0 sudo[32077]: pam_unix(sudo:session): session closed for user root
Jan 22 16:29:52 compute-0 sudo[32229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dltpmwwuqrdcoefvwmschiedydasoftx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099391.7074807-60-15035665821778/AnsiballZ_file.py'
Jan 22 16:29:52 compute-0 sudo[32229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:29:52 compute-0 python3.9[32231]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:29:52 compute-0 sudo[32229]: pam_unix(sudo:session): session closed for user root
Jan 22 16:29:52 compute-0 sudo[32381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juedkntzgqqzfvnqywvuzgksbkwcrjnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099392.5592465-68-168338768070238/AnsiballZ_stat.py'
Jan 22 16:29:52 compute-0 sudo[32381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:29:52 compute-0 python3.9[32383]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:29:52 compute-0 sudo[32381]: pam_unix(sudo:session): session closed for user root
Jan 22 16:29:53 compute-0 sudo[32504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psykoiktueumuzmacxgmyfogqvivseuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099392.5592465-68-168338768070238/AnsiballZ_copy.py'
Jan 22 16:29:53 compute-0 sudo[32504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:29:53 compute-0 python3.9[32506]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769099392.5592465-68-168338768070238/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:29:53 compute-0 sudo[32504]: pam_unix(sudo:session): session closed for user root
Jan 22 16:29:54 compute-0 sudo[32656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iteoazbhtqbifvhygzjotiszxkktkenq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099393.9060504-83-135460408197175/AnsiballZ_setup.py'
Jan 22 16:29:54 compute-0 sudo[32656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:29:54 compute-0 python3.9[32658]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:29:54 compute-0 sudo[32656]: pam_unix(sudo:session): session closed for user root
Jan 22 16:29:55 compute-0 sudo[32812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxlesambhnawwowwcanxzmydfokfydrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099394.7695205-91-199208308134154/AnsiballZ_file.py'
Jan 22 16:29:55 compute-0 sudo[32812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:29:55 compute-0 python3.9[32814]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:29:55 compute-0 sudo[32812]: pam_unix(sudo:session): session closed for user root
Jan 22 16:29:55 compute-0 sudo[32964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhkktzlmigwtnkqkrnoarplbuyiwstrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099395.4520588-100-29293235675392/AnsiballZ_file.py'
Jan 22 16:29:55 compute-0 sudo[32964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:29:55 compute-0 python3.9[32966]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:29:55 compute-0 sudo[32964]: pam_unix(sudo:session): session closed for user root
Jan 22 16:29:56 compute-0 python3.9[33116]: ansible-ansible.builtin.service_facts Invoked
Jan 22 16:30:03 compute-0 python3.9[33369]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:30:04 compute-0 python3.9[33519]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:30:05 compute-0 python3.9[33673]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:30:06 compute-0 sudo[33829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxipsgznyfqmqvkcyuusynfxhfyfyqbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099405.9688761-148-269750017797359/AnsiballZ_setup.py'
Jan 22 16:30:06 compute-0 sudo[33829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:30:06 compute-0 python3.9[33831]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:30:06 compute-0 sudo[33829]: pam_unix(sudo:session): session closed for user root
Jan 22 16:30:07 compute-0 sudo[33913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrblrvdxghxmeziudmehveppvqelqyfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099405.9688761-148-269750017797359/AnsiballZ_dnf.py'
Jan 22 16:30:07 compute-0 sudo[33913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:30:07 compute-0 python3.9[33915]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:30:52 compute-0 systemd[1]: Reloading.
Jan 22 16:30:52 compute-0 systemd-rc-local-generator[34113]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:30:52 compute-0 systemd[1]: Starting dnf makecache...
Jan 22 16:30:52 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 22 16:30:52 compute-0 dnf[34123]: Failed determining last makecache time.
Jan 22 16:30:52 compute-0 dnf[34123]: delorean-openstack-barbican-42b4c41831408a8e323 133 kB/s | 3.0 kB     00:00
Jan 22 16:30:52 compute-0 dnf[34123]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 187 kB/s | 3.0 kB     00:00
Jan 22 16:30:52 compute-0 dnf[34123]: delorean-openstack-cinder-1c00d6490d88e436f26ef 193 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 dnf[34123]: delorean-python-stevedore-c4acc5639fd2329372142 194 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 dnf[34123]: delorean-python-cloudkitty-tests-tempest-2c80f8 194 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 dnf[34123]: delorean-os-refresh-config-9bfc52b5049be2d8de61 181 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 systemd[1]: Reloading.
Jan 22 16:30:53 compute-0 dnf[34123]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 193 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 dnf[34123]: delorean-python-designate-tests-tempest-347fdbc 203 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 dnf[34123]: delorean-openstack-glance-1fd12c29b339f30fe823e 185 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 dnf[34123]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 194 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 dnf[34123]: delorean-openstack-manila-3c01b7181572c95dac462 189 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 systemd-rc-local-generator[34162]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:30:53 compute-0 dnf[34123]: delorean-python-whitebox-neutron-tests-tempest- 183 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 dnf[34123]: delorean-openstack-octavia-ba397f07a7331190208c 182 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 dnf[34123]: delorean-openstack-watcher-c014f81a8647287f6dcc 195 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 22 16:30:53 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 22 16:30:53 compute-0 systemd[1]: Reloading.
Jan 22 16:30:53 compute-0 systemd-rc-local-generator[34210]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:30:53 compute-0 dnf[34123]: delorean-ansible-config_template-5ccaa22121a7ff 9.0 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 dnf[34123]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 133 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 dnf[34123]: delorean-openstack-swift-dc98a8463506ac520c469a 132 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 22 16:30:53 compute-0 dnf[34123]: delorean-python-tempestconf-8515371b7cceebd4282 171 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 dnf[34123]: delorean-openstack-heat-ui-013accbfd179753bc3f0 198 kB/s | 3.0 kB     00:00
Jan 22 16:30:53 compute-0 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 22 16:30:53 compute-0 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 22 16:30:54 compute-0 dnf[34123]: CentOS Stream 9 - BaseOS                         18 kB/s | 6.7 kB     00:00
Jan 22 16:30:54 compute-0 dnf[34123]: CentOS Stream 9 - AppStream                      29 kB/s | 6.8 kB     00:00
Jan 22 16:30:54 compute-0 dnf[34123]: CentOS Stream 9 - CRB                            29 kB/s | 6.6 kB     00:00
Jan 22 16:30:54 compute-0 dnf[34123]: CentOS Stream 9 - Extras packages                30 kB/s | 7.3 kB     00:00
Jan 22 16:30:54 compute-0 dnf[34123]: dlrn-antelope-testing                           165 kB/s | 3.0 kB     00:00
Jan 22 16:30:54 compute-0 dnf[34123]: dlrn-antelope-build-deps                        175 kB/s | 3.0 kB     00:00
Jan 22 16:30:55 compute-0 dnf[34123]: centos9-rabbitmq                                125 kB/s | 3.0 kB     00:00
Jan 22 16:30:55 compute-0 dnf[34123]: centos9-storage                                 110 kB/s | 3.0 kB     00:00
Jan 22 16:30:55 compute-0 dnf[34123]: centos9-opstools                                103 kB/s | 3.0 kB     00:00
Jan 22 16:30:55 compute-0 dnf[34123]: NFV SIG OpenvSwitch                             105 kB/s | 3.0 kB     00:00
Jan 22 16:30:55 compute-0 dnf[34123]: repo-setup-centos-appstream                     163 kB/s | 4.4 kB     00:00
Jan 22 16:30:55 compute-0 dnf[34123]: repo-setup-centos-baseos                        125 kB/s | 3.9 kB     00:00
Jan 22 16:30:55 compute-0 dnf[34123]: repo-setup-centos-highavailability              159 kB/s | 3.9 kB     00:00
Jan 22 16:30:55 compute-0 dnf[34123]: repo-setup-centos-powertools                    186 kB/s | 4.3 kB     00:00
Jan 22 16:30:55 compute-0 dnf[34123]: Extra Packages for Enterprise Linux 9 - x86_64  144 kB/s |  27 kB     00:00
Jan 22 16:30:56 compute-0 dnf[34123]: Metadata cache created.
Jan 22 16:30:56 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 22 16:30:56 compute-0 systemd[1]: Finished dnf makecache.
Jan 22 16:30:56 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.789s CPU time.
Jan 22 16:31:59 compute-0 kernel: SELinux:  Converting 2724 SID table entries...
Jan 22 16:31:59 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 16:31:59 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 22 16:31:59 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 16:31:59 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 22 16:31:59 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 16:31:59 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 16:31:59 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 16:31:59 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 22 16:32:00 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 16:32:00 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 22 16:32:00 compute-0 systemd[1]: Reloading.
Jan 22 16:32:00 compute-0 systemd-rc-local-generator[34571]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:32:00 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 16:32:00 compute-0 sudo[33913]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:01 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 16:32:01 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 16:32:01 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.182s CPU time.
Jan 22 16:32:01 compute-0 systemd[1]: run-r6a112c661b8c4fcdb66c49c81a136ee8.service: Deactivated successfully.
Jan 22 16:32:01 compute-0 sudo[35488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kapgnkwewszmgowfhfcsgdbtjkuyfmdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099520.9807572-160-46267844647650/AnsiballZ_command.py'
Jan 22 16:32:01 compute-0 sudo[35488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:01 compute-0 python3.9[35490]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:32:02 compute-0 sudo[35488]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:03 compute-0 sudo[35769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aljekntmibipdqitubfnqcjdiqwsufiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099522.6956775-168-98052556033274/AnsiballZ_selinux.py'
Jan 22 16:32:03 compute-0 sudo[35769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:03 compute-0 python3.9[35771]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 22 16:32:03 compute-0 sudo[35769]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:04 compute-0 sudo[35921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhdutdhbkodijnaddxkxxpkucovirdpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099524.09919-179-53945496639137/AnsiballZ_command.py'
Jan 22 16:32:04 compute-0 sudo[35921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:04 compute-0 python3.9[35923]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 22 16:32:05 compute-0 sudo[35921]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:06 compute-0 sudo[36074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjxhiktszlhxaerjuhrggumvkvoiuapb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099525.773653-187-87971587015027/AnsiballZ_file.py'
Jan 22 16:32:06 compute-0 sudo[36074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:07 compute-0 python3.9[36076]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:32:07 compute-0 sudo[36074]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:08 compute-0 sudo[36226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxcdwwelcgagwgxhyeisdewupjeyceno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099527.4764488-195-26780626186155/AnsiballZ_mount.py'
Jan 22 16:32:08 compute-0 sudo[36226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:08 compute-0 python3.9[36228]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 22 16:32:08 compute-0 sudo[36226]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:09 compute-0 sudo[36378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqftnntwwibhgjueywceawjqztmjkocj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099529.1596587-223-126748367023305/AnsiballZ_file.py'
Jan 22 16:32:09 compute-0 sudo[36378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:09 compute-0 python3.9[36380]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:32:09 compute-0 sudo[36378]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:10 compute-0 sudo[36530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djptmkxixnnraevaitiiigxvcxkdfnzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099529.9122467-231-127657393534240/AnsiballZ_stat.py'
Jan 22 16:32:10 compute-0 sudo[36530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:10 compute-0 python3.9[36532]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:32:10 compute-0 sudo[36530]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:10 compute-0 sudo[36653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzjzurqeevlyznbrmvxgpswkwkcykanz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099529.9122467-231-127657393534240/AnsiballZ_copy.py'
Jan 22 16:32:10 compute-0 sudo[36653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:13 compute-0 python3.9[36655]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099529.9122467-231-127657393534240/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=62f56ebb86b7819c5ce2b2a14a69280df383a076 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:32:13 compute-0 sudo[36653]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:14 compute-0 sudo[36805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdzavkqjzaumbprwjhqnxttijvycqakl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099534.147927-255-25978447371273/AnsiballZ_stat.py'
Jan 22 16:32:14 compute-0 sudo[36805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:17 compute-0 python3.9[36807]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:32:17 compute-0 sudo[36805]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:18 compute-0 sudo[36957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbtbgbmvqyvvqkavhhihzzdaaecnwhob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099537.79412-263-272927332324194/AnsiballZ_command.py'
Jan 22 16:32:18 compute-0 sudo[36957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:18 compute-0 python3.9[36959]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:32:18 compute-0 sudo[36957]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:19 compute-0 sudo[37110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bngrquspzmnukvzntxaunnethbyffnyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099538.643125-271-165918156145466/AnsiballZ_file.py'
Jan 22 16:32:19 compute-0 sudo[37110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:19 compute-0 python3.9[37112]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:32:19 compute-0 sudo[37110]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:20 compute-0 sudo[37262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynwymcvhgrdqpypafpdbimfkwlanicki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099539.6050322-282-270077064675518/AnsiballZ_getent.py'
Jan 22 16:32:20 compute-0 sudo[37262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:20 compute-0 python3.9[37264]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 22 16:32:20 compute-0 sudo[37262]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:20 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 16:32:21 compute-0 sudo[37416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzoontvjqnjwfnacpcddnxponwhxdpyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099540.4981608-290-178352824513913/AnsiballZ_group.py'
Jan 22 16:32:21 compute-0 sudo[37416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:21 compute-0 python3.9[37418]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 16:32:21 compute-0 groupadd[37419]: group added to /etc/group: name=qemu, GID=107
Jan 22 16:32:21 compute-0 groupadd[37419]: group added to /etc/gshadow: name=qemu
Jan 22 16:32:21 compute-0 groupadd[37419]: new group: name=qemu, GID=107
Jan 22 16:32:21 compute-0 sudo[37416]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:22 compute-0 sudo[37574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oppifohnoszvhmjfrnpxpycsudarnlsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099541.5443683-298-47964275893678/AnsiballZ_user.py'
Jan 22 16:32:22 compute-0 sudo[37574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:22 compute-0 python3.9[37576]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 16:32:22 compute-0 useradd[37578]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Jan 22 16:32:22 compute-0 sudo[37574]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:22 compute-0 sudo[37734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgrlntuezwcighhfiljhzpveaerpiazv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099542.5843756-306-147013304724079/AnsiballZ_getent.py'
Jan 22 16:32:22 compute-0 sudo[37734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:23 compute-0 python3.9[37736]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 22 16:32:23 compute-0 sudo[37734]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:23 compute-0 sudo[37887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdqptwomugbwwztyacoutigfijfglecp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099543.2783449-314-255872346483647/AnsiballZ_group.py'
Jan 22 16:32:23 compute-0 sudo[37887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:23 compute-0 python3.9[37889]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 16:32:23 compute-0 groupadd[37890]: group added to /etc/group: name=hugetlbfs, GID=42477
Jan 22 16:32:23 compute-0 groupadd[37890]: group added to /etc/gshadow: name=hugetlbfs
Jan 22 16:32:23 compute-0 groupadd[37890]: new group: name=hugetlbfs, GID=42477
Jan 22 16:32:23 compute-0 sudo[37887]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:24 compute-0 sudo[38045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckthkxvhrbrnnzdkvlmvjigghnnndvnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099544.057353-323-234406391400190/AnsiballZ_file.py'
Jan 22 16:32:24 compute-0 sudo[38045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:24 compute-0 python3.9[38047]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 22 16:32:24 compute-0 sudo[38045]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:25 compute-0 sudo[38197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcerbiprqlhdeijwodlylamdnurmztbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099544.8547635-334-249275471000696/AnsiballZ_dnf.py'
Jan 22 16:32:25 compute-0 sudo[38197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:25 compute-0 python3.9[38199]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:32:28 compute-0 sudo[38197]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:29 compute-0 sudo[38350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkmccmryypmbsxnbalgprbqnbtfglypq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099548.974551-342-81349822567507/AnsiballZ_file.py'
Jan 22 16:32:29 compute-0 sudo[38350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:29 compute-0 python3.9[38352]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:32:29 compute-0 sudo[38350]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:29 compute-0 sudo[38502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xorsiooghjedrnkolnvzaujtcnmqxepw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099549.578764-350-268406584664232/AnsiballZ_stat.py'
Jan 22 16:32:29 compute-0 sudo[38502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:30 compute-0 python3.9[38504]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:32:30 compute-0 sudo[38502]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:30 compute-0 sudo[38625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlytorwoyqxxrviawvpdgbcmpjrnsqop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099549.578764-350-268406584664232/AnsiballZ_copy.py'
Jan 22 16:32:30 compute-0 sudo[38625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:30 compute-0 python3.9[38627]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769099549.578764-350-268406584664232/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:32:30 compute-0 sudo[38625]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:31 compute-0 sudo[38777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oodpmmurjsshedpabnrvnjyugiueysup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099550.8580647-365-81885084819156/AnsiballZ_systemd.py'
Jan 22 16:32:31 compute-0 sudo[38777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:31 compute-0 python3.9[38779]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:32:31 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 22 16:32:31 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 22 16:32:31 compute-0 kernel: Bridge firewalling registered
Jan 22 16:32:31 compute-0 systemd-modules-load[38783]: Inserted module 'br_netfilter'
Jan 22 16:32:31 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 22 16:32:31 compute-0 sudo[38777]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:32 compute-0 sudo[38937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlevfodjjoezvybjusbygsxmequvudqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099552.0627232-373-207556361755882/AnsiballZ_stat.py'
Jan 22 16:32:32 compute-0 sudo[38937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:32 compute-0 python3.9[38939]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:32:32 compute-0 sudo[38937]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:32 compute-0 sudo[39060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rineaqogtqjtnitddrbpbfxknabbuxhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099552.0627232-373-207556361755882/AnsiballZ_copy.py'
Jan 22 16:32:32 compute-0 sudo[39060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:33 compute-0 python3.9[39062]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769099552.0627232-373-207556361755882/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:32:33 compute-0 sudo[39060]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:33 compute-0 sudo[39212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ishxfvyinrkjwahwxtigmqldjzgbuxhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099553.4583435-391-74157198330712/AnsiballZ_dnf.py'
Jan 22 16:32:33 compute-0 sudo[39212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:33 compute-0 python3.9[39214]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:32:38 compute-0 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 22 16:32:38 compute-0 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 22 16:32:39 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 16:32:39 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 22 16:32:39 compute-0 systemd[1]: Reloading.
Jan 22 16:32:39 compute-0 systemd-rc-local-generator[39280]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:32:39 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 16:32:40 compute-0 sudo[39212]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:41 compute-0 python3.9[41271]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:32:42 compute-0 python3.9[42166]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 22 16:32:43 compute-0 python3.9[42867]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:32:43 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 16:32:43 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 16:32:43 compute-0 systemd[1]: man-db-cache-update.service: Consumed 5.209s CPU time.
Jan 22 16:32:43 compute-0 systemd[1]: run-rd303eb42f3f24282a09c7860a3190543.service: Deactivated successfully.
Jan 22 16:32:43 compute-0 sudo[43386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhjxallupfnlgnoltafylisoaooiqipn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099563.5935133-430-27343949369939/AnsiballZ_command.py'
Jan 22 16:32:43 compute-0 sudo[43386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:44 compute-0 python3.9[43388]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:32:44 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 22 16:32:44 compute-0 systemd[1]: Starting Authorization Manager...
Jan 22 16:32:44 compute-0 polkitd[43605]: Started polkitd version 0.117
Jan 22 16:32:44 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 22 16:32:44 compute-0 polkitd[43605]: Loading rules from directory /etc/polkit-1/rules.d
Jan 22 16:32:44 compute-0 polkitd[43605]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 22 16:32:44 compute-0 polkitd[43605]: Finished loading, compiling and executing 2 rules
Jan 22 16:32:44 compute-0 polkitd[43605]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Jan 22 16:32:44 compute-0 systemd[1]: Started Authorization Manager.
Jan 22 16:32:44 compute-0 sudo[43386]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:45 compute-0 sudo[43773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhezgabxdztlxnvewtfjutenlnnjwaxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099565.04645-439-111263987116730/AnsiballZ_systemd.py'
Jan 22 16:32:45 compute-0 sudo[43773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:45 compute-0 python3.9[43775]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:32:45 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 22 16:32:45 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Jan 22 16:32:45 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 22 16:32:45 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 22 16:32:45 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 22 16:32:46 compute-0 sudo[43773]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:46 compute-0 python3.9[43936]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 22 16:32:48 compute-0 sudo[44086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zawpzahtsmewsksoxgxpnwpqcutftoem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099568.5113666-496-265972210235836/AnsiballZ_systemd.py'
Jan 22 16:32:48 compute-0 sudo[44086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:49 compute-0 python3.9[44088]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:32:49 compute-0 systemd[1]: Reloading.
Jan 22 16:32:49 compute-0 systemd-rc-local-generator[44119]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:32:49 compute-0 sudo[44086]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:49 compute-0 sudo[44275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vustphptgjeokfitemwqvvfridxgwebt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099569.6592584-496-143550763492384/AnsiballZ_systemd.py'
Jan 22 16:32:49 compute-0 sudo[44275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:50 compute-0 python3.9[44277]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:32:50 compute-0 systemd[1]: Reloading.
Jan 22 16:32:50 compute-0 systemd-rc-local-generator[44299]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:32:50 compute-0 sudo[44275]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:51 compute-0 sudo[44463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuhipifwoqefzhrweaofhisdmpxletkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099570.7945454-512-38689104034689/AnsiballZ_command.py'
Jan 22 16:32:51 compute-0 sudo[44463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:51 compute-0 python3.9[44465]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:32:51 compute-0 sudo[44463]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:51 compute-0 sudo[44616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibicggihdoktygjgxycvxedyrtcutafk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099571.537133-520-246414770571129/AnsiballZ_command.py'
Jan 22 16:32:51 compute-0 sudo[44616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:52 compute-0 python3.9[44618]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:32:52 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 22 16:32:52 compute-0 sudo[44616]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:52 compute-0 sudo[44769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpltdeckawfoesbshwacoospwblvokwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099572.2399817-528-102209572299780/AnsiballZ_command.py'
Jan 22 16:32:52 compute-0 sudo[44769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:52 compute-0 python3.9[44771]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:32:54 compute-0 sudo[44769]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:54 compute-0 sudo[44931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxpokfbaxsxfigvqwggcawsyawglzbak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099574.346627-536-123181804914562/AnsiballZ_command.py'
Jan 22 16:32:54 compute-0 sudo[44931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:54 compute-0 python3.9[44933]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:32:54 compute-0 sudo[44931]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:55 compute-0 sudo[45084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksrnshkxhaskzutsqhzpxjitzmlscijd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099575.1089072-544-78352594464799/AnsiballZ_systemd.py'
Jan 22 16:32:55 compute-0 sudo[45084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:32:55 compute-0 python3.9[45086]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:32:55 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 22 16:32:55 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Jan 22 16:32:55 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Jan 22 16:32:55 compute-0 systemd[1]: Starting Apply Kernel Variables...
Jan 22 16:32:55 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 22 16:32:55 compute-0 systemd[1]: Finished Apply Kernel Variables.
Jan 22 16:32:55 compute-0 sudo[45084]: pam_unix(sudo:session): session closed for user root
Jan 22 16:32:56 compute-0 sshd-session[31450]: Connection closed by 192.168.122.30 port 49258
Jan 22 16:32:56 compute-0 sshd-session[31447]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:32:56 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Jan 22 16:32:56 compute-0 systemd[1]: session-10.scope: Consumed 2min 16.929s CPU time.
Jan 22 16:32:56 compute-0 systemd-logind[796]: Session 10 logged out. Waiting for processes to exit.
Jan 22 16:32:56 compute-0 systemd-logind[796]: Removed session 10.
Jan 22 16:33:01 compute-0 sshd-session[45117]: Accepted publickey for zuul from 192.168.122.30 port 49690 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:33:01 compute-0 systemd-logind[796]: New session 11 of user zuul.
Jan 22 16:33:01 compute-0 systemd[1]: Started Session 11 of User zuul.
Jan 22 16:33:01 compute-0 sshd-session[45117]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:33:02 compute-0 python3.9[45270]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:33:03 compute-0 python3.9[45424]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:33:05 compute-0 sudo[45578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzmhlhmoslgcezelbyshedyasiknvpoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099584.5402517-45-77170469391813/AnsiballZ_command.py'
Jan 22 16:33:05 compute-0 sudo[45578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:05 compute-0 python3.9[45580]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:33:05 compute-0 sudo[45578]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:06 compute-0 python3.9[45731]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:33:07 compute-0 sudo[45885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-entykxtbthjtkpgebptinnuzoocitcup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099586.6551032-65-85480291990532/AnsiballZ_setup.py'
Jan 22 16:33:07 compute-0 sudo[45885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:07 compute-0 python3.9[45887]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:33:07 compute-0 sudo[45885]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:08 compute-0 sudo[45969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flnwldorkukakjgrtnrugqpdeikxztxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099586.6551032-65-85480291990532/AnsiballZ_dnf.py'
Jan 22 16:33:08 compute-0 sudo[45969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:08 compute-0 python3.9[45971]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:33:09 compute-0 sudo[45969]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:10 compute-0 sudo[46122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofpegtmnsizsyluntnocxfuwoxqplspb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099589.6973374-77-43786313435126/AnsiballZ_setup.py'
Jan 22 16:33:10 compute-0 sudo[46122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:10 compute-0 python3.9[46124]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:33:10 compute-0 sudo[46122]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:11 compute-0 sudo[46293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrldtymnkynuqsfwsdhhwlnimmwiouff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099590.8915448-88-164899271389570/AnsiballZ_file.py'
Jan 22 16:33:11 compute-0 sudo[46293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:11 compute-0 python3.9[46295]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:33:11 compute-0 sudo[46293]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:12 compute-0 sudo[46445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzkzxtuxeixvfnhzqadrrhjidxcstlrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099591.7897887-96-276108673091631/AnsiballZ_command.py'
Jan 22 16:33:12 compute-0 sudo[46445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:12 compute-0 python3.9[46447]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:33:12 compute-0 podman[46448]: 2026-01-22 16:33:12.41742447 +0000 UTC m=+0.071806795 system refresh
Jan 22 16:33:12 compute-0 sudo[46445]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:13 compute-0 sudo[46608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btiwlegowhxghqonwvljczonbddsssei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099592.6375754-104-24446290144879/AnsiballZ_stat.py'
Jan 22 16:33:13 compute-0 sudo[46608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:13 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:33:13 compute-0 python3.9[46610]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:33:13 compute-0 sudo[46608]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:14 compute-0 sudo[46731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqedapxuvtaunjwkopxhccdvzmdeavqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099592.6375754-104-24446290144879/AnsiballZ_copy.py'
Jan 22 16:33:14 compute-0 sudo[46731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:14 compute-0 python3.9[46733]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099592.6375754-104-24446290144879/.source.json follow=False _original_basename=podman_network_config.j2 checksum=fe8b8eb509eaf22b02059de6789a3919f433f87c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:33:14 compute-0 sudo[46731]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:14 compute-0 sudo[46883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvqgedwqsvgocfcvpxrcpirzhfpgqzwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099594.5243146-119-238242889812946/AnsiballZ_stat.py'
Jan 22 16:33:14 compute-0 sudo[46883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:15 compute-0 python3.9[46885]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:33:15 compute-0 sudo[46883]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:15 compute-0 sudo[47006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udtyjqxblfvphtcmupdoqmlukocdquky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099594.5243146-119-238242889812946/AnsiballZ_copy.py'
Jan 22 16:33:15 compute-0 sudo[47006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:15 compute-0 python3.9[47008]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769099594.5243146-119-238242889812946/.source.conf follow=False _original_basename=registries.conf.j2 checksum=c2a85b7389d30a5066b1ae0058c9a8ae1bc25688 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:33:15 compute-0 sudo[47006]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:16 compute-0 sudo[47158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jahhlmydwdnhwfszvcuqxksfpwhjgwva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099595.9685163-135-65154170919353/AnsiballZ_ini_file.py'
Jan 22 16:33:16 compute-0 sudo[47158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:16 compute-0 python3.9[47160]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:33:16 compute-0 sudo[47158]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:17 compute-0 sudo[47310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szkekmcqrjdzbefcsfqumzuweuhmdmci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099596.9067218-135-58954366397197/AnsiballZ_ini_file.py'
Jan 22 16:33:17 compute-0 sudo[47310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:17 compute-0 python3.9[47312]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:33:17 compute-0 sudo[47310]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:17 compute-0 sudo[47462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hibqipbpqeeotxzveoxibdsyrtzhcenv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099597.603478-135-77872058685640/AnsiballZ_ini_file.py'
Jan 22 16:33:17 compute-0 sudo[47462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:18 compute-0 python3.9[47464]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:33:18 compute-0 sudo[47462]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:18 compute-0 sudo[47614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exvuxbdxtyuosuitwxdznvbfjyyfbvmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099598.2927024-135-52166110244300/AnsiballZ_ini_file.py'
Jan 22 16:33:18 compute-0 sudo[47614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:18 compute-0 python3.9[47616]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:33:18 compute-0 sudo[47614]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:19 compute-0 python3.9[47766]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:33:20 compute-0 sudo[47918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzkxoobpccubonoyxyltfphtjhsppppf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099600.1080306-175-249592004872257/AnsiballZ_dnf.py'
Jan 22 16:33:20 compute-0 sudo[47918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:20 compute-0 python3.9[47920]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:33:21 compute-0 sudo[47918]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:22 compute-0 sudo[48071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghxzluvuxrjgzmfdgizrgsrxzdupixen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099602.1111166-183-116790723005177/AnsiballZ_dnf.py'
Jan 22 16:33:22 compute-0 sudo[48071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:22 compute-0 python3.9[48073]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:33:24 compute-0 sudo[48071]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:25 compute-0 sudo[48231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyucjqghzbaevkschbfiynhbpkzwhqdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099604.7836726-193-81987725570577/AnsiballZ_dnf.py'
Jan 22 16:33:25 compute-0 sudo[48231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:25 compute-0 python3.9[48233]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:33:26 compute-0 sudo[48231]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:27 compute-0 sudo[48384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drmhaigepnyqfbkgpyebxqewuhkajthk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099606.7078016-202-68530989130776/AnsiballZ_dnf.py'
Jan 22 16:33:27 compute-0 sudo[48384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:27 compute-0 python3.9[48386]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:33:28 compute-0 sudo[48384]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:30 compute-0 sudo[48538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrjupvgpcfsorsvpnlecagsoeyemqgiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099609.7531524-213-110226467936831/AnsiballZ_dnf.py'
Jan 22 16:33:30 compute-0 sudo[48538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:30 compute-0 python3.9[48540]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:33:32 compute-0 sudo[48538]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:32 compute-0 sudo[48694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyecjbzimapolneeygrcbljoqyjsphgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099612.4202127-221-195634737023714/AnsiballZ_dnf.py'
Jan 22 16:33:32 compute-0 sudo[48694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:32 compute-0 python3.9[48696]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:33:35 compute-0 sudo[48694]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:36 compute-0 sudo[48864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbdzetfroemqurweuxdvgdrbsmdewajt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099615.7215953-230-61717536722650/AnsiballZ_dnf.py'
Jan 22 16:33:36 compute-0 sudo[48864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:36 compute-0 python3.9[48866]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:33:37 compute-0 sudo[48864]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:38 compute-0 sudo[49017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayztmqulenmqlvkrkljsrpoipursjbzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099617.676606-239-107430376803865/AnsiballZ_dnf.py'
Jan 22 16:33:38 compute-0 sudo[49017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:38 compute-0 python3.9[49019]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:33:49 compute-0 sudo[49017]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:50 compute-0 sudo[49353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xghhbgthahznkcdrnqbsaskrkjhpwhwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099630.1807578-248-126513046508065/AnsiballZ_dnf.py'
Jan 22 16:33:50 compute-0 sudo[49353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:50 compute-0 python3.9[49355]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:33:51 compute-0 sudo[49353]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:52 compute-0 sudo[49509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etasylwzrvpthaqjrlaruhcrbwkxwlxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099632.2783635-258-15817233464229/AnsiballZ_dnf.py'
Jan 22 16:33:52 compute-0 sudo[49509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:52 compute-0 python3.9[49511]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:33:54 compute-0 sudo[49509]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:55 compute-0 sudo[49666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxaeggbqauvfxkvjifsvvaaglfkrbkwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099634.839354-269-224022566205338/AnsiballZ_file.py'
Jan 22 16:33:55 compute-0 sudo[49666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:55 compute-0 python3.9[49668]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:33:55 compute-0 sudo[49666]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:55 compute-0 sudo[49841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjbhsmtcdlxmphybayrxyuqpebcyjjfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099635.5328293-277-125254570397240/AnsiballZ_stat.py'
Jan 22 16:33:55 compute-0 sudo[49841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:56 compute-0 python3.9[49843]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:33:56 compute-0 sudo[49841]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:56 compute-0 sudo[49964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxywgsvrdkbffwtmtgpjhwezuxrhffdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099635.5328293-277-125254570397240/AnsiballZ_copy.py'
Jan 22 16:33:56 compute-0 sudo[49964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:56 compute-0 python3.9[49966]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769099635.5328293-277-125254570397240/.source.json _original_basename=.biw996nd follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:33:56 compute-0 sudo[49964]: pam_unix(sudo:session): session closed for user root
Jan 22 16:33:57 compute-0 sudo[50116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txnewpwxaiydhwoacsyahodcejbbhvsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099637.1382725-295-1455864224195/AnsiballZ_podman_image.py'
Jan 22 16:33:57 compute-0 sudo[50116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:33:57 compute-0 python3.9[50118]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 22 16:33:57 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2153783389-lower\x2dmapped.mount: Deactivated successfully.
Jan 22 16:34:04 compute-0 podman[50131]: 2026-01-22 16:34:04.747320981 +0000 UTC m=+6.734969161 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 22 16:34:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:04 compute-0 sudo[50116]: pam_unix(sudo:session): session closed for user root
Jan 22 16:34:05 compute-0 sudo[50426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mglvemfcdrbwnogeavfxmfaunsehqwvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099645.3172934-306-186407869821681/AnsiballZ_podman_image.py'
Jan 22 16:34:05 compute-0 sudo[50426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:34:05 compute-0 python3.9[50428]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 22 16:34:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:16 compute-0 podman[50441]: 2026-01-22 16:34:16.210264862 +0000 UTC m=+10.366469562 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 16:34:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:16 compute-0 sudo[50426]: pam_unix(sudo:session): session closed for user root
Jan 22 16:34:17 compute-0 sudo[50739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzbaruryxaixwimolqwzjcvjyekthxfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099656.862574-316-21945294646863/AnsiballZ_podman_image.py'
Jan 22 16:34:17 compute-0 sudo[50739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:34:17 compute-0 python3.9[50741]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 22 16:34:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:37 compute-0 podman[50753]: 2026-01-22 16:34:37.44083764 +0000 UTC m=+19.988627328 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 22 16:34:37 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:37 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:37 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:37 compute-0 sudo[50739]: pam_unix(sudo:session): session closed for user root
Jan 22 16:34:38 compute-0 sudo[51011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyilqgpmaushxmyaorfsllyqrbwlrrms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099678.2002366-327-172594550945797/AnsiballZ_podman_image.py'
Jan 22 16:34:38 compute-0 sudo[51011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:34:38 compute-0 python3.9[51013]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 22 16:34:38 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:41 compute-0 podman[51025]: 2026-01-22 16:34:41.785165208 +0000 UTC m=+2.928871288 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 22 16:34:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:41 compute-0 sudo[51011]: pam_unix(sudo:session): session closed for user root
Jan 22 16:34:42 compute-0 sudo[51280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lejeatphoabbdqsrlyghkkwvtmhdfbac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099682.1455672-327-74144223565543/AnsiballZ_podman_image.py'
Jan 22 16:34:42 compute-0 sudo[51280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:34:42 compute-0 python3.9[51282]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 22 16:34:43 compute-0 podman[51294]: 2026-01-22 16:34:43.948728995 +0000 UTC m=+1.193610786 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 22 16:34:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:34:44 compute-0 sudo[51280]: pam_unix(sudo:session): session closed for user root
Jan 22 16:34:44 compute-0 sshd-session[45120]: Connection closed by 192.168.122.30 port 49690
Jan 22 16:34:44 compute-0 sshd-session[45117]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:34:44 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Jan 22 16:34:44 compute-0 systemd[1]: session-11.scope: Consumed 1min 49.529s CPU time.
Jan 22 16:34:44 compute-0 systemd-logind[796]: Session 11 logged out. Waiting for processes to exit.
Jan 22 16:34:44 compute-0 systemd-logind[796]: Removed session 11.
Jan 22 16:34:51 compute-0 sshd-session[51437]: Accepted publickey for zuul from 192.168.122.30 port 57198 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:34:51 compute-0 systemd-logind[796]: New session 12 of user zuul.
Jan 22 16:34:51 compute-0 systemd[1]: Started Session 12 of User zuul.
Jan 22 16:34:51 compute-0 sshd-session[51437]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:34:53 compute-0 python3.9[51590]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:34:54 compute-0 sudo[51744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heequwxhptvphjbaarosajufyhvllxur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099693.7712255-31-209500639518184/AnsiballZ_getent.py'
Jan 22 16:34:54 compute-0 sudo[51744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:34:54 compute-0 python3.9[51746]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 22 16:34:54 compute-0 sudo[51744]: pam_unix(sudo:session): session closed for user root
Jan 22 16:34:55 compute-0 sudo[51897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myyuhovbivoyukhrvzyavnawxubiunij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099694.7080374-39-110218055772687/AnsiballZ_group.py'
Jan 22 16:34:55 compute-0 sudo[51897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:34:55 compute-0 python3.9[51899]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 16:34:55 compute-0 groupadd[51900]: group added to /etc/group: name=openvswitch, GID=42476
Jan 22 16:34:55 compute-0 groupadd[51900]: group added to /etc/gshadow: name=openvswitch
Jan 22 16:34:55 compute-0 groupadd[51900]: new group: name=openvswitch, GID=42476
Jan 22 16:34:55 compute-0 sudo[51897]: pam_unix(sudo:session): session closed for user root
Jan 22 16:34:56 compute-0 sudo[52055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txnhmkzfclpmkkvrmlgqxbbnnwvyghzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099695.7889225-47-130683187840355/AnsiballZ_user.py'
Jan 22 16:34:56 compute-0 sudo[52055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:34:56 compute-0 python3.9[52057]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 16:34:56 compute-0 useradd[52059]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Jan 22 16:34:56 compute-0 useradd[52059]: add 'openvswitch' to group 'hugetlbfs'
Jan 22 16:34:56 compute-0 useradd[52059]: add 'openvswitch' to shadow group 'hugetlbfs'
Jan 22 16:34:56 compute-0 sudo[52055]: pam_unix(sudo:session): session closed for user root
Jan 22 16:34:57 compute-0 sudo[52215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eodaekutibbielmequgphxbtdukyhygu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099696.9567719-57-112948081886857/AnsiballZ_setup.py'
Jan 22 16:34:57 compute-0 sudo[52215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:34:57 compute-0 python3.9[52217]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:34:57 compute-0 sudo[52215]: pam_unix(sudo:session): session closed for user root
Jan 22 16:34:58 compute-0 sudo[52299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohgphynvnqmonaiqcxmjynonhiglhisq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099696.9567719-57-112948081886857/AnsiballZ_dnf.py'
Jan 22 16:34:58 compute-0 sudo[52299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:34:58 compute-0 python3.9[52301]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:35:00 compute-0 sudo[52299]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:01 compute-0 sudo[52461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkckzrumppmwoimfpjhpucsvzkkalgfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099700.8605342-71-959280447526/AnsiballZ_dnf.py'
Jan 22 16:35:01 compute-0 sudo[52461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:01 compute-0 python3.9[52463]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:35:10 compute-0 sshd-session[52469]: Connection reset by authenticating user root 176.120.22.47 port 48284 [preauth]
Jan 22 16:35:13 compute-0 kernel: SELinux:  Converting 2737 SID table entries...
Jan 22 16:35:13 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 16:35:13 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 22 16:35:13 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 16:35:13 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 22 16:35:13 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 16:35:13 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 16:35:13 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 16:35:13 compute-0 groupadd[52490]: group added to /etc/group: name=unbound, GID=994
Jan 22 16:35:13 compute-0 groupadd[52490]: group added to /etc/gshadow: name=unbound
Jan 22 16:35:13 compute-0 groupadd[52490]: new group: name=unbound, GID=994
Jan 22 16:35:13 compute-0 useradd[52497]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Jan 22 16:35:13 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 22 16:35:13 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 22 16:35:15 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 16:35:15 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 22 16:35:15 compute-0 systemd[1]: Reloading.
Jan 22 16:35:15 compute-0 systemd-rc-local-generator[52990]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:35:15 compute-0 systemd-sysv-generator[52994]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:35:15 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 16:35:15 compute-0 sudo[52461]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:15 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 16:35:15 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 16:35:15 compute-0 systemd[1]: run-rc209599132d3401b85c731f1153b64a6.service: Deactivated successfully.
Jan 22 16:35:16 compute-0 sudo[53562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqdteqjxnxajcnwiibuzosbixiwazpol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099716.0993161-79-269458117841818/AnsiballZ_systemd.py'
Jan 22 16:35:16 compute-0 sudo[53562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:17 compute-0 python3.9[53564]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 16:35:18 compute-0 systemd[1]: Reloading.
Jan 22 16:35:18 compute-0 systemd-rc-local-generator[53591]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:35:18 compute-0 systemd-sysv-generator[53595]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:35:18 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Jan 22 16:35:18 compute-0 chown[53607]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 22 16:35:18 compute-0 sshd-session[52480]: Connection reset by authenticating user root 176.120.22.47 port 46974 [preauth]
Jan 22 16:35:18 compute-0 ovs-ctl[53612]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 22 16:35:18 compute-0 ovs-ctl[53612]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 22 16:35:18 compute-0 ovs-ctl[53612]: Starting ovsdb-server [  OK  ]
Jan 22 16:35:18 compute-0 ovs-vsctl[53662]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 22 16:35:18 compute-0 ovs-vsctl[53678]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"c288e768-a990-4b51-bd88-fd8dddb8c85d\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 22 16:35:18 compute-0 ovs-ctl[53612]: Configuring Open vSwitch system IDs [  OK  ]
Jan 22 16:35:18 compute-0 ovs-vsctl[53687]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 22 16:35:18 compute-0 ovs-ctl[53612]: Enabling remote OVSDB managers [  OK  ]
Jan 22 16:35:18 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Jan 22 16:35:18 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 22 16:35:18 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 22 16:35:18 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 22 16:35:18 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Jan 22 16:35:18 compute-0 ovs-ctl[53731]: Inserting openvswitch module [  OK  ]
Jan 22 16:35:19 compute-0 ovs-ctl[53700]: Starting ovs-vswitchd [  OK  ]
Jan 22 16:35:19 compute-0 ovs-vsctl[53749]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 22 16:35:19 compute-0 ovs-ctl[53700]: Enabling remote OVSDB managers [  OK  ]
Jan 22 16:35:19 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 22 16:35:19 compute-0 systemd[1]: Starting Open vSwitch...
Jan 22 16:35:19 compute-0 systemd[1]: Finished Open vSwitch.
Jan 22 16:35:19 compute-0 sudo[53562]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:19 compute-0 python3.9[53902]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:35:20 compute-0 sudo[54052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zelpcebvhvtwpgmpnlwoaafywaptsraa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099720.226606-97-75325662940172/AnsiballZ_sefcontext.py'
Jan 22 16:35:20 compute-0 sudo[54052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:20 compute-0 python3.9[54054]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 22 16:35:22 compute-0 kernel: SELinux:  Converting 2751 SID table entries...
Jan 22 16:35:22 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 16:35:22 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 22 16:35:22 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 16:35:22 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 22 16:35:22 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 16:35:22 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 16:35:22 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 16:35:22 compute-0 sudo[54052]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:23 compute-0 python3.9[54209]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:35:23 compute-0 sudo[54365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhhnyncksbqjuazbiisxtefnhwmhpyxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099723.4428024-115-65535503714786/AnsiballZ_dnf.py'
Jan 22 16:35:23 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 22 16:35:23 compute-0 sudo[54365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:23 compute-0 sshd-session[53568]: Connection reset by authenticating user root 176.120.22.47 port 47008 [preauth]
Jan 22 16:35:24 compute-0 python3.9[54367]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:35:24 compute-0 sshd-session[53661]: Connection reset by authenticating user root 176.120.22.47 port 47020 [preauth]
Jan 22 16:35:25 compute-0 sudo[54365]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:26 compute-0 sudo[54521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guaulxrkjarhrmeokkzlptllstbqgayc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099725.509968-123-41479803234801/AnsiballZ_command.py'
Jan 22 16:35:26 compute-0 sudo[54521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:26 compute-0 python3.9[54523]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:35:26 compute-0 sudo[54521]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:27 compute-0 sudo[54809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujedlmcvroeetutbcxvngjnodggligkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099727.1247268-131-277441690650332/AnsiballZ_file.py'
Jan 22 16:35:27 compute-0 sudo[54809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:27 compute-0 python3.9[54811]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 22 16:35:27 compute-0 sudo[54809]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:28 compute-0 python3.9[54961]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:35:29 compute-0 sudo[55113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krzjqjggaidndbvzaqzjgeyqbiwjvryn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099729.121048-147-17032707624515/AnsiballZ_dnf.py'
Jan 22 16:35:29 compute-0 sudo[55113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:29 compute-0 python3.9[55115]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:35:31 compute-0 sshd-session[54370]: Connection reset by authenticating user root 176.120.22.47 port 56472 [preauth]
Jan 22 16:35:32 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 16:35:32 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 22 16:35:32 compute-0 systemd[1]: Reloading.
Jan 22 16:35:32 compute-0 systemd-rc-local-generator[55156]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:35:32 compute-0 systemd-sysv-generator[55160]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:35:32 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 16:35:32 compute-0 sudo[55113]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:33 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 16:35:33 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 16:35:33 compute-0 systemd[1]: run-r8e51e05b15e2481d95b47908048bbe6e.service: Deactivated successfully.
Jan 22 16:35:33 compute-0 sudo[55434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-najeeaatttumsqjrrzmdvkhzkcztkjbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099733.1281602-155-147488556005384/AnsiballZ_systemd.py'
Jan 22 16:35:33 compute-0 sudo[55434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:33 compute-0 python3.9[55436]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:35:33 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 22 16:35:33 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Jan 22 16:35:33 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Jan 22 16:35:33 compute-0 systemd[1]: Stopping Network Manager...
Jan 22 16:35:33 compute-0 NetworkManager[7206]: <info>  [1769099733.8794] caught SIGTERM, shutting down normally.
Jan 22 16:35:33 compute-0 NetworkManager[7206]: <info>  [1769099733.8810] dhcp4 (eth0): canceled DHCP transaction
Jan 22 16:35:33 compute-0 NetworkManager[7206]: <info>  [1769099733.8810] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:35:33 compute-0 NetworkManager[7206]: <info>  [1769099733.8810] dhcp4 (eth0): state changed no lease
Jan 22 16:35:33 compute-0 NetworkManager[7206]: <info>  [1769099733.8813] manager: NetworkManager state is now CONNECTED_SITE
Jan 22 16:35:33 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 16:35:33 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 16:35:33 compute-0 NetworkManager[7206]: <info>  [1769099733.9163] exiting (success)
Jan 22 16:35:33 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 22 16:35:33 compute-0 systemd[1]: Stopped Network Manager.
Jan 22 16:35:33 compute-0 systemd[1]: NetworkManager.service: Consumed 11.967s CPU time, 4.1M memory peak, read 0B from disk, written 35.5K to disk.
Jan 22 16:35:33 compute-0 systemd[1]: Starting Network Manager...
Jan 22 16:35:33 compute-0 NetworkManager[55454]: <info>  [1769099733.9871] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:da997f0d-f9f6-41ea-b801-9627d95136ee)
Jan 22 16:35:33 compute-0 NetworkManager[55454]: <info>  [1769099733.9873] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 22 16:35:33 compute-0 NetworkManager[55454]: <info>  [1769099733.9932] manager[0x564ce1ef3000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 22 16:35:34 compute-0 systemd[1]: Starting Hostname Service...
Jan 22 16:35:34 compute-0 systemd[1]: Started Hostname Service.
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.0939] hostname: hostname: using hostnamed
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.0942] hostname: static hostname changed from (none) to "compute-0"
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.0947] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.0952] manager[0x564ce1ef3000]: rfkill: Wi-Fi hardware radio set enabled
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.0952] manager[0x564ce1ef3000]: rfkill: WWAN hardware radio set enabled
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.0971] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.0979] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.0979] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.0980] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.0980] manager: Networking is enabled by state file
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.0982] settings: Loaded settings plugin: keyfile (internal)
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.0985] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1006] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1015] dhcp: init: Using DHCP client 'internal'
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1017] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1021] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1025] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1030] device (lo): Activation: starting connection 'lo' (30955f9e-8f64-42bf-81b2-a9784deb7a51)
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1035] device (eth0): carrier: link connected
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1038] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1041] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1042] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1046] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1050] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1054] device (eth1): carrier: link connected
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1057] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1060] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (f8969003-395c-5487-8caf-079e54e358f5) (indicated)
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1060] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1064] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1068] device (eth1): Activation: starting connection 'ci-private-network' (f8969003-395c-5487-8caf-079e54e358f5)
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1072] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 22 16:35:34 compute-0 systemd[1]: Started Network Manager.
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1078] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1080] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1081] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1082] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1096] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1100] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1103] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1108] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1119] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1124] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1154] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1180] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1197] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1202] dhcp4 (eth0): state changed new lease, address=38.102.83.176
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1208] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 22 16:35:34 compute-0 systemd[1]: Starting Network Manager Wait Online...
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1221] device (lo): Activation: successful, device activated.
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1241] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 22 16:35:34 compute-0 sudo[55434]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1535] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1547] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1549] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1552] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1555] device (eth1): Activation: successful, device activated.
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1761] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1763] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1767] manager: NetworkManager state is now CONNECTED_SITE
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1770] device (eth0): Activation: successful, device activated.
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1774] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 22 16:35:34 compute-0 NetworkManager[55454]: <info>  [1769099734.1922] manager: startup complete
Jan 22 16:35:34 compute-0 systemd[1]: Finished Network Manager Wait Online.
Jan 22 16:35:34 compute-0 sudo[55660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwogpkpdxdccdvvbbbrbsouticyapsly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099734.3887885-163-45792216711829/AnsiballZ_dnf.py'
Jan 22 16:35:34 compute-0 sudo[55660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:35 compute-0 python3.9[55662]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:35:39 compute-0 sshd-session[55165]: Connection reset by authenticating user root 176.120.22.47 port 33894 [preauth]
Jan 22 16:35:40 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 16:35:40 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 22 16:35:40 compute-0 systemd[1]: Reloading.
Jan 22 16:35:40 compute-0 systemd-sysv-generator[55718]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:35:40 compute-0 systemd-rc-local-generator[55715]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:35:40 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 16:35:41 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 16:35:41 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 16:35:41 compute-0 systemd[1]: run-r05a4fffe251b4a76ae5256194b62124a.service: Deactivated successfully.
Jan 22 16:35:41 compute-0 sudo[55660]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:42 compute-0 sudo[56122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kicmirtvprythwkiqiwcbcumtwnaajxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099742.0030441-175-248891168172373/AnsiballZ_stat.py'
Jan 22 16:35:42 compute-0 sudo[56122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:42 compute-0 python3.9[56124]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:35:42 compute-0 sudo[56122]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:43 compute-0 sudo[56274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnxeieutpopofkvjkhqkmmahccumeojv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099742.6965954-184-76018914892157/AnsiballZ_ini_file.py'
Jan 22 16:35:43 compute-0 sudo[56274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:43 compute-0 python3.9[56276]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:35:43 compute-0 sudo[56274]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:43 compute-0 sudo[56428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foefrsthpwwkzujpjyopkazaccuimagc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099743.6255355-194-194307970904619/AnsiballZ_ini_file.py'
Jan 22 16:35:43 compute-0 sudo[56428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:44 compute-0 python3.9[56430]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:35:44 compute-0 sudo[56428]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:44 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 16:35:44 compute-0 sudo[56580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avtuhcyexgzcwiymucdzienxiskiyfia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099744.2447567-194-257190861561632/AnsiballZ_ini_file.py'
Jan 22 16:35:44 compute-0 sudo[56580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:44 compute-0 python3.9[56582]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:35:44 compute-0 sudo[56580]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:45 compute-0 sudo[56732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yconnundknmgehrbkmsmxnktfhrrcadj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099745.0198305-209-94052168747507/AnsiballZ_ini_file.py'
Jan 22 16:35:45 compute-0 sudo[56732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:45 compute-0 python3.9[56734]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:35:45 compute-0 sudo[56732]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:45 compute-0 sudo[56884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgmcrwdncqfsvixkegjxuehqsogkbneu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099745.6598387-209-201386806887602/AnsiballZ_ini_file.py'
Jan 22 16:35:45 compute-0 sudo[56884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:46 compute-0 python3.9[56886]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:35:46 compute-0 sudo[56884]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:46 compute-0 sudo[57036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwjtqnnmagviuodlvdsfxbzkgujvlgtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099746.3023672-224-33390435263804/AnsiballZ_stat.py'
Jan 22 16:35:46 compute-0 sudo[57036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:46 compute-0 python3.9[57038]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:35:46 compute-0 sudo[57036]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:47 compute-0 sudo[57159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkxrhxegdhazwbfnyqlhowmoborefzob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099746.3023672-224-33390435263804/AnsiballZ_copy.py'
Jan 22 16:35:47 compute-0 sudo[57159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:47 compute-0 sshd-session[55682]: Connection reset by authenticating user root 176.120.22.47 port 30410 [preauth]
Jan 22 16:35:47 compute-0 python3.9[57161]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769099746.3023672-224-33390435263804/.source _original_basename=.7f8wzc89 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:35:47 compute-0 sudo[57159]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:47 compute-0 sudo[57313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvqswhondusnltjqkikcbtwwnoktvssv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099747.6282456-239-166286222285062/AnsiballZ_file.py'
Jan 22 16:35:47 compute-0 sudo[57313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:48 compute-0 python3.9[57315]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:35:48 compute-0 sudo[57313]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:48 compute-0 sudo[57466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arsvdhtzmntbxrahxhbuzzumgvfdqfhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099748.2777574-247-109534830206233/AnsiballZ_edpm_os_net_config_mappings.py'
Jan 22 16:35:48 compute-0 sudo[57466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:48 compute-0 python3.9[57468]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 22 16:35:48 compute-0 sudo[57466]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:49 compute-0 sudo[57619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arxnluqpwtwbxnuddklqmsgdnddjygob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099749.1273904-256-2654874796014/AnsiballZ_file.py'
Jan 22 16:35:49 compute-0 sudo[57619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:49 compute-0 python3.9[57621]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:35:49 compute-0 sudo[57619]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:50 compute-0 sudo[57771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjuioivfomrjanoibafqfjkmxwaanhmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099750.0209093-266-50329739201862/AnsiballZ_stat.py'
Jan 22 16:35:50 compute-0 sudo[57771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:50 compute-0 sudo[57771]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:50 compute-0 sudo[57894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwtqyvvovcrqkvyrsjgkjnppshyjcrjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099750.0209093-266-50329739201862/AnsiballZ_copy.py'
Jan 22 16:35:50 compute-0 sudo[57894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:50 compute-0 sudo[57894]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:51 compute-0 sudo[58046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhizkadqmjjjqrbjnlerwwhbjmcjnpvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099751.1586304-281-101977641252215/AnsiballZ_slurp.py'
Jan 22 16:35:51 compute-0 sudo[58046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:51 compute-0 python3.9[58048]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 22 16:35:51 compute-0 sudo[58046]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:52 compute-0 sudo[58221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuxyzktemfvttbubzhftnpvywgzsizgy ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099751.967499-290-24373084825766/async_wrapper.py j846613817492 300 /home/zuul/.ansible/tmp/ansible-tmp-1769099751.967499-290-24373084825766/AnsiballZ_edpm_os_net_config.py _'
Jan 22 16:35:52 compute-0 sudo[58221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:52 compute-0 ansible-async_wrapper.py[58223]: Invoked with j846613817492 300 /home/zuul/.ansible/tmp/ansible-tmp-1769099751.967499-290-24373084825766/AnsiballZ_edpm_os_net_config.py _
Jan 22 16:35:52 compute-0 ansible-async_wrapper.py[58226]: Starting module and watcher
Jan 22 16:35:52 compute-0 ansible-async_wrapper.py[58226]: Start watching 58227 (300)
Jan 22 16:35:52 compute-0 ansible-async_wrapper.py[58227]: Start module (58227)
Jan 22 16:35:52 compute-0 ansible-async_wrapper.py[58223]: Return async_wrapper task started.
Jan 22 16:35:52 compute-0 sudo[58221]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:53 compute-0 python3.9[58228]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 22 16:35:53 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 22 16:35:53 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 22 16:35:53 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 22 16:35:53 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 22 16:35:53 compute-0 kernel: cfg80211: failed to load regulatory.db
Jan 22 16:35:54 compute-0 sshd-session[57162]: Connection reset by authenticating user root 176.120.22.47 port 30424 [preauth]
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.7751] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58229 uid=0 result="success"
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.7771] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58229 uid=0 result="success"
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.8298] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.8301] audit: op="connection-add" uuid="469adc80-1a0a-4abb-8284-9bc5049723f0" name="br-ex-br" pid=58229 uid=0 result="success"
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.8315] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.8317] audit: op="connection-add" uuid="70e500de-d00f-4149-b13f-6d1feb18193b" name="br-ex-port" pid=58229 uid=0 result="success"
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.8327] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.8329] audit: op="connection-add" uuid="bf779204-8c8f-4813-a0b6-0a4e324c5af2" name="eth1-port" pid=58229 uid=0 result="success"
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.8342] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.8343] audit: op="connection-add" uuid="b96938fc-6ffe-4df9-aabc-8c8d407bcbb3" name="vlan20-port" pid=58229 uid=0 result="success"
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.8354] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.8356] audit: op="connection-add" uuid="c052c044-6377-48ec-bef0-78596f3bd5b9" name="vlan21-port" pid=58229 uid=0 result="success"
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.8366] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.8368] audit: op="connection-add" uuid="20784ef0-494f-4487-a709-c8d70faa3b9a" name="vlan22-port" pid=58229 uid=0 result="success"
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.8385] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.timestamp,connection.autoconnect-priority,ipv4.dhcp-timeout,ipv4.dhcp-client-id,802-3-ethernet.mtu,ipv6.method,ipv6.addr-gen-mode,ipv6.dhcp-timeout" pid=58229 uid=0 result="success"
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.8401] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.8403] audit: op="connection-add" uuid="ca614780-85f3-451e-976f-1a0e07eebf1c" name="br-ex-if" pid=58229 uid=0 result="success"
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.9808] audit: op="connection-update" uuid="f8969003-395c-5487-8caf-079e54e358f5" name="ci-private-network" args="ovs-external-ids.data,connection.controller,connection.slave-type,connection.port-type,connection.timestamp,connection.master,ipv4.method,ipv4.dns,ipv4.routes,ipv4.never-default,ipv4.addresses,ipv4.routing-rules,ipv6.method,ipv6.addr-gen-mode,ipv6.dns,ipv6.routes,ipv6.routing-rules,ipv6.addresses,ovs-interface.type" pid=58229 uid=0 result="success"
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.9839] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.9843] audit: op="connection-add" uuid="8ccf8388-751d-4daa-8a3f-0bc0278544ff" name="vlan20-if" pid=58229 uid=0 result="success"
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.9873] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.9877] audit: op="connection-add" uuid="b44fd19b-f156-4986-be27-77a46b9f594b" name="vlan21-if" pid=58229 uid=0 result="success"
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.9906] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.9911] audit: op="connection-add" uuid="41dc9db0-08b5-434e-9518-69cd0708331b" name="vlan22-if" pid=58229 uid=0 result="success"
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.9934] audit: op="connection-delete" uuid="996dff1a-21f6-3407-9055-9c1cc954befb" name="Wired connection 1" pid=58229 uid=0 result="success"
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.9958] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <warn>  [1769099754.9963] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.9977] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.9986] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (469adc80-1a0a-4abb-8284-9bc5049723f0)
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.9989] audit: op="connection-activate" uuid="469adc80-1a0a-4abb-8284-9bc5049723f0" name="br-ex-br" pid=58229 uid=0 result="success"
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <info>  [1769099754.9993] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:35:54 compute-0 NetworkManager[55454]: <warn>  [1769099754.9997] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0009] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0018] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (70e500de-d00f-4149-b13f-6d1feb18193b)
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0022] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <warn>  [1769099755.0026] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0036] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0047] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (bf779204-8c8f-4813-a0b6-0a4e324c5af2)
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0052] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <warn>  [1769099755.0055] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0060] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0063] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (b96938fc-6ffe-4df9-aabc-8c8d407bcbb3)
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0064] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <warn>  [1769099755.0065] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0069] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0071] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (c052c044-6377-48ec-bef0-78596f3bd5b9)
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0072] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <warn>  [1769099755.0073] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0077] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0080] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (20784ef0-494f-4487-a709-c8d70faa3b9a)
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0080] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0082] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0083] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0088] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <warn>  [1769099755.0088] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0090] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0093] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (ca614780-85f3-451e-976f-1a0e07eebf1c)
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0093] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0095] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0097] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0097] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0098] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0105] device (eth1): disconnecting for new activation request.
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0106] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0108] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0109] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0110] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0112] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <warn>  [1769099755.0112] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0114] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0117] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (8ccf8388-751d-4daa-8a3f-0bc0278544ff)
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0118] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0119] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0121] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0121] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0123] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <warn>  [1769099755.0124] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0126] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0129] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (b44fd19b-f156-4986-be27-77a46b9f594b)
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0129] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0131] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0132] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0133] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0135] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <warn>  [1769099755.0136] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0138] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0140] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (41dc9db0-08b5-434e-9518-69cd0708331b)
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0141] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0143] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0144] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0145] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0146] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0156] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv4.dhcp-timeout,ipv4.dhcp-client-id,802-3-ethernet.mtu,ipv6.method,ipv6.addr-gen-mode" pid=58229 uid=0 result="success"
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0157] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0159] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0161] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0166] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0169] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0172] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0174] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0175] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0179] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0182] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0185] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0186] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0190] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0193] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0195] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0197] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0201] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0204] dhcp4 (eth0): canceled DHCP transaction
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0204] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0204] dhcp4 (eth0): state changed no lease
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0206] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.0215] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58229 uid=0 result="fail" reason="Device is not activated"
Jan 22 16:35:55 compute-0 kernel: ovs-system: entered promiscuous mode
Jan 22 16:35:55 compute-0 kernel: Timeout policy base is empty
Jan 22 16:35:55 compute-0 systemd-udevd[58234]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 16:35:55 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 16:35:55 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 16:35:55 compute-0 kernel: br-ex: entered promiscuous mode
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1026] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1038] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1044] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 22 16:35:55 compute-0 kernel: vlan20: entered promiscuous mode
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1118] device (eth1): Activation: starting connection 'ci-private-network' (f8969003-395c-5487-8caf-079e54e358f5)
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1121] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1122] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1123] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1124] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1125] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1126] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1130] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1136] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1140] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1143] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1148] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1151] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1155] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1158] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1160] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1163] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1166] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1169] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1172] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1175] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1178] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1180] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1187] device (eth1): state change: config -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1188] device (eth1): released from controller device eth1
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1195] device (eth1): disconnecting for new activation request.
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1196] audit: op="connection-activate" uuid="f8969003-395c-5487-8caf-079e54e358f5" name="ci-private-network" pid=58229 uid=0 result="success"
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1199] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1207] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1208] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1221] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1229] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1234] device (eth1): Activation: starting connection 'ci-private-network' (f8969003-395c-5487-8caf-079e54e358f5)
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1245] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1248] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1251] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1252] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58229 uid=0 result="success"
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1253] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1254] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1259] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1263] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 16:35:55 compute-0 kernel: vlan21: entered promiscuous mode
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1277] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1281] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1287] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1291] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 22 16:35:55 compute-0 kernel: vlan22: entered promiscuous mode
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1657] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1668] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1670] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1691] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1702] device (eth1): Activation: successful, device activated.
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1749] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1763] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1788] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 sshd-session[57286]: Connection reset by authenticating user root 176.120.22.47 port 30434 [preauth]
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1798] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1800] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1805] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1813] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.1818] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 16:35:55 compute-0 NetworkManager[55454]: <info>  [1769099755.8503] dhcp4 (eth0): state changed new lease, address=38.102.83.176
Jan 22 16:35:56 compute-0 NetworkManager[55454]: <info>  [1769099756.3412] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58229 uid=0 result="success"
Jan 22 16:35:56 compute-0 NetworkManager[55454]: <info>  [1769099756.5098] checkpoint[0x564ce1ec8950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 22 16:35:56 compute-0 NetworkManager[55454]: <info>  [1769099756.5102] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58229 uid=0 result="success"
Jan 22 16:35:56 compute-0 sudo[58568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnlfpchuorelrqixqfppcrcnbowwmmwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099756.0079353-290-18343067624211/AnsiballZ_async_status.py'
Jan 22 16:35:56 compute-0 sudo[58568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:35:56 compute-0 python3.9[58571]: ansible-ansible.legacy.async_status Invoked with jid=j846613817492.58223 mode=status _async_dir=/root/.ansible_async
Jan 22 16:35:56 compute-0 sudo[58568]: pam_unix(sudo:session): session closed for user root
Jan 22 16:35:56 compute-0 NetworkManager[55454]: <info>  [1769099756.7720] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58229 uid=0 result="success"
Jan 22 16:35:56 compute-0 NetworkManager[55454]: <info>  [1769099756.7731] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58229 uid=0 result="success"
Jan 22 16:35:56 compute-0 NetworkManager[55454]: <info>  [1769099756.9727] audit: op="networking-control" arg="global-dns-configuration" pid=58229 uid=0 result="success"
Jan 22 16:35:56 compute-0 NetworkManager[55454]: <info>  [1769099756.9757] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 22 16:35:56 compute-0 NetworkManager[55454]: <info>  [1769099756.9790] audit: op="networking-control" arg="global-dns-configuration" pid=58229 uid=0 result="success"
Jan 22 16:35:56 compute-0 NetworkManager[55454]: <info>  [1769099756.9822] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58229 uid=0 result="success"
Jan 22 16:35:57 compute-0 NetworkManager[55454]: <info>  [1769099757.1178] checkpoint[0x564ce1ec8a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 22 16:35:57 compute-0 NetworkManager[55454]: <info>  [1769099757.1189] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58229 uid=0 result="success"
Jan 22 16:35:57 compute-0 ansible-async_wrapper.py[58227]: Module complete (58227)
Jan 22 16:35:57 compute-0 ansible-async_wrapper.py[58226]: Done in kid B.
Jan 22 16:35:58 compute-0 sshd-session[58578]: Received disconnect from 91.224.92.78 port 61404:11:  [preauth]
Jan 22 16:35:58 compute-0 sshd-session[58578]: Disconnected from authenticating user root 91.224.92.78 port 61404 [preauth]
Jan 22 16:36:00 compute-0 sudo[58676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuymagjsluwsirsiyhamcacsdaronnqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099756.0079353-290-18343067624211/AnsiballZ_async_status.py'
Jan 22 16:36:00 compute-0 sudo[58676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:00 compute-0 python3.9[58678]: ansible-ansible.legacy.async_status Invoked with jid=j846613817492.58223 mode=status _async_dir=/root/.ansible_async
Jan 22 16:36:00 compute-0 sudo[58676]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:00 compute-0 sshd-session[58278]: Connection reset by authenticating user root 176.120.22.47 port 27186 [preauth]
Jan 22 16:36:00 compute-0 sudo[58776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibamzzhnrltfxonsarexwdolrwkjablt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099756.0079353-290-18343067624211/AnsiballZ_async_status.py'
Jan 22 16:36:00 compute-0 sudo[58776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:00 compute-0 python3.9[58778]: ansible-ansible.legacy.async_status Invoked with jid=j846613817492.58223 mode=cleanup _async_dir=/root/.ansible_async
Jan 22 16:36:00 compute-0 sudo[58776]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:01 compute-0 sshd-session[58382]: Connection reset by authenticating user root 176.120.22.47 port 27202 [preauth]
Jan 22 16:36:01 compute-0 sudo[58929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbosmtylasyttibvkzteumboadlxhcbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099760.9265535-317-178239413086734/AnsiballZ_stat.py'
Jan 22 16:36:01 compute-0 sudo[58929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:01 compute-0 python3.9[58931]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:36:01 compute-0 sudo[58929]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:01 compute-0 sshd-session[54368]: Connection reset by 176.120.22.47 port 56466 [preauth]
Jan 22 16:36:01 compute-0 sudo[59054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qulwdkkgwjtqdygsnnpzzwpszhgbloed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099760.9265535-317-178239413086734/AnsiballZ_copy.py'
Jan 22 16:36:01 compute-0 sudo[59054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:01 compute-0 python3.9[59056]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769099760.9265535-317-178239413086734/.source.returncode _original_basename=.x2_qrp9g follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:36:01 compute-0 sudo[59054]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:02 compute-0 sudo[59207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnfbxeacpaprmkizcxbbwpubkjcczujq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099762.1208096-333-248752516584047/AnsiballZ_stat.py'
Jan 22 16:36:02 compute-0 sudo[59207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:02 compute-0 python3.9[59209]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:36:02 compute-0 sudo[59207]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:02 compute-0 sudo[59330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koqcgzbddtsuizogqxkvyyjqhqktbdpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099762.1208096-333-248752516584047/AnsiballZ_copy.py'
Jan 22 16:36:02 compute-0 sudo[59330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:03 compute-0 python3.9[59332]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769099762.1208096-333-248752516584047/.source.cfg _original_basename=.rceywf8y follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:36:03 compute-0 sudo[59330]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:03 compute-0 sudo[59483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-segzkiqinomclfvkdlfichkgesvgzsur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099763.3733714-348-67581448759883/AnsiballZ_systemd.py'
Jan 22 16:36:03 compute-0 sudo[59483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:04 compute-0 python3.9[59485]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:36:04 compute-0 systemd[1]: Reloading Network Manager...
Jan 22 16:36:04 compute-0 NetworkManager[55454]: <info>  [1769099764.0851] audit: op="reload" arg="0" pid=59489 uid=0 result="success"
Jan 22 16:36:04 compute-0 NetworkManager[55454]: <info>  [1769099764.0857] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 22 16:36:04 compute-0 systemd[1]: Reloaded Network Manager.
Jan 22 16:36:04 compute-0 sudo[59483]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:04 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 16:36:04 compute-0 sshd-session[51440]: Connection closed by 192.168.122.30 port 57198
Jan 22 16:36:04 compute-0 sshd-session[51437]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:36:04 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Jan 22 16:36:04 compute-0 systemd[1]: session-12.scope: Consumed 49.978s CPU time.
Jan 22 16:36:04 compute-0 systemd-logind[796]: Session 12 logged out. Waiting for processes to exit.
Jan 22 16:36:04 compute-0 systemd-logind[796]: Removed session 12.
Jan 22 16:36:07 compute-0 sshd-session[58779]: Connection reset by authenticating user root 176.120.22.47 port 42180 [preauth]
Jan 22 16:36:09 compute-0 sshd-session[55120]: Connection reset by 176.120.22.47 port 33888 [preauth]
Jan 22 16:36:10 compute-0 sshd-session[59524]: Accepted publickey for zuul from 192.168.122.30 port 44222 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:36:10 compute-0 systemd-logind[796]: New session 13 of user zuul.
Jan 22 16:36:10 compute-0 systemd[1]: Started Session 13 of User zuul.
Jan 22 16:36:10 compute-0 sshd-session[59524]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:36:11 compute-0 python3.9[59677]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:36:13 compute-0 python3.9[59831]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:36:14 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 16:36:14 compute-0 python3.9[60021]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:36:14 compute-0 sshd-session[59527]: Connection closed by 192.168.122.30 port 44222
Jan 22 16:36:14 compute-0 sshd-session[59524]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:36:14 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Jan 22 16:36:14 compute-0 systemd[1]: session-13.scope: Consumed 2.652s CPU time.
Jan 22 16:36:14 compute-0 systemd-logind[796]: Session 13 logged out. Waiting for processes to exit.
Jan 22 16:36:14 compute-0 systemd-logind[796]: Removed session 13.
Jan 22 16:36:14 compute-0 sshd-session[59521]: Connection reset by authenticating user root 176.120.22.47 port 42214 [preauth]
Jan 22 16:36:19 compute-0 sshd-session[60048]: Connection reset by authenticating user root 176.120.22.47 port 37130 [preauth]
Jan 22 16:36:20 compute-0 sshd-session[60054]: Accepted publickey for zuul from 192.168.122.30 port 36916 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:36:20 compute-0 systemd-logind[796]: New session 14 of user zuul.
Jan 22 16:36:20 compute-0 systemd[1]: Started Session 14 of User zuul.
Jan 22 16:36:20 compute-0 sshd-session[60054]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:36:21 compute-0 python3.9[60207]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:36:22 compute-0 python3.9[60361]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:36:23 compute-0 sudo[60516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udqdyqlaomxywuylubxuymdghngalotd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099782.950197-35-75019187456585/AnsiballZ_setup.py'
Jan 22 16:36:23 compute-0 sudo[60516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:23 compute-0 python3.9[60518]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:36:23 compute-0 sudo[60516]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:24 compute-0 sudo[60600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnxxrtcwdtpwyeefzixujgbrhkyqqdna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099782.950197-35-75019187456585/AnsiballZ_dnf.py'
Jan 22 16:36:24 compute-0 sudo[60600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:24 compute-0 python3.9[60602]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:36:24 compute-0 sshd-session[60052]: Connection reset by authenticating user root 176.120.22.47 port 37150 [preauth]
Jan 22 16:36:25 compute-0 sudo[60600]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:26 compute-0 sudo[60756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zguztpbekmfmcvkocmwnnmogdpvsevec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099786.1098938-47-252995263274386/AnsiballZ_setup.py'
Jan 22 16:36:26 compute-0 sudo[60756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:26 compute-0 python3.9[60758]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:36:26 compute-0 sudo[60756]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:28 compute-0 sudo[60947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hempcjpctphrvwrbqdkkptednifrnycs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099787.499171-58-101098500767949/AnsiballZ_file.py'
Jan 22 16:36:28 compute-0 sudo[60947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:28 compute-0 sshd-session[60604]: Invalid user ubuntu from 176.120.22.47 port 60180
Jan 22 16:36:28 compute-0 python3.9[60949]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:36:28 compute-0 sudo[60947]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:29 compute-0 sudo[61099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jymmmwstytlsrpmddvqrapoufbxazekm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099788.594785-66-78362392968799/AnsiballZ_command.py'
Jan 22 16:36:29 compute-0 sudo[61099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:29 compute-0 python3.9[61101]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:36:29 compute-0 sshd-session[60604]: Connection reset by invalid user ubuntu 176.120.22.47 port 60180 [preauth]
Jan 22 16:36:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:36:29 compute-0 sudo[61099]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:29 compute-0 sudo[61264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfsiedsgdccnlhxyhopshcryvvucvwtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099789.5280392-74-147399365384855/AnsiballZ_stat.py'
Jan 22 16:36:29 compute-0 sudo[61264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:30 compute-0 python3.9[61266]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:36:30 compute-0 sudo[61264]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:30 compute-0 sudo[61342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qymyiyebnwzbialkhvjuiboocnyumjvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099789.5280392-74-147399365384855/AnsiballZ_file.py'
Jan 22 16:36:30 compute-0 sudo[61342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:30 compute-0 python3.9[61344]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:36:30 compute-0 sudo[61342]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:31 compute-0 sudo[61494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfxstgvyfvxhrvnhgsojhjnffgtbbnpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099790.8491986-86-190267643056556/AnsiballZ_stat.py'
Jan 22 16:36:31 compute-0 sudo[61494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:31 compute-0 python3.9[61496]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:36:31 compute-0 sudo[61494]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:31 compute-0 sudo[61573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxzqjpavoqhzaebfqizwvukgoslqzatu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099790.8491986-86-190267643056556/AnsiballZ_file.py'
Jan 22 16:36:31 compute-0 sudo[61573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:31 compute-0 python3.9[61575]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:36:31 compute-0 sudo[61573]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:32 compute-0 sudo[61725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brccplgrwdbjldnqdbjowewixitrjiji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099792.1342907-99-186983849778397/AnsiballZ_ini_file.py'
Jan 22 16:36:32 compute-0 sudo[61725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:32 compute-0 python3.9[61727]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:36:32 compute-0 sudo[61725]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:33 compute-0 sudo[61877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jztmsoiobqbjrzubsuiyoltrumhyatku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099792.9984875-99-261734643858566/AnsiballZ_ini_file.py'
Jan 22 16:36:33 compute-0 sudo[61877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:33 compute-0 python3.9[61879]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:36:33 compute-0 sudo[61877]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:34 compute-0 sudo[62029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udblslmvujjcfljtvkltdptbrkpjeklr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099793.8505588-99-274543222197302/AnsiballZ_ini_file.py'
Jan 22 16:36:34 compute-0 sudo[62029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:34 compute-0 python3.9[62031]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:36:34 compute-0 sudo[62029]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:35 compute-0 sudo[62181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exvuhbklifssdczenrisedjhrkoeszwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099794.6888845-99-92444489613030/AnsiballZ_ini_file.py'
Jan 22 16:36:35 compute-0 sudo[62181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:35 compute-0 python3.9[62183]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:36:35 compute-0 sudo[62181]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:35 compute-0 sshd-session[61161]: Connection reset by authenticating user root 176.120.22.47 port 62276 [preauth]
Jan 22 16:36:35 compute-0 sudo[62333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yilcpfgmrzppctaayfgxbkppvannswnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099795.512667-130-118149565462976/AnsiballZ_dnf.py'
Jan 22 16:36:35 compute-0 sudo[62333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:36 compute-0 python3.9[62335]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:36:37 compute-0 sudo[62333]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:37 compute-0 sudo[62487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdxkuevwfgxolywhhitpyiioeoyiflwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099797.651741-141-241275398505184/AnsiballZ_setup.py'
Jan 22 16:36:37 compute-0 sudo[62487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:38 compute-0 python3.9[62489]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:36:38 compute-0 sudo[62487]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:38 compute-0 sshd-session[58932]: Connection reset by 176.120.22.47 port 42196 [preauth]
Jan 22 16:36:38 compute-0 sudo[62641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyjtezmkidunrtnrvosfvgunuhqlxfse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099798.4914625-149-32222976380648/AnsiballZ_stat.py'
Jan 22 16:36:38 compute-0 sudo[62641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:39 compute-0 python3.9[62643]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:36:39 compute-0 sudo[62641]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:39 compute-0 sudo[62793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylilqzjiciymznzjdfmrlbafstbvjmmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099799.3453522-158-264433367568931/AnsiballZ_stat.py'
Jan 22 16:36:39 compute-0 sudo[62793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:39 compute-0 python3.9[62795]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:36:39 compute-0 sudo[62793]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:40 compute-0 sudo[62945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymjcspvpbawhyilpqdoaxpyisgzfsdxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099800.2255979-168-253724465793211/AnsiballZ_command.py'
Jan 22 16:36:40 compute-0 sudo[62945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:40 compute-0 python3.9[62947]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:36:40 compute-0 sudo[62945]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:41 compute-0 sudo[63098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgjbiskmalkmpsmkuhvttezcwjijlibb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099801.1005108-178-253352618306899/AnsiballZ_service_facts.py'
Jan 22 16:36:41 compute-0 sudo[63098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:41 compute-0 python3.9[63100]: ansible-service_facts Invoked
Jan 22 16:36:41 compute-0 network[63117]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 16:36:41 compute-0 network[63118]: 'network-scripts' will be removed from distribution in near future.
Jan 22 16:36:41 compute-0 network[63119]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 16:36:46 compute-0 sudo[63098]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:47 compute-0 sudo[63402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsumzomjunixgfixgtitdssccoydlxvo ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769099807.1512537-193-168331871065853/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769099807.1512537-193-168331871065853/args'
Jan 22 16:36:47 compute-0 sudo[63402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:47 compute-0 sudo[63402]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:48 compute-0 sudo[63569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utxbnirkztnfjopcovphjvzgvjpkaxuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099808.0684502-204-25798117074121/AnsiballZ_dnf.py'
Jan 22 16:36:48 compute-0 sudo[63569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:48 compute-0 python3.9[63571]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:36:49 compute-0 sudo[63569]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:51 compute-0 sudo[63722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpotqifbtbltnfedsaelgovncklqmxsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099810.3209927-217-209004429597404/AnsiballZ_package_facts.py'
Jan 22 16:36:51 compute-0 sudo[63722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:51 compute-0 python3.9[63724]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 22 16:36:51 compute-0 sudo[63722]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:52 compute-0 sudo[63874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nahrknuzlcdjxdokpiohzlepevfqiadg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099811.9965663-227-268472904542070/AnsiballZ_stat.py'
Jan 22 16:36:52 compute-0 sudo[63874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:52 compute-0 python3.9[63876]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:36:52 compute-0 sudo[63874]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:53 compute-0 sudo[63999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fytjedmhchxzulrexskogopaueoivews ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099811.9965663-227-268472904542070/AnsiballZ_copy.py'
Jan 22 16:36:53 compute-0 sudo[63999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:53 compute-0 python3.9[64001]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769099811.9965663-227-268472904542070/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:36:53 compute-0 sudo[63999]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:53 compute-0 sudo[64153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfjkcfscniocdcjcdaqguitaxdvpjuxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099813.7102642-242-198609268475514/AnsiballZ_stat.py'
Jan 22 16:36:53 compute-0 sudo[64153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:54 compute-0 python3.9[64155]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:36:54 compute-0 sudo[64153]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:54 compute-0 sudo[64278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubsmcyebzssvjvfyndftjkxonhbljljx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099813.7102642-242-198609268475514/AnsiballZ_copy.py'
Jan 22 16:36:54 compute-0 sudo[64278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:54 compute-0 python3.9[64280]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769099813.7102642-242-198609268475514/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:36:54 compute-0 sudo[64278]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:55 compute-0 sudo[64432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oitydfwnbesejdvuabtuhkrxututbdxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099815.39785-263-112415142692243/AnsiballZ_lineinfile.py'
Jan 22 16:36:55 compute-0 sudo[64432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:56 compute-0 python3.9[64434]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:36:56 compute-0 sudo[64432]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:57 compute-0 sudo[64586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oydosrybwvewiiksbefwnwqocxwkirtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099816.6526031-278-140152252534135/AnsiballZ_setup.py'
Jan 22 16:36:57 compute-0 sudo[64586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:57 compute-0 python3.9[64588]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:36:57 compute-0 sudo[64586]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:58 compute-0 sudo[64670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efffgdcqzydutgqcebdslgabcebtfeen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099816.6526031-278-140152252534135/AnsiballZ_systemd.py'
Jan 22 16:36:58 compute-0 sudo[64670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:58 compute-0 python3.9[64672]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:36:58 compute-0 sudo[64670]: pam_unix(sudo:session): session closed for user root
Jan 22 16:36:59 compute-0 sudo[64824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chokuetcbqqqsvbyiosmowywowgebiuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099819.107051-294-79678830479924/AnsiballZ_setup.py'
Jan 22 16:36:59 compute-0 sudo[64824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:36:59 compute-0 python3.9[64826]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:37:00 compute-0 sudo[64824]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:00 compute-0 sudo[64908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myqjehelyxtafuyvjimflcywarzarimb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099819.107051-294-79678830479924/AnsiballZ_systemd.py'
Jan 22 16:37:00 compute-0 sudo[64908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:00 compute-0 python3.9[64910]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:37:00 compute-0 chronyd[788]: chronyd exiting
Jan 22 16:37:00 compute-0 systemd[1]: Stopping NTP client/server...
Jan 22 16:37:00 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Jan 22 16:37:00 compute-0 systemd[1]: Stopped NTP client/server.
Jan 22 16:37:00 compute-0 systemd[1]: Starting NTP client/server...
Jan 22 16:37:00 compute-0 chronyd[64919]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 22 16:37:00 compute-0 chronyd[64919]: Frequency -26.264 +/- 0.281 ppm read from /var/lib/chrony/drift
Jan 22 16:37:00 compute-0 chronyd[64919]: Loaded seccomp filter (level 2)
Jan 22 16:37:00 compute-0 systemd[1]: Started NTP client/server.
Jan 22 16:37:00 compute-0 sudo[64908]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:01 compute-0 sshd-session[60057]: Connection closed by 192.168.122.30 port 36916
Jan 22 16:37:01 compute-0 sshd-session[60054]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:37:01 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Jan 22 16:37:01 compute-0 systemd[1]: session-14.scope: Consumed 28.332s CPU time.
Jan 22 16:37:01 compute-0 systemd-logind[796]: Session 14 logged out. Waiting for processes to exit.
Jan 22 16:37:01 compute-0 systemd-logind[796]: Removed session 14.
Jan 22 16:37:06 compute-0 sshd-session[64945]: Accepted publickey for zuul from 192.168.122.30 port 53640 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:37:06 compute-0 systemd-logind[796]: New session 15 of user zuul.
Jan 22 16:37:06 compute-0 systemd[1]: Started Session 15 of User zuul.
Jan 22 16:37:06 compute-0 sshd-session[64945]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:37:08 compute-0 python3.9[65098]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:37:09 compute-0 sudo[65252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nonhtwgswshhyyldaeqedjseympgaioy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099828.5135956-28-11571523466180/AnsiballZ_file.py'
Jan 22 16:37:09 compute-0 sudo[65252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:09 compute-0 python3.9[65254]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:09 compute-0 sudo[65252]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:10 compute-0 sudo[65427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oevjaqrlwngujbmrbquuovmiktkzqlas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099829.534399-36-136860655570961/AnsiballZ_stat.py'
Jan 22 16:37:10 compute-0 sudo[65427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:10 compute-0 python3.9[65429]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:10 compute-0 sudo[65427]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:10 compute-0 sudo[65505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwvmnzqhzglgxwzgpgjxscuwkzxbxlgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099829.534399-36-136860655570961/AnsiballZ_file.py'
Jan 22 16:37:10 compute-0 sudo[65505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:10 compute-0 python3.9[65507]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.ah4vpkrv recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:10 compute-0 sudo[65505]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:11 compute-0 sudo[65657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwxijavkookpdnpyzmolpeolnfbgbund ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099831.178242-56-115470193335140/AnsiballZ_stat.py'
Jan 22 16:37:11 compute-0 sudo[65657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:11 compute-0 python3.9[65659]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:11 compute-0 sudo[65657]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:12 compute-0 sudo[65780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wumsoybzijillysjeazxrrdzohwhduys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099831.178242-56-115470193335140/AnsiballZ_copy.py'
Jan 22 16:37:12 compute-0 sudo[65780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:12 compute-0 python3.9[65782]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769099831.178242-56-115470193335140/.source _original_basename=.8cqk27q3 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:12 compute-0 sudo[65780]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:13 compute-0 sudo[65932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gieghojfjqfnjsnbwdiwxykpbxuhpnpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099832.8015149-72-232263929475754/AnsiballZ_file.py'
Jan 22 16:37:13 compute-0 sudo[65932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:13 compute-0 python3.9[65934]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:37:13 compute-0 sudo[65932]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:13 compute-0 sudo[66084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agswvrugewdcfakzvuvokeboocumtakv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099833.548683-80-128409180077418/AnsiballZ_stat.py'
Jan 22 16:37:13 compute-0 sudo[66084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:14 compute-0 python3.9[66086]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:14 compute-0 sudo[66084]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:14 compute-0 sudo[66207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyicepxslnbrfoptkivmbvhtonhqfvkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099833.548683-80-128409180077418/AnsiballZ_copy.py'
Jan 22 16:37:14 compute-0 sudo[66207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:14 compute-0 python3.9[66209]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769099833.548683-80-128409180077418/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:37:14 compute-0 sudo[66207]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:15 compute-0 sudo[66359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fblzihdbufcuepilndaonewnolkglhqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099834.9345586-80-173351487693223/AnsiballZ_stat.py'
Jan 22 16:37:15 compute-0 sudo[66359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:15 compute-0 python3.9[66361]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:15 compute-0 sudo[66359]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:15 compute-0 sudo[66482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybuciaxrgzjibskfuflwwkopcgylkpwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099834.9345586-80-173351487693223/AnsiballZ_copy.py'
Jan 22 16:37:15 compute-0 sudo[66482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:16 compute-0 python3.9[66484]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769099834.9345586-80-173351487693223/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:37:16 compute-0 sudo[66482]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:16 compute-0 sudo[66634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aftnoixbvjqymclmgwjknkslxxlrtbsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099836.274307-109-33448479880550/AnsiballZ_file.py'
Jan 22 16:37:16 compute-0 sudo[66634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:16 compute-0 python3.9[66636]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:16 compute-0 sudo[66634]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:17 compute-0 sudo[66786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljysjmhpcthwqxffnilqnzevqujwdcgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099836.927165-117-203973284718861/AnsiballZ_stat.py'
Jan 22 16:37:17 compute-0 sudo[66786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:17 compute-0 python3.9[66788]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:17 compute-0 sudo[66786]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:17 compute-0 sudo[66909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxbbqfpqbcmayrdqfaykcrsjedngxvgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099836.927165-117-203973284718861/AnsiballZ_copy.py'
Jan 22 16:37:17 compute-0 sudo[66909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:18 compute-0 python3.9[66911]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099836.927165-117-203973284718861/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:18 compute-0 sudo[66909]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:18 compute-0 sudo[67061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbkyzznqbhfuzlyudakufeamqkxafjgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099838.2868004-132-7008951896844/AnsiballZ_stat.py'
Jan 22 16:37:18 compute-0 sudo[67061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:18 compute-0 python3.9[67063]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:18 compute-0 sudo[67061]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:19 compute-0 sudo[67184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxqueiyczyxjalsubhzqsmjdpkezhpnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099838.2868004-132-7008951896844/AnsiballZ_copy.py'
Jan 22 16:37:19 compute-0 sudo[67184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:19 compute-0 python3.9[67186]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099838.2868004-132-7008951896844/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:19 compute-0 sudo[67184]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:20 compute-0 sudo[67336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulslyxzlhksbebdaxmazkomcqxbstvbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099839.8065314-147-191814834399793/AnsiballZ_systemd.py'
Jan 22 16:37:20 compute-0 sudo[67336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:20 compute-0 python3.9[67338]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:37:20 compute-0 systemd[1]: Reloading.
Jan 22 16:37:20 compute-0 systemd-sysv-generator[67367]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:37:20 compute-0 systemd-rc-local-generator[67361]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:37:21 compute-0 systemd[1]: Reloading.
Jan 22 16:37:21 compute-0 systemd-rc-local-generator[67402]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:37:21 compute-0 systemd-sysv-generator[67406]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:37:21 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Jan 22 16:37:21 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Jan 22 16:37:21 compute-0 sudo[67336]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:21 compute-0 sudo[67563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltujihovpmkhtnzptvewtbtmhhbjqjjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099841.5990598-155-62154184494903/AnsiballZ_stat.py'
Jan 22 16:37:21 compute-0 sudo[67563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:22 compute-0 python3.9[67565]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:22 compute-0 sudo[67563]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:22 compute-0 sudo[67686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idqkppslwmkmgexcillvlwgezfqiwfot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099841.5990598-155-62154184494903/AnsiballZ_copy.py'
Jan 22 16:37:22 compute-0 sudo[67686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:22 compute-0 python3.9[67688]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099841.5990598-155-62154184494903/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:22 compute-0 sudo[67686]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:23 compute-0 sudo[67838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbozowguvcttmonwaqgnzhhcywcyuaai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099842.8920681-170-87498771354174/AnsiballZ_stat.py'
Jan 22 16:37:23 compute-0 sudo[67838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:23 compute-0 python3.9[67840]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:23 compute-0 sudo[67838]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:23 compute-0 sudo[67961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwbpbfrblkqsiraeycvkfpaalolyymye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099842.8920681-170-87498771354174/AnsiballZ_copy.py'
Jan 22 16:37:23 compute-0 sudo[67961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:23 compute-0 python3.9[67963]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099842.8920681-170-87498771354174/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:24 compute-0 sudo[67961]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:24 compute-0 sudo[68113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaoxykbflmmfolhjmmeytiodvnxdbuuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099844.2029507-185-141065576301543/AnsiballZ_systemd.py'
Jan 22 16:37:24 compute-0 sudo[68113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:24 compute-0 python3.9[68115]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:37:24 compute-0 systemd[1]: Reloading.
Jan 22 16:37:24 compute-0 systemd-sysv-generator[68149]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:37:24 compute-0 systemd-rc-local-generator[68145]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:37:25 compute-0 systemd[1]: Reloading.
Jan 22 16:37:25 compute-0 systemd-rc-local-generator[68175]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:37:25 compute-0 systemd-sysv-generator[68183]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:37:25 compute-0 systemd[1]: Starting Create netns directory...
Jan 22 16:37:25 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 22 16:37:25 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 22 16:37:25 compute-0 systemd[1]: Finished Create netns directory.
Jan 22 16:37:25 compute-0 sudo[68113]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:26 compute-0 python3.9[68341]: ansible-ansible.builtin.service_facts Invoked
Jan 22 16:37:26 compute-0 network[68358]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 16:37:26 compute-0 network[68359]: 'network-scripts' will be removed from distribution in near future.
Jan 22 16:37:26 compute-0 network[68360]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 16:37:31 compute-0 sudo[68620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yywnmehmnhttqdwpuaxfljlvjchbfriq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099851.3037496-201-100162017746749/AnsiballZ_systemd.py'
Jan 22 16:37:31 compute-0 sudo[68620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:31 compute-0 python3.9[68622]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:37:31 compute-0 systemd[1]: Reloading.
Jan 22 16:37:32 compute-0 systemd-rc-local-generator[68651]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:37:32 compute-0 systemd-sysv-generator[68654]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:37:32 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 22 16:37:32 compute-0 iptables.init[68661]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 22 16:37:32 compute-0 iptables.init[68661]: iptables: Flushing firewall rules: [  OK  ]
Jan 22 16:37:32 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Jan 22 16:37:32 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 22 16:37:32 compute-0 sudo[68620]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:33 compute-0 sudo[68857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slgkyacotavfcevksgwkbuivtqubuqtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099852.7092214-201-176418463121152/AnsiballZ_systemd.py'
Jan 22 16:37:33 compute-0 sudo[68857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:33 compute-0 python3.9[68859]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:37:33 compute-0 sudo[68857]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:33 compute-0 sudo[69011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vckeeemdgbeoaujvxpyxpyuutgbjsaxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099853.5908518-217-251627232365655/AnsiballZ_systemd.py'
Jan 22 16:37:33 compute-0 sudo[69011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:34 compute-0 python3.9[69013]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:37:34 compute-0 systemd[1]: Reloading.
Jan 22 16:37:34 compute-0 systemd-rc-local-generator[69043]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:37:34 compute-0 systemd-sysv-generator[69047]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:37:34 compute-0 systemd[1]: Starting Netfilter Tables...
Jan 22 16:37:34 compute-0 systemd[1]: Finished Netfilter Tables.
Jan 22 16:37:34 compute-0 sudo[69011]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:35 compute-0 sudo[69204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kulpgzzwnwmhonmangrouqptnnicqjuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099854.881245-225-6919321467629/AnsiballZ_command.py'
Jan 22 16:37:35 compute-0 sudo[69204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:35 compute-0 python3.9[69206]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:37:35 compute-0 sudo[69204]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:36 compute-0 sudo[69357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggqzohhbhyxakrftfxrmhrhyowmiaagn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099855.9847124-239-50492942583387/AnsiballZ_stat.py'
Jan 22 16:37:36 compute-0 sudo[69357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:36 compute-0 python3.9[69359]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:36 compute-0 sudo[69357]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:36 compute-0 sudo[69482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwljsudxuevflnpzwhygaumcdmiawiak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099855.9847124-239-50492942583387/AnsiballZ_copy.py'
Jan 22 16:37:36 compute-0 sudo[69482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:37 compute-0 python3.9[69484]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769099855.9847124-239-50492942583387/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:37 compute-0 sudo[69482]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:37 compute-0 sudo[69635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhvgfysaubopwdqxgbxrngjtxfrovjpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099857.234393-254-19380271660275/AnsiballZ_systemd.py'
Jan 22 16:37:37 compute-0 sudo[69635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:37 compute-0 python3.9[69637]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:37:37 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Jan 22 16:37:37 compute-0 sshd[1007]: Received SIGHUP; restarting.
Jan 22 16:37:37 compute-0 sshd[1007]: Server listening on 0.0.0.0 port 22.
Jan 22 16:37:37 compute-0 sshd[1007]: Server listening on :: port 22.
Jan 22 16:37:37 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Jan 22 16:37:37 compute-0 sudo[69635]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:38 compute-0 sudo[69791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptcgtloenagkrqtqjvpfvofsodgwevkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099858.0634642-262-145648968273223/AnsiballZ_file.py'
Jan 22 16:37:38 compute-0 sudo[69791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:38 compute-0 python3.9[69793]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:38 compute-0 sudo[69791]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:39 compute-0 sudo[69943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdxgtqcxbxppxinkfnkmncyqxwgguoni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099858.7720172-270-212492287680829/AnsiballZ_stat.py'
Jan 22 16:37:39 compute-0 sudo[69943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:39 compute-0 python3.9[69945]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:39 compute-0 sudo[69943]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:39 compute-0 sudo[70066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjewhfaqgkvgppkvhcfoijrqcexnjjci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099858.7720172-270-212492287680829/AnsiballZ_copy.py'
Jan 22 16:37:39 compute-0 sudo[70066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:39 compute-0 python3.9[70068]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099858.7720172-270-212492287680829/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:39 compute-0 sudo[70066]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:40 compute-0 sudo[70218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueqettwbjuwxmuqwajzxlvdyxicdidky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099860.1946218-288-266008558792035/AnsiballZ_timezone.py'
Jan 22 16:37:40 compute-0 sudo[70218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:40 compute-0 python3.9[70220]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 22 16:37:40 compute-0 systemd[1]: Starting Time & Date Service...
Jan 22 16:37:41 compute-0 systemd[1]: Started Time & Date Service.
Jan 22 16:37:41 compute-0 sudo[70218]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:41 compute-0 sudo[70374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzhjtazqgmqzmomguoyvynpbbrtrdmep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099861.315529-297-121905771967418/AnsiballZ_file.py'
Jan 22 16:37:41 compute-0 sudo[70374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:41 compute-0 python3.9[70376]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:41 compute-0 sudo[70374]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:42 compute-0 sudo[70526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkdijyeluklvtpzlnkywvhroitwvofnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099862.0045588-305-276941800691890/AnsiballZ_stat.py'
Jan 22 16:37:42 compute-0 sudo[70526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:42 compute-0 python3.9[70528]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:42 compute-0 sudo[70526]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:43 compute-0 sudo[70649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yogkllttwlrvuomnpwjmojtdmakkalgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099862.0045588-305-276941800691890/AnsiballZ_copy.py'
Jan 22 16:37:43 compute-0 sudo[70649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:43 compute-0 python3.9[70651]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769099862.0045588-305-276941800691890/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:43 compute-0 sudo[70649]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:43 compute-0 sudo[70801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjjysdzxgfhtbcbnskuruapjrydreskg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099863.4810348-320-272206231262787/AnsiballZ_stat.py'
Jan 22 16:37:43 compute-0 sudo[70801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:44 compute-0 python3.9[70803]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:44 compute-0 sudo[70801]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:44 compute-0 sudo[70924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrpolqapmmvvlkfeiisydapraelyeinm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099863.4810348-320-272206231262787/AnsiballZ_copy.py'
Jan 22 16:37:44 compute-0 sudo[70924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:44 compute-0 python3.9[70926]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769099863.4810348-320-272206231262787/.source.yaml _original_basename=.myysmivd follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:44 compute-0 sudo[70924]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:45 compute-0 sudo[71076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reasgtpuabhsrqvgckkbamiweutbzjwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099864.8262827-335-198171076440831/AnsiballZ_stat.py'
Jan 22 16:37:45 compute-0 sudo[71076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:45 compute-0 python3.9[71078]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:45 compute-0 sudo[71076]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:45 compute-0 sudo[71199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqpeffflexzigraazzlkhzcnqhddhdba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099864.8262827-335-198171076440831/AnsiballZ_copy.py'
Jan 22 16:37:45 compute-0 sudo[71199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:46 compute-0 python3.9[71201]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099864.8262827-335-198171076440831/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:46 compute-0 sudo[71199]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:46 compute-0 sudo[71351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqplyqvkmqsptikmhnnkhyxjwyaywgoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099866.295769-350-169212978726212/AnsiballZ_command.py'
Jan 22 16:37:46 compute-0 sudo[71351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:46 compute-0 python3.9[71353]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:37:46 compute-0 sudo[71351]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:47 compute-0 sudo[71504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sodwtrskxaetbeoazyzkelnacoprihwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099866.9942887-358-63206400722216/AnsiballZ_command.py'
Jan 22 16:37:47 compute-0 sudo[71504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:47 compute-0 python3.9[71506]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:37:47 compute-0 sudo[71504]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:48 compute-0 sudo[71657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liuqdsygsfvtoflhoqgducjgibyajbfa ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769099867.6780202-366-236181663958623/AnsiballZ_edpm_nftables_from_files.py'
Jan 22 16:37:48 compute-0 sudo[71657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:48 compute-0 python3[71659]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 22 16:37:48 compute-0 sudo[71657]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:49 compute-0 sudo[71809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjmhzpuaeldtsmekjzbgzgyckgtwtciv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099869.1388323-374-39239835587016/AnsiballZ_stat.py'
Jan 22 16:37:49 compute-0 sudo[71809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:49 compute-0 python3.9[71811]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:49 compute-0 sudo[71809]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:50 compute-0 sudo[71932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frkiyatqevagvawektehpqdluqrawkyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099869.1388323-374-39239835587016/AnsiballZ_copy.py'
Jan 22 16:37:50 compute-0 sudo[71932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:50 compute-0 python3.9[71934]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099869.1388323-374-39239835587016/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:50 compute-0 sudo[71932]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:51 compute-0 sudo[72084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdehlzmwutmjbnhjcwtwuwdcdwnqsekt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099870.578247-389-149024283918609/AnsiballZ_stat.py'
Jan 22 16:37:51 compute-0 sudo[72084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:51 compute-0 python3.9[72086]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:51 compute-0 sudo[72084]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:51 compute-0 sudo[72207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptvmnvzxxrvyhfeperkgexoipewuhsqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099870.578247-389-149024283918609/AnsiballZ_copy.py'
Jan 22 16:37:51 compute-0 sudo[72207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:52 compute-0 python3.9[72209]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099870.578247-389-149024283918609/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:52 compute-0 sudo[72207]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:52 compute-0 sudo[72359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-settkksjeefjjfrbcycbejgiywvyjxgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099872.2473607-404-118895265754420/AnsiballZ_stat.py'
Jan 22 16:37:52 compute-0 sudo[72359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:52 compute-0 python3.9[72361]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:52 compute-0 sudo[72359]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:53 compute-0 sudo[72482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dggsrcbpmsznjufinsornqtukeetwoil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099872.2473607-404-118895265754420/AnsiballZ_copy.py'
Jan 22 16:37:53 compute-0 sudo[72482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:53 compute-0 python3.9[72484]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099872.2473607-404-118895265754420/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:53 compute-0 sudo[72482]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:53 compute-0 sudo[72634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfiavmvciqwhtrfunpmndcmhwawlszkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099873.390168-419-6145017065290/AnsiballZ_stat.py'
Jan 22 16:37:53 compute-0 sudo[72634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:53 compute-0 python3.9[72636]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:53 compute-0 sudo[72634]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:54 compute-0 sudo[72757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzmcphyuzhivspxsekbpjwzvcgagsjla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099873.390168-419-6145017065290/AnsiballZ_copy.py'
Jan 22 16:37:54 compute-0 sudo[72757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:54 compute-0 python3.9[72759]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099873.390168-419-6145017065290/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:54 compute-0 sudo[72757]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:55 compute-0 sudo[72909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-misezymikygbmgqqbpyatijvvibfdscd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099874.6696582-434-177276711043970/AnsiballZ_stat.py'
Jan 22 16:37:55 compute-0 sudo[72909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:55 compute-0 python3.9[72911]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:37:55 compute-0 sudo[72909]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:55 compute-0 sudo[73032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paurumpzeshzuvxjnznqewwiistgxftv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099874.6696582-434-177276711043970/AnsiballZ_copy.py'
Jan 22 16:37:55 compute-0 sudo[73032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:56 compute-0 python3.9[73034]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099874.6696582-434-177276711043970/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:56 compute-0 sudo[73032]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:56 compute-0 sudo[73184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhdyjfwrjeahbamjrhsrlfxenyqipfvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099876.3069756-449-100914953226685/AnsiballZ_file.py'
Jan 22 16:37:56 compute-0 sudo[73184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:56 compute-0 python3.9[73186]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:56 compute-0 sudo[73184]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:57 compute-0 sudo[73336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrwqnampronsswjijschylulixpxpoog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099877.164058-457-172201730799622/AnsiballZ_command.py'
Jan 22 16:37:57 compute-0 sudo[73336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:57 compute-0 python3.9[73338]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:37:57 compute-0 sudo[73336]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:58 compute-0 sudo[73495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zugyanfctaczduxtpcyjqleaqrkrxpbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099878.0063188-465-219350924132473/AnsiballZ_blockinfile.py'
Jan 22 16:37:58 compute-0 sudo[73495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:58 compute-0 python3.9[73497]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:58 compute-0 sudo[73495]: pam_unix(sudo:session): session closed for user root
Jan 22 16:37:59 compute-0 sudo[73648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpyalhrseojokqaeamdyfgzpmvnxmlsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099879.2161336-474-74303399701787/AnsiballZ_file.py'
Jan 22 16:37:59 compute-0 sudo[73648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:37:59 compute-0 python3.9[73650]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:37:59 compute-0 sudo[73648]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:00 compute-0 sudo[73800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drjohazfxsrymwtcfzuhhcoodgbqnlwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099880.0206647-474-1516918593077/AnsiballZ_file.py'
Jan 22 16:38:00 compute-0 sudo[73800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:00 compute-0 python3.9[73802]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:38:00 compute-0 sudo[73800]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:01 compute-0 sudo[73952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jctvvfvkmejptygubobygthlpwcbgrki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099880.8999386-489-274023589041075/AnsiballZ_mount.py'
Jan 22 16:38:01 compute-0 sudo[73952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:01 compute-0 python3.9[73954]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 22 16:38:01 compute-0 sudo[73952]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:01 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 16:38:01 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 16:38:02 compute-0 sudo[74106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aasdjwewgcsckwtroczwbcjnqcbeluaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099881.809673-489-202286533694345/AnsiballZ_mount.py'
Jan 22 16:38:02 compute-0 sudo[74106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:02 compute-0 python3.9[74108]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 22 16:38:02 compute-0 sudo[74106]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:02 compute-0 sshd-session[64948]: Connection closed by 192.168.122.30 port 53640
Jan 22 16:38:02 compute-0 sshd-session[64945]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:38:02 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Jan 22 16:38:02 compute-0 systemd[1]: session-15.scope: Consumed 38.687s CPU time.
Jan 22 16:38:02 compute-0 systemd-logind[796]: Session 15 logged out. Waiting for processes to exit.
Jan 22 16:38:02 compute-0 systemd-logind[796]: Removed session 15.
Jan 22 16:38:09 compute-0 sshd-session[74134]: Accepted publickey for zuul from 192.168.122.30 port 52850 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:38:09 compute-0 systemd-logind[796]: New session 16 of user zuul.
Jan 22 16:38:09 compute-0 systemd[1]: Started Session 16 of User zuul.
Jan 22 16:38:09 compute-0 sshd-session[74134]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:38:10 compute-0 sudo[74287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgphlknnefpimvtbakxeqlgqrqmbuvhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099889.629371-16-115278970810647/AnsiballZ_tempfile.py'
Jan 22 16:38:10 compute-0 sudo[74287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:10 compute-0 python3.9[74289]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 22 16:38:10 compute-0 sudo[74287]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:11 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 22 16:38:11 compute-0 sudo[74441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykzmopeuyjxjfonkkeksjdqierrujgcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099890.728311-28-38054997306523/AnsiballZ_stat.py'
Jan 22 16:38:11 compute-0 sudo[74441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:11 compute-0 python3.9[74443]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:38:11 compute-0 sudo[74441]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:12 compute-0 sudo[74593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bauydgbtoevwijzkqwuszsrfvfxxjyfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099891.6561441-38-195994456776750/AnsiballZ_setup.py'
Jan 22 16:38:12 compute-0 sudo[74593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:12 compute-0 python3.9[74595]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:38:12 compute-0 sudo[74593]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:13 compute-0 sudo[74745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khhhnftzvworrlpkjvndmbgvfcppnmzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099893.005792-47-269044761590493/AnsiballZ_blockinfile.py'
Jan 22 16:38:13 compute-0 sudo[74745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:13 compute-0 python3.9[74747]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCdsV1wFHvbKek26PJ/WDWSmsb5jWqP3JOejZvj+VVQvN/DuY1wyTB2ILyXe7Ps8M1FxKeETq1zgkHqKuVFMevwUDoqWs2HGtdudQabAGvW1Da4sbSsBnzjkzgOZ1JoHEbfA/GmxtaXn4vl4ADoaHt0izYpQvSQWZLcO8hZ/pDQgF472GBYJSXxcANJbCTaYHZ1JTpNoNZXQSH5tNSEFeqAOYmsOEth9MkmfMk+rNiQnu1DYeesuGwg2pJg8SnE/puA3IJMI5u9MtIHLAmAOk3NDLj2AZDeev6k+AIC4ZMaUQh6jZOhCFx9gUh0ORoh9fuDrhZRX/zfVo3iDf86HtRfVI9pkHPsSioTImgMvPUktMnGuTPTSg0fJ/pmgwclNgQHbYucDkTWSC8jZPGUjNBbTB+unj63RaRlvwYnftD6rtdF0kMWMetkOzsuhE22Oc2UlOKmMfZbq6hxSQiIHeLBplDCvDXXx984rhTMqsa1OYFLPw9X+k5oOogrA2JF7nU=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBeR5NEPzNe+sJVnbccFtxSV7Iojskta34U7z53fetYu
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBA1xzUp3MZBun5CM85rPv/2nq++PPV9rCpyzOVdUPnTjseEcUUtj5AXvB5vT+ibZdMmF4Z2NOERNCY6npvCE+FE=
                                             create=True mode=0644 path=/tmp/ansible.1fq1sg25 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:38:13 compute-0 sudo[74745]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:14 compute-0 sudo[74897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llndlwcvxodbrgijlyoisgcfcnikcbui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099893.9945824-55-207522957331021/AnsiballZ_command.py'
Jan 22 16:38:14 compute-0 sudo[74897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:14 compute-0 python3.9[74899]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.1fq1sg25' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:38:14 compute-0 sudo[74897]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:15 compute-0 sudo[75051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euqqmoewhgchlhfspmvftefiibynholw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099894.9210327-63-107000699177276/AnsiballZ_file.py'
Jan 22 16:38:15 compute-0 sudo[75051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:15 compute-0 python3.9[75053]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.1fq1sg25 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:38:15 compute-0 sudo[75051]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:16 compute-0 sshd-session[74137]: Connection closed by 192.168.122.30 port 52850
Jan 22 16:38:16 compute-0 sshd-session[74134]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:38:16 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Jan 22 16:38:16 compute-0 systemd[1]: session-16.scope: Consumed 4.203s CPU time.
Jan 22 16:38:16 compute-0 systemd-logind[796]: Session 16 logged out. Waiting for processes to exit.
Jan 22 16:38:16 compute-0 systemd-logind[796]: Removed session 16.
Jan 22 16:38:22 compute-0 sshd-session[75078]: Accepted publickey for zuul from 192.168.122.30 port 38478 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:38:22 compute-0 systemd-logind[796]: New session 17 of user zuul.
Jan 22 16:38:22 compute-0 systemd[1]: Started Session 17 of User zuul.
Jan 22 16:38:22 compute-0 sshd-session[75078]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:38:23 compute-0 python3.9[75231]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:38:24 compute-0 sudo[75385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsxqkguxuzongzpnezpwsldqdgalrjiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099903.6107514-27-80306058083402/AnsiballZ_systemd.py'
Jan 22 16:38:24 compute-0 sudo[75385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:24 compute-0 python3.9[75387]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 22 16:38:24 compute-0 sudo[75385]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:25 compute-0 sudo[75539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlhgedejavbrqamarqdxmhuhfddwauxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099904.7963097-35-260505988531663/AnsiballZ_systemd.py'
Jan 22 16:38:25 compute-0 sudo[75539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:25 compute-0 python3.9[75541]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:38:25 compute-0 sudo[75539]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:26 compute-0 sudo[75692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsjzvifbagfipjsewbwmekbyjlljoezd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099905.8386018-44-250567987813796/AnsiballZ_command.py'
Jan 22 16:38:26 compute-0 sudo[75692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:26 compute-0 python3.9[75694]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:38:26 compute-0 sudo[75692]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:27 compute-0 sudo[75845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncdpltuvhofzhjypjeufrrvqvihmrsew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099906.7620788-52-262092992613666/AnsiballZ_stat.py'
Jan 22 16:38:27 compute-0 sudo[75845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:27 compute-0 python3.9[75847]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:38:27 compute-0 sudo[75845]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:28 compute-0 sudo[75999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjfrsuuaivlsubtvhbrhsgociklffahs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099908.0096598-60-280504759313299/AnsiballZ_command.py'
Jan 22 16:38:28 compute-0 sudo[75999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:28 compute-0 python3.9[76001]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:38:28 compute-0 sudo[75999]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:29 compute-0 sudo[76154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iggegjdbuowqhvthbewlfzpzomaqrnyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099908.8180573-68-134359602793372/AnsiballZ_file.py'
Jan 22 16:38:29 compute-0 sudo[76154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:29 compute-0 python3.9[76156]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:38:29 compute-0 sudo[76154]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:29 compute-0 sshd-session[75081]: Connection closed by 192.168.122.30 port 38478
Jan 22 16:38:29 compute-0 sshd-session[75078]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:38:29 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Jan 22 16:38:29 compute-0 systemd[1]: session-17.scope: Consumed 5.207s CPU time.
Jan 22 16:38:29 compute-0 systemd-logind[796]: Session 17 logged out. Waiting for processes to exit.
Jan 22 16:38:29 compute-0 systemd-logind[796]: Removed session 17.
Jan 22 16:38:34 compute-0 sshd-session[76181]: Accepted publickey for zuul from 192.168.122.30 port 60076 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:38:34 compute-0 systemd-logind[796]: New session 18 of user zuul.
Jan 22 16:38:34 compute-0 systemd[1]: Started Session 18 of User zuul.
Jan 22 16:38:34 compute-0 sshd-session[76181]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:38:36 compute-0 python3.9[76334]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:38:36 compute-0 sudo[76488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjmggbldjelmyjcdlulpzwjaxtrutfaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099916.3927062-29-262639836413103/AnsiballZ_setup.py'
Jan 22 16:38:36 compute-0 sudo[76488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:37 compute-0 python3.9[76490]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:38:37 compute-0 sudo[76488]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:37 compute-0 sudo[76572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxyfdlxwceklaxekivlolviryydbmjsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099916.3927062-29-262639836413103/AnsiballZ_dnf.py'
Jan 22 16:38:37 compute-0 sudo[76572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:37 compute-0 python3.9[76574]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:38:39 compute-0 sudo[76572]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:40 compute-0 python3.9[76725]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:38:41 compute-0 python3.9[76876]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 16:38:42 compute-0 python3.9[77026]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:38:43 compute-0 python3.9[77176]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:38:43 compute-0 sshd-session[76184]: Connection closed by 192.168.122.30 port 60076
Jan 22 16:38:43 compute-0 sshd-session[76181]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:38:43 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Jan 22 16:38:43 compute-0 systemd[1]: session-18.scope: Consumed 6.469s CPU time.
Jan 22 16:38:43 compute-0 systemd-logind[796]: Session 18 logged out. Waiting for processes to exit.
Jan 22 16:38:43 compute-0 systemd-logind[796]: Removed session 18.
Jan 22 16:38:49 compute-0 sshd-session[77201]: Accepted publickey for zuul from 192.168.122.30 port 56238 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:38:49 compute-0 systemd-logind[796]: New session 19 of user zuul.
Jan 22 16:38:49 compute-0 systemd[1]: Started Session 19 of User zuul.
Jan 22 16:38:49 compute-0 sshd-session[77201]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:38:50 compute-0 python3.9[77354]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:38:51 compute-0 sudo[77508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzfcqqrfjoeaiglqvugaelyklklugdwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099931.2281752-45-203915516008010/AnsiballZ_file.py'
Jan 22 16:38:51 compute-0 sudo[77508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:51 compute-0 python3.9[77510]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:38:51 compute-0 sudo[77508]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:52 compute-0 sudo[77660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfdwbzmgqvfxaoatptnwtivtalhcbwhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099932.0802252-45-268461087684595/AnsiballZ_file.py'
Jan 22 16:38:52 compute-0 sudo[77660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:52 compute-0 python3.9[77662]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:38:52 compute-0 sudo[77660]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:53 compute-0 sudo[77812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnlqtvscoimptizxbrgqwtmvtlburqcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099932.7967901-60-262594438639951/AnsiballZ_stat.py'
Jan 22 16:38:53 compute-0 sudo[77812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:53 compute-0 python3.9[77814]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:38:53 compute-0 sudo[77812]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:54 compute-0 sudo[77935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxiibvfaemanhhhxmbjvfeuidqwvwfvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099932.7967901-60-262594438639951/AnsiballZ_copy.py'
Jan 22 16:38:54 compute-0 sudo[77935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:54 compute-0 python3.9[77937]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099932.7967901-60-262594438639951/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=db4ade5694a7ced2c07d189be3dc13fe174419d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:38:54 compute-0 sudo[77935]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:54 compute-0 sudo[78087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxnfwsqnylzrsiptezpvmutujoclmhdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099934.3694441-60-48981761545815/AnsiballZ_stat.py'
Jan 22 16:38:54 compute-0 sudo[78087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:54 compute-0 python3.9[78089]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:38:54 compute-0 sudo[78087]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:55 compute-0 sudo[78210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlcvjjpjoqwrpyrxehrovzllnzlqfnpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099934.3694441-60-48981761545815/AnsiballZ_copy.py'
Jan 22 16:38:55 compute-0 sudo[78210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:55 compute-0 python3.9[78212]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099934.3694441-60-48981761545815/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=4ded5b46db3377b111d5a5dfc88cc962030b8d04 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:38:55 compute-0 sudo[78210]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:56 compute-0 sudo[78362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spymrxmjlxznbmxtwwfqqxatdolwuwyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099935.7057598-60-129367736555570/AnsiballZ_stat.py'
Jan 22 16:38:56 compute-0 sudo[78362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:56 compute-0 python3.9[78364]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:38:56 compute-0 sudo[78362]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:56 compute-0 sudo[78486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjtkksighhuqtqeyyzrusgszgykwhbrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099935.7057598-60-129367736555570/AnsiballZ_copy.py'
Jan 22 16:38:56 compute-0 sudo[78486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:56 compute-0 python3.9[78488]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099935.7057598-60-129367736555570/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=0a1e1ca78503608b2be7c28408260d03568eeaf2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:38:56 compute-0 sudo[78486]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:57 compute-0 sudo[78638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szmqojelquyumvxwlhznpseltqgkkhdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099937.1721897-104-21764272573756/AnsiballZ_file.py'
Jan 22 16:38:57 compute-0 sudo[78638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:57 compute-0 python3.9[78640]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:38:57 compute-0 sudo[78638]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:58 compute-0 sudo[78790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drhfjvfutwrxbszkqhjnftcbrckfdxkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099937.841763-104-126729408783154/AnsiballZ_file.py'
Jan 22 16:38:58 compute-0 sudo[78790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:58 compute-0 python3.9[78792]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:38:58 compute-0 sudo[78790]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:59 compute-0 sudo[78942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhpozujglmlndqoobkeisauaruteojma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099938.618333-119-205105012332652/AnsiballZ_stat.py'
Jan 22 16:38:59 compute-0 sudo[78942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:59 compute-0 python3.9[78944]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:38:59 compute-0 sudo[78942]: pam_unix(sudo:session): session closed for user root
Jan 22 16:38:59 compute-0 sudo[79065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfoholajkugehxryjnraiillptjwdkcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099938.618333-119-205105012332652/AnsiballZ_copy.py'
Jan 22 16:38:59 compute-0 sudo[79065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:38:59 compute-0 python3.9[79067]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099938.618333-119-205105012332652/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=8f418ad963f887c3032cf99c6305943c6c44a8f0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:38:59 compute-0 sudo[79065]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:00 compute-0 sudo[79217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuwkbsatndzodewbbldooleoevggpsfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099939.9735951-119-151120953269784/AnsiballZ_stat.py'
Jan 22 16:39:00 compute-0 sudo[79217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:00 compute-0 python3.9[79219]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:00 compute-0 sudo[79217]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:00 compute-0 sudo[79340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpmulzefzpmhapmhodtwodiojteextss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099939.9735951-119-151120953269784/AnsiballZ_copy.py'
Jan 22 16:39:00 compute-0 sudo[79340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:01 compute-0 anacron[4164]: Job `cron.daily' started
Jan 22 16:39:01 compute-0 anacron[4164]: Job `cron.daily' terminated
Jan 22 16:39:01 compute-0 python3.9[79342]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099939.9735951-119-151120953269784/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=fa5df7907408c90d27cb095e6aeff5a1b382daad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:01 compute-0 sudo[79340]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:01 compute-0 sudo[79494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vikcuyjnghxlxyuvxhpbavtlxpbdxbns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099941.3854082-119-174356790926954/AnsiballZ_stat.py'
Jan 22 16:39:01 compute-0 sudo[79494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:02 compute-0 python3.9[79496]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:02 compute-0 sudo[79494]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:02 compute-0 sudo[79617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzipvvlfntgxnntqyjmvxosxzaxbzkrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099941.3854082-119-174356790926954/AnsiballZ_copy.py'
Jan 22 16:39:02 compute-0 sudo[79617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:02 compute-0 python3.9[79619]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099941.3854082-119-174356790926954/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=f4b5a0b86f5e607ea6d7c3ba0328ced151b1f741 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:02 compute-0 sudo[79617]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:03 compute-0 sudo[79769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehygungmezjslsrnylmlhjcdfqbsrple ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099942.9792757-163-225106152936301/AnsiballZ_file.py'
Jan 22 16:39:03 compute-0 sudo[79769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:03 compute-0 python3.9[79771]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:39:03 compute-0 sudo[79769]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:04 compute-0 sudo[79921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfendfcrfifiypxcvgjlkwjezbotwbqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099943.691307-163-71407187498077/AnsiballZ_file.py'
Jan 22 16:39:04 compute-0 sudo[79921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:04 compute-0 python3.9[79923]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:39:04 compute-0 sudo[79921]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:04 compute-0 sudo[80073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eowhrqhexyrhwxqpfsoricfarxfbbptr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099944.449097-178-214746019963088/AnsiballZ_stat.py'
Jan 22 16:39:04 compute-0 sudo[80073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:04 compute-0 python3.9[80075]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:04 compute-0 sudo[80073]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:05 compute-0 sudo[80196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yghlhnfsfpboyyuyyyxgfxltumyakigr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099944.449097-178-214746019963088/AnsiballZ_copy.py'
Jan 22 16:39:05 compute-0 sudo[80196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:05 compute-0 python3.9[80198]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099944.449097-178-214746019963088/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=4f357c092600a4625ef4969934727a10a1e20fca backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:05 compute-0 sudo[80196]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:06 compute-0 sudo[80348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqnlsdhdbjqhgoxjuptpayafahehfnnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099945.7360418-178-102782957200291/AnsiballZ_stat.py'
Jan 22 16:39:06 compute-0 sudo[80348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:06 compute-0 python3.9[80350]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:06 compute-0 sudo[80348]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:06 compute-0 sudo[80471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-physnzygpqliyubcdqseaqwdhlgrlyyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099945.7360418-178-102782957200291/AnsiballZ_copy.py'
Jan 22 16:39:06 compute-0 sudo[80471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:06 compute-0 python3.9[80473]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099945.7360418-178-102782957200291/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=f08ca7dccae28d5d5e7be68f169026caba53d3ec backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:06 compute-0 sudo[80471]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:07 compute-0 sudo[80623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueuxeuudwauyiajqnjfkehzftagsrqkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099946.942091-178-53900796960777/AnsiballZ_stat.py'
Jan 22 16:39:07 compute-0 sudo[80623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:07 compute-0 python3.9[80625]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:07 compute-0 sudo[80623]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:07 compute-0 sudo[80746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llzvjuhwpolgmlukhtmbcbsvlqjmtiop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099946.942091-178-53900796960777/AnsiballZ_copy.py'
Jan 22 16:39:07 compute-0 sudo[80746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:08 compute-0 python3.9[80748]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099946.942091-178-53900796960777/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=7741466d710cc6d54f884247e7de7aca629026b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:08 compute-0 sudo[80746]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:08 compute-0 sudo[80898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwuyzmeipsadztvtiyqlqntsvuyvaacb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099948.4885728-222-46878925422884/AnsiballZ_file.py'
Jan 22 16:39:08 compute-0 sudo[80898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:09 compute-0 python3.9[80900]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:39:09 compute-0 sudo[80898]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:09 compute-0 sudo[81050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gebvxdxeyypsqsuwupdkxbrpxxxqiuzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099949.2422488-222-235265524022372/AnsiballZ_file.py'
Jan 22 16:39:09 compute-0 sudo[81050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:09 compute-0 python3.9[81052]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:39:09 compute-0 sudo[81050]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:10 compute-0 sudo[81202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swqkijwdffagvrmbxadzqecqkszkaqlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099949.8833716-237-150785104396893/AnsiballZ_stat.py'
Jan 22 16:39:10 compute-0 sudo[81202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:10 compute-0 chronyd[64919]: Selected source 149.56.19.163 (pool.ntp.org)
Jan 22 16:39:10 compute-0 python3.9[81204]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:10 compute-0 sudo[81202]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:10 compute-0 sudo[81325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kflhovqtpzozjkkmyavkdtrbacswgefl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099949.8833716-237-150785104396893/AnsiballZ_copy.py'
Jan 22 16:39:10 compute-0 sudo[81325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:10 compute-0 python3.9[81327]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099949.8833716-237-150785104396893/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=6bd90cb21778247429e106db826f6880655a45a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:10 compute-0 sudo[81325]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:11 compute-0 sudo[81477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtihszlmrsyepnmsbfgvzmsvuvqowsfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099951.1631463-237-28440637369050/AnsiballZ_stat.py'
Jan 22 16:39:11 compute-0 sudo[81477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:11 compute-0 python3.9[81479]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:11 compute-0 sudo[81477]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:12 compute-0 sudo[81600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmnmeytdvegvdwntawerbqdoflwvxslu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099951.1631463-237-28440637369050/AnsiballZ_copy.py'
Jan 22 16:39:12 compute-0 sudo[81600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:12 compute-0 python3.9[81602]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099951.1631463-237-28440637369050/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=f08ca7dccae28d5d5e7be68f169026caba53d3ec backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:12 compute-0 sudo[81600]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:12 compute-0 sudo[81752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbhbcuxmcozdvravygvgmremvgljadro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099952.4367065-237-171133301890749/AnsiballZ_stat.py'
Jan 22 16:39:12 compute-0 sudo[81752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:12 compute-0 python3.9[81754]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:12 compute-0 sudo[81752]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:13 compute-0 sudo[81875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tipaqvhwgistqqlhxjabfetlrotgognz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099952.4367065-237-171133301890749/AnsiballZ_copy.py'
Jan 22 16:39:13 compute-0 sudo[81875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:13 compute-0 python3.9[81877]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099952.4367065-237-171133301890749/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=4cdbe07d6224e0747728a1b2ae3d7c2ae755e22d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:13 compute-0 sudo[81875]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:14 compute-0 sudo[82027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yriwfzikzypdnpcowiasddvqjpbqgmei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099954.4064806-297-46533823201766/AnsiballZ_file.py'
Jan 22 16:39:14 compute-0 sudo[82027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:14 compute-0 python3.9[82029]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:39:14 compute-0 sudo[82027]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:15 compute-0 sudo[82179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqvyrvuizprsizjywrshutnjpuwzhbnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099955.1497977-305-36699258452458/AnsiballZ_stat.py'
Jan 22 16:39:15 compute-0 sudo[82179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:15 compute-0 python3.9[82181]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:15 compute-0 sudo[82179]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:16 compute-0 sudo[82302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afjumapyjrpmphhywwpmqdtlgbwfxchu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099955.1497977-305-36699258452458/AnsiballZ_copy.py'
Jan 22 16:39:16 compute-0 sudo[82302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:16 compute-0 python3.9[82304]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099955.1497977-305-36699258452458/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=62f56ebb86b7819c5ce2b2a14a69280df383a076 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:16 compute-0 sudo[82302]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:17 compute-0 sudo[82454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtzlapkeqpizwbcnawbuzkulyrpcigyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099956.7505364-321-7986269741708/AnsiballZ_file.py'
Jan 22 16:39:17 compute-0 sudo[82454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:17 compute-0 python3.9[82456]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:39:17 compute-0 sudo[82454]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:17 compute-0 sudo[82606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtoccnrhnjxkmgkeochllnssbqsrnyfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099957.5323818-329-52201258775992/AnsiballZ_stat.py'
Jan 22 16:39:17 compute-0 sudo[82606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:18 compute-0 python3.9[82608]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:18 compute-0 sudo[82606]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:18 compute-0 sudo[82729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seljbrdfzecieiujffljtvvilpzeugbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099957.5323818-329-52201258775992/AnsiballZ_copy.py'
Jan 22 16:39:18 compute-0 sudo[82729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:18 compute-0 python3.9[82731]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099957.5323818-329-52201258775992/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=62f56ebb86b7819c5ce2b2a14a69280df383a076 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:18 compute-0 sudo[82729]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:19 compute-0 sudo[82881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkewqfmfohrmukgvydgcelbnsrpeeppk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099958.922377-345-93884300026275/AnsiballZ_file.py'
Jan 22 16:39:19 compute-0 sudo[82881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:19 compute-0 python3.9[82883]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:39:19 compute-0 sudo[82881]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:20 compute-0 sudo[83033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwlalojzsakkocmvkvmviuzzvieyyfvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099959.6836038-353-237226221743140/AnsiballZ_stat.py'
Jan 22 16:39:20 compute-0 sudo[83033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:20 compute-0 python3.9[83035]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:20 compute-0 sudo[83033]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:20 compute-0 sudo[83156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osbxolmiyljlncjlaehchifdokzxyxxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099959.6836038-353-237226221743140/AnsiballZ_copy.py'
Jan 22 16:39:20 compute-0 sudo[83156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:21 compute-0 python3.9[83158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099959.6836038-353-237226221743140/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=62f56ebb86b7819c5ce2b2a14a69280df383a076 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:21 compute-0 sudo[83156]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:21 compute-0 sudo[83308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syuobyznmfuzleanmieukxjzilnvsvjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099961.2496557-369-186312458814556/AnsiballZ_file.py'
Jan 22 16:39:21 compute-0 sudo[83308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:21 compute-0 python3.9[83310]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:39:21 compute-0 sudo[83308]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:22 compute-0 sudo[83460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etxgxqxqlhdvdylynnmfbidlebzxgfxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099962.0380428-377-225027509865355/AnsiballZ_stat.py'
Jan 22 16:39:22 compute-0 sudo[83460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:22 compute-0 python3.9[83462]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:22 compute-0 sudo[83460]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:22 compute-0 sudo[83583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwqnyrvxyernefdzmaindtcopalpnakk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099962.0380428-377-225027509865355/AnsiballZ_copy.py'
Jan 22 16:39:22 compute-0 sudo[83583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:23 compute-0 python3.9[83585]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099962.0380428-377-225027509865355/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=62f56ebb86b7819c5ce2b2a14a69280df383a076 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:23 compute-0 sudo[83583]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:23 compute-0 sudo[83735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hulitpidsoqkyzdbsnnzsjnmwjaqoaoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099963.3907735-393-16988488179788/AnsiballZ_file.py'
Jan 22 16:39:23 compute-0 sudo[83735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:23 compute-0 python3.9[83737]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:39:23 compute-0 sudo[83735]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:24 compute-0 sudo[83887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icqwuvzfyyrasmpygftnwsxcniatcsmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099964.1222503-401-53576991557729/AnsiballZ_stat.py'
Jan 22 16:39:24 compute-0 sudo[83887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:24 compute-0 python3.9[83889]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:24 compute-0 sudo[83887]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:24 compute-0 sudo[84010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-modvkeiupdduhnvjqxkkumpgqrwqafwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099964.1222503-401-53576991557729/AnsiballZ_copy.py'
Jan 22 16:39:24 compute-0 sudo[84010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:25 compute-0 python3.9[84012]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099964.1222503-401-53576991557729/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=62f56ebb86b7819c5ce2b2a14a69280df383a076 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:25 compute-0 sudo[84010]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:25 compute-0 sudo[84162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bicgbstnvcbtkcsvelmakzqxzhjqogyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099965.4309523-417-149372789160184/AnsiballZ_file.py'
Jan 22 16:39:25 compute-0 sudo[84162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:25 compute-0 python3.9[84164]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:39:25 compute-0 sudo[84162]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:26 compute-0 sudo[84314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwztkjjprilmtiywljwpznlhcpkmnyhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099966.138943-425-199087413060535/AnsiballZ_stat.py'
Jan 22 16:39:26 compute-0 sudo[84314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:26 compute-0 python3.9[84316]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:26 compute-0 sudo[84314]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:27 compute-0 sudo[84437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkhhaxplqlpqkddfzxcrshpbdjrajfre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099966.138943-425-199087413060535/AnsiballZ_copy.py'
Jan 22 16:39:27 compute-0 sudo[84437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:27 compute-0 python3.9[84439]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099966.138943-425-199087413060535/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=62f56ebb86b7819c5ce2b2a14a69280df383a076 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:27 compute-0 sudo[84437]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:28 compute-0 sudo[84589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wppmyqgifjdhybidcxmnihbhecqpbwjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099967.7686656-441-222987035474286/AnsiballZ_file.py'
Jan 22 16:39:28 compute-0 sudo[84589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:28 compute-0 python3.9[84591]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:39:28 compute-0 sudo[84589]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:28 compute-0 sudo[84741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmgabpdnpzdfwkgszvfvrtmjtxnmbxwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099968.5245829-449-220669974160942/AnsiballZ_stat.py'
Jan 22 16:39:28 compute-0 sudo[84741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:29 compute-0 python3.9[84743]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:30 compute-0 sudo[84741]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:30 compute-0 sudo[84864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvfzyfctreficekyftnmurfvehqgvwzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099968.5245829-449-220669974160942/AnsiballZ_copy.py'
Jan 22 16:39:30 compute-0 sudo[84864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:30 compute-0 python3.9[84866]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099968.5245829-449-220669974160942/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=62f56ebb86b7819c5ce2b2a14a69280df383a076 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:30 compute-0 sudo[84864]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:31 compute-0 sshd-session[77204]: Connection closed by 192.168.122.30 port 56238
Jan 22 16:39:31 compute-0 sshd-session[77201]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:39:31 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Jan 22 16:39:31 compute-0 systemd[1]: session-19.scope: Consumed 32.644s CPU time.
Jan 22 16:39:31 compute-0 systemd-logind[796]: Session 19 logged out. Waiting for processes to exit.
Jan 22 16:39:31 compute-0 systemd-logind[796]: Removed session 19.
Jan 22 16:39:36 compute-0 sshd-session[84891]: Accepted publickey for zuul from 192.168.122.30 port 40276 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:39:36 compute-0 systemd-logind[796]: New session 20 of user zuul.
Jan 22 16:39:36 compute-0 systemd[1]: Started Session 20 of User zuul.
Jan 22 16:39:36 compute-0 sshd-session[84891]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:39:37 compute-0 python3.9[85044]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:39:38 compute-0 sudo[85198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqaxpwwlvuuayhmdndkzavvgklakmxwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099978.2067564-29-176244274747811/AnsiballZ_file.py'
Jan 22 16:39:38 compute-0 sudo[85198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:38 compute-0 python3.9[85200]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:39:38 compute-0 sudo[85198]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:39 compute-0 sudo[85350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezzqjmgtiyxtpzntglibczuvyxndjevm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099979.1088128-29-190597833809098/AnsiballZ_file.py'
Jan 22 16:39:39 compute-0 sudo[85350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:39 compute-0 python3.9[85352]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:39:39 compute-0 sudo[85350]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:40 compute-0 python3.9[85502]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:39:41 compute-0 sudo[85652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flxlzoetchrkhtsegzxogevhqdlagirg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099980.684811-52-224113450311995/AnsiballZ_seboolean.py'
Jan 22 16:39:41 compute-0 sudo[85652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:41 compute-0 python3.9[85654]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 22 16:39:42 compute-0 sudo[85652]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:43 compute-0 sudo[85808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zobjqoayhhbibnkgbpsyouwptsbmrcrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099983.1848786-62-277748834000086/AnsiballZ_setup.py'
Jan 22 16:39:43 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 22 16:39:43 compute-0 sudo[85808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:43 compute-0 python3.9[85810]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:39:44 compute-0 sudo[85808]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:44 compute-0 sudo[85892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjsblowhtnlvmzuapvpjhqtezmdntdoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099983.1848786-62-277748834000086/AnsiballZ_dnf.py'
Jan 22 16:39:44 compute-0 sudo[85892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:44 compute-0 python3.9[85894]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:39:46 compute-0 sudo[85892]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:47 compute-0 sudo[86045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pycfqghvvqeprdrplsixcgwmhhnflqin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099986.431595-74-49276188638454/AnsiballZ_systemd.py'
Jan 22 16:39:47 compute-0 sudo[86045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:47 compute-0 python3.9[86047]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 16:39:47 compute-0 sudo[86045]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:48 compute-0 sudo[86200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcdwwjezeupdubhzuglgszszgujufwfn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769099987.6323152-82-163200658036934/AnsiballZ_edpm_nftables_snippet.py'
Jan 22 16:39:48 compute-0 sudo[86200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:48 compute-0 python3[86202]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 22 16:39:48 compute-0 sudo[86200]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:48 compute-0 sudo[86352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jffvnzntqfffzkuistwoprbcmhhohmes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099988.6417294-91-196470957798938/AnsiballZ_file.py'
Jan 22 16:39:48 compute-0 sudo[86352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:49 compute-0 python3.9[86354]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:49 compute-0 sudo[86352]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:49 compute-0 sudo[86504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hucusuwghcwadblofklznwpxolrdqznb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099989.3558443-99-188629724106110/AnsiballZ_stat.py'
Jan 22 16:39:49 compute-0 sudo[86504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:50 compute-0 python3.9[86506]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:50 compute-0 sudo[86504]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:50 compute-0 sudo[86582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhkgxunvyyvhdwpduribsankbyutfgfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099989.3558443-99-188629724106110/AnsiballZ_file.py'
Jan 22 16:39:50 compute-0 sudo[86582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:50 compute-0 python3.9[86584]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:50 compute-0 sudo[86582]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:51 compute-0 sudo[86734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skxymxvyouttehvygwpdvwpgiyoeuwkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099990.7775548-111-218233374590538/AnsiballZ_stat.py'
Jan 22 16:39:51 compute-0 sudo[86734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:51 compute-0 python3.9[86736]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:51 compute-0 sudo[86734]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:51 compute-0 sudo[86812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkhjsrchfexawiatnprtuavjohempnuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099990.7775548-111-218233374590538/AnsiballZ_file.py'
Jan 22 16:39:51 compute-0 sudo[86812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:51 compute-0 python3.9[86814]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.60snpdvt recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:51 compute-0 sudo[86812]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:52 compute-0 sudo[86964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-artcyamcimxccixnplsvervkwvuanyrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099991.9283226-123-88160948712118/AnsiballZ_stat.py'
Jan 22 16:39:52 compute-0 sudo[86964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:52 compute-0 python3.9[86966]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:52 compute-0 sudo[86964]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:52 compute-0 sudo[87042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feelomemyalwgvdrtrscwrcizvvvpdsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099991.9283226-123-88160948712118/AnsiballZ_file.py'
Jan 22 16:39:52 compute-0 sudo[87042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:52 compute-0 python3.9[87044]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:52 compute-0 sudo[87042]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:53 compute-0 sudo[87194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sehqzyvcidjkcrawttusjekcvvcxitrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099993.1503851-136-201422873462336/AnsiballZ_command.py'
Jan 22 16:39:53 compute-0 sudo[87194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:53 compute-0 python3.9[87196]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:39:53 compute-0 sudo[87194]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:54 compute-0 sudo[87347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qephgrlzwancbnebqtsssrinqtjintve ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769099994.0939467-144-127002843356490/AnsiballZ_edpm_nftables_from_files.py'
Jan 22 16:39:54 compute-0 sudo[87347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:54 compute-0 python3[87349]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 22 16:39:54 compute-0 sudo[87347]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:55 compute-0 sudo[87499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luesdpyhqncxixcsyrbbkaksnzkyrajr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099994.8951344-152-244817434228846/AnsiballZ_stat.py'
Jan 22 16:39:55 compute-0 sudo[87499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:55 compute-0 python3.9[87501]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:55 compute-0 sudo[87499]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:56 compute-0 sudo[87624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-argtdruqturhuolkarvpqbynsynliafu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099994.8951344-152-244817434228846/AnsiballZ_copy.py'
Jan 22 16:39:56 compute-0 sudo[87624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:56 compute-0 python3.9[87626]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099994.8951344-152-244817434228846/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:56 compute-0 sudo[87624]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:56 compute-0 sudo[87776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skssqsmbpfbtdlvzgxixiomfjwansbfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099996.4415083-167-25646696040678/AnsiballZ_stat.py'
Jan 22 16:39:56 compute-0 sudo[87776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:57 compute-0 python3.9[87778]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:57 compute-0 sudo[87776]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:57 compute-0 sudo[87901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swxdrydezpyvdgvnibdstseuhrgghtox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099996.4415083-167-25646696040678/AnsiballZ_copy.py'
Jan 22 16:39:57 compute-0 sudo[87901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:57 compute-0 python3.9[87903]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099996.4415083-167-25646696040678/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:57 compute-0 sudo[87901]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:58 compute-0 sudo[88053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvdtanyxdnznfdtpguhicjrgcgiqtsca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099997.8563945-182-22548140200668/AnsiballZ_stat.py'
Jan 22 16:39:58 compute-0 sudo[88053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:58 compute-0 python3.9[88055]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:58 compute-0 sudo[88053]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:58 compute-0 sudo[88178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nunjzupmrveaikkpxqkvygenazfuzavp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099997.8563945-182-22548140200668/AnsiballZ_copy.py'
Jan 22 16:39:58 compute-0 sudo[88178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:59 compute-0 python3.9[88180]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099997.8563945-182-22548140200668/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:39:59 compute-0 sudo[88178]: pam_unix(sudo:session): session closed for user root
Jan 22 16:39:59 compute-0 sudo[88330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxevdypnmhaakruhncuqmvadvrmtqdve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099999.2919452-197-139885396255939/AnsiballZ_stat.py'
Jan 22 16:39:59 compute-0 sudo[88330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:39:59 compute-0 python3.9[88332]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:39:59 compute-0 sudo[88330]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:00 compute-0 sudo[88455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atpxqjhhpzxmqcibnnywcgbymllpdejz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769099999.2919452-197-139885396255939/AnsiballZ_copy.py'
Jan 22 16:40:00 compute-0 sudo[88455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:00 compute-0 python3.9[88457]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769099999.2919452-197-139885396255939/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:40:00 compute-0 sudo[88455]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:01 compute-0 sudo[88607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjntgcfriqdbafzngkpqqdpxkzjeukyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100000.619412-212-247065463917564/AnsiballZ_stat.py'
Jan 22 16:40:01 compute-0 sudo[88607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:01 compute-0 python3.9[88609]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:40:01 compute-0 sudo[88607]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:01 compute-0 sudo[88732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrieezupwmcwgmypkdkiiqtqzhpnkixb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100000.619412-212-247065463917564/AnsiballZ_copy.py'
Jan 22 16:40:01 compute-0 sudo[88732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:01 compute-0 python3.9[88734]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100000.619412-212-247065463917564/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:40:01 compute-0 sudo[88732]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:02 compute-0 sudo[88884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xajvskyzmpykzyzbxswuamqfeprwgesr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100002.0288253-227-197486681552961/AnsiballZ_file.py'
Jan 22 16:40:02 compute-0 sudo[88884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:02 compute-0 python3.9[88886]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:40:02 compute-0 sudo[88884]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:03 compute-0 sudo[89036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgfkbcnrkzcdtxsrlqcsplvynlziseqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100002.8326592-235-151040507081802/AnsiballZ_command.py'
Jan 22 16:40:03 compute-0 sudo[89036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:03 compute-0 python3.9[89038]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:40:03 compute-0 sudo[89036]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:04 compute-0 sudo[89191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azutcjibhcepbrzzbiwixxmkjhiafziu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100003.6056716-243-263887607945415/AnsiballZ_blockinfile.py'
Jan 22 16:40:04 compute-0 sudo[89191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:04 compute-0 python3.9[89193]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:40:04 compute-0 sudo[89191]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:05 compute-0 sudo[89343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-movmtxznzjuakdkkjdklebdxmovrlsrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100004.7012677-252-83566744793078/AnsiballZ_command.py'
Jan 22 16:40:05 compute-0 sudo[89343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:05 compute-0 python3.9[89345]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:40:05 compute-0 sudo[89343]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:05 compute-0 sudo[89496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwrfhupqstzoxijkjymkagqicrqvdkjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100005.4554574-260-98757263515704/AnsiballZ_stat.py'
Jan 22 16:40:05 compute-0 sudo[89496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:05 compute-0 python3.9[89498]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:40:05 compute-0 sudo[89496]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:06 compute-0 sudo[89650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efkeurjwdrnwiszpszkvshryjgtgdxwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100006.141375-268-168682052471604/AnsiballZ_command.py'
Jan 22 16:40:06 compute-0 sudo[89650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:06 compute-0 python3.9[89652]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:40:06 compute-0 sudo[89650]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:07 compute-0 sudo[89805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvxzyyswpbjqyhnfhzkkbiienseilzya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100006.890937-276-91109418062590/AnsiballZ_file.py'
Jan 22 16:40:07 compute-0 sudo[89805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:07 compute-0 python3.9[89807]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:40:07 compute-0 sudo[89805]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:08 compute-0 python3.9[89957]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:40:09 compute-0 sudo[90108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdjrlgnirtaqktgykagepjggzbathpks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100009.3064363-316-250545088951100/AnsiballZ_command.py'
Jan 22 16:40:09 compute-0 sudo[90108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:09 compute-0 python3.9[90110]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:40:09 compute-0 ovs-vsctl[90111]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 22 16:40:09 compute-0 sudo[90108]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:10 compute-0 sudo[90261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqjlwvmajhmiwyjkqiqahknvokteigps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100010.1129103-325-143095394029943/AnsiballZ_command.py'
Jan 22 16:40:10 compute-0 sudo[90261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:10 compute-0 python3.9[90263]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:40:10 compute-0 sudo[90261]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:11 compute-0 sudo[90416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlfuadmujqxxxspkovmlhiykzugqhdnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100010.8928196-333-190590283897874/AnsiballZ_command.py'
Jan 22 16:40:11 compute-0 sudo[90416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:11 compute-0 python3.9[90418]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:40:11 compute-0 ovs-vsctl[90419]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 22 16:40:11 compute-0 sudo[90416]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:12 compute-0 python3.9[90569]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:40:12 compute-0 sudo[90721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuckrzveftalgnhmresamiainzvxbdbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100012.5663602-350-11064483804417/AnsiballZ_file.py'
Jan 22 16:40:12 compute-0 sudo[90721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:13 compute-0 python3.9[90723]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:40:13 compute-0 sudo[90721]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:13 compute-0 sudo[90873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tehezrevjuuzbzsgmsjhyabajuvtflys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100013.3254879-358-59615974377411/AnsiballZ_stat.py'
Jan 22 16:40:13 compute-0 sudo[90873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:13 compute-0 python3.9[90875]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:40:13 compute-0 sudo[90873]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:14 compute-0 sudo[90951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlnxrmjiyhaicdibxayttdjqegvfmrbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100013.3254879-358-59615974377411/AnsiballZ_file.py'
Jan 22 16:40:14 compute-0 sudo[90951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:14 compute-0 python3.9[90953]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:40:14 compute-0 sudo[90951]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:14 compute-0 sudo[91103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxstpfqqydsdiiffmugviueaadqybvsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100014.441111-358-183330537566144/AnsiballZ_stat.py'
Jan 22 16:40:14 compute-0 sudo[91103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:14 compute-0 python3.9[91105]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:40:15 compute-0 sudo[91103]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:15 compute-0 sudo[91181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixmpdryyzcsjuytywmdrtzbukrxvjxuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100014.441111-358-183330537566144/AnsiballZ_file.py'
Jan 22 16:40:15 compute-0 sudo[91181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:15 compute-0 python3.9[91183]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:40:15 compute-0 sudo[91181]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:15 compute-0 sudo[91333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzdgsszlvkifgmnkzfqpgmxehqihzsnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100015.6270456-381-114154189572991/AnsiballZ_file.py'
Jan 22 16:40:15 compute-0 sudo[91333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:16 compute-0 python3.9[91335]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:40:16 compute-0 sudo[91333]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:16 compute-0 sudo[91485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqildnkhygpmvlaiirdracpbdkfbyouw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100016.2756927-389-190546890597512/AnsiballZ_stat.py'
Jan 22 16:40:16 compute-0 sudo[91485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:16 compute-0 python3.9[91487]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:40:16 compute-0 sudo[91485]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:17 compute-0 sudo[91563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgcqvvtvwauklilcbrefmztamfsbccta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100016.2756927-389-190546890597512/AnsiballZ_file.py'
Jan 22 16:40:17 compute-0 sudo[91563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:17 compute-0 python3.9[91565]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:40:17 compute-0 sudo[91563]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:17 compute-0 sudo[91715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbtsqpljzsfrtgeprhrzsaijokwirajj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100017.5012925-401-85064555959632/AnsiballZ_stat.py'
Jan 22 16:40:17 compute-0 sudo[91715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:18 compute-0 python3.9[91717]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:40:18 compute-0 sudo[91715]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:18 compute-0 sudo[91793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rldhptdueebuxzdfmklxozefsthggtjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100017.5012925-401-85064555959632/AnsiballZ_file.py'
Jan 22 16:40:18 compute-0 sudo[91793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:18 compute-0 python3.9[91795]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:40:18 compute-0 sudo[91793]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:19 compute-0 sudo[91945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxamlbtdgkgthktjzdbnqkkdraekxqnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100018.7238183-413-95477522999136/AnsiballZ_systemd.py'
Jan 22 16:40:19 compute-0 sudo[91945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:19 compute-0 python3.9[91947]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:40:19 compute-0 systemd[1]: Reloading.
Jan 22 16:40:19 compute-0 systemd-rc-local-generator[91973]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:40:19 compute-0 systemd-sysv-generator[91976]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:40:19 compute-0 sudo[91945]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:20 compute-0 sudo[92134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjarfrujljuvcoqhzgpwidmumwghxhiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100019.7767224-421-239591251522899/AnsiballZ_stat.py'
Jan 22 16:40:20 compute-0 sudo[92134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:20 compute-0 python3.9[92136]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:40:20 compute-0 sudo[92134]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:20 compute-0 sudo[92212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adonrnvfbhjrkmenaadhsvosstapvool ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100019.7767224-421-239591251522899/AnsiballZ_file.py'
Jan 22 16:40:20 compute-0 sudo[92212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:20 compute-0 python3.9[92214]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:40:20 compute-0 sudo[92212]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:21 compute-0 sudo[92364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztarmtwieqnhlueorzeqzbkwpnzjfqxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100021.040356-433-190532120734960/AnsiballZ_stat.py'
Jan 22 16:40:21 compute-0 sudo[92364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:21 compute-0 python3.9[92366]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:40:21 compute-0 sudo[92364]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:21 compute-0 sudo[92442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnjcwnkeafrrewuabwpkbtxnusqhxfza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100021.040356-433-190532120734960/AnsiballZ_file.py'
Jan 22 16:40:21 compute-0 sudo[92442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:22 compute-0 python3.9[92444]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:40:22 compute-0 sudo[92442]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:22 compute-0 sudo[92594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exouoheundufddhemznhjbhhrdgazqsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100022.207586-445-52515008929910/AnsiballZ_systemd.py'
Jan 22 16:40:22 compute-0 sudo[92594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:22 compute-0 python3.9[92596]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:40:22 compute-0 systemd[1]: Reloading.
Jan 22 16:40:22 compute-0 systemd-sysv-generator[92627]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:40:22 compute-0 systemd-rc-local-generator[92624]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:40:23 compute-0 systemd[1]: Starting Create netns directory...
Jan 22 16:40:23 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 22 16:40:23 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 22 16:40:23 compute-0 systemd[1]: Finished Create netns directory.
Jan 22 16:40:23 compute-0 sudo[92594]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:23 compute-0 sudo[92787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tatnmfevhjxfwqxwfghdspzihwzxnxff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100023.3925965-455-3389626643619/AnsiballZ_file.py'
Jan 22 16:40:23 compute-0 sudo[92787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:23 compute-0 python3.9[92789]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:40:23 compute-0 sudo[92787]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:24 compute-0 sudo[92939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cspyivfvagaqvslnippangfxloqsxzqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100024.0148213-463-178550907887573/AnsiballZ_stat.py'
Jan 22 16:40:24 compute-0 sudo[92939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:24 compute-0 python3.9[92941]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:40:24 compute-0 sudo[92939]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:24 compute-0 sudo[93062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cifsggfphkdgiplbpyjhswghpvfacbzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100024.0148213-463-178550907887573/AnsiballZ_copy.py'
Jan 22 16:40:24 compute-0 sudo[93062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:25 compute-0 python3.9[93064]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100024.0148213-463-178550907887573/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:40:25 compute-0 sudo[93062]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:25 compute-0 sudo[93214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohqdxmbzzhqkofcuxpegshfolxaembkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100025.4643753-480-197900713145691/AnsiballZ_file.py'
Jan 22 16:40:25 compute-0 sudo[93214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:25 compute-0 python3.9[93216]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:40:26 compute-0 sudo[93214]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:26 compute-0 sudo[93366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukuiiveufciuvtprgfadujqsrdwkahdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100026.1909974-488-37477282743627/AnsiballZ_file.py'
Jan 22 16:40:26 compute-0 sudo[93366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:26 compute-0 python3.9[93368]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:40:26 compute-0 sudo[93366]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:27 compute-0 sudo[93518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gomnntdfuyimmcrdtamswgtjftblmdwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100026.8581276-496-260499123247071/AnsiballZ_stat.py'
Jan 22 16:40:27 compute-0 sudo[93518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:27 compute-0 python3.9[93520]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:40:27 compute-0 sudo[93518]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:27 compute-0 sudo[93641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blerdegjvhxzjnagapcwvxoudtbgwdvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100026.8581276-496-260499123247071/AnsiballZ_copy.py'
Jan 22 16:40:27 compute-0 sudo[93641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:27 compute-0 python3.9[93643]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100026.8581276-496-260499123247071/.source.json _original_basename=.1bfwhoqu follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:40:27 compute-0 sudo[93641]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:28 compute-0 python3.9[93793]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:40:31 compute-0 sudo[94214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwixffqgjmsccnipfkaudutqoqhgepoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100031.1263955-536-250023658726832/AnsiballZ_container_config_data.py'
Jan 22 16:40:31 compute-0 sudo[94214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:31 compute-0 python3.9[94216]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 22 16:40:31 compute-0 sudo[94214]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:32 compute-0 sudo[94366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfcjeyuddikfusjpomczltvmilngmchj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100032.2629256-547-217034177674057/AnsiballZ_container_config_hash.py'
Jan 22 16:40:32 compute-0 sudo[94366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:32 compute-0 python3.9[94368]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 16:40:32 compute-0 sudo[94366]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:33 compute-0 sudo[94518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwfukmhayozgxeoynuhmgtfxytbekuvo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769100033.2153873-557-166849046094794/AnsiballZ_edpm_container_manage.py'
Jan 22 16:40:33 compute-0 sudo[94518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:34 compute-0 python3[94520]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 16:40:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:40:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:40:34 compute-0 podman[94556]: 2026-01-22 16:40:34.33418882 +0000 UTC m=+0.065271363 container create 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller)
Jan 22 16:40:34 compute-0 podman[94556]: 2026-01-22 16:40:34.301730043 +0000 UTC m=+0.032812606 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 22 16:40:34 compute-0 python3[94520]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 22 16:40:34 compute-0 sudo[94518]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:34 compute-0 sudo[94744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbxmsubupxtcqyatawbqovxpriztlaze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100034.6910386-565-96344163056996/AnsiballZ_stat.py'
Jan 22 16:40:34 compute-0 sudo[94744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:35 compute-0 python3.9[94746]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:40:35 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:40:35 compute-0 sudo[94744]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:35 compute-0 sudo[94898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myywnclbmqtaasivikoestzodebssqjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100035.4445872-574-147026040752779/AnsiballZ_file.py'
Jan 22 16:40:35 compute-0 sudo[94898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:35 compute-0 python3.9[94900]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:40:35 compute-0 sudo[94898]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:36 compute-0 sudo[94974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-togyncjqzswllermknssqkzkgunfxxfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100035.4445872-574-147026040752779/AnsiballZ_stat.py'
Jan 22 16:40:36 compute-0 sudo[94974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:36 compute-0 python3.9[94976]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:40:36 compute-0 sudo[94974]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:37 compute-0 sudo[95125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrttildmnnqhyqlvktjhnnyhofcpstmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100036.4894328-574-192274664164694/AnsiballZ_copy.py'
Jan 22 16:40:37 compute-0 sudo[95125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:37 compute-0 python3.9[95127]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769100036.4894328-574-192274664164694/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:40:37 compute-0 sudo[95125]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:37 compute-0 sudo[95201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyhjlzgtuaziofcmvsiqurolmlwlnqmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100036.4894328-574-192274664164694/AnsiballZ_systemd.py'
Jan 22 16:40:37 compute-0 sudo[95201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:37 compute-0 python3.9[95203]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 16:40:37 compute-0 systemd[1]: Reloading.
Jan 22 16:40:37 compute-0 systemd-sysv-generator[95229]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:40:37 compute-0 systemd-rc-local-generator[95225]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:40:38 compute-0 sudo[95201]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:38 compute-0 sudo[95313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqmcqizybndmlojybntyunicnjnbahbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100036.4894328-574-192274664164694/AnsiballZ_systemd.py'
Jan 22 16:40:38 compute-0 sudo[95313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:38 compute-0 python3.9[95315]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:40:38 compute-0 systemd[1]: Reloading.
Jan 22 16:40:38 compute-0 systemd-sysv-generator[95349]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:40:38 compute-0 systemd-rc-local-generator[95345]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:40:39 compute-0 systemd[1]: Starting ovn_controller container...
Jan 22 16:40:39 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 22 16:40:39 compute-0 systemd[1]: Started libcrun container.
Jan 22 16:40:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b07e7153dabdfb8feca2e9540d1aa9b84aef10e64b8b518f94e5565e6074fea/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 22 16:40:39 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee.
Jan 22 16:40:39 compute-0 podman[95356]: 2026-01-22 16:40:39.213106708 +0000 UTC m=+0.156436641 container init 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 16:40:39 compute-0 ovn_controller[95372]: + sudo -E kolla_set_configs
Jan 22 16:40:39 compute-0 podman[95356]: 2026-01-22 16:40:39.247156596 +0000 UTC m=+0.190486519 container start 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 16:40:39 compute-0 edpm-start-podman-container[95356]: ovn_controller
Jan 22 16:40:39 compute-0 systemd[1]: Created slice User Slice of UID 0.
Jan 22 16:40:39 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 22 16:40:39 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 22 16:40:39 compute-0 systemd[1]: Starting User Manager for UID 0...
Jan 22 16:40:39 compute-0 podman[95378]: 2026-01-22 16:40:39.327509262 +0000 UTC m=+0.069831913 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 22 16:40:39 compute-0 systemd[95411]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Jan 22 16:40:39 compute-0 systemd[1]: 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee-1097c77679d08ae.service: Main process exited, code=exited, status=1/FAILURE
Jan 22 16:40:39 compute-0 systemd[1]: 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee-1097c77679d08ae.service: Failed with result 'exit-code'.
Jan 22 16:40:39 compute-0 edpm-start-podman-container[95355]: Creating additional drop-in dependency for "ovn_controller" (3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee)
Jan 22 16:40:39 compute-0 systemd[1]: Reloading.
Jan 22 16:40:39 compute-0 systemd[95411]: Queued start job for default target Main User Target.
Jan 22 16:40:39 compute-0 systemd[95411]: Created slice User Application Slice.
Jan 22 16:40:39 compute-0 systemd[95411]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 22 16:40:39 compute-0 systemd[95411]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 16:40:39 compute-0 systemd[95411]: Reached target Paths.
Jan 22 16:40:39 compute-0 systemd[95411]: Reached target Timers.
Jan 22 16:40:39 compute-0 systemd[95411]: Starting D-Bus User Message Bus Socket...
Jan 22 16:40:39 compute-0 systemd-sysv-generator[95466]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:40:39 compute-0 systemd-rc-local-generator[95463]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:40:39 compute-0 systemd[95411]: Starting Create User's Volatile Files and Directories...
Jan 22 16:40:39 compute-0 systemd[95411]: Finished Create User's Volatile Files and Directories.
Jan 22 16:40:39 compute-0 systemd[95411]: Listening on D-Bus User Message Bus Socket.
Jan 22 16:40:39 compute-0 systemd[95411]: Reached target Sockets.
Jan 22 16:40:39 compute-0 systemd[95411]: Reached target Basic System.
Jan 22 16:40:39 compute-0 systemd[95411]: Reached target Main User Target.
Jan 22 16:40:39 compute-0 systemd[95411]: Startup finished in 136ms.
Jan 22 16:40:39 compute-0 systemd[1]: Started User Manager for UID 0.
Jan 22 16:40:39 compute-0 systemd[1]: Started Session c1 of User root.
Jan 22 16:40:39 compute-0 systemd[1]: Started ovn_controller container.
Jan 22 16:40:39 compute-0 sudo[95313]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:39 compute-0 ovn_controller[95372]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 16:40:39 compute-0 ovn_controller[95372]: INFO:__main__:Validating config file
Jan 22 16:40:39 compute-0 ovn_controller[95372]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 16:40:39 compute-0 ovn_controller[95372]: INFO:__main__:Writing out command to execute
Jan 22 16:40:39 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 22 16:40:39 compute-0 ovn_controller[95372]: ++ cat /run_command
Jan 22 16:40:39 compute-0 ovn_controller[95372]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 22 16:40:39 compute-0 ovn_controller[95372]: + ARGS=
Jan 22 16:40:39 compute-0 ovn_controller[95372]: + sudo kolla_copy_cacerts
Jan 22 16:40:39 compute-0 systemd[1]: Started Session c2 of User root.
Jan 22 16:40:39 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 22 16:40:39 compute-0 ovn_controller[95372]: + [[ ! -n '' ]]
Jan 22 16:40:39 compute-0 ovn_controller[95372]: + . kolla_extend_start
Jan 22 16:40:39 compute-0 ovn_controller[95372]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 22 16:40:39 compute-0 ovn_controller[95372]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 22 16:40:39 compute-0 ovn_controller[95372]: + umask 0022
Jan 22 16:40:39 compute-0 ovn_controller[95372]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 22 16:40:39 compute-0 NetworkManager[55454]: <info>  [1769100039.7958] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 22 16:40:39 compute-0 NetworkManager[55454]: <info>  [1769100039.7970] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:40:39 compute-0 NetworkManager[55454]: <warn>  [1769100039.7973] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 16:40:39 compute-0 NetworkManager[55454]: <info>  [1769100039.7983] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 22 16:40:39 compute-0 NetworkManager[55454]: <info>  [1769100039.7992] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 22 16:40:39 compute-0 NetworkManager[55454]: <info>  [1769100039.7997] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 22 16:40:39 compute-0 kernel: br-int: entered promiscuous mode
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00001|statctrl(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00002|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00003|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 22 16:40:39 compute-0 ovn_controller[95372]: 2026-01-22T16:40:39Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 22 16:40:39 compute-0 NetworkManager[55454]: <info>  [1769100039.8384] manager: (ovn-1028b9-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 22 16:40:39 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Jan 22 16:40:39 compute-0 NetworkManager[55454]: <info>  [1769100039.8601] device (genev_sys_6081): carrier: link connected
Jan 22 16:40:39 compute-0 NetworkManager[55454]: <info>  [1769100039.8604] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Jan 22 16:40:39 compute-0 systemd-udevd[95509]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 16:40:39 compute-0 systemd-udevd[95513]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 16:40:40 compute-0 python3.9[95641]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 16:40:41 compute-0 sudo[95791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yistrvlgvhxdnimpccqxbogtkmeykibi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100041.247999-619-3922187706126/AnsiballZ_stat.py'
Jan 22 16:40:41 compute-0 sudo[95791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:41 compute-0 python3.9[95793]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:40:41 compute-0 sudo[95791]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:42 compute-0 sudo[95914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcfsvsvhlglkkyhpmnengcwigsnjalhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100041.247999-619-3922187706126/AnsiballZ_copy.py'
Jan 22 16:40:42 compute-0 sudo[95914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:42 compute-0 python3.9[95916]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100041.247999-619-3922187706126/.source.yaml _original_basename=.f3sesl5q follow=False checksum=df7cdf85bcf2c56f607f9a29849ff5e49ace78ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:40:42 compute-0 sudo[95914]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:42 compute-0 sudo[96066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsddnttlokrmztnxnbblcehvctehnaom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100042.665694-634-86084354327031/AnsiballZ_command.py'
Jan 22 16:40:43 compute-0 sudo[96066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:43 compute-0 python3.9[96068]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:40:43 compute-0 ovs-vsctl[96069]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 22 16:40:43 compute-0 sudo[96066]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:43 compute-0 sudo[96219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peoqcsjbqnmehepplbbeelgkicqsbujz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100043.4270732-642-52124439862632/AnsiballZ_command.py'
Jan 22 16:40:43 compute-0 sudo[96219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:43 compute-0 python3.9[96221]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:40:43 compute-0 ovs-vsctl[96223]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 22 16:40:43 compute-0 sudo[96219]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:44 compute-0 sudo[96374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frnlzyotukqbieueslybaqvqhwzgvyos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100044.3175292-656-54366988542098/AnsiballZ_command.py'
Jan 22 16:40:44 compute-0 sudo[96374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:44 compute-0 python3.9[96376]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:40:44 compute-0 ovs-vsctl[96377]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 22 16:40:44 compute-0 sudo[96374]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:45 compute-0 sshd-session[84894]: Connection closed by 192.168.122.30 port 40276
Jan 22 16:40:45 compute-0 sshd-session[84891]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:40:45 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Jan 22 16:40:45 compute-0 systemd-logind[796]: Session 20 logged out. Waiting for processes to exit.
Jan 22 16:40:45 compute-0 systemd[1]: session-20.scope: Consumed 51.188s CPU time.
Jan 22 16:40:45 compute-0 systemd-logind[796]: Removed session 20.
Jan 22 16:40:50 compute-0 systemd[1]: Stopping User Manager for UID 0...
Jan 22 16:40:50 compute-0 systemd[95411]: Activating special unit Exit the Session...
Jan 22 16:40:50 compute-0 systemd[95411]: Stopped target Main User Target.
Jan 22 16:40:50 compute-0 systemd[95411]: Stopped target Basic System.
Jan 22 16:40:50 compute-0 systemd[95411]: Stopped target Paths.
Jan 22 16:40:50 compute-0 systemd[95411]: Stopped target Sockets.
Jan 22 16:40:50 compute-0 systemd[95411]: Stopped target Timers.
Jan 22 16:40:50 compute-0 systemd[95411]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 16:40:50 compute-0 systemd[95411]: Closed D-Bus User Message Bus Socket.
Jan 22 16:40:50 compute-0 systemd[95411]: Stopped Create User's Volatile Files and Directories.
Jan 22 16:40:50 compute-0 systemd[95411]: Removed slice User Application Slice.
Jan 22 16:40:50 compute-0 systemd[95411]: Reached target Shutdown.
Jan 22 16:40:50 compute-0 systemd[95411]: Finished Exit the Session.
Jan 22 16:40:50 compute-0 systemd[95411]: Reached target Exit the Session.
Jan 22 16:40:50 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Jan 22 16:40:50 compute-0 systemd[1]: Stopped User Manager for UID 0.
Jan 22 16:40:50 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 22 16:40:50 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 22 16:40:50 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 22 16:40:50 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 22 16:40:50 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Jan 22 16:40:50 compute-0 sshd-session[96404]: Accepted publickey for zuul from 192.168.122.30 port 48722 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:40:50 compute-0 systemd-logind[796]: New session 22 of user zuul.
Jan 22 16:40:50 compute-0 systemd[1]: Started Session 22 of User zuul.
Jan 22 16:40:50 compute-0 sshd-session[96404]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:40:51 compute-0 python3.9[96557]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:40:52 compute-0 sudo[96711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpckefrbyifmlmbihojydhmyjuaanuyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100051.9037693-29-72115994727392/AnsiballZ_file.py'
Jan 22 16:40:52 compute-0 sudo[96711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:52 compute-0 python3.9[96713]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:40:52 compute-0 sudo[96711]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:53 compute-0 sudo[96863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdrnwekjmgchkueopgcjqolnyqunuysl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100052.8668103-29-228591226394262/AnsiballZ_file.py'
Jan 22 16:40:53 compute-0 sudo[96863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:53 compute-0 python3.9[96865]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:40:53 compute-0 sudo[96863]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:54 compute-0 sudo[97015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bisdjstlqbqyxvwahgfkwsmdjgwgiqme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100053.6332715-29-217902520554890/AnsiballZ_file.py'
Jan 22 16:40:54 compute-0 sudo[97015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:54 compute-0 python3.9[97017]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:40:54 compute-0 sudo[97015]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:54 compute-0 sudo[97167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etwnwvrracumnmhzxtddqtjcicvmztnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100054.4289691-29-214803774542007/AnsiballZ_file.py'
Jan 22 16:40:54 compute-0 sudo[97167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:55 compute-0 python3.9[97169]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:40:55 compute-0 sudo[97167]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:55 compute-0 sudo[97319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsufvexzroyccemwvpqwklmbvpedgqjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100055.1901414-29-145360785628002/AnsiballZ_file.py'
Jan 22 16:40:55 compute-0 sudo[97319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:55 compute-0 python3.9[97321]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:40:55 compute-0 sudo[97319]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:56 compute-0 python3.9[97471]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:40:57 compute-0 sudo[97621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eotpzdirvsntgktawuarwqphlndtlixl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100056.941568-73-163627679018967/AnsiballZ_seboolean.py'
Jan 22 16:40:57 compute-0 sudo[97621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:40:57 compute-0 python3.9[97623]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 22 16:40:58 compute-0 sudo[97621]: pam_unix(sudo:session): session closed for user root
Jan 22 16:40:59 compute-0 python3.9[97774]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:41:00 compute-0 python3.9[97895]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100058.6069407-81-98631962992024/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:41:00 compute-0 python3.9[98045]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:41:01 compute-0 python3.9[98166]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100060.2667181-96-193083941349020/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:41:02 compute-0 sudo[98316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sehisvzikidfndskkdtlxiwwlnxflcym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100061.7753708-113-140904518972303/AnsiballZ_setup.py'
Jan 22 16:41:02 compute-0 sudo[98316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:02 compute-0 python3.9[98318]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:41:02 compute-0 sudo[98316]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:03 compute-0 sudo[98400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekzdrrasryxmtlfuuczbqwzjaebxuvan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100061.7753708-113-140904518972303/AnsiballZ_dnf.py'
Jan 22 16:41:03 compute-0 sudo[98400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:03 compute-0 python3.9[98402]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:41:04 compute-0 sudo[98400]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:05 compute-0 sudo[98553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylfsecbwuxddunvkdaixsvmkabmzzdat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100064.9527023-125-29352078710689/AnsiballZ_systemd.py'
Jan 22 16:41:05 compute-0 sudo[98553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:05 compute-0 python3.9[98555]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 16:41:06 compute-0 sudo[98553]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:06 compute-0 python3.9[98708]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:41:07 compute-0 python3.9[98829]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100066.302098-133-197071753755866/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:41:08 compute-0 python3.9[98979]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:41:08 compute-0 python3.9[99100]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100067.7418401-133-223349462616243/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:41:10 compute-0 ovn_controller[95372]: 2026-01-22T16:41:10Z|00025|memory|INFO|16000 kB peak resident set size after 30.3 seconds
Jan 22 16:41:10 compute-0 ovn_controller[95372]: 2026-01-22T16:41:10Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Jan 22 16:41:10 compute-0 podman[99224]: 2026-01-22 16:41:10.063620354 +0000 UTC m=+0.125590986 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 16:41:10 compute-0 python3.9[99262]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:41:10 compute-0 python3.9[99397]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100069.6414633-177-171659930923625/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:41:11 compute-0 python3.9[99547]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:41:12 compute-0 python3.9[99668]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100070.8716745-177-137680177281217/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:41:12 compute-0 python3.9[99818]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:41:13 compute-0 sudo[99970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emjilmoytgypbuixyghpbjouqzfubewv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100073.1063302-215-38819039198295/AnsiballZ_file.py'
Jan 22 16:41:13 compute-0 sudo[99970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:13 compute-0 python3.9[99972]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:41:13 compute-0 sudo[99970]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:14 compute-0 sudo[100122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsigazszkjwzndgcxsfxbwpdoavdysvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100073.958353-223-99792519097532/AnsiballZ_stat.py'
Jan 22 16:41:14 compute-0 sudo[100122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:14 compute-0 python3.9[100124]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:41:14 compute-0 sudo[100122]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:14 compute-0 sudo[100200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wycgmaiornufsjaksdlfwlkpydosrfvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100073.958353-223-99792519097532/AnsiballZ_file.py'
Jan 22 16:41:14 compute-0 sudo[100200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:15 compute-0 python3.9[100202]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:41:15 compute-0 sudo[100200]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:15 compute-0 sudo[100352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owavylztfgdtifgsabsxnasdjiytfgdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100075.2154984-223-105616421338079/AnsiballZ_stat.py'
Jan 22 16:41:15 compute-0 sudo[100352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:15 compute-0 python3.9[100354]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:41:15 compute-0 sudo[100352]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:15 compute-0 sudo[100430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znosliimubtetumaxshmsuonqrqkstjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100075.2154984-223-105616421338079/AnsiballZ_file.py'
Jan 22 16:41:15 compute-0 sudo[100430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:16 compute-0 python3.9[100432]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:41:16 compute-0 sudo[100430]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:16 compute-0 sudo[100582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpbqhwqyuweqbvpybwlmcsaffpwuglvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100076.3683612-246-75407038542948/AnsiballZ_file.py'
Jan 22 16:41:16 compute-0 sudo[100582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:16 compute-0 python3.9[100584]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:41:16 compute-0 sudo[100582]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:17 compute-0 sudo[100734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdewssxfjobapfwpufcwewybnfjgrxag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100077.0553489-254-18641149325163/AnsiballZ_stat.py'
Jan 22 16:41:17 compute-0 sudo[100734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:17 compute-0 python3.9[100736]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:41:17 compute-0 sudo[100734]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:17 compute-0 sudo[100812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvdepfjmrbkthwpdthypffusudvkyfmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100077.0553489-254-18641149325163/AnsiballZ_file.py'
Jan 22 16:41:17 compute-0 sudo[100812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:18 compute-0 python3.9[100814]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:41:18 compute-0 sudo[100812]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:18 compute-0 sudo[100964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoonxbzrgkhvgtmliyqwhayhgpoejpbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100078.3264701-266-150732833479695/AnsiballZ_stat.py'
Jan 22 16:41:18 compute-0 sudo[100964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:18 compute-0 python3.9[100966]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:41:18 compute-0 sudo[100964]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:19 compute-0 sudo[101042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szocaqialzzcmjrdxrhezuvlbmjqnyej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100078.3264701-266-150732833479695/AnsiballZ_file.py'
Jan 22 16:41:19 compute-0 sudo[101042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:19 compute-0 python3.9[101044]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:41:19 compute-0 sudo[101042]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:20 compute-0 sudo[101194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxhxymawpeefifewyydinkiflgadzutl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100079.6222494-278-203774982709774/AnsiballZ_systemd.py'
Jan 22 16:41:20 compute-0 sudo[101194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:20 compute-0 python3.9[101196]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:41:20 compute-0 systemd[1]: Reloading.
Jan 22 16:41:20 compute-0 systemd-sysv-generator[101224]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:41:20 compute-0 systemd-rc-local-generator[101218]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:41:20 compute-0 sudo[101194]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:21 compute-0 sudo[101383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afssdlqfdrfmjqvmutfkljvlwvsgfrec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100080.866879-286-5416499929735/AnsiballZ_stat.py'
Jan 22 16:41:21 compute-0 sudo[101383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:21 compute-0 python3.9[101385]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:41:21 compute-0 sudo[101383]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:21 compute-0 sudo[101461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhipybggupsqrsnguoytmepxgrwprusy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100080.866879-286-5416499929735/AnsiballZ_file.py'
Jan 22 16:41:21 compute-0 sudo[101461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:21 compute-0 python3.9[101463]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:41:21 compute-0 sudo[101461]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:22 compute-0 sudo[101613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqpsxdyxuvgzajkpocgcarevwoolouiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100082.0778368-298-239773194303236/AnsiballZ_stat.py'
Jan 22 16:41:22 compute-0 sudo[101613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:22 compute-0 python3.9[101615]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:41:22 compute-0 sudo[101613]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:22 compute-0 sudo[101691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnicjozttheukarubjqqjidyprjsehpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100082.0778368-298-239773194303236/AnsiballZ_file.py'
Jan 22 16:41:22 compute-0 sudo[101691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:23 compute-0 python3.9[101693]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:41:23 compute-0 sudo[101691]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:23 compute-0 sudo[101843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfgnnmpbkuumrfjmjfhxxylqmhpklele ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100083.2929575-310-191505107482976/AnsiballZ_systemd.py'
Jan 22 16:41:23 compute-0 sudo[101843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:23 compute-0 python3.9[101845]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:41:23 compute-0 systemd[1]: Reloading.
Jan 22 16:41:24 compute-0 systemd-rc-local-generator[101873]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:41:24 compute-0 systemd-sysv-generator[101878]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:41:24 compute-0 systemd[1]: Starting Create netns directory...
Jan 22 16:41:24 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 22 16:41:24 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 22 16:41:24 compute-0 systemd[1]: Finished Create netns directory.
Jan 22 16:41:24 compute-0 sudo[101843]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:24 compute-0 sudo[102038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsewrqmrugbcnyvxrtpdvkwfhxdgprtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100084.602535-320-232103305588088/AnsiballZ_file.py'
Jan 22 16:41:24 compute-0 sudo[102038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:25 compute-0 python3.9[102040]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:41:25 compute-0 sudo[102038]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:25 compute-0 sudo[102190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtjmrlkfobtoctloqoczqowiguxznedg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100085.3915415-328-273136542413041/AnsiballZ_stat.py'
Jan 22 16:41:25 compute-0 sudo[102190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:25 compute-0 python3.9[102192]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:41:25 compute-0 sudo[102190]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:26 compute-0 sudo[102313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghhfbwsdpzhkptuwsaqamrgnmbfphmcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100085.3915415-328-273136542413041/AnsiballZ_copy.py'
Jan 22 16:41:26 compute-0 sudo[102313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:26 compute-0 python3.9[102315]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100085.3915415-328-273136542413041/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:41:26 compute-0 sudo[102313]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:27 compute-0 sudo[102465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxgpvhzgsgfddyiofjrixllkfilnhsci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100086.8872373-345-96682811535028/AnsiballZ_file.py'
Jan 22 16:41:27 compute-0 sudo[102465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:27 compute-0 python3.9[102467]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:41:27 compute-0 sudo[102465]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:28 compute-0 sudo[102617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drylccfhgqyoawxhlfrltjibkbtkzthj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100087.7436464-353-112039647376392/AnsiballZ_file.py'
Jan 22 16:41:28 compute-0 sudo[102617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:28 compute-0 python3.9[102619]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:41:28 compute-0 sudo[102617]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:29 compute-0 sudo[102769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okrllyybzgmwdbotpvmbvdwyuxdlzhue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100088.7277927-361-54900056693829/AnsiballZ_stat.py'
Jan 22 16:41:29 compute-0 sudo[102769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:29 compute-0 python3.9[102771]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:41:29 compute-0 sudo[102769]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:29 compute-0 sudo[102892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcvbqgesvhcdbbwlfqgsymthyioovvkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100088.7277927-361-54900056693829/AnsiballZ_copy.py'
Jan 22 16:41:29 compute-0 sudo[102892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:30 compute-0 python3.9[102894]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100088.7277927-361-54900056693829/.source.json _original_basename=.h7n_6iny follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:41:30 compute-0 sudo[102892]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:30 compute-0 python3.9[103044]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:41:32 compute-0 sudo[103465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfwiicbighpynoliasmfkmfmchnasvvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100092.4710321-401-78046051438751/AnsiballZ_container_config_data.py'
Jan 22 16:41:32 compute-0 sudo[103465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:33 compute-0 python3.9[103467]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 22 16:41:33 compute-0 sudo[103465]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:34 compute-0 sudo[103617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrppzmxzykcrzrqthaatgpcpmtuftinv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100093.5527058-412-177708247392437/AnsiballZ_container_config_hash.py'
Jan 22 16:41:34 compute-0 sudo[103617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:34 compute-0 python3.9[103619]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 16:41:34 compute-0 sudo[103617]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:35 compute-0 sudo[103769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbiqujdwuxnrzcgwdyedtmlhmcojfupv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769100094.624512-422-167445914331673/AnsiballZ_edpm_container_manage.py'
Jan 22 16:41:35 compute-0 sudo[103769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:35 compute-0 python3[103771]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 16:41:35 compute-0 podman[103808]: 2026-01-22 16:41:35.648378227 +0000 UTC m=+0.051577013 container create 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 16:41:35 compute-0 podman[103808]: 2026-01-22 16:41:35.621772175 +0000 UTC m=+0.024971001 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 16:41:35 compute-0 python3[103771]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 16:41:35 compute-0 sudo[103769]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:36 compute-0 sudo[103996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdomnxouvkdaztejltzdvfcghbguoibo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100095.9857886-430-96227385236671/AnsiballZ_stat.py'
Jan 22 16:41:36 compute-0 sudo[103996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:36 compute-0 python3.9[103998]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:41:36 compute-0 sudo[103996]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:36 compute-0 sudo[104150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxuuburmtarfsdkzlizxfmcxgnefbqrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100096.7378514-439-87310477379738/AnsiballZ_file.py'
Jan 22 16:41:36 compute-0 sudo[104150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:37 compute-0 python3.9[104152]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:41:37 compute-0 sudo[104150]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:37 compute-0 sudo[104226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilvdvfxjthyznpexrjsylkjjmsncxwki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100096.7378514-439-87310477379738/AnsiballZ_stat.py'
Jan 22 16:41:37 compute-0 sudo[104226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:37 compute-0 python3.9[104228]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:41:37 compute-0 sudo[104226]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:38 compute-0 sudo[104377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsxlffktlxusocxilgzezkkhiftwepvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100097.686973-439-7147667135041/AnsiballZ_copy.py'
Jan 22 16:41:38 compute-0 sudo[104377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:38 compute-0 python3.9[104379]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769100097.686973-439-7147667135041/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:41:38 compute-0 sudo[104377]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:38 compute-0 sudo[104453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glghjzorodcxyxzaamlufzcjwttgcqlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100097.686973-439-7147667135041/AnsiballZ_systemd.py'
Jan 22 16:41:38 compute-0 sudo[104453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:38 compute-0 python3.9[104455]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 16:41:38 compute-0 systemd[1]: Reloading.
Jan 22 16:41:38 compute-0 systemd-rc-local-generator[104482]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:41:38 compute-0 systemd-sysv-generator[104485]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:41:39 compute-0 sudo[104453]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:39 compute-0 sudo[104565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erovvzasahbozuqkqccvqogwsmfhythc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100097.686973-439-7147667135041/AnsiballZ_systemd.py'
Jan 22 16:41:39 compute-0 sudo[104565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:39 compute-0 python3.9[104567]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:41:39 compute-0 systemd[1]: Reloading.
Jan 22 16:41:39 compute-0 systemd-rc-local-generator[104597]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:41:39 compute-0 systemd-sysv-generator[104601]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:41:39 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Jan 22 16:41:39 compute-0 systemd[1]: Started libcrun container.
Jan 22 16:41:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6616be91899fbc11486f913ecceb1b9bd4366461452e34abeb8480b8ecb30e1d/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 22 16:41:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6616be91899fbc11486f913ecceb1b9bd4366461452e34abeb8480b8ecb30e1d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 16:41:40 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd.
Jan 22 16:41:40 compute-0 podman[104608]: 2026-01-22 16:41:40.01745166 +0000 UTC m=+0.134152793 container init 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: + sudo -E kolla_set_configs
Jan 22 16:41:40 compute-0 podman[104608]: 2026-01-22 16:41:40.049608597 +0000 UTC m=+0.166309730 container start 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 22 16:41:40 compute-0 edpm-start-podman-container[104608]: ovn_metadata_agent
Jan 22 16:41:40 compute-0 podman[104631]: 2026-01-22 16:41:40.10391811 +0000 UTC m=+0.045989598 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true)
Jan 22 16:41:40 compute-0 edpm-start-podman-container[104607]: Creating additional drop-in dependency for "ovn_metadata_agent" (642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd)
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: INFO:__main__:Validating config file
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: INFO:__main__:Copying service configuration files
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: INFO:__main__:Writing out command to execute
Jan 22 16:41:40 compute-0 systemd[1]: Reloading.
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: ++ cat /run_command
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: + CMD=neutron-ovn-metadata-agent
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: + ARGS=
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: + sudo kolla_copy_cacerts
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: Running command: 'neutron-ovn-metadata-agent'
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: + [[ ! -n '' ]]
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: + . kolla_extend_start
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: + umask 0022
Jan 22 16:41:40 compute-0 ovn_metadata_agent[104624]: + exec neutron-ovn-metadata-agent
Jan 22 16:41:40 compute-0 systemd-sysv-generator[104722]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:41:40 compute-0 systemd-rc-local-generator[104718]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:41:40 compute-0 podman[104665]: 2026-01-22 16:41:40.227459826 +0000 UTC m=+0.099644325 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 16:41:40 compute-0 systemd[1]: Started ovn_metadata_agent container.
Jan 22 16:41:40 compute-0 sudo[104565]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:41 compute-0 python3.9[104884]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.861 104629 INFO neutron.common.config [-] Logging enabled!
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.861 104629 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.861 104629 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.862 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.862 104629 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.862 104629 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.862 104629 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.862 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.862 104629 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.862 104629 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.863 104629 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.863 104629 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.863 104629 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.863 104629 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.863 104629 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.863 104629 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.863 104629 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.863 104629 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.863 104629 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.864 104629 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.864 104629 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.864 104629 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.864 104629 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.864 104629 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.864 104629 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.864 104629 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.864 104629 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.864 104629 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.864 104629 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.865 104629 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.865 104629 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.865 104629 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.865 104629 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.865 104629 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.865 104629 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.865 104629 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.865 104629 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.866 104629 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.866 104629 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.866 104629 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.866 104629 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.866 104629 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.866 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.866 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.866 104629 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.866 104629 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.866 104629 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.866 104629 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.867 104629 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.867 104629 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.867 104629 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.867 104629 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.867 104629 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.867 104629 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.867 104629 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.867 104629 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.867 104629 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.867 104629 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.868 104629 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.868 104629 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.868 104629 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.868 104629 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.868 104629 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.868 104629 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.868 104629 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.868 104629 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.868 104629 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.869 104629 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.869 104629 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.869 104629 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.869 104629 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.869 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.869 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.869 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.869 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.869 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.870 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.870 104629 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.870 104629 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.870 104629 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.870 104629 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.870 104629 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.870 104629 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.870 104629 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.870 104629 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.870 104629 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.871 104629 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.871 104629 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.871 104629 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.871 104629 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.871 104629 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.871 104629 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.871 104629 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.871 104629 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.871 104629 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.871 104629 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.872 104629 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.872 104629 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.872 104629 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.872 104629 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.872 104629 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.872 104629 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.872 104629 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.872 104629 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.872 104629 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.872 104629 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.873 104629 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.873 104629 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.873 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.873 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.873 104629 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.873 104629 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.873 104629 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.873 104629 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.874 104629 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.874 104629 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.874 104629 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.874 104629 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.874 104629 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.874 104629 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.874 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.874 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.874 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.875 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.875 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.875 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.875 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.875 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.875 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.875 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.875 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.876 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.876 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.876 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.876 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.876 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.876 104629 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.876 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.876 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.876 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.877 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.877 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.877 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.877 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.877 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.877 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.877 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.877 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.877 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.878 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.878 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.878 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.878 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.878 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.878 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.878 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.878 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.878 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.878 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.879 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.879 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.879 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.879 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.879 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.879 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.879 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.879 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.879 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.880 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.880 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.880 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.880 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.880 104629 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.880 104629 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.880 104629 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.880 104629 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.880 104629 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.881 104629 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.881 104629 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.881 104629 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.881 104629 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.881 104629 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.881 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.881 104629 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.881 104629 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.881 104629 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.881 104629 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.882 104629 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.882 104629 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.882 104629 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.882 104629 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.882 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.882 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.882 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.882 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.882 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.883 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.883 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.883 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.883 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.883 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.883 104629 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.883 104629 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.883 104629 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.883 104629 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.883 104629 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.884 104629 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.884 104629 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.884 104629 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.884 104629 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.884 104629 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.884 104629 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.884 104629 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.884 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.884 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.885 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.885 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.885 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.885 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.885 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.885 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.885 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.885 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.885 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.885 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.886 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.886 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.886 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.886 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.886 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.886 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.886 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.886 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.886 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.886 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.887 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.887 104629 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.887 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.887 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.887 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.887 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.887 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.887 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.887 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.888 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.888 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.888 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.888 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.888 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.888 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.888 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.888 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.888 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.889 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.889 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.889 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.889 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.889 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.889 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.889 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.889 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.889 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.890 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.890 104629 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.890 104629 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.890 104629 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.890 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.890 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.890 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.890 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.890 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.891 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.891 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.891 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.891 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.891 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.891 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.891 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.891 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.891 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.891 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.892 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.892 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.892 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.892 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.892 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.892 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.892 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.892 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.892 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.893 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.893 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.893 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.893 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.893 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.893 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.893 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.893 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.893 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.894 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.894 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.894 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.894 104629 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.894 104629 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.902 104629 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.902 104629 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.902 104629 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.903 104629 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.903 104629 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.914 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name c288e768-a990-4b51-bd88-fd8dddb8c85d (UUID: c288e768-a990-4b51-bd88-fd8dddb8c85d) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.938 104629 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.939 104629 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.939 104629 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.939 104629 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.941 104629 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.948 104629 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.952 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'c288e768-a990-4b51-bd88-fd8dddb8c85d'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], external_ids={}, name=c288e768-a990-4b51-bd88-fd8dddb8c85d, nb_cfg_timestamp=1769100047825, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.953 104629 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f87fb904130>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.954 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.954 104629 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.954 104629 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.954 104629 INFO oslo_service.service [-] Starting 1 workers
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.958 104629 DEBUG oslo_service.service [-] Started child 104990 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.961 104629 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmplrw2rc2z/privsep.sock']
Jan 22 16:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:41.964 104990 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-162107'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Jan 22 16:41:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:42.000 104990 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 22 16:41:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:42.001 104990 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 22 16:41:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:42.001 104990 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 16:41:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:42.005 104990 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 22 16:41:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:42.014 104990 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 22 16:41:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:42.021 104990 INFO eventlet.wsgi.server [-] (104990) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Jan 22 16:41:42 compute-0 sudo[105038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvrdezzmpguzktgwqnzjhgeuhbdyfmnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100101.738335-484-251866294113573/AnsiballZ_stat.py'
Jan 22 16:41:42 compute-0 sudo[105038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:42 compute-0 python3.9[105040]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:41:42 compute-0 sudo[105038]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:42 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 22 16:41:42 compute-0 sudo[105165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhuuhohxdnorcryexcxwaaoavusvnjhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100101.738335-484-251866294113573/AnsiballZ_copy.py'
Jan 22 16:41:42 compute-0 sudo[105165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:42.602 104629 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 22 16:41:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:42.603 104629 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmplrw2rc2z/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 22 16:41:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:42.482 105117 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 22 16:41:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:42.486 105117 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 22 16:41:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:42.488 105117 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Jan 22 16:41:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:42.488 105117 INFO oslo.privsep.daemon [-] privsep daemon running as pid 105117
Jan 22 16:41:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:42.605 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[5e4a6bf9-0e3c-4d9a-b736-2130a46e089c]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:41:42 compute-0 python3.9[105167]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100101.738335-484-251866294113573/.source.yaml _original_basename=.5kqr8s18 follow=False checksum=939d854607467a87ae4b20d0997ee4de0c1f16ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:41:42 compute-0 sudo[105165]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.103 105117 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.103 105117 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.103 105117 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:41:43 compute-0 sshd-session[96407]: Connection closed by 192.168.122.30 port 48722
Jan 22 16:41:43 compute-0 sshd-session[96404]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:41:43 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Jan 22 16:41:43 compute-0 systemd[1]: session-22.scope: Consumed 39.406s CPU time.
Jan 22 16:41:43 compute-0 systemd-logind[796]: Session 22 logged out. Waiting for processes to exit.
Jan 22 16:41:43 compute-0 systemd-logind[796]: Removed session 22.
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.637 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[b46b0520-3635-48b5-a643-54a3696b43b9]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.639 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, column=external_ids, values=({'neutron:ovn-metadata-id': '8785f5df-bc50-5c57-b906-2ab72ffbf78e'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.649 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.655 104629 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.655 104629 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.655 104629 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.655 104629 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.655 104629 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.655 104629 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.655 104629 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.655 104629 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.656 104629 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.656 104629 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.656 104629 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.656 104629 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.656 104629 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.656 104629 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.656 104629 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.657 104629 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.657 104629 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.657 104629 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.657 104629 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.657 104629 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.657 104629 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.657 104629 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.658 104629 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.658 104629 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.658 104629 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.658 104629 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.658 104629 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.659 104629 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.659 104629 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.659 104629 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.659 104629 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.659 104629 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.659 104629 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.660 104629 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.660 104629 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.660 104629 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.660 104629 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.660 104629 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.661 104629 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.661 104629 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.661 104629 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.661 104629 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.661 104629 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.661 104629 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.661 104629 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.662 104629 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.662 104629 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.662 104629 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.662 104629 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.662 104629 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.662 104629 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.662 104629 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.663 104629 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.663 104629 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.663 104629 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.663 104629 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.663 104629 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.663 104629 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.663 104629 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.663 104629 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.663 104629 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.663 104629 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.664 104629 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.664 104629 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.664 104629 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.664 104629 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.664 104629 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.664 104629 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.664 104629 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.664 104629 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.665 104629 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.665 104629 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.665 104629 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.665 104629 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.665 104629 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.665 104629 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.665 104629 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.666 104629 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.666 104629 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.666 104629 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.666 104629 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.666 104629 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.666 104629 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.666 104629 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.666 104629 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.667 104629 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.667 104629 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.667 104629 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.667 104629 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.667 104629 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.667 104629 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.667 104629 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.667 104629 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.668 104629 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.668 104629 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.668 104629 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.668 104629 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.668 104629 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.668 104629 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.668 104629 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.668 104629 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.668 104629 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.669 104629 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.669 104629 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.669 104629 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.669 104629 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.669 104629 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.669 104629 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.669 104629 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.670 104629 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.670 104629 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.670 104629 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.670 104629 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.670 104629 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.670 104629 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.670 104629 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.671 104629 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.671 104629 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.671 104629 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.671 104629 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.671 104629 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.671 104629 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.671 104629 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.671 104629 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.672 104629 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.672 104629 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.672 104629 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.672 104629 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.672 104629 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.672 104629 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.672 104629 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.672 104629 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.673 104629 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.673 104629 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.673 104629 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.673 104629 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.673 104629 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.673 104629 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.673 104629 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.673 104629 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.673 104629 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.674 104629 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.674 104629 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.674 104629 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.674 104629 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.674 104629 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.674 104629 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.674 104629 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.674 104629 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.674 104629 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.675 104629 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.675 104629 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.675 104629 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.675 104629 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.675 104629 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.675 104629 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.675 104629 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.675 104629 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.675 104629 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.675 104629 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.676 104629 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.676 104629 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.676 104629 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.676 104629 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.676 104629 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.676 104629 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.676 104629 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.676 104629 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.676 104629 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.677 104629 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.677 104629 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.677 104629 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.677 104629 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.677 104629 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.677 104629 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.677 104629 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.678 104629 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.678 104629 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.678 104629 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.678 104629 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.678 104629 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.678 104629 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.678 104629 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.679 104629 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.679 104629 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.679 104629 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.679 104629 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.679 104629 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.679 104629 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.679 104629 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.679 104629 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.680 104629 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.680 104629 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.680 104629 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.680 104629 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.680 104629 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.680 104629 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.680 104629 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.680 104629 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.681 104629 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.681 104629 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.681 104629 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.681 104629 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.681 104629 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.681 104629 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.681 104629 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.681 104629 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.681 104629 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.682 104629 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.682 104629 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.682 104629 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.682 104629 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.682 104629 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.682 104629 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.682 104629 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.683 104629 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.683 104629 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.683 104629 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.683 104629 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.683 104629 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.683 104629 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.683 104629 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.683 104629 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.684 104629 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.684 104629 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.684 104629 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.684 104629 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.684 104629 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.684 104629 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.684 104629 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.684 104629 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.685 104629 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.685 104629 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.685 104629 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.685 104629 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.685 104629 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.685 104629 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.685 104629 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.686 104629 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.686 104629 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.686 104629 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.686 104629 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.686 104629 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.686 104629 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.686 104629 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.687 104629 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.687 104629 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.687 104629 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.687 104629 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.687 104629 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.687 104629 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.687 104629 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.688 104629 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.688 104629 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.688 104629 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.688 104629 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.688 104629 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.688 104629 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.688 104629 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.689 104629 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.689 104629 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.689 104629 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.689 104629 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.689 104629 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.689 104629 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.689 104629 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.689 104629 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.690 104629 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.690 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.690 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.690 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.690 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.690 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.690 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.690 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.690 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.691 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.691 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.691 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.691 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.691 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.691 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.691 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.692 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.692 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.692 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.692 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.692 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.692 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.692 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.692 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.693 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.693 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.693 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.693 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.693 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.693 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.693 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.693 104629 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.693 104629 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.694 104629 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.694 104629 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.694 104629 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:41:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:41:43.694 104629 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 22 16:41:49 compute-0 sshd-session[105196]: Accepted publickey for zuul from 192.168.122.30 port 45238 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:41:49 compute-0 systemd-logind[796]: New session 23 of user zuul.
Jan 22 16:41:49 compute-0 systemd[1]: Started Session 23 of User zuul.
Jan 22 16:41:49 compute-0 sshd-session[105196]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:41:50 compute-0 python3.9[105349]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:41:51 compute-0 sudo[105503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwdpnqhahwkqtjvnlsuiqbyuthdwffrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100110.9340224-29-256954706800041/AnsiballZ_command.py'
Jan 22 16:41:51 compute-0 sudo[105503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:51 compute-0 python3.9[105505]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:41:51 compute-0 sudo[105503]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:52 compute-0 sudo[105668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxvixhrtwxyfbtzxhkzcvminyasgpdqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100111.9985454-40-269196008780262/AnsiballZ_systemd_service.py'
Jan 22 16:41:52 compute-0 sudo[105668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:52 compute-0 python3.9[105670]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 16:41:52 compute-0 systemd[1]: Reloading.
Jan 22 16:41:52 compute-0 systemd-sysv-generator[105699]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:41:52 compute-0 systemd-rc-local-generator[105696]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:41:53 compute-0 sudo[105668]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:53 compute-0 python3.9[105854]: ansible-ansible.builtin.service_facts Invoked
Jan 22 16:41:53 compute-0 network[105871]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 16:41:53 compute-0 network[105872]: 'network-scripts' will be removed from distribution in near future.
Jan 22 16:41:53 compute-0 network[105873]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 16:41:58 compute-0 sudo[106132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cutpyspwajmmbqshillbxjmwednazerx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100117.9901872-59-258512051138651/AnsiballZ_systemd_service.py'
Jan 22 16:41:58 compute-0 sudo[106132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:58 compute-0 python3.9[106134]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:41:58 compute-0 sudo[106132]: pam_unix(sudo:session): session closed for user root
Jan 22 16:41:59 compute-0 sudo[106285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oasacrwvpleoilgjqbxidycygrultpbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100118.874559-59-25590226845743/AnsiballZ_systemd_service.py'
Jan 22 16:41:59 compute-0 sudo[106285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:41:59 compute-0 python3.9[106287]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:41:59 compute-0 sudo[106285]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:00 compute-0 sudo[106438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vslsbouofpfndakvdeneckqgjqccvrrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100119.727619-59-60882510055377/AnsiballZ_systemd_service.py'
Jan 22 16:42:00 compute-0 sudo[106438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:00 compute-0 python3.9[106440]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:42:00 compute-0 sudo[106438]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:00 compute-0 sudo[106591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sumyfzbduyjvvcszfqovfqopeajfwqau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100120.6100724-59-104351767486035/AnsiballZ_systemd_service.py'
Jan 22 16:42:00 compute-0 sudo[106591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:01 compute-0 python3.9[106593]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:42:01 compute-0 sudo[106591]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:01 compute-0 sudo[106744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phrgtpfringspdvaroifegwyofepfkbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100121.4753969-59-393302356778/AnsiballZ_systemd_service.py'
Jan 22 16:42:01 compute-0 sudo[106744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:02 compute-0 python3.9[106746]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:42:02 compute-0 sudo[106744]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:02 compute-0 sudo[106897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbecqtdqlgqdnjijvovuzmnelobsvxgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100122.2897167-59-148757743649535/AnsiballZ_systemd_service.py'
Jan 22 16:42:02 compute-0 sudo[106897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:02 compute-0 python3.9[106899]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:42:02 compute-0 sudo[106897]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:03 compute-0 sudo[107050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbnarecyupggctrrwbqjtytznoehljsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100123.1493385-59-8542198390153/AnsiballZ_systemd_service.py'
Jan 22 16:42:03 compute-0 sudo[107050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:03 compute-0 python3.9[107052]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:42:03 compute-0 sudo[107050]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:04 compute-0 sudo[107203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkswbvcwctkixokmdyiprxmwcgkzqhvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100124.0644379-111-140714521057722/AnsiballZ_file.py'
Jan 22 16:42:04 compute-0 sudo[107203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:04 compute-0 python3.9[107205]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:42:04 compute-0 sudo[107203]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:05 compute-0 sudo[107355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxhiolootbizuimgqpunixbserhjsxhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100124.9684346-111-131133734528707/AnsiballZ_file.py'
Jan 22 16:42:05 compute-0 sudo[107355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:05 compute-0 python3.9[107357]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:42:05 compute-0 sudo[107355]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:05 compute-0 sudo[107507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqkaylgmfnifipfojekznjoddgfyymqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100125.6769984-111-146801509152703/AnsiballZ_file.py'
Jan 22 16:42:05 compute-0 sudo[107507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:06 compute-0 python3.9[107509]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:42:06 compute-0 sudo[107507]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:06 compute-0 sudo[107659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryfcihmvsuvlejoqxlmtrjyufqcveege ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100126.2886987-111-13340521020490/AnsiballZ_file.py'
Jan 22 16:42:06 compute-0 sudo[107659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:06 compute-0 python3.9[107661]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:42:06 compute-0 sudo[107659]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:07 compute-0 sudo[107811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcmugaenweyrfaynxsgswheufpuzhtvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100126.9238346-111-178259800061084/AnsiballZ_file.py'
Jan 22 16:42:07 compute-0 sudo[107811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:07 compute-0 python3.9[107813]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:42:07 compute-0 sudo[107811]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:07 compute-0 sudo[107963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxnjfrvjfmhvtqnfiwrrigrfagnqgzwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100127.5174391-111-95113654861444/AnsiballZ_file.py'
Jan 22 16:42:07 compute-0 sudo[107963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:08 compute-0 python3.9[107965]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:42:08 compute-0 sudo[107963]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:08 compute-0 sudo[108115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmscmytqgtxatyaecvfxzzmhvbcitifh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100128.1514053-111-160394254097048/AnsiballZ_file.py'
Jan 22 16:42:08 compute-0 sudo[108115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:08 compute-0 python3.9[108117]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:42:08 compute-0 sudo[108115]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:08 compute-0 sudo[108267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utlchnmjbxdkaedvstxyxguxebxikahy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100128.7312498-161-226794180618250/AnsiballZ_file.py'
Jan 22 16:42:08 compute-0 sudo[108267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:09 compute-0 python3.9[108269]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:42:09 compute-0 sudo[108267]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:09 compute-0 sudo[108419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glepgzbvwyxepuwsvtpbdgzktdghoumr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100129.3295581-161-45770337546386/AnsiballZ_file.py'
Jan 22 16:42:09 compute-0 sudo[108419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:09 compute-0 python3.9[108421]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:42:09 compute-0 sudo[108419]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:10 compute-0 sudo[108581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brnlpzniobubolhduhfqftkiqadoklrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100129.926215-161-211926764016192/AnsiballZ_file.py'
Jan 22 16:42:10 compute-0 sudo[108581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:10 compute-0 podman[108545]: 2026-01-22 16:42:10.308500876 +0000 UTC m=+0.082504659 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 16:42:10 compute-0 python3.9[108587]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:42:10 compute-0 sudo[108581]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:10 compute-0 sudo[108751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjqsmojvxgrmyvqtgprrjcnvmntflwpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100130.6164129-161-30234102232964/AnsiballZ_file.py'
Jan 22 16:42:10 compute-0 sudo[108751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:11 compute-0 podman[108714]: 2026-01-22 16:42:11.009882141 +0000 UTC m=+0.112130770 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 16:42:11 compute-0 python3.9[108762]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:42:11 compute-0 sudo[108751]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:11 compute-0 sudo[108918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pblphtzzfzadydxnecedltqkddzppijv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100131.2991755-161-101538930328457/AnsiballZ_file.py'
Jan 22 16:42:11 compute-0 sudo[108918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:11 compute-0 python3.9[108920]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:42:11 compute-0 sudo[108918]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:12 compute-0 sudo[109070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvtpplwgviazguarcsxuolejqvjqbjjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100131.8949842-161-145859873488974/AnsiballZ_file.py'
Jan 22 16:42:12 compute-0 sudo[109070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:12 compute-0 python3.9[109072]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:42:12 compute-0 sudo[109070]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:12 compute-0 sudo[109222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olqkaictrmaiqchaalvbdwuyfcyytkwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100132.5492845-161-102985255540478/AnsiballZ_file.py'
Jan 22 16:42:12 compute-0 sudo[109222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:13 compute-0 python3.9[109224]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:42:13 compute-0 sudo[109222]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:13 compute-0 sudo[109374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdvbxcgmsdrnygdkcwhvnvykgcjllopt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100133.3144674-212-152360636987475/AnsiballZ_command.py'
Jan 22 16:42:13 compute-0 sudo[109374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:13 compute-0 python3.9[109376]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:42:13 compute-0 sudo[109374]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:14 compute-0 python3.9[109528]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 16:42:15 compute-0 sudo[109678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqewdfqlqedetroqtkyarmiqbjlhzzth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100135.111904-230-137553287474228/AnsiballZ_systemd_service.py'
Jan 22 16:42:15 compute-0 sudo[109678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:15 compute-0 python3.9[109680]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 16:42:15 compute-0 systemd[1]: Reloading.
Jan 22 16:42:15 compute-0 systemd-sysv-generator[109709]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:42:15 compute-0 systemd-rc-local-generator[109705]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:42:16 compute-0 sudo[109678]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:16 compute-0 sudo[109865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fauvtdkscmuiqvtyanbkqoexradctiax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100136.2786362-238-15805129882553/AnsiballZ_command.py'
Jan 22 16:42:16 compute-0 sudo[109865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:16 compute-0 python3.9[109867]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:42:16 compute-0 sudo[109865]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:17 compute-0 sudo[110018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfmavdnlvlixeqneqsdzkkwmwbuqoxdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100136.9667583-238-97433012088394/AnsiballZ_command.py'
Jan 22 16:42:17 compute-0 sudo[110018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:17 compute-0 python3.9[110020]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:42:17 compute-0 sudo[110018]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:17 compute-0 sudo[110171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxhozzcuhrnupkfhmvcqeoniowpgthnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100137.609239-238-116379692454791/AnsiballZ_command.py'
Jan 22 16:42:17 compute-0 sudo[110171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:18 compute-0 python3.9[110173]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:42:18 compute-0 sudo[110171]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:18 compute-0 sudo[110324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dltaxlopdqcnenyqqsjddbnklfuozabs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100138.234949-238-257884184289090/AnsiballZ_command.py'
Jan 22 16:42:18 compute-0 sudo[110324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:18 compute-0 python3.9[110326]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:42:18 compute-0 sudo[110324]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:19 compute-0 sudo[110477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-matunkfxylkdujcdjptkfyjfihyuxgwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100138.919011-238-143663477049562/AnsiballZ_command.py'
Jan 22 16:42:19 compute-0 sudo[110477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:19 compute-0 python3.9[110479]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:42:19 compute-0 sudo[110477]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:20 compute-0 sudo[110630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyffbyxtldjeljqkbgwlartprcomnzdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100139.7279477-238-42411741339521/AnsiballZ_command.py'
Jan 22 16:42:20 compute-0 sudo[110630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:20 compute-0 python3.9[110632]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:42:20 compute-0 sudo[110630]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:20 compute-0 sudo[110783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeztuyamhnurfujltzfxofwwjmteyxmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100140.4976363-238-276440398830112/AnsiballZ_command.py'
Jan 22 16:42:20 compute-0 sudo[110783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:20 compute-0 python3.9[110785]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:42:21 compute-0 sudo[110783]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:21 compute-0 sudo[110936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crkiypwiuqmcvgnzhkwdrstxlzhnzrdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100141.456181-292-137992353290883/AnsiballZ_getent.py'
Jan 22 16:42:21 compute-0 sudo[110936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:22 compute-0 python3.9[110938]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 22 16:42:22 compute-0 sudo[110936]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:22 compute-0 sudo[111089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svysixgionwabiocnrsbgdlaolmtknrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100142.3672197-300-243729111552364/AnsiballZ_group.py'
Jan 22 16:42:22 compute-0 sudo[111089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:23 compute-0 python3.9[111091]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 16:42:23 compute-0 groupadd[111092]: group added to /etc/group: name=libvirt, GID=42473
Jan 22 16:42:23 compute-0 groupadd[111092]: group added to /etc/gshadow: name=libvirt
Jan 22 16:42:23 compute-0 groupadd[111092]: new group: name=libvirt, GID=42473
Jan 22 16:42:23 compute-0 sudo[111089]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:24 compute-0 sudo[111247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwfyizqsbjuxocbetbmqzjrkowckurvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100143.7076554-308-418267453158/AnsiballZ_user.py'
Jan 22 16:42:24 compute-0 sudo[111247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:24 compute-0 python3.9[111249]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 16:42:24 compute-0 useradd[111251]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Jan 22 16:42:24 compute-0 sudo[111247]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:25 compute-0 sudo[111407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbyxcgtfznbkhfcnwxtoywrvjxivnrvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100145.0733917-319-244428733281588/AnsiballZ_setup.py'
Jan 22 16:42:25 compute-0 sudo[111407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:25 compute-0 python3.9[111409]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:42:25 compute-0 sudo[111407]: pam_unix(sudo:session): session closed for user root
Jan 22 16:42:26 compute-0 sudo[111491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnfvvwrtmbfqwcgfbkpscrumppptxtit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100145.0733917-319-244428733281588/AnsiballZ_dnf.py'
Jan 22 16:42:26 compute-0 sudo[111491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:42:26 compute-0 python3.9[111493]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:42:41 compute-0 podman[111678]: 2026-01-22 16:42:41.384716352 +0000 UTC m=+0.083308705 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 16:42:41 compute-0 podman[111677]: 2026-01-22 16:42:41.413397155 +0000 UTC m=+0.111582108 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 16:42:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:42:41.896 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:42:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:42:41.896 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:42:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:42:41.896 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:42:56 compute-0 kernel: SELinux:  Converting 2764 SID table entries...
Jan 22 16:42:56 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 16:42:56 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 22 16:42:56 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 16:42:56 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 22 16:42:56 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 16:42:56 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 16:42:56 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 16:43:06 compute-0 kernel: SELinux:  Converting 2764 SID table entries...
Jan 22 16:43:06 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 16:43:06 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 22 16:43:06 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 16:43:06 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 22 16:43:06 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 16:43:06 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 16:43:06 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 16:43:12 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 22 16:43:12 compute-0 podman[111742]: 2026-01-22 16:43:12.39200836 +0000 UTC m=+0.077051781 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 16:43:12 compute-0 podman[111741]: 2026-01-22 16:43:12.446462528 +0000 UTC m=+0.140339681 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 16:43:15 compute-0 sshd-session[111787]: Received disconnect from 91.224.92.78 port 13752:11:  [preauth]
Jan 22 16:43:15 compute-0 sshd-session[111787]: Disconnected from authenticating user root 91.224.92.78 port 13752 [preauth]
Jan 22 16:43:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:43:41.897 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:43:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:43:41.898 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:43:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:43:41.898 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:43:43 compute-0 podman[127475]: 2026-01-22 16:43:43.343463223 +0000 UTC m=+0.051673639 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 16:43:43 compute-0 podman[127465]: 2026-01-22 16:43:43.3791001 +0000 UTC m=+0.089709291 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2)
Jan 22 16:43:59 compute-0 kernel: SELinux:  Converting 2765 SID table entries...
Jan 22 16:43:59 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 16:43:59 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 22 16:43:59 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 16:43:59 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 22 16:43:59 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 16:43:59 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 16:43:59 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 16:44:00 compute-0 groupadd[128721]: group added to /etc/group: name=dnsmasq, GID=993
Jan 22 16:44:00 compute-0 groupadd[128721]: group added to /etc/gshadow: name=dnsmasq
Jan 22 16:44:00 compute-0 groupadd[128721]: new group: name=dnsmasq, GID=993
Jan 22 16:44:00 compute-0 useradd[128728]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Jan 22 16:44:00 compute-0 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 22 16:44:00 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 22 16:44:00 compute-0 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 22 16:44:01 compute-0 groupadd[128741]: group added to /etc/group: name=clevis, GID=992
Jan 22 16:44:01 compute-0 groupadd[128741]: group added to /etc/gshadow: name=clevis
Jan 22 16:44:01 compute-0 groupadd[128741]: new group: name=clevis, GID=992
Jan 22 16:44:01 compute-0 useradd[128748]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Jan 22 16:44:01 compute-0 usermod[128758]: add 'clevis' to group 'tss'
Jan 22 16:44:01 compute-0 usermod[128758]: add 'clevis' to shadow group 'tss'
Jan 22 16:44:08 compute-0 polkitd[43605]: Reloading rules
Jan 22 16:44:08 compute-0 polkitd[43605]: Collecting garbage unconditionally...
Jan 22 16:44:08 compute-0 polkitd[43605]: Loading rules from directory /etc/polkit-1/rules.d
Jan 22 16:44:08 compute-0 polkitd[43605]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 22 16:44:08 compute-0 polkitd[43605]: Finished loading, compiling and executing 3 rules
Jan 22 16:44:08 compute-0 polkitd[43605]: Reloading rules
Jan 22 16:44:08 compute-0 polkitd[43605]: Collecting garbage unconditionally...
Jan 22 16:44:08 compute-0 polkitd[43605]: Loading rules from directory /etc/polkit-1/rules.d
Jan 22 16:44:08 compute-0 polkitd[43605]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 22 16:44:08 compute-0 polkitd[43605]: Finished loading, compiling and executing 3 rules
Jan 22 16:44:13 compute-0 groupadd[128948]: group added to /etc/group: name=ceph, GID=167
Jan 22 16:44:13 compute-0 groupadd[128948]: group added to /etc/gshadow: name=ceph
Jan 22 16:44:13 compute-0 groupadd[128948]: new group: name=ceph, GID=167
Jan 22 16:44:13 compute-0 useradd[128954]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Jan 22 16:44:14 compute-0 podman[128962]: 2026-01-22 16:44:14.406018658 +0000 UTC m=+0.104726444 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 16:44:14 compute-0 podman[128961]: 2026-01-22 16:44:14.457359397 +0000 UTC m=+0.156354990 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 16:44:15 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Jan 22 16:44:15 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Jan 22 16:44:15 compute-0 sshd[1007]: Received signal 15; terminating.
Jan 22 16:44:15 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Jan 22 16:44:15 compute-0 systemd[1]: sshd.service: Consumed 2.307s CPU time, read 564.0K from disk, written 8.0K to disk.
Jan 22 16:44:15 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Jan 22 16:44:15 compute-0 systemd[1]: Stopping sshd-keygen.target...
Jan 22 16:44:15 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 16:44:15 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 16:44:15 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 16:44:15 compute-0 systemd[1]: Reached target sshd-keygen.target.
Jan 22 16:44:15 compute-0 systemd[1]: Starting OpenSSH server daemon...
Jan 22 16:44:15 compute-0 sshd[129518]: Server listening on 0.0.0.0 port 22.
Jan 22 16:44:15 compute-0 sshd[129518]: Server listening on :: port 22.
Jan 22 16:44:15 compute-0 systemd[1]: Started OpenSSH server daemon.
Jan 22 16:44:17 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 16:44:17 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 22 16:44:17 compute-0 systemd[1]: Reloading.
Jan 22 16:44:17 compute-0 systemd-rc-local-generator[129772]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:44:17 compute-0 systemd-sysv-generator[129776]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:44:17 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 16:44:20 compute-0 sudo[111491]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:21 compute-0 sudo[134001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kymrxmgzynmhbzqyjsuywjoqobbtxfxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100260.941377-331-61365260145347/AnsiballZ_systemd.py'
Jan 22 16:44:21 compute-0 sudo[134001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:21 compute-0 python3.9[134024]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 16:44:21 compute-0 systemd[1]: Reloading.
Jan 22 16:44:21 compute-0 systemd-sysv-generator[134501]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:44:21 compute-0 systemd-rc-local-generator[134496]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:44:22 compute-0 sudo[134001]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:22 compute-0 sudo[135238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-disknuoqboczbvymcpyaluqgslnsibgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100262.2842443-331-170057479836700/AnsiballZ_systemd.py'
Jan 22 16:44:22 compute-0 sudo[135238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:22 compute-0 python3.9[135263]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 16:44:22 compute-0 systemd[1]: Reloading.
Jan 22 16:44:22 compute-0 systemd-rc-local-generator[135754]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:44:22 compute-0 systemd-sysv-generator[135757]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:44:23 compute-0 sudo[135238]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:23 compute-0 sudo[136575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dablmtfrvrvvnmmoezlmkwdlxnohlhcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100263.308959-331-10693110969783/AnsiballZ_systemd.py'
Jan 22 16:44:23 compute-0 sudo[136575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:23 compute-0 python3.9[136604]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 16:44:23 compute-0 systemd[1]: Reloading.
Jan 22 16:44:24 compute-0 systemd-rc-local-generator[137062]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:44:24 compute-0 systemd-sysv-generator[137068]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:44:24 compute-0 sudo[136575]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:24 compute-0 sudo[137896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcifaxhmijldospfndsocokqrnxobnau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100264.3401582-331-36450921582734/AnsiballZ_systemd.py'
Jan 22 16:44:24 compute-0 sudo[137896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:24 compute-0 python3.9[137922]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 16:44:25 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 16:44:25 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 16:44:25 compute-0 systemd[1]: man-db-cache-update.service: Consumed 9.925s CPU time.
Jan 22 16:44:25 compute-0 systemd[1]: run-rd18b688228a1476daebf4b6352929df1.service: Deactivated successfully.
Jan 22 16:44:25 compute-0 systemd[1]: Reloading.
Jan 22 16:44:26 compute-0 systemd-rc-local-generator[138904]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:44:26 compute-0 systemd-sysv-generator[138908]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:44:26 compute-0 sudo[137896]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:26 compute-0 sudo[139064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzljywlvfetoleuwduvatghcxuwgmujv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100266.3943315-360-139691707833787/AnsiballZ_systemd.py'
Jan 22 16:44:26 compute-0 sudo[139064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:27 compute-0 python3.9[139066]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:27 compute-0 systemd[1]: Reloading.
Jan 22 16:44:27 compute-0 systemd-sysv-generator[139097]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:44:27 compute-0 systemd-rc-local-generator[139093]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:44:27 compute-0 sudo[139064]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:27 compute-0 sudo[139254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqujxqhldaoetsvqmbfxjskehvdxmpiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100267.527159-360-100501845890980/AnsiballZ_systemd.py'
Jan 22 16:44:27 compute-0 sudo[139254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:28 compute-0 python3.9[139256]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:28 compute-0 systemd[1]: Reloading.
Jan 22 16:44:28 compute-0 systemd-sysv-generator[139287]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:44:28 compute-0 systemd-rc-local-generator[139283]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:44:28 compute-0 sudo[139254]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:29 compute-0 sudo[139444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvghblxugvvuwafimpbmdkimcegkkpfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100269.0926342-360-30135406070621/AnsiballZ_systemd.py'
Jan 22 16:44:29 compute-0 sudo[139444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:29 compute-0 python3.9[139446]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:29 compute-0 systemd[1]: Reloading.
Jan 22 16:44:29 compute-0 systemd-rc-local-generator[139475]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:44:29 compute-0 systemd-sysv-generator[139479]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:44:30 compute-0 sudo[139444]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:30 compute-0 sudo[139634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkstyzpwxnughweghvlnnzpqaapsvwis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100270.2907934-360-252456693633573/AnsiballZ_systemd.py'
Jan 22 16:44:30 compute-0 sudo[139634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:30 compute-0 python3.9[139636]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:31 compute-0 sudo[139634]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:31 compute-0 sudo[139789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byoodpdvwrsrajbxdvtuxhugfhoskvqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100271.2260764-360-38950764971332/AnsiballZ_systemd.py'
Jan 22 16:44:31 compute-0 sudo[139789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:31 compute-0 python3.9[139791]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:32 compute-0 systemd[1]: Reloading.
Jan 22 16:44:32 compute-0 systemd-rc-local-generator[139822]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:44:32 compute-0 systemd-sysv-generator[139825]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:44:33 compute-0 sudo[139789]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:33 compute-0 sudo[139979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbvlqmhyjmgwskgkjoocwlxpmtqewgxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100273.339646-396-7697984746279/AnsiballZ_systemd.py'
Jan 22 16:44:33 compute-0 sudo[139979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:33 compute-0 python3.9[139981]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 16:44:34 compute-0 systemd[1]: Reloading.
Jan 22 16:44:34 compute-0 systemd-sysv-generator[140012]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:44:34 compute-0 systemd-rc-local-generator[140009]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:44:34 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 22 16:44:34 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 22 16:44:34 compute-0 sudo[139979]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:34 compute-0 sudo[140172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjlkhfymqdrndywwylymhwdkocelnlzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100274.5954967-404-251333561812693/AnsiballZ_systemd.py'
Jan 22 16:44:34 compute-0 sudo[140172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:35 compute-0 python3.9[140174]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:35 compute-0 sudo[140172]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:35 compute-0 sudo[140327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcpbfhjlyinsogukwbsaiyxzvkoqergn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100275.4240031-404-232347694390981/AnsiballZ_systemd.py'
Jan 22 16:44:35 compute-0 sudo[140327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:36 compute-0 python3.9[140329]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:36 compute-0 sudo[140327]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:36 compute-0 sudo[140482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xujmymkcpchsgqeiougiitlamrbfwcge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100276.3057597-404-81505009728317/AnsiballZ_systemd.py'
Jan 22 16:44:36 compute-0 sudo[140482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:36 compute-0 python3.9[140484]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:36 compute-0 sudo[140482]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:37 compute-0 sudo[140637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vorexzyprssyvzmmmwaxfqbtumbpnixi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100277.1583622-404-42780725822330/AnsiballZ_systemd.py'
Jan 22 16:44:37 compute-0 sudo[140637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:37 compute-0 python3.9[140639]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:37 compute-0 sudo[140637]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:38 compute-0 sudo[140792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfxabuybjjscdngptameahnsgitzjzvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100278.097799-404-199267433286013/AnsiballZ_systemd.py'
Jan 22 16:44:38 compute-0 sudo[140792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:38 compute-0 python3.9[140794]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:38 compute-0 sudo[140792]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:39 compute-0 sudo[140947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmbekcykcpcvovenampjqgzsyyxnltrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100278.862845-404-266797895889113/AnsiballZ_systemd.py'
Jan 22 16:44:39 compute-0 sudo[140947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:39 compute-0 python3.9[140949]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:39 compute-0 sudo[140947]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:39 compute-0 sudo[141102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqmovospzajragtllzvsyftigziqzktq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100279.6277235-404-153893284260132/AnsiballZ_systemd.py'
Jan 22 16:44:39 compute-0 sudo[141102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:40 compute-0 python3.9[141104]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:40 compute-0 sudo[141102]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:40 compute-0 sudo[141257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwpblnoovemnhzqburemzzlkyaujlyzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100280.3739126-404-197996466022357/AnsiballZ_systemd.py'
Jan 22 16:44:40 compute-0 sudo[141257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:40 compute-0 python3.9[141259]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:40 compute-0 sudo[141257]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:41 compute-0 sudo[141412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwlmlkaxuptrmogwanqgssomtwilqpcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100281.1148465-404-43229161688656/AnsiballZ_systemd.py'
Jan 22 16:44:41 compute-0 sudo[141412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:41 compute-0 python3.9[141414]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:41 compute-0 sudo[141412]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:44:41.897 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:44:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:44:41.899 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:44:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:44:41.899 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:44:42 compute-0 sudo[141567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afjwpzudylcashyothjumvfnccabtcie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100281.8526368-404-22746763376877/AnsiballZ_systemd.py'
Jan 22 16:44:42 compute-0 sudo[141567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:42 compute-0 python3.9[141569]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:42 compute-0 sudo[141567]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:43 compute-0 sudo[141722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coqevtoriwzncbprelqvqlqfmnmbnjjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100282.5662236-404-198579615336912/AnsiballZ_systemd.py'
Jan 22 16:44:43 compute-0 sudo[141722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:43 compute-0 python3.9[141724]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:43 compute-0 sudo[141722]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:43 compute-0 sudo[141877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqugttmzvqgrngbivtzxoamimwcvanct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100283.5662308-404-31113737453994/AnsiballZ_systemd.py'
Jan 22 16:44:43 compute-0 sudo[141877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:44 compute-0 python3.9[141879]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:44 compute-0 sudo[141877]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:44 compute-0 sudo[142053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltkvejeushnknbcbrchvsgppebrrtoli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100284.415478-404-193152502166503/AnsiballZ_systemd.py'
Jan 22 16:44:44 compute-0 sudo[142053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:44 compute-0 podman[142007]: 2026-01-22 16:44:44.789499546 +0000 UTC m=+0.063442512 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 16:44:44 compute-0 podman[142006]: 2026-01-22 16:44:44.817297985 +0000 UTC m=+0.090990964 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 16:44:45 compute-0 python3.9[142067]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:45 compute-0 sudo[142053]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:45 compute-0 sudo[142231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oarwjcctffddkflwrqfgxlnugzskfbou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100285.2699816-404-46375018433496/AnsiballZ_systemd.py'
Jan 22 16:44:45 compute-0 sudo[142231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:45 compute-0 python3.9[142233]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 16:44:45 compute-0 sudo[142231]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:46 compute-0 sudo[142386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqqgsbflarjtserayjgymgrsthnqcija ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100286.3254278-506-59091919947744/AnsiballZ_file.py'
Jan 22 16:44:46 compute-0 sudo[142386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:46 compute-0 python3.9[142388]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:44:46 compute-0 sudo[142386]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:47 compute-0 sudo[142538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfmflotbrrjzzzmpeqjddagykzbapdvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100286.9049876-506-266434253770550/AnsiballZ_file.py'
Jan 22 16:44:47 compute-0 sudo[142538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:48 compute-0 python3.9[142540]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:44:48 compute-0 sudo[142538]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:49 compute-0 sudo[142690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpsjzrhwxsksxfaisqvcmykbtdhlxirz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100288.7661896-506-191473328582868/AnsiballZ_file.py'
Jan 22 16:44:49 compute-0 sudo[142690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:49 compute-0 python3.9[142692]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:44:49 compute-0 sudo[142690]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:49 compute-0 sudo[142842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuikhdcihtpyomzbqlguzcejiathavxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100289.5150003-506-49390820608273/AnsiballZ_file.py'
Jan 22 16:44:49 compute-0 sudo[142842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:50 compute-0 python3.9[142844]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:44:50 compute-0 sudo[142842]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:50 compute-0 sudo[142994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcyonekwkyzbsslfvvmrrxbxveqycmzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100290.3546443-506-181642003465275/AnsiballZ_file.py'
Jan 22 16:44:50 compute-0 sudo[142994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:50 compute-0 python3.9[142996]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:44:50 compute-0 sudo[142994]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:51 compute-0 sudo[143146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bklwwggqfmngxiznmhtaexxaqqpziwov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100291.0457776-506-135094075838169/AnsiballZ_file.py'
Jan 22 16:44:51 compute-0 sudo[143146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:51 compute-0 python3.9[143148]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:44:51 compute-0 sudo[143146]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:52 compute-0 python3.9[143298]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:44:52 compute-0 sudo[143448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evepxsbvtazcvzkqqzfeubhaykxoxpyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100292.5234628-557-227720119408020/AnsiballZ_stat.py'
Jan 22 16:44:52 compute-0 sudo[143448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:53 compute-0 python3.9[143450]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:44:53 compute-0 sudo[143448]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:53 compute-0 sudo[143573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqqypcpzgpomprwaudsgoqvnezxckwxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100292.5234628-557-227720119408020/AnsiballZ_copy.py'
Jan 22 16:44:53 compute-0 sudo[143573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:53 compute-0 python3.9[143575]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769100292.5234628-557-227720119408020/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:44:53 compute-0 sudo[143573]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:54 compute-0 sudo[143725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lahmdjpufmbnsuytmcsofacyabgbvctx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100294.069193-557-126335882140265/AnsiballZ_stat.py'
Jan 22 16:44:54 compute-0 sudo[143725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:54 compute-0 python3.9[143727]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:44:54 compute-0 sudo[143725]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:55 compute-0 sudo[143850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqrffhydhrfbesfryiuzjvccepnasiwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100294.069193-557-126335882140265/AnsiballZ_copy.py'
Jan 22 16:44:55 compute-0 sudo[143850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:55 compute-0 python3.9[143852]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769100294.069193-557-126335882140265/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:44:55 compute-0 sudo[143850]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:55 compute-0 sudo[144002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzofipkasmgipmqbebdkmuedhnpbdteo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100295.420377-557-129366169666308/AnsiballZ_stat.py'
Jan 22 16:44:55 compute-0 sudo[144002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:55 compute-0 python3.9[144004]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:44:55 compute-0 sudo[144002]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:56 compute-0 sudo[144127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjdtpgocidxumwzfldxrsmmgkgfshyjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100295.420377-557-129366169666308/AnsiballZ_copy.py'
Jan 22 16:44:56 compute-0 sudo[144127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:56 compute-0 python3.9[144129]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769100295.420377-557-129366169666308/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:44:56 compute-0 sudo[144127]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:57 compute-0 sudo[144279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhdvbcyhxzqlhruommcukrqvzgqwnuke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100296.7223618-557-30845313943835/AnsiballZ_stat.py'
Jan 22 16:44:57 compute-0 sudo[144279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:57 compute-0 python3.9[144281]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:44:57 compute-0 sudo[144279]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:57 compute-0 sudo[144404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jijqexsnrzbufwhejyudqgwsacwvqlsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100296.7223618-557-30845313943835/AnsiballZ_copy.py'
Jan 22 16:44:57 compute-0 sudo[144404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:57 compute-0 python3.9[144406]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769100296.7223618-557-30845313943835/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:44:57 compute-0 sudo[144404]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:58 compute-0 sudo[144556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owsnnlbyosolrimjrpnbugcbfdkvqhvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100298.1262882-557-142750323949502/AnsiballZ_stat.py'
Jan 22 16:44:58 compute-0 sudo[144556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:58 compute-0 python3.9[144558]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:44:58 compute-0 sudo[144556]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:59 compute-0 sudo[144681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yalxmwbsteozibzsouexmijsmbvewjbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100298.1262882-557-142750323949502/AnsiballZ_copy.py'
Jan 22 16:44:59 compute-0 sudo[144681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:44:59 compute-0 python3.9[144683]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769100298.1262882-557-142750323949502/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:44:59 compute-0 sudo[144681]: pam_unix(sudo:session): session closed for user root
Jan 22 16:44:59 compute-0 sudo[144833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdfvxkhujeiqrbilduhodimmpxrlbkci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100299.6549678-557-214869363553051/AnsiballZ_stat.py'
Jan 22 16:44:59 compute-0 sudo[144833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:00 compute-0 python3.9[144835]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:00 compute-0 sudo[144833]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:00 compute-0 sudo[144958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugmrpvzznkafbhwfilnahrqderwojztc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100299.6549678-557-214869363553051/AnsiballZ_copy.py'
Jan 22 16:45:00 compute-0 sudo[144958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:00 compute-0 python3.9[144960]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769100299.6549678-557-214869363553051/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:00 compute-0 sudo[144958]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:01 compute-0 sudo[145110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alcslagxmclwdwmpzxgcntrxszcaeuab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100300.8926613-557-240935400729147/AnsiballZ_stat.py'
Jan 22 16:45:01 compute-0 sudo[145110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:01 compute-0 python3.9[145112]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:01 compute-0 sudo[145110]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:01 compute-0 sudo[145233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azavflctiawddsrxliiazyytlmstbrhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100300.8926613-557-240935400729147/AnsiballZ_copy.py'
Jan 22 16:45:01 compute-0 sudo[145233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:01 compute-0 python3.9[145235]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769100300.8926613-557-240935400729147/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:01 compute-0 sudo[145233]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:02 compute-0 sudo[145385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faqgdrogttauvynrphhcbpovijexvclx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100302.1173806-557-64353938094208/AnsiballZ_stat.py'
Jan 22 16:45:02 compute-0 sudo[145385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:02 compute-0 python3.9[145387]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:02 compute-0 sudo[145385]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:03 compute-0 sudo[145510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uooyhrdavmxvqbhayxtvxusbtyliiavr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100302.1173806-557-64353938094208/AnsiballZ_copy.py'
Jan 22 16:45:03 compute-0 sudo[145510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:03 compute-0 python3.9[145512]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769100302.1173806-557-64353938094208/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:03 compute-0 sudo[145510]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:03 compute-0 sudo[145662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chpkooiyiveydafamalhbndkyfqyiztz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100303.5252175-670-203142317515988/AnsiballZ_command.py'
Jan 22 16:45:03 compute-0 sudo[145662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:04 compute-0 python3.9[145664]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 22 16:45:04 compute-0 sudo[145662]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:04 compute-0 sudo[145815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nufltzarmigfcfuzfnybuzwilrjzfawr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100304.2362113-679-170649361918245/AnsiballZ_file.py'
Jan 22 16:45:04 compute-0 sudo[145815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:04 compute-0 python3.9[145817]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:04 compute-0 sudo[145815]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:05 compute-0 sudo[145967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bygtfjzerttxznttitvrnxgsfliioxnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100304.9083517-679-197082726425159/AnsiballZ_file.py'
Jan 22 16:45:05 compute-0 sudo[145967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:05 compute-0 python3.9[145969]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:05 compute-0 sudo[145967]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:05 compute-0 sudo[146119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lztakonneytzrayoknqfceuexykzfiiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100305.5150132-679-141219828096814/AnsiballZ_file.py'
Jan 22 16:45:05 compute-0 sudo[146119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:06 compute-0 python3.9[146121]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:06 compute-0 sudo[146119]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:06 compute-0 sudo[146271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khxcdcjljvkgzqwzyfboctfwuzqyutnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100306.1737213-679-66253556681779/AnsiballZ_file.py'
Jan 22 16:45:06 compute-0 sudo[146271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:06 compute-0 python3.9[146273]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:06 compute-0 sudo[146271]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:07 compute-0 sudo[146423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nevopggvwegwwwfnulbllkohopdacjwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100306.94345-679-193132671840742/AnsiballZ_file.py'
Jan 22 16:45:07 compute-0 sudo[146423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:07 compute-0 python3.9[146425]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:07 compute-0 sudo[146423]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:07 compute-0 sudo[146575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilkuttdvvnygsdpmanxwuavejljqjqba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100307.6410253-679-104967589387753/AnsiballZ_file.py'
Jan 22 16:45:07 compute-0 sudo[146575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:08 compute-0 python3.9[146577]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:08 compute-0 sudo[146575]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:08 compute-0 sudo[146727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gssztcsjrgdxvijexyfqztarncevclqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100308.2747805-679-35768545051213/AnsiballZ_file.py'
Jan 22 16:45:08 compute-0 sudo[146727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:08 compute-0 python3.9[146729]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:08 compute-0 sudo[146727]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:09 compute-0 sudo[146879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icbyekchechhbsqyjgpxkrylwrxbjykg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100308.9462519-679-266144845616229/AnsiballZ_file.py'
Jan 22 16:45:09 compute-0 sudo[146879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:09 compute-0 python3.9[146881]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:09 compute-0 sudo[146879]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:09 compute-0 sudo[147031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gllnzvspjpizgezlhgjldxrazcgosmpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100309.6000493-679-5131761603474/AnsiballZ_file.py'
Jan 22 16:45:09 compute-0 sudo[147031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:10 compute-0 python3.9[147033]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:10 compute-0 sudo[147031]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:10 compute-0 sudo[147183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrisrxxhxbgqximdbmjzlgyacleqlzwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100310.2570171-679-184110205774389/AnsiballZ_file.py'
Jan 22 16:45:10 compute-0 sudo[147183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:10 compute-0 python3.9[147185]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:10 compute-0 sudo[147183]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:11 compute-0 sudo[147335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yulxevlmpmbyotmjugxvdtvxuinmpygi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100310.8637626-679-274362785975973/AnsiballZ_file.py'
Jan 22 16:45:11 compute-0 sudo[147335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:11 compute-0 python3.9[147337]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:11 compute-0 sudo[147335]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:11 compute-0 sudo[147487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sinasxppwnwshlztkirfvpjmxlyimagf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100311.6164331-679-272814816031106/AnsiballZ_file.py'
Jan 22 16:45:11 compute-0 sudo[147487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:12 compute-0 python3.9[147489]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:12 compute-0 sudo[147487]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:12 compute-0 sudo[147639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orkchwyxbmbxcuxndyyaeulyvgfhvpsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100312.273191-679-189392446731831/AnsiballZ_file.py'
Jan 22 16:45:12 compute-0 sudo[147639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:12 compute-0 python3.9[147641]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:12 compute-0 sudo[147639]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:13 compute-0 sudo[147791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdtdarqkwilpyorkmstwnqdbebwjrtif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100312.9644-679-98376433923229/AnsiballZ_file.py'
Jan 22 16:45:13 compute-0 sudo[147791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:13 compute-0 python3.9[147793]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:13 compute-0 sudo[147791]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:14 compute-0 sudo[147943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmvrkgfgvqqwqutrjoceaqlaowfhsgmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100313.7332788-778-117479124097741/AnsiballZ_stat.py'
Jan 22 16:45:14 compute-0 sudo[147943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:14 compute-0 python3.9[147945]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:14 compute-0 sudo[147943]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:14 compute-0 sudo[148066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-livusydsjytwyvxbkhdhoukhgzcbiawp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100313.7332788-778-117479124097741/AnsiballZ_copy.py'
Jan 22 16:45:14 compute-0 sudo[148066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:14 compute-0 python3.9[148068]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100313.7332788-778-117479124097741/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:14 compute-0 sudo[148066]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:15 compute-0 podman[148149]: 2026-01-22 16:45:15.35336431 +0000 UTC m=+0.055479385 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 16:45:15 compute-0 podman[148145]: 2026-01-22 16:45:15.425514352 +0000 UTC m=+0.134584514 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 16:45:15 compute-0 sudo[148262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cioxbveppypaconnydjyhvjdyfrkakhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100315.1657555-778-106858357599881/AnsiballZ_stat.py'
Jan 22 16:45:15 compute-0 sudo[148262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:15 compute-0 python3.9[148264]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:15 compute-0 sudo[148262]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:16 compute-0 sudo[148385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdicustentofdplzqvtgfqqfuruvjprq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100315.1657555-778-106858357599881/AnsiballZ_copy.py'
Jan 22 16:45:16 compute-0 sudo[148385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:16 compute-0 python3.9[148387]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100315.1657555-778-106858357599881/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:16 compute-0 sudo[148385]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:16 compute-0 sudo[148537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhzyyayyungvwtcuemspkadopswqbspk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100316.392355-778-82750802531221/AnsiballZ_stat.py'
Jan 22 16:45:16 compute-0 sudo[148537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:16 compute-0 python3.9[148539]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:16 compute-0 sudo[148537]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:17 compute-0 sudo[148660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxgmodprsgssiwlsgjosiankowaizaqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100316.392355-778-82750802531221/AnsiballZ_copy.py'
Jan 22 16:45:17 compute-0 sudo[148660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:17 compute-0 python3.9[148662]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100316.392355-778-82750802531221/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:17 compute-0 sudo[148660]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:17 compute-0 sudo[148812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdyaxwjxndpeihdjupeiuspeanuwyxuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100317.656311-778-108932132767865/AnsiballZ_stat.py'
Jan 22 16:45:17 compute-0 sudo[148812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:18 compute-0 python3.9[148814]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:18 compute-0 sudo[148812]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:18 compute-0 sudo[148935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhvusswbqezecvwbkxiwdncloberrjsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100317.656311-778-108932132767865/AnsiballZ_copy.py'
Jan 22 16:45:18 compute-0 sudo[148935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:18 compute-0 python3.9[148937]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100317.656311-778-108932132767865/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:18 compute-0 sudo[148935]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:19 compute-0 sudo[149087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jptzpgjsxnxjausolosgnvdycjomyttu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100318.907112-778-72721725375940/AnsiballZ_stat.py'
Jan 22 16:45:19 compute-0 sudo[149087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:19 compute-0 python3.9[149089]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:19 compute-0 sudo[149087]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:19 compute-0 sudo[149210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrtoungvlfdowtvapqgpmrfpxsvgredx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100318.907112-778-72721725375940/AnsiballZ_copy.py'
Jan 22 16:45:19 compute-0 sudo[149210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:20 compute-0 python3.9[149212]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100318.907112-778-72721725375940/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:20 compute-0 sudo[149210]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:20 compute-0 sudo[149362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cddgwqhthdzysxuorrpwnsmwjaycknsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100320.2460382-778-152166930122208/AnsiballZ_stat.py'
Jan 22 16:45:20 compute-0 sudo[149362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:20 compute-0 python3.9[149364]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:20 compute-0 sudo[149362]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:21 compute-0 sudo[149485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-carmpzjpshlmlgawdfmcuqlnhcsejmmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100320.2460382-778-152166930122208/AnsiballZ_copy.py'
Jan 22 16:45:21 compute-0 sudo[149485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:21 compute-0 python3.9[149487]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100320.2460382-778-152166930122208/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:21 compute-0 sudo[149485]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:21 compute-0 sudo[149637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgmopmrvtlvhwzyqkdsgesimsizwnkbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100321.5025456-778-194316757214469/AnsiballZ_stat.py'
Jan 22 16:45:21 compute-0 sudo[149637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:22 compute-0 python3.9[149639]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:22 compute-0 sudo[149637]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:22 compute-0 sudo[149760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvalguoxctbrzuuyzmacjflqfovubztv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100321.5025456-778-194316757214469/AnsiballZ_copy.py'
Jan 22 16:45:22 compute-0 sudo[149760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:22 compute-0 python3.9[149762]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100321.5025456-778-194316757214469/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:22 compute-0 sudo[149760]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:23 compute-0 sudo[149912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swwxkbifuuorcntkcydyquopmczywsne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100322.764013-778-237076802438724/AnsiballZ_stat.py'
Jan 22 16:45:23 compute-0 sudo[149912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:23 compute-0 python3.9[149914]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:23 compute-0 sudo[149912]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:23 compute-0 sudo[150035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voaedxrftwdgmsawqhcrtznajzajxetl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100322.764013-778-237076802438724/AnsiballZ_copy.py'
Jan 22 16:45:23 compute-0 sudo[150035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:23 compute-0 python3.9[150037]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100322.764013-778-237076802438724/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:23 compute-0 sudo[150035]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:24 compute-0 sudo[150187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwzrpkmtxxdbauiituijdslvqiqbbzpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100324.0099776-778-49291787957704/AnsiballZ_stat.py'
Jan 22 16:45:24 compute-0 sudo[150187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:24 compute-0 python3.9[150189]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:24 compute-0 sudo[150187]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:24 compute-0 sudo[150310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrfgpfffgpxoeasgemwovwofqcucrvoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100324.0099776-778-49291787957704/AnsiballZ_copy.py'
Jan 22 16:45:24 compute-0 sudo[150310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:25 compute-0 python3.9[150312]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100324.0099776-778-49291787957704/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:25 compute-0 sudo[150310]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:25 compute-0 sudo[150462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faegnuqtszcobjextmcejwnpptfeiplr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100325.220357-778-227065199703209/AnsiballZ_stat.py'
Jan 22 16:45:25 compute-0 sudo[150462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:25 compute-0 python3.9[150464]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:25 compute-0 sudo[150462]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:26 compute-0 sudo[150585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bavqfpnxgyuphcwetbuklmfwqlcemrob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100325.220357-778-227065199703209/AnsiballZ_copy.py'
Jan 22 16:45:26 compute-0 sudo[150585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:26 compute-0 python3.9[150587]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100325.220357-778-227065199703209/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:26 compute-0 sudo[150585]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:27 compute-0 sudo[150737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nythkabwbwyfuohbvafqigjcdezuoitq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100326.695289-778-221543737611007/AnsiballZ_stat.py'
Jan 22 16:45:27 compute-0 sudo[150737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:27 compute-0 python3.9[150739]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:27 compute-0 sudo[150737]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:27 compute-0 sudo[150860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnqnvmjeayslelfwpnfkcpnkvyerdwju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100326.695289-778-221543737611007/AnsiballZ_copy.py'
Jan 22 16:45:27 compute-0 sudo[150860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:27 compute-0 python3.9[150862]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100326.695289-778-221543737611007/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:27 compute-0 sudo[150860]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:28 compute-0 sudo[151012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgenrefnzrtbomrqtfeypqohoxzknzfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100328.4419773-778-80266503551396/AnsiballZ_stat.py'
Jan 22 16:45:28 compute-0 sudo[151012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:28 compute-0 python3.9[151014]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:28 compute-0 sudo[151012]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:29 compute-0 sudo[151135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqnpimdtzadnoexjcbejxeeonoxlssxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100328.4419773-778-80266503551396/AnsiballZ_copy.py'
Jan 22 16:45:29 compute-0 sudo[151135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:30 compute-0 python3.9[151137]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100328.4419773-778-80266503551396/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:30 compute-0 sudo[151135]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:30 compute-0 sudo[151287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkrunluuaofktncvxdwqllhbqjgnammh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100330.1632528-778-186083724483212/AnsiballZ_stat.py'
Jan 22 16:45:30 compute-0 sudo[151287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:30 compute-0 python3.9[151289]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:30 compute-0 sudo[151287]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:31 compute-0 sudo[151410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrmvulcydocrzwumdnhmgdchamkqjshe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100330.1632528-778-186083724483212/AnsiballZ_copy.py'
Jan 22 16:45:31 compute-0 sudo[151410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:31 compute-0 python3.9[151412]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100330.1632528-778-186083724483212/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:31 compute-0 sudo[151410]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:31 compute-0 sudo[151562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phrvpkedobcmiyhrjkvautxfoktbeeev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100331.4745574-778-247319016735744/AnsiballZ_stat.py'
Jan 22 16:45:31 compute-0 sudo[151562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:31 compute-0 python3.9[151564]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:31 compute-0 sudo[151562]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:32 compute-0 sudo[151685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imhykshsdxvqfrohxokwuvjwqwlzhtgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100331.4745574-778-247319016735744/AnsiballZ_copy.py'
Jan 22 16:45:32 compute-0 sudo[151685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:32 compute-0 python3.9[151687]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100331.4745574-778-247319016735744/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:32 compute-0 sudo[151685]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:33 compute-0 python3.9[151837]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:45:34 compute-0 sudo[151990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afhdmdcrmpcmagbydelsunscangfsnyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100333.4905505-984-74243584611909/AnsiballZ_seboolean.py'
Jan 22 16:45:34 compute-0 sudo[151990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:34 compute-0 python3.9[151992]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 22 16:45:35 compute-0 sudo[151990]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:36 compute-0 sudo[152147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yytkbnlaafxkrhvmmgxcnpvutqaoumbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100336.0781357-992-34019464776600/AnsiballZ_copy.py'
Jan 22 16:45:36 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 22 16:45:36 compute-0 sudo[152147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:36 compute-0 python3.9[152149]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:36 compute-0 sudo[152147]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:37 compute-0 sudo[152299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqrgprkxbdycqsppeqmeaqhicjdqjhgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100336.8192427-992-85588148053639/AnsiballZ_copy.py'
Jan 22 16:45:37 compute-0 sudo[152299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:37 compute-0 python3.9[152301]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:37 compute-0 sudo[152299]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:37 compute-0 sudo[152451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thflgppahqzurhqqhtimxssqvwpssmbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100337.578686-992-242371971588964/AnsiballZ_copy.py'
Jan 22 16:45:37 compute-0 sudo[152451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:38 compute-0 python3.9[152453]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:38 compute-0 sudo[152451]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:38 compute-0 sudo[152603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyfzeplflahllmhpwrwddvcjuqvlpfzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100338.2695913-992-263184705365690/AnsiballZ_copy.py'
Jan 22 16:45:38 compute-0 sudo[152603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:38 compute-0 python3.9[152605]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:38 compute-0 sudo[152603]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:39 compute-0 sudo[152755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrgjkpgjqikljvbnwfpboreivjadfxpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100338.9424846-992-62549980828448/AnsiballZ_copy.py'
Jan 22 16:45:39 compute-0 sudo[152755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:39 compute-0 python3.9[152757]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:39 compute-0 sudo[152755]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:39 compute-0 sudo[152907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cczwwevebcgscwgalneyxlcsoyoyblni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100339.6664765-1028-90264607591883/AnsiballZ_copy.py'
Jan 22 16:45:39 compute-0 sudo[152907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:40 compute-0 python3.9[152909]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:40 compute-0 sudo[152907]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:40 compute-0 sudo[153059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oencanptbmckprflklnjsjmqpnaeilqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100340.3579545-1028-12493610901589/AnsiballZ_copy.py'
Jan 22 16:45:40 compute-0 sudo[153059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:40 compute-0 python3.9[153061]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:40 compute-0 sudo[153059]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:41 compute-0 sudo[153211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzwrjzdlrwfggulvecdrmrhhxvifxqrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100340.9727733-1028-187076168272859/AnsiballZ_copy.py'
Jan 22 16:45:41 compute-0 sudo[153211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:41 compute-0 python3.9[153213]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:41 compute-0 sudo[153211]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:41 compute-0 sudo[153363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkznnzrabbfswvtrzlzuqfraygtkwfin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100341.5927854-1028-187259839095946/AnsiballZ_copy.py'
Jan 22 16:45:41 compute-0 sudo[153363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:45:41.898 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:45:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:45:41.900 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:45:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:45:41.900 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:45:42 compute-0 python3.9[153365]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:42 compute-0 sudo[153363]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:42 compute-0 sudo[153515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrfchhkkfzgrkyggwcbuqtrttlijoxdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100342.2113576-1028-236122565735337/AnsiballZ_copy.py'
Jan 22 16:45:42 compute-0 sudo[153515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:42 compute-0 python3.9[153517]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:42 compute-0 sudo[153515]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:43 compute-0 sudo[153667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsgwgwcwfmnbaatjpvtrdhtabymhuclz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100343.0073197-1064-204746903564166/AnsiballZ_systemd.py'
Jan 22 16:45:43 compute-0 sudo[153667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:43 compute-0 python3.9[153669]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:45:43 compute-0 systemd[1]: Reloading.
Jan 22 16:45:43 compute-0 systemd-rc-local-generator[153696]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:45:43 compute-0 systemd-sysv-generator[153700]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:45:44 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Jan 22 16:45:44 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Jan 22 16:45:44 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 22 16:45:44 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 22 16:45:44 compute-0 systemd[1]: Starting libvirt logging daemon...
Jan 22 16:45:44 compute-0 systemd[1]: Started libvirt logging daemon.
Jan 22 16:45:44 compute-0 sudo[153667]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:44 compute-0 sudo[153859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lredcwyzoszqnmgqvboqaivbqozbxbuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100344.3769388-1064-177072549612792/AnsiballZ_systemd.py'
Jan 22 16:45:44 compute-0 sudo[153859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:45 compute-0 python3.9[153861]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:45:45 compute-0 systemd[1]: Reloading.
Jan 22 16:45:45 compute-0 systemd-sysv-generator[153890]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:45:45 compute-0 systemd-rc-local-generator[153882]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:45:45 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 22 16:45:45 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 22 16:45:45 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 22 16:45:45 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 22 16:45:45 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 22 16:45:45 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 22 16:45:45 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Jan 22 16:45:45 compute-0 systemd[1]: Started libvirt nodedev daemon.
Jan 22 16:45:45 compute-0 podman[153903]: 2026-01-22 16:45:45.456565678 +0000 UTC m=+0.070444679 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 16:45:45 compute-0 sudo[153859]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:45 compute-0 podman[153942]: 2026-01-22 16:45:45.576478578 +0000 UTC m=+0.090754134 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Jan 22 16:45:45 compute-0 sudo[154119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzqxxnpvpchpojbykuneiotqnfmlyola ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100345.5954697-1064-146192701825237/AnsiballZ_systemd.py'
Jan 22 16:45:45 compute-0 sudo[154119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:46 compute-0 python3.9[154121]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:45:46 compute-0 systemd[1]: Reloading.
Jan 22 16:45:46 compute-0 systemd-rc-local-generator[154146]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:45:46 compute-0 systemd-sysv-generator[154151]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:45:46 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 22 16:45:46 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 22 16:45:46 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 22 16:45:46 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 22 16:45:46 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 22 16:45:46 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 22 16:45:46 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 22 16:45:46 compute-0 sudo[154119]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:46 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 22 16:45:46 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 22 16:45:46 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 22 16:45:47 compute-0 sudo[154338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snfzturhyzxkgltqjozxcwzcgjezhuvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100346.7778053-1064-69826460635677/AnsiballZ_systemd.py'
Jan 22 16:45:47 compute-0 sudo[154338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:47 compute-0 python3.9[154340]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:45:47 compute-0 systemd[1]: Reloading.
Jan 22 16:45:47 compute-0 systemd-rc-local-generator[154371]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:45:47 compute-0 systemd-sysv-generator[154375]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:45:47 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Jan 22 16:45:47 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 22 16:45:47 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 22 16:45:47 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 22 16:45:47 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 22 16:45:47 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 22 16:45:47 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 22 16:45:47 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 22 16:45:47 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 22 16:45:47 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 22 16:45:47 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Jan 22 16:45:47 compute-0 systemd[1]: Started libvirt QEMU daemon.
Jan 22 16:45:47 compute-0 sudo[154338]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:47 compute-0 setroubleshoot[154158]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 1b99b166-37dd-41b2-a0de-c3098acca135
Jan 22 16:45:47 compute-0 setroubleshoot[154158]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 22 16:45:47 compute-0 setroubleshoot[154158]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 1b99b166-37dd-41b2-a0de-c3098acca135
Jan 22 16:45:47 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 16:45:47 compute-0 setroubleshoot[154158]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 22 16:45:48 compute-0 sudo[154557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guwqhjukiuwkvtbywuvszhnyzytlfmrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100347.9745896-1064-62745822543147/AnsiballZ_systemd.py'
Jan 22 16:45:48 compute-0 sudo[154557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:48 compute-0 python3.9[154559]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:45:48 compute-0 systemd[1]: Reloading.
Jan 22 16:45:48 compute-0 systemd-rc-local-generator[154586]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:45:48 compute-0 systemd-sysv-generator[154591]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:45:49 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Jan 22 16:45:49 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Jan 22 16:45:49 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 22 16:45:49 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 22 16:45:49 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 22 16:45:49 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 22 16:45:49 compute-0 systemd[1]: Starting libvirt secret daemon...
Jan 22 16:45:49 compute-0 systemd[1]: Started libvirt secret daemon.
Jan 22 16:45:49 compute-0 sudo[154557]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:49 compute-0 sudo[154769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztifvjygqrfnhbeerasbkolxvnxqofxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100349.5124593-1101-96810213586308/AnsiballZ_file.py'
Jan 22 16:45:49 compute-0 sudo[154769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:50 compute-0 python3.9[154771]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:50 compute-0 sudo[154769]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:50 compute-0 sudo[154921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpjzpszibwffbtlsrovzlclivglrvpex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100350.394397-1109-90597455299844/AnsiballZ_find.py'
Jan 22 16:45:50 compute-0 sudo[154921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:51 compute-0 python3.9[154923]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 16:45:51 compute-0 sudo[154921]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:51 compute-0 sudo[155073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okxwepnuvkvhxemuzerncskytuzohcmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100351.4935367-1123-38691195039974/AnsiballZ_stat.py'
Jan 22 16:45:51 compute-0 sudo[155073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:52 compute-0 python3.9[155075]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:52 compute-0 sudo[155073]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:52 compute-0 sudo[155196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqtqtfmzolijruftidrwskeyilqwgmfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100351.4935367-1123-38691195039974/AnsiballZ_copy.py'
Jan 22 16:45:52 compute-0 sudo[155196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:52 compute-0 python3.9[155198]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100351.4935367-1123-38691195039974/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:52 compute-0 sudo[155196]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:53 compute-0 sudo[155348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfvuhpridonoqyaizqptuucozqzbmeuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100352.9684153-1139-44317070345397/AnsiballZ_file.py'
Jan 22 16:45:53 compute-0 sudo[155348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:53 compute-0 python3.9[155350]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:53 compute-0 sudo[155348]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:54 compute-0 sudo[155500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhjgawevjvtmlygszucdjoczavtlqztl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100353.7945545-1147-98434216939448/AnsiballZ_stat.py'
Jan 22 16:45:54 compute-0 sudo[155500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:54 compute-0 python3.9[155502]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:54 compute-0 sudo[155500]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:54 compute-0 sudo[155578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmlwtwhhigmldibnsomzawtpuplgyemg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100353.7945545-1147-98434216939448/AnsiballZ_file.py'
Jan 22 16:45:54 compute-0 sudo[155578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:54 compute-0 python3.9[155580]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:54 compute-0 sudo[155578]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:55 compute-0 sudo[155730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxdvemkmaegkgodxpwckjhpcyshwjxvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100354.9308527-1159-139171166231771/AnsiballZ_stat.py'
Jan 22 16:45:55 compute-0 sudo[155730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:55 compute-0 python3.9[155732]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:55 compute-0 sudo[155730]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:55 compute-0 sudo[155808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqpwbazybqninorhfckbzywsknvjbbix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100354.9308527-1159-139171166231771/AnsiballZ_file.py'
Jan 22 16:45:55 compute-0 sudo[155808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:55 compute-0 python3.9[155810]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.34b4vego recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:55 compute-0 sudo[155808]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:56 compute-0 sudo[155960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sblzsnuxfgidlgndnngxsqapfnybzvbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100356.075479-1171-103763359634045/AnsiballZ_stat.py'
Jan 22 16:45:56 compute-0 sudo[155960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:56 compute-0 python3.9[155962]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:56 compute-0 sudo[155960]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:57 compute-0 sudo[156038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oycwbbfvliofnzyeyeylncauwsjzuaha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100356.075479-1171-103763359634045/AnsiballZ_file.py'
Jan 22 16:45:57 compute-0 sudo[156038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:57 compute-0 python3.9[156040]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:45:57 compute-0 sudo[156038]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:57 compute-0 sudo[156190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdkbkdopqdhmmhhlsrwfbzwinccsqwde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100357.539257-1184-180427478680592/AnsiballZ_command.py'
Jan 22 16:45:57 compute-0 sudo[156190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:57 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 22 16:45:58 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 22 16:45:58 compute-0 python3.9[156192]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:45:58 compute-0 sudo[156190]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:58 compute-0 sudo[156343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twaoaufvjbjpxrjslslcyzyqatpqwsof ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769100358.3509426-1192-259196792465314/AnsiballZ_edpm_nftables_from_files.py'
Jan 22 16:45:58 compute-0 sudo[156343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:59 compute-0 python3[156345]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 22 16:45:59 compute-0 sudo[156343]: pam_unix(sudo:session): session closed for user root
Jan 22 16:45:59 compute-0 sudo[156495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufqoyrudvxrhffoquuuorwlqmfkgbpll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100359.246747-1200-229269729580873/AnsiballZ_stat.py'
Jan 22 16:45:59 compute-0 sudo[156495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:45:59 compute-0 python3.9[156497]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:45:59 compute-0 sudo[156495]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:00 compute-0 sudo[156573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csqyssanhmdeavenopztozkoefrbjuiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100359.246747-1200-229269729580873/AnsiballZ_file.py'
Jan 22 16:46:00 compute-0 sudo[156573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:00 compute-0 python3.9[156575]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:00 compute-0 sudo[156573]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:00 compute-0 sudo[156725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpiodpzsghpmtjwxsrbjgskzxptzauti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100360.49525-1212-229572090136313/AnsiballZ_stat.py'
Jan 22 16:46:00 compute-0 sudo[156725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:01 compute-0 python3.9[156727]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:46:01 compute-0 sudo[156725]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:01 compute-0 sudo[156850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lunrkyuhmitnxsoywmuxzfcnyftbfwjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100360.49525-1212-229572090136313/AnsiballZ_copy.py'
Jan 22 16:46:01 compute-0 sudo[156850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:01 compute-0 python3.9[156852]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100360.49525-1212-229572090136313/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:01 compute-0 sudo[156850]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:02 compute-0 sudo[157002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqermguxylcbtrsnlvcqxjnmxkbtbjzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100361.8984232-1227-106118645153814/AnsiballZ_stat.py'
Jan 22 16:46:02 compute-0 sudo[157002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:02 compute-0 python3.9[157004]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:46:02 compute-0 sudo[157002]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:02 compute-0 sudo[157080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrsxuljgfnymklgteddcqousutvpkchw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100361.8984232-1227-106118645153814/AnsiballZ_file.py'
Jan 22 16:46:02 compute-0 sudo[157080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:02 compute-0 python3.9[157082]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:02 compute-0 sudo[157080]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:03 compute-0 sudo[157232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyluwahnnqucflnwmtpnkuirozzqghma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100363.1677718-1239-68399577525073/AnsiballZ_stat.py'
Jan 22 16:46:03 compute-0 sudo[157232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:03 compute-0 python3.9[157234]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:46:03 compute-0 sudo[157232]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:04 compute-0 sudo[157310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouovsjmcxdqqodfztpkyogmsqjjpzjge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100363.1677718-1239-68399577525073/AnsiballZ_file.py'
Jan 22 16:46:04 compute-0 sudo[157310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:04 compute-0 python3.9[157312]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:04 compute-0 sudo[157310]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:04 compute-0 sudo[157462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqacwnodrxcrdwgazrrwuyvluyrbumhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100364.4753435-1251-226651736704040/AnsiballZ_stat.py'
Jan 22 16:46:04 compute-0 sudo[157462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:05 compute-0 python3.9[157464]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:46:05 compute-0 sudo[157462]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:05 compute-0 sudo[157587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eauyopxjqzevraoyvrnvwgesqndktfxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100364.4753435-1251-226651736704040/AnsiballZ_copy.py'
Jan 22 16:46:05 compute-0 sudo[157587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:05 compute-0 python3.9[157589]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100364.4753435-1251-226651736704040/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:05 compute-0 sudo[157587]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:06 compute-0 sudo[157739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vptgiiibukyeoxythgpzkitqrnvqnuqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100366.1396763-1266-281177406140191/AnsiballZ_file.py'
Jan 22 16:46:06 compute-0 sudo[157739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:06 compute-0 python3.9[157741]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:06 compute-0 sudo[157739]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:07 compute-0 sudo[157891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmsmquvconzzazozcjqhmtgjqipqljmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100366.9884324-1274-104779594447226/AnsiballZ_command.py'
Jan 22 16:46:07 compute-0 sudo[157891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:07 compute-0 python3.9[157893]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:46:07 compute-0 sudo[157891]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:08 compute-0 sudo[158046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khqwydcgfliqdueyzvxhhojpuqvbpniw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100367.7904217-1282-241822129759332/AnsiballZ_blockinfile.py'
Jan 22 16:46:08 compute-0 sudo[158046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:08 compute-0 python3.9[158048]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:08 compute-0 sudo[158046]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:09 compute-0 sudo[158198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhhftznvicvpqobgotehnzzecvbvauid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100368.8482606-1291-234318062170261/AnsiballZ_command.py'
Jan 22 16:46:09 compute-0 sudo[158198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:09 compute-0 python3.9[158200]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:46:09 compute-0 sudo[158198]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:10 compute-0 sudo[158351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqkwwdragyfyvbpegprbrqqfhaamtagy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100369.717011-1299-151801695552427/AnsiballZ_stat.py'
Jan 22 16:46:10 compute-0 sudo[158351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:10 compute-0 python3.9[158353]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:46:10 compute-0 sudo[158351]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:10 compute-0 sudo[158505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bufhbozabvskumdlcxjlamufekyfkmrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100370.5475743-1307-215680935642741/AnsiballZ_command.py'
Jan 22 16:46:10 compute-0 sudo[158505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:11 compute-0 python3.9[158507]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:46:11 compute-0 sudo[158505]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:11 compute-0 sudo[158660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liuxqsxmtkookqfsupzmxzvxqvnfofgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100371.3913805-1315-162960483376929/AnsiballZ_file.py'
Jan 22 16:46:11 compute-0 sudo[158660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:11 compute-0 python3.9[158662]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:11 compute-0 sudo[158660]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:12 compute-0 sudo[158812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbutuvugtcclsxdffysulozzvvlkaywl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100372.1590765-1323-174137993041975/AnsiballZ_stat.py'
Jan 22 16:46:12 compute-0 sudo[158812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:12 compute-0 python3.9[158814]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:46:12 compute-0 sudo[158812]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:13 compute-0 sudo[158935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aferudgyfkiuijnbtzvqwwbcdpxlnzgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100372.1590765-1323-174137993041975/AnsiballZ_copy.py'
Jan 22 16:46:13 compute-0 sudo[158935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:13 compute-0 python3.9[158937]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100372.1590765-1323-174137993041975/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:13 compute-0 sudo[158935]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:13 compute-0 sudo[159087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhbwewzfktrtcgswtdnrkibzlqdhpmkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100373.5255132-1338-235382173787268/AnsiballZ_stat.py'
Jan 22 16:46:13 compute-0 sudo[159087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:14 compute-0 python3.9[159089]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:46:14 compute-0 sudo[159087]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:14 compute-0 sudo[159210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlqlsymbrzhwvnfzbfzjbxvmahcqhssz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100373.5255132-1338-235382173787268/AnsiballZ_copy.py'
Jan 22 16:46:14 compute-0 sudo[159210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:14 compute-0 python3.9[159212]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100373.5255132-1338-235382173787268/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:14 compute-0 sudo[159210]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:15 compute-0 sudo[159362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqaqfaklfppcksmjvusiymkieuvilhmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100374.8589838-1353-183936521067781/AnsiballZ_stat.py'
Jan 22 16:46:15 compute-0 sudo[159362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:15 compute-0 python3.9[159364]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:46:15 compute-0 sudo[159362]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:15 compute-0 podman[159460]: 2026-01-22 16:46:15.703672144 +0000 UTC m=+0.061288928 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 16:46:15 compute-0 sudo[159517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzjswdsrtvskpkpxprmcxwevzevibjpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100374.8589838-1353-183936521067781/AnsiballZ_copy.py'
Jan 22 16:46:15 compute-0 sudo[159517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:15 compute-0 podman[159459]: 2026-01-22 16:46:15.737745466 +0000 UTC m=+0.095370800 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller)
Jan 22 16:46:15 compute-0 python3.9[159530]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100374.8589838-1353-183936521067781/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:15 compute-0 sudo[159517]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:16 compute-0 sudo[159683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgubafasnzmekyqzahbddvfzqhpmpdve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100376.1356819-1368-145966795286837/AnsiballZ_systemd.py'
Jan 22 16:46:16 compute-0 sudo[159683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:16 compute-0 python3.9[159685]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:46:16 compute-0 systemd[1]: Reloading.
Jan 22 16:46:16 compute-0 systemd-rc-local-generator[159713]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:46:16 compute-0 systemd-sysv-generator[159716]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:46:17 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Jan 22 16:46:17 compute-0 sudo[159683]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:17 compute-0 sudo[159874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sivgvdpojvhrlmozopiespqtaqieifth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100377.4372804-1376-88130118512644/AnsiballZ_systemd.py'
Jan 22 16:46:17 compute-0 sudo[159874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:18 compute-0 python3.9[159876]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 22 16:46:18 compute-0 systemd[1]: Reloading.
Jan 22 16:46:18 compute-0 systemd-sysv-generator[159906]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:46:18 compute-0 systemd-rc-local-generator[159901]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:46:18 compute-0 systemd[1]: Reloading.
Jan 22 16:46:18 compute-0 systemd-rc-local-generator[159940]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:46:18 compute-0 systemd-sysv-generator[159945]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:46:18 compute-0 sudo[159874]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:19 compute-0 sshd-session[105199]: Connection closed by 192.168.122.30 port 45238
Jan 22 16:46:19 compute-0 sshd-session[105196]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:46:19 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Jan 22 16:46:19 compute-0 systemd[1]: session-23.scope: Consumed 3min 24.498s CPU time.
Jan 22 16:46:19 compute-0 systemd-logind[796]: Session 23 logged out. Waiting for processes to exit.
Jan 22 16:46:19 compute-0 systemd-logind[796]: Removed session 23.
Jan 22 16:46:23 compute-0 sshd-session[159973]: Accepted publickey for zuul from 192.168.122.30 port 59076 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:46:23 compute-0 systemd-logind[796]: New session 24 of user zuul.
Jan 22 16:46:24 compute-0 systemd[1]: Started Session 24 of User zuul.
Jan 22 16:46:24 compute-0 sshd-session[159973]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:46:25 compute-0 python3.9[160126]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:46:26 compute-0 python3.9[160280]: ansible-ansible.builtin.service_facts Invoked
Jan 22 16:46:26 compute-0 network[160297]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 16:46:26 compute-0 network[160298]: 'network-scripts' will be removed from distribution in near future.
Jan 22 16:46:26 compute-0 network[160299]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 16:46:30 compute-0 sudo[160568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pedtameujkabcicvqqttbtnbmdyjjlvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100390.1256151-42-232454670907169/AnsiballZ_setup.py'
Jan 22 16:46:30 compute-0 sudo[160568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:30 compute-0 python3.9[160570]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:46:31 compute-0 sudo[160568]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:31 compute-0 sudo[160652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azaexyjynmsgshfwozwwbnkbnnbkeqny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100390.1256151-42-232454670907169/AnsiballZ_dnf.py'
Jan 22 16:46:31 compute-0 sudo[160652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:31 compute-0 python3.9[160654]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:46:36 compute-0 sudo[160652]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:37 compute-0 sudo[160805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjoiohtawkprfnllifqxyzvdlfyzxlza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100396.9485621-54-62529900433502/AnsiballZ_stat.py'
Jan 22 16:46:37 compute-0 sudo[160805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:37 compute-0 python3.9[160807]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:46:37 compute-0 sudo[160805]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:38 compute-0 sudo[160957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmfhhfkogmykauwchxknxrqosatuqmri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100397.833617-64-19897612193941/AnsiballZ_command.py'
Jan 22 16:46:38 compute-0 sudo[160957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:38 compute-0 python3.9[160959]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:46:38 compute-0 sudo[160957]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:39 compute-0 sudo[161110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bshoqqiihqxpxzsfzjbuexbzostradjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100398.815361-74-226147234709690/AnsiballZ_stat.py'
Jan 22 16:46:39 compute-0 sudo[161110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:39 compute-0 python3.9[161112]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:46:39 compute-0 sudo[161110]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:39 compute-0 sudo[161262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfgmvaohfekjokwviksydirfplxvhpyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100399.5628998-82-122794935753279/AnsiballZ_command.py'
Jan 22 16:46:39 compute-0 sudo[161262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:40 compute-0 python3.9[161264]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:46:40 compute-0 sudo[161262]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:40 compute-0 sudo[161415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcpeqmignwyemmvwpbnjfwcfvxuupcto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100400.2568257-90-219757192845553/AnsiballZ_stat.py'
Jan 22 16:46:40 compute-0 sudo[161415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:40 compute-0 python3.9[161417]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:46:40 compute-0 sudo[161415]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:41 compute-0 sudo[161538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lthlijumeudzzgeinyompwxdgjfuyfas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100400.2568257-90-219757192845553/AnsiballZ_copy.py'
Jan 22 16:46:41 compute-0 sudo[161538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:41 compute-0 python3.9[161540]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100400.2568257-90-219757192845553/.source.iscsi _original_basename=.svejve7r follow=False checksum=cfebd7a30dc8c039471f187540cea65e964c0331 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:41 compute-0 sudo[161538]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:46:41.899 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:46:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:46:41.900 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:46:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:46:41.900 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:46:42 compute-0 sudo[161690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqndbzbvvjwuwsaiwqceonqafbermezz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100401.9947314-105-135494700918107/AnsiballZ_file.py'
Jan 22 16:46:42 compute-0 sudo[161690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:42 compute-0 python3.9[161692]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:42 compute-0 sudo[161690]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:43 compute-0 sudo[161842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfifwohrrnmhfkxzyrwmjfbfayuoskah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100402.748504-113-252348030319910/AnsiballZ_lineinfile.py'
Jan 22 16:46:43 compute-0 sudo[161842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:43 compute-0 python3.9[161844]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:43 compute-0 sudo[161842]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:44 compute-0 sudo[161994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrlxymeiqzpjqwlyfsplaanqhixxrnjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100403.5692122-122-255308583246654/AnsiballZ_systemd_service.py'
Jan 22 16:46:44 compute-0 sudo[161994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:44 compute-0 python3.9[161996]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:46:44 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 22 16:46:44 compute-0 sudo[161994]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:45 compute-0 sudo[162150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpuvlisdnlhnribwbhzzwahqjulirnvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100404.7863455-130-226648425840878/AnsiballZ_systemd_service.py'
Jan 22 16:46:45 compute-0 sudo[162150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:45 compute-0 python3.9[162152]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:46:45 compute-0 systemd[1]: Reloading.
Jan 22 16:46:45 compute-0 systemd-rc-local-generator[162181]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:46:45 compute-0 systemd-sysv-generator[162184]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:46:45 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 22 16:46:45 compute-0 systemd[1]: Starting Open-iSCSI...
Jan 22 16:46:45 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Jan 22 16:46:45 compute-0 systemd[1]: Started Open-iSCSI.
Jan 22 16:46:45 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 22 16:46:45 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 22 16:46:45 compute-0 sudo[162150]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:45 compute-0 podman[162196]: 2026-01-22 16:46:45.786359486 +0000 UTC m=+0.054122210 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 16:46:45 compute-0 podman[162246]: 2026-01-22 16:46:45.909989761 +0000 UTC m=+0.093613095 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 16:46:46 compute-0 python3.9[162397]: ansible-ansible.builtin.service_facts Invoked
Jan 22 16:46:46 compute-0 network[162414]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 16:46:46 compute-0 network[162415]: 'network-scripts' will be removed from distribution in near future.
Jan 22 16:46:46 compute-0 network[162416]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 16:46:50 compute-0 sudo[162685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egjzlxgsqbrnyuojkzauwrzpydzkyhzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100410.3035343-153-49405937298962/AnsiballZ_dnf.py'
Jan 22 16:46:50 compute-0 sudo[162685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:50 compute-0 python3.9[162687]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:46:53 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 16:46:53 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 22 16:46:53 compute-0 systemd[1]: Reloading.
Jan 22 16:46:53 compute-0 systemd-rc-local-generator[162731]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:46:53 compute-0 systemd-sysv-generator[162735]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:46:53 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 16:46:54 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 16:46:54 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 16:46:54 compute-0 systemd[1]: run-rf751cfd698dd434da04d1c4347d78b96.service: Deactivated successfully.
Jan 22 16:46:54 compute-0 sudo[162685]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:54 compute-0 sudo[163001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvjiksclvnfpvqnkbafsvzjpnxvvqnmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100414.4002707-162-51186399082877/AnsiballZ_file.py'
Jan 22 16:46:54 compute-0 sudo[163001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:54 compute-0 python3.9[163003]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 22 16:46:54 compute-0 sudo[163001]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:55 compute-0 sudo[163153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgwigvumkaacjctzkbwevczrrxtirurw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100415.0571847-170-203942501730988/AnsiballZ_modprobe.py'
Jan 22 16:46:55 compute-0 sudo[163153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:55 compute-0 python3.9[163155]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 22 16:46:55 compute-0 sudo[163153]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:56 compute-0 sudo[163309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuhdnusdtsrdjjappzxntdqguhchoulu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100415.8689058-178-58590385064454/AnsiballZ_stat.py'
Jan 22 16:46:56 compute-0 sudo[163309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:56 compute-0 python3.9[163311]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:46:56 compute-0 sudo[163309]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:56 compute-0 sudo[163432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biwdgfxjdsesvsjokkpsvwujfzpaeoit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100415.8689058-178-58590385064454/AnsiballZ_copy.py'
Jan 22 16:46:56 compute-0 sudo[163432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:57 compute-0 python3.9[163434]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100415.8689058-178-58590385064454/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:57 compute-0 sudo[163432]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:57 compute-0 sudo[163584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hygynjgyvhwknoyrslskrnlcvyilizbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100417.3021429-194-188048343177714/AnsiballZ_lineinfile.py'
Jan 22 16:46:57 compute-0 sudo[163584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:57 compute-0 python3.9[163586]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:57 compute-0 sudo[163584]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:58 compute-0 sudo[163736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtdefxxidqfzkaxwyrcytiovsbiyyvni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100418.018456-202-142082167872733/AnsiballZ_systemd.py'
Jan 22 16:46:58 compute-0 sudo[163736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:59 compute-0 python3.9[163738]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:46:59 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 22 16:46:59 compute-0 systemd[1]: Stopped Load Kernel Modules.
Jan 22 16:46:59 compute-0 systemd[1]: Stopping Load Kernel Modules...
Jan 22 16:46:59 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 22 16:46:59 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 22 16:46:59 compute-0 sudo[163736]: pam_unix(sudo:session): session closed for user root
Jan 22 16:46:59 compute-0 sudo[163892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiigvxshzkxxbulaxphjzfsytbuvywiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100419.4080894-210-211613945452213/AnsiballZ_command.py'
Jan 22 16:46:59 compute-0 sudo[163892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:46:59 compute-0 python3.9[163894]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:46:59 compute-0 sudo[163892]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:00 compute-0 sudo[164045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmglofoenrxxrcfgkktwjgidsilrxydx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100420.242041-220-65164580965825/AnsiballZ_stat.py'
Jan 22 16:47:00 compute-0 sudo[164045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:00 compute-0 python3.9[164047]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:47:00 compute-0 sudo[164045]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:01 compute-0 sudo[164197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scmdxogcnieequtxxsbcqdjjbuwnqpon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100420.9738772-229-225069965512625/AnsiballZ_stat.py'
Jan 22 16:47:01 compute-0 sudo[164197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:01 compute-0 python3.9[164199]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:47:01 compute-0 sudo[164197]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:01 compute-0 sudo[164320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewmdnxldndogihjbdqojnkywldllnzyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100420.9738772-229-225069965512625/AnsiballZ_copy.py'
Jan 22 16:47:01 compute-0 sudo[164320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:02 compute-0 python3.9[164322]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100420.9738772-229-225069965512625/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:02 compute-0 sudo[164320]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:02 compute-0 sudo[164472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jihmrhagltnymbllvfyjbyjmldmpwzfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100422.4160411-244-136943317201048/AnsiballZ_command.py'
Jan 22 16:47:02 compute-0 sudo[164472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:02 compute-0 python3.9[164474]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:47:03 compute-0 sudo[164472]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:03 compute-0 sudo[164625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmyllwulxthuvrugfhtzrrevxnsqgvwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100423.194645-252-45968188120940/AnsiballZ_lineinfile.py'
Jan 22 16:47:03 compute-0 sudo[164625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:03 compute-0 python3.9[164627]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:03 compute-0 sudo[164625]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:04 compute-0 sudo[164777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hujugcxlqhljnlefvdphnjmvafejlgih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100423.9476132-260-208334366715920/AnsiballZ_replace.py'
Jan 22 16:47:04 compute-0 sudo[164777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:04 compute-0 python3.9[164779]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:04 compute-0 sudo[164777]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:05 compute-0 sudo[164929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umivgyubbbjhortmpzwcqemqnhbpapph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100424.8233087-268-54713101920947/AnsiballZ_replace.py'
Jan 22 16:47:05 compute-0 sudo[164929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:05 compute-0 python3.9[164931]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:05 compute-0 sudo[164929]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:06 compute-0 sudo[165081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqdvotzsvgnupretimbgapcslcrjqeww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100425.5931292-277-229669019881715/AnsiballZ_lineinfile.py'
Jan 22 16:47:06 compute-0 sudo[165081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:06 compute-0 python3.9[165083]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:06 compute-0 sudo[165081]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:06 compute-0 sudo[165233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugopkdjavqgdokvypzkobjmkkbslyzod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100426.4922333-277-168300034043040/AnsiballZ_lineinfile.py'
Jan 22 16:47:06 compute-0 sudo[165233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:07 compute-0 python3.9[165235]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:07 compute-0 sudo[165233]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:07 compute-0 sudo[165385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zygwfthmsmzrajlujgnxcvmevmusxzjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100427.1926131-277-113333694470430/AnsiballZ_lineinfile.py'
Jan 22 16:47:07 compute-0 sudo[165385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:07 compute-0 python3.9[165387]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:07 compute-0 sudo[165385]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:08 compute-0 sudo[165537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pantihuoeejamseijhoxtkhsqpvegpvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100427.909047-277-238235420677996/AnsiballZ_lineinfile.py'
Jan 22 16:47:08 compute-0 sudo[165537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:08 compute-0 python3.9[165539]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:08 compute-0 sudo[165537]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:09 compute-0 sudo[165689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrauhyguyhkoinbuulnpweguhlrlpjhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100428.6960857-306-197034485451165/AnsiballZ_stat.py'
Jan 22 16:47:09 compute-0 sudo[165689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:09 compute-0 python3.9[165691]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:47:09 compute-0 sudo[165689]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:09 compute-0 sudo[165843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncgjmqxtlzxxbrdiinnetzxqinsvoxsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100429.453619-314-127582614719374/AnsiballZ_command.py'
Jan 22 16:47:09 compute-0 sudo[165843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:09 compute-0 python3.9[165845]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:47:10 compute-0 sudo[165843]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:10 compute-0 sudo[165996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgjghydbxkgxgfvltqynkrgefdvknlqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100430.2999432-323-76791491075884/AnsiballZ_systemd_service.py'
Jan 22 16:47:10 compute-0 sudo[165996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:11 compute-0 python3.9[165998]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:47:11 compute-0 systemd[1]: Listening on multipathd control socket.
Jan 22 16:47:11 compute-0 sudo[165996]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:11 compute-0 sudo[166152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elzrdgmhhruaheinkblhjkrwcnnzldlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100431.4623551-331-164709572931391/AnsiballZ_systemd_service.py'
Jan 22 16:47:11 compute-0 sudo[166152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:12 compute-0 python3.9[166154]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:47:12 compute-0 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 22 16:47:12 compute-0 udevadm[166159]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 22 16:47:12 compute-0 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 22 16:47:12 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 22 16:47:12 compute-0 multipathd[166163]: --------start up--------
Jan 22 16:47:12 compute-0 multipathd[166163]: read /etc/multipath.conf
Jan 22 16:47:12 compute-0 multipathd[166163]: path checkers start up
Jan 22 16:47:12 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 22 16:47:12 compute-0 sudo[166152]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:12 compute-0 sudo[166320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzmteiaotgfpkqaxmdgnhvgytkqlhiyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100432.6473072-343-65284539004666/AnsiballZ_file.py'
Jan 22 16:47:12 compute-0 sudo[166320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:13 compute-0 python3.9[166322]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 22 16:47:13 compute-0 sudo[166320]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:13 compute-0 sudo[166472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqzblinnropxqbmbkdhdkypxuhflrbvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100433.4268813-351-233464166863513/AnsiballZ_modprobe.py'
Jan 22 16:47:13 compute-0 sudo[166472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:13 compute-0 python3.9[166474]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 22 16:47:13 compute-0 kernel: Key type psk registered
Jan 22 16:47:13 compute-0 sudo[166472]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:14 compute-0 sudo[166636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xryphvoveoeffevjkwwlfampmtwnctnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100434.1966486-359-144040204656157/AnsiballZ_stat.py'
Jan 22 16:47:14 compute-0 sudo[166636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:14 compute-0 python3.9[166638]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:47:14 compute-0 sudo[166636]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:15 compute-0 sudo[166759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qejevwbzfdzgzxbyruiorevachmbvdfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100434.1966486-359-144040204656157/AnsiballZ_copy.py'
Jan 22 16:47:15 compute-0 sudo[166759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:15 compute-0 python3.9[166761]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100434.1966486-359-144040204656157/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:15 compute-0 sudo[166759]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:16 compute-0 podman[166886]: 2026-01-22 16:47:16.151561445 +0000 UTC m=+0.070946203 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 16:47:16 compute-0 sudo[166948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpzoyuazdfrjisjayofhguwtwvanmyks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100435.689463-375-8212203896854/AnsiballZ_lineinfile.py'
Jan 22 16:47:16 compute-0 sudo[166948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:16 compute-0 podman[166885]: 2026-01-22 16:47:16.189865159 +0000 UTC m=+0.111282289 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 16:47:16 compute-0 python3.9[166957]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:16 compute-0 sudo[166948]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:16 compute-0 sudo[167112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhjxlmyeuvmaezductarjyvaewxbamnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100436.5641937-383-108360312496833/AnsiballZ_systemd.py'
Jan 22 16:47:16 compute-0 sudo[167112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:17 compute-0 python3.9[167114]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:47:17 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 22 16:47:17 compute-0 systemd[1]: Stopped Load Kernel Modules.
Jan 22 16:47:17 compute-0 systemd[1]: Stopping Load Kernel Modules...
Jan 22 16:47:17 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 22 16:47:17 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 22 16:47:17 compute-0 sudo[167112]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:17 compute-0 sudo[167268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfylejymxxnosdjagmidedwczpwdlpap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100437.4940906-391-239538644424672/AnsiballZ_dnf.py'
Jan 22 16:47:17 compute-0 sudo[167268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:18 compute-0 python3.9[167270]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:47:20 compute-0 systemd[1]: Reloading.
Jan 22 16:47:20 compute-0 systemd-sysv-generator[167301]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:47:20 compute-0 systemd-rc-local-generator[167294]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:47:20 compute-0 systemd[1]: Reloading.
Jan 22 16:47:20 compute-0 systemd-rc-local-generator[167340]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:47:20 compute-0 systemd-sysv-generator[167343]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:47:20 compute-0 systemd-logind[796]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 22 16:47:20 compute-0 systemd-logind[796]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 22 16:47:21 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 16:47:21 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 22 16:47:21 compute-0 systemd[1]: Reloading.
Jan 22 16:47:21 compute-0 systemd-rc-local-generator[167431]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:47:21 compute-0 systemd-sysv-generator[167435]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:47:21 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 16:47:21 compute-0 sudo[167268]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:22 compute-0 sudo[168667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnoszncxfvicukaicvctmzttdteystam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100442.2165256-399-142297615218637/AnsiballZ_systemd_service.py'
Jan 22 16:47:22 compute-0 sudo[168667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:22 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 16:47:22 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 16:47:22 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.761s CPU time.
Jan 22 16:47:22 compute-0 systemd[1]: run-rf1585f596e5d4dcbb2cb3aee20463e25.service: Deactivated successfully.
Jan 22 16:47:22 compute-0 python3.9[168686]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:47:22 compute-0 systemd[1]: Stopping Open-iSCSI...
Jan 22 16:47:22 compute-0 iscsid[162192]: iscsid shutting down.
Jan 22 16:47:22 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Jan 22 16:47:22 compute-0 systemd[1]: Stopped Open-iSCSI.
Jan 22 16:47:22 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 22 16:47:22 compute-0 systemd[1]: Starting Open-iSCSI...
Jan 22 16:47:22 compute-0 systemd[1]: Started Open-iSCSI.
Jan 22 16:47:22 compute-0 sudo[168667]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:23 compute-0 sudo[168888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iefbwsneqpttxhrnetiokeulvyzmrewy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100443.1384487-407-257424793994896/AnsiballZ_systemd_service.py'
Jan 22 16:47:23 compute-0 sudo[168888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:23 compute-0 python3.9[168890]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:47:23 compute-0 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 22 16:47:23 compute-0 multipathd[166163]: exit (signal)
Jan 22 16:47:23 compute-0 multipathd[166163]: --------shut down-------
Jan 22 16:47:23 compute-0 systemd[1]: multipathd.service: Deactivated successfully.
Jan 22 16:47:23 compute-0 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 22 16:47:23 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 22 16:47:23 compute-0 multipathd[168896]: --------start up--------
Jan 22 16:47:23 compute-0 multipathd[168896]: read /etc/multipath.conf
Jan 22 16:47:23 compute-0 multipathd[168896]: path checkers start up
Jan 22 16:47:23 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 22 16:47:23 compute-0 sudo[168888]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:24 compute-0 python3.9[169054]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:47:25 compute-0 sudo[169208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxroaqbqlpqcxfkcfqfnlgqespshkcju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100445.1397014-425-188146799685028/AnsiballZ_file.py'
Jan 22 16:47:25 compute-0 sudo[169208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:25 compute-0 python3.9[169210]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:25 compute-0 sudo[169208]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:26 compute-0 sudo[169360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvxfwnsgzvzopukieqjlbdkdddykclvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100446.0567508-436-110999576216646/AnsiballZ_systemd_service.py'
Jan 22 16:47:26 compute-0 sudo[169360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:26 compute-0 python3.9[169362]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 16:47:26 compute-0 systemd[1]: Reloading.
Jan 22 16:47:26 compute-0 systemd-sysv-generator[169392]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:47:26 compute-0 systemd-rc-local-generator[169387]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:47:26 compute-0 sudo[169360]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:27 compute-0 python3.9[169546]: ansible-ansible.builtin.service_facts Invoked
Jan 22 16:47:27 compute-0 network[169563]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 16:47:27 compute-0 network[169564]: 'network-scripts' will be removed from distribution in near future.
Jan 22 16:47:27 compute-0 network[169565]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 16:47:33 compute-0 sudo[169835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwcbgdknrkkjfdbazncvviodfbjrwfrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100452.8089948-455-259599189944400/AnsiballZ_systemd_service.py'
Jan 22 16:47:33 compute-0 sudo[169835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:33 compute-0 python3.9[169837]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:47:33 compute-0 sudo[169835]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:34 compute-0 sudo[169988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuesqqnpwvgtvvatmgzznpvnnnojbxow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100453.7278175-455-82156738832367/AnsiballZ_systemd_service.py'
Jan 22 16:47:34 compute-0 sudo[169988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:34 compute-0 python3.9[169990]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:47:34 compute-0 sudo[169988]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:34 compute-0 sudo[170141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvlmakpbwblckinbxqxdsxgacfbjczdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100454.5242965-455-3195617115130/AnsiballZ_systemd_service.py'
Jan 22 16:47:34 compute-0 sudo[170141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:35 compute-0 python3.9[170143]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:47:35 compute-0 sudo[170141]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:35 compute-0 sudo[170294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdiuhkhfjbtpdgguznwxplccvdlbuypz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100455.4514265-455-209327727468418/AnsiballZ_systemd_service.py'
Jan 22 16:47:35 compute-0 sudo[170294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:36 compute-0 python3.9[170296]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:47:36 compute-0 sudo[170294]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:36 compute-0 sudo[170447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqlnylbcokbvycziodebdinajejqprmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100456.3737342-455-276468909540485/AnsiballZ_systemd_service.py'
Jan 22 16:47:36 compute-0 sudo[170447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:37 compute-0 python3.9[170449]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:47:37 compute-0 sudo[170447]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:37 compute-0 sudo[170600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yliafptzvrxwtvwfcmwdvqklhzosjmkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100457.2602823-455-270463508994413/AnsiballZ_systemd_service.py'
Jan 22 16:47:37 compute-0 sudo[170600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:37 compute-0 python3.9[170602]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:47:37 compute-0 sudo[170600]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:38 compute-0 sudo[170753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-relgytzthbbyezpakrvgbnlfeuwctbqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100458.0626557-455-20487330302255/AnsiballZ_systemd_service.py'
Jan 22 16:47:38 compute-0 sudo[170753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:38 compute-0 python3.9[170755]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:47:38 compute-0 sudo[170753]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:39 compute-0 sudo[170906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wetitrbwojaodozfwzqokxddzbxvsncb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100458.8920076-455-101530008500361/AnsiballZ_systemd_service.py'
Jan 22 16:47:39 compute-0 sudo[170906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:39 compute-0 python3.9[170908]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:47:39 compute-0 sudo[170906]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:40 compute-0 sudo[171059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szeecgrhydrpfolinxwqumfouatvsmfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100459.8529663-514-4749983493591/AnsiballZ_file.py'
Jan 22 16:47:40 compute-0 sudo[171059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:40 compute-0 python3.9[171061]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:40 compute-0 sudo[171059]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:40 compute-0 sudo[171211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iybmiuorrglyaievdupztmngcvvearhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100460.4690335-514-58981502612882/AnsiballZ_file.py'
Jan 22 16:47:40 compute-0 sudo[171211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:41 compute-0 python3.9[171213]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:41 compute-0 sudo[171211]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:41 compute-0 sudo[171363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijjkpgrqjbnqkmbyxzvluvfztcwihxhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100461.2795076-514-225405917514590/AnsiballZ_file.py'
Jan 22 16:47:41 compute-0 sudo[171363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:41 compute-0 python3.9[171365]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:41 compute-0 sudo[171363]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:47:41.900 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:47:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:47:41.902 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:47:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:47:41.902 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:47:42 compute-0 sudo[171515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pshjqfzxunsjbtppvpshcuvjteekbahn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100462.002325-514-259666892946119/AnsiballZ_file.py'
Jan 22 16:47:42 compute-0 sudo[171515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:42 compute-0 python3.9[171517]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:42 compute-0 sudo[171515]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:42 compute-0 sudo[171667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjwkxwllaucdkxfxtqwtzabrgonwujpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100462.6516063-514-71082641128575/AnsiballZ_file.py'
Jan 22 16:47:42 compute-0 sudo[171667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:43 compute-0 python3.9[171669]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:43 compute-0 sudo[171667]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:43 compute-0 sudo[171819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knyvhcmitgxnhwlshovvjqfsimvbsiob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100463.3363183-514-231118152878409/AnsiballZ_file.py'
Jan 22 16:47:43 compute-0 sudo[171819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:43 compute-0 python3.9[171821]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:43 compute-0 sudo[171819]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:44 compute-0 sudo[171971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-numbjzbprlyncboaowwzcwxoxqqubexr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100464.0407438-514-233576244843065/AnsiballZ_file.py'
Jan 22 16:47:44 compute-0 sudo[171971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:44 compute-0 python3.9[171973]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:44 compute-0 sudo[171971]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:44 compute-0 sudo[172123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwralszzprdtnpakkokwiaaqjnljyemi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100464.658965-514-265698780851666/AnsiballZ_file.py'
Jan 22 16:47:44 compute-0 sudo[172123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:45 compute-0 python3.9[172125]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:45 compute-0 sudo[172123]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:45 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 22 16:47:45 compute-0 sudo[172276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsmjmiwqksnorzdipyifskawbpyhjsii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100465.2736077-571-109861810136369/AnsiballZ_file.py'
Jan 22 16:47:45 compute-0 sudo[172276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:45 compute-0 python3.9[172278]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:45 compute-0 sudo[172276]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:46 compute-0 podman[172403]: 2026-01-22 16:47:46.301460285 +0000 UTC m=+0.059129139 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 16:47:46 compute-0 sudo[172462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipksadfpvvhkrqgyadcnxzqvxfikappp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100466.002142-571-86160893186013/AnsiballZ_file.py'
Jan 22 16:47:46 compute-0 sudo[172462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:46 compute-0 podman[172402]: 2026-01-22 16:47:46.33048026 +0000 UTC m=+0.087181930 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 16:47:46 compute-0 python3.9[172470]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:46 compute-0 sudo[172462]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:46 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 22 16:47:46 compute-0 sudo[172624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njiqbhumtjnwccbjrhlpgemevdfcquvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100466.6577227-571-137150722354886/AnsiballZ_file.py'
Jan 22 16:47:46 compute-0 sudo[172624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:47 compute-0 python3.9[172626]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:47 compute-0 sudo[172624]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:47 compute-0 sudo[172776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhplxqedeaufzpqsrykpwndxjkftxzkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100467.3442934-571-255412684450480/AnsiballZ_file.py'
Jan 22 16:47:47 compute-0 sudo[172776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:47 compute-0 python3.9[172778]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:47 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 22 16:47:47 compute-0 sudo[172776]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:48 compute-0 sudo[172929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvuxcznvzqretoibnqyuugugflmgretu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100468.016983-571-205265733222706/AnsiballZ_file.py'
Jan 22 16:47:48 compute-0 sudo[172929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:48 compute-0 python3.9[172931]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:48 compute-0 sudo[172929]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:49 compute-0 sudo[173081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxvghbqmfrexkqrdbyjqpovpyhzvbcgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100468.794517-571-110585402313876/AnsiballZ_file.py'
Jan 22 16:47:49 compute-0 sudo[173081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:49 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 22 16:47:49 compute-0 python3.9[173083]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:49 compute-0 sudo[173081]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:49 compute-0 sudo[173234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrxhacwxhtgzrtwrficrdbybruayfrbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100469.476537-571-44897982408919/AnsiballZ_file.py'
Jan 22 16:47:49 compute-0 sudo[173234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:49 compute-0 python3.9[173236]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:49 compute-0 sudo[173234]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:50 compute-0 sudo[173386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nihrswazzwsfqsntedhztdibuedjaylx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100470.1462874-571-242734124782016/AnsiballZ_file.py'
Jan 22 16:47:50 compute-0 sudo[173386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:50 compute-0 python3.9[173388]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:47:50 compute-0 sudo[173386]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:51 compute-0 sudo[173538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjyzkbygyssfizwrwvbvxjghjfgcsgfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100471.08653-629-95277904857874/AnsiballZ_command.py'
Jan 22 16:47:51 compute-0 sudo[173538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:51 compute-0 python3.9[173540]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:47:51 compute-0 sudo[173538]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:52 compute-0 python3.9[173692]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 16:47:53 compute-0 sudo[173842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tafzsmjnlfpgofajtaemxmwcrwgzfzyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100472.7717083-647-4081201304884/AnsiballZ_systemd_service.py'
Jan 22 16:47:53 compute-0 sudo[173842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:53 compute-0 python3.9[173844]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 16:47:53 compute-0 systemd[1]: Reloading.
Jan 22 16:47:53 compute-0 systemd-rc-local-generator[173865]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:47:53 compute-0 systemd-sysv-generator[173873]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:47:53 compute-0 sudo[173842]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:54 compute-0 sudo[174028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voqnvhfsdovpdwfiymlvehvrsjvhpnzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100473.9216769-655-12076279934835/AnsiballZ_command.py'
Jan 22 16:47:54 compute-0 sudo[174028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:54 compute-0 python3.9[174030]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:47:54 compute-0 sudo[174028]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:54 compute-0 sudo[174181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbiadcqdfjapgswaydildaswrhowkfgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100474.5432909-655-199106386060589/AnsiballZ_command.py'
Jan 22 16:47:54 compute-0 sudo[174181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:55 compute-0 python3.9[174183]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:47:55 compute-0 sudo[174181]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:55 compute-0 sudo[174334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acngmvaiijlwtvzpvsnkbusogwkjvruh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100475.2025616-655-85004038821408/AnsiballZ_command.py'
Jan 22 16:47:55 compute-0 sudo[174334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:55 compute-0 python3.9[174336]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:47:55 compute-0 sudo[174334]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:56 compute-0 sudo[174487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxnnzqqruqqozznllqxlkjpzzdhuyoic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100475.8822725-655-126739583459165/AnsiballZ_command.py'
Jan 22 16:47:56 compute-0 sudo[174487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:56 compute-0 python3.9[174489]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:47:56 compute-0 sudo[174487]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:56 compute-0 sudo[174640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxwevxqfvmelaidytjakaoydqdvrbbiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100476.5654576-655-98826751072326/AnsiballZ_command.py'
Jan 22 16:47:56 compute-0 sudo[174640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:57 compute-0 python3.9[174642]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:47:57 compute-0 sudo[174640]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:57 compute-0 sudo[174793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyikiygppumcgezsmmhhxtybchjxtvok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100477.335153-655-120271666821630/AnsiballZ_command.py'
Jan 22 16:47:57 compute-0 sudo[174793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:57 compute-0 python3.9[174795]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:47:57 compute-0 sudo[174793]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:58 compute-0 sudo[174946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zazicbtkpurjtxpjmlaqcnugprbhdvgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100478.1251128-655-39770371384575/AnsiballZ_command.py'
Jan 22 16:47:58 compute-0 sudo[174946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:58 compute-0 python3.9[174948]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:47:58 compute-0 sudo[174946]: pam_unix(sudo:session): session closed for user root
Jan 22 16:47:59 compute-0 sudo[175099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljbzceaafgqwxueddyvdwjrxidrjmlbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100478.8640528-655-204295928341680/AnsiballZ_command.py'
Jan 22 16:47:59 compute-0 sudo[175099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:47:59 compute-0 python3.9[175101]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:47:59 compute-0 sudo[175099]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:00 compute-0 sudo[175252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqnzdslqzvhgmksczogxfmdenjogpwqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100480.4486969-734-270569445920493/AnsiballZ_file.py'
Jan 22 16:48:00 compute-0 sudo[175252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:01 compute-0 python3.9[175254]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:01 compute-0 sudo[175252]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:01 compute-0 sudo[175404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kelyofidkvbvpntvzehokvusdyluwudl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100481.2722764-734-265274611466043/AnsiballZ_file.py'
Jan 22 16:48:01 compute-0 sudo[175404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:01 compute-0 python3.9[175406]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:01 compute-0 sudo[175404]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:02 compute-0 sudo[175556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzzxseknzucafzmszxwgtcelqjdqieus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100481.9834497-734-206136521615525/AnsiballZ_file.py'
Jan 22 16:48:02 compute-0 sudo[175556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:02 compute-0 python3.9[175558]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:02 compute-0 sudo[175556]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:03 compute-0 sudo[175708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfhvzumgnbfewteixylnzdtictxxgtll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100482.748571-756-29218047120676/AnsiballZ_file.py'
Jan 22 16:48:03 compute-0 sudo[175708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:03 compute-0 python3.9[175710]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:03 compute-0 sudo[175708]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:03 compute-0 sudo[175860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwiisstiblislhrxvxzvhjnlvwayadgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100483.456719-756-162387495948540/AnsiballZ_file.py'
Jan 22 16:48:03 compute-0 sudo[175860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:03 compute-0 python3.9[175862]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:03 compute-0 sudo[175860]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:04 compute-0 sudo[176012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wostzdlrampvijtghvngawpzjhfrtldl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100484.1055667-756-94664495050939/AnsiballZ_file.py'
Jan 22 16:48:04 compute-0 sudo[176012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:04 compute-0 python3.9[176014]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:04 compute-0 sudo[176012]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:05 compute-0 sudo[176164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzdhpqwzftjnjctrfcgktxjqrsugtvij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100484.7662432-756-87906861185213/AnsiballZ_file.py'
Jan 22 16:48:05 compute-0 sudo[176164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:05 compute-0 python3.9[176166]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:05 compute-0 sudo[176164]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:05 compute-0 sudo[176316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zphyxkdnlvrtemoddymfuaqlaqaqzmte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100485.4706495-756-195879390235844/AnsiballZ_file.py'
Jan 22 16:48:05 compute-0 sudo[176316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:05 compute-0 python3.9[176318]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:06 compute-0 sudo[176316]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:06 compute-0 sudo[176468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szjiderppkqmgzpibnsahqvoabxorkgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100486.1436899-756-174504466598991/AnsiballZ_file.py'
Jan 22 16:48:06 compute-0 sudo[176468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:06 compute-0 python3.9[176470]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:06 compute-0 sudo[176468]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:07 compute-0 sudo[176620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqqsbedxqhcrafoawuoawzzmzidstcqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100486.871592-756-225129688574276/AnsiballZ_file.py'
Jan 22 16:48:07 compute-0 sudo[176620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:07 compute-0 python3.9[176622]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:07 compute-0 sudo[176620]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:12 compute-0 sudo[176772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nudmjyebgxshiqcutysxdcupoaqktmch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100491.7415967-925-108916027487750/AnsiballZ_getent.py'
Jan 22 16:48:12 compute-0 sudo[176772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:12 compute-0 python3.9[176774]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 22 16:48:12 compute-0 sudo[176772]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:13 compute-0 sudo[176925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqrwrstmlzxpktrnfziaxltadpoinddq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100492.6865556-933-219143986927335/AnsiballZ_group.py'
Jan 22 16:48:13 compute-0 sudo[176925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:13 compute-0 python3.9[176927]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 16:48:13 compute-0 groupadd[176928]: group added to /etc/group: name=nova, GID=42436
Jan 22 16:48:13 compute-0 groupadd[176928]: group added to /etc/gshadow: name=nova
Jan 22 16:48:13 compute-0 groupadd[176928]: new group: name=nova, GID=42436
Jan 22 16:48:13 compute-0 sudo[176925]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:14 compute-0 sudo[177083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxnjvmrfsemxwwzzbwryeqolgxsyqdav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100493.6503446-941-83864127766317/AnsiballZ_user.py'
Jan 22 16:48:14 compute-0 sudo[177083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:14 compute-0 python3.9[177085]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 16:48:14 compute-0 useradd[177087]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Jan 22 16:48:14 compute-0 useradd[177087]: add 'nova' to group 'libvirt'
Jan 22 16:48:14 compute-0 useradd[177087]: add 'nova' to shadow group 'libvirt'
Jan 22 16:48:14 compute-0 sudo[177083]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:15 compute-0 sshd-session[177118]: Accepted publickey for zuul from 192.168.122.30 port 45434 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:48:15 compute-0 systemd-logind[796]: New session 25 of user zuul.
Jan 22 16:48:15 compute-0 systemd[1]: Started Session 25 of User zuul.
Jan 22 16:48:15 compute-0 sshd-session[177118]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:48:15 compute-0 sshd-session[177121]: Received disconnect from 192.168.122.30 port 45434:11: disconnected by user
Jan 22 16:48:15 compute-0 sshd-session[177121]: Disconnected from user zuul 192.168.122.30 port 45434
Jan 22 16:48:15 compute-0 sshd-session[177118]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:48:15 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Jan 22 16:48:15 compute-0 systemd-logind[796]: Session 25 logged out. Waiting for processes to exit.
Jan 22 16:48:15 compute-0 systemd-logind[796]: Removed session 25.
Jan 22 16:48:16 compute-0 python3.9[177271]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:48:16 compute-0 podman[177367]: 2026-01-22 16:48:16.916923422 +0000 UTC m=+0.089226589 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 16:48:16 compute-0 podman[177366]: 2026-01-22 16:48:16.962812365 +0000 UTC m=+0.133273533 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 16:48:17 compute-0 python3.9[177419]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100495.8525095-966-53552315922241/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:17 compute-0 python3.9[177585]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:48:18 compute-0 python3.9[177661]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:19 compute-0 python3.9[177811]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:48:19 compute-0 python3.9[177932]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100498.5565722-966-167903988527418/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:20 compute-0 python3.9[178082]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:48:21 compute-0 python3.9[178203]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100500.1009917-966-52523399516433/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:21 compute-0 python3.9[178353]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:48:22 compute-0 python3.9[178474]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100501.4281654-966-213188482022206/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:23 compute-0 python3.9[178624]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:48:23 compute-0 python3.9[178745]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100502.6717846-966-95953438627482/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:24 compute-0 sudo[178895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrxgaaodkyonchfnvruurganszsndkrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100504.0041358-1049-219037646446765/AnsiballZ_file.py'
Jan 22 16:48:24 compute-0 sudo[178895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:24 compute-0 python3.9[178897]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:48:24 compute-0 sudo[178895]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:25 compute-0 sudo[179047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghbwrishxuuinlmedthkbwxfmgldyisp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100504.7892191-1057-16935443104651/AnsiballZ_copy.py'
Jan 22 16:48:25 compute-0 sudo[179047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:25 compute-0 python3.9[179049]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:48:25 compute-0 sudo[179047]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:25 compute-0 sudo[179199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btajytzujeazyxttlqvtdbpdwedmieys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100505.5072753-1065-227101047423613/AnsiballZ_stat.py'
Jan 22 16:48:25 compute-0 sudo[179199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:26 compute-0 python3.9[179201]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:48:26 compute-0 sudo[179199]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:26 compute-0 sudo[179351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odgqabhivckcbwhwcfaltashpzyfqwwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100506.2956815-1073-97971459078155/AnsiballZ_stat.py'
Jan 22 16:48:26 compute-0 sudo[179351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:26 compute-0 python3.9[179353]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:48:26 compute-0 sudo[179351]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:27 compute-0 sudo[179474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xofidecoshkdhotpmaikptnszeuvnniw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100506.2956815-1073-97971459078155/AnsiballZ_copy.py'
Jan 22 16:48:27 compute-0 sudo[179474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:27 compute-0 python3.9[179476]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769100506.2956815-1073-97971459078155/.source _original_basename=.yvrijg80 follow=False checksum=a8e56b013226e7a969bc949134fb885af868284c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 22 16:48:27 compute-0 sudo[179474]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:28 compute-0 python3.9[179628]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:48:28 compute-0 python3.9[179780]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:48:29 compute-0 python3.9[179901]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100508.5270674-1099-281076486972948/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:30 compute-0 python3.9[180051]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:48:32 compute-0 python3.9[180172]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100510.1475103-1114-4962565405638/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:48:33 compute-0 sudo[180322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btzxeeejfebgtqmtecdfcxjjmjppvipn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100512.5631888-1131-216577717402038/AnsiballZ_container_config_data.py'
Jan 22 16:48:33 compute-0 sudo[180322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:33 compute-0 python3.9[180324]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 22 16:48:33 compute-0 sudo[180322]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:34 compute-0 sudo[180474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnfyirtgnwmnfowzffufjckhnnvavgke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100513.6267455-1142-173770554059618/AnsiballZ_container_config_hash.py'
Jan 22 16:48:34 compute-0 sudo[180474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:34 compute-0 python3.9[180476]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 16:48:34 compute-0 sudo[180474]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:35 compute-0 sudo[180626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qncnlnmuykvajlfeufjxpruborgjbgop ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769100514.7605975-1152-204801625193423/AnsiballZ_edpm_container_manage.py'
Jan 22 16:48:35 compute-0 sudo[180626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:35 compute-0 python3[180628]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 16:48:35 compute-0 podman[180665]: 2026-01-22 16:48:35.843158362 +0000 UTC m=+0.054636007 container create ddb09390195b756475e3299b73380b576d2d95487745a61cccf042418cb6c4bc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, container_name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 16:48:35 compute-0 podman[180665]: 2026-01-22 16:48:35.811153239 +0000 UTC m=+0.022630894 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 22 16:48:35 compute-0 python3[180628]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 22 16:48:36 compute-0 sudo[180626]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:36 compute-0 sudo[180853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wialppvgokdmestmlerhxdzrljuxwsgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100516.2217762-1160-62608160311075/AnsiballZ_stat.py'
Jan 22 16:48:36 compute-0 sudo[180853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:36 compute-0 python3.9[180855]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:48:36 compute-0 sudo[180853]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:37 compute-0 sudo[181007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsembulvreelsirbitjwiaqutlnnlgdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100517.3317547-1172-114568014084313/AnsiballZ_container_config_data.py'
Jan 22 16:48:37 compute-0 sudo[181007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:37 compute-0 python3.9[181009]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 22 16:48:37 compute-0 sudo[181007]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:38 compute-0 sudo[181159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtoadwjfjdgirzhhnooksakmfhzzwfly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100518.2723968-1183-30851457584813/AnsiballZ_container_config_hash.py'
Jan 22 16:48:38 compute-0 sudo[181159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:38 compute-0 python3.9[181161]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 16:48:38 compute-0 sudo[181159]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:39 compute-0 sudo[181311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxiirozqnesumwdzhsgzzwonhgundica ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769100519.1348817-1193-99060848467593/AnsiballZ_edpm_container_manage.py'
Jan 22 16:48:39 compute-0 sudo[181311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:39 compute-0 python3[181313]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 16:48:39 compute-0 podman[181350]: 2026-01-22 16:48:39.952781651 +0000 UTC m=+0.061929982 container create 7e081cf3e74ca81472f44e092740e74d76aa9d3296e900d7563f5c5fa5075f21 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Jan 22 16:48:39 compute-0 podman[181350]: 2026-01-22 16:48:39.91784956 +0000 UTC m=+0.026997971 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 22 16:48:39 compute-0 python3[181313]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 22 16:48:40 compute-0 sudo[181311]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:40 compute-0 sudo[181535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kozaxqoryptwxlgjsmmsflcyipglijil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100520.265655-1201-231581254587612/AnsiballZ_stat.py'
Jan 22 16:48:40 compute-0 sudo[181535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:40 compute-0 python3.9[181537]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:48:40 compute-0 sudo[181535]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:41 compute-0 sudo[181689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxjbkdzfgilzeyablhuzmiykejmmsepp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100521.1380143-1210-162022229505020/AnsiballZ_file.py'
Jan 22 16:48:41 compute-0 sudo[181689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:41 compute-0 python3.9[181691]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:48:41 compute-0 sudo[181689]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:48:41.902 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:48:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:48:41.903 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:48:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:48:41.903 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:48:42 compute-0 sudo[181840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyisvzymvcullsgbbvkcjqhwnpruceuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100521.739997-1210-53222247406643/AnsiballZ_copy.py'
Jan 22 16:48:42 compute-0 sudo[181840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:42 compute-0 python3.9[181842]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769100521.739997-1210-53222247406643/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:48:42 compute-0 sudo[181840]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:42 compute-0 sudo[181916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eeseqpxylcoqaegxgtresxoxjrdhktmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100521.739997-1210-53222247406643/AnsiballZ_systemd.py'
Jan 22 16:48:42 compute-0 sudo[181916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:43 compute-0 python3.9[181918]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 16:48:43 compute-0 systemd[1]: Reloading.
Jan 22 16:48:43 compute-0 systemd-rc-local-generator[181945]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:48:43 compute-0 systemd-sysv-generator[181948]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:48:43 compute-0 sudo[181916]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:43 compute-0 sudo[182028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyjfgyhxbcufpnictbwjqlimuxytrjqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100521.739997-1210-53222247406643/AnsiballZ_systemd.py'
Jan 22 16:48:43 compute-0 sudo[182028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:43 compute-0 python3.9[182030]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:48:44 compute-0 systemd[1]: Reloading.
Jan 22 16:48:44 compute-0 systemd-rc-local-generator[182054]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:48:44 compute-0 systemd-sysv-generator[182057]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:48:44 compute-0 systemd[1]: Starting nova_compute container...
Jan 22 16:48:44 compute-0 systemd[1]: Started libcrun container.
Jan 22 16:48:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6a47cb747ad3a09e3d79ab2f56dd1bfc6d52ebb23134d96e66dc5f3202337ba/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 22 16:48:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6a47cb747ad3a09e3d79ab2f56dd1bfc6d52ebb23134d96e66dc5f3202337ba/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 22 16:48:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6a47cb747ad3a09e3d79ab2f56dd1bfc6d52ebb23134d96e66dc5f3202337ba/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 22 16:48:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6a47cb747ad3a09e3d79ab2f56dd1bfc6d52ebb23134d96e66dc5f3202337ba/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 22 16:48:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6a47cb747ad3a09e3d79ab2f56dd1bfc6d52ebb23134d96e66dc5f3202337ba/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 22 16:48:44 compute-0 podman[182070]: 2026-01-22 16:48:44.374734003 +0000 UTC m=+0.113902826 container init 7e081cf3e74ca81472f44e092740e74d76aa9d3296e900d7563f5c5fa5075f21 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 16:48:44 compute-0 podman[182070]: 2026-01-22 16:48:44.387124863 +0000 UTC m=+0.126293696 container start 7e081cf3e74ca81472f44e092740e74d76aa9d3296e900d7563f5c5fa5075f21 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 16:48:44 compute-0 podman[182070]: nova_compute
Jan 22 16:48:44 compute-0 nova_compute[182084]: + sudo -E kolla_set_configs
Jan 22 16:48:44 compute-0 systemd[1]: Started nova_compute container.
Jan 22 16:48:44 compute-0 sudo[182028]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Validating config file
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Copying service configuration files
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Deleting /etc/ceph
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Creating directory /etc/ceph
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Setting permission for /etc/ceph
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Writing out command to execute
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 22 16:48:44 compute-0 nova_compute[182084]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 22 16:48:44 compute-0 nova_compute[182084]: ++ cat /run_command
Jan 22 16:48:44 compute-0 nova_compute[182084]: + CMD=nova-compute
Jan 22 16:48:44 compute-0 nova_compute[182084]: + ARGS=
Jan 22 16:48:44 compute-0 nova_compute[182084]: + sudo kolla_copy_cacerts
Jan 22 16:48:44 compute-0 nova_compute[182084]: + [[ ! -n '' ]]
Jan 22 16:48:44 compute-0 nova_compute[182084]: + . kolla_extend_start
Jan 22 16:48:44 compute-0 nova_compute[182084]: + echo 'Running command: '\''nova-compute'\'''
Jan 22 16:48:44 compute-0 nova_compute[182084]: Running command: 'nova-compute'
Jan 22 16:48:44 compute-0 nova_compute[182084]: + umask 0022
Jan 22 16:48:44 compute-0 nova_compute[182084]: + exec nova-compute
Jan 22 16:48:45 compute-0 python3.9[182245]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:48:46 compute-0 python3.9[182396]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:48:46 compute-0 nova_compute[182084]: 2026-01-22 16:48:46.391 182088 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 22 16:48:46 compute-0 nova_compute[182084]: 2026-01-22 16:48:46.392 182088 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 22 16:48:46 compute-0 nova_compute[182084]: 2026-01-22 16:48:46.392 182088 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 22 16:48:46 compute-0 nova_compute[182084]: 2026-01-22 16:48:46.392 182088 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 22 16:48:46 compute-0 nova_compute[182084]: 2026-01-22 16:48:46.525 182088 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 16:48:46 compute-0 nova_compute[182084]: 2026-01-22 16:48:46.553 182088 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 16:48:46 compute-0 nova_compute[182084]: 2026-01-22 16:48:46.554 182088 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 22 16:48:46 compute-0 python3.9[182550]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.225 182088 INFO nova.virt.driver [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.346 182088 INFO nova.compute.provider_config [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 22 16:48:47 compute-0 podman[182613]: 2026-01-22 16:48:47.366147452 +0000 UTC m=+0.053382624 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.366 182088 DEBUG oslo_concurrency.lockutils [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.367 182088 DEBUG oslo_concurrency.lockutils [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.367 182088 DEBUG oslo_concurrency.lockutils [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.368 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.368 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.368 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.368 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.368 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.369 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.369 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.369 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.369 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.369 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.370 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.370 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.370 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.370 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.370 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.371 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.371 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.371 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.371 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.371 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.372 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.372 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.372 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.372 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.372 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.372 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.373 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.373 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.373 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.373 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.373 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.373 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.374 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.374 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.374 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] force_config_drive             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.374 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.374 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.374 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.375 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.375 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.375 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.375 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.375 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.376 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.376 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.376 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.376 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.376 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.376 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.376 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.377 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.377 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.377 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.377 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.377 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.377 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.378 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.378 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.378 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.378 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.378 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.378 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.379 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.379 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.379 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.379 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.379 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.379 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.380 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.380 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.380 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.380 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.380 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.380 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.381 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.381 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.381 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.381 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.381 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.381 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.382 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.382 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.382 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.382 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.382 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.382 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.383 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.383 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.383 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.383 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.383 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.384 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.384 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.384 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.384 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.384 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.385 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.385 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.385 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.385 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.385 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.385 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.386 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.386 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.386 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.386 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.386 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.386 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.387 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.387 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.387 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.387 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.387 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.387 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 podman[182599]: 2026-01-22 16:48:47.388059546 +0000 UTC m=+0.074630810 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.388 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.388 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.388 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.388 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.388 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.389 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.389 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.389 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.389 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.389 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.389 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.390 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.390 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.390 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.390 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.390 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.391 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.391 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.391 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.391 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.391 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.391 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.391 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.392 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.392 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.392 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.392 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.392 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.392 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.393 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.393 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.393 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.393 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.393 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.394 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.394 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.394 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.394 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.394 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.395 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.395 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.395 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.395 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.395 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.395 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.396 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.396 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.396 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.396 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.396 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.396 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.396 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.397 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.397 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.397 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.397 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.397 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.397 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.398 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.398 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.398 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.398 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.398 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.398 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.399 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.399 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.399 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.399 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.399 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.399 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.400 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.400 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.400 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.400 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.400 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.400 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.400 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.400 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.401 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.401 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.401 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.401 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.401 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.401 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.402 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.402 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.402 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.402 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.402 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.402 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.403 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.403 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.403 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.403 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.403 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.403 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.404 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.404 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.404 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.404 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.404 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.405 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.405 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.405 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.405 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.405 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.405 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.406 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.406 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.406 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.406 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.406 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.406 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.407 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.407 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.407 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.407 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.407 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.407 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.408 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.408 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.408 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.408 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.408 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.408 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.409 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.409 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.409 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.409 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.409 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.409 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.410 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.410 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.410 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.410 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.410 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.410 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.411 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.411 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.411 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.411 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.411 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.411 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.412 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.412 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.412 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.412 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.412 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.412 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.413 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.413 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.413 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.413 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.413 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.413 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.413 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.413 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.414 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.414 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.414 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.414 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.414 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.414 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.414 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.415 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.415 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.415 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.415 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.415 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.415 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.415 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.416 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.416 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.416 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.416 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.416 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.416 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.416 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.416 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.417 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.417 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.417 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.417 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.417 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.417 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.417 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.418 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.418 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.418 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.418 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.418 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.418 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.418 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.419 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.419 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.419 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.419 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.419 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.419 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.419 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.420 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.420 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.420 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.420 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.420 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.420 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.420 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.421 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.421 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.421 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.421 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.421 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.421 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.422 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.422 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.422 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.422 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.422 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.422 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.422 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.423 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.423 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.423 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.423 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.423 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.423 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.423 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.424 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.424 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.424 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.424 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.424 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.424 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.424 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.424 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.425 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.425 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.425 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.425 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.425 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.425 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.426 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.426 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.426 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.426 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.426 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.426 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.426 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.427 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.427 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.427 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.427 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.427 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.427 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.428 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.428 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.428 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.428 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.428 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.428 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.428 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.428 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.429 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.429 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.429 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.429 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.429 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.429 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.429 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.430 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.430 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.430 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.430 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.430 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.431 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.431 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.431 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.431 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.431 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.431 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.432 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.432 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.432 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.432 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.432 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.432 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.432 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.432 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.433 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.433 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.433 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.433 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.433 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.433 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.433 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.434 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.434 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.434 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.434 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.434 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.434 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.434 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.435 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.435 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.435 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.435 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.435 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.435 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.436 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.436 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.436 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.436 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.436 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.436 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.437 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.437 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.437 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.437 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.437 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.437 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.437 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.438 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.438 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.438 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.438 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.438 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.438 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.438 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.439 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.439 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.439 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.439 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.439 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.439 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.439 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.440 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.440 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.440 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.440 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.440 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.440 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.440 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.441 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.441 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.441 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.441 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.441 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.441 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.441 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.442 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.442 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.442 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.442 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.442 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.443 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.443 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.443 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.443 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.443 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.443 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.443 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.443 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.444 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.444 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.444 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.444 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.444 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.444 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.444 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.445 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.445 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.445 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.445 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.445 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.446 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.446 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.446 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.446 182088 WARNING oslo_config.cfg [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 22 16:48:47 compute-0 nova_compute[182084]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 22 16:48:47 compute-0 nova_compute[182084]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 22 16:48:47 compute-0 nova_compute[182084]: and ``live_migration_inbound_addr`` respectively.
Jan 22 16:48:47 compute-0 nova_compute[182084]: ).  Its value may be silently ignored in the future.
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.446 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.447 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.447 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.447 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.447 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.447 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.447 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.447 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.448 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.448 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.448 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.448 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.448 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.448 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.448 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.449 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.449 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.449 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.449 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.449 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.449 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.449 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.449 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.450 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.450 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.450 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.450 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.450 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.450 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.451 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.451 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.451 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.451 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.451 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.451 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.452 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.452 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.452 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.452 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.452 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.452 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.452 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.453 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.453 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.453 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.453 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.453 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.453 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.453 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.454 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.454 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.454 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.454 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.454 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.454 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.454 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.455 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.455 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.455 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.455 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.455 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.455 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.455 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.455 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.456 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.456 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.456 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.456 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.456 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.456 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.456 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.457 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.457 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.457 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.457 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.457 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.457 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.458 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.458 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.458 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.458 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.458 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.458 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.459 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.459 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.459 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.459 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.459 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.460 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.460 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.460 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.460 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.460 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.460 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.460 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.461 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.461 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.461 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.461 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.461 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.462 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.462 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.462 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.462 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.462 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.462 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.463 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.463 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.463 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.463 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.463 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.464 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.464 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.464 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.464 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.464 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.464 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.464 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.464 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.465 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.465 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.465 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.465 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.465 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.465 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.465 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.466 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.466 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.466 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.466 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.466 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.466 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.466 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.467 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.467 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.467 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.467 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.467 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.467 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.468 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.468 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.468 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.468 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.468 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.468 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.468 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.469 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.469 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.469 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.469 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.469 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.469 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.470 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.470 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.470 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.470 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.470 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.470 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.471 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.471 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.471 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.471 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.471 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.471 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.472 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.472 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.472 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.472 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.472 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.472 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.473 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.473 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.473 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.473 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.473 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.473 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.474 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.474 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.474 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.474 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.474 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.474 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.475 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.475 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.475 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.475 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.475 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.475 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.475 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.475 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.476 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.476 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.476 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.476 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.476 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.476 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.476 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.477 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.477 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.477 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.477 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.477 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.477 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.478 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.478 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.478 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.478 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.478 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.478 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.479 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.479 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.479 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.479 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.479 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.479 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.479 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.479 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.480 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.480 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.480 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.480 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.480 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.480 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.480 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.481 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.481 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.481 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.481 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.481 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.481 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.481 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.481 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.482 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.482 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.482 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.482 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.482 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.482 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.482 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.483 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.483 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.483 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.483 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.483 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.483 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.483 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.484 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.484 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.484 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.484 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.484 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.484 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.485 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.485 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.485 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.485 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.485 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.485 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.485 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.485 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.486 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.486 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.486 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.486 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.486 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.486 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.486 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.487 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.487 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.487 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.487 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.487 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.487 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.487 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.488 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.488 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.488 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.488 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.488 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.488 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.488 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.488 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.489 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.489 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.489 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.489 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.489 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.489 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.489 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.490 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.490 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.490 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.490 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.490 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.490 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.490 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.491 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.491 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.491 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.491 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.491 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.491 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.491 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.492 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.492 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.492 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.492 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.492 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.492 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.492 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.492 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.493 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.493 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.493 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.493 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.493 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.493 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.493 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.494 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.494 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.494 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.494 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.494 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.494 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.494 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.494 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.495 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.495 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.495 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.495 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.495 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.495 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.495 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.496 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.496 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.496 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.496 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.496 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.496 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.496 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.497 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.497 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.497 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.497 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.497 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.497 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.498 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.498 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.498 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.498 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.498 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.498 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.498 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.499 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.499 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.499 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.499 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.499 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.499 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.499 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.499 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.500 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.500 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.500 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.500 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.500 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.500 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.500 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.501 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.501 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.501 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.501 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.501 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.501 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.501 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.501 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.502 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.502 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.502 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.502 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.502 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.502 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.502 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.503 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.503 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.503 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.503 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.503 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.503 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.503 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.503 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.504 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.504 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.504 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.504 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.504 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.504 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.504 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.505 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.505 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.505 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.505 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.505 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.505 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.506 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.506 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.506 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.506 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.506 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.506 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.506 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.507 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.507 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.507 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.507 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.507 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.507 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.507 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.508 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.508 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.508 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.508 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.508 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.508 182088 DEBUG oslo_service.service [None req-0aa7280f-0917-4167-bd4f-c9b0a2976a9b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.509 182088 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.524 182088 DEBUG nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.524 182088 DEBUG nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.525 182088 DEBUG nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.525 182088 DEBUG nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 22 16:48:47 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Jan 22 16:48:47 compute-0 systemd[1]: Started libvirt QEMU daemon.
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.583 182088 DEBUG nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f028c842df0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.585 182088 DEBUG nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f028c842df0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.586 182088 INFO nova.virt.libvirt.driver [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Connection event '1' reason 'None'
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.608 182088 WARNING nova.virt.libvirt.driver [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 22 16:48:47 compute-0 nova_compute[182084]: 2026-01-22 16:48:47.609 182088 DEBUG nova.virt.libvirt.volume.mount [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 22 16:48:47 compute-0 sudo[182799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlhlhainrfrzmvzolvtsjtjtuvfctwdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100527.2224848-1270-17461725132222/AnsiballZ_podman_container.py'
Jan 22 16:48:47 compute-0 sudo[182799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:47 compute-0 python3.9[182801]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 22 16:48:48 compute-0 sudo[182799]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:48 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 16:48:48 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.456 182088 INFO nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Libvirt host capabilities <capabilities>
Jan 22 16:48:48 compute-0 nova_compute[182084]: 
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <host>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <uuid>814341d3-bd19-425d-8185-e66e96ccdc81</uuid>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <cpu>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <arch>x86_64</arch>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model>EPYC-Rome-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <vendor>AMD</vendor>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <microcode version='16777317'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <signature family='23' model='49' stepping='0'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='x2apic'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='tsc-deadline'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='osxsave'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='hypervisor'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='tsc_adjust'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='spec-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='stibp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='arch-capabilities'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='ssbd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='cmp_legacy'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='topoext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='virt-ssbd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='lbrv'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='tsc-scale'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='vmcb-clean'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='pause-filter'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='pfthreshold'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='svme-addr-chk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='rdctl-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='skip-l1dfl-vmentry'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='mds-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature name='pschange-mc-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <pages unit='KiB' size='4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <pages unit='KiB' size='2048'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <pages unit='KiB' size='1048576'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </cpu>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <power_management>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <suspend_mem/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <suspend_disk/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <suspend_hybrid/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </power_management>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <iommu support='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <migration_features>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <live/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <uri_transports>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <uri_transport>tcp</uri_transport>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <uri_transport>rdma</uri_transport>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </uri_transports>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </migration_features>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <topology>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <cells num='1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <cell id='0'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:           <memory unit='KiB'>7864308</memory>
Jan 22 16:48:48 compute-0 nova_compute[182084]:           <pages unit='KiB' size='4'>1966077</pages>
Jan 22 16:48:48 compute-0 nova_compute[182084]:           <pages unit='KiB' size='2048'>0</pages>
Jan 22 16:48:48 compute-0 nova_compute[182084]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 22 16:48:48 compute-0 nova_compute[182084]:           <distances>
Jan 22 16:48:48 compute-0 nova_compute[182084]:             <sibling id='0' value='10'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:           </distances>
Jan 22 16:48:48 compute-0 nova_compute[182084]:           <cpus num='8'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:           </cpus>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         </cell>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </cells>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </topology>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <cache>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </cache>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <secmodel>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model>selinux</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <doi>0</doi>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </secmodel>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <secmodel>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model>dac</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <doi>0</doi>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </secmodel>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </host>
Jan 22 16:48:48 compute-0 nova_compute[182084]: 
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <guest>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <os_type>hvm</os_type>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <arch name='i686'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <wordsize>32</wordsize>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <domain type='qemu'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <domain type='kvm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </arch>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <features>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <pae/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <nonpae/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <acpi default='on' toggle='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <apic default='on' toggle='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <cpuselection/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <deviceboot/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <disksnapshot default='on' toggle='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <externalSnapshot/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </features>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </guest>
Jan 22 16:48:48 compute-0 nova_compute[182084]: 
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <guest>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <os_type>hvm</os_type>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <arch name='x86_64'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <wordsize>64</wordsize>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <domain type='qemu'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <domain type='kvm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </arch>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <features>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <acpi default='on' toggle='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <apic default='on' toggle='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <cpuselection/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <deviceboot/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <disksnapshot default='on' toggle='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <externalSnapshot/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </features>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </guest>
Jan 22 16:48:48 compute-0 nova_compute[182084]: 
Jan 22 16:48:48 compute-0 nova_compute[182084]: </capabilities>
Jan 22 16:48:48 compute-0 nova_compute[182084]: 
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.466 182088 DEBUG nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.494 182088 DEBUG nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 22 16:48:48 compute-0 nova_compute[182084]: <domainCapabilities>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <path>/usr/libexec/qemu-kvm</path>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <domain>kvm</domain>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <arch>i686</arch>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <vcpu max='240'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <iothreads supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <os supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <enum name='firmware'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <loader supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>rom</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pflash</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='readonly'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>yes</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>no</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='secure'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>no</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </loader>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </os>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <cpu>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <mode name='host-passthrough' supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='hostPassthroughMigratable'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>on</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>off</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </mode>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <mode name='maximum' supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='maximumMigratable'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>on</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>off</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </mode>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <mode name='host-model' supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <vendor>AMD</vendor>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='x2apic'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='tsc-deadline'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='hypervisor'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='tsc_adjust'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='spec-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='stibp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='ssbd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='cmp_legacy'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='overflow-recov'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='succor'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='amd-ssbd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='virt-ssbd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='lbrv'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='tsc-scale'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='vmcb-clean'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='flushbyasid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='pause-filter'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='pfthreshold'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='svme-addr-chk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='disable' name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </mode>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <mode name='custom' supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-noTSX'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='ClearwaterForest'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ddpd-u'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='intel-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='lam'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sha512'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sm3'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sm4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='ClearwaterForest-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ddpd-u'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='intel-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='lam'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sha512'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sm3'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sm4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cooperlake'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cooperlake-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cooperlake-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Denverton'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mpx'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Denverton-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mpx'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Denverton-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Denverton-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Dhyana-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Genoa'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Genoa-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Genoa-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='perfmon-v2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Milan'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Milan-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Milan-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Milan-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Rome'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Rome-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Rome-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Rome-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Turin'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vp2intersect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibpb-brtype'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='perfmon-v2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbpb'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='srso-user-kernel-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Turin-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vp2intersect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibpb-brtype'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='perfmon-v2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbpb'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='srso-user-kernel-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-v5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='GraniteRapids'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='GraniteRapids-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='GraniteRapids-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-128'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-256'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-512'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='GraniteRapids-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-128'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-256'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-512'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-noTSX'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-noTSX'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v6'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v7'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='IvyBridge'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='IvyBridge-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='IvyBridge-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='IvyBridge-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='KnightsMill'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-4fmaps'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-4vnniw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512er'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512pf'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='KnightsMill-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-4fmaps'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-4vnniw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512er'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512pf'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Opteron_G4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fma4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xop'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Opteron_G4-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fma4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xop'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Opteron_G5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fma4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tbm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xop'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Opteron_G5-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fma4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tbm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xop'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SierraForest'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SierraForest-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SierraForest-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='intel-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='lam'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SierraForest-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='intel-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='lam'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='core-capability'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mpx'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='split-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='core-capability'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mpx'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='split-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='core-capability'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='split-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='core-capability'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='split-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='athlon'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnow'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnowext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='athlon-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnow'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnowext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='core2duo'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='core2duo-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='coreduo'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='coreduo-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='n270'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='n270-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='phenom'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnow'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnowext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='phenom-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnow'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnowext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </mode>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </cpu>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <memoryBacking supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <enum name='sourceType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>file</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>anonymous</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>memfd</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </memoryBacking>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <devices>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <disk supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='diskDevice'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>disk</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>cdrom</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>floppy</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>lun</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='bus'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>ide</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>fdc</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>scsi</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>usb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>sata</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio-transitional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio-non-transitional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </disk>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <graphics supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vnc</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>egl-headless</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>dbus</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </graphics>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <video supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='modelType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vga</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>cirrus</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>none</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>bochs</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>ramfb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </video>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <hostdev supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='mode'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>subsystem</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='startupPolicy'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>default</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>mandatory</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>requisite</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>optional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='subsysType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>usb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pci</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>scsi</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='capsType'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='pciBackend'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </hostdev>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <rng supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio-transitional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio-non-transitional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendModel'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>random</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>egd</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>builtin</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </rng>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <filesystem supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='driverType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>path</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>handle</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtiofs</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </filesystem>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <tpm supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>tpm-tis</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>tpm-crb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendModel'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>emulator</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>external</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendVersion'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>2.0</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </tpm>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <redirdev supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='bus'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>usb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </redirdev>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <channel supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pty</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>unix</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </channel>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <crypto supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>qemu</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendModel'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>builtin</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </crypto>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <interface supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>default</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>passt</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </interface>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <panic supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>isa</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>hyperv</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </panic>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <console supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>null</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vc</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pty</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>dev</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>file</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pipe</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>stdio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>udp</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>tcp</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>unix</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>qemu-vdagent</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>dbus</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </console>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </devices>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <features>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <gic supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <vmcoreinfo supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <genid supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <backingStoreInput supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <backup supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <async-teardown supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <s390-pv supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <ps2 supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <tdx supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <sev supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <sgx supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <hyperv supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='features'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>relaxed</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vapic</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>spinlocks</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vpindex</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>runtime</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>synic</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>stimer</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>reset</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vendor_id</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>frequencies</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>reenlightenment</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>tlbflush</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>ipi</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>avic</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>emsr_bitmap</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>xmm_input</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <defaults>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <spinlocks>4095</spinlocks>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <stimer_direct>on</stimer_direct>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <tlbflush_direct>on</tlbflush_direct>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <tlbflush_extended>on</tlbflush_extended>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </defaults>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </hyperv>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <launchSecurity supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </features>
Jan 22 16:48:48 compute-0 nova_compute[182084]: </domainCapabilities>
Jan 22 16:48:48 compute-0 nova_compute[182084]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.502 182088 DEBUG nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 22 16:48:48 compute-0 nova_compute[182084]: <domainCapabilities>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <path>/usr/libexec/qemu-kvm</path>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <domain>kvm</domain>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <arch>i686</arch>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <vcpu max='4096'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <iothreads supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <os supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <enum name='firmware'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <loader supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>rom</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pflash</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='readonly'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>yes</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>no</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='secure'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>no</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </loader>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </os>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <cpu>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <mode name='host-passthrough' supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='hostPassthroughMigratable'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>on</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>off</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </mode>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <mode name='maximum' supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='maximumMigratable'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>on</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>off</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </mode>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <mode name='host-model' supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <vendor>AMD</vendor>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='x2apic'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='tsc-deadline'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='hypervisor'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='tsc_adjust'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='spec-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='stibp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='ssbd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='cmp_legacy'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='overflow-recov'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='succor'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='amd-ssbd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='virt-ssbd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='lbrv'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='tsc-scale'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='vmcb-clean'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='flushbyasid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='pause-filter'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='pfthreshold'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='svme-addr-chk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='disable' name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </mode>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <mode name='custom' supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-noTSX'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='ClearwaterForest'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ddpd-u'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='intel-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='lam'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sha512'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sm3'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sm4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='ClearwaterForest-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ddpd-u'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='intel-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='lam'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sha512'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sm3'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sm4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cooperlake'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cooperlake-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cooperlake-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Denverton'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mpx'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Denverton-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mpx'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Denverton-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Denverton-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Dhyana-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Genoa'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Genoa-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Genoa-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='perfmon-v2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Milan'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Milan-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Milan-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Milan-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Rome'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Rome-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Rome-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Rome-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Turin'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vp2intersect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibpb-brtype'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='perfmon-v2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbpb'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='srso-user-kernel-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Turin-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vp2intersect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibpb-brtype'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='perfmon-v2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbpb'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='srso-user-kernel-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-v5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='GraniteRapids'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='GraniteRapids-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='GraniteRapids-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-128'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-256'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-512'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='GraniteRapids-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-128'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-256'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-512'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-noTSX'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-noTSX'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v6'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v7'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='IvyBridge'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='IvyBridge-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='IvyBridge-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='IvyBridge-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='KnightsMill'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-4fmaps'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-4vnniw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512er'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512pf'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='KnightsMill-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-4fmaps'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-4vnniw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512er'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512pf'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Opteron_G4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fma4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xop'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Opteron_G4-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fma4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xop'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Opteron_G5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fma4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tbm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xop'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Opteron_G5-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fma4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tbm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xop'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SierraForest'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SierraForest-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SierraForest-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='intel-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='lam'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SierraForest-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='intel-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='lam'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='core-capability'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mpx'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='split-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='core-capability'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mpx'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='split-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='core-capability'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='split-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='core-capability'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='split-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='athlon'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnow'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnowext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='athlon-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnow'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnowext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='core2duo'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='core2duo-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='coreduo'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='coreduo-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='n270'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='n270-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='phenom'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnow'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnowext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='phenom-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnow'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnowext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </mode>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </cpu>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <memoryBacking supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <enum name='sourceType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>file</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>anonymous</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>memfd</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </memoryBacking>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <devices>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <disk supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='diskDevice'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>disk</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>cdrom</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>floppy</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>lun</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='bus'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>fdc</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>scsi</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>usb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>sata</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio-transitional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio-non-transitional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </disk>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <graphics supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vnc</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>egl-headless</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>dbus</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </graphics>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <video supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='modelType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vga</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>cirrus</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>none</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>bochs</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>ramfb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </video>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <hostdev supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='mode'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>subsystem</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='startupPolicy'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>default</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>mandatory</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>requisite</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>optional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='subsysType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>usb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pci</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>scsi</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='capsType'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='pciBackend'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </hostdev>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <rng supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio-transitional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio-non-transitional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendModel'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>random</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>egd</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>builtin</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </rng>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <filesystem supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='driverType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>path</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>handle</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtiofs</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </filesystem>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <tpm supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>tpm-tis</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>tpm-crb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendModel'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>emulator</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>external</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendVersion'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>2.0</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </tpm>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <redirdev supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='bus'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>usb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </redirdev>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <channel supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pty</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>unix</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </channel>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <crypto supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>qemu</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendModel'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>builtin</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </crypto>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <interface supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>default</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>passt</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </interface>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <panic supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>isa</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>hyperv</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </panic>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <console supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>null</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vc</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pty</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>dev</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>file</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pipe</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>stdio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>udp</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>tcp</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>unix</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>qemu-vdagent</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>dbus</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </console>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </devices>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <features>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <gic supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <vmcoreinfo supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <genid supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <backingStoreInput supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <backup supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <async-teardown supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <s390-pv supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <ps2 supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <tdx supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <sev supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <sgx supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <hyperv supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='features'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>relaxed</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vapic</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>spinlocks</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vpindex</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>runtime</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>synic</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>stimer</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>reset</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vendor_id</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>frequencies</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>reenlightenment</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>tlbflush</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>ipi</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>avic</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>emsr_bitmap</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>xmm_input</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <defaults>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <spinlocks>4095</spinlocks>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <stimer_direct>on</stimer_direct>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <tlbflush_direct>on</tlbflush_direct>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <tlbflush_extended>on</tlbflush_extended>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </defaults>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </hyperv>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <launchSecurity supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </features>
Jan 22 16:48:48 compute-0 nova_compute[182084]: </domainCapabilities>
Jan 22 16:48:48 compute-0 nova_compute[182084]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.556 182088 DEBUG nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.563 182088 DEBUG nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 22 16:48:48 compute-0 nova_compute[182084]: <domainCapabilities>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <path>/usr/libexec/qemu-kvm</path>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <domain>kvm</domain>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <arch>x86_64</arch>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <vcpu max='240'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <iothreads supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <os supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <enum name='firmware'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <loader supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>rom</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pflash</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='readonly'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>yes</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>no</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='secure'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>no</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </loader>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </os>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <cpu>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <mode name='host-passthrough' supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='hostPassthroughMigratable'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>on</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>off</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </mode>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <mode name='maximum' supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='maximumMigratable'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>on</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>off</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </mode>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <mode name='host-model' supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <vendor>AMD</vendor>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='x2apic'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='tsc-deadline'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='hypervisor'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='tsc_adjust'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='spec-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='stibp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='ssbd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='cmp_legacy'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='overflow-recov'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='succor'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='amd-ssbd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='virt-ssbd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='lbrv'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='tsc-scale'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='vmcb-clean'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='flushbyasid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='pause-filter'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='pfthreshold'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='svme-addr-chk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='disable' name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </mode>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <mode name='custom' supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-noTSX'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='ClearwaterForest'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ddpd-u'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='intel-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='lam'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sha512'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sm3'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sm4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='ClearwaterForest-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ddpd-u'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='intel-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='lam'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sha512'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sm3'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sm4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cooperlake'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cooperlake-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cooperlake-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Denverton'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mpx'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Denverton-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mpx'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Denverton-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Denverton-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Dhyana-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Genoa'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Genoa-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Genoa-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='perfmon-v2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Milan'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Milan-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Milan-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Milan-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Rome'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Rome-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Rome-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Rome-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Turin'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vp2intersect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibpb-brtype'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='perfmon-v2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbpb'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='srso-user-kernel-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Turin-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vp2intersect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibpb-brtype'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='perfmon-v2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbpb'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='srso-user-kernel-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-v5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='GraniteRapids'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='GraniteRapids-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='GraniteRapids-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-128'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-256'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-512'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='GraniteRapids-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-128'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-256'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-512'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-noTSX'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-noTSX'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 sudo[182985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frnrhvzbaeypngqmelcljadakfbjchkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100528.3200264-1278-140342983810285/AnsiballZ_systemd.py'
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 sudo[182985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v6'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v7'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='IvyBridge'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='IvyBridge-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='IvyBridge-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='IvyBridge-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='KnightsMill'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-4fmaps'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-4vnniw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512er'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512pf'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='KnightsMill-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-4fmaps'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-4vnniw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512er'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512pf'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Opteron_G4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fma4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xop'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Opteron_G4-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fma4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xop'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Opteron_G5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fma4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tbm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xop'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Opteron_G5-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fma4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tbm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xop'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SierraForest'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SierraForest-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SierraForest-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='intel-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='lam'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SierraForest-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='intel-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='lam'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='core-capability'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mpx'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='split-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='core-capability'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mpx'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='split-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='core-capability'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='split-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='core-capability'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='split-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='athlon'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnow'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnowext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='athlon-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnow'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnowext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='core2duo'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='core2duo-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='coreduo'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='coreduo-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='n270'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='n270-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='phenom'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnow'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnowext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='phenom-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnow'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnowext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </mode>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </cpu>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <memoryBacking supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <enum name='sourceType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>file</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>anonymous</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>memfd</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </memoryBacking>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <devices>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <disk supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='diskDevice'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>disk</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>cdrom</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>floppy</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>lun</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='bus'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>ide</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>fdc</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>scsi</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>usb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>sata</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio-transitional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio-non-transitional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </disk>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <graphics supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vnc</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>egl-headless</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>dbus</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </graphics>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <video supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='modelType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vga</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>cirrus</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>none</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>bochs</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>ramfb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </video>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <hostdev supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='mode'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>subsystem</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='startupPolicy'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>default</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>mandatory</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>requisite</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>optional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='subsysType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>usb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pci</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>scsi</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='capsType'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='pciBackend'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </hostdev>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <rng supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio-transitional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio-non-transitional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendModel'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>random</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>egd</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>builtin</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </rng>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <filesystem supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='driverType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>path</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>handle</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtiofs</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </filesystem>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <tpm supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>tpm-tis</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>tpm-crb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendModel'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>emulator</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>external</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendVersion'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>2.0</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </tpm>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <redirdev supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='bus'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>usb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </redirdev>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <channel supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pty</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>unix</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </channel>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <crypto supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>qemu</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendModel'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>builtin</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </crypto>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <interface supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>default</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>passt</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </interface>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <panic supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>isa</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>hyperv</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </panic>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <console supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>null</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vc</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pty</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>dev</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>file</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pipe</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>stdio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>udp</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>tcp</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>unix</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>qemu-vdagent</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>dbus</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </console>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </devices>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <features>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <gic supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <vmcoreinfo supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <genid supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <backingStoreInput supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <backup supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <async-teardown supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <s390-pv supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <ps2 supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <tdx supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <sev supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <sgx supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <hyperv supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='features'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>relaxed</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vapic</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>spinlocks</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vpindex</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>runtime</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>synic</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>stimer</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>reset</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vendor_id</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>frequencies</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>reenlightenment</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>tlbflush</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>ipi</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>avic</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>emsr_bitmap</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>xmm_input</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <defaults>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <spinlocks>4095</spinlocks>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <stimer_direct>on</stimer_direct>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <tlbflush_direct>on</tlbflush_direct>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <tlbflush_extended>on</tlbflush_extended>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </defaults>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </hyperv>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <launchSecurity supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </features>
Jan 22 16:48:48 compute-0 nova_compute[182084]: </domainCapabilities>
Jan 22 16:48:48 compute-0 nova_compute[182084]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.642 182088 DEBUG nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 22 16:48:48 compute-0 nova_compute[182084]: <domainCapabilities>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <path>/usr/libexec/qemu-kvm</path>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <domain>kvm</domain>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <arch>x86_64</arch>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <vcpu max='4096'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <iothreads supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <os supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <enum name='firmware'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>efi</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <loader supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>rom</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pflash</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='readonly'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>yes</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>no</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='secure'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>yes</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>no</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </loader>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </os>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <cpu>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <mode name='host-passthrough' supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='hostPassthroughMigratable'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>on</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>off</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </mode>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <mode name='maximum' supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='maximumMigratable'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>on</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>off</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </mode>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <mode name='host-model' supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <vendor>AMD</vendor>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='x2apic'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='tsc-deadline'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='hypervisor'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='tsc_adjust'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='spec-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='stibp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='ssbd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='cmp_legacy'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='overflow-recov'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='succor'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='amd-ssbd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='virt-ssbd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='lbrv'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='tsc-scale'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='vmcb-clean'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='flushbyasid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='pause-filter'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='pfthreshold'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='svme-addr-chk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <feature policy='disable' name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </mode>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <mode name='custom' supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-noTSX'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Broadwell-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cascadelake-Server-v5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='ClearwaterForest'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ddpd-u'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='intel-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='lam'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sha512'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sm3'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sm4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='ClearwaterForest-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ddpd-u'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='intel-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='lam'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sha512'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sm3'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sm4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cooperlake'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cooperlake-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Cooperlake-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Denverton'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mpx'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Denverton-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mpx'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Denverton-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Denverton-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Dhyana-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Genoa'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Genoa-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Genoa-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='perfmon-v2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Milan'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Milan-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Milan-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Milan-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Rome'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Rome-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Rome-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Rome-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Turin'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vp2intersect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibpb-brtype'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='perfmon-v2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbpb'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='srso-user-kernel-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-Turin-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amd-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='auto-ibrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vp2intersect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibpb-brtype'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='perfmon-v2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbpb'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='srso-user-kernel-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='stibp-always-on'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='EPYC-v5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='GraniteRapids'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='GraniteRapids-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='GraniteRapids-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-128'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-256'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-512'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='GraniteRapids-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-128'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-256'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx10-512'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='prefetchiti'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-noTSX'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Haswell-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-noTSX'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v6'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Icelake-Server-v7'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='IvyBridge'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='IvyBridge-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='IvyBridge-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='IvyBridge-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='KnightsMill'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-4fmaps'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-4vnniw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512er'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512pf'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='KnightsMill-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-4fmaps'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-4vnniw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512er'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512pf'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Opteron_G4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fma4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xop'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Opteron_G4-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fma4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xop'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Opteron_G5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fma4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tbm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xop'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Opteron_G5-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fma4'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tbm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xop'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SapphireRapids-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='amx-tile'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-bf16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-fp16'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bitalg'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrc'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fzrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='la57'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='taa-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SierraForest'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SierraForest-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SierraForest-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='intel-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='lam'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='SierraForest-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ifma'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cmpccxadd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fbsdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='fsrs'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ibrs-all'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='intel-psfd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='lam'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mcdt-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pbrsb-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='psdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='serialize'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vaes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Client-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='hle'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='rtm'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Skylake-Server-v5'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512bw'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512cd'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512dq'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512f'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='avx512vl'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='invpcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pcid'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='pku'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='core-capability'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mpx'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='split-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='core-capability'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='mpx'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='split-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge-v2'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='core-capability'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='split-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge-v3'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='core-capability'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='split-lock-detect'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='Snowridge-v4'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='cldemote'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='erms'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='gfni'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdir64b'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='movdiri'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='xsaves'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='athlon'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnow'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnowext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='athlon-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnow'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnowext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='core2duo'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='core2duo-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='coreduo'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='coreduo-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='n270'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='n270-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='ss'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='phenom'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnow'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnowext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <blockers model='phenom-v1'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnow'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <feature name='3dnowext'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </blockers>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </mode>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </cpu>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <memoryBacking supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <enum name='sourceType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>file</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>anonymous</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <value>memfd</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </memoryBacking>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <devices>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <disk supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='diskDevice'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>disk</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>cdrom</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>floppy</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>lun</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='bus'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>fdc</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>scsi</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>usb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>sata</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio-transitional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio-non-transitional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </disk>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <graphics supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vnc</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>egl-headless</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>dbus</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </graphics>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <video supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='modelType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vga</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>cirrus</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>none</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>bochs</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>ramfb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </video>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <hostdev supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='mode'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>subsystem</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='startupPolicy'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>default</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>mandatory</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>requisite</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>optional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='subsysType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>usb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pci</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>scsi</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='capsType'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='pciBackend'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </hostdev>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <rng supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio-transitional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtio-non-transitional</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendModel'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>random</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>egd</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>builtin</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </rng>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <filesystem supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='driverType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>path</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>handle</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>virtiofs</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </filesystem>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <tpm supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>tpm-tis</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>tpm-crb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendModel'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>emulator</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>external</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendVersion'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>2.0</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </tpm>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <redirdev supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='bus'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>usb</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </redirdev>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <channel supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pty</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>unix</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </channel>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <crypto supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>qemu</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendModel'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>builtin</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </crypto>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <interface supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='backendType'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>default</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>passt</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </interface>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <panic supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='model'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>isa</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>hyperv</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </panic>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <console supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='type'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>null</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vc</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pty</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>dev</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>file</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>pipe</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>stdio</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>udp</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>tcp</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>unix</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>qemu-vdagent</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>dbus</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </console>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </devices>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   <features>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <gic supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <vmcoreinfo supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <genid supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <backingStoreInput supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <backup supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <async-teardown supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <s390-pv supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <ps2 supported='yes'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <tdx supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <sev supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <sgx supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <hyperv supported='yes'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <enum name='features'>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>relaxed</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vapic</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>spinlocks</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vpindex</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>runtime</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>synic</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>stimer</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>reset</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>vendor_id</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>frequencies</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>reenlightenment</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>tlbflush</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>ipi</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>avic</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>emsr_bitmap</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <value>xmm_input</value>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </enum>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       <defaults>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <spinlocks>4095</spinlocks>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <stimer_direct>on</stimer_direct>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <tlbflush_direct>on</tlbflush_direct>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <tlbflush_extended>on</tlbflush_extended>
Jan 22 16:48:48 compute-0 nova_compute[182084]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 16:48:48 compute-0 nova_compute[182084]:       </defaults>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     </hyperv>
Jan 22 16:48:48 compute-0 nova_compute[182084]:     <launchSecurity supported='no'/>
Jan 22 16:48:48 compute-0 nova_compute[182084]:   </features>
Jan 22 16:48:48 compute-0 nova_compute[182084]: </domainCapabilities>
Jan 22 16:48:48 compute-0 nova_compute[182084]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.749 182088 DEBUG nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.750 182088 DEBUG nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.751 182088 DEBUG nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.756 182088 INFO nova.virt.libvirt.host [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Secure Boot support detected
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.758 182088 INFO nova.virt.libvirt.driver [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.759 182088 INFO nova.virt.libvirt.driver [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.774 182088 DEBUG nova.virt.libvirt.driver [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.827 182088 INFO nova.virt.node [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Determined node identity 2513134c-f67c-4237-84bf-4ebe2450d610 from /var/lib/nova/compute_id
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.860 182088 WARNING nova.compute.manager [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Compute nodes ['2513134c-f67c-4237-84bf-4ebe2450d610'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.924 182088 INFO nova.compute.manager [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.955 182088 WARNING nova.compute.manager [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.955 182088 DEBUG oslo_concurrency.lockutils [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.955 182088 DEBUG oslo_concurrency.lockutils [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.955 182088 DEBUG oslo_concurrency.lockutils [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:48:48 compute-0 nova_compute[182084]: 2026-01-22 16:48:48.955 182088 DEBUG nova.compute.resource_tracker [None req-ff184a8a-6dde-4e8b-ac78-598cd2f4ac0b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 16:48:48 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Jan 22 16:48:49 compute-0 systemd[1]: Started libvirt nodedev daemon.
Jan 22 16:48:49 compute-0 python3.9[182987]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:48:49 compute-0 systemd[1]: Stopping nova_compute container...
Jan 22 16:48:49 compute-0 virtqemud[182696]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 22 16:48:49 compute-0 virtqemud[182696]: hostname: compute-0
Jan 22 16:48:49 compute-0 virtqemud[182696]: End of file while reading data: Input/output error
Jan 22 16:48:49 compute-0 systemd[1]: libpod-7e081cf3e74ca81472f44e092740e74d76aa9d3296e900d7563f5c5fa5075f21.scope: Deactivated successfully.
Jan 22 16:48:49 compute-0 systemd[1]: libpod-7e081cf3e74ca81472f44e092740e74d76aa9d3296e900d7563f5c5fa5075f21.scope: Consumed 2.606s CPU time.
Jan 22 16:48:49 compute-0 podman[183012]: 2026-01-22 16:48:49.196286843 +0000 UTC m=+0.086507306 container died 7e081cf3e74ca81472f44e092740e74d76aa9d3296e900d7563f5c5fa5075f21 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 22 16:48:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e081cf3e74ca81472f44e092740e74d76aa9d3296e900d7563f5c5fa5075f21-userdata-shm.mount: Deactivated successfully.
Jan 22 16:48:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-f6a47cb747ad3a09e3d79ab2f56dd1bfc6d52ebb23134d96e66dc5f3202337ba-merged.mount: Deactivated successfully.
Jan 22 16:48:49 compute-0 podman[183012]: 2026-01-22 16:48:49.255871611 +0000 UTC m=+0.146092064 container cleanup 7e081cf3e74ca81472f44e092740e74d76aa9d3296e900d7563f5c5fa5075f21 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm)
Jan 22 16:48:49 compute-0 podman[183012]: nova_compute
Jan 22 16:48:49 compute-0 podman[183046]: nova_compute
Jan 22 16:48:49 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 22 16:48:49 compute-0 systemd[1]: Stopped nova_compute container.
Jan 22 16:48:49 compute-0 systemd[1]: Starting nova_compute container...
Jan 22 16:48:49 compute-0 systemd[1]: Started libcrun container.
Jan 22 16:48:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6a47cb747ad3a09e3d79ab2f56dd1bfc6d52ebb23134d96e66dc5f3202337ba/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 22 16:48:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6a47cb747ad3a09e3d79ab2f56dd1bfc6d52ebb23134d96e66dc5f3202337ba/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 22 16:48:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6a47cb747ad3a09e3d79ab2f56dd1bfc6d52ebb23134d96e66dc5f3202337ba/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 22 16:48:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6a47cb747ad3a09e3d79ab2f56dd1bfc6d52ebb23134d96e66dc5f3202337ba/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 22 16:48:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6a47cb747ad3a09e3d79ab2f56dd1bfc6d52ebb23134d96e66dc5f3202337ba/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 22 16:48:49 compute-0 podman[183059]: 2026-01-22 16:48:49.487500374 +0000 UTC m=+0.121833408 container init 7e081cf3e74ca81472f44e092740e74d76aa9d3296e900d7563f5c5fa5075f21 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible)
Jan 22 16:48:49 compute-0 podman[183059]: 2026-01-22 16:48:49.494818349 +0000 UTC m=+0.129151353 container start 7e081cf3e74ca81472f44e092740e74d76aa9d3296e900d7563f5c5fa5075f21 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 16:48:49 compute-0 podman[183059]: nova_compute
Jan 22 16:48:49 compute-0 nova_compute[183075]: + sudo -E kolla_set_configs
Jan 22 16:48:49 compute-0 systemd[1]: Started nova_compute container.
Jan 22 16:48:49 compute-0 sudo[182985]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Validating config file
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Copying service configuration files
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Deleting /etc/ceph
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Creating directory /etc/ceph
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Setting permission for /etc/ceph
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Writing out command to execute
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 22 16:48:49 compute-0 nova_compute[183075]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 22 16:48:49 compute-0 nova_compute[183075]: ++ cat /run_command
Jan 22 16:48:49 compute-0 nova_compute[183075]: + CMD=nova-compute
Jan 22 16:48:49 compute-0 nova_compute[183075]: + ARGS=
Jan 22 16:48:49 compute-0 nova_compute[183075]: + sudo kolla_copy_cacerts
Jan 22 16:48:49 compute-0 nova_compute[183075]: + [[ ! -n '' ]]
Jan 22 16:48:49 compute-0 nova_compute[183075]: + . kolla_extend_start
Jan 22 16:48:49 compute-0 nova_compute[183075]: + echo 'Running command: '\''nova-compute'\'''
Jan 22 16:48:49 compute-0 nova_compute[183075]: Running command: 'nova-compute'
Jan 22 16:48:49 compute-0 nova_compute[183075]: + umask 0022
Jan 22 16:48:49 compute-0 nova_compute[183075]: + exec nova-compute
Jan 22 16:48:50 compute-0 sudo[183236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtocasixyfuxjmibnuxoezmqexolkmgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100529.7702122-1287-168568926644039/AnsiballZ_podman_container.py'
Jan 22 16:48:50 compute-0 sudo[183236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:50 compute-0 python3.9[183238]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 22 16:48:50 compute-0 systemd[1]: Started libpod-conmon-ddb09390195b756475e3299b73380b576d2d95487745a61cccf042418cb6c4bc.scope.
Jan 22 16:48:50 compute-0 systemd[1]: Started libcrun container.
Jan 22 16:48:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f3fc7c32769bf1e08f563e2c106c4628a52449baf45e46bd8f0cc14f87ffa33/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 22 16:48:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f3fc7c32769bf1e08f563e2c106c4628a52449baf45e46bd8f0cc14f87ffa33/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 22 16:48:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f3fc7c32769bf1e08f563e2c106c4628a52449baf45e46bd8f0cc14f87ffa33/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 22 16:48:50 compute-0 podman[183264]: 2026-01-22 16:48:50.714422381 +0000 UTC m=+0.172993592 container init ddb09390195b756475e3299b73380b576d2d95487745a61cccf042418cb6c4bc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Jan 22 16:48:50 compute-0 podman[183264]: 2026-01-22 16:48:50.725674911 +0000 UTC m=+0.184246092 container start ddb09390195b756475e3299b73380b576d2d95487745a61cccf042418cb6c4bc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 16:48:50 compute-0 python3.9[183238]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 22 16:48:50 compute-0 nova_compute_init[183285]: INFO:nova_statedir:Applying nova statedir ownership
Jan 22 16:48:50 compute-0 nova_compute_init[183285]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 22 16:48:50 compute-0 nova_compute_init[183285]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 22 16:48:50 compute-0 nova_compute_init[183285]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 22 16:48:50 compute-0 nova_compute_init[183285]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 22 16:48:50 compute-0 nova_compute_init[183285]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 22 16:48:50 compute-0 nova_compute_init[183285]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 22 16:48:50 compute-0 nova_compute_init[183285]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 22 16:48:50 compute-0 nova_compute_init[183285]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 22 16:48:50 compute-0 nova_compute_init[183285]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 22 16:48:50 compute-0 nova_compute_init[183285]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 22 16:48:50 compute-0 nova_compute_init[183285]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 22 16:48:50 compute-0 nova_compute_init[183285]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 22 16:48:50 compute-0 nova_compute_init[183285]: INFO:nova_statedir:Nova statedir ownership complete
Jan 22 16:48:50 compute-0 systemd[1]: libpod-ddb09390195b756475e3299b73380b576d2d95487745a61cccf042418cb6c4bc.scope: Deactivated successfully.
Jan 22 16:48:50 compute-0 podman[183299]: 2026-01-22 16:48:50.841413645 +0000 UTC m=+0.027630787 container died ddb09390195b756475e3299b73380b576d2d95487745a61cccf042418cb6c4bc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3)
Jan 22 16:48:50 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ddb09390195b756475e3299b73380b576d2d95487745a61cccf042418cb6c4bc-userdata-shm.mount: Deactivated successfully.
Jan 22 16:48:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-1f3fc7c32769bf1e08f563e2c106c4628a52449baf45e46bd8f0cc14f87ffa33-merged.mount: Deactivated successfully.
Jan 22 16:48:50 compute-0 podman[183299]: 2026-01-22 16:48:50.884711019 +0000 UTC m=+0.070928131 container cleanup ddb09390195b756475e3299b73380b576d2d95487745a61cccf042418cb6c4bc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 16:48:50 compute-0 systemd[1]: libpod-conmon-ddb09390195b756475e3299b73380b576d2d95487745a61cccf042418cb6c4bc.scope: Deactivated successfully.
Jan 22 16:48:50 compute-0 sudo[183236]: pam_unix(sudo:session): session closed for user root
Jan 22 16:48:51 compute-0 sshd-session[159976]: Connection closed by 192.168.122.30 port 59076
Jan 22 16:48:51 compute-0 sshd-session[159973]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:48:51 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Jan 22 16:48:51 compute-0 systemd[1]: session-24.scope: Consumed 1min 42.629s CPU time.
Jan 22 16:48:51 compute-0 systemd-logind[796]: Session 24 logged out. Waiting for processes to exit.
Jan 22 16:48:51 compute-0 systemd-logind[796]: Removed session 24.
Jan 22 16:48:51 compute-0 nova_compute[183075]: 2026-01-22 16:48:51.612 183079 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 22 16:48:51 compute-0 nova_compute[183075]: 2026-01-22 16:48:51.612 183079 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 22 16:48:51 compute-0 nova_compute[183075]: 2026-01-22 16:48:51.612 183079 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 22 16:48:51 compute-0 nova_compute[183075]: 2026-01-22 16:48:51.613 183079 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 22 16:48:51 compute-0 nova_compute[183075]: 2026-01-22 16:48:51.750 183079 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 16:48:51 compute-0 nova_compute[183075]: 2026-01-22 16:48:51.777 183079 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 16:48:51 compute-0 nova_compute[183075]: 2026-01-22 16:48:51.778 183079 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.283 183079 INFO nova.virt.driver [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.377 183079 INFO nova.compute.provider_config [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.390 183079 DEBUG oslo_concurrency.lockutils [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.391 183079 DEBUG oslo_concurrency.lockutils [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.391 183079 DEBUG oslo_concurrency.lockutils [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.391 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.391 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.391 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.391 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.392 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.392 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.392 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.392 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.392 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.392 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.392 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.393 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.393 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.393 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.393 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.393 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.394 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.394 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.394 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.394 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.394 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.394 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.395 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.395 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.395 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.395 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.395 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.395 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.395 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.396 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.396 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.396 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.396 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.396 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.396 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] force_config_drive             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.397 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.397 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.397 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.397 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.397 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.397 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.398 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.398 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.398 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.398 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.398 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.399 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.399 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.399 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.399 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.399 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.399 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.399 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.400 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.400 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.400 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.400 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.400 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.400 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.400 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.401 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.401 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.401 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.401 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.401 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.401 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.402 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.402 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.402 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.402 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.402 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.402 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.403 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.403 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.403 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.403 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.403 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.403 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.404 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.404 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.404 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.404 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.404 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.404 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.404 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.405 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.405 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.405 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.405 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.405 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.405 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.405 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.406 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.406 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.406 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.406 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.406 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.406 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.406 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.407 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.407 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.407 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.407 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.407 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.407 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.407 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.408 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.408 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.408 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.408 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.408 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.408 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.409 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.409 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.409 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.409 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.409 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.409 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.409 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.410 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.410 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.410 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.410 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.410 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.410 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.411 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.411 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.411 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.411 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.411 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.412 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.412 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.412 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.412 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.412 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.412 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.413 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.413 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.413 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.413 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.413 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.413 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.414 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.414 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.414 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.414 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.414 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.414 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.414 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.415 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.415 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.415 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.415 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.415 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.415 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.415 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.416 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.416 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.416 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.416 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.416 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.416 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.416 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.417 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.417 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.417 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.417 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.417 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.417 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.418 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.418 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.418 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.418 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.418 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.418 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.419 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.419 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.419 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.419 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.419 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.419 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.419 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.420 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.420 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.420 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.420 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.420 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.420 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.420 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.421 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.421 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.421 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.421 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.421 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.421 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.422 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.422 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.422 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.422 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.422 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.422 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.422 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.423 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.423 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.423 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.423 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.423 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.423 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.423 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.423 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.424 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.424 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.424 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.424 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.424 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.424 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.424 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.425 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.425 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.425 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.425 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.425 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.425 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.426 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.426 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.426 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.426 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.426 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.426 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.426 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.427 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.427 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.427 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.427 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.427 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.427 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.427 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.428 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.428 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.428 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.428 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.428 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.428 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.428 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.429 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.429 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.429 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.429 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.429 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.429 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.429 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.429 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.430 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.430 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.430 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.430 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.430 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.430 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.430 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.431 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.431 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.431 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.431 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.431 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.431 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.431 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.432 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.432 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.432 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.432 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.432 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.432 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.433 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.433 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.433 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.433 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.433 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.433 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.433 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.434 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.434 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.434 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.434 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.434 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.434 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.434 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.435 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.435 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.435 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.435 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.435 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.435 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.436 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.436 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.436 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.436 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.436 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.436 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.437 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.437 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.437 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.437 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.437 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.437 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.437 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.438 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.438 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.438 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.438 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.438 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.439 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.439 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.439 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.439 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.439 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.439 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.439 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.440 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.440 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.440 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.440 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.440 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.440 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.440 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.441 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.441 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.441 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.441 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.441 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.441 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.441 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.442 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.442 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.442 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.442 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.442 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.442 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.443 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.443 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.443 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.443 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.443 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.443 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.443 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.444 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.444 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.444 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.444 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.444 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.444 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.444 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.445 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.445 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.445 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.445 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.445 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.446 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.446 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.446 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.446 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.446 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.446 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.446 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.447 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.447 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.447 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.447 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.447 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.447 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.447 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.448 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.448 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.448 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.448 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.448 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.448 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.448 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.449 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.449 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.449 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.449 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.449 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.449 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.450 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.450 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.450 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.450 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.450 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.450 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.451 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.451 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.451 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.451 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.451 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.452 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.452 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.452 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.452 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.452 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.452 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.453 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.453 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.453 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.453 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.453 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.453 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.453 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.454 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.454 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.454 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.454 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.454 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.454 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.454 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.455 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.455 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.455 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.455 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.455 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.455 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.455 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.456 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.456 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.456 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.456 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.456 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.456 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.456 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.457 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.457 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.457 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.457 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.457 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.457 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.457 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.458 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.458 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.458 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.458 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.458 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.458 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.458 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.458 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.459 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.459 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.459 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.459 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.459 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.459 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.459 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.460 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.460 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.460 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.460 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.460 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.460 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.460 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.461 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.461 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.461 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.461 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.461 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.461 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.461 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.462 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.462 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.462 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.462 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.462 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.462 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.462 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.463 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.463 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.463 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.463 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.463 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.463 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.464 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.464 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.464 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.464 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.464 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.465 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.465 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.465 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.465 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.465 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.465 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.466 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.466 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.466 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.466 183079 WARNING oslo_config.cfg [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 22 16:48:52 compute-0 nova_compute[183075]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 22 16:48:52 compute-0 nova_compute[183075]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 22 16:48:52 compute-0 nova_compute[183075]: and ``live_migration_inbound_addr`` respectively.
Jan 22 16:48:52 compute-0 nova_compute[183075]: ).  Its value may be silently ignored in the future.
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.467 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.467 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.467 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.467 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.467 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.468 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.468 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.468 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.468 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.468 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.468 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.469 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.469 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.469 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.469 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.470 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.470 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.470 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.470 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.470 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.471 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.471 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.471 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.471 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.471 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.471 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.472 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.472 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.472 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.472 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.472 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.473 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.473 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.473 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.473 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.473 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.474 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.474 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.474 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.474 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.474 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.475 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.475 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.475 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.475 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.475 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.476 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.476 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.476 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.476 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.476 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.477 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.477 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.477 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.477 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.477 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.478 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.478 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.478 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.478 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.478 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.478 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.478 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.479 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.479 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.479 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.479 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.479 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.479 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.479 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.480 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.480 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.480 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.480 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.480 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.480 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.481 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.481 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.481 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.481 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.481 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.481 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.482 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.482 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.482 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.482 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.482 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.483 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.483 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.483 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.483 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.484 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.484 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.484 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.484 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.484 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.485 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.485 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.485 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.485 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.486 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.486 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.486 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.486 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.486 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.487 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.487 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.487 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.487 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.487 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.488 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.488 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.488 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.488 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.488 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.488 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.488 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.489 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.489 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.489 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.489 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.490 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.490 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.490 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.490 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.490 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.490 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.491 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.491 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.491 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.491 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.491 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.491 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.491 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.492 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.492 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.492 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.492 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.492 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.492 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.493 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.493 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.493 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.493 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.493 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.494 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.494 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.494 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.494 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.494 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.495 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.495 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.495 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.495 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.495 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.496 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.496 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.496 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.496 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.496 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.497 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.497 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.497 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.497 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.498 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.498 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.498 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.498 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.498 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.498 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.499 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.499 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.499 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.499 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.499 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.499 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.500 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.500 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.500 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.500 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.500 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.500 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.501 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.501 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.501 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.501 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.501 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.501 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.502 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.502 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.502 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.502 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.502 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.502 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.502 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.503 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.503 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.503 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.503 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.503 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.504 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.504 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.504 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.504 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.504 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.504 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.505 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.505 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.505 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.505 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.505 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.505 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.506 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.506 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.506 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.506 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.506 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.506 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.506 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.506 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.507 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.507 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.507 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.507 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.507 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.507 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.507 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.508 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.508 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.508 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.508 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.508 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.508 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.508 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.509 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.509 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.509 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.509 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.509 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.509 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.509 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.510 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.510 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.510 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.510 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.510 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.510 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.510 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.511 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.511 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.511 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.511 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.511 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.511 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.512 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.512 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.512 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.512 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.512 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.512 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.512 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.513 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.513 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.513 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.513 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.513 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.513 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.513 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.514 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.514 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.514 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.514 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.514 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.514 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.514 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.515 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.515 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.515 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.515 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.515 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.515 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.515 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.516 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.516 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.516 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.516 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.516 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.516 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.516 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.517 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.517 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.517 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.517 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.517 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.517 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.518 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.518 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.518 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.518 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.518 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.518 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.518 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.519 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.519 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.519 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.519 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.519 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.519 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.519 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.520 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.520 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.520 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.520 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.520 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.520 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.520 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.521 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.521 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.521 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.521 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.521 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.521 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.521 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.522 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.522 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.522 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.522 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.522 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.522 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.522 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.523 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.523 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.523 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.523 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.523 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.523 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.524 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.524 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.524 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.524 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.524 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.524 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.524 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.525 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.525 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.525 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.525 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.525 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.525 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.525 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.526 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.526 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.526 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.526 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.526 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.526 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.526 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.527 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.527 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.527 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.527 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.527 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.527 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.527 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.528 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.528 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.528 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.528 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.528 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.528 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.528 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.529 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.529 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.529 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.529 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.529 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.529 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.529 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.530 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.530 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.530 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.530 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.530 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.530 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.530 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.531 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.531 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.531 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.531 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.531 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.531 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.531 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.532 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.532 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.532 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.532 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.532 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.532 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.532 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.533 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.533 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.533 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.533 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.533 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.533 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.533 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.534 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.534 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.534 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.534 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.534 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.534 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.535 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.535 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.535 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.535 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.535 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.535 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.535 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.535 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.536 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.536 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.536 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.536 183079 DEBUG oslo_service.service [None req-e856b250-f22d-4aec-b8b5-a7c535715069 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.537 183079 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.560 183079 INFO nova.virt.node [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Determined node identity 2513134c-f67c-4237-84bf-4ebe2450d610 from /var/lib/nova/compute_id
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.561 183079 DEBUG nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.562 183079 DEBUG nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.562 183079 DEBUG nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.562 183079 DEBUG nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.582 183079 DEBUG nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fcb46635ac0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.588 183079 DEBUG nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fcb46635ac0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.590 183079 INFO nova.virt.libvirt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Connection event '1' reason 'None'
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.598 183079 INFO nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Libvirt host capabilities <capabilities>
Jan 22 16:48:52 compute-0 nova_compute[183075]: 
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <host>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <uuid>814341d3-bd19-425d-8185-e66e96ccdc81</uuid>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <cpu>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <arch>x86_64</arch>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model>EPYC-Rome-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <vendor>AMD</vendor>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <microcode version='16777317'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <signature family='23' model='49' stepping='0'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='x2apic'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='tsc-deadline'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='osxsave'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='hypervisor'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='tsc_adjust'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='spec-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='stibp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='arch-capabilities'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='ssbd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='cmp_legacy'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='topoext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='virt-ssbd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='lbrv'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='tsc-scale'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='vmcb-clean'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='pause-filter'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='pfthreshold'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='svme-addr-chk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='rdctl-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='skip-l1dfl-vmentry'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='mds-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature name='pschange-mc-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <pages unit='KiB' size='4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <pages unit='KiB' size='2048'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <pages unit='KiB' size='1048576'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </cpu>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <power_management>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <suspend_mem/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <suspend_disk/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <suspend_hybrid/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </power_management>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <iommu support='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <migration_features>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <live/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <uri_transports>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <uri_transport>tcp</uri_transport>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <uri_transport>rdma</uri_transport>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </uri_transports>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </migration_features>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <topology>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <cells num='1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <cell id='0'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:           <memory unit='KiB'>7864308</memory>
Jan 22 16:48:52 compute-0 nova_compute[183075]:           <pages unit='KiB' size='4'>1966077</pages>
Jan 22 16:48:52 compute-0 nova_compute[183075]:           <pages unit='KiB' size='2048'>0</pages>
Jan 22 16:48:52 compute-0 nova_compute[183075]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 22 16:48:52 compute-0 nova_compute[183075]:           <distances>
Jan 22 16:48:52 compute-0 nova_compute[183075]:             <sibling id='0' value='10'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:           </distances>
Jan 22 16:48:52 compute-0 nova_compute[183075]:           <cpus num='8'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:           </cpus>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         </cell>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </cells>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </topology>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <cache>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </cache>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <secmodel>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model>selinux</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <doi>0</doi>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </secmodel>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <secmodel>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model>dac</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <doi>0</doi>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </secmodel>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </host>
Jan 22 16:48:52 compute-0 nova_compute[183075]: 
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <guest>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <os_type>hvm</os_type>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <arch name='i686'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <wordsize>32</wordsize>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <domain type='qemu'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <domain type='kvm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </arch>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <features>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <pae/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <nonpae/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <acpi default='on' toggle='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <apic default='on' toggle='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <cpuselection/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <deviceboot/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <disksnapshot default='on' toggle='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <externalSnapshot/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </features>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </guest>
Jan 22 16:48:52 compute-0 nova_compute[183075]: 
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <guest>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <os_type>hvm</os_type>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <arch name='x86_64'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <wordsize>64</wordsize>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <domain type='qemu'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <domain type='kvm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </arch>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <features>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <acpi default='on' toggle='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <apic default='on' toggle='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <cpuselection/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <deviceboot/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <disksnapshot default='on' toggle='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <externalSnapshot/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </features>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </guest>
Jan 22 16:48:52 compute-0 nova_compute[183075]: 
Jan 22 16:48:52 compute-0 nova_compute[183075]: </capabilities>
Jan 22 16:48:52 compute-0 nova_compute[183075]: 
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.609 183079 DEBUG nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.613 183079 DEBUG nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 22 16:48:52 compute-0 nova_compute[183075]: <domainCapabilities>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <path>/usr/libexec/qemu-kvm</path>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <domain>kvm</domain>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <arch>i686</arch>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <vcpu max='240'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <iothreads supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <os supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <enum name='firmware'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <loader supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>rom</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pflash</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='readonly'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>yes</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>no</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='secure'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>no</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </loader>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </os>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <cpu>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <mode name='host-passthrough' supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='hostPassthroughMigratable'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>on</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>off</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </mode>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <mode name='maximum' supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='maximumMigratable'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>on</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>off</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </mode>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <mode name='host-model' supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <vendor>AMD</vendor>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='x2apic'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='tsc-deadline'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='hypervisor'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='tsc_adjust'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='spec-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='stibp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='ssbd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='cmp_legacy'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='overflow-recov'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='succor'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='amd-ssbd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='virt-ssbd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='lbrv'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='tsc-scale'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='vmcb-clean'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='flushbyasid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='pause-filter'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='pfthreshold'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='svme-addr-chk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='disable' name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </mode>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <mode name='custom' supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-noTSX'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='ClearwaterForest'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ddpd-u'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='intel-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='lam'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sha512'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sm3'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sm4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='ClearwaterForest-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ddpd-u'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='intel-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='lam'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sha512'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sm3'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sm4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cooperlake'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cooperlake-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cooperlake-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Denverton'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mpx'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Denverton-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mpx'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Denverton-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Denverton-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Dhyana-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Genoa'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Genoa-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Genoa-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='perfmon-v2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Milan'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Milan-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Milan-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Milan-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Rome'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Rome-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Rome-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Rome-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Turin'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vp2intersect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibpb-brtype'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='perfmon-v2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbpb'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='srso-user-kernel-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Turin-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vp2intersect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibpb-brtype'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='perfmon-v2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbpb'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='srso-user-kernel-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-v5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='GraniteRapids'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='GraniteRapids-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='GraniteRapids-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-128'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-256'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-512'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='GraniteRapids-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-128'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-256'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-512'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-noTSX'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-noTSX'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v6'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v7'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='IvyBridge'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='IvyBridge-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='IvyBridge-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='IvyBridge-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='KnightsMill'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-4fmaps'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-4vnniw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512er'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512pf'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='KnightsMill-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-4fmaps'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-4vnniw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512er'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512pf'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Opteron_G4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fma4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xop'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Opteron_G4-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fma4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xop'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Opteron_G5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fma4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tbm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xop'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Opteron_G5-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fma4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tbm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xop'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SierraForest'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SierraForest-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SierraForest-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='intel-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='lam'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SierraForest-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='intel-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='lam'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='core-capability'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mpx'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='split-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='core-capability'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mpx'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='split-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='core-capability'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='split-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='core-capability'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='split-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='athlon'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnow'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnowext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='athlon-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnow'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnowext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='core2duo'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='core2duo-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='coreduo'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='coreduo-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='n270'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='n270-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='phenom'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnow'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnowext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='phenom-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnow'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnowext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </mode>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </cpu>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <memoryBacking supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <enum name='sourceType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>file</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>anonymous</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>memfd</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </memoryBacking>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <devices>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <disk supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='diskDevice'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>disk</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>cdrom</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>floppy</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>lun</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='bus'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>ide</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>fdc</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>scsi</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>usb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>sata</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio-transitional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio-non-transitional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </disk>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <graphics supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vnc</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>egl-headless</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>dbus</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </graphics>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <video supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='modelType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vga</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>cirrus</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>none</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>bochs</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>ramfb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </video>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <hostdev supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='mode'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>subsystem</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='startupPolicy'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>default</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>mandatory</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>requisite</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>optional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='subsysType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>usb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pci</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>scsi</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='capsType'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='pciBackend'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </hostdev>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <rng supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio-transitional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio-non-transitional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendModel'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>random</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>egd</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>builtin</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </rng>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <filesystem supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='driverType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>path</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>handle</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtiofs</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </filesystem>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <tpm supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>tpm-tis</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>tpm-crb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendModel'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>emulator</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>external</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendVersion'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>2.0</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </tpm>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <redirdev supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='bus'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>usb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </redirdev>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <channel supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pty</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>unix</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </channel>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <crypto supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>qemu</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendModel'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>builtin</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </crypto>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <interface supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>default</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>passt</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </interface>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <panic supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>isa</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>hyperv</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </panic>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <console supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>null</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vc</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pty</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>dev</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>file</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pipe</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>stdio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>udp</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>tcp</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>unix</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>qemu-vdagent</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>dbus</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </console>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </devices>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <features>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <gic supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <vmcoreinfo supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <genid supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <backingStoreInput supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <backup supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <async-teardown supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <s390-pv supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <ps2 supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <tdx supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <sev supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <sgx supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <hyperv supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='features'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>relaxed</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vapic</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>spinlocks</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vpindex</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>runtime</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>synic</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>stimer</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>reset</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vendor_id</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>frequencies</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>reenlightenment</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>tlbflush</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>ipi</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>avic</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>emsr_bitmap</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>xmm_input</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <defaults>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <spinlocks>4095</spinlocks>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <stimer_direct>on</stimer_direct>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <tlbflush_direct>on</tlbflush_direct>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <tlbflush_extended>on</tlbflush_extended>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </defaults>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </hyperv>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <launchSecurity supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </features>
Jan 22 16:48:52 compute-0 nova_compute[183075]: </domainCapabilities>
Jan 22 16:48:52 compute-0 nova_compute[183075]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.617 183079 DEBUG nova.virt.libvirt.volume.mount [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.620 183079 DEBUG nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 22 16:48:52 compute-0 nova_compute[183075]: <domainCapabilities>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <path>/usr/libexec/qemu-kvm</path>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <domain>kvm</domain>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <arch>i686</arch>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <vcpu max='4096'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <iothreads supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <os supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <enum name='firmware'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <loader supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>rom</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pflash</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='readonly'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>yes</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>no</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='secure'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>no</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </loader>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </os>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <cpu>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <mode name='host-passthrough' supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='hostPassthroughMigratable'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>on</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>off</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </mode>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <mode name='maximum' supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='maximumMigratable'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>on</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>off</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </mode>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <mode name='host-model' supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <vendor>AMD</vendor>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='x2apic'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='tsc-deadline'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='hypervisor'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='tsc_adjust'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='spec-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='stibp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='ssbd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='cmp_legacy'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='overflow-recov'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='succor'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='amd-ssbd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='virt-ssbd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='lbrv'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='tsc-scale'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='vmcb-clean'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='flushbyasid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='pause-filter'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='pfthreshold'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='svme-addr-chk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='disable' name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </mode>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <mode name='custom' supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-noTSX'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='ClearwaterForest'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ddpd-u'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='intel-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='lam'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sha512'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sm3'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sm4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='ClearwaterForest-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ddpd-u'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='intel-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='lam'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sha512'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sm3'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sm4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cooperlake'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cooperlake-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cooperlake-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Denverton'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mpx'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Denverton-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mpx'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Denverton-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Denverton-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Dhyana-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Genoa'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Genoa-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Genoa-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='perfmon-v2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Milan'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Milan-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Milan-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Milan-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Rome'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Rome-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Rome-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Rome-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Turin'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vp2intersect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibpb-brtype'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='perfmon-v2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbpb'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='srso-user-kernel-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Turin-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vp2intersect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibpb-brtype'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='perfmon-v2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbpb'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='srso-user-kernel-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-v5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='GraniteRapids'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='GraniteRapids-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='GraniteRapids-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-128'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-256'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-512'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='GraniteRapids-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-128'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-256'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-512'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-noTSX'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-noTSX'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v6'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v7'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='IvyBridge'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='IvyBridge-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='IvyBridge-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='IvyBridge-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='KnightsMill'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-4fmaps'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-4vnniw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512er'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512pf'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='KnightsMill-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-4fmaps'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-4vnniw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512er'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512pf'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Opteron_G4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fma4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xop'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Opteron_G4-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fma4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xop'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Opteron_G5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fma4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tbm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xop'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Opteron_G5-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fma4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tbm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xop'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SierraForest'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SierraForest-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SierraForest-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='intel-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='lam'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SierraForest-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='intel-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='lam'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='core-capability'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mpx'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='split-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='core-capability'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mpx'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='split-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='core-capability'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='split-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='core-capability'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='split-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='athlon'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnow'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnowext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='athlon-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnow'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnowext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='core2duo'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='core2duo-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='coreduo'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='coreduo-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='n270'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='n270-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='phenom'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnow'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnowext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='phenom-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnow'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnowext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </mode>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </cpu>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <memoryBacking supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <enum name='sourceType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>file</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>anonymous</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>memfd</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </memoryBacking>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <devices>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <disk supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='diskDevice'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>disk</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>cdrom</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>floppy</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>lun</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='bus'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>fdc</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>scsi</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>usb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>sata</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio-transitional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio-non-transitional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </disk>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <graphics supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vnc</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>egl-headless</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>dbus</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </graphics>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <video supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='modelType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vga</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>cirrus</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>none</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>bochs</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>ramfb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </video>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <hostdev supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='mode'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>subsystem</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='startupPolicy'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>default</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>mandatory</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>requisite</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>optional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='subsysType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>usb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pci</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>scsi</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='capsType'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='pciBackend'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </hostdev>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <rng supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio-transitional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio-non-transitional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendModel'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>random</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>egd</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>builtin</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </rng>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <filesystem supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='driverType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>path</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>handle</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtiofs</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </filesystem>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <tpm supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>tpm-tis</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>tpm-crb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendModel'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>emulator</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>external</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendVersion'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>2.0</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </tpm>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <redirdev supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='bus'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>usb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </redirdev>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <channel supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pty</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>unix</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </channel>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <crypto supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>qemu</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendModel'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>builtin</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </crypto>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <interface supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>default</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>passt</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </interface>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <panic supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>isa</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>hyperv</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </panic>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <console supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>null</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vc</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pty</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>dev</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>file</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pipe</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>stdio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>udp</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>tcp</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>unix</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>qemu-vdagent</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>dbus</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </console>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </devices>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <features>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <gic supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <vmcoreinfo supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <genid supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <backingStoreInput supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <backup supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <async-teardown supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <s390-pv supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <ps2 supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <tdx supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <sev supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <sgx supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <hyperv supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='features'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>relaxed</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vapic</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>spinlocks</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vpindex</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>runtime</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>synic</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>stimer</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>reset</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vendor_id</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>frequencies</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>reenlightenment</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>tlbflush</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>ipi</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>avic</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>emsr_bitmap</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>xmm_input</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <defaults>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <spinlocks>4095</spinlocks>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <stimer_direct>on</stimer_direct>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <tlbflush_direct>on</tlbflush_direct>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <tlbflush_extended>on</tlbflush_extended>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </defaults>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </hyperv>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <launchSecurity supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </features>
Jan 22 16:48:52 compute-0 nova_compute[183075]: </domainCapabilities>
Jan 22 16:48:52 compute-0 nova_compute[183075]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.683 183079 DEBUG nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.687 183079 DEBUG nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 22 16:48:52 compute-0 nova_compute[183075]: <domainCapabilities>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <path>/usr/libexec/qemu-kvm</path>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <domain>kvm</domain>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <arch>x86_64</arch>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <vcpu max='240'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <iothreads supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <os supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <enum name='firmware'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <loader supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>rom</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pflash</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='readonly'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>yes</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>no</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='secure'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>no</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </loader>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </os>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <cpu>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <mode name='host-passthrough' supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='hostPassthroughMigratable'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>on</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>off</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </mode>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <mode name='maximum' supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='maximumMigratable'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>on</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>off</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </mode>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <mode name='host-model' supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <vendor>AMD</vendor>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='x2apic'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='tsc-deadline'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='hypervisor'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='tsc_adjust'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='spec-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='stibp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='ssbd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='cmp_legacy'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='overflow-recov'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='succor'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='amd-ssbd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='virt-ssbd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='lbrv'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='tsc-scale'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='vmcb-clean'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='flushbyasid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='pause-filter'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='pfthreshold'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='svme-addr-chk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='disable' name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </mode>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <mode name='custom' supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-noTSX'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='ClearwaterForest'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ddpd-u'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='intel-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='lam'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sha512'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sm3'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sm4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='ClearwaterForest-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ddpd-u'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='intel-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='lam'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sha512'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sm3'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sm4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cooperlake'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cooperlake-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cooperlake-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Denverton'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mpx'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Denverton-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mpx'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Denverton-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Denverton-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Dhyana-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Genoa'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Genoa-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Genoa-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='perfmon-v2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Milan'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Milan-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Milan-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Milan-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Rome'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Rome-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Rome-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Rome-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Turin'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vp2intersect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibpb-brtype'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='perfmon-v2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbpb'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='srso-user-kernel-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Turin-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vp2intersect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibpb-brtype'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='perfmon-v2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbpb'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='srso-user-kernel-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-v5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='GraniteRapids'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='GraniteRapids-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='GraniteRapids-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-128'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-256'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-512'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='GraniteRapids-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-128'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-256'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-512'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-noTSX'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-noTSX'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v6'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v7'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='IvyBridge'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='IvyBridge-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='IvyBridge-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='IvyBridge-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='KnightsMill'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-4fmaps'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-4vnniw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512er'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512pf'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='KnightsMill-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-4fmaps'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-4vnniw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512er'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512pf'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Opteron_G4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fma4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xop'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Opteron_G4-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fma4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xop'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Opteron_G5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fma4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tbm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xop'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Opteron_G5-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fma4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tbm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xop'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SierraForest'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SierraForest-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SierraForest-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='intel-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='lam'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SierraForest-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='intel-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='lam'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='core-capability'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mpx'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='split-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='core-capability'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mpx'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='split-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='core-capability'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='split-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='core-capability'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='split-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='athlon'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnow'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnowext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='athlon-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnow'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnowext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='core2duo'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='core2duo-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='coreduo'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='coreduo-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='n270'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='n270-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='phenom'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnow'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnowext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='phenom-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnow'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnowext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </mode>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </cpu>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <memoryBacking supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <enum name='sourceType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>file</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>anonymous</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>memfd</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </memoryBacking>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <devices>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <disk supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='diskDevice'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>disk</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>cdrom</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>floppy</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>lun</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='bus'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>ide</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>fdc</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>scsi</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>usb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>sata</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio-transitional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio-non-transitional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </disk>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <graphics supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vnc</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>egl-headless</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>dbus</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </graphics>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <video supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='modelType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vga</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>cirrus</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>none</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>bochs</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>ramfb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </video>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <hostdev supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='mode'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>subsystem</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='startupPolicy'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>default</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>mandatory</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>requisite</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>optional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='subsysType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>usb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pci</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>scsi</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='capsType'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='pciBackend'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </hostdev>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <rng supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio-transitional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio-non-transitional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendModel'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>random</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>egd</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>builtin</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </rng>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <filesystem supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='driverType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>path</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>handle</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtiofs</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </filesystem>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <tpm supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>tpm-tis</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>tpm-crb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendModel'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>emulator</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>external</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendVersion'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>2.0</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </tpm>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <redirdev supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='bus'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>usb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </redirdev>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <channel supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pty</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>unix</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </channel>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <crypto supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>qemu</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendModel'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>builtin</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </crypto>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <interface supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>default</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>passt</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </interface>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <panic supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>isa</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>hyperv</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </panic>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <console supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>null</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vc</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pty</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>dev</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>file</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pipe</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>stdio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>udp</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>tcp</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>unix</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>qemu-vdagent</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>dbus</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </console>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </devices>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <features>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <gic supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <vmcoreinfo supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <genid supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <backingStoreInput supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <backup supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <async-teardown supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <s390-pv supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <ps2 supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <tdx supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <sev supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <sgx supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <hyperv supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='features'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>relaxed</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vapic</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>spinlocks</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vpindex</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>runtime</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>synic</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>stimer</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>reset</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vendor_id</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>frequencies</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>reenlightenment</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>tlbflush</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>ipi</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>avic</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>emsr_bitmap</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>xmm_input</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <defaults>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <spinlocks>4095</spinlocks>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <stimer_direct>on</stimer_direct>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <tlbflush_direct>on</tlbflush_direct>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <tlbflush_extended>on</tlbflush_extended>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </defaults>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </hyperv>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <launchSecurity supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </features>
Jan 22 16:48:52 compute-0 nova_compute[183075]: </domainCapabilities>
Jan 22 16:48:52 compute-0 nova_compute[183075]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.773 183079 DEBUG nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 22 16:48:52 compute-0 nova_compute[183075]: <domainCapabilities>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <path>/usr/libexec/qemu-kvm</path>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <domain>kvm</domain>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <arch>x86_64</arch>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <vcpu max='4096'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <iothreads supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <os supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <enum name='firmware'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>efi</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <loader supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>rom</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pflash</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='readonly'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>yes</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>no</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='secure'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>yes</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>no</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </loader>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </os>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <cpu>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <mode name='host-passthrough' supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='hostPassthroughMigratable'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>on</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>off</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </mode>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <mode name='maximum' supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='maximumMigratable'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>on</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>off</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </mode>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <mode name='host-model' supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <vendor>AMD</vendor>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='x2apic'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='tsc-deadline'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='hypervisor'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='tsc_adjust'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='spec-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='stibp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='ssbd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='cmp_legacy'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='overflow-recov'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='succor'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='amd-ssbd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='virt-ssbd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='lbrv'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='tsc-scale'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='vmcb-clean'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='flushbyasid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='pause-filter'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='pfthreshold'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='svme-addr-chk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <feature policy='disable' name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </mode>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <mode name='custom' supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-noTSX'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Broadwell-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cascadelake-Server-v5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='ClearwaterForest'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ddpd-u'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='intel-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='lam'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sha512'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sm3'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sm4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='ClearwaterForest-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ddpd-u'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='intel-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='lam'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sha512'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sm3'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sm4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cooperlake'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cooperlake-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Cooperlake-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Denverton'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mpx'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Denverton-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mpx'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Denverton-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Denverton-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Dhyana-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Genoa'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Genoa-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Genoa-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='perfmon-v2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Milan'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Milan-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Milan-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Milan-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Rome'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Rome-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Rome-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Rome-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Turin'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vp2intersect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibpb-brtype'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='perfmon-v2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbpb'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='srso-user-kernel-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-Turin-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amd-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='auto-ibrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vp2intersect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fs-gs-base-ns'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibpb-brtype'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='no-nested-data-bp'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='null-sel-clr-base'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='perfmon-v2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbpb'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='srso-user-kernel-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='stibp-always-on'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='EPYC-v5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='GraniteRapids'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='GraniteRapids-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='GraniteRapids-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-128'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-256'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-512'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='GraniteRapids-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-128'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-256'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx10-512'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='prefetchiti'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-noTSX'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Haswell-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-noTSX'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v6'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Icelake-Server-v7'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='IvyBridge'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='IvyBridge-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='IvyBridge-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='IvyBridge-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='KnightsMill'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-4fmaps'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-4vnniw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512er'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512pf'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='KnightsMill-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-4fmaps'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-4vnniw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512er'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512pf'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Opteron_G4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fma4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xop'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Opteron_G4-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fma4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xop'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Opteron_G5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fma4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tbm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xop'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Opteron_G5-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fma4'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tbm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xop'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SapphireRapids-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='amx-tile'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-bf16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-fp16'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512-vpopcntdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bitalg'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vbmi2'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrc'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fzrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='la57'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='taa-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='tsx-ldtrk'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SierraForest'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SierraForest-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SierraForest-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='intel-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='lam'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='SierraForest-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ifma'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-ne-convert'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx-vnni-int8'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bhi-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='bus-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cmpccxadd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fbsdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='fsrs'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ibrs-all'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='intel-psfd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ipred-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='lam'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mcdt-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pbrsb-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='psdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rrsba-ctrl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='sbdr-ssdp-no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='serialize'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vaes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='vpclmulqdq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Client-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='hle'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='rtm'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Skylake-Server-v5'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512bw'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512cd'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512dq'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512f'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='avx512vl'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='invpcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pcid'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='pku'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='core-capability'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mpx'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='split-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='core-capability'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='mpx'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='split-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge-v2'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='core-capability'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='split-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge-v3'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='core-capability'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='split-lock-detect'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='Snowridge-v4'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='cldemote'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='erms'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='gfni'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdir64b'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='movdiri'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='xsaves'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='athlon'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnow'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnowext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='athlon-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnow'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnowext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='core2duo'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='core2duo-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='coreduo'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='coreduo-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='n270'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='n270-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='ss'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='phenom'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnow'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnowext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <blockers model='phenom-v1'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnow'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <feature name='3dnowext'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </blockers>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </mode>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </cpu>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <memoryBacking supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <enum name='sourceType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>file</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>anonymous</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <value>memfd</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </memoryBacking>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <devices>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <disk supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='diskDevice'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>disk</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>cdrom</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>floppy</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>lun</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='bus'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>fdc</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>scsi</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>usb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>sata</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio-transitional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio-non-transitional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </disk>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <graphics supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vnc</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>egl-headless</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>dbus</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </graphics>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <video supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='modelType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vga</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>cirrus</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>none</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>bochs</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>ramfb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </video>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <hostdev supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='mode'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>subsystem</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='startupPolicy'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>default</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>mandatory</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>requisite</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>optional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='subsysType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>usb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pci</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>scsi</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='capsType'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='pciBackend'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </hostdev>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <rng supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio-transitional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtio-non-transitional</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendModel'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>random</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>egd</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>builtin</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </rng>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <filesystem supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='driverType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>path</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>handle</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>virtiofs</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </filesystem>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <tpm supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>tpm-tis</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>tpm-crb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendModel'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>emulator</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>external</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendVersion'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>2.0</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </tpm>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <redirdev supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='bus'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>usb</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </redirdev>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <channel supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pty</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>unix</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </channel>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <crypto supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>qemu</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendModel'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>builtin</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </crypto>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <interface supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='backendType'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>default</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>passt</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </interface>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <panic supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='model'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>isa</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>hyperv</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </panic>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <console supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='type'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>null</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vc</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pty</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>dev</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>file</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>pipe</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>stdio</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>udp</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>tcp</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>unix</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>qemu-vdagent</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>dbus</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </console>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </devices>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   <features>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <gic supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <vmcoreinfo supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <genid supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <backingStoreInput supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <backup supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <async-teardown supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <s390-pv supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <ps2 supported='yes'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <tdx supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <sev supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <sgx supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <hyperv supported='yes'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <enum name='features'>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>relaxed</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vapic</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>spinlocks</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vpindex</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>runtime</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>synic</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>stimer</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>reset</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>vendor_id</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>frequencies</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>reenlightenment</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>tlbflush</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>ipi</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>avic</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>emsr_bitmap</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <value>xmm_input</value>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </enum>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       <defaults>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <spinlocks>4095</spinlocks>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <stimer_direct>on</stimer_direct>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <tlbflush_direct>on</tlbflush_direct>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <tlbflush_extended>on</tlbflush_extended>
Jan 22 16:48:52 compute-0 nova_compute[183075]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 16:48:52 compute-0 nova_compute[183075]:       </defaults>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     </hyperv>
Jan 22 16:48:52 compute-0 nova_compute[183075]:     <launchSecurity supported='no'/>
Jan 22 16:48:52 compute-0 nova_compute[183075]:   </features>
Jan 22 16:48:52 compute-0 nova_compute[183075]: </domainCapabilities>
Jan 22 16:48:52 compute-0 nova_compute[183075]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.863 183079 DEBUG nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.863 183079 DEBUG nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.863 183079 DEBUG nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.869 183079 INFO nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Secure Boot support detected
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.872 183079 INFO nova.virt.libvirt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.872 183079 INFO nova.virt.libvirt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.883 183079 DEBUG nova.virt.libvirt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.902 183079 INFO nova.virt.node [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Determined node identity 2513134c-f67c-4237-84bf-4ebe2450d610 from /var/lib/nova/compute_id
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.934 183079 WARNING nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Compute nodes ['2513134c-f67c-4237-84bf-4ebe2450d610'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.966 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.985 183079 WARNING nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.985 183079 DEBUG oslo_concurrency.lockutils [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.985 183079 DEBUG oslo_concurrency.lockutils [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.986 183079 DEBUG oslo_concurrency.lockutils [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:48:52 compute-0 nova_compute[183075]: 2026-01-22 16:48:52.986 183079 DEBUG nova.compute.resource_tracker [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 16:48:53 compute-0 nova_compute[183075]: 2026-01-22 16:48:53.158 183079 WARNING nova.virt.libvirt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 16:48:53 compute-0 nova_compute[183075]: 2026-01-22 16:48:53.159 183079 DEBUG nova.compute.resource_tracker [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6154MB free_disk=73.58840942382812GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 16:48:53 compute-0 nova_compute[183075]: 2026-01-22 16:48:53.159 183079 DEBUG oslo_concurrency.lockutils [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:48:53 compute-0 nova_compute[183075]: 2026-01-22 16:48:53.160 183079 DEBUG oslo_concurrency.lockutils [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:48:53 compute-0 nova_compute[183075]: 2026-01-22 16:48:53.175 183079 WARNING nova.compute.resource_tracker [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] No compute node record for compute-0.ctlplane.example.com:2513134c-f67c-4237-84bf-4ebe2450d610: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 2513134c-f67c-4237-84bf-4ebe2450d610 could not be found.
Jan 22 16:48:53 compute-0 nova_compute[183075]: 2026-01-22 16:48:53.193 183079 INFO nova.compute.resource_tracker [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 2513134c-f67c-4237-84bf-4ebe2450d610
Jan 22 16:48:53 compute-0 nova_compute[183075]: 2026-01-22 16:48:53.248 183079 DEBUG nova.compute.resource_tracker [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 16:48:53 compute-0 nova_compute[183075]: 2026-01-22 16:48:53.249 183079 DEBUG nova.compute.resource_tracker [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 16:48:53 compute-0 rsyslogd[1006]: imjournal from <np0005592449:nova_compute>: begin to drop messages due to rate-limiting
Jan 22 16:48:54 compute-0 nova_compute[183075]: 2026-01-22 16:48:54.125 183079 INFO nova.scheduler.client.report [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [req-c4e41a8b-3174-44cc-a406-66ed816c1319] Created resource provider record via placement API for resource provider with UUID 2513134c-f67c-4237-84bf-4ebe2450d610 and name compute-0.ctlplane.example.com.
Jan 22 16:48:54 compute-0 nova_compute[183075]: 2026-01-22 16:48:54.487 183079 DEBUG nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 22 16:48:54 compute-0 nova_compute[183075]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Jan 22 16:48:54 compute-0 nova_compute[183075]: 2026-01-22 16:48:54.487 183079 INFO nova.virt.libvirt.host [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] kernel doesn't support AMD SEV
Jan 22 16:48:54 compute-0 nova_compute[183075]: 2026-01-22 16:48:54.488 183079 DEBUG nova.compute.provider_tree [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 16:48:54 compute-0 nova_compute[183075]: 2026-01-22 16:48:54.488 183079 DEBUG nova.virt.libvirt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 16:48:54 compute-0 nova_compute[183075]: 2026-01-22 16:48:54.548 183079 DEBUG nova.scheduler.client.report [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Updated inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 22 16:48:54 compute-0 nova_compute[183075]: 2026-01-22 16:48:54.548 183079 DEBUG nova.compute.provider_tree [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Updating resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 22 16:48:54 compute-0 nova_compute[183075]: 2026-01-22 16:48:54.548 183079 DEBUG nova.compute.provider_tree [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 16:48:54 compute-0 nova_compute[183075]: 2026-01-22 16:48:54.628 183079 DEBUG nova.compute.provider_tree [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Updating resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 22 16:48:54 compute-0 nova_compute[183075]: 2026-01-22 16:48:54.650 183079 DEBUG nova.compute.resource_tracker [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 16:48:54 compute-0 nova_compute[183075]: 2026-01-22 16:48:54.650 183079 DEBUG oslo_concurrency.lockutils [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.490s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:48:54 compute-0 nova_compute[183075]: 2026-01-22 16:48:54.650 183079 DEBUG nova.service [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Jan 22 16:48:54 compute-0 nova_compute[183075]: 2026-01-22 16:48:54.742 183079 DEBUG nova.service [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Jan 22 16:48:54 compute-0 nova_compute[183075]: 2026-01-22 16:48:54.742 183079 DEBUG nova.servicegroup.drivers.db [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Jan 22 16:48:56 compute-0 sshd-session[183375]: Accepted publickey for zuul from 192.168.122.30 port 32852 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 16:48:56 compute-0 systemd-logind[796]: New session 26 of user zuul.
Jan 22 16:48:56 compute-0 systemd[1]: Started Session 26 of User zuul.
Jan 22 16:48:56 compute-0 sshd-session[183375]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 16:48:57 compute-0 python3.9[183528]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:48:59 compute-0 sudo[183682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwrowccxgtnannybuhdxtkuiqetmqtxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100538.4716582-31-20848374495217/AnsiballZ_systemd_service.py'
Jan 22 16:48:59 compute-0 sudo[183682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:48:59 compute-0 python3.9[183684]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 16:48:59 compute-0 systemd[1]: Reloading.
Jan 22 16:48:59 compute-0 systemd-rc-local-generator[183708]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:48:59 compute-0 systemd-sysv-generator[183714]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:48:59 compute-0 sudo[183682]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:00 compute-0 python3.9[183868]: ansible-ansible.builtin.service_facts Invoked
Jan 22 16:49:00 compute-0 network[183885]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 16:49:00 compute-0 network[183886]: 'network-scripts' will be removed from distribution in near future.
Jan 22 16:49:00 compute-0 network[183887]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 16:49:04 compute-0 sudo[184157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epmmznbgoigpdddtwvozhugxotezujzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100543.8920252-50-165583940965539/AnsiballZ_systemd_service.py'
Jan 22 16:49:04 compute-0 sudo[184157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:04 compute-0 python3.9[184159]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:49:04 compute-0 sudo[184157]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:05 compute-0 sudo[184310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkvtgefugedfkporpbxqggptsbjensbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100544.8112776-60-176411638604096/AnsiballZ_file.py'
Jan 22 16:49:05 compute-0 sudo[184310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:05 compute-0 python3.9[184312]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:05 compute-0 sudo[184310]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:05 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 16:49:05 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 16:49:06 compute-0 sudo[184463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qllklqlwarilvmaptwmscqwhyhhigifb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100545.6693463-68-59446347111315/AnsiballZ_file.py'
Jan 22 16:49:06 compute-0 sudo[184463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:06 compute-0 python3.9[184465]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:06 compute-0 sudo[184463]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:06 compute-0 sudo[184615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnjatdpculbqnzktyfokyswexgkddamp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100546.4569693-77-172578885008752/AnsiballZ_command.py'
Jan 22 16:49:06 compute-0 sudo[184615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:07 compute-0 python3.9[184617]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:49:07 compute-0 sudo[184615]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:08 compute-0 python3.9[184769]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 16:49:08 compute-0 sudo[184919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiupnouqjhaakltddqjnvkwofimjsude ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100548.2493494-95-275183045753757/AnsiballZ_systemd_service.py'
Jan 22 16:49:08 compute-0 sudo[184919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:08 compute-0 python3.9[184921]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 16:49:08 compute-0 systemd[1]: Reloading.
Jan 22 16:49:09 compute-0 systemd-sysv-generator[184949]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:49:09 compute-0 systemd-rc-local-generator[184946]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:49:09 compute-0 sudo[184919]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:09 compute-0 sudo[185107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsngodxjcywxxsrrvixyuetpkfejgtxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100549.4159126-103-211814004170376/AnsiballZ_command.py'
Jan 22 16:49:09 compute-0 sudo[185107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:09 compute-0 python3.9[185109]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:49:10 compute-0 sudo[185107]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:10 compute-0 sudo[185260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhzndjyuvjmnprgdxyvgakbqjcuzssca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100550.3210912-112-275138088177045/AnsiballZ_file.py'
Jan 22 16:49:10 compute-0 sudo[185260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:10 compute-0 python3.9[185262]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:49:10 compute-0 sudo[185260]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:11 compute-0 python3.9[185412]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:49:12 compute-0 sudo[185564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyjbiejydlungkqndycpwyucwwydjnpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100551.975259-128-163245650072638/AnsiballZ_group.py'
Jan 22 16:49:12 compute-0 sudo[185564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:12 compute-0 python3.9[185566]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Jan 22 16:49:12 compute-0 sudo[185564]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:13 compute-0 sudo[185716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbrrnebfwcolqovinvrtwubuhbimtajr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100553.0241961-139-82767948702042/AnsiballZ_getent.py'
Jan 22 16:49:13 compute-0 sudo[185716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:13 compute-0 python3.9[185718]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Jan 22 16:49:13 compute-0 sudo[185716]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:14 compute-0 sudo[185869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utpetwwjzlfuwjgmqphhuuyystoomntn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100553.9106162-147-260038979930877/AnsiballZ_group.py'
Jan 22 16:49:14 compute-0 sudo[185869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:14 compute-0 python3.9[185871]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 16:49:14 compute-0 groupadd[185872]: group added to /etc/group: name=ceilometer, GID=42405
Jan 22 16:49:14 compute-0 groupadd[185872]: group added to /etc/gshadow: name=ceilometer
Jan 22 16:49:14 compute-0 groupadd[185872]: new group: name=ceilometer, GID=42405
Jan 22 16:49:14 compute-0 sudo[185869]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:15 compute-0 sudo[186027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkvlimgxdtzsjwxhzqnielavdqrtumwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100554.7716618-155-188921884519022/AnsiballZ_user.py'
Jan 22 16:49:15 compute-0 sudo[186027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:15 compute-0 python3.9[186029]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 16:49:15 compute-0 useradd[186031]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Jan 22 16:49:15 compute-0 useradd[186031]: add 'ceilometer' to group 'libvirt'
Jan 22 16:49:15 compute-0 useradd[186031]: add 'ceilometer' to shadow group 'libvirt'
Jan 22 16:49:15 compute-0 sudo[186027]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:16 compute-0 python3.9[186187]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:49:17 compute-0 python3.9[186308]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769100556.2641122-181-183780931765379/.source.conf _original_basename=ceilometer.conf follow=False checksum=806b21daa538a66a80669be8bf74c414d178dfbc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:17 compute-0 podman[186310]: 2026-01-22 16:49:17.681762301 +0000 UTC m=+0.112660774 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 16:49:17 compute-0 podman[186309]: 2026-01-22 16:49:17.734984955 +0000 UTC m=+0.164478420 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 16:49:18 compute-0 python3.9[186503]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:49:18 compute-0 nova_compute[183075]: 2026-01-22 16:49:18.745 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:49:18 compute-0 nova_compute[183075]: 2026-01-22 16:49:18.785 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:49:19 compute-0 python3.9[186624]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769100557.767574-181-146846334659610/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:19 compute-0 python3.9[186774]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:49:20 compute-0 python3.9[186895]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769100559.2769935-181-31384646446667/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:21 compute-0 python3.9[187045]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:49:21 compute-0 python3.9[187197]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:49:22 compute-0 python3.9[187349]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:49:23 compute-0 python3.9[187470]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100562.0686045-240-43648311276049/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:49:23 compute-0 python3.9[187620]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:49:24 compute-0 python3.9[187741]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100563.372789-240-50429141531703/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:49:25 compute-0 python3.9[187891]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:49:25 compute-0 python3.9[188012]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100564.7953932-269-194330692057956/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:49:26 compute-0 python3.9[188162]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:49:27 compute-0 python3.9[188283]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100566.1263876-285-180864664627514/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:28 compute-0 python3.9[188433]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:49:28 compute-0 python3.9[188554]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100567.4493518-300-212645352912980/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:29 compute-0 python3.9[188704]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:49:30 compute-0 python3.9[188825]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100568.959021-315-23276220785483/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:30 compute-0 sudo[188975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omynjpmtvekhaolpkzpnwgmvhumourkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100570.666689-330-9361089052595/AnsiballZ_file.py'
Jan 22 16:49:30 compute-0 sudo[188975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:31 compute-0 python3.9[188977]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:31 compute-0 sudo[188975]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:31 compute-0 sudo[189127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bujqphtyqyzbchozfmnekvfmmnchqnud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100571.4295967-338-278935085030653/AnsiballZ_file.py'
Jan 22 16:49:31 compute-0 sudo[189127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:31 compute-0 python3.9[189129]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:31 compute-0 sudo[189127]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:32 compute-0 python3.9[189279]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:49:33 compute-0 python3.9[189431]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:49:34 compute-0 python3.9[189583]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:49:34 compute-0 sudo[189735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xholdrsqojvlbrlrmmeiuhmnnbixfxmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100574.2444615-370-250609778837853/AnsiballZ_file.py'
Jan 22 16:49:34 compute-0 sudo[189735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:34 compute-0 python3.9[189737]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:49:34 compute-0 sudo[189735]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:35 compute-0 sudo[189887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tieifezqvuabbatkhadugdgbopkicnbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100575.111421-378-25415121055324/AnsiballZ_systemd_service.py'
Jan 22 16:49:35 compute-0 sudo[189887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:35 compute-0 python3.9[189889]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:49:35 compute-0 systemd[1]: Reloading.
Jan 22 16:49:35 compute-0 systemd-rc-local-generator[189918]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:49:35 compute-0 systemd-sysv-generator[189921]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:49:36 compute-0 systemd[1]: Listening on Podman API Socket.
Jan 22 16:49:36 compute-0 sudo[189887]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:36 compute-0 sudo[190078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvnuzixptlvyhtjhhynhfkrgmmhdyuyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100576.432298-387-131378183442338/AnsiballZ_stat.py'
Jan 22 16:49:36 compute-0 sudo[190078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:36 compute-0 python3.9[190080]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:49:37 compute-0 sudo[190078]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:37 compute-0 sudo[190201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bovumlnrwiywakoojcxcoxpiyufaqjhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100576.432298-387-131378183442338/AnsiballZ_copy.py'
Jan 22 16:49:37 compute-0 sudo[190201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:37 compute-0 python3.9[190203]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100576.432298-387-131378183442338/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:49:37 compute-0 sudo[190201]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:37 compute-0 sudo[190277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnodnsymjkiemzshoxdinbccknkmpeyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100576.432298-387-131378183442338/AnsiballZ_stat.py'
Jan 22 16:49:37 compute-0 sudo[190277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:38 compute-0 python3.9[190279]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:49:38 compute-0 sudo[190277]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:38 compute-0 sudo[190400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugzjtlztnkogvnbdmkxzwtocvrhlymwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100576.432298-387-131378183442338/AnsiballZ_copy.py'
Jan 22 16:49:38 compute-0 sudo[190400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:38 compute-0 python3.9[190402]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100576.432298-387-131378183442338/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:49:38 compute-0 sudo[190400]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:39 compute-0 sudo[190552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnjkqsbosjxkrvaovavfmbszurrkyooe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100579.4730577-419-190997760703936/AnsiballZ_file.py'
Jan 22 16:49:39 compute-0 sudo[190552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:40 compute-0 python3.9[190554]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:40 compute-0 sudo[190552]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:40 compute-0 sudo[190704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlqznrvhonxnmkphfdwxftqdxdqpqwqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100580.3227842-427-186844046932398/AnsiballZ_file.py'
Jan 22 16:49:40 compute-0 sudo[190704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:40 compute-0 python3.9[190706]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:49:40 compute-0 sudo[190704]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:41 compute-0 sudo[190856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwovkarndnwxbdakfmxxeybcrtiehmyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100581.2090068-435-235168749193463/AnsiballZ_stat.py'
Jan 22 16:49:41 compute-0 sudo[190856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:41 compute-0 python3.9[190858]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:49:41 compute-0 sudo[190856]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:49:41.903 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:49:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:49:41.904 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:49:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:49:41.904 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:49:42 compute-0 sudo[190979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhjqyrgupxpxjpjickmhidqtsyctbznt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100581.2090068-435-235168749193463/AnsiballZ_copy.py'
Jan 22 16:49:42 compute-0 sudo[190979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:42 compute-0 python3.9[190981]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100581.2090068-435-235168749193463/.source.json _original_basename=.6pfp82z_ follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:42 compute-0 sudo[190979]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:43 compute-0 python3.9[191131]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:45 compute-0 sudo[191552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqnudbtyjehzcfolouhgalcjlcepyfgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100585.1298838-475-270118762595243/AnsiballZ_container_config_data.py'
Jan 22 16:49:45 compute-0 sudo[191552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:45 compute-0 python3.9[191554]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Jan 22 16:49:45 compute-0 sudo[191552]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:46 compute-0 sudo[191704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ronugoqtfhalzhjdqsdljuhqbrglbpan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100586.2390683-486-172585762672928/AnsiballZ_container_config_hash.py'
Jan 22 16:49:46 compute-0 sudo[191704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:46 compute-0 python3.9[191706]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 16:49:46 compute-0 sudo[191704]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:47 compute-0 sudo[191878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cepfrfobkhpglgqbvvqjbrkncofklqcp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769100587.435902-496-147986195251627/AnsiballZ_edpm_container_manage.py'
Jan 22 16:49:47 compute-0 sudo[191878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:48 compute-0 podman[191831]: 2026-01-22 16:49:48.019776316 +0000 UTC m=+0.124934273 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 16:49:48 compute-0 podman[191830]: 2026-01-22 16:49:48.039501464 +0000 UTC m=+0.144515577 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 16:49:48 compute-0 python3[191889]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 16:49:48 compute-0 podman[191941]: 2026-01-22 16:49:48.457254309 +0000 UTC m=+0.070995640 container create 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 16:49:48 compute-0 podman[191941]: 2026-01-22 16:49:48.414485925 +0000 UTC m=+0.028227336 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 22 16:49:48 compute-0 python3[191889]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Jan 22 16:49:48 compute-0 sudo[191878]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:49 compute-0 sudo[192128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lflxmdwdryjrgdqemeoxnsdeilkcxjfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100588.83469-504-216160910549225/AnsiballZ_stat.py'
Jan 22 16:49:49 compute-0 sudo[192128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:49 compute-0 python3.9[192130]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:49:49 compute-0 sudo[192128]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:50 compute-0 sudo[192282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nanyeiscslpvkszvssqorpidgezrolhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100589.6244147-513-142199562064471/AnsiballZ_file.py'
Jan 22 16:49:50 compute-0 sudo[192282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:50 compute-0 python3.9[192284]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:50 compute-0 sudo[192282]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:50 compute-0 sudo[192358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbdputprkbnfonblzpzfrhkymtudmvpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100589.6244147-513-142199562064471/AnsiballZ_stat.py'
Jan 22 16:49:50 compute-0 sudo[192358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:50 compute-0 python3.9[192360]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:49:50 compute-0 sudo[192358]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:51 compute-0 sudo[192509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgqoskhyslxgxocrasbojkqgcbcwyekv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100590.789663-513-100682655122148/AnsiballZ_copy.py'
Jan 22 16:49:51 compute-0 sudo[192509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:51 compute-0 python3.9[192511]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769100590.789663-513-100682655122148/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:51 compute-0 sudo[192509]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.792 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.793 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.794 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.794 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.814 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.815 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.815 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.816 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.816 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.816 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.817 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.817 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.818 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.854 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.854 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.855 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:49:51 compute-0 nova_compute[183075]: 2026-01-22 16:49:51.855 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 16:49:52 compute-0 nova_compute[183075]: 2026-01-22 16:49:52.088 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 16:49:52 compute-0 nova_compute[183075]: 2026-01-22 16:49:52.089 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6163MB free_disk=73.58566665649414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 16:49:52 compute-0 nova_compute[183075]: 2026-01-22 16:49:52.089 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:49:52 compute-0 nova_compute[183075]: 2026-01-22 16:49:52.090 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:49:52 compute-0 nova_compute[183075]: 2026-01-22 16:49:52.255 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 16:49:52 compute-0 nova_compute[183075]: 2026-01-22 16:49:52.256 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 16:49:52 compute-0 nova_compute[183075]: 2026-01-22 16:49:52.287 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 16:49:52 compute-0 nova_compute[183075]: 2026-01-22 16:49:52.314 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 16:49:52 compute-0 nova_compute[183075]: 2026-01-22 16:49:52.317 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 16:49:52 compute-0 nova_compute[183075]: 2026-01-22 16:49:52.317 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:49:52 compute-0 sudo[192585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvvkdjhfcxczgdpnzgzbwzllfabivovz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100590.789663-513-100682655122148/AnsiballZ_systemd.py'
Jan 22 16:49:52 compute-0 sudo[192585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:52 compute-0 python3.9[192587]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 16:49:52 compute-0 systemd[1]: Reloading.
Jan 22 16:49:52 compute-0 systemd-sysv-generator[192611]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:49:52 compute-0 systemd-rc-local-generator[192606]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:49:52 compute-0 sudo[192585]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:53 compute-0 sudo[192696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijozfjargxyhmpizckjheihpduifzhci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100590.789663-513-100682655122148/AnsiballZ_systemd.py'
Jan 22 16:49:53 compute-0 sudo[192696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:53 compute-0 python3.9[192698]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:49:53 compute-0 systemd[1]: Reloading.
Jan 22 16:49:53 compute-0 systemd-rc-local-generator[192722]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:49:53 compute-0 systemd-sysv-generator[192726]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:49:53 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Jan 22 16:49:54 compute-0 systemd[1]: Started libcrun container.
Jan 22 16:49:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9da97b860b2c7bf02df40df398611998d08b9d338567b88a46d8b3517f496be/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Jan 22 16:49:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9da97b860b2c7bf02df40df398611998d08b9d338567b88a46d8b3517f496be/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 22 16:49:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9da97b860b2c7bf02df40df398611998d08b9d338567b88a46d8b3517f496be/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Jan 22 16:49:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9da97b860b2c7bf02df40df398611998d08b9d338567b88a46d8b3517f496be/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Jan 22 16:49:54 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a.
Jan 22 16:49:54 compute-0 podman[192738]: 2026-01-22 16:49:54.112601386 +0000 UTC m=+0.148280808 container init 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: + sudo -E kolla_set_configs
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: sudo: unable to send audit message: Operation not permitted
Jan 22 16:49:54 compute-0 sudo[192759]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Jan 22 16:49:54 compute-0 sudo[192759]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Jan 22 16:49:54 compute-0 sudo[192759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Jan 22 16:49:54 compute-0 podman[192738]: 2026-01-22 16:49:54.155213346 +0000 UTC m=+0.190892748 container start 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 16:49:54 compute-0 podman[192738]: ceilometer_agent_compute
Jan 22 16:49:54 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Jan 22 16:49:54 compute-0 sudo[192696]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: INFO:__main__:Validating config file
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: INFO:__main__:Copying service configuration files
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: INFO:__main__:Writing out command to execute
Jan 22 16:49:54 compute-0 sudo[192759]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: ++ cat /run_command
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: + ARGS=
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: + sudo kolla_copy_cacerts
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: sudo: unable to send audit message: Operation not permitted
Jan 22 16:49:54 compute-0 sudo[192776]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Jan 22 16:49:54 compute-0 sudo[192776]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Jan 22 16:49:54 compute-0 podman[192760]: 2026-01-22 16:49:54.2454833 +0000 UTC m=+0.068346759 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 22 16:49:54 compute-0 sudo[192776]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Jan 22 16:49:54 compute-0 sudo[192776]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: + [[ ! -n '' ]]
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: + . kolla_extend_start
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: + umask 0022
Jan 22 16:49:54 compute-0 ceilometer_agent_compute[192753]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Jan 22 16:49:54 compute-0 systemd[1]: 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a-374c3b43a6ba2f8a.service: Main process exited, code=exited, status=1/FAILURE
Jan 22 16:49:54 compute-0 systemd[1]: 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a-374c3b43a6ba2f8a.service: Failed with result 'exit-code'.
Jan 22 16:49:55 compute-0 python3.9[192933]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.155 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.156 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.156 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.156 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.156 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.156 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.157 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.157 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.157 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.157 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.157 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.158 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.158 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.158 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.158 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.158 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.159 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.159 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.159 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.159 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.159 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.160 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.160 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.160 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.160 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.160 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.160 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.161 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.161 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.161 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.161 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.161 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.161 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.162 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.162 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.162 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.162 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.162 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.163 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.163 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.163 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.163 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.163 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.164 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.164 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.164 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.164 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.164 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.165 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.165 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.165 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.165 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.165 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.165 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.166 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.166 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.166 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.166 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.166 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.167 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.167 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.167 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.167 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.167 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.168 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.168 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.168 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.168 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.168 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.169 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.169 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.169 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.169 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.169 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.170 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.170 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.170 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.170 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.170 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.171 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.171 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.171 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.171 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.171 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.171 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.172 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.172 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.172 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.172 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.172 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.173 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.173 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.173 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.173 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.173 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.174 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.174 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.174 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.174 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.174 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.175 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.175 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.175 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.175 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.175 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.176 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.176 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.176 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.176 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.176 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.177 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.177 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.177 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.177 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.177 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.178 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.178 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.178 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.178 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.178 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.179 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.179 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.179 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.179 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.179 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.180 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.180 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.180 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.180 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.180 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.181 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.181 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.181 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.181 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.181 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.182 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.182 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.182 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.182 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.183 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.183 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.183 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.183 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.183 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.183 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.183 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.183 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.184 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.184 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.184 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.184 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.184 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.184 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.184 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.184 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.184 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.184 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.184 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.185 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.185 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.185 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.185 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.185 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.185 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.185 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.206 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.209 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.211 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.322 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.404 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.404 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.404 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.404 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.404 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.404 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.405 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.405 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.405 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.405 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.405 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.405 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.405 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.405 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.405 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.405 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.406 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.406 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.406 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.406 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.406 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.406 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.406 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.406 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.406 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.406 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.406 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.407 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.407 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.407 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.407 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.407 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.407 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.407 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.407 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.407 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.407 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.407 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.407 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.408 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.408 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.408 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.408 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.408 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.408 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.408 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.408 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.408 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.408 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.409 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.409 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.409 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.409 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.409 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.409 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.409 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.409 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.409 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.410 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.410 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.410 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.410 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.410 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.410 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.410 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.410 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.410 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.410 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.410 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.411 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.411 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.411 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.411 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.411 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.411 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.411 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.411 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.411 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.411 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.411 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.412 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.412 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.412 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.412 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.412 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.412 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.412 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.412 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.412 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.412 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.412 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.413 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.413 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.413 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.413 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.413 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.413 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.413 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.413 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.413 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.413 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.413 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.414 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.414 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.414 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.414 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.414 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.414 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.414 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.414 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.414 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.414 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.415 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.415 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.415 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.415 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.415 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.415 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.415 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.415 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.416 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.416 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.416 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.416 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.416 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.416 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.416 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.416 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.416 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.416 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.417 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.417 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.417 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.417 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.417 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.417 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.417 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.417 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.417 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.417 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.418 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.418 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.418 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.418 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.418 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.418 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.418 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.418 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.418 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.418 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.419 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.419 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.419 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.419 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.419 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.419 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.419 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.419 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.419 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.419 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.419 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.420 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.420 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.420 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.420 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.420 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.420 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.420 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.420 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.420 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.420 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.421 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.421 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.421 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.421 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.421 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.421 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.421 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.421 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.421 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.422 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.422 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.422 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.422 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.422 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.422 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.422 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.422 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.422 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.423 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.423 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.423 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.423 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.423 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.423 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.423 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.423 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.424 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.424 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.424 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.424 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.424 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.424 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.424 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.424 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.424 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.425 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.425 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.425 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.425 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.425 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.425 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.425 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.425 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.425 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.425 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.425 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.426 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.426 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.430 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.439 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.444 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.444 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.444 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.444 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.444 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.444 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.445 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.445 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.445 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.445 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.445 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.445 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:49:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:49:55 compute-0 sudo[193089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnmpjtnebqjzoyldykriduveykkfpjny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100595.54278-558-162246828044494/AnsiballZ_stat.py'
Jan 22 16:49:55 compute-0 sudo[193089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:56 compute-0 python3.9[193091]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:49:56 compute-0 sudo[193089]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:56 compute-0 sudo[193214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mklwgawpwebcipdwyorbqfwbqvqwmzsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100595.54278-558-162246828044494/AnsiballZ_copy.py'
Jan 22 16:49:56 compute-0 sudo[193214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:56 compute-0 python3.9[193216]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100595.54278-558-162246828044494/.source.yaml _original_basename=.oqbk0_zq follow=False checksum=63ff485ac16b1b321792d64a574c68deec0ea400 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:56 compute-0 sudo[193214]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:57 compute-0 sudo[193366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kowbgysqyiisgmonmpztmtfukmduasva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100596.96468-573-63076119767054/AnsiballZ_stat.py'
Jan 22 16:49:57 compute-0 sudo[193366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:57 compute-0 python3.9[193368]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:49:57 compute-0 sudo[193366]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:57 compute-0 sudo[193489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyhwvfsxnnibbpuairgohktmifleqwdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100596.96468-573-63076119767054/AnsiballZ_copy.py'
Jan 22 16:49:57 compute-0 sudo[193489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:58 compute-0 python3.9[193491]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100596.96468-573-63076119767054/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:49:58 compute-0 sudo[193489]: pam_unix(sudo:session): session closed for user root
Jan 22 16:49:59 compute-0 sudo[193641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfidxvjbwwpwlueuvgoxipqhiwebrtmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100598.8850224-594-17629318446710/AnsiballZ_file.py'
Jan 22 16:49:59 compute-0 sudo[193641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:49:59 compute-0 python3.9[193643]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:59 compute-0 sudo[193641]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:00 compute-0 sudo[193793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsdgobjfxdikrquzzgaovqbriytmhreq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100599.6899433-602-159152218805884/AnsiballZ_file.py'
Jan 22 16:50:00 compute-0 sudo[193793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:00 compute-0 python3.9[193795]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:50:00 compute-0 sudo[193793]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:00 compute-0 sudo[193945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iurvjnacwxgnzrcwdqxbwbwiboscuzur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100600.5568933-610-127455148026701/AnsiballZ_stat.py'
Jan 22 16:50:00 compute-0 sudo[193945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:01 compute-0 python3.9[193947]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:50:01 compute-0 sudo[193945]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:01 compute-0 sudo[194023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmktgynjqpidbjqjzegxmdultqqzplrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100600.5568933-610-127455148026701/AnsiballZ_file.py'
Jan 22 16:50:01 compute-0 sudo[194023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:01 compute-0 python3.9[194025]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.2knhgcf0 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:01 compute-0 sudo[194023]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:02 compute-0 python3.9[194175]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:04 compute-0 sudo[194596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjcwsgtkzaagjuyqdwfiqbijunzxyvps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100604.1993225-647-101227878701210/AnsiballZ_container_config_data.py'
Jan 22 16:50:04 compute-0 sudo[194596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:04 compute-0 python3.9[194598]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Jan 22 16:50:04 compute-0 sudo[194596]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:05 compute-0 sudo[194748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mprzjmghlcytnufuznehcktnpgbdyczx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100605.1574879-658-92611309741831/AnsiballZ_container_config_hash.py'
Jan 22 16:50:05 compute-0 sudo[194748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:05 compute-0 python3.9[194750]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 16:50:05 compute-0 sudo[194748]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:06 compute-0 sudo[194900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqmtwrldrlnibyyfwluxtegpekumvfnp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769100606.1166158-668-64356043459632/AnsiballZ_edpm_container_manage.py'
Jan 22 16:50:06 compute-0 sudo[194900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:06 compute-0 python3[194902]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 16:50:06 compute-0 podman[194936]: 2026-01-22 16:50:06.977444701 +0000 UTC m=+0.078061823 container create 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=node_exporter, container_name=node_exporter)
Jan 22 16:50:06 compute-0 podman[194936]: 2026-01-22 16:50:06.937601151 +0000 UTC m=+0.038218323 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 22 16:50:06 compute-0 python3[194902]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Jan 22 16:50:07 compute-0 sudo[194900]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:07 compute-0 sudo[195124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fahpchqyzfjmotapaxdpzxziuynpgdih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100607.367748-676-59819353087361/AnsiballZ_stat.py'
Jan 22 16:50:07 compute-0 sudo[195124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:07 compute-0 python3.9[195126]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:50:07 compute-0 sudo[195124]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:08 compute-0 sudo[195278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifgvjclszitnqgugdscjvcxaspqkdyka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100608.2444835-685-30282130902423/AnsiballZ_file.py'
Jan 22 16:50:08 compute-0 sudo[195278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:08 compute-0 python3.9[195280]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:08 compute-0 sudo[195278]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:08 compute-0 sudo[195354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etrxejzoclwbhomvzsgpbewdvykfjzzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100608.2444835-685-30282130902423/AnsiballZ_stat.py'
Jan 22 16:50:08 compute-0 sudo[195354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:09 compute-0 python3.9[195356]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:50:09 compute-0 sudo[195354]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:09 compute-0 sudo[195505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nanryjweewhctedlefmzlbedwifsjnel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100609.2531238-685-44368071866725/AnsiballZ_copy.py'
Jan 22 16:50:09 compute-0 sudo[195505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:09 compute-0 python3.9[195507]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769100609.2531238-685-44368071866725/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:10 compute-0 sudo[195505]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:10 compute-0 sudo[195581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvdnzlrdxojbmwwsrnotvlyvczmbpiyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100609.2531238-685-44368071866725/AnsiballZ_systemd.py'
Jan 22 16:50:10 compute-0 sudo[195581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:10 compute-0 python3.9[195583]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 16:50:10 compute-0 systemd[1]: Reloading.
Jan 22 16:50:10 compute-0 systemd-rc-local-generator[195611]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:50:10 compute-0 systemd-sysv-generator[195616]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:50:10 compute-0 sudo[195581]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:11 compute-0 sudo[195693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqlpsdzgpigjuzfjdtwohfaiydgpnpjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100609.2531238-685-44368071866725/AnsiballZ_systemd.py'
Jan 22 16:50:11 compute-0 sudo[195693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:11 compute-0 python3.9[195695]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:50:11 compute-0 systemd[1]: Reloading.
Jan 22 16:50:11 compute-0 systemd-sysv-generator[195730]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:50:11 compute-0 systemd-rc-local-generator[195727]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:50:11 compute-0 systemd[1]: Starting node_exporter container...
Jan 22 16:50:11 compute-0 systemd[1]: Started libcrun container.
Jan 22 16:50:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a8a639d01e0d359e45e6cde66b1d18865caf42b02ab91d5e3cd747641b20a1b/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 22 16:50:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a8a639d01e0d359e45e6cde66b1d18865caf42b02ab91d5e3cd747641b20a1b/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 22 16:50:11 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75.
Jan 22 16:50:11 compute-0 podman[195736]: 2026-01-22 16:50:11.870483034 +0000 UTC m=+0.147093525 container init 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.891Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.891Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.891Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.892Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.892Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.893Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.893Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.893Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.893Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=arp
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=bcache
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=bonding
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=cpu
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=edac
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=filefd
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=netclass
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=netdev
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=netstat
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=nfs
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=nvme
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=softnet
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=systemd
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=xfs
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.894Z caller=node_exporter.go:117 level=info collector=zfs
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.896Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Jan 22 16:50:11 compute-0 node_exporter[195752]: ts=2026-01-22T16:50:11.896Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Jan 22 16:50:11 compute-0 podman[195736]: 2026-01-22 16:50:11.905442551 +0000 UTC m=+0.182053062 container start 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 16:50:11 compute-0 podman[195736]: node_exporter
Jan 22 16:50:11 compute-0 systemd[1]: Started node_exporter container.
Jan 22 16:50:11 compute-0 sudo[195693]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:12 compute-0 podman[195761]: 2026-01-22 16:50:12.003097184 +0000 UTC m=+0.078492252 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 16:50:12 compute-0 python3.9[195935]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 16:50:13 compute-0 sudo[196085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvfzabhestosclpacecclofstfpepsss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100613.4150445-730-192336008775443/AnsiballZ_stat.py'
Jan 22 16:50:13 compute-0 sudo[196085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:13 compute-0 python3.9[196087]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:50:14 compute-0 sudo[196085]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:14 compute-0 sudo[196210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irpgdwozveikynxkpfvkcdyskpmlyvwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100613.4150445-730-192336008775443/AnsiballZ_copy.py'
Jan 22 16:50:14 compute-0 sudo[196210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:14 compute-0 python3.9[196212]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100613.4150445-730-192336008775443/.source.yaml _original_basename=.k4h785gi follow=False checksum=276c36708c4b8c8882a7fdefd57777e41d9025e1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:14 compute-0 sudo[196210]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:15 compute-0 sudo[196362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udyhdufbvprzjxofzvqkapabwxteijyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100614.7900434-745-63914561006080/AnsiballZ_stat.py'
Jan 22 16:50:15 compute-0 sudo[196362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:15 compute-0 python3.9[196364]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:50:15 compute-0 sudo[196362]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:15 compute-0 sudo[196485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uledehahuotvbqesipkbogeevryitvpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100614.7900434-745-63914561006080/AnsiballZ_copy.py'
Jan 22 16:50:15 compute-0 sudo[196485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:15 compute-0 python3.9[196487]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100614.7900434-745-63914561006080/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:50:15 compute-0 sudo[196485]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:16 compute-0 sudo[196637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyklvbnjomeikspavuvgeparygkwlcqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100616.5473533-766-115838358578066/AnsiballZ_file.py'
Jan 22 16:50:16 compute-0 sudo[196637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:17 compute-0 python3.9[196639]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:17 compute-0 sudo[196637]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:17 compute-0 sudo[196789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zugfyeivaijxcpgvurqnqyuxxzxwwqmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100617.3587933-774-52421346750431/AnsiballZ_file.py'
Jan 22 16:50:17 compute-0 sudo[196789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:17 compute-0 python3.9[196791]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:50:17 compute-0 sudo[196789]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:18 compute-0 podman[196892]: 2026-01-22 16:50:18.372410313 +0000 UTC m=+0.054372906 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 16:50:18 compute-0 podman[196873]: 2026-01-22 16:50:18.391206786 +0000 UTC m=+0.087657753 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 16:50:18 compute-0 sudo[196986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stknscygmoheipcurmopkgpwmgoafdpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100618.165248-782-248283686270077/AnsiballZ_stat.py'
Jan 22 16:50:18 compute-0 sudo[196986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:18 compute-0 python3.9[196988]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:50:18 compute-0 sudo[196986]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:18 compute-0 sudo[197064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iijywquuhnqcefeesfiuyjvawuriicas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100618.165248-782-248283686270077/AnsiballZ_file.py'
Jan 22 16:50:18 compute-0 sudo[197064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:19 compute-0 python3.9[197066]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.rid8rfl1 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:19 compute-0 sudo[197064]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:19 compute-0 python3.9[197216]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:21 compute-0 sudo[197637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amfobzbexybdochqluyvknayofrobsce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100621.554453-819-56894044256303/AnsiballZ_container_config_data.py'
Jan 22 16:50:21 compute-0 sudo[197637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:22 compute-0 python3.9[197639]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Jan 22 16:50:22 compute-0 sudo[197637]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:22 compute-0 sudo[197789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crzycsbqpxjqfitseypuewrwrnrxbixe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100622.440626-830-204767397125013/AnsiballZ_container_config_hash.py'
Jan 22 16:50:22 compute-0 sudo[197789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:22 compute-0 python3.9[197791]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 16:50:22 compute-0 sudo[197789]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:23 compute-0 sudo[197941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rttawtnfwduvaqjntlrrjqyemnwpimah ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769100623.3009846-840-68777015481269/AnsiballZ_edpm_container_manage.py'
Jan 22 16:50:23 compute-0 sudo[197941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:23 compute-0 python3[197943]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 16:50:24 compute-0 podman[197969]: 2026-01-22 16:50:24.372075981 +0000 UTC m=+0.073917977 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 16:50:24 compute-0 systemd[1]: 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a-374c3b43a6ba2f8a.service: Main process exited, code=exited, status=1/FAILURE
Jan 22 16:50:24 compute-0 systemd[1]: 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a-374c3b43a6ba2f8a.service: Failed with result 'exit-code'.
Jan 22 16:50:25 compute-0 podman[197956]: 2026-01-22 16:50:25.241771469 +0000 UTC m=+1.178774461 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 22 16:50:25 compute-0 podman[198070]: 2026-01-22 16:50:25.366345223 +0000 UTC m=+0.046105575 container create 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible)
Jan 22 16:50:25 compute-0 podman[198070]: 2026-01-22 16:50:25.34193391 +0000 UTC m=+0.021694242 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 22 16:50:25 compute-0 python3[197943]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Jan 22 16:50:25 compute-0 sudo[197941]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:25 compute-0 sudo[198258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghauyilhpeajxkbciwudflqcdcvwdfjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100625.6538932-848-206825929322517/AnsiballZ_stat.py'
Jan 22 16:50:25 compute-0 sudo[198258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:26 compute-0 python3.9[198260]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:50:26 compute-0 sudo[198258]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:26 compute-0 sudo[198412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywrnkhawczbeshbqkgznyrtvazbmsxxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100626.4855337-857-258407213963877/AnsiballZ_file.py'
Jan 22 16:50:26 compute-0 sudo[198412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:26 compute-0 python3.9[198414]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:27 compute-0 sudo[198412]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:27 compute-0 sudo[198488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxxucpnnuwqtpogwwbxdnssqszzermmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100626.4855337-857-258407213963877/AnsiballZ_stat.py'
Jan 22 16:50:27 compute-0 sudo[198488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:27 compute-0 python3.9[198490]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:50:27 compute-0 sudo[198488]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:28 compute-0 sudo[198639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yewotkxrqptkzgsnsxeoogtpwacllxwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100627.5114481-857-166057884157920/AnsiballZ_copy.py'
Jan 22 16:50:28 compute-0 sudo[198639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:28 compute-0 python3.9[198641]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769100627.5114481-857-166057884157920/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:28 compute-0 sudo[198639]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:28 compute-0 sudo[198715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odwiyktdltmroqgdeckiisihvbmxhadz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100627.5114481-857-166057884157920/AnsiballZ_systemd.py'
Jan 22 16:50:28 compute-0 sudo[198715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:28 compute-0 python3.9[198717]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 16:50:29 compute-0 systemd[1]: Reloading.
Jan 22 16:50:29 compute-0 systemd-rc-local-generator[198744]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:50:29 compute-0 systemd-sysv-generator[198747]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:50:29 compute-0 sudo[198715]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:29 compute-0 sudo[198825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzfigxrujgsyrinmxvsygnayxuxycvgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100627.5114481-857-166057884157920/AnsiballZ_systemd.py'
Jan 22 16:50:29 compute-0 sudo[198825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:29 compute-0 python3.9[198827]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:50:30 compute-0 systemd[1]: Reloading.
Jan 22 16:50:30 compute-0 systemd-rc-local-generator[198858]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:50:30 compute-0 systemd-sysv-generator[198861]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:50:30 compute-0 systemd[1]: Starting podman_exporter container...
Jan 22 16:50:30 compute-0 systemd[1]: Started libcrun container.
Jan 22 16:50:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/782f6827ea5654c9c7fa7a177368b2697f6293d5e7b5d320f3787f8420b133ba/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 22 16:50:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/782f6827ea5654c9c7fa7a177368b2697f6293d5e7b5d320f3787f8420b133ba/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 22 16:50:30 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69.
Jan 22 16:50:30 compute-0 podman[198868]: 2026-01-22 16:50:30.608597475 +0000 UTC m=+0.169648006 container init 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 16:50:30 compute-0 podman_exporter[198884]: ts=2026-01-22T16:50:30.637Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Jan 22 16:50:30 compute-0 podman_exporter[198884]: ts=2026-01-22T16:50:30.637Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Jan 22 16:50:30 compute-0 podman_exporter[198884]: ts=2026-01-22T16:50:30.637Z caller=handler.go:94 level=info msg="enabled collectors"
Jan 22 16:50:30 compute-0 podman_exporter[198884]: ts=2026-01-22T16:50:30.637Z caller=handler.go:105 level=info collector=container
Jan 22 16:50:30 compute-0 podman[198868]: 2026-01-22 16:50:30.641806911 +0000 UTC m=+0.202857402 container start 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 16:50:30 compute-0 podman[198868]: podman_exporter
Jan 22 16:50:30 compute-0 systemd[1]: Starting Podman API Service...
Jan 22 16:50:30 compute-0 systemd[1]: Started Podman API Service.
Jan 22 16:50:30 compute-0 systemd[1]: Started podman_exporter container.
Jan 22 16:50:30 compute-0 sudo[198825]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:30 compute-0 podman[198895]: time="2026-01-22T16:50:30Z" level=info msg="/usr/bin/podman filtering at log level info"
Jan 22 16:50:30 compute-0 podman[198895]: time="2026-01-22T16:50:30Z" level=info msg="Setting parallel job count to 25"
Jan 22 16:50:30 compute-0 podman[198895]: time="2026-01-22T16:50:30Z" level=info msg="Using sqlite as database backend"
Jan 22 16:50:30 compute-0 podman[198895]: time="2026-01-22T16:50:30Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Jan 22 16:50:30 compute-0 podman[198895]: time="2026-01-22T16:50:30Z" level=info msg="Using systemd socket activation to determine API endpoint"
Jan 22 16:50:30 compute-0 podman[198895]: time="2026-01-22T16:50:30Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Jan 22 16:50:30 compute-0 podman[198895]: @ - - [22/Jan/2026:16:50:30 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Jan 22 16:50:30 compute-0 podman[198895]: time="2026-01-22T16:50:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 22 16:50:30 compute-0 podman[198895]: @ - - [22/Jan/2026:16:50:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 18076 "" "Go-http-client/1.1"
Jan 22 16:50:30 compute-0 podman_exporter[198884]: ts=2026-01-22T16:50:30.735Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Jan 22 16:50:30 compute-0 podman_exporter[198884]: ts=2026-01-22T16:50:30.736Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Jan 22 16:50:30 compute-0 podman_exporter[198884]: ts=2026-01-22T16:50:30.736Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Jan 22 16:50:30 compute-0 podman[198893]: 2026-01-22 16:50:30.741363578 +0000 UTC m=+0.079972166 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 16:50:30 compute-0 systemd[1]: 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69-6c861d9b6de705ac.service: Main process exited, code=exited, status=1/FAILURE
Jan 22 16:50:30 compute-0 systemd[1]: 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69-6c861d9b6de705ac.service: Failed with result 'exit-code'.
Jan 22 16:50:31 compute-0 python3.9[199082]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 16:50:32 compute-0 sudo[199232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lilkjfbgsvnfrwszwmmrcurjhpfstzzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100632.1440032-902-167959339569878/AnsiballZ_stat.py'
Jan 22 16:50:32 compute-0 sudo[199232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:32 compute-0 python3.9[199234]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:50:32 compute-0 sudo[199232]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:33 compute-0 sudo[199357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxdbtkzlejivkzvrwplewfhtecrsbedt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100632.1440032-902-167959339569878/AnsiballZ_copy.py'
Jan 22 16:50:33 compute-0 sudo[199357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:33 compute-0 python3.9[199359]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100632.1440032-902-167959339569878/.source.yaml _original_basename=.2wel3arc follow=False checksum=c6cc04e6395720fa8dc60b562885f8794cc890a7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:33 compute-0 sudo[199357]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:34 compute-0 sudo[199509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdexygdedlybkkwbvfjaghvadteutofx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100633.9265766-917-28112610543144/AnsiballZ_stat.py'
Jan 22 16:50:34 compute-0 sudo[199509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:34 compute-0 python3.9[199511]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:50:34 compute-0 sudo[199509]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:34 compute-0 sudo[199632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngcryetsdlwyhggamudxdnvirmhbjbmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100633.9265766-917-28112610543144/AnsiballZ_copy.py'
Jan 22 16:50:34 compute-0 sudo[199632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:35 compute-0 python3.9[199634]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769100633.9265766-917-28112610543144/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:50:35 compute-0 sudo[199632]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:36 compute-0 sudo[199784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kalorcdaeouolhcoghothnqgjolobhqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100635.694384-938-210953706366892/AnsiballZ_file.py'
Jan 22 16:50:36 compute-0 sudo[199784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:36 compute-0 python3.9[199786]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:36 compute-0 sudo[199784]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:36 compute-0 sudo[199936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwyapnfnvpoqunsvcegrypefdyfanofs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100636.595349-946-150521522571870/AnsiballZ_file.py'
Jan 22 16:50:36 compute-0 sudo[199936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:37 compute-0 python3.9[199938]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:50:37 compute-0 sudo[199936]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:37 compute-0 sudo[200088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifspahuxszttzdnecxcyurlsgbnzycru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100637.440033-954-233474177101857/AnsiballZ_stat.py'
Jan 22 16:50:37 compute-0 sudo[200088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:38 compute-0 python3.9[200090]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:50:38 compute-0 sudo[200088]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:38 compute-0 sudo[200166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyboxtqrhpcrgidtnhgnnbmycsbbezwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100637.440033-954-233474177101857/AnsiballZ_file.py'
Jan 22 16:50:38 compute-0 sudo[200166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:38 compute-0 python3.9[200168]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.2kttlexw recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:38 compute-0 sudo[200166]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:39 compute-0 python3.9[200318]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:41 compute-0 sudo[200739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oigazmvkxuoqzfeurmyksrwemafstcdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100641.3859065-991-228628072398469/AnsiballZ_container_config_data.py'
Jan 22 16:50:41 compute-0 sudo[200739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:50:41.905 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:50:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:50:41.906 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:50:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:50:41.906 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:50:42 compute-0 python3.9[200741]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Jan 22 16:50:42 compute-0 sudo[200739]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:42 compute-0 podman[200766]: 2026-01-22 16:50:42.379255704 +0000 UTC m=+0.074362916 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 16:50:42 compute-0 sudo[200914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cewdfrbnsnwaaemmikuwjozexdioaisg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100642.4071867-1002-221409207353192/AnsiballZ_container_config_hash.py'
Jan 22 16:50:42 compute-0 sudo[200914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:42 compute-0 python3.9[200916]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 16:50:43 compute-0 sudo[200914]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:43 compute-0 auditd[701]: Audit daemon rotating log files
Jan 22 16:50:43 compute-0 sudo[201066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfjhcyzaukdfgioajxbssulgflmyrdul ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769100643.454457-1012-28041226368692/AnsiballZ_edpm_container_manage.py'
Jan 22 16:50:43 compute-0 sudo[201066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:44 compute-0 python3[201068]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 16:50:46 compute-0 podman[201081]: 2026-01-22 16:50:46.571834125 +0000 UTC m=+2.274786941 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 22 16:50:46 compute-0 podman[201180]: 2026-01-22 16:50:46.690210317 +0000 UTC m=+0.042668896 container create c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, maintainer=Red Hat, Inc.)
Jan 22 16:50:46 compute-0 podman[201180]: 2026-01-22 16:50:46.667839611 +0000 UTC m=+0.020298210 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 22 16:50:46 compute-0 python3[201068]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 22 16:50:46 compute-0 sudo[201066]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:47 compute-0 sudo[201367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwwfwgaihkphpfclavlpetnohwzymcps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100647.1037118-1020-111861706179974/AnsiballZ_stat.py'
Jan 22 16:50:47 compute-0 sudo[201367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:47 compute-0 python3.9[201369]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:50:47 compute-0 sudo[201367]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:48 compute-0 sudo[201522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uizmgkfhtsrrjsmjbtvdofffuqueyvyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100647.9754832-1029-28019358048181/AnsiballZ_file.py'
Jan 22 16:50:48 compute-0 sudo[201522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:48 compute-0 python3.9[201525]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:48 compute-0 sudo[201522]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:48 compute-0 sudo[201625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izifcfqhiqwklmqppwhullesrgznbtmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100647.9754832-1029-28019358048181/AnsiballZ_stat.py'
Jan 22 16:50:48 compute-0 podman[201574]: 2026-01-22 16:50:48.705900997 +0000 UTC m=+0.062169715 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 16:50:48 compute-0 sudo[201625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:48 compute-0 podman[201573]: 2026-01-22 16:50:48.811415122 +0000 UTC m=+0.165594522 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Jan 22 16:50:48 compute-0 sshd-session[201519]: Received disconnect from 45.148.10.147 port 39059:11:  [preauth]
Jan 22 16:50:48 compute-0 sshd-session[201519]: Disconnected from authenticating user root 45.148.10.147 port 39059 [preauth]
Jan 22 16:50:48 compute-0 python3.9[201637]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:50:48 compute-0 sudo[201625]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:49 compute-0 sudo[201792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cstqdjcggowysmizqdqxepkzjbdrvhfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100648.980208-1029-129463373044357/AnsiballZ_copy.py'
Jan 22 16:50:49 compute-0 sudo[201792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:49 compute-0 python3.9[201794]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769100648.980208-1029-129463373044357/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:49 compute-0 sudo[201792]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:49 compute-0 sudo[201868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zczbasfxioqudvaxplfwtfyzrvspcmge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100648.980208-1029-129463373044357/AnsiballZ_systemd.py'
Jan 22 16:50:49 compute-0 sudo[201868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:50 compute-0 python3.9[201870]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 16:50:50 compute-0 systemd[1]: Reloading.
Jan 22 16:50:50 compute-0 systemd-rc-local-generator[201897]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:50:50 compute-0 systemd-sysv-generator[201900]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:50:50 compute-0 sudo[201868]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:50 compute-0 sudo[201979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsnfberlosjazhnytpxppwfpvlxhjmen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100648.980208-1029-129463373044357/AnsiballZ_systemd.py'
Jan 22 16:50:50 compute-0 sudo[201979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:51 compute-0 python3.9[201981]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:50:51 compute-0 systemd[1]: Reloading.
Jan 22 16:50:51 compute-0 systemd-rc-local-generator[202010]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:50:51 compute-0 systemd-sysv-generator[202013]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:50:51 compute-0 systemd[1]: Starting openstack_network_exporter container...
Jan 22 16:50:51 compute-0 systemd[1]: Started libcrun container.
Jan 22 16:50:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4473d984f6b6268e97bc36199f0078cb2169a95241a144f6edbe275efbe537a/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 22 16:50:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4473d984f6b6268e97bc36199f0078cb2169a95241a144f6edbe275efbe537a/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 22 16:50:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4473d984f6b6268e97bc36199f0078cb2169a95241a144f6edbe275efbe537a/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 22 16:50:51 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30.
Jan 22 16:50:51 compute-0 podman[202020]: 2026-01-22 16:50:51.586061825 +0000 UTC m=+0.139778456 container init c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, managed_by=edpm_ansible)
Jan 22 16:50:51 compute-0 openstack_network_exporter[202036]: INFO    16:50:51 main.go:48: registering *bridge.Collector
Jan 22 16:50:51 compute-0 openstack_network_exporter[202036]: INFO    16:50:51 main.go:48: registering *coverage.Collector
Jan 22 16:50:51 compute-0 openstack_network_exporter[202036]: INFO    16:50:51 main.go:48: registering *datapath.Collector
Jan 22 16:50:51 compute-0 openstack_network_exporter[202036]: INFO    16:50:51 main.go:48: registering *iface.Collector
Jan 22 16:50:51 compute-0 openstack_network_exporter[202036]: INFO    16:50:51 main.go:48: registering *memory.Collector
Jan 22 16:50:51 compute-0 openstack_network_exporter[202036]: INFO    16:50:51 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Jan 22 16:50:51 compute-0 openstack_network_exporter[202036]: INFO    16:50:51 main.go:48: registering *ovn.Collector
Jan 22 16:50:51 compute-0 openstack_network_exporter[202036]: INFO    16:50:51 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Jan 22 16:50:51 compute-0 openstack_network_exporter[202036]: INFO    16:50:51 main.go:48: registering *pmd_perf.Collector
Jan 22 16:50:51 compute-0 openstack_network_exporter[202036]: INFO    16:50:51 main.go:48: registering *pmd_rxq.Collector
Jan 22 16:50:51 compute-0 openstack_network_exporter[202036]: INFO    16:50:51 main.go:48: registering *vswitch.Collector
Jan 22 16:50:51 compute-0 openstack_network_exporter[202036]: NOTICE  16:50:51 main.go:76: listening on https://:9105/metrics
Jan 22 16:50:51 compute-0 podman[202020]: 2026-01-22 16:50:51.6161769 +0000 UTC m=+0.169893511 container start c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Jan 22 16:50:51 compute-0 podman[202020]: openstack_network_exporter
Jan 22 16:50:51 compute-0 systemd[1]: Started openstack_network_exporter container.
Jan 22 16:50:51 compute-0 sudo[201979]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:51 compute-0 podman[202046]: 2026-01-22 16:50:51.754390469 +0000 UTC m=+0.126796777 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.307 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.327 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.327 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.327 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.328 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 16:50:52 compute-0 python3.9[202218]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.804 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.804 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.804 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.824 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.825 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.825 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.825 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.974 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.975 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5841MB free_disk=73.37396621704102GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.975 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:50:52 compute-0 nova_compute[183075]: 2026-01-22 16:50:52.975 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:50:53 compute-0 nova_compute[183075]: 2026-01-22 16:50:53.039 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 16:50:53 compute-0 nova_compute[183075]: 2026-01-22 16:50:53.040 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 16:50:53 compute-0 nova_compute[183075]: 2026-01-22 16:50:53.063 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 16:50:53 compute-0 nova_compute[183075]: 2026-01-22 16:50:53.084 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 16:50:53 compute-0 nova_compute[183075]: 2026-01-22 16:50:53.086 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 16:50:53 compute-0 nova_compute[183075]: 2026-01-22 16:50:53.086 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:50:53 compute-0 sudo[202368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adbkdwnqfgedisamvggwxrdoabyucybj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100653.0118487-1074-139532530839367/AnsiballZ_stat.py'
Jan 22 16:50:53 compute-0 sudo[202368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:53 compute-0 python3.9[202370]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:50:53 compute-0 sudo[202368]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:53 compute-0 sudo[202493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fttiamutebxickintwiwftzvieejlpkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100653.0118487-1074-139532530839367/AnsiballZ_copy.py'
Jan 22 16:50:53 compute-0 sudo[202493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:54 compute-0 nova_compute[183075]: 2026-01-22 16:50:54.073 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:50:54 compute-0 nova_compute[183075]: 2026-01-22 16:50:54.073 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:50:54 compute-0 nova_compute[183075]: 2026-01-22 16:50:54.074 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:50:54 compute-0 python3.9[202495]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100653.0118487-1074-139532530839367/.source.yaml _original_basename=.9ubtt6vj follow=False checksum=9e0c5408668a02a5e35cf2a735cc772db9fb97d5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:54 compute-0 sudo[202493]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:54 compute-0 podman[202619]: 2026-01-22 16:50:54.723054839 +0000 UTC m=+0.055233215 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 16:50:54 compute-0 sudo[202657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efpoadtcgazrzefgjedwggsgpnevqrnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100654.3947158-1089-206356839512304/AnsiballZ_find.py'
Jan 22 16:50:54 compute-0 systemd[1]: 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a-374c3b43a6ba2f8a.service: Main process exited, code=exited, status=1/FAILURE
Jan 22 16:50:54 compute-0 systemd[1]: 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a-374c3b43a6ba2f8a.service: Failed with result 'exit-code'.
Jan 22 16:50:54 compute-0 sudo[202657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:54 compute-0 python3.9[202665]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 16:50:54 compute-0 sudo[202657]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:55 compute-0 sudo[202816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmlsevgesykporgjrgrktfpwkthpbpmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100655.4046469-1099-132445759105378/AnsiballZ_podman_container_info.py'
Jan 22 16:50:55 compute-0 sudo[202816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:56 compute-0 python3.9[202818]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Jan 22 16:50:56 compute-0 sudo[202816]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:56 compute-0 sudo[202981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idvxgcdyxasppglnwmqalqrspmfunqjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100656.354631-1107-166206568661433/AnsiballZ_podman_container_exec.py'
Jan 22 16:50:56 compute-0 sudo[202981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:57 compute-0 python3.9[202983]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 16:50:57 compute-0 systemd[1]: Started libpod-conmon-3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee.scope.
Jan 22 16:50:57 compute-0 podman[202984]: 2026-01-22 16:50:57.193892693 +0000 UTC m=+0.098237598 container exec 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 16:50:57 compute-0 podman[202984]: 2026-01-22 16:50:57.202896021 +0000 UTC m=+0.107240926 container exec_died 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 16:50:57 compute-0 systemd[1]: libpod-conmon-3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee.scope: Deactivated successfully.
Jan 22 16:50:57 compute-0 sudo[202981]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:57 compute-0 sudo[203167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anfbekfnjqalrcnsejtxnqtifiofvjur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100657.4301348-1115-68242492162527/AnsiballZ_podman_container_exec.py'
Jan 22 16:50:57 compute-0 sudo[203167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:57 compute-0 python3.9[203169]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 16:50:58 compute-0 systemd[1]: Started libpod-conmon-3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee.scope.
Jan 22 16:50:58 compute-0 podman[203170]: 2026-01-22 16:50:58.032554964 +0000 UTC m=+0.078373010 container exec 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, config_id=ovn_controller)
Jan 22 16:50:58 compute-0 podman[203170]: 2026-01-22 16:50:58.067893489 +0000 UTC m=+0.113711515 container exec_died 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 16:50:58 compute-0 systemd[1]: libpod-conmon-3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee.scope: Deactivated successfully.
Jan 22 16:50:58 compute-0 sudo[203167]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:58 compute-0 sudo[203351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyeefrjufhxerlznctyhzokhlxiwxvea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100658.30833-1123-158054309063591/AnsiballZ_file.py'
Jan 22 16:50:58 compute-0 sudo[203351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:58 compute-0 python3.9[203353]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:58 compute-0 sudo[203351]: pam_unix(sudo:session): session closed for user root
Jan 22 16:50:59 compute-0 sudo[203503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwtiervxecljwbiwgfurihjgtgaqgsvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100659.0662973-1132-261055313089547/AnsiballZ_podman_container_info.py'
Jan 22 16:50:59 compute-0 sudo[203503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:50:59 compute-0 python3.9[203505]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Jan 22 16:50:59 compute-0 sudo[203503]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:00 compute-0 sudo[203668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfjjkbosmycejpupubvgsgkeglfgkrnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100659.9021406-1140-93851222735442/AnsiballZ_podman_container_exec.py'
Jan 22 16:51:00 compute-0 sudo[203668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:00 compute-0 python3.9[203670]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 16:51:00 compute-0 systemd[1]: Started libpod-conmon-642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd.scope.
Jan 22 16:51:00 compute-0 podman[203671]: 2026-01-22 16:51:00.573116656 +0000 UTC m=+0.090035778 container exec 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 16:51:00 compute-0 podman[203671]: 2026-01-22 16:51:00.607981641 +0000 UTC m=+0.124900733 container exec_died 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 16:51:00 compute-0 systemd[1]: libpod-conmon-642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd.scope: Deactivated successfully.
Jan 22 16:51:00 compute-0 sudo[203668]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:01 compute-0 sudo[203864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzbevrogbqbeiaiinqhoaezkhldmiscn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100660.8682334-1148-177077426586648/AnsiballZ_podman_container_exec.py'
Jan 22 16:51:01 compute-0 sudo[203864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:01 compute-0 podman[203825]: 2026-01-22 16:51:01.230394643 +0000 UTC m=+0.062942353 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 16:51:01 compute-0 python3.9[203870]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 16:51:01 compute-0 systemd[1]: Started libpod-conmon-642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd.scope.
Jan 22 16:51:01 compute-0 podman[203878]: 2026-01-22 16:51:01.518933398 +0000 UTC m=+0.104440635 container exec 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 16:51:01 compute-0 podman[203878]: 2026-01-22 16:51:01.554206288 +0000 UTC m=+0.139713495 container exec_died 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 22 16:51:01 compute-0 systemd[1]: libpod-conmon-642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd.scope: Deactivated successfully.
Jan 22 16:51:01 compute-0 sudo[203864]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:02 compute-0 sudo[204059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxetgapjbvydsxaupsbdvckhtxcervjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100661.7756894-1156-276077357644624/AnsiballZ_file.py'
Jan 22 16:51:02 compute-0 sudo[204059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:02 compute-0 python3.9[204061]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:02 compute-0 sudo[204059]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:02 compute-0 sudo[204211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dilkceqnurafwdeieplvddnlvrtviutt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100662.5606487-1165-169129886187013/AnsiballZ_podman_container_info.py'
Jan 22 16:51:02 compute-0 sudo[204211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:03 compute-0 python3.9[204213]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Jan 22 16:51:03 compute-0 sudo[204211]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:03 compute-0 sudo[204376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfvrqdbdzvhbwuaemxsueudvrjhtryvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100663.434324-1173-261341409009955/AnsiballZ_podman_container_exec.py'
Jan 22 16:51:03 compute-0 sudo[204376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:04 compute-0 python3.9[204378]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 16:51:04 compute-0 systemd[1]: Started libpod-conmon-7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a.scope.
Jan 22 16:51:04 compute-0 podman[204379]: 2026-01-22 16:51:04.133620616 +0000 UTC m=+0.091444980 container exec 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 16:51:04 compute-0 podman[204379]: 2026-01-22 16:51:04.168956317 +0000 UTC m=+0.126780631 container exec_died 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 22 16:51:04 compute-0 systemd[1]: libpod-conmon-7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a.scope: Deactivated successfully.
Jan 22 16:51:04 compute-0 sudo[204376]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:04 compute-0 sudo[204560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irzebdmsfdapjdofpocwewqxskfaddxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100664.4362621-1181-54251298566209/AnsiballZ_podman_container_exec.py'
Jan 22 16:51:04 compute-0 sudo[204560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:04 compute-0 python3.9[204562]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 16:51:05 compute-0 systemd[1]: Started libpod-conmon-7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a.scope.
Jan 22 16:51:05 compute-0 podman[204563]: 2026-01-22 16:51:05.034926871 +0000 UTC m=+0.077467068 container exec 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 22 16:51:05 compute-0 podman[204563]: 2026-01-22 16:51:05.071051213 +0000 UTC m=+0.113591420 container exec_died 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 22 16:51:05 compute-0 systemd[1]: libpod-conmon-7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a.scope: Deactivated successfully.
Jan 22 16:51:05 compute-0 sudo[204560]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:05 compute-0 sudo[204744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sftraxfqfcwgqseziuyjqtckarthxoho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100665.3146062-1189-133062450163348/AnsiballZ_file.py'
Jan 22 16:51:05 compute-0 sudo[204744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:05 compute-0 python3.9[204746]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:05 compute-0 sudo[204744]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:06 compute-0 sudo[204896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsgarnbcqofkwlsxqorettesettuzvdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100666.1105926-1198-176569914082637/AnsiballZ_podman_container_info.py'
Jan 22 16:51:06 compute-0 sudo[204896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:06 compute-0 python3.9[204898]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Jan 22 16:51:06 compute-0 sudo[204896]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:07 compute-0 sudo[205061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qswkyfinxknyydsfbxvptwcplwqzzygu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100666.9696403-1206-207131205328091/AnsiballZ_podman_container_exec.py'
Jan 22 16:51:07 compute-0 sudo[205061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:07 compute-0 python3.9[205063]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 16:51:07 compute-0 systemd[1]: Started libpod-conmon-04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75.scope.
Jan 22 16:51:07 compute-0 podman[205064]: 2026-01-22 16:51:07.573066954 +0000 UTC m=+0.071838074 container exec 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 16:51:07 compute-0 podman[205064]: 2026-01-22 16:51:07.637319521 +0000 UTC m=+0.136090621 container exec_died 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 16:51:07 compute-0 systemd[1]: libpod-conmon-04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75.scope: Deactivated successfully.
Jan 22 16:51:07 compute-0 sudo[205061]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:08 compute-0 sudo[205243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibihrohjvsykfmruhnmddkedzybqcamr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100667.9073517-1214-221627902014074/AnsiballZ_podman_container_exec.py'
Jan 22 16:51:08 compute-0 sudo[205243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:08 compute-0 python3.9[205245]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 16:51:08 compute-0 systemd[1]: Started libpod-conmon-04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75.scope.
Jan 22 16:51:08 compute-0 podman[205246]: 2026-01-22 16:51:08.575910198 +0000 UTC m=+0.076802661 container exec 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 16:51:08 compute-0 podman[205246]: 2026-01-22 16:51:08.608899559 +0000 UTC m=+0.109792012 container exec_died 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 16:51:08 compute-0 systemd[1]: libpod-conmon-04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75.scope: Deactivated successfully.
Jan 22 16:51:08 compute-0 sudo[205243]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:09 compute-0 sudo[205428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjgqntfvqedqngobhrorqjocwqiqmvmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100668.8667839-1222-68733376809903/AnsiballZ_file.py'
Jan 22 16:51:09 compute-0 sudo[205428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:09 compute-0 python3.9[205430]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:09 compute-0 sudo[205428]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:10 compute-0 sudo[205580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlpnaqhhlcbslgeoxbsxomvbacivaylk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100669.6999345-1231-134946488333693/AnsiballZ_podman_container_info.py'
Jan 22 16:51:10 compute-0 sudo[205580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:10 compute-0 python3.9[205582]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Jan 22 16:51:10 compute-0 sudo[205580]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:10 compute-0 sudo[205745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgljcqddhjjxdxpkchxberomsnwakbhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100670.4909172-1239-5396270336232/AnsiballZ_podman_container_exec.py'
Jan 22 16:51:10 compute-0 sudo[205745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:11 compute-0 python3.9[205747]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 16:51:11 compute-0 systemd[1]: Started libpod-conmon-266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69.scope.
Jan 22 16:51:11 compute-0 podman[205748]: 2026-01-22 16:51:11.190505973 +0000 UTC m=+0.106105057 container exec 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 16:51:11 compute-0 podman[205748]: 2026-01-22 16:51:11.226248645 +0000 UTC m=+0.141847699 container exec_died 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 16:51:11 compute-0 systemd[1]: libpod-conmon-266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69.scope: Deactivated successfully.
Jan 22 16:51:11 compute-0 sudo[205745]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:11 compute-0 sudo[205928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfnvgnkokanjxbdsdnmrojrcspuaqxnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100671.4822214-1247-244061479325877/AnsiballZ_podman_container_exec.py'
Jan 22 16:51:11 compute-0 sudo[205928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:12 compute-0 python3.9[205930]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 16:51:12 compute-0 systemd[1]: Started libpod-conmon-266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69.scope.
Jan 22 16:51:12 compute-0 podman[205931]: 2026-01-22 16:51:12.123668531 +0000 UTC m=+0.094065357 container exec 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 16:51:12 compute-0 podman[205931]: 2026-01-22 16:51:12.158095118 +0000 UTC m=+0.128491844 container exec_died 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 16:51:12 compute-0 systemd[1]: libpod-conmon-266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69.scope: Deactivated successfully.
Jan 22 16:51:12 compute-0 sudo[205928]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:12 compute-0 sudo[206132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btgxndqflbxmsmdeirvvrmnwaotoxneq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100672.3924437-1255-138809060578868/AnsiballZ_file.py'
Jan 22 16:51:12 compute-0 sudo[206132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:12 compute-0 podman[206087]: 2026-01-22 16:51:12.717844345 +0000 UTC m=+0.063796567 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 16:51:12 compute-0 python3.9[206141]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:12 compute-0 sudo[206132]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:13 compute-0 sudo[206291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvjgttmfgttafjmvlococuwrbbdcolok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100673.126269-1264-116010066239463/AnsiballZ_podman_container_info.py'
Jan 22 16:51:13 compute-0 sudo[206291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:13 compute-0 python3.9[206293]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Jan 22 16:51:13 compute-0 sudo[206291]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:14 compute-0 sudo[206457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtwicsmgccjumzqbtuwxrfreelhwckbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100673.9509513-1272-34311644677127/AnsiballZ_podman_container_exec.py'
Jan 22 16:51:14 compute-0 sudo[206457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:14 compute-0 python3.9[206459]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 16:51:14 compute-0 systemd[1]: Started libpod-conmon-c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30.scope.
Jan 22 16:51:14 compute-0 podman[206460]: 2026-01-22 16:51:14.634324554 +0000 UTC m=+0.082341165 container exec c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Jan 22 16:51:14 compute-0 podman[206460]: 2026-01-22 16:51:14.668007793 +0000 UTC m=+0.116024414 container exec_died c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 16:51:14 compute-0 systemd[1]: libpod-conmon-c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30.scope: Deactivated successfully.
Jan 22 16:51:14 compute-0 sudo[206457]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:15 compute-0 sudo[206639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npzgrwssmyuqbaprtxelozbrhjdtfmee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100674.892446-1280-279420643032714/AnsiballZ_podman_container_exec.py'
Jan 22 16:51:15 compute-0 sudo[206639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:15 compute-0 python3.9[206641]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 16:51:15 compute-0 systemd[1]: Started libpod-conmon-c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30.scope.
Jan 22 16:51:15 compute-0 podman[206642]: 2026-01-22 16:51:15.434307337 +0000 UTC m=+0.077310075 container exec c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Jan 22 16:51:15 compute-0 podman[206642]: 2026-01-22 16:51:15.443028562 +0000 UTC m=+0.086031300 container exec_died c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, release=1755695350, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Jan 22 16:51:15 compute-0 systemd[1]: libpod-conmon-c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30.scope: Deactivated successfully.
Jan 22 16:51:15 compute-0 sudo[206639]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:15 compute-0 sudo[206823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghllnypvezkhdiqxdcxdohysdzujnxcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100675.665055-1288-162931561645324/AnsiballZ_file.py'
Jan 22 16:51:15 compute-0 sudo[206823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:16 compute-0 python3.9[206825]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:16 compute-0 sudo[206823]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:16 compute-0 sudo[206975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okajywindssmemnwjkucooyvxdguvhib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100676.490059-1297-228706671019831/AnsiballZ_file.py'
Jan 22 16:51:16 compute-0 sudo[206975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:17 compute-0 python3.9[206977]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:17 compute-0 sudo[206975]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:17 compute-0 sudo[207127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtrzlmdltnxejvjfhttsruaczfkuvxpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100677.28711-1305-40825424688298/AnsiballZ_stat.py'
Jan 22 16:51:17 compute-0 sudo[207127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:17 compute-0 python3.9[207129]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:51:17 compute-0 sudo[207127]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:18 compute-0 sudo[207250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksjxefoeiiqmvmfphfvotwweuixpsegn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100677.28711-1305-40825424688298/AnsiballZ_copy.py'
Jan 22 16:51:18 compute-0 sudo[207250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:18 compute-0 python3.9[207252]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769100677.28711-1305-40825424688298/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:18 compute-0 sudo[207250]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:19 compute-0 sudo[207433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pteqbjadkrhvjatemdhniaihngjopnuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100678.666993-1321-183409527939165/AnsiballZ_file.py'
Jan 22 16:51:19 compute-0 sudo[207433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:19 compute-0 podman[207377]: 2026-01-22 16:51:19.043809842 +0000 UTC m=+0.059837255 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 16:51:19 compute-0 podman[207376]: 2026-01-22 16:51:19.078815705 +0000 UTC m=+0.097694981 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 16:51:19 compute-0 python3.9[207440]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:19 compute-0 sudo[207433]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:19 compute-0 sudo[207596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emuobbbqcvtbawcwiyzwzanmgayoexvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100679.4321034-1329-240249191801560/AnsiballZ_stat.py'
Jan 22 16:51:19 compute-0 sudo[207596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:20 compute-0 python3.9[207598]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:51:20 compute-0 sudo[207596]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:20 compute-0 sudo[207674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkmjfpxfyfaqhizztqaqpgfleiynznuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100679.4321034-1329-240249191801560/AnsiballZ_file.py'
Jan 22 16:51:20 compute-0 sudo[207674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:21 compute-0 python3.9[207676]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:21 compute-0 sudo[207674]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:21 compute-0 sudo[207842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puultqjkpgpmxcaqmmzgdrzcpdzchrpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100681.6744587-1341-29836526911499/AnsiballZ_stat.py'
Jan 22 16:51:21 compute-0 sudo[207842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:22 compute-0 podman[207800]: 2026-01-22 16:51:22.004989765 +0000 UTC m=+0.064786712 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350)
Jan 22 16:51:22 compute-0 python3.9[207848]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:51:22 compute-0 sudo[207842]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:22 compute-0 sudo[207924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmxfwfprmpxfklwicmtpdxmnvywfirob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100681.6744587-1341-29836526911499/AnsiballZ_file.py'
Jan 22 16:51:22 compute-0 sudo[207924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:22 compute-0 python3.9[207926]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.sjef_5ex recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:22 compute-0 sudo[207924]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:23 compute-0 sudo[208076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlqhtarwaujrtgspnyyrphzfzvrvumpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100682.8138454-1353-79144976514379/AnsiballZ_stat.py'
Jan 22 16:51:23 compute-0 sudo[208076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:23 compute-0 python3.9[208078]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:51:23 compute-0 sudo[208076]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:23 compute-0 sudo[208154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zddpufkelkmgksmowqpigwcgjreioxeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100682.8138454-1353-79144976514379/AnsiballZ_file.py'
Jan 22 16:51:23 compute-0 sudo[208154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:23 compute-0 python3.9[208156]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:23 compute-0 sudo[208154]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:24 compute-0 sudo[208306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpcniooavcdevaipjebtejjkvrcczxtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100684.117335-1366-148210285841838/AnsiballZ_command.py'
Jan 22 16:51:24 compute-0 sudo[208306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:24 compute-0 python3.9[208308]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:51:24 compute-0 sudo[208306]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:25 compute-0 sudo[208470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsethbkxmszzkaotexvceqcqkisuykjn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769100684.7974138-1374-270614202854392/AnsiballZ_edpm_nftables_from_files.py'
Jan 22 16:51:25 compute-0 sudo[208470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:25 compute-0 podman[208433]: 2026-01-22 16:51:25.27361109 +0000 UTC m=+0.064540136 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 16:51:25 compute-0 python3[208476]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 22 16:51:25 compute-0 sudo[208470]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:26 compute-0 sudo[208631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqmneyysdnfxklidgmgdpudovvmrrnuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100685.7667377-1382-242390745289308/AnsiballZ_stat.py'
Jan 22 16:51:26 compute-0 sudo[208631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:26 compute-0 python3.9[208633]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:51:26 compute-0 sudo[208631]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:26 compute-0 sudo[208709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rerijqvfglhrclejwqzlivpmohaxvlsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100685.7667377-1382-242390745289308/AnsiballZ_file.py'
Jan 22 16:51:26 compute-0 sudo[208709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:26 compute-0 python3.9[208711]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:26 compute-0 sudo[208709]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:27 compute-0 sudo[208861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thghqybzededsgtuxpudzxaeturgngfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100686.9359233-1394-151499544513843/AnsiballZ_stat.py'
Jan 22 16:51:27 compute-0 sudo[208861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:27 compute-0 python3.9[208863]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:51:27 compute-0 sudo[208861]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:27 compute-0 sudo[208939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkvzfaridqcdfclnrvbgedoiyqfkepwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100686.9359233-1394-151499544513843/AnsiballZ_file.py'
Jan 22 16:51:27 compute-0 sudo[208939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:28 compute-0 python3.9[208941]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:28 compute-0 sudo[208939]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:28 compute-0 sudo[209091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfrrzhzaybxcfhzoxixbtobhujkqannh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100688.2579417-1406-7693579280136/AnsiballZ_stat.py'
Jan 22 16:51:28 compute-0 sudo[209091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:28 compute-0 python3.9[209093]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:51:28 compute-0 sudo[209091]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:29 compute-0 sudo[209169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmjgdjwytizguvrogrynodtwaswhbicx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100688.2579417-1406-7693579280136/AnsiballZ_file.py'
Jan 22 16:51:29 compute-0 sudo[209169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:29 compute-0 python3.9[209171]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:29 compute-0 sudo[209169]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:30 compute-0 sudo[209321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-recufhuxluzlwlbercelnadrrypddgwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100689.7800517-1418-260073913220904/AnsiballZ_stat.py'
Jan 22 16:51:30 compute-0 sudo[209321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:30 compute-0 python3.9[209323]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:51:30 compute-0 sudo[209321]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:30 compute-0 sudo[209399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etmougqtwgcrqqnrjobjtdivttulyphi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100689.7800517-1418-260073913220904/AnsiballZ_file.py'
Jan 22 16:51:30 compute-0 sudo[209399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:30 compute-0 python3.9[209401]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:30 compute-0 sudo[209399]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:31 compute-0 podman[209478]: 2026-01-22 16:51:31.379548231 +0000 UTC m=+0.089791416 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 16:51:31 compute-0 sudo[209575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-laxywuhvezewcniyfpjywwpjtimuqawn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100691.1235213-1430-159156367059474/AnsiballZ_stat.py'
Jan 22 16:51:31 compute-0 sudo[209575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:31 compute-0 python3.9[209577]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:51:31 compute-0 sudo[209575]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:32 compute-0 sudo[209700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqwljxyqasrjjmqslwjyohjwwpbtwfme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100691.1235213-1430-159156367059474/AnsiballZ_copy.py'
Jan 22 16:51:32 compute-0 sudo[209700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:32 compute-0 python3.9[209702]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769100691.1235213-1430-159156367059474/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:32 compute-0 sudo[209700]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:32 compute-0 sudo[209852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoyouxwcmuxypqsxbrxmrztquimgfewv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100692.5809662-1445-52410194662314/AnsiballZ_file.py'
Jan 22 16:51:32 compute-0 sudo[209852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:33 compute-0 python3.9[209854]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:33 compute-0 sudo[209852]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:33 compute-0 sudo[210004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voooymqbfevwstoxyvavplbhtzhkyyvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100693.3808994-1453-252214233031636/AnsiballZ_command.py'
Jan 22 16:51:33 compute-0 sudo[210004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:33 compute-0 python3.9[210006]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:51:33 compute-0 sudo[210004]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:34 compute-0 sudo[210159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyjzrtgmnfrqagdmjnhbzmlfuadulcoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100694.1386328-1461-63775275364151/AnsiballZ_blockinfile.py'
Jan 22 16:51:34 compute-0 sudo[210159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:34 compute-0 python3.9[210161]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:34 compute-0 sudo[210159]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:35 compute-0 sudo[210311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahviyozdyenkfeubwiwgloduijiqtijk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100695.145267-1470-74967178317778/AnsiballZ_command.py'
Jan 22 16:51:35 compute-0 sudo[210311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:35 compute-0 python3.9[210313]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:51:35 compute-0 sudo[210311]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:36 compute-0 sudo[210464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-romvvpcgtxhqktqxvytbzmslhdzbdwlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100695.8109803-1478-160982292452164/AnsiballZ_stat.py'
Jan 22 16:51:36 compute-0 sudo[210464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:36 compute-0 python3.9[210466]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:51:36 compute-0 sudo[210464]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:36 compute-0 sudo[210618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaawiyfcgobswwwltljluimhgzrqkswh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100696.5700727-1486-173847915016124/AnsiballZ_command.py'
Jan 22 16:51:36 compute-0 sudo[210618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:37 compute-0 python3.9[210620]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:51:37 compute-0 sudo[210618]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:37 compute-0 sudo[210773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agbxeulryeyzmcttlcaulyfrczpzgkpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769100697.3004682-1494-2422612746366/AnsiballZ_file.py'
Jan 22 16:51:37 compute-0 sudo[210773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 16:51:37 compute-0 python3.9[210775]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:37 compute-0 sudo[210773]: pam_unix(sudo:session): session closed for user root
Jan 22 16:51:38 compute-0 sshd-session[183378]: Connection closed by 192.168.122.30 port 32852
Jan 22 16:51:38 compute-0 sshd-session[183375]: pam_unix(sshd:session): session closed for user zuul
Jan 22 16:51:38 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Jan 22 16:51:38 compute-0 systemd[1]: session-26.scope: Consumed 1min 59.115s CPU time.
Jan 22 16:51:38 compute-0 systemd-logind[796]: Session 26 logged out. Waiting for processes to exit.
Jan 22 16:51:38 compute-0 systemd-logind[796]: Removed session 26.
Jan 22 16:51:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:51:41.906 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:51:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:51:41.908 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:51:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:51:41.908 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:51:43 compute-0 podman[210800]: 2026-01-22 16:51:43.356574758 +0000 UTC m=+0.066515007 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 16:51:49 compute-0 podman[210825]: 2026-01-22 16:51:49.337785061 +0000 UTC m=+0.046837249 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 16:51:49 compute-0 podman[210824]: 2026-01-22 16:51:49.367425756 +0000 UTC m=+0.075265052 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 16:51:51 compute-0 nova_compute[183075]: 2026-01-22 16:51:51.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:51:52 compute-0 podman[210870]: 2026-01-22 16:51:52.365940091 +0000 UTC m=+0.073564748 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-type=git)
Jan 22 16:51:52 compute-0 nova_compute[183075]: 2026-01-22 16:51:52.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:51:52 compute-0 nova_compute[183075]: 2026-01-22 16:51:52.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:51:52 compute-0 nova_compute[183075]: 2026-01-22 16:51:52.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:51:52 compute-0 nova_compute[183075]: 2026-01-22 16:51:52.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 16:51:52 compute-0 nova_compute[183075]: 2026-01-22 16:51:52.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:51:53 compute-0 nova_compute[183075]: 2026-01-22 16:51:53.214 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:51:53 compute-0 nova_compute[183075]: 2026-01-22 16:51:53.215 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:51:53 compute-0 nova_compute[183075]: 2026-01-22 16:51:53.215 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:51:53 compute-0 nova_compute[183075]: 2026-01-22 16:51:53.216 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 16:51:53 compute-0 nova_compute[183075]: 2026-01-22 16:51:53.420 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 16:51:53 compute-0 nova_compute[183075]: 2026-01-22 16:51:53.422 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5947MB free_disk=73.41587448120117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 16:51:53 compute-0 nova_compute[183075]: 2026-01-22 16:51:53.422 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:51:53 compute-0 nova_compute[183075]: 2026-01-22 16:51:53.423 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:51:53 compute-0 nova_compute[183075]: 2026-01-22 16:51:53.557 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 16:51:53 compute-0 nova_compute[183075]: 2026-01-22 16:51:53.557 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 16:51:53 compute-0 nova_compute[183075]: 2026-01-22 16:51:53.584 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 16:51:53 compute-0 nova_compute[183075]: 2026-01-22 16:51:53.734 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 16:51:53 compute-0 nova_compute[183075]: 2026-01-22 16:51:53.735 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 16:51:53 compute-0 nova_compute[183075]: 2026-01-22 16:51:53.735 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.312s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:51:54 compute-0 nova_compute[183075]: 2026-01-22 16:51:54.734 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:51:54 compute-0 nova_compute[183075]: 2026-01-22 16:51:54.735 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 16:51:54 compute-0 nova_compute[183075]: 2026-01-22 16:51:54.735 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 16:51:55 compute-0 nova_compute[183075]: 2026-01-22 16:51:55.249 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 16:51:55 compute-0 nova_compute[183075]: 2026-01-22 16:51:55.250 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.441 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.441 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.441 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.442 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.442 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.442 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.442 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.442 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.442 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.442 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.443 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.443 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.443 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.443 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.443 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.443 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.443 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.443 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.443 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.444 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.444 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.444 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.444 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.444 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:51:55.444 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:51:55 compute-0 nova_compute[183075]: 2026-01-22 16:51:55.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:51:55 compute-0 nova_compute[183075]: 2026-01-22 16:51:55.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:51:56 compute-0 podman[210892]: 2026-01-22 16:51:56.334928687 +0000 UTC m=+0.050946055 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Jan 22 16:52:02 compute-0 podman[210912]: 2026-01-22 16:52:02.36226744 +0000 UTC m=+0.071373642 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 16:52:02 compute-0 rsyslogd[1006]: imjournal: 1531 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 22 16:52:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:52:04.423 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 16:52:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:52:04.426 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 16:52:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:52:04.427 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:52:14 compute-0 podman[210936]: 2026-01-22 16:52:14.364744994 +0000 UTC m=+0.070777541 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 16:52:20 compute-0 podman[210961]: 2026-01-22 16:52:20.358729826 +0000 UTC m=+0.061326702 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 16:52:20 compute-0 podman[210960]: 2026-01-22 16:52:20.370895276 +0000 UTC m=+0.084683156 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 16:52:23 compute-0 podman[211005]: 2026-01-22 16:52:23.398995622 +0000 UTC m=+0.104379974 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.openshift.expose-services=, release=1755695350)
Jan 22 16:52:27 compute-0 podman[211027]: 2026-01-22 16:52:27.358096147 +0000 UTC m=+0.068651563 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 16:52:33 compute-0 podman[211047]: 2026-01-22 16:52:33.357165415 +0000 UTC m=+0.066478208 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 16:52:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:52:41.908 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:52:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:52:41.908 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:52:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:52:41.908 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:52:45 compute-0 podman[211073]: 2026-01-22 16:52:45.379645458 +0000 UTC m=+0.090135129 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 16:52:51 compute-0 podman[211100]: 2026-01-22 16:52:51.340992345 +0000 UTC m=+0.049216044 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 16:52:51 compute-0 podman[211099]: 2026-01-22 16:52:51.366441983 +0000 UTC m=+0.077068586 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 16:52:51 compute-0 nova_compute[183075]: 2026-01-22 16:52:51.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:52:52 compute-0 nova_compute[183075]: 2026-01-22 16:52:52.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:52:52 compute-0 nova_compute[183075]: 2026-01-22 16:52:52.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:52:52 compute-0 nova_compute[183075]: 2026-01-22 16:52:52.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:52:52 compute-0 nova_compute[183075]: 2026-01-22 16:52:52.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 16:52:52 compute-0 nova_compute[183075]: 2026-01-22 16:52:52.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:52:52 compute-0 nova_compute[183075]: 2026-01-22 16:52:52.823 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:52:52 compute-0 nova_compute[183075]: 2026-01-22 16:52:52.823 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:52:52 compute-0 nova_compute[183075]: 2026-01-22 16:52:52.823 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:52:52 compute-0 nova_compute[183075]: 2026-01-22 16:52:52.824 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 16:52:52 compute-0 nova_compute[183075]: 2026-01-22 16:52:52.996 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 16:52:52 compute-0 nova_compute[183075]: 2026-01-22 16:52:52.997 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6002MB free_disk=73.41587448120117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 16:52:52 compute-0 nova_compute[183075]: 2026-01-22 16:52:52.997 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:52:52 compute-0 nova_compute[183075]: 2026-01-22 16:52:52.997 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:52:53 compute-0 nova_compute[183075]: 2026-01-22 16:52:53.070 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 16:52:53 compute-0 nova_compute[183075]: 2026-01-22 16:52:53.070 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 16:52:53 compute-0 nova_compute[183075]: 2026-01-22 16:52:53.096 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 16:52:53 compute-0 nova_compute[183075]: 2026-01-22 16:52:53.114 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 16:52:53 compute-0 nova_compute[183075]: 2026-01-22 16:52:53.116 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 16:52:53 compute-0 nova_compute[183075]: 2026-01-22 16:52:53.116 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:52:54 compute-0 podman[211143]: 2026-01-22 16:52:54.348054106 +0000 UTC m=+0.059344791 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.expose-services=, config_id=openstack_network_exporter, release=1755695350, vcs-type=git)
Jan 22 16:52:56 compute-0 nova_compute[183075]: 2026-01-22 16:52:56.111 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:52:56 compute-0 nova_compute[183075]: 2026-01-22 16:52:56.112 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:52:56 compute-0 nova_compute[183075]: 2026-01-22 16:52:56.112 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 16:52:56 compute-0 nova_compute[183075]: 2026-01-22 16:52:56.112 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 16:52:56 compute-0 nova_compute[183075]: 2026-01-22 16:52:56.318 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 16:52:56 compute-0 nova_compute[183075]: 2026-01-22 16:52:56.319 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:52:56 compute-0 nova_compute[183075]: 2026-01-22 16:52:56.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:52:56 compute-0 nova_compute[183075]: 2026-01-22 16:52:56.807 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:52:58 compute-0 podman[211164]: 2026-01-22 16:52:58.355322318 +0000 UTC m=+0.068669915 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 16:53:04 compute-0 podman[211184]: 2026-01-22 16:53:04.362445746 +0000 UTC m=+0.075139696 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 16:53:16 compute-0 podman[211208]: 2026-01-22 16:53:16.368137037 +0000 UTC m=+0.083014476 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 16:53:22 compute-0 podman[211232]: 2026-01-22 16:53:22.340773933 +0000 UTC m=+0.049676533 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 22 16:53:22 compute-0 podman[211231]: 2026-01-22 16:53:22.379492307 +0000 UTC m=+0.091235322 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 22 16:53:25 compute-0 podman[211274]: 2026-01-22 16:53:25.370963531 +0000 UTC m=+0.088166620 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 16:53:29 compute-0 podman[211295]: 2026-01-22 16:53:29.423958671 +0000 UTC m=+0.120102617 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute)
Jan 22 16:53:35 compute-0 podman[211317]: 2026-01-22 16:53:35.342733538 +0000 UTC m=+0.050880633 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 16:53:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:53:41.909 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:53:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:53:41.910 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:53:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:53:41.911 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:53:47 compute-0 podman[211341]: 2026-01-22 16:53:47.396816851 +0000 UTC m=+0.095939324 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 16:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:53:49.626 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 16:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:53:49.628 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 16:53:51 compute-0 nova_compute[183075]: 2026-01-22 16:53:51.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:53:51 compute-0 nova_compute[183075]: 2026-01-22 16:53:51.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:53:51 compute-0 nova_compute[183075]: 2026-01-22 16:53:51.790 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 16:53:51 compute-0 nova_compute[183075]: 2026-01-22 16:53:51.833 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 16:53:51 compute-0 nova_compute[183075]: 2026-01-22 16:53:51.834 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:53:51 compute-0 nova_compute[183075]: 2026-01-22 16:53:51.835 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 16:53:51 compute-0 nova_compute[183075]: 2026-01-22 16:53:51.854 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:53:52 compute-0 nova_compute[183075]: 2026-01-22 16:53:52.862 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:53:52 compute-0 nova_compute[183075]: 2026-01-22 16:53:52.863 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:53:52 compute-0 nova_compute[183075]: 2026-01-22 16:53:52.863 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 16:53:53 compute-0 podman[211366]: 2026-01-22 16:53:53.374565552 +0000 UTC m=+0.074290947 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 16:53:53 compute-0 podman[211365]: 2026-01-22 16:53:53.409473286 +0000 UTC m=+0.106790538 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 16:53:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:53:53.631 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:53:53 compute-0 nova_compute[183075]: 2026-01-22 16:53:53.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:53:53 compute-0 nova_compute[183075]: 2026-01-22 16:53:53.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:53:53 compute-0 nova_compute[183075]: 2026-01-22 16:53:53.813 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:53:53 compute-0 nova_compute[183075]: 2026-01-22 16:53:53.813 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:53:53 compute-0 nova_compute[183075]: 2026-01-22 16:53:53.814 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:53:53 compute-0 nova_compute[183075]: 2026-01-22 16:53:53.814 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 16:53:53 compute-0 nova_compute[183075]: 2026-01-22 16:53:53.968 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 16:53:53 compute-0 nova_compute[183075]: 2026-01-22 16:53:53.970 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6049MB free_disk=73.41586303710938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 16:53:53 compute-0 nova_compute[183075]: 2026-01-22 16:53:53.970 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:53:53 compute-0 nova_compute[183075]: 2026-01-22 16:53:53.970 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:53:54 compute-0 nova_compute[183075]: 2026-01-22 16:53:54.205 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 16:53:54 compute-0 nova_compute[183075]: 2026-01-22 16:53:54.205 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 16:53:54 compute-0 nova_compute[183075]: 2026-01-22 16:53:54.261 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing inventories for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 16:53:54 compute-0 nova_compute[183075]: 2026-01-22 16:53:54.320 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Updating ProviderTree inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 16:53:54 compute-0 nova_compute[183075]: 2026-01-22 16:53:54.320 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 16:53:54 compute-0 nova_compute[183075]: 2026-01-22 16:53:54.344 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing aggregate associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 16:53:54 compute-0 nova_compute[183075]: 2026-01-22 16:53:54.371 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing trait associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NODE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 16:53:54 compute-0 nova_compute[183075]: 2026-01-22 16:53:54.393 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 16:53:54 compute-0 nova_compute[183075]: 2026-01-22 16:53:54.412 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 16:53:54 compute-0 nova_compute[183075]: 2026-01-22 16:53:54.415 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 16:53:54 compute-0 nova_compute[183075]: 2026-01-22 16:53:54.416 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.446s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.444 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.444 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.445 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.445 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.445 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.445 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:53:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:53:56 compute-0 podman[211407]: 2026-01-22 16:53:56.393544736 +0000 UTC m=+0.100391220 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Jan 22 16:53:56 compute-0 nova_compute[183075]: 2026-01-22 16:53:56.411 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:53:56 compute-0 nova_compute[183075]: 2026-01-22 16:53:56.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:53:56 compute-0 nova_compute[183075]: 2026-01-22 16:53:56.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 16:53:56 compute-0 nova_compute[183075]: 2026-01-22 16:53:56.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 16:53:56 compute-0 nova_compute[183075]: 2026-01-22 16:53:56.800 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 16:53:57 compute-0 nova_compute[183075]: 2026-01-22 16:53:57.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:53:57 compute-0 nova_compute[183075]: 2026-01-22 16:53:57.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:54:00 compute-0 podman[211428]: 2026-01-22 16:54:00.399081192 +0000 UTC m=+0.106254293 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 16:54:06 compute-0 podman[211448]: 2026-01-22 16:54:06.343538621 +0000 UTC m=+0.052183418 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 16:54:16 compute-0 nova_compute[183075]: 2026-01-22 16:54:16.514 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Acquiring lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:16 compute-0 nova_compute[183075]: 2026-01-22 16:54:16.515 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:16 compute-0 nova_compute[183075]: 2026-01-22 16:54:16.542 183079 DEBUG nova.compute.manager [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 16:54:16 compute-0 nova_compute[183075]: 2026-01-22 16:54:16.667 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:16 compute-0 nova_compute[183075]: 2026-01-22 16:54:16.668 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:16 compute-0 nova_compute[183075]: 2026-01-22 16:54:16.676 183079 DEBUG nova.virt.hardware [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 16:54:16 compute-0 nova_compute[183075]: 2026-01-22 16:54:16.677 183079 INFO nova.compute.claims [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Claim successful on node compute-0.ctlplane.example.com
Jan 22 16:54:16 compute-0 nova_compute[183075]: 2026-01-22 16:54:16.774 183079 DEBUG nova.compute.provider_tree [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 16:54:16 compute-0 nova_compute[183075]: 2026-01-22 16:54:16.787 183079 DEBUG nova.scheduler.client.report [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 16:54:16 compute-0 nova_compute[183075]: 2026-01-22 16:54:16.815 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:16 compute-0 nova_compute[183075]: 2026-01-22 16:54:16.816 183079 DEBUG nova.compute.manager [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 16:54:16 compute-0 nova_compute[183075]: 2026-01-22 16:54:16.863 183079 DEBUG nova.compute.manager [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 16:54:16 compute-0 nova_compute[183075]: 2026-01-22 16:54:16.864 183079 DEBUG nova.network.neutron [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 16:54:16 compute-0 nova_compute[183075]: 2026-01-22 16:54:16.893 183079 INFO nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 16:54:16 compute-0 nova_compute[183075]: 2026-01-22 16:54:16.917 183079 DEBUG nova.compute.manager [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 16:54:17 compute-0 nova_compute[183075]: 2026-01-22 16:54:17.073 183079 DEBUG nova.compute.manager [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 16:54:17 compute-0 nova_compute[183075]: 2026-01-22 16:54:17.075 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 16:54:17 compute-0 nova_compute[183075]: 2026-01-22 16:54:17.075 183079 INFO nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Creating image(s)
Jan 22 16:54:17 compute-0 nova_compute[183075]: 2026-01-22 16:54:17.076 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Acquiring lock "/var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:17 compute-0 nova_compute[183075]: 2026-01-22 16:54:17.077 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "/var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:17 compute-0 nova_compute[183075]: 2026-01-22 16:54:17.077 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "/var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:17 compute-0 nova_compute[183075]: 2026-01-22 16:54:17.078 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:17 compute-0 nova_compute[183075]: 2026-01-22 16:54:17.079 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:17 compute-0 nova_compute[183075]: 2026-01-22 16:54:17.493 183079 WARNING oslo_policy.policy [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 22 16:54:17 compute-0 nova_compute[183075]: 2026-01-22 16:54:17.493 183079 WARNING oslo_policy.policy [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 22 16:54:17 compute-0 nova_compute[183075]: 2026-01-22 16:54:17.497 183079 DEBUG nova.policy [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd904092c61441ffad7349f369ab599f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '73fb0e8a2818489ba429a79567981fbc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 16:54:18 compute-0 podman[211471]: 2026-01-22 16:54:18.363114031 +0000 UTC m=+0.063400291 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 16:54:18 compute-0 nova_compute[183075]: 2026-01-22 16:54:18.921 183079 DEBUG nova.network.neutron [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Successfully created port: 2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 16:54:19 compute-0 nova_compute[183075]: 2026-01-22 16:54:19.093 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 16:54:19 compute-0 nova_compute[183075]: 2026-01-22 16:54:19.146 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218.part --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 16:54:19 compute-0 nova_compute[183075]: 2026-01-22 16:54:19.147 183079 DEBUG nova.virt.images [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] e1b65bbe-5c14-4552-a5d9-d275c9dd42d3 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 22 16:54:19 compute-0 nova_compute[183075]: 2026-01-22 16:54:19.148 183079 DEBUG nova.privsep.utils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 16:54:19 compute-0 nova_compute[183075]: 2026-01-22 16:54:19.149 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218.part /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 16:54:19 compute-0 nova_compute[183075]: 2026-01-22 16:54:19.348 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218.part /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218.converted" returned: 0 in 0.200s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 16:54:19 compute-0 nova_compute[183075]: 2026-01-22 16:54:19.359 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 16:54:19 compute-0 nova_compute[183075]: 2026-01-22 16:54:19.445 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218.converted --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 16:54:19 compute-0 nova_compute[183075]: 2026-01-22 16:54:19.447 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.369s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:19 compute-0 nova_compute[183075]: 2026-01-22 16:54:19.473 183079 INFO oslo.privsep.daemon [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp34r2vfut/privsep.sock']
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.170 183079 INFO oslo.privsep.daemon [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Spawned new privsep daemon via rootwrap
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.036 211515 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.044 211515 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.048 211515 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.048 211515 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211515
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.263 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.318 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.320 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.320 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.335 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.391 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.392 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.436 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.438 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.440 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.497 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.499 183079 DEBUG nova.virt.disk.api [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Checking if we can resize image /var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.500 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.554 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.556 183079 DEBUG nova.virt.disk.api [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Cannot resize image /var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.556 183079 DEBUG nova.objects.instance [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lazy-loading 'migration_context' on Instance uuid 243d8a1b-180e-4b78-88fb-2d08cb598b7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.589 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.590 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Ensure instance console log exists: /var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.591 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.592 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.592 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.974 183079 DEBUG nova.network.neutron [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Successfully updated port: 2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.992 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Acquiring lock "refresh_cache-243d8a1b-180e-4b78-88fb-2d08cb598b7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.992 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Acquired lock "refresh_cache-243d8a1b-180e-4b78-88fb-2d08cb598b7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 16:54:20 compute-0 nova_compute[183075]: 2026-01-22 16:54:20.993 183079 DEBUG nova.network.neutron [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 16:54:21 compute-0 nova_compute[183075]: 2026-01-22 16:54:21.357 183079 DEBUG nova.network.neutron [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 16:54:21 compute-0 nova_compute[183075]: 2026-01-22 16:54:21.500 183079 DEBUG nova.compute.manager [req-92e00d1b-40d0-4cfd-a0e1-c608baf5892e req-0f125052-00f6-43a8-ba24-3529d0742566 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Received event network-changed-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 16:54:21 compute-0 nova_compute[183075]: 2026-01-22 16:54:21.501 183079 DEBUG nova.compute.manager [req-92e00d1b-40d0-4cfd-a0e1-c608baf5892e req-0f125052-00f6-43a8-ba24-3529d0742566 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Refreshing instance network info cache due to event network-changed-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 16:54:21 compute-0 nova_compute[183075]: 2026-01-22 16:54:21.501 183079 DEBUG oslo_concurrency.lockutils [req-92e00d1b-40d0-4cfd-a0e1-c608baf5892e req-0f125052-00f6-43a8-ba24-3529d0742566 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-243d8a1b-180e-4b78-88fb-2d08cb598b7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.404 183079 DEBUG nova.network.neutron [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Updating instance_info_cache with network_info: [{"id": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "address": "fa:16:3e:f9:0c:52", "network": {"id": "1abfbb44-50a3-4820-9c53-e621ffff9719", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1873150826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fb0e8a2818489ba429a79567981fbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eabf26e-55", "ovs_interfaceid": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.427 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Releasing lock "refresh_cache-243d8a1b-180e-4b78-88fb-2d08cb598b7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.428 183079 DEBUG nova.compute.manager [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Instance network_info: |[{"id": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "address": "fa:16:3e:f9:0c:52", "network": {"id": "1abfbb44-50a3-4820-9c53-e621ffff9719", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1873150826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fb0e8a2818489ba429a79567981fbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eabf26e-55", "ovs_interfaceid": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.429 183079 DEBUG oslo_concurrency.lockutils [req-92e00d1b-40d0-4cfd-a0e1-c608baf5892e req-0f125052-00f6-43a8-ba24-3529d0742566 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-243d8a1b-180e-4b78-88fb-2d08cb598b7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.429 183079 DEBUG nova.network.neutron [req-92e00d1b-40d0-4cfd-a0e1-c608baf5892e req-0f125052-00f6-43a8-ba24-3529d0742566 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Refreshing network info cache for port 2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.432 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Start _get_guest_xml network_info=[{"id": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "address": "fa:16:3e:f9:0c:52", "network": {"id": "1abfbb44-50a3-4820-9c53-e621ffff9719", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1873150826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fb0e8a2818489ba429a79567981fbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eabf26e-55", "ovs_interfaceid": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.440 183079 WARNING nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.448 183079 DEBUG nova.virt.libvirt.host [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.448 183079 DEBUG nova.virt.libvirt.host [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.453 183079 DEBUG nova.virt.libvirt.host [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.454 183079 DEBUG nova.virt.libvirt.host [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.454 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.455 183079 DEBUG nova.virt.hardware [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.455 183079 DEBUG nova.virt.hardware [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.455 183079 DEBUG nova.virt.hardware [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.456 183079 DEBUG nova.virt.hardware [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.456 183079 DEBUG nova.virt.hardware [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.456 183079 DEBUG nova.virt.hardware [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.456 183079 DEBUG nova.virt.hardware [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.457 183079 DEBUG nova.virt.hardware [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.457 183079 DEBUG nova.virt.hardware [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.457 183079 DEBUG nova.virt.hardware [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.458 183079 DEBUG nova.virt.hardware [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.462 183079 DEBUG nova.privsep.utils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.463 183079 DEBUG nova.virt.libvirt.vif [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T16:54:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-2117019190',display_name='tempest-TestServerBasicOps-server-2117019190',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserverbasicops-server-2117019190',id=1,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPbKBuCyR9cLZReK75enUrgE5KJvQeyP50prZDIEvZ5uXx6sRIVFnoPiHx8vYeR8tCZ4GuJvkv4q3s9I1caAJMaIZPnsmsfmHIg0OmJrn8S63wqPWPNKX90mr7GILLt7UQ==',key_name='tempest-TestServerBasicOps-958706236',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='73fb0e8a2818489ba429a79567981fbc',ramdisk_id='',reservation_id='r-5em908ya',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-1562817894',owner_user_name='tempest-TestServerBasicOps-1562817894-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T16:54:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fd904092c61441ffad7349f369ab599f',uuid=243d8a1b-180e-4b78-88fb-2d08cb598b7d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "address": "fa:16:3e:f9:0c:52", "network": {"id": "1abfbb44-50a3-4820-9c53-e621ffff9719", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1873150826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fb0e8a2818489ba429a79567981fbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eabf26e-55", "ovs_interfaceid": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.463 183079 DEBUG nova.network.os_vif_util [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Converting VIF {"id": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "address": "fa:16:3e:f9:0c:52", "network": {"id": "1abfbb44-50a3-4820-9c53-e621ffff9719", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1873150826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fb0e8a2818489ba429a79567981fbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eabf26e-55", "ovs_interfaceid": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.464 183079 DEBUG nova.network.os_vif_util [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:52,bridge_name='br-int',has_traffic_filtering=True,id=2eabf26e-55cc-43cb-9ad5-ab8259ca50c0,network=Network(1abfbb44-50a3-4820-9c53-e621ffff9719),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eabf26e-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.465 183079 DEBUG nova.objects.instance [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lazy-loading 'pci_devices' on Instance uuid 243d8a1b-180e-4b78-88fb-2d08cb598b7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.481 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] End _get_guest_xml xml=<domain type="kvm">
Jan 22 16:54:22 compute-0 nova_compute[183075]:   <uuid>243d8a1b-180e-4b78-88fb-2d08cb598b7d</uuid>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   <name>instance-00000001</name>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   <metadata>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <nova:name>tempest-TestServerBasicOps-server-2117019190</nova:name>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 16:54:22</nova:creationTime>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 16:54:22 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 16:54:22 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 16:54:22 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 16:54:22 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 16:54:22 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 16:54:22 compute-0 nova_compute[183075]:         <nova:user uuid="fd904092c61441ffad7349f369ab599f">tempest-TestServerBasicOps-1562817894-project-member</nova:user>
Jan 22 16:54:22 compute-0 nova_compute[183075]:         <nova:project uuid="73fb0e8a2818489ba429a79567981fbc">tempest-TestServerBasicOps-1562817894</nova:project>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 16:54:22 compute-0 nova_compute[183075]:         <nova:port uuid="2eabf26e-55cc-43cb-9ad5-ab8259ca50c0">
Jan 22 16:54:22 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   </metadata>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <system>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <entry name="serial">243d8a1b-180e-4b78-88fb-2d08cb598b7d</entry>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <entry name="uuid">243d8a1b-180e-4b78-88fb-2d08cb598b7d</entry>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     </system>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   <os>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   </os>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   <features>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <apic/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   </features>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   </clock>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   </cpu>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   <devices>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d/disk"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     </disk>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <disk type="file" device="cdrom">
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d/disk.config"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <target dev="sda" bus="sata"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     </disk>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:f9:0c:52"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <target dev="tap2eabf26e-55"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     </interface>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d/console.log" append="off"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     </serial>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <video>
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     </video>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     </rng>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 16:54:22 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 16:54:22 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 16:54:22 compute-0 nova_compute[183075]:   </devices>
Jan 22 16:54:22 compute-0 nova_compute[183075]: </domain>
Jan 22 16:54:22 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.483 183079 DEBUG nova.compute.manager [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Preparing to wait for external event network-vif-plugged-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.483 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Acquiring lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.485 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.485 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.485 183079 DEBUG nova.virt.libvirt.vif [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T16:54:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-2117019190',display_name='tempest-TestServerBasicOps-server-2117019190',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserverbasicops-server-2117019190',id=1,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPbKBuCyR9cLZReK75enUrgE5KJvQeyP50prZDIEvZ5uXx6sRIVFnoPiHx8vYeR8tCZ4GuJvkv4q3s9I1caAJMaIZPnsmsfmHIg0OmJrn8S63wqPWPNKX90mr7GILLt7UQ==',key_name='tempest-TestServerBasicOps-958706236',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='73fb0e8a2818489ba429a79567981fbc',ramdisk_id='',reservation_id='r-5em908ya',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-1562817894',owner_user_name='tempest-TestServerBasicOps-1562817894-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T16:54:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fd904092c61441ffad7349f369ab599f',uuid=243d8a1b-180e-4b78-88fb-2d08cb598b7d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "address": "fa:16:3e:f9:0c:52", "network": {"id": "1abfbb44-50a3-4820-9c53-e621ffff9719", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1873150826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fb0e8a2818489ba429a79567981fbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eabf26e-55", "ovs_interfaceid": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.486 183079 DEBUG nova.network.os_vif_util [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Converting VIF {"id": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "address": "fa:16:3e:f9:0c:52", "network": {"id": "1abfbb44-50a3-4820-9c53-e621ffff9719", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1873150826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fb0e8a2818489ba429a79567981fbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eabf26e-55", "ovs_interfaceid": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.486 183079 DEBUG nova.network.os_vif_util [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:52,bridge_name='br-int',has_traffic_filtering=True,id=2eabf26e-55cc-43cb-9ad5-ab8259ca50c0,network=Network(1abfbb44-50a3-4820-9c53-e621ffff9719),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eabf26e-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.486 183079 DEBUG os_vif [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:52,bridge_name='br-int',has_traffic_filtering=True,id=2eabf26e-55cc-43cb-9ad5-ab8259ca50c0,network=Network(1abfbb44-50a3-4820-9c53-e621ffff9719),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eabf26e-55') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.553 183079 DEBUG ovsdbapp.backend.ovs_idl [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.553 183079 DEBUG ovsdbapp.backend.ovs_idl [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.553 183079 DEBUG ovsdbapp.backend.ovs_idl [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.554 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.554 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.554 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.554 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.556 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.557 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.567 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.567 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.567 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 16:54:22 compute-0 nova_compute[183075]: 2026-01-22 16:54:22.568 183079 INFO oslo.privsep.daemon [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpnz40kegb/privsep.sock']
Jan 22 16:54:23 compute-0 nova_compute[183075]: 2026-01-22 16:54:23.210 183079 INFO oslo.privsep.daemon [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Spawned new privsep daemon via rootwrap
Jan 22 16:54:23 compute-0 nova_compute[183075]: 2026-01-22 16:54:23.112 211536 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 22 16:54:23 compute-0 nova_compute[183075]: 2026-01-22 16:54:23.116 211536 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 22 16:54:23 compute-0 nova_compute[183075]: 2026-01-22 16:54:23.118 211536 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Jan 22 16:54:23 compute-0 nova_compute[183075]: 2026-01-22 16:54:23.118 211536 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211536
Jan 22 16:54:23 compute-0 nova_compute[183075]: 2026-01-22 16:54:23.627 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:23 compute-0 nova_compute[183075]: 2026-01-22 16:54:23.628 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2eabf26e-55, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:54:23 compute-0 nova_compute[183075]: 2026-01-22 16:54:23.629 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2eabf26e-55, col_values=(('external_ids', {'iface-id': '2eabf26e-55cc-43cb-9ad5-ab8259ca50c0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:0c:52', 'vm-uuid': '243d8a1b-180e-4b78-88fb-2d08cb598b7d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:54:23 compute-0 nova_compute[183075]: 2026-01-22 16:54:23.632 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:23 compute-0 NetworkManager[55454]: <info>  [1769100863.6333] manager: (tap2eabf26e-55): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 22 16:54:23 compute-0 nova_compute[183075]: 2026-01-22 16:54:23.636 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 16:54:23 compute-0 nova_compute[183075]: 2026-01-22 16:54:23.642 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:23 compute-0 nova_compute[183075]: 2026-01-22 16:54:23.643 183079 INFO os_vif [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:52,bridge_name='br-int',has_traffic_filtering=True,id=2eabf26e-55cc-43cb-9ad5-ab8259ca50c0,network=Network(1abfbb44-50a3-4820-9c53-e621ffff9719),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eabf26e-55')
Jan 22 16:54:23 compute-0 nova_compute[183075]: 2026-01-22 16:54:23.708 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 16:54:23 compute-0 nova_compute[183075]: 2026-01-22 16:54:23.709 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 16:54:23 compute-0 nova_compute[183075]: 2026-01-22 16:54:23.709 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] No VIF found with MAC fa:16:3e:f9:0c:52, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 16:54:23 compute-0 nova_compute[183075]: 2026-01-22 16:54:23.710 183079 INFO nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Using config drive
Jan 22 16:54:24 compute-0 nova_compute[183075]: 2026-01-22 16:54:24.280 183079 DEBUG nova.network.neutron [req-92e00d1b-40d0-4cfd-a0e1-c608baf5892e req-0f125052-00f6-43a8-ba24-3529d0742566 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Updated VIF entry in instance network info cache for port 2eabf26e-55cc-43cb-9ad5-ab8259ca50c0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 16:54:24 compute-0 nova_compute[183075]: 2026-01-22 16:54:24.280 183079 DEBUG nova.network.neutron [req-92e00d1b-40d0-4cfd-a0e1-c608baf5892e req-0f125052-00f6-43a8-ba24-3529d0742566 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Updating instance_info_cache with network_info: [{"id": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "address": "fa:16:3e:f9:0c:52", "network": {"id": "1abfbb44-50a3-4820-9c53-e621ffff9719", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1873150826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fb0e8a2818489ba429a79567981fbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eabf26e-55", "ovs_interfaceid": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 16:54:24 compute-0 nova_compute[183075]: 2026-01-22 16:54:24.315 183079 DEBUG oslo_concurrency.lockutils [req-92e00d1b-40d0-4cfd-a0e1-c608baf5892e req-0f125052-00f6-43a8-ba24-3529d0742566 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-243d8a1b-180e-4b78-88fb-2d08cb598b7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 16:54:24 compute-0 podman[211542]: 2026-01-22 16:54:24.37139516 +0000 UTC m=+0.084904576 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 16:54:24 compute-0 podman[211543]: 2026-01-22 16:54:24.378899791 +0000 UTC m=+0.086893447 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 16:54:24 compute-0 nova_compute[183075]: 2026-01-22 16:54:24.524 183079 INFO nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Creating config drive at /var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d/disk.config
Jan 22 16:54:24 compute-0 nova_compute[183075]: 2026-01-22 16:54:24.530 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3698qiia execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 16:54:24 compute-0 nova_compute[183075]: 2026-01-22 16:54:24.651 183079 DEBUG oslo_concurrency.processutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3698qiia" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 16:54:24 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 22 16:54:24 compute-0 kernel: tap2eabf26e-55: entered promiscuous mode
Jan 22 16:54:24 compute-0 NetworkManager[55454]: <info>  [1769100864.7194] manager: (tap2eabf26e-55): new Tun device (/org/freedesktop/NetworkManager/Devices/20)
Jan 22 16:54:24 compute-0 nova_compute[183075]: 2026-01-22 16:54:24.748 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:24 compute-0 ovn_controller[95372]: 2026-01-22T16:54:24Z|00027|binding|INFO|Claiming lport 2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 for this chassis.
Jan 22 16:54:24 compute-0 ovn_controller[95372]: 2026-01-22T16:54:24Z|00028|binding|INFO|2eabf26e-55cc-43cb-9ad5-ab8259ca50c0: Claiming fa:16:3e:f9:0c:52 10.100.0.6
Jan 22 16:54:24 compute-0 nova_compute[183075]: 2026-01-22 16:54:24.750 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:24.764 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:0c:52 10.100.0.6'], port_security=['fa:16:3e:f9:0c:52 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '243d8a1b-180e-4b78-88fb-2d08cb598b7d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1abfbb44-50a3-4820-9c53-e621ffff9719', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '73fb0e8a2818489ba429a79567981fbc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '660dc581-3a5d-4927-8f93-cfdf57d0a611 77ddaf26-0aa8-4b5d-8fe9-8005fa222865', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9434a937-0ff9-43b0-a27f-0f94550ab0a0, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=2eabf26e-55cc-43cb-9ad5-ab8259ca50c0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 16:54:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:24.766 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 in datapath 1abfbb44-50a3-4820-9c53-e621ffff9719 bound to our chassis
Jan 22 16:54:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:24.769 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1abfbb44-50a3-4820-9c53-e621ffff9719
Jan 22 16:54:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:24.770 104629 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp3fziir8z/privsep.sock']
Jan 22 16:54:24 compute-0 systemd-udevd[211605]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 16:54:24 compute-0 NetworkManager[55454]: <info>  [1769100864.7989] device (tap2eabf26e-55): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 16:54:24 compute-0 NetworkManager[55454]: <info>  [1769100864.7993] device (tap2eabf26e-55): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 16:54:24 compute-0 systemd-machined[154382]: New machine qemu-1-instance-00000001.
Jan 22 16:54:24 compute-0 nova_compute[183075]: 2026-01-22 16:54:24.839 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:24 compute-0 ovn_controller[95372]: 2026-01-22T16:54:24Z|00029|binding|INFO|Setting lport 2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 ovn-installed in OVS
Jan 22 16:54:24 compute-0 ovn_controller[95372]: 2026-01-22T16:54:24Z|00030|binding|INFO|Setting lport 2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 up in Southbound
Jan 22 16:54:24 compute-0 nova_compute[183075]: 2026-01-22 16:54:24.846 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:24 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.185 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769100865.184232, 243d8a1b-180e-4b78-88fb-2d08cb598b7d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.185 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] VM Started (Lifecycle Event)
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.223 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.226 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769100865.1868367, 243d8a1b-180e-4b78-88fb-2d08cb598b7d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.226 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] VM Paused (Lifecycle Event)
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.247 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.250 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.272 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 16:54:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:25.478 104629 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 22 16:54:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:25.479 104629 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp3fziir8z/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 22 16:54:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:25.358 211630 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 22 16:54:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:25.362 211630 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 22 16:54:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:25.364 211630 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Jan 22 16:54:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:25.364 211630 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211630
Jan 22 16:54:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:25.481 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[33d35369-b30e-49d7-bd98-406e9ba95f5f]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.561 183079 DEBUG nova.compute.manager [req-762ed037-6c37-411b-a227-8da0f5422dc3 req-8aad894c-505a-4d6f-95b8-7063290eb50a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Received event network-vif-plugged-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.561 183079 DEBUG oslo_concurrency.lockutils [req-762ed037-6c37-411b-a227-8da0f5422dc3 req-8aad894c-505a-4d6f-95b8-7063290eb50a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.562 183079 DEBUG oslo_concurrency.lockutils [req-762ed037-6c37-411b-a227-8da0f5422dc3 req-8aad894c-505a-4d6f-95b8-7063290eb50a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.562 183079 DEBUG oslo_concurrency.lockutils [req-762ed037-6c37-411b-a227-8da0f5422dc3 req-8aad894c-505a-4d6f-95b8-7063290eb50a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.562 183079 DEBUG nova.compute.manager [req-762ed037-6c37-411b-a227-8da0f5422dc3 req-8aad894c-505a-4d6f-95b8-7063290eb50a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Processing event network-vif-plugged-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.563 183079 DEBUG nova.compute.manager [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.567 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769100865.5674691, 243d8a1b-180e-4b78-88fb-2d08cb598b7d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.568 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] VM Resumed (Lifecycle Event)
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.569 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.572 183079 INFO nova.virt.libvirt.driver [-] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Instance spawned successfully.
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.573 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.596 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.602 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.605 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.606 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.606 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.607 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.607 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.608 183079 DEBUG nova.virt.libvirt.driver [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.630 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.722 183079 INFO nova.compute.manager [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Took 8.65 seconds to spawn the instance on the hypervisor.
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.724 183079 DEBUG nova.compute.manager [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.801 183079 INFO nova.compute.manager [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Took 9.17 seconds to build instance.
Jan 22 16:54:25 compute-0 nova_compute[183075]: 2026-01-22 16:54:25.828 183079 DEBUG oslo_concurrency.lockutils [None req-b3c0152e-12dc-43a4-8c38-2bf6ff4f9959 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:26.027 211630 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:26.027 211630 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:26.028 211630 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:26.578 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[598ee75d-9fe8-4c71-96f1-1b20c2651d65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:26.579 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1abfbb44-51 in ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 16:54:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:26.581 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1abfbb44-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 16:54:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:26.581 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4f578d0b-927a-47c8-9305-51ecc70c1b79]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:26.583 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd1ac30-bebf-4266-9ad7-c42aff8001bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:26.611 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[26bebbfa-0b7e-40c8-9eff-d84e68ba448f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:26.642 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f0b59abe-641d-4aeb-beb0-72f123708286]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:26.644 104629 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpafmylke6/privsep.sock']
Jan 22 16:54:26 compute-0 podman[211639]: 2026-01-22 16:54:26.759689796 +0000 UTC m=+0.125403245 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=openstack_network_exporter, name=ubi9-minimal, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6)
Jan 22 16:54:26 compute-0 nova_compute[183075]: 2026-01-22 16:54:26.847 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:27.328 104629 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 22 16:54:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:27.330 104629 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpafmylke6/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 22 16:54:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:27.210 211665 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 22 16:54:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:27.216 211665 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 22 16:54:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:27.219 211665 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 22 16:54:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:27.219 211665 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211665
Jan 22 16:54:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:27.333 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a3d56d28-4c82-456e-a8ec-accc20352c70]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:27 compute-0 nova_compute[183075]: 2026-01-22 16:54:27.701 183079 DEBUG nova.compute.manager [req-35188070-c3a3-41f8-b927-cbb7393d98d9 req-1eb648fe-aa59-42ca-adae-78bcc572dcae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Received event network-vif-plugged-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 16:54:27 compute-0 nova_compute[183075]: 2026-01-22 16:54:27.702 183079 DEBUG oslo_concurrency.lockutils [req-35188070-c3a3-41f8-b927-cbb7393d98d9 req-1eb648fe-aa59-42ca-adae-78bcc572dcae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:27 compute-0 nova_compute[183075]: 2026-01-22 16:54:27.702 183079 DEBUG oslo_concurrency.lockutils [req-35188070-c3a3-41f8-b927-cbb7393d98d9 req-1eb648fe-aa59-42ca-adae-78bcc572dcae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:27 compute-0 nova_compute[183075]: 2026-01-22 16:54:27.702 183079 DEBUG oslo_concurrency.lockutils [req-35188070-c3a3-41f8-b927-cbb7393d98d9 req-1eb648fe-aa59-42ca-adae-78bcc572dcae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:27 compute-0 nova_compute[183075]: 2026-01-22 16:54:27.702 183079 DEBUG nova.compute.manager [req-35188070-c3a3-41f8-b927-cbb7393d98d9 req-1eb648fe-aa59-42ca-adae-78bcc572dcae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] No waiting events found dispatching network-vif-plugged-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 16:54:27 compute-0 nova_compute[183075]: 2026-01-22 16:54:27.705 183079 WARNING nova.compute.manager [req-35188070-c3a3-41f8-b927-cbb7393d98d9 req-1eb648fe-aa59-42ca-adae-78bcc572dcae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Received unexpected event network-vif-plugged-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 for instance with vm_state active and task_state None.
Jan 22 16:54:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:27.847 211665 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:27.847 211665 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:27.847 211665 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.411 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5fc23b81-3cc2-4427-af0e-128f4ee05f89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.438 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2e9fdaea-514e-4715-8e4e-9ab35e2d1193]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:28 compute-0 NetworkManager[55454]: <info>  [1769100868.4407] manager: (tap1abfbb44-50): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.477 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f28f33d3-8bc6-430d-865c-e2585c2ad786]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.481 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5132268e-1492-4423-9585-99d8aad49625]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:28 compute-0 systemd-udevd[211677]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 16:54:28 compute-0 NetworkManager[55454]: <info>  [1769100868.5092] device (tap1abfbb44-50): carrier: link connected
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.517 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[6179c10b-719f-469b-8f2c-63e170169f1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.535 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[226e1872-1556-4b00-8e82-36dd30622191]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1abfbb44-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:f7:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 323221, 'reachable_time': 16880, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211695, 'error': None, 'target': 'ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.550 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e49645-a760-4f13-9d92-15ae5629e152]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe37:f725'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 323221, 'tstamp': 323221}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211696, 'error': None, 'target': 'ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.566 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3910b36b-fbbd-4882-925b-3b174a4e266e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1abfbb44-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:f7:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 323221, 'reachable_time': 16880, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211697, 'error': None, 'target': 'ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.602 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7821f623-49e8-417d-9c06-e26b46febd67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:28 compute-0 nova_compute[183075]: 2026-01-22 16:54:28.632 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.658 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[89378222-eaed-4609-a476-3e7e95ece958]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.660 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1abfbb44-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.660 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.661 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1abfbb44-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:54:28 compute-0 kernel: tap1abfbb44-50: entered promiscuous mode
Jan 22 16:54:28 compute-0 nova_compute[183075]: 2026-01-22 16:54:28.664 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:28 compute-0 NetworkManager[55454]: <info>  [1769100868.6686] manager: (tap1abfbb44-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Jan 22 16:54:28 compute-0 nova_compute[183075]: 2026-01-22 16:54:28.669 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.670 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1abfbb44-50, col_values=(('external_ids', {'iface-id': 'f2894cc0-7465-405b-a6c6-53604d2fa9dd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:54:28 compute-0 ovn_controller[95372]: 2026-01-22T16:54:28Z|00031|binding|INFO|Releasing lport f2894cc0-7465-405b-a6c6-53604d2fa9dd from this chassis (sb_readonly=0)
Jan 22 16:54:28 compute-0 nova_compute[183075]: 2026-01-22 16:54:28.672 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:28 compute-0 nova_compute[183075]: 2026-01-22 16:54:28.699 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.700 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1abfbb44-50a3-4820-9c53-e621ffff9719.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1abfbb44-50a3-4820-9c53-e621ffff9719.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.701 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8d63bf02-e87f-47e0-b6d9-04aff5fbe29c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.705 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: global
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-1abfbb44-50a3-4820-9c53-e621ffff9719
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/1abfbb44-50a3-4820-9c53-e621ffff9719.pid.haproxy
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 1abfbb44-50a3-4820-9c53-e621ffff9719
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 16:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:28.706 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719', 'env', 'PROCESS_TAG=haproxy-1abfbb44-50a3-4820-9c53-e621ffff9719', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1abfbb44-50a3-4820-9c53-e621ffff9719.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 16:54:28 compute-0 NetworkManager[55454]: <info>  [1769100868.9295] manager: (patch-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/23)
Jan 22 16:54:28 compute-0 NetworkManager[55454]: <info>  [1769100868.9301] device (patch-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:54:28 compute-0 NetworkManager[55454]: <warn>  [1769100868.9304] device (patch-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 16:54:28 compute-0 nova_compute[183075]: 2026-01-22 16:54:28.930 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:28 compute-0 NetworkManager[55454]: <info>  [1769100868.9314] manager: (patch-br-int-to-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/24)
Jan 22 16:54:28 compute-0 NetworkManager[55454]: <info>  [1769100868.9320] device (patch-br-int-to-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:54:28 compute-0 NetworkManager[55454]: <warn>  [1769100868.9320] device (patch-br-int-to-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 16:54:28 compute-0 NetworkManager[55454]: <info>  [1769100868.9331] manager: (patch-br-int-to-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Jan 22 16:54:28 compute-0 NetworkManager[55454]: <info>  [1769100868.9339] manager: (patch-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Jan 22 16:54:28 compute-0 NetworkManager[55454]: <info>  [1769100868.9344] device (patch-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 22 16:54:28 compute-0 NetworkManager[55454]: <info>  [1769100868.9349] device (patch-br-int-to-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 22 16:54:28 compute-0 nova_compute[183075]: 2026-01-22 16:54:28.996 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:28 compute-0 ovn_controller[95372]: 2026-01-22T16:54:28Z|00032|binding|INFO|Releasing lport f2894cc0-7465-405b-a6c6-53604d2fa9dd from this chassis (sb_readonly=0)
Jan 22 16:54:29 compute-0 nova_compute[183075]: 2026-01-22 16:54:29.005 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:29 compute-0 podman[211730]: 2026-01-22 16:54:29.123298125 +0000 UTC m=+0.090761085 container create 59cf508400188af3df901f057e843972aad631bb9d2c749de64edd717b64a398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 16:54:29 compute-0 podman[211730]: 2026-01-22 16:54:29.073358437 +0000 UTC m=+0.040821377 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 16:54:29 compute-0 systemd[1]: Started libpod-conmon-59cf508400188af3df901f057e843972aad631bb9d2c749de64edd717b64a398.scope.
Jan 22 16:54:29 compute-0 systemd[1]: Started libcrun container.
Jan 22 16:54:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/883bc76817c051bae9cb1a098e66ecce6fe4f981a394576a7d1e8fbf17fdc253/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 16:54:29 compute-0 podman[211730]: 2026-01-22 16:54:29.222501053 +0000 UTC m=+0.189964103 container init 59cf508400188af3df901f057e843972aad631bb9d2c749de64edd717b64a398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 16:54:29 compute-0 podman[211730]: 2026-01-22 16:54:29.229417399 +0000 UTC m=+0.196880379 container start 59cf508400188af3df901f057e843972aad631bb9d2c749de64edd717b64a398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 16:54:29 compute-0 neutron-haproxy-ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719[211745]: [NOTICE]   (211749) : New worker (211751) forked
Jan 22 16:54:29 compute-0 neutron-haproxy-ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719[211745]: [NOTICE]   (211749) : Loading success.
Jan 22 16:54:29 compute-0 nova_compute[183075]: 2026-01-22 16:54:29.830 183079 DEBUG nova.compute.manager [req-8b07a06a-312c-405b-a08e-ffc87136b744 req-40ef2e63-4680-4c3d-8bed-ff2c58411680 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Received event network-changed-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 16:54:29 compute-0 nova_compute[183075]: 2026-01-22 16:54:29.831 183079 DEBUG nova.compute.manager [req-8b07a06a-312c-405b-a08e-ffc87136b744 req-40ef2e63-4680-4c3d-8bed-ff2c58411680 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Refreshing instance network info cache due to event network-changed-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 16:54:29 compute-0 nova_compute[183075]: 2026-01-22 16:54:29.832 183079 DEBUG oslo_concurrency.lockutils [req-8b07a06a-312c-405b-a08e-ffc87136b744 req-40ef2e63-4680-4c3d-8bed-ff2c58411680 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-243d8a1b-180e-4b78-88fb-2d08cb598b7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 16:54:29 compute-0 nova_compute[183075]: 2026-01-22 16:54:29.833 183079 DEBUG oslo_concurrency.lockutils [req-8b07a06a-312c-405b-a08e-ffc87136b744 req-40ef2e63-4680-4c3d-8bed-ff2c58411680 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-243d8a1b-180e-4b78-88fb-2d08cb598b7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 16:54:29 compute-0 nova_compute[183075]: 2026-01-22 16:54:29.834 183079 DEBUG nova.network.neutron [req-8b07a06a-312c-405b-a08e-ffc87136b744 req-40ef2e63-4680-4c3d-8bed-ff2c58411680 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Refreshing network info cache for port 2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 16:54:31 compute-0 podman[211760]: 2026-01-22 16:54:31.37720262 +0000 UTC m=+0.082414873 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 16:54:31 compute-0 nova_compute[183075]: 2026-01-22 16:54:31.850 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:32 compute-0 nova_compute[183075]: 2026-01-22 16:54:32.206 183079 DEBUG nova.network.neutron [req-8b07a06a-312c-405b-a08e-ffc87136b744 req-40ef2e63-4680-4c3d-8bed-ff2c58411680 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Updated VIF entry in instance network info cache for port 2eabf26e-55cc-43cb-9ad5-ab8259ca50c0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 16:54:32 compute-0 nova_compute[183075]: 2026-01-22 16:54:32.207 183079 DEBUG nova.network.neutron [req-8b07a06a-312c-405b-a08e-ffc87136b744 req-40ef2e63-4680-4c3d-8bed-ff2c58411680 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Updating instance_info_cache with network_info: [{"id": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "address": "fa:16:3e:f9:0c:52", "network": {"id": "1abfbb44-50a3-4820-9c53-e621ffff9719", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1873150826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fb0e8a2818489ba429a79567981fbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eabf26e-55", "ovs_interfaceid": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 16:54:32 compute-0 nova_compute[183075]: 2026-01-22 16:54:32.228 183079 DEBUG oslo_concurrency.lockutils [req-8b07a06a-312c-405b-a08e-ffc87136b744 req-40ef2e63-4680-4c3d-8bed-ff2c58411680 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-243d8a1b-180e-4b78-88fb-2d08cb598b7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 16:54:33 compute-0 nova_compute[183075]: 2026-01-22 16:54:33.635 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:36 compute-0 nova_compute[183075]: 2026-01-22 16:54:36.891 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:37 compute-0 podman[211806]: 2026-01-22 16:54:37.37572814 +0000 UTC m=+0.071592738 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 16:54:38 compute-0 ovn_controller[95372]: 2026-01-22T16:54:38Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f9:0c:52 10.100.0.6
Jan 22 16:54:38 compute-0 ovn_controller[95372]: 2026-01-22T16:54:38Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f9:0c:52 10.100.0.6
Jan 22 16:54:38 compute-0 nova_compute[183075]: 2026-01-22 16:54:38.639 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:41 compute-0 nova_compute[183075]: 2026-01-22 16:54:41.895 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:41.910 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:41.911 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:41.912 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:43 compute-0 nova_compute[183075]: 2026-01-22 16:54:43.642 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:46.151 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 16:54:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:46.153 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /latest/meta-data/public-ipv4 HTTP/1.0
Jan 22 16:54:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 16:54:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 16:54:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 16:54:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 16:54:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 16:54:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 16:54:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 1abfbb44-50a3-4820-9c53-e621ffff9719 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 16:54:46 compute-0 nova_compute[183075]: 2026-01-22 16:54:46.898 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:48.333 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 16:54:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:48.333 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /latest/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 2.1802666
Jan 22 16:54:48 compute-0 haproxy-metadata-proxy-1abfbb44-50a3-4820-9c53-e621ffff9719[211751]: 10.100.0.6:51350 [22/Jan/2026:16:54:46.150] listener listener/metadata 0/0/0/2183/2183 200 135 - - ---- 1/1/0/0/0 0/0 "GET /latest/meta-data/public-ipv4 HTTP/1.1"
Jan 22 16:54:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:48.448 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 16:54:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:48.450 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: POST /openstack/2013-10-17/password HTTP/1.0
Jan 22 16:54:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 16:54:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 16:54:48 compute-0 ovn_metadata_agent[104624]: Content-Length: 100
Jan 22 16:54:48 compute-0 ovn_metadata_agent[104624]: Content-Type: application/x-www-form-urlencoded
Jan 22 16:54:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 16:54:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 16:54:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 16:54:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 1abfbb44-50a3-4820-9c53-e621ffff9719
Jan 22 16:54:48 compute-0 ovn_metadata_agent[104624]: 
Jan 22 16:54:48 compute-0 ovn_metadata_agent[104624]: testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 16:54:48 compute-0 nova_compute[183075]: 2026-01-22 16:54:48.644 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:48.691 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 16:54:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:48.692 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "POST /openstack/2013-10-17/password HTTP/1.1" status: 200  len: 134 time: 0.2422411
Jan 22 16:54:48 compute-0 haproxy-metadata-proxy-1abfbb44-50a3-4820-9c53-e621ffff9719[211751]: 10.100.0.6:51352 [22/Jan/2026:16:54:48.446] listener listener/metadata 0/0/0/245/245 200 118 - - ---- 1/1/0/0/0 0/0 "POST /openstack/2013-10-17/password HTTP/1.1"
Jan 22 16:54:49 compute-0 podman[211830]: 2026-01-22 16:54:49.369688303 +0000 UTC m=+0.069768372 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 16:54:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:49.951 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 16:54:49 compute-0 nova_compute[183075]: 2026-01-22 16:54:49.952 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:49.953 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 16:54:50 compute-0 nova_compute[183075]: 2026-01-22 16:54:50.820 183079 DEBUG oslo_concurrency.lockutils [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Acquiring lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:50 compute-0 nova_compute[183075]: 2026-01-22 16:54:50.821 183079 DEBUG oslo_concurrency.lockutils [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:50 compute-0 nova_compute[183075]: 2026-01-22 16:54:50.821 183079 DEBUG oslo_concurrency.lockutils [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Acquiring lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:50 compute-0 nova_compute[183075]: 2026-01-22 16:54:50.822 183079 DEBUG oslo_concurrency.lockutils [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:50 compute-0 nova_compute[183075]: 2026-01-22 16:54:50.822 183079 DEBUG oslo_concurrency.lockutils [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:50 compute-0 nova_compute[183075]: 2026-01-22 16:54:50.824 183079 INFO nova.compute.manager [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Terminating instance
Jan 22 16:54:50 compute-0 nova_compute[183075]: 2026-01-22 16:54:50.825 183079 DEBUG nova.compute.manager [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 16:54:50 compute-0 kernel: tap2eabf26e-55 (unregistering): left promiscuous mode
Jan 22 16:54:50 compute-0 NetworkManager[55454]: <info>  [1769100890.8545] device (tap2eabf26e-55): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 16:54:50 compute-0 ovn_controller[95372]: 2026-01-22T16:54:50Z|00033|binding|INFO|Releasing lport 2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 from this chassis (sb_readonly=0)
Jan 22 16:54:50 compute-0 ovn_controller[95372]: 2026-01-22T16:54:50Z|00034|binding|INFO|Setting lport 2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 down in Southbound
Jan 22 16:54:50 compute-0 nova_compute[183075]: 2026-01-22 16:54:50.871 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:50 compute-0 ovn_controller[95372]: 2026-01-22T16:54:50Z|00035|binding|INFO|Removing iface tap2eabf26e-55 ovn-installed in OVS
Jan 22 16:54:50 compute-0 nova_compute[183075]: 2026-01-22 16:54:50.875 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:50.880 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:0c:52 10.100.0.6'], port_security=['fa:16:3e:f9:0c:52 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '243d8a1b-180e-4b78-88fb-2d08cb598b7d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1abfbb44-50a3-4820-9c53-e621ffff9719', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '73fb0e8a2818489ba429a79567981fbc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '660dc581-3a5d-4927-8f93-cfdf57d0a611 77ddaf26-0aa8-4b5d-8fe9-8005fa222865', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.172'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9434a937-0ff9-43b0-a27f-0f94550ab0a0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=2eabf26e-55cc-43cb-9ad5-ab8259ca50c0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 16:54:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:50.881 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 in datapath 1abfbb44-50a3-4820-9c53-e621ffff9719 unbound from our chassis
Jan 22 16:54:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:50.883 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1abfbb44-50a3-4820-9c53-e621ffff9719, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 16:54:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:50.884 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[93596acf-ed41-4dcc-9759-1a65deb1f240]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:50.885 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719 namespace which is not needed anymore
Jan 22 16:54:50 compute-0 nova_compute[183075]: 2026-01-22 16:54:50.887 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:50 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Jan 22 16:54:50 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 12.826s CPU time.
Jan 22 16:54:50 compute-0 systemd-machined[154382]: Machine qemu-1-instance-00000001 terminated.
Jan 22 16:54:51 compute-0 neutron-haproxy-ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719[211745]: [NOTICE]   (211749) : haproxy version is 2.8.14-c23fe91
Jan 22 16:54:51 compute-0 neutron-haproxy-ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719[211745]: [NOTICE]   (211749) : path to executable is /usr/sbin/haproxy
Jan 22 16:54:51 compute-0 neutron-haproxy-ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719[211745]: [WARNING]  (211749) : Exiting Master process...
Jan 22 16:54:51 compute-0 neutron-haproxy-ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719[211745]: [ALERT]    (211749) : Current worker (211751) exited with code 143 (Terminated)
Jan 22 16:54:51 compute-0 neutron-haproxy-ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719[211745]: [WARNING]  (211749) : All workers exited. Exiting... (0)
Jan 22 16:54:51 compute-0 systemd[1]: libpod-59cf508400188af3df901f057e843972aad631bb9d2c749de64edd717b64a398.scope: Deactivated successfully.
Jan 22 16:54:51 compute-0 podman[211880]: 2026-01-22 16:54:51.020317573 +0000 UTC m=+0.046556952 container died 59cf508400188af3df901f057e843972aad631bb9d2c749de64edd717b64a398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.051 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.057 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59cf508400188af3df901f057e843972aad631bb9d2c749de64edd717b64a398-userdata-shm.mount: Deactivated successfully.
Jan 22 16:54:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-883bc76817c051bae9cb1a098e66ecce6fe4f981a394576a7d1e8fbf17fdc253-merged.mount: Deactivated successfully.
Jan 22 16:54:51 compute-0 podman[211880]: 2026-01-22 16:54:51.071648456 +0000 UTC m=+0.097887805 container cleanup 59cf508400188af3df901f057e843972aad631bb9d2c749de64edd717b64a398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 16:54:51 compute-0 systemd[1]: libpod-conmon-59cf508400188af3df901f057e843972aad631bb9d2c749de64edd717b64a398.scope: Deactivated successfully.
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.084 183079 INFO nova.virt.libvirt.driver [-] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Instance destroyed successfully.
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.085 183079 DEBUG nova.objects.instance [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lazy-loading 'resources' on Instance uuid 243d8a1b-180e-4b78-88fb-2d08cb598b7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.102 183079 DEBUG nova.virt.libvirt.vif [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T16:54:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-2117019190',display_name='tempest-TestServerBasicOps-server-2117019190',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserverbasicops-server-2117019190',id=1,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPbKBuCyR9cLZReK75enUrgE5KJvQeyP50prZDIEvZ5uXx6sRIVFnoPiHx8vYeR8tCZ4GuJvkv4q3s9I1caAJMaIZPnsmsfmHIg0OmJrn8S63wqPWPNKX90mr7GILLt7UQ==',key_name='tempest-TestServerBasicOps-958706236',keypairs=<?>,launch_index=0,launched_at=2026-01-22T16:54:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='73fb0e8a2818489ba429a79567981fbc',ramdisk_id='',reservation_id='r-5em908ya',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerBasicOps-1562817894',owner_user_name='tempest-TestServerBasicOps-1562817894-project-member',password_0='testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T16:54:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fd904092c61441ffad7349f369ab599f',uuid=243d8a1b-180e-4b78-88fb-2d08cb598b7d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "address": "fa:16:3e:f9:0c:52", "network": {"id": "1abfbb44-50a3-4820-9c53-e621ffff9719", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1873150826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fb0e8a2818489ba429a79567981fbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eabf26e-55", "ovs_interfaceid": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.103 183079 DEBUG nova.network.os_vif_util [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Converting VIF {"id": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "address": "fa:16:3e:f9:0c:52", "network": {"id": "1abfbb44-50a3-4820-9c53-e621ffff9719", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1873150826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fb0e8a2818489ba429a79567981fbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eabf26e-55", "ovs_interfaceid": "2eabf26e-55cc-43cb-9ad5-ab8259ca50c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.104 183079 DEBUG nova.network.os_vif_util [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f9:0c:52,bridge_name='br-int',has_traffic_filtering=True,id=2eabf26e-55cc-43cb-9ad5-ab8259ca50c0,network=Network(1abfbb44-50a3-4820-9c53-e621ffff9719),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eabf26e-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.105 183079 DEBUG os_vif [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:0c:52,bridge_name='br-int',has_traffic_filtering=True,id=2eabf26e-55cc-43cb-9ad5-ab8259ca50c0,network=Network(1abfbb44-50a3-4820-9c53-e621ffff9719),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eabf26e-55') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.107 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.108 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2eabf26e-55, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.113 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.116 183079 INFO os_vif [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:0c:52,bridge_name='br-int',has_traffic_filtering=True,id=2eabf26e-55cc-43cb-9ad5-ab8259ca50c0,network=Network(1abfbb44-50a3-4820-9c53-e621ffff9719),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eabf26e-55')
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.117 183079 INFO nova.virt.libvirt.driver [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Deleting instance files /var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d_del
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.118 183079 INFO nova.virt.libvirt.driver [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Deletion of /var/lib/nova/instances/243d8a1b-180e-4b78-88fb-2d08cb598b7d_del complete
Jan 22 16:54:51 compute-0 podman[211926]: 2026-01-22 16:54:51.146541758 +0000 UTC m=+0.046288327 container remove 59cf508400188af3df901f057e843972aad631bb9d2c749de64edd717b64a398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 22 16:54:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:51.155 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5317b471-fd97-4b2f-b771-b6c7ccbe7fb3]: (4, ('Thu Jan 22 04:54:50 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719 (59cf508400188af3df901f057e843972aad631bb9d2c749de64edd717b64a398)\n59cf508400188af3df901f057e843972aad631bb9d2c749de64edd717b64a398\nThu Jan 22 04:54:51 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719 (59cf508400188af3df901f057e843972aad631bb9d2c749de64edd717b64a398)\n59cf508400188af3df901f057e843972aad631bb9d2c749de64edd717b64a398\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:51.157 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[984bb917-60f4-4902-be89-09a9db8a7a14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:51.159 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1abfbb44-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.162 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:51 compute-0 kernel: tap1abfbb44-50: left promiscuous mode
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.187 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:51.190 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d1591b5f-95cb-4656-8171-ca8aaa8aad86]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.199 183079 DEBUG nova.virt.libvirt.host [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.200 183079 INFO nova.virt.libvirt.host [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] UEFI support detected
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.202 183079 INFO nova.compute.manager [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.202 183079 DEBUG oslo.service.loopingcall [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.202 183079 DEBUG nova.compute.manager [-] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.202 183079 DEBUG nova.network.neutron [-] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 16:54:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:51.205 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ed0cd8-e90e-4536-883d-6c7a232dc2b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:51.207 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6cc7f91d-b16f-46c4-8347-cdde99f40739]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:51.226 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f7cc8a42-d1cd-499a-812f-cdc80f083920]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 323210, 'reachable_time': 20764, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211941, 'error': None, 'target': 'ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:51 compute-0 systemd[1]: run-netns-ovnmeta\x2d1abfbb44\x2d50a3\x2d4820\x2d9c53\x2de621ffff9719.mount: Deactivated successfully.
Jan 22 16:54:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:51.238 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1abfbb44-50a3-4820-9c53-e621ffff9719 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 16:54:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:51.239 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b70024-eced-45b1-8060-31cb2994e3fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.341 183079 DEBUG nova.compute.manager [req-a31b91d2-7b0d-44bf-ad50-c5b20e8d6af0 req-a1d79c26-02f6-4453-9f87-d672b00183a0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Received event network-vif-unplugged-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.342 183079 DEBUG oslo_concurrency.lockutils [req-a31b91d2-7b0d-44bf-ad50-c5b20e8d6af0 req-a1d79c26-02f6-4453-9f87-d672b00183a0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.343 183079 DEBUG oslo_concurrency.lockutils [req-a31b91d2-7b0d-44bf-ad50-c5b20e8d6af0 req-a1d79c26-02f6-4453-9f87-d672b00183a0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.343 183079 DEBUG oslo_concurrency.lockutils [req-a31b91d2-7b0d-44bf-ad50-c5b20e8d6af0 req-a1d79c26-02f6-4453-9f87-d672b00183a0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.343 183079 DEBUG nova.compute.manager [req-a31b91d2-7b0d-44bf-ad50-c5b20e8d6af0 req-a1d79c26-02f6-4453-9f87-d672b00183a0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] No waiting events found dispatching network-vif-unplugged-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.343 183079 DEBUG nova.compute.manager [req-a31b91d2-7b0d-44bf-ad50-c5b20e8d6af0 req-a1d79c26-02f6-4453-9f87-d672b00183a0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Received event network-vif-unplugged-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:54:51 compute-0 nova_compute[183075]: 2026-01-22 16:54:51.901 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:54:51.957 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.127 183079 DEBUG nova.network.neutron [-] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.150 183079 INFO nova.compute.manager [-] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Took 1.95 seconds to deallocate network for instance.
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.206 183079 DEBUG oslo_concurrency.lockutils [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.206 183079 DEBUG oslo_concurrency.lockutils [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.271 183079 DEBUG nova.compute.provider_tree [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.299 183079 ERROR nova.scheduler.client.report [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] [req-50081a4b-9ddc-4305-b03b-a7cda2240760] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 2513134c-f67c-4237-84bf-4ebe2450d610.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-50081a4b-9ddc-4305-b03b-a7cda2240760"}]}
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.317 183079 DEBUG nova.scheduler.client.report [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Refreshing inventories for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.333 183079 DEBUG nova.scheduler.client.report [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Updating ProviderTree inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.333 183079 DEBUG nova.compute.provider_tree [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.346 183079 DEBUG nova.scheduler.client.report [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Refreshing aggregate associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.369 183079 DEBUG nova.scheduler.client.report [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Refreshing trait associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NODE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.424 183079 DEBUG nova.compute.provider_tree [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.453 183079 DEBUG nova.compute.manager [req-7eb4282c-0995-4ac2-8756-eb739b2cbdba req-41548f5b-cd85-419f-9cbd-71722583cf89 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Received event network-vif-plugged-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.454 183079 DEBUG oslo_concurrency.lockutils [req-7eb4282c-0995-4ac2-8756-eb739b2cbdba req-41548f5b-cd85-419f-9cbd-71722583cf89 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.455 183079 DEBUG oslo_concurrency.lockutils [req-7eb4282c-0995-4ac2-8756-eb739b2cbdba req-41548f5b-cd85-419f-9cbd-71722583cf89 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.455 183079 DEBUG oslo_concurrency.lockutils [req-7eb4282c-0995-4ac2-8756-eb739b2cbdba req-41548f5b-cd85-419f-9cbd-71722583cf89 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.456 183079 DEBUG nova.compute.manager [req-7eb4282c-0995-4ac2-8756-eb739b2cbdba req-41548f5b-cd85-419f-9cbd-71722583cf89 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] No waiting events found dispatching network-vif-plugged-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.456 183079 WARNING nova.compute.manager [req-7eb4282c-0995-4ac2-8756-eb739b2cbdba req-41548f5b-cd85-419f-9cbd-71722583cf89 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Received unexpected event network-vif-plugged-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 for instance with vm_state deleted and task_state None.
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.456 183079 DEBUG nova.compute.manager [req-7eb4282c-0995-4ac2-8756-eb739b2cbdba req-41548f5b-cd85-419f-9cbd-71722583cf89 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Received event network-vif-deleted-2eabf26e-55cc-43cb-9ad5-ab8259ca50c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.483 183079 DEBUG nova.scheduler.client.report [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Updated inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.484 183079 DEBUG nova.compute.provider_tree [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Updating resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.484 183079 DEBUG nova.compute.provider_tree [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.513 183079 DEBUG oslo_concurrency.lockutils [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.534 183079 INFO nova.scheduler.client.report [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Deleted allocations for instance 243d8a1b-180e-4b78-88fb-2d08cb598b7d
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.611 183079 DEBUG oslo_concurrency.lockutils [None req-c2eda8c2-020e-4596-9277-df716e365474 fd904092c61441ffad7349f369ab599f 73fb0e8a2818489ba429a79567981fbc - - default default] Lock "243d8a1b-180e-4b78-88fb-2d08cb598b7d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.789s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:54:53 compute-0 nova_compute[183075]: 2026-01-22 16:54:53.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 16:54:54 compute-0 nova_compute[183075]: 2026-01-22 16:54:54.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:54:55 compute-0 podman[211944]: 2026-01-22 16:54:55.391179107 +0000 UTC m=+0.087339939 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 16:54:55 compute-0 podman[211943]: 2026-01-22 16:54:55.409928633 +0000 UTC m=+0.102903634 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 16:54:55 compute-0 nova_compute[183075]: 2026-01-22 16:54:55.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:54:55 compute-0 nova_compute[183075]: 2026-01-22 16:54:55.814 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:55 compute-0 nova_compute[183075]: 2026-01-22 16:54:55.814 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:55 compute-0 nova_compute[183075]: 2026-01-22 16:54:55.815 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:55 compute-0 nova_compute[183075]: 2026-01-22 16:54:55.815 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 16:54:55 compute-0 nova_compute[183075]: 2026-01-22 16:54:55.965 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 16:54:55 compute-0 nova_compute[183075]: 2026-01-22 16:54:55.966 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5764MB free_disk=73.38264846801758GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 16:54:55 compute-0 nova_compute[183075]: 2026-01-22 16:54:55.967 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:54:55 compute-0 nova_compute[183075]: 2026-01-22 16:54:55.967 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:54:56 compute-0 nova_compute[183075]: 2026-01-22 16:54:56.018 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 16:54:56 compute-0 nova_compute[183075]: 2026-01-22 16:54:56.018 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 16:54:56 compute-0 nova_compute[183075]: 2026-01-22 16:54:56.040 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 16:54:56 compute-0 nova_compute[183075]: 2026-01-22 16:54:56.053 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 16:54:56 compute-0 nova_compute[183075]: 2026-01-22 16:54:56.070 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 16:54:56 compute-0 nova_compute[183075]: 2026-01-22 16:54:56.071 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:54:56 compute-0 nova_compute[183075]: 2026-01-22 16:54:56.111 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:56 compute-0 nova_compute[183075]: 2026-01-22 16:54:56.904 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:57 compute-0 nova_compute[183075]: 2026-01-22 16:54:57.072 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:54:57 compute-0 nova_compute[183075]: 2026-01-22 16:54:57.072 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 16:54:57 compute-0 nova_compute[183075]: 2026-01-22 16:54:57.073 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 16:54:57 compute-0 nova_compute[183075]: 2026-01-22 16:54:57.091 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 16:54:57 compute-0 podman[211990]: 2026-01-22 16:54:57.372438101 +0000 UTC m=+0.078046342 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 16:54:57 compute-0 nova_compute[183075]: 2026-01-22 16:54:57.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:54:57 compute-0 nova_compute[183075]: 2026-01-22 16:54:57.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:54:58 compute-0 nova_compute[183075]: 2026-01-22 16:54:58.513 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:58 compute-0 nova_compute[183075]: 2026-01-22 16:54:58.653 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:54:58 compute-0 nova_compute[183075]: 2026-01-22 16:54:58.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:55:01 compute-0 nova_compute[183075]: 2026-01-22 16:55:01.114 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:01 compute-0 nova_compute[183075]: 2026-01-22 16:55:01.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:55:01 compute-0 nova_compute[183075]: 2026-01-22 16:55:01.906 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:02 compute-0 podman[212014]: 2026-01-22 16:55:02.391498678 +0000 UTC m=+0.091810262 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute)
Jan 22 16:55:06 compute-0 nova_compute[183075]: 2026-01-22 16:55:06.082 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769100891.079939, 243d8a1b-180e-4b78-88fb-2d08cb598b7d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 16:55:06 compute-0 nova_compute[183075]: 2026-01-22 16:55:06.082 183079 INFO nova.compute.manager [-] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] VM Stopped (Lifecycle Event)
Jan 22 16:55:06 compute-0 nova_compute[183075]: 2026-01-22 16:55:06.106 183079 DEBUG nova.compute.manager [None req-3fc0afb7-18d5-41ef-a87a-e07ae2237601 - - - - - -] [instance: 243d8a1b-180e-4b78-88fb-2d08cb598b7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 16:55:06 compute-0 nova_compute[183075]: 2026-01-22 16:55:06.142 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:06 compute-0 nova_compute[183075]: 2026-01-22 16:55:06.909 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:08 compute-0 podman[212035]: 2026-01-22 16:55:08.367000744 +0000 UTC m=+0.068402807 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 16:55:11 compute-0 nova_compute[183075]: 2026-01-22 16:55:11.146 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:11 compute-0 nova_compute[183075]: 2026-01-22 16:55:11.910 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:16 compute-0 nova_compute[183075]: 2026-01-22 16:55:16.174 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:16 compute-0 nova_compute[183075]: 2026-01-22 16:55:16.913 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:20 compute-0 podman[212059]: 2026-01-22 16:55:20.380323354 +0000 UTC m=+0.079449890 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 16:55:21 compute-0 nova_compute[183075]: 2026-01-22 16:55:21.177 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:21 compute-0 nova_compute[183075]: 2026-01-22 16:55:21.914 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:26 compute-0 nova_compute[183075]: 2026-01-22 16:55:26.222 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:26 compute-0 podman[212085]: 2026-01-22 16:55:26.392506142 +0000 UTC m=+0.081578954 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 16:55:26 compute-0 podman[212084]: 2026-01-22 16:55:26.414459677 +0000 UTC m=+0.115368878 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 22 16:55:26 compute-0 nova_compute[183075]: 2026-01-22 16:55:26.916 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:28 compute-0 podman[212129]: 2026-01-22 16:55:28.367806015 +0000 UTC m=+0.077297475 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git)
Jan 22 16:55:31 compute-0 nova_compute[183075]: 2026-01-22 16:55:31.225 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:31 compute-0 nova_compute[183075]: 2026-01-22 16:55:31.918 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:33 compute-0 podman[212150]: 2026-01-22 16:55:33.38336983 +0000 UTC m=+0.080549688 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute)
Jan 22 16:55:36 compute-0 nova_compute[183075]: 2026-01-22 16:55:36.228 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:36 compute-0 nova_compute[183075]: 2026-01-22 16:55:36.953 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:37 compute-0 ovn_controller[95372]: 2026-01-22T16:55:37Z|00036|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 22 16:55:39 compute-0 podman[212171]: 2026-01-22 16:55:39.379167093 +0000 UTC m=+0.077603733 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 16:55:41 compute-0 nova_compute[183075]: 2026-01-22 16:55:41.230 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:55:41.910 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:55:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:55:41.911 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:55:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:55:41.911 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:55:41 compute-0 nova_compute[183075]: 2026-01-22 16:55:41.955 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:46 compute-0 nova_compute[183075]: 2026-01-22 16:55:46.232 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:46 compute-0 nova_compute[183075]: 2026-01-22 16:55:46.991 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:51 compute-0 nova_compute[183075]: 2026-01-22 16:55:51.235 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:51 compute-0 podman[212195]: 2026-01-22 16:55:51.362058549 +0000 UTC m=+0.064466061 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 16:55:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:55:51.872 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 16:55:51 compute-0 nova_compute[183075]: 2026-01-22 16:55:51.873 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:55:51.873 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 16:55:51 compute-0 nova_compute[183075]: 2026-01-22 16:55:51.992 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:52 compute-0 nova_compute[183075]: 2026-01-22 16:55:52.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:55:53 compute-0 nova_compute[183075]: 2026-01-22 16:55:53.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:55:53 compute-0 nova_compute[183075]: 2026-01-22 16:55:53.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:55:53 compute-0 nova_compute[183075]: 2026-01-22 16:55:53.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.445 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.445 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.445 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.445 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.445 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.445 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:55:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:55:55 compute-0 nova_compute[183075]: 2026-01-22 16:55:55.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:55:56 compute-0 nova_compute[183075]: 2026-01-22 16:55:56.237 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:56 compute-0 nova_compute[183075]: 2026-01-22 16:55:56.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:55:56 compute-0 nova_compute[183075]: 2026-01-22 16:55:56.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 16:55:56 compute-0 nova_compute[183075]: 2026-01-22 16:55:56.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 16:55:56 compute-0 nova_compute[183075]: 2026-01-22 16:55:56.805 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 16:55:56 compute-0 nova_compute[183075]: 2026-01-22 16:55:56.993 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:55:57 compute-0 podman[212222]: 2026-01-22 16:55:57.373040547 +0000 UTC m=+0.070157575 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 16:55:57 compute-0 podman[212221]: 2026-01-22 16:55:57.403103807 +0000 UTC m=+0.113342767 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 16:55:57 compute-0 nova_compute[183075]: 2026-01-22 16:55:57.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:55:57 compute-0 nova_compute[183075]: 2026-01-22 16:55:57.813 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:55:57 compute-0 nova_compute[183075]: 2026-01-22 16:55:57.814 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:55:57 compute-0 nova_compute[183075]: 2026-01-22 16:55:57.814 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:55:57 compute-0 nova_compute[183075]: 2026-01-22 16:55:57.814 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 16:55:58 compute-0 nova_compute[183075]: 2026-01-22 16:55:58.028 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 16:55:58 compute-0 nova_compute[183075]: 2026-01-22 16:55:58.029 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5789MB free_disk=73.38266372680664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 16:55:58 compute-0 nova_compute[183075]: 2026-01-22 16:55:58.030 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:55:58 compute-0 nova_compute[183075]: 2026-01-22 16:55:58.030 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:55:58 compute-0 nova_compute[183075]: 2026-01-22 16:55:58.089 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 16:55:58 compute-0 nova_compute[183075]: 2026-01-22 16:55:58.089 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 16:55:58 compute-0 nova_compute[183075]: 2026-01-22 16:55:58.120 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 16:55:58 compute-0 nova_compute[183075]: 2026-01-22 16:55:58.134 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 16:55:58 compute-0 nova_compute[183075]: 2026-01-22 16:55:58.136 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 16:55:58 compute-0 nova_compute[183075]: 2026-01-22 16:55:58.136 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:55:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:55:58.875 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:55:59 compute-0 nova_compute[183075]: 2026-01-22 16:55:59.136 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:55:59 compute-0 nova_compute[183075]: 2026-01-22 16:55:59.137 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:55:59 compute-0 podman[212266]: 2026-01-22 16:55:59.365329568 +0000 UTC m=+0.069185141 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Jan 22 16:55:59 compute-0 nova_compute[183075]: 2026-01-22 16:55:59.782 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:56:01 compute-0 nova_compute[183075]: 2026-01-22 16:56:01.240 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:02 compute-0 nova_compute[183075]: 2026-01-22 16:56:02.035 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:04 compute-0 podman[212287]: 2026-01-22 16:56:04.342586246 +0000 UTC m=+0.057026974 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 16:56:06 compute-0 nova_compute[183075]: 2026-01-22 16:56:06.243 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:07 compute-0 nova_compute[183075]: 2026-01-22 16:56:07.038 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:10 compute-0 podman[212308]: 2026-01-22 16:56:10.333580078 +0000 UTC m=+0.044970978 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 16:56:11 compute-0 nova_compute[183075]: 2026-01-22 16:56:11.244 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:12 compute-0 nova_compute[183075]: 2026-01-22 16:56:12.080 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:16 compute-0 nova_compute[183075]: 2026-01-22 16:56:16.246 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:17 compute-0 nova_compute[183075]: 2026-01-22 16:56:17.121 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:21 compute-0 nova_compute[183075]: 2026-01-22 16:56:21.250 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:22 compute-0 nova_compute[183075]: 2026-01-22 16:56:22.122 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:22 compute-0 podman[212333]: 2026-01-22 16:56:22.354887385 +0000 UTC m=+0.066349418 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 16:56:26 compute-0 nova_compute[183075]: 2026-01-22 16:56:26.299 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:27 compute-0 nova_compute[183075]: 2026-01-22 16:56:27.124 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:28 compute-0 podman[212359]: 2026-01-22 16:56:28.403977967 +0000 UTC m=+0.101782332 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 16:56:28 compute-0 podman[212358]: 2026-01-22 16:56:28.444846285 +0000 UTC m=+0.145731760 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 16:56:30 compute-0 podman[212401]: 2026-01-22 16:56:30.33979425 +0000 UTC m=+0.056643141 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, config_id=openstack_network_exporter, container_name=openstack_network_exporter, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 16:56:31 compute-0 nova_compute[183075]: 2026-01-22 16:56:31.302 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:32 compute-0 nova_compute[183075]: 2026-01-22 16:56:32.172 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:35 compute-0 podman[212422]: 2026-01-22 16:56:35.368256414 +0000 UTC m=+0.066600412 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 16:56:36 compute-0 nova_compute[183075]: 2026-01-22 16:56:36.304 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:37 compute-0 nova_compute[183075]: 2026-01-22 16:56:37.226 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:56:38.423 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:16:d7 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e75483ea-d78a-4ac4-b8f0-49a78a777c36', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e75483ea-d78a-4ac4-b8f0-49a78a777c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b16d50953b774c09b5468dfddb260bfd', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e8bf5872-b2d3-4bd9-90dd-7f2808ceb062, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d42e316c-bf24-468d-bcc3-0cbfbae2c52c) old=Port_Binding(mac=['fa:16:3e:57:16:d7 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e75483ea-d78a-4ac4-b8f0-49a78a777c36', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e75483ea-d78a-4ac4-b8f0-49a78a777c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b16d50953b774c09b5468dfddb260bfd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 16:56:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:56:38.425 104629 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d42e316c-bf24-468d-bcc3-0cbfbae2c52c in datapath e75483ea-d78a-4ac4-b8f0-49a78a777c36 updated
Jan 22 16:56:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:56:38.427 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e75483ea-d78a-4ac4-b8f0-49a78a777c36, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 16:56:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:56:38.428 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c4c1dc00-3d0d-43fe-bb79-1cff7a6a2528]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:56:41 compute-0 nova_compute[183075]: 2026-01-22 16:56:41.307 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:41 compute-0 podman[212442]: 2026-01-22 16:56:41.382045826 +0000 UTC m=+0.082534839 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 16:56:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:56:41.911 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:56:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:56:41.911 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:56:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:56:41.912 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:56:42 compute-0 nova_compute[183075]: 2026-01-22 16:56:42.228 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:46 compute-0 nova_compute[183075]: 2026-01-22 16:56:46.349 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:47 compute-0 nova_compute[183075]: 2026-01-22 16:56:47.229 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:51 compute-0 nova_compute[183075]: 2026-01-22 16:56:51.351 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:52 compute-0 nova_compute[183075]: 2026-01-22 16:56:52.231 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:56:52.440 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 16:56:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:56:52.441 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 16:56:52 compute-0 nova_compute[183075]: 2026-01-22 16:56:52.441 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:53 compute-0 podman[212467]: 2026-01-22 16:56:53.367188547 +0000 UTC m=+0.079807767 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 16:56:53 compute-0 nova_compute[183075]: 2026-01-22 16:56:53.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:56:53 compute-0 nova_compute[183075]: 2026-01-22 16:56:53.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:56:55 compute-0 nova_compute[183075]: 2026-01-22 16:56:55.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:56:55 compute-0 nova_compute[183075]: 2026-01-22 16:56:55.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 16:56:56 compute-0 nova_compute[183075]: 2026-01-22 16:56:56.360 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:56 compute-0 nova_compute[183075]: 2026-01-22 16:56:56.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:56:56 compute-0 nova_compute[183075]: 2026-01-22 16:56:56.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 16:56:56 compute-0 nova_compute[183075]: 2026-01-22 16:56:56.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 16:56:56 compute-0 nova_compute[183075]: 2026-01-22 16:56:56.811 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 16:56:57 compute-0 nova_compute[183075]: 2026-01-22 16:56:57.233 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:56:57 compute-0 nova_compute[183075]: 2026-01-22 16:56:57.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:56:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:56:58.443 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:56:59 compute-0 podman[212492]: 2026-01-22 16:56:59.335350696 +0000 UTC m=+0.044550315 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 16:56:59 compute-0 podman[212491]: 2026-01-22 16:56:59.385366613 +0000 UTC m=+0.097632702 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 16:56:59 compute-0 nova_compute[183075]: 2026-01-22 16:56:59.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:56:59 compute-0 nova_compute[183075]: 2026-01-22 16:56:59.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:56:59 compute-0 nova_compute[183075]: 2026-01-22 16:56:59.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:56:59 compute-0 nova_compute[183075]: 2026-01-22 16:56:59.815 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:56:59 compute-0 nova_compute[183075]: 2026-01-22 16:56:59.816 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:56:59 compute-0 nova_compute[183075]: 2026-01-22 16:56:59.816 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:56:59 compute-0 nova_compute[183075]: 2026-01-22 16:56:59.816 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 16:56:59 compute-0 nova_compute[183075]: 2026-01-22 16:56:59.967 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 16:56:59 compute-0 nova_compute[183075]: 2026-01-22 16:56:59.968 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5803MB free_disk=73.38264465332031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 16:56:59 compute-0 nova_compute[183075]: 2026-01-22 16:56:59.968 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:56:59 compute-0 nova_compute[183075]: 2026-01-22 16:56:59.968 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:57:00 compute-0 nova_compute[183075]: 2026-01-22 16:57:00.045 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 16:57:00 compute-0 nova_compute[183075]: 2026-01-22 16:57:00.046 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 16:57:00 compute-0 nova_compute[183075]: 2026-01-22 16:57:00.078 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 16:57:00 compute-0 nova_compute[183075]: 2026-01-22 16:57:00.100 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 16:57:00 compute-0 nova_compute[183075]: 2026-01-22 16:57:00.103 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 16:57:00 compute-0 nova_compute[183075]: 2026-01-22 16:57:00.103 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:57:01 compute-0 podman[212536]: 2026-01-22 16:57:01.356891169 +0000 UTC m=+0.072816684 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Jan 22 16:57:01 compute-0 nova_compute[183075]: 2026-01-22 16:57:01.371 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:02 compute-0 nova_compute[183075]: 2026-01-22 16:57:02.099 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:57:02 compute-0 nova_compute[183075]: 2026-01-22 16:57:02.235 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:04 compute-0 nova_compute[183075]: 2026-01-22 16:57:04.782 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:57:06 compute-0 podman[212558]: 2026-01-22 16:57:06.359372975 +0000 UTC m=+0.055294286 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 16:57:06 compute-0 nova_compute[183075]: 2026-01-22 16:57:06.373 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:07 compute-0 nova_compute[183075]: 2026-01-22 16:57:07.238 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:11 compute-0 nova_compute[183075]: 2026-01-22 16:57:11.375 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:12 compute-0 nova_compute[183075]: 2026-01-22 16:57:12.240 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:12 compute-0 podman[212577]: 2026-01-22 16:57:12.369460247 +0000 UTC m=+0.070045312 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 16:57:16 compute-0 nova_compute[183075]: 2026-01-22 16:57:16.377 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:17 compute-0 nova_compute[183075]: 2026-01-22 16:57:17.243 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:21 compute-0 nova_compute[183075]: 2026-01-22 16:57:21.379 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:22 compute-0 nova_compute[183075]: 2026-01-22 16:57:22.246 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:24 compute-0 podman[212601]: 2026-01-22 16:57:24.353775929 +0000 UTC m=+0.063477000 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 16:57:26 compute-0 nova_compute[183075]: 2026-01-22 16:57:26.381 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:27 compute-0 nova_compute[183075]: 2026-01-22 16:57:27.249 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:31 compute-0 podman[212625]: 2026-01-22 16:57:31.176499896 +0000 UTC m=+0.871523961 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 16:57:31 compute-0 podman[212626]: 2026-01-22 16:57:31.179399912 +0000 UTC m=+0.066723310 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 16:57:31 compute-0 nova_compute[183075]: 2026-01-22 16:57:31.386 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:32 compute-0 nova_compute[183075]: 2026-01-22 16:57:32.251 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:32 compute-0 podman[212669]: 2026-01-22 16:57:32.388174328 +0000 UTC m=+0.088311155 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=openstack_network_exporter, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Jan 22 16:57:36 compute-0 nova_compute[183075]: 2026-01-22 16:57:36.389 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:37 compute-0 nova_compute[183075]: 2026-01-22 16:57:37.254 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:37 compute-0 podman[212690]: 2026-01-22 16:57:37.372585851 +0000 UTC m=+0.076576929 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 16:57:41 compute-0 nova_compute[183075]: 2026-01-22 16:57:41.429 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:57:41.912 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:57:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:57:41.913 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:57:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:57:41.913 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:57:42 compute-0 nova_compute[183075]: 2026-01-22 16:57:42.257 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:43 compute-0 podman[212710]: 2026-01-22 16:57:43.337699863 +0000 UTC m=+0.051688995 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 16:57:46 compute-0 nova_compute[183075]: 2026-01-22 16:57:46.470 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:47 compute-0 nova_compute[183075]: 2026-01-22 16:57:47.259 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:51 compute-0 nova_compute[183075]: 2026-01-22 16:57:51.473 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:52 compute-0 nova_compute[183075]: 2026-01-22 16:57:52.261 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:57:52.611 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 16:57:52 compute-0 nova_compute[183075]: 2026-01-22 16:57:52.612 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:57:52.613 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 16:57:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:57:52.614 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:57:53 compute-0 nova_compute[183075]: 2026-01-22 16:57:53.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:57:54 compute-0 nova_compute[183075]: 2026-01-22 16:57:54.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:57:55 compute-0 podman[212735]: 2026-01-22 16:57:55.366868691 +0000 UTC m=+0.082280988 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:57:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:57:55 compute-0 nova_compute[183075]: 2026-01-22 16:57:55.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:57:55 compute-0 nova_compute[183075]: 2026-01-22 16:57:55.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 16:57:56 compute-0 nova_compute[183075]: 2026-01-22 16:57:56.475 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:57 compute-0 nova_compute[183075]: 2026-01-22 16:57:57.264 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:57:57 compute-0 nova_compute[183075]: 2026-01-22 16:57:57.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:57:57 compute-0 nova_compute[183075]: 2026-01-22 16:57:57.790 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 16:57:57 compute-0 nova_compute[183075]: 2026-01-22 16:57:57.790 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 16:57:57 compute-0 nova_compute[183075]: 2026-01-22 16:57:57.812 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 16:57:58 compute-0 nova_compute[183075]: 2026-01-22 16:57:58.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:57:59 compute-0 nova_compute[183075]: 2026-01-22 16:57:59.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:57:59 compute-0 nova_compute[183075]: 2026-01-22 16:57:59.816 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:57:59 compute-0 nova_compute[183075]: 2026-01-22 16:57:59.817 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:57:59 compute-0 nova_compute[183075]: 2026-01-22 16:57:59.817 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:57:59 compute-0 nova_compute[183075]: 2026-01-22 16:57:59.817 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 16:58:00 compute-0 nova_compute[183075]: 2026-01-22 16:58:00.025 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 16:58:00 compute-0 nova_compute[183075]: 2026-01-22 16:58:00.027 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5809MB free_disk=73.38266372680664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 16:58:00 compute-0 nova_compute[183075]: 2026-01-22 16:58:00.027 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:58:00 compute-0 nova_compute[183075]: 2026-01-22 16:58:00.027 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:58:00 compute-0 nova_compute[183075]: 2026-01-22 16:58:00.106 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 16:58:00 compute-0 nova_compute[183075]: 2026-01-22 16:58:00.107 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 16:58:00 compute-0 nova_compute[183075]: 2026-01-22 16:58:00.128 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 16:58:00 compute-0 nova_compute[183075]: 2026-01-22 16:58:00.142 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 16:58:00 compute-0 nova_compute[183075]: 2026-01-22 16:58:00.144 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 16:58:00 compute-0 nova_compute[183075]: 2026-01-22 16:58:00.144 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:58:01 compute-0 podman[212760]: 2026-01-22 16:58:01.360852338 +0000 UTC m=+0.062003846 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 16:58:01 compute-0 podman[212759]: 2026-01-22 16:58:01.425690357 +0000 UTC m=+0.126204238 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 16:58:01 compute-0 nova_compute[183075]: 2026-01-22 16:58:01.478 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:02 compute-0 nova_compute[183075]: 2026-01-22 16:58:02.140 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:58:02 compute-0 nova_compute[183075]: 2026-01-22 16:58:02.141 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:58:02 compute-0 nova_compute[183075]: 2026-01-22 16:58:02.141 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:58:02 compute-0 nova_compute[183075]: 2026-01-22 16:58:02.267 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:03 compute-0 podman[212805]: 2026-01-22 16:58:03.389883801 +0000 UTC m=+0.101746387 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, name=ubi9-minimal, maintainer=Red Hat, Inc.)
Jan 22 16:58:06 compute-0 nova_compute[183075]: 2026-01-22 16:58:06.484 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:07 compute-0 nova_compute[183075]: 2026-01-22 16:58:07.315 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:08 compute-0 sshd-session[212828]: Received disconnect from 45.227.254.170 port 58280:11:  [preauth]
Jan 22 16:58:08 compute-0 sshd-session[212828]: Disconnected from authenticating user root 45.227.254.170 port 58280 [preauth]
Jan 22 16:58:08 compute-0 podman[212830]: 2026-01-22 16:58:08.377812734 +0000 UTC m=+0.087459013 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 16:58:11 compute-0 nova_compute[183075]: 2026-01-22 16:58:11.487 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:12 compute-0 nova_compute[183075]: 2026-01-22 16:58:12.317 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:14 compute-0 podman[212850]: 2026-01-22 16:58:14.376297621 +0000 UTC m=+0.083401488 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 16:58:16 compute-0 nova_compute[183075]: 2026-01-22 16:58:16.526 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:17 compute-0 nova_compute[183075]: 2026-01-22 16:58:17.318 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:21 compute-0 nova_compute[183075]: 2026-01-22 16:58:21.528 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:22 compute-0 nova_compute[183075]: 2026-01-22 16:58:22.320 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:26 compute-0 podman[212875]: 2026-01-22 16:58:26.349768308 +0000 UTC m=+0.058595236 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 16:58:26 compute-0 nova_compute[183075]: 2026-01-22 16:58:26.530 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:27 compute-0 nova_compute[183075]: 2026-01-22 16:58:27.322 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:31 compute-0 nova_compute[183075]: 2026-01-22 16:58:31.545 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:32 compute-0 nova_compute[183075]: 2026-01-22 16:58:32.324 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:32 compute-0 podman[212901]: 2026-01-22 16:58:32.374758219 +0000 UTC m=+0.076664270 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 16:58:32 compute-0 podman[212900]: 2026-01-22 16:58:32.406208302 +0000 UTC m=+0.105940456 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 16:58:34 compute-0 podman[212940]: 2026-01-22 16:58:34.378093089 +0000 UTC m=+0.085778619 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 16:58:36 compute-0 nova_compute[183075]: 2026-01-22 16:58:36.547 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:37 compute-0 nova_compute[183075]: 2026-01-22 16:58:37.326 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:39 compute-0 podman[212961]: 2026-01-22 16:58:39.391079999 +0000 UTC m=+0.086472017 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 16:58:41 compute-0 nova_compute[183075]: 2026-01-22 16:58:41.592 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:58:41.913 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:58:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:58:41.914 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:58:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:58:41.915 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:58:42 compute-0 nova_compute[183075]: 2026-01-22 16:58:42.327 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:45 compute-0 podman[212981]: 2026-01-22 16:58:45.342939373 +0000 UTC m=+0.055469254 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 16:58:46 compute-0 nova_compute[183075]: 2026-01-22 16:58:46.628 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:47 compute-0 nova_compute[183075]: 2026-01-22 16:58:47.329 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:51 compute-0 nova_compute[183075]: 2026-01-22 16:58:51.630 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:52 compute-0 nova_compute[183075]: 2026-01-22 16:58:52.331 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:52 compute-0 nova_compute[183075]: 2026-01-22 16:58:52.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:58:52 compute-0 nova_compute[183075]: 2026-01-22 16:58:52.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 16:58:52 compute-0 nova_compute[183075]: 2026-01-22 16:58:52.814 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 16:58:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:58:53.688 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 16:58:53 compute-0 nova_compute[183075]: 2026-01-22 16:58:53.688 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:58:53.690 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 16:58:53 compute-0 nova_compute[183075]: 2026-01-22 16:58:53.815 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:58:54 compute-0 nova_compute[183075]: 2026-01-22 16:58:54.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:58:56 compute-0 nova_compute[183075]: 2026-01-22 16:58:56.665 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:56 compute-0 nova_compute[183075]: 2026-01-22 16:58:56.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:58:56 compute-0 nova_compute[183075]: 2026-01-22 16:58:56.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 16:58:57 compute-0 nova_compute[183075]: 2026-01-22 16:58:57.334 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:58:57 compute-0 podman[213005]: 2026-01-22 16:58:57.375649942 +0000 UTC m=+0.074256567 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 16:58:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:58:57.692 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:58:57 compute-0 nova_compute[183075]: 2026-01-22 16:58:57.811 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:58:57 compute-0 nova_compute[183075]: 2026-01-22 16:58:57.811 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 16:58:57 compute-0 nova_compute[183075]: 2026-01-22 16:58:57.812 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 16:58:57 compute-0 nova_compute[183075]: 2026-01-22 16:58:57.825 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 16:58:57 compute-0 nova_compute[183075]: 2026-01-22 16:58:57.826 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:58:57 compute-0 nova_compute[183075]: 2026-01-22 16:58:57.826 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 16:58:58 compute-0 nova_compute[183075]: 2026-01-22 16:58:58.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:59:01 compute-0 anacron[4164]: Job `cron.weekly' started
Jan 22 16:59:01 compute-0 anacron[4164]: Job `cron.weekly' terminated
Jan 22 16:59:01 compute-0 nova_compute[183075]: 2026-01-22 16:59:01.667 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:01 compute-0 nova_compute[183075]: 2026-01-22 16:59:01.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:59:01 compute-0 nova_compute[183075]: 2026-01-22 16:59:01.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:59:01 compute-0 nova_compute[183075]: 2026-01-22 16:59:01.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:59:01 compute-0 nova_compute[183075]: 2026-01-22 16:59:01.810 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:59:01 compute-0 nova_compute[183075]: 2026-01-22 16:59:01.811 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:59:01 compute-0 nova_compute[183075]: 2026-01-22 16:59:01.811 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:59:01 compute-0 nova_compute[183075]: 2026-01-22 16:59:01.812 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 16:59:02 compute-0 nova_compute[183075]: 2026-01-22 16:59:02.032 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 16:59:02 compute-0 nova_compute[183075]: 2026-01-22 16:59:02.032 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5809MB free_disk=73.38264083862305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 16:59:02 compute-0 nova_compute[183075]: 2026-01-22 16:59:02.033 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:59:02 compute-0 nova_compute[183075]: 2026-01-22 16:59:02.033 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:59:02 compute-0 nova_compute[183075]: 2026-01-22 16:59:02.335 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:02 compute-0 nova_compute[183075]: 2026-01-22 16:59:02.343 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 16:59:02 compute-0 nova_compute[183075]: 2026-01-22 16:59:02.344 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 16:59:02 compute-0 nova_compute[183075]: 2026-01-22 16:59:02.412 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 16:59:02 compute-0 nova_compute[183075]: 2026-01-22 16:59:02.431 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 16:59:02 compute-0 nova_compute[183075]: 2026-01-22 16:59:02.434 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 16:59:02 compute-0 nova_compute[183075]: 2026-01-22 16:59:02.434 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.401s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:59:03 compute-0 podman[213032]: 2026-01-22 16:59:03.344049131 +0000 UTC m=+0.051855920 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 16:59:03 compute-0 podman[213031]: 2026-01-22 16:59:03.401547928 +0000 UTC m=+0.103459222 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 16:59:03 compute-0 nova_compute[183075]: 2026-01-22 16:59:03.435 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:59:04 compute-0 nova_compute[183075]: 2026-01-22 16:59:04.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:59:05 compute-0 podman[213074]: 2026-01-22 16:59:05.347298158 +0000 UTC m=+0.063519126 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 16:59:05 compute-0 nova_compute[183075]: 2026-01-22 16:59:05.795 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:59:06 compute-0 nova_compute[183075]: 2026-01-22 16:59:06.669 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:07 compute-0 nova_compute[183075]: 2026-01-22 16:59:07.338 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:10 compute-0 podman[213095]: 2026-01-22 16:59:10.392496284 +0000 UTC m=+0.101499471 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_managed=true)
Jan 22 16:59:11 compute-0 nova_compute[183075]: 2026-01-22 16:59:11.671 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:12 compute-0 nova_compute[183075]: 2026-01-22 16:59:12.340 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:16 compute-0 podman[213116]: 2026-01-22 16:59:16.365162123 +0000 UTC m=+0.071678359 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 16:59:16 compute-0 nova_compute[183075]: 2026-01-22 16:59:16.716 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:17 compute-0 nova_compute[183075]: 2026-01-22 16:59:17.341 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:18 compute-0 nova_compute[183075]: 2026-01-22 16:59:18.747 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:59:21 compute-0 nova_compute[183075]: 2026-01-22 16:59:21.717 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:22 compute-0 nova_compute[183075]: 2026-01-22 16:59:22.343 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:26 compute-0 nova_compute[183075]: 2026-01-22 16:59:26.719 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:27 compute-0 nova_compute[183075]: 2026-01-22 16:59:27.345 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:28 compute-0 podman[213140]: 2026-01-22 16:59:28.345985563 +0000 UTC m=+0.058289238 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 16:59:31 compute-0 nova_compute[183075]: 2026-01-22 16:59:31.722 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:32 compute-0 nova_compute[183075]: 2026-01-22 16:59:32.381 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:34 compute-0 podman[213166]: 2026-01-22 16:59:34.409232978 +0000 UTC m=+0.104707775 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 16:59:34 compute-0 podman[213165]: 2026-01-22 16:59:34.426454849 +0000 UTC m=+0.127771639 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 16:59:36 compute-0 podman[213209]: 2026-01-22 16:59:36.405225746 +0000 UTC m=+0.111993366 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 16:59:36 compute-0 nova_compute[183075]: 2026-01-22 16:59:36.725 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:37 compute-0 nova_compute[183075]: 2026-01-22 16:59:37.416 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:41 compute-0 podman[213231]: 2026-01-22 16:59:41.39781631 +0000 UTC m=+0.102172418 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 16:59:41 compute-0 nova_compute[183075]: 2026-01-22 16:59:41.733 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:59:41.915 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 16:59:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:59:41.915 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 16:59:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:59:41.915 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 16:59:42 compute-0 nova_compute[183075]: 2026-01-22 16:59:42.464 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:46 compute-0 nova_compute[183075]: 2026-01-22 16:59:46.738 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:47 compute-0 podman[213254]: 2026-01-22 16:59:47.338327031 +0000 UTC m=+0.051570922 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 16:59:47 compute-0 nova_compute[183075]: 2026-01-22 16:59:47.466 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:51 compute-0 nova_compute[183075]: 2026-01-22 16:59:51.741 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:59:52.271 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:16:67 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bab61d81-8962-4825-ba74-ee042fb16882', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bab61d81-8962-4825-ba74-ee042fb16882', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c82f089f9ae4a02b05a6f979a30d88d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5d58542-1a8e-4531-9a98-50476806045d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=53459905-c69a-4af0-a98a-3bf254fff9ff) old=Port_Binding(mac=['fa:16:3e:5a:16:67 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bab61d81-8962-4825-ba74-ee042fb16882', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bab61d81-8962-4825-ba74-ee042fb16882', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c82f089f9ae4a02b05a6f979a30d88d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 16:59:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:59:52.274 104629 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 53459905-c69a-4af0-a98a-3bf254fff9ff in datapath bab61d81-8962-4825-ba74-ee042fb16882 updated
Jan 22 16:59:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:59:52.276 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bab61d81-8962-4825-ba74-ee042fb16882, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 16:59:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:59:52.278 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[dc61dbca-3cfa-4347-8a25-f3cfbcfe419c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 16:59:52 compute-0 nova_compute[183075]: 2026-01-22 16:59:52.507 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:59:53.827 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 16:59:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:59:53.829 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 16:59:53 compute-0 nova_compute[183075]: 2026-01-22 16:59:53.829 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:54 compute-0 nova_compute[183075]: 2026-01-22 16:59:54.814 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.446 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 16:59:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 16:59:55 compute-0 nova_compute[183075]: 2026-01-22 16:59:55.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:59:56 compute-0 nova_compute[183075]: 2026-01-22 16:59:56.744 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 16:59:56.831 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 16:59:57 compute-0 nova_compute[183075]: 2026-01-22 16:59:57.549 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 16:59:58 compute-0 nova_compute[183075]: 2026-01-22 16:59:58.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:59:58 compute-0 nova_compute[183075]: 2026-01-22 16:59:58.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 16:59:58 compute-0 nova_compute[183075]: 2026-01-22 16:59:58.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 16:59:58 compute-0 nova_compute[183075]: 2026-01-22 16:59:58.837 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 16:59:58 compute-0 nova_compute[183075]: 2026-01-22 16:59:58.838 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:59:59 compute-0 podman[213278]: 2026-01-22 16:59:59.385189028 +0000 UTC m=+0.089033511 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 16:59:59 compute-0 nova_compute[183075]: 2026-01-22 16:59:59.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 16:59:59 compute-0 nova_compute[183075]: 2026-01-22 16:59:59.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:00:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:00.065 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:16:67 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-bab61d81-8962-4825-ba74-ee042fb16882', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bab61d81-8962-4825-ba74-ee042fb16882', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c82f089f9ae4a02b05a6f979a30d88d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5d58542-1a8e-4531-9a98-50476806045d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=53459905-c69a-4af0-a98a-3bf254fff9ff) old=Port_Binding(mac=['fa:16:3e:5a:16:67 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bab61d81-8962-4825-ba74-ee042fb16882', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bab61d81-8962-4825-ba74-ee042fb16882', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c82f089f9ae4a02b05a6f979a30d88d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:00:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:00.068 104629 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 53459905-c69a-4af0-a98a-3bf254fff9ff in datapath bab61d81-8962-4825-ba74-ee042fb16882 updated
Jan 22 17:00:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:00.071 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bab61d81-8962-4825-ba74-ee042fb16882, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:00:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:00.072 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[aba79966-2317-49cf-b7e0-404edfe3b8c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:00:01 compute-0 nova_compute[183075]: 2026-01-22 17:00:01.747 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:01 compute-0 nova_compute[183075]: 2026-01-22 17:00:01.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:00:01 compute-0 nova_compute[183075]: 2026-01-22 17:00:01.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.127 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.128 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.128 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.128 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.319 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.320 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5806MB free_disk=73.38234329223633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.320 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.321 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.390 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.390 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.406 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing inventories for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.422 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Updating ProviderTree inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.423 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.551 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.684 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing aggregate associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.710 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing trait associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NODE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.732 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.752 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.755 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:00:02 compute-0 nova_compute[183075]: 2026-01-22 17:00:02.755 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.435s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:00:04 compute-0 nova_compute[183075]: 2026-01-22 17:00:04.751 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:00:04 compute-0 nova_compute[183075]: 2026-01-22 17:00:04.752 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:00:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:04.971 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:16:67 10.100.0.18 10.100.0.2 10.100.0.34 10.100.0.50'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28 10.100.0.50/28', 'neutron:device_id': 'ovnmeta-bab61d81-8962-4825-ba74-ee042fb16882', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bab61d81-8962-4825-ba74-ee042fb16882', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c82f089f9ae4a02b05a6f979a30d88d', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5d58542-1a8e-4531-9a98-50476806045d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=53459905-c69a-4af0-a98a-3bf254fff9ff) old=Port_Binding(mac=['fa:16:3e:5a:16:67 10.100.0.18 10.100.0.2 10.100.0.34'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-bab61d81-8962-4825-ba74-ee042fb16882', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bab61d81-8962-4825-ba74-ee042fb16882', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c82f089f9ae4a02b05a6f979a30d88d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:00:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:04.973 104629 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 53459905-c69a-4af0-a98a-3bf254fff9ff in datapath bab61d81-8962-4825-ba74-ee042fb16882 updated
Jan 22 17:00:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:04.975 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bab61d81-8962-4825-ba74-ee042fb16882, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:00:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:04.976 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cfa91367-51ac-4eb9-8436-423d6f76b4fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:00:05 compute-0 podman[213304]: 2026-01-22 17:00:05.343459578 +0000 UTC m=+0.054647353 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:00:05 compute-0 podman[213303]: 2026-01-22 17:00:05.386528205 +0000 UTC m=+0.098986424 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:00:06 compute-0 nova_compute[183075]: 2026-01-22 17:00:06.749 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:07 compute-0 podman[213350]: 2026-01-22 17:00:07.380980494 +0000 UTC m=+0.082316743 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, version=9.6, distribution-scope=public, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.openshift.expose-services=)
Jan 22 17:00:07 compute-0 nova_compute[183075]: 2026-01-22 17:00:07.553 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:07.735 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:16:67 10.100.0.18 10.100.0.2 10.100.0.34 10.100.0.50 10.100.0.66'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28 10.100.0.50/28 10.100.0.66/28', 'neutron:device_id': 'ovnmeta-bab61d81-8962-4825-ba74-ee042fb16882', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bab61d81-8962-4825-ba74-ee042fb16882', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c82f089f9ae4a02b05a6f979a30d88d', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5d58542-1a8e-4531-9a98-50476806045d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=53459905-c69a-4af0-a98a-3bf254fff9ff) old=Port_Binding(mac=['fa:16:3e:5a:16:67 10.100.0.18 10.100.0.2 10.100.0.34 10.100.0.50'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28 10.100.0.50/28', 'neutron:device_id': 'ovnmeta-bab61d81-8962-4825-ba74-ee042fb16882', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bab61d81-8962-4825-ba74-ee042fb16882', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c82f089f9ae4a02b05a6f979a30d88d', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:00:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:07.737 104629 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 53459905-c69a-4af0-a98a-3bf254fff9ff in datapath bab61d81-8962-4825-ba74-ee042fb16882 updated
Jan 22 17:00:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:07.740 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bab61d81-8962-4825-ba74-ee042fb16882, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:00:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:07.741 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ff531192-56e3-4749-b3a5-aae98ad87e7c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:00:11 compute-0 nova_compute[183075]: 2026-01-22 17:00:11.753 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:12 compute-0 podman[213373]: 2026-01-22 17:00:12.393177474 +0000 UTC m=+0.090503229 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 17:00:12 compute-0 nova_compute[183075]: 2026-01-22 17:00:12.556 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:16 compute-0 nova_compute[183075]: 2026-01-22 17:00:16.756 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:17 compute-0 nova_compute[183075]: 2026-01-22 17:00:17.558 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:18 compute-0 podman[213394]: 2026-01-22 17:00:18.375462718 +0000 UTC m=+0.075865174 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:00:21 compute-0 nova_compute[183075]: 2026-01-22 17:00:21.759 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:22 compute-0 nova_compute[183075]: 2026-01-22 17:00:22.560 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:26 compute-0 nova_compute[183075]: 2026-01-22 17:00:26.761 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:27 compute-0 nova_compute[183075]: 2026-01-22 17:00:27.563 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:30 compute-0 podman[213418]: 2026-01-22 17:00:30.377882723 +0000 UTC m=+0.088080606 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:00:31 compute-0 nova_compute[183075]: 2026-01-22 17:00:31.764 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:33 compute-0 nova_compute[183075]: 2026-01-22 17:00:33.532 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:36 compute-0 podman[213445]: 2026-01-22 17:00:36.383133162 +0000 UTC m=+0.081549444 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:00:36 compute-0 podman[213444]: 2026-01-22 17:00:36.425335596 +0000 UTC m=+0.126559152 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:00:36 compute-0 nova_compute[183075]: 2026-01-22 17:00:36.766 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:38 compute-0 podman[213491]: 2026-01-22 17:00:38.38813394 +0000 UTC m=+0.093366435 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public)
Jan 22 17:00:38 compute-0 nova_compute[183075]: 2026-01-22 17:00:38.534 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:41 compute-0 nova_compute[183075]: 2026-01-22 17:00:41.768 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:41.915 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:00:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:41.916 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:00:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:41.916 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:00:43 compute-0 podman[213513]: 2026-01-22 17:00:43.379741036 +0000 UTC m=+0.080913556 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 22 17:00:43 compute-0 nova_compute[183075]: 2026-01-22 17:00:43.536 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:46 compute-0 nova_compute[183075]: 2026-01-22 17:00:46.770 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:48 compute-0 nova_compute[183075]: 2026-01-22 17:00:48.537 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:49 compute-0 podman[213536]: 2026-01-22 17:00:49.37645503 +0000 UTC m=+0.079017457 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:00:51 compute-0 nova_compute[183075]: 2026-01-22 17:00:51.771 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:53 compute-0 nova_compute[183075]: 2026-01-22 17:00:53.540 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:54.055 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:00:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:54.057 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:00:54 compute-0 nova_compute[183075]: 2026-01-22 17:00:54.098 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:54 compute-0 nova_compute[183075]: 2026-01-22 17:00:54.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:00:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:00:55.060 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:00:55 compute-0 nova_compute[183075]: 2026-01-22 17:00:55.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:00:56 compute-0 nova_compute[183075]: 2026-01-22 17:00:56.772 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:58 compute-0 nova_compute[183075]: 2026-01-22 17:00:58.565 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:00:58 compute-0 nova_compute[183075]: 2026-01-22 17:00:58.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:00:58 compute-0 nova_compute[183075]: 2026-01-22 17:00:58.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:00:58 compute-0 nova_compute[183075]: 2026-01-22 17:00:58.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:00:58 compute-0 nova_compute[183075]: 2026-01-22 17:00:58.817 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:00:58 compute-0 nova_compute[183075]: 2026-01-22 17:00:58.817 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:01:01 compute-0 CROND[213562]: (root) CMD (run-parts /etc/cron.hourly)
Jan 22 17:01:01 compute-0 run-parts[213565]: (/etc/cron.hourly) starting 0anacron
Jan 22 17:01:01 compute-0 run-parts[213572]: (/etc/cron.hourly) finished 0anacron
Jan 22 17:01:01 compute-0 CROND[213561]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 22 17:01:01 compute-0 podman[213566]: 2026-01-22 17:01:01.356820273 +0000 UTC m=+0.061604858 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:01:01 compute-0 nova_compute[183075]: 2026-01-22 17:01:01.774 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:01 compute-0 nova_compute[183075]: 2026-01-22 17:01:01.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:01:01 compute-0 nova_compute[183075]: 2026-01-22 17:01:01.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:01:03 compute-0 nova_compute[183075]: 2026-01-22 17:01:03.567 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:03 compute-0 nova_compute[183075]: 2026-01-22 17:01:03.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:01:03 compute-0 nova_compute[183075]: 2026-01-22 17:01:03.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:01:03 compute-0 nova_compute[183075]: 2026-01-22 17:01:03.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:01:03 compute-0 nova_compute[183075]: 2026-01-22 17:01:03.817 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:01:03 compute-0 nova_compute[183075]: 2026-01-22 17:01:03.818 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:01:03 compute-0 nova_compute[183075]: 2026-01-22 17:01:03.818 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:01:03 compute-0 nova_compute[183075]: 2026-01-22 17:01:03.818 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:01:03 compute-0 nova_compute[183075]: 2026-01-22 17:01:03.965 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:01:03 compute-0 nova_compute[183075]: 2026-01-22 17:01:03.966 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5811MB free_disk=73.38227462768555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:01:03 compute-0 nova_compute[183075]: 2026-01-22 17:01:03.966 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:01:03 compute-0 nova_compute[183075]: 2026-01-22 17:01:03.966 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:01:04 compute-0 nova_compute[183075]: 2026-01-22 17:01:04.024 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:01:04 compute-0 nova_compute[183075]: 2026-01-22 17:01:04.024 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:01:04 compute-0 nova_compute[183075]: 2026-01-22 17:01:04.055 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:01:04 compute-0 nova_compute[183075]: 2026-01-22 17:01:04.067 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:01:04 compute-0 nova_compute[183075]: 2026-01-22 17:01:04.068 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:01:04 compute-0 nova_compute[183075]: 2026-01-22 17:01:04.068 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:01:05 compute-0 nova_compute[183075]: 2026-01-22 17:01:05.065 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:01:05 compute-0 nova_compute[183075]: 2026-01-22 17:01:05.782 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:01:06 compute-0 nova_compute[183075]: 2026-01-22 17:01:06.802 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:07 compute-0 podman[213597]: 2026-01-22 17:01:07.357495812 +0000 UTC m=+0.066235819 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 22 17:01:07 compute-0 podman[213596]: 2026-01-22 17:01:07.435709597 +0000 UTC m=+0.146141198 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 17:01:08 compute-0 nova_compute[183075]: 2026-01-22 17:01:08.570 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:09 compute-0 podman[213640]: 2026-01-22 17:01:09.369376003 +0000 UTC m=+0.085797116 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Jan 22 17:01:11 compute-0 nova_compute[183075]: 2026-01-22 17:01:11.804 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:13 compute-0 nova_compute[183075]: 2026-01-22 17:01:13.572 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:14 compute-0 podman[213661]: 2026-01-22 17:01:14.371558967 +0000 UTC m=+0.086637308 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 22 17:01:16 compute-0 nova_compute[183075]: 2026-01-22 17:01:16.806 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:18 compute-0 nova_compute[183075]: 2026-01-22 17:01:18.575 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:20 compute-0 podman[213681]: 2026-01-22 17:01:20.369821622 +0000 UTC m=+0.073264625 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:01:21 compute-0 nova_compute[183075]: 2026-01-22 17:01:21.808 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:23 compute-0 nova_compute[183075]: 2026-01-22 17:01:23.577 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:26 compute-0 nova_compute[183075]: 2026-01-22 17:01:26.810 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:28 compute-0 nova_compute[183075]: 2026-01-22 17:01:28.579 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:31 compute-0 nova_compute[183075]: 2026-01-22 17:01:31.812 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:32 compute-0 podman[213704]: 2026-01-22 17:01:32.370768107 +0000 UTC m=+0.073819819 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:01:33 compute-0 nova_compute[183075]: 2026-01-22 17:01:33.656 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:36 compute-0 nova_compute[183075]: 2026-01-22 17:01:36.814 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:38 compute-0 podman[213729]: 2026-01-22 17:01:38.416784223 +0000 UTC m=+0.109514632 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 22 17:01:38 compute-0 podman[213728]: 2026-01-22 17:01:38.432451796 +0000 UTC m=+0.141733431 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:01:38 compute-0 nova_compute[183075]: 2026-01-22 17:01:38.659 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:40 compute-0 podman[213770]: 2026-01-22 17:01:40.391693857 +0000 UTC m=+0.091189707 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, container_name=openstack_network_exporter, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 17:01:41 compute-0 nova_compute[183075]: 2026-01-22 17:01:41.816 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:01:41.916 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:01:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:01:41.917 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:01:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:01:41.917 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:01:43 compute-0 nova_compute[183075]: 2026-01-22 17:01:43.717 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:45 compute-0 podman[213791]: 2026-01-22 17:01:45.371397809 +0000 UTC m=+0.076045768 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:01:46 compute-0 nova_compute[183075]: 2026-01-22 17:01:46.819 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:48 compute-0 nova_compute[183075]: 2026-01-22 17:01:48.718 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:51 compute-0 podman[213810]: 2026-01-22 17:01:51.381577649 +0000 UTC m=+0.080465605 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:01:51 compute-0 nova_compute[183075]: 2026-01-22 17:01:51.821 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:53 compute-0 nova_compute[183075]: 2026-01-22 17:01:53.720 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:01:55.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:01:56 compute-0 nova_compute[183075]: 2026-01-22 17:01:56.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:01:56 compute-0 nova_compute[183075]: 2026-01-22 17:01:56.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:01:56 compute-0 nova_compute[183075]: 2026-01-22 17:01:56.823 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:01:58.040 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:01:58 compute-0 nova_compute[183075]: 2026-01-22 17:01:58.041 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:01:58.043 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:01:58 compute-0 nova_compute[183075]: 2026-01-22 17:01:58.722 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:01:59 compute-0 nova_compute[183075]: 2026-01-22 17:01:59.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:01:59 compute-0 nova_compute[183075]: 2026-01-22 17:01:59.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:01:59 compute-0 nova_compute[183075]: 2026-01-22 17:01:59.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:01:59 compute-0 nova_compute[183075]: 2026-01-22 17:01:59.875 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:02:00 compute-0 nova_compute[183075]: 2026-01-22 17:02:00.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:02:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:02:01.047 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:02:01 compute-0 nova_compute[183075]: 2026-01-22 17:02:01.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:02:01 compute-0 nova_compute[183075]: 2026-01-22 17:02:01.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:02:01 compute-0 nova_compute[183075]: 2026-01-22 17:02:01.825 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:03 compute-0 podman[213834]: 2026-01-22 17:02:03.353135322 +0000 UTC m=+0.065086981 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:02:03 compute-0 nova_compute[183075]: 2026-01-22 17:02:03.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:02:03 compute-0 nova_compute[183075]: 2026-01-22 17:02:03.789 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:03 compute-0 nova_compute[183075]: 2026-01-22 17:02:03.790 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:02:04 compute-0 nova_compute[183075]: 2026-01-22 17:02:04.785 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:02:04 compute-0 nova_compute[183075]: 2026-01-22 17:02:04.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:02:04 compute-0 nova_compute[183075]: 2026-01-22 17:02:04.816 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:02:04 compute-0 nova_compute[183075]: 2026-01-22 17:02:04.817 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:02:04 compute-0 nova_compute[183075]: 2026-01-22 17:02:04.817 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:02:04 compute-0 nova_compute[183075]: 2026-01-22 17:02:04.817 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:02:05 compute-0 nova_compute[183075]: 2026-01-22 17:02:05.009 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:02:05 compute-0 nova_compute[183075]: 2026-01-22 17:02:05.011 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5811MB free_disk=73.38229370117188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:02:05 compute-0 nova_compute[183075]: 2026-01-22 17:02:05.011 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:02:05 compute-0 nova_compute[183075]: 2026-01-22 17:02:05.012 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:02:05 compute-0 nova_compute[183075]: 2026-01-22 17:02:05.084 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:02:05 compute-0 nova_compute[183075]: 2026-01-22 17:02:05.085 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:02:05 compute-0 nova_compute[183075]: 2026-01-22 17:02:05.112 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:02:05 compute-0 nova_compute[183075]: 2026-01-22 17:02:05.125 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:02:05 compute-0 nova_compute[183075]: 2026-01-22 17:02:05.128 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:02:05 compute-0 nova_compute[183075]: 2026-01-22 17:02:05.128 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:02:06 compute-0 nova_compute[183075]: 2026-01-22 17:02:06.826 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:08 compute-0 nova_compute[183075]: 2026-01-22 17:02:08.827 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:09 compute-0 podman[213859]: 2026-01-22 17:02:09.370754435 +0000 UTC m=+0.068958583 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 17:02:09 compute-0 podman[213858]: 2026-01-22 17:02:09.448811446 +0000 UTC m=+0.144028095 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 22 17:02:11 compute-0 podman[213900]: 2026-01-22 17:02:11.361371225 +0000 UTC m=+0.075600587 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Jan 22 17:02:11 compute-0 nova_compute[183075]: 2026-01-22 17:02:11.828 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:13 compute-0 nova_compute[183075]: 2026-01-22 17:02:13.828 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:16 compute-0 podman[213922]: 2026-01-22 17:02:16.38635857 +0000 UTC m=+0.088442415 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:02:16 compute-0 nova_compute[183075]: 2026-01-22 17:02:16.830 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:18 compute-0 nova_compute[183075]: 2026-01-22 17:02:18.830 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:21 compute-0 nova_compute[183075]: 2026-01-22 17:02:21.832 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:22 compute-0 podman[213943]: 2026-01-22 17:02:22.375792602 +0000 UTC m=+0.069769754 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:02:23 compute-0 nova_compute[183075]: 2026-01-22 17:02:23.832 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:26 compute-0 nova_compute[183075]: 2026-01-22 17:02:26.834 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:28 compute-0 nova_compute[183075]: 2026-01-22 17:02:28.834 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:31 compute-0 nova_compute[183075]: 2026-01-22 17:02:31.837 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:33 compute-0 nova_compute[183075]: 2026-01-22 17:02:33.861 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:34 compute-0 podman[213968]: 2026-01-22 17:02:34.395002868 +0000 UTC m=+0.092565763 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:02:36 compute-0 nova_compute[183075]: 2026-01-22 17:02:36.838 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:38 compute-0 nova_compute[183075]: 2026-01-22 17:02:38.863 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:40 compute-0 podman[213994]: 2026-01-22 17:02:40.393531168 +0000 UTC m=+0.097928694 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 22 17:02:40 compute-0 podman[213993]: 2026-01-22 17:02:40.451105431 +0000 UTC m=+0.152424676 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 22 17:02:41 compute-0 nova_compute[183075]: 2026-01-22 17:02:41.840 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:02:41.917 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:02:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:02:41.918 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:02:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:02:41.919 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:02:42 compute-0 podman[214038]: 2026-01-22 17:02:42.361083744 +0000 UTC m=+0.071484259 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, container_name=openstack_network_exporter, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 17:02:43 compute-0 nova_compute[183075]: 2026-01-22 17:02:43.864 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:46 compute-0 nova_compute[183075]: 2026-01-22 17:02:46.842 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:47 compute-0 podman[214059]: 2026-01-22 17:02:47.383043857 +0000 UTC m=+0.087559012 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:02:48 compute-0 nova_compute[183075]: 2026-01-22 17:02:48.898 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:51 compute-0 nova_compute[183075]: 2026-01-22 17:02:51.844 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:53 compute-0 podman[214081]: 2026-01-22 17:02:53.384698091 +0000 UTC m=+0.095127410 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:02:53 compute-0 nova_compute[183075]: 2026-01-22 17:02:53.901 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:56 compute-0 nova_compute[183075]: 2026-01-22 17:02:56.845 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:57 compute-0 nova_compute[183075]: 2026-01-22 17:02:57.131 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:02:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:02:58.054 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:02:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:02:58.056 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:02:58 compute-0 nova_compute[183075]: 2026-01-22 17:02:58.055 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:58 compute-0 nova_compute[183075]: 2026-01-22 17:02:58.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:02:58 compute-0 nova_compute[183075]: 2026-01-22 17:02:58.950 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:02:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:02:59.059 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:03:00 compute-0 nova_compute[183075]: 2026-01-22 17:03:00.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:03:00 compute-0 nova_compute[183075]: 2026-01-22 17:03:00.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:03:00 compute-0 nova_compute[183075]: 2026-01-22 17:03:00.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:03:00 compute-0 nova_compute[183075]: 2026-01-22 17:03:00.803 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:03:00 compute-0 nova_compute[183075]: 2026-01-22 17:03:00.804 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:03:01 compute-0 nova_compute[183075]: 2026-01-22 17:03:01.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:03:01 compute-0 nova_compute[183075]: 2026-01-22 17:03:01.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:03:01 compute-0 nova_compute[183075]: 2026-01-22 17:03:01.846 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:03 compute-0 nova_compute[183075]: 2026-01-22 17:03:03.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:03:03 compute-0 nova_compute[183075]: 2026-01-22 17:03:03.951 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:04 compute-0 nova_compute[183075]: 2026-01-22 17:03:04.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:03:05 compute-0 podman[214106]: 2026-01-22 17:03:05.368835964 +0000 UTC m=+0.067755391 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:03:05 compute-0 nova_compute[183075]: 2026-01-22 17:03:05.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:03:06 compute-0 nova_compute[183075]: 2026-01-22 17:03:06.784 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:03:06 compute-0 nova_compute[183075]: 2026-01-22 17:03:06.801 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:03:06 compute-0 nova_compute[183075]: 2026-01-22 17:03:06.838 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:03:06 compute-0 nova_compute[183075]: 2026-01-22 17:03:06.839 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:03:06 compute-0 nova_compute[183075]: 2026-01-22 17:03:06.840 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:03:06 compute-0 nova_compute[183075]: 2026-01-22 17:03:06.840 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:03:06 compute-0 nova_compute[183075]: 2026-01-22 17:03:06.848 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:07 compute-0 nova_compute[183075]: 2026-01-22 17:03:07.108 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:03:07 compute-0 nova_compute[183075]: 2026-01-22 17:03:07.110 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5818MB free_disk=73.38227462768555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:03:07 compute-0 nova_compute[183075]: 2026-01-22 17:03:07.110 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:03:07 compute-0 nova_compute[183075]: 2026-01-22 17:03:07.111 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:03:07 compute-0 nova_compute[183075]: 2026-01-22 17:03:07.194 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:03:07 compute-0 nova_compute[183075]: 2026-01-22 17:03:07.195 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:03:07 compute-0 nova_compute[183075]: 2026-01-22 17:03:07.223 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:03:07 compute-0 nova_compute[183075]: 2026-01-22 17:03:07.245 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:03:07 compute-0 nova_compute[183075]: 2026-01-22 17:03:07.247 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:03:07 compute-0 nova_compute[183075]: 2026-01-22 17:03:07.247 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:03:08 compute-0 nova_compute[183075]: 2026-01-22 17:03:08.953 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:11 compute-0 podman[214132]: 2026-01-22 17:03:11.398506804 +0000 UTC m=+0.085449096 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 17:03:11 compute-0 podman[214131]: 2026-01-22 17:03:11.406619447 +0000 UTC m=+0.112684171 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 22 17:03:11 compute-0 nova_compute[183075]: 2026-01-22 17:03:11.851 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:13 compute-0 podman[214174]: 2026-01-22 17:03:13.361135299 +0000 UTC m=+0.074161229 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Jan 22 17:03:14 compute-0 nova_compute[183075]: 2026-01-22 17:03:14.006 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:16 compute-0 nova_compute[183075]: 2026-01-22 17:03:16.853 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:18 compute-0 podman[214195]: 2026-01-22 17:03:18.381706607 +0000 UTC m=+0.084625415 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:03:19 compute-0 nova_compute[183075]: 2026-01-22 17:03:19.009 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:03:20.993 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:58:b4 192.168.1.2 192.168.2.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.1.2/24 192.168.2.2/24', 'neutron:device_id': 'ovnmeta-55a56a61-c5b6-43b4-b5cb-050d8e3fbc13', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55a56a61-c5b6-43b4-b5cb-050d8e3fbc13', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f756822dc47541368e69c87d9789450e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6089ca6e-61bb-4907-8977-7430da889ae7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=58290ae7-fd61-4e49-b963-429eb378eb62) old=Port_Binding(mac=['fa:16:3e:15:58:b4 192.168.1.2'], external_ids={'neutron:cidrs': '192.168.1.2/24', 'neutron:device_id': 'ovnmeta-55a56a61-c5b6-43b4-b5cb-050d8e3fbc13', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55a56a61-c5b6-43b4-b5cb-050d8e3fbc13', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f756822dc47541368e69c87d9789450e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:03:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:03:20.995 104629 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 58290ae7-fd61-4e49-b963-429eb378eb62 in datapath 55a56a61-c5b6-43b4-b5cb-050d8e3fbc13 updated
Jan 22 17:03:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:03:20.996 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55a56a61-c5b6-43b4-b5cb-050d8e3fbc13, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:03:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:03:20.997 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[966e2159-686d-4231-bfe7-65f46ae1b35f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:03:21 compute-0 nova_compute[183075]: 2026-01-22 17:03:21.855 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:24 compute-0 nova_compute[183075]: 2026-01-22 17:03:24.011 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:24 compute-0 podman[214214]: 2026-01-22 17:03:24.380058673 +0000 UTC m=+0.080797563 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:03:26 compute-0 nova_compute[183075]: 2026-01-22 17:03:26.856 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:29 compute-0 nova_compute[183075]: 2026-01-22 17:03:29.012 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:31 compute-0 nova_compute[183075]: 2026-01-22 17:03:31.858 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:34 compute-0 nova_compute[183075]: 2026-01-22 17:03:34.014 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:36 compute-0 podman[214239]: 2026-01-22 17:03:36.372656889 +0000 UTC m=+0.070283958 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:03:36 compute-0 nova_compute[183075]: 2026-01-22 17:03:36.859 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:39 compute-0 nova_compute[183075]: 2026-01-22 17:03:39.016 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:41 compute-0 nova_compute[183075]: 2026-01-22 17:03:41.862 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:03:41.917 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:03:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:03:41.917 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:03:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:03:41.917 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:03:41 compute-0 podman[214264]: 2026-01-22 17:03:41.969478395 +0000 UTC m=+0.066596721 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 22 17:03:42 compute-0 podman[214263]: 2026-01-22 17:03:42.051233733 +0000 UTC m=+0.151860741 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 17:03:44 compute-0 nova_compute[183075]: 2026-01-22 17:03:44.017 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:44 compute-0 podman[214308]: 2026-01-22 17:03:44.342594334 +0000 UTC m=+0.054670777 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=openstack_network_exporter, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, version=9.6)
Jan 22 17:03:46 compute-0 nova_compute[183075]: 2026-01-22 17:03:46.864 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:49 compute-0 nova_compute[183075]: 2026-01-22 17:03:49.018 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:49 compute-0 podman[214330]: 2026-01-22 17:03:49.359226528 +0000 UTC m=+0.075015142 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:03:51 compute-0 nova_compute[183075]: 2026-01-22 17:03:51.866 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:54 compute-0 nova_compute[183075]: 2026-01-22 17:03:54.021 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:55 compute-0 podman[214350]: 2026-01-22 17:03:55.379532916 +0000 UTC m=+0.086693637 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.447 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:03:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:03:56 compute-0 nova_compute[183075]: 2026-01-22 17:03:56.867 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:57 compute-0 nova_compute[183075]: 2026-01-22 17:03:57.235 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:03:57 compute-0 nova_compute[183075]: 2026-01-22 17:03:57.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:03:57 compute-0 nova_compute[183075]: 2026-01-22 17:03:57.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 17:03:59 compute-0 nova_compute[183075]: 2026-01-22 17:03:59.023 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:03:59.342 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:03:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:03:59.343 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:03:59 compute-0 nova_compute[183075]: 2026-01-22 17:03:59.344 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:03:59 compute-0 nova_compute[183075]: 2026-01-22 17:03:59.804 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:04:00 compute-0 nova_compute[183075]: 2026-01-22 17:04:00.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:04:00 compute-0 nova_compute[183075]: 2026-01-22 17:04:00.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:04:00 compute-0 nova_compute[183075]: 2026-01-22 17:04:00.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:04:00 compute-0 nova_compute[183075]: 2026-01-22 17:04:00.807 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:04:01 compute-0 nova_compute[183075]: 2026-01-22 17:04:01.868 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:02 compute-0 nova_compute[183075]: 2026-01-22 17:04:02.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:04:02 compute-0 nova_compute[183075]: 2026-01-22 17:04:02.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:04:02 compute-0 nova_compute[183075]: 2026-01-22 17:04:02.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:04:04 compute-0 nova_compute[183075]: 2026-01-22 17:04:04.026 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:04:04.345 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:04:04 compute-0 nova_compute[183075]: 2026-01-22 17:04:04.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:04:04 compute-0 nova_compute[183075]: 2026-01-22 17:04:04.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:04:04 compute-0 nova_compute[183075]: 2026-01-22 17:04:04.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 17:04:04 compute-0 nova_compute[183075]: 2026-01-22 17:04:04.819 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 17:04:05 compute-0 nova_compute[183075]: 2026-01-22 17:04:05.814 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:04:06 compute-0 nova_compute[183075]: 2026-01-22 17:04:06.871 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:07 compute-0 podman[214376]: 2026-01-22 17:04:07.391049341 +0000 UTC m=+0.087198290 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:04:07 compute-0 nova_compute[183075]: 2026-01-22 17:04:07.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:04:07 compute-0 nova_compute[183075]: 2026-01-22 17:04:07.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:04:07 compute-0 nova_compute[183075]: 2026-01-22 17:04:07.837 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:04:07 compute-0 nova_compute[183075]: 2026-01-22 17:04:07.838 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:04:07 compute-0 nova_compute[183075]: 2026-01-22 17:04:07.838 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:04:07 compute-0 nova_compute[183075]: 2026-01-22 17:04:07.838 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:04:08 compute-0 nova_compute[183075]: 2026-01-22 17:04:08.042 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:04:08 compute-0 nova_compute[183075]: 2026-01-22 17:04:08.044 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5821MB free_disk=73.38229370117188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:04:08 compute-0 nova_compute[183075]: 2026-01-22 17:04:08.044 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:04:08 compute-0 nova_compute[183075]: 2026-01-22 17:04:08.044 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:04:08 compute-0 nova_compute[183075]: 2026-01-22 17:04:08.398 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:04:08 compute-0 nova_compute[183075]: 2026-01-22 17:04:08.399 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:04:08 compute-0 nova_compute[183075]: 2026-01-22 17:04:08.546 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:04:08 compute-0 nova_compute[183075]: 2026-01-22 17:04:08.657 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:04:08 compute-0 nova_compute[183075]: 2026-01-22 17:04:08.659 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:04:08 compute-0 nova_compute[183075]: 2026-01-22 17:04:08.659 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:04:09 compute-0 nova_compute[183075]: 2026-01-22 17:04:09.028 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:11 compute-0 nova_compute[183075]: 2026-01-22 17:04:11.873 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:12 compute-0 podman[214401]: 2026-01-22 17:04:12.377063148 +0000 UTC m=+0.076898234 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:04:12 compute-0 podman[214400]: 2026-01-22 17:04:12.443442226 +0000 UTC m=+0.147373768 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:04:14 compute-0 nova_compute[183075]: 2026-01-22 17:04:14.030 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:15 compute-0 podman[214443]: 2026-01-22 17:04:15.377905316 +0000 UTC m=+0.075421075 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, io.openshift.expose-services=, config_id=openstack_network_exporter, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 17:04:16 compute-0 nova_compute[183075]: 2026-01-22 17:04:16.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:04:16 compute-0 nova_compute[183075]: 2026-01-22 17:04:16.876 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:19 compute-0 nova_compute[183075]: 2026-01-22 17:04:19.032 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:20 compute-0 podman[214464]: 2026-01-22 17:04:20.3727029 +0000 UTC m=+0.075190499 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 17:04:21 compute-0 nova_compute[183075]: 2026-01-22 17:04:21.878 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:24 compute-0 nova_compute[183075]: 2026-01-22 17:04:24.034 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:26 compute-0 podman[214484]: 2026-01-22 17:04:26.353472677 +0000 UTC m=+0.063438635 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:04:26 compute-0 nova_compute[183075]: 2026-01-22 17:04:26.879 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:29 compute-0 nova_compute[183075]: 2026-01-22 17:04:29.037 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:31 compute-0 nova_compute[183075]: 2026-01-22 17:04:31.882 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:34 compute-0 nova_compute[183075]: 2026-01-22 17:04:34.039 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:36 compute-0 nova_compute[183075]: 2026-01-22 17:04:36.884 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:38 compute-0 podman[214508]: 2026-01-22 17:04:38.362450867 +0000 UTC m=+0.069089779 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:04:39 compute-0 nova_compute[183075]: 2026-01-22 17:04:39.040 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:41 compute-0 nova_compute[183075]: 2026-01-22 17:04:41.886 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:04:41.917 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:04:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:04:41.918 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:04:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:04:41.918 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:04:43 compute-0 podman[214533]: 2026-01-22 17:04:43.35258286 +0000 UTC m=+0.056128965 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 17:04:43 compute-0 podman[214532]: 2026-01-22 17:04:43.377469094 +0000 UTC m=+0.088600156 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 17:04:44 compute-0 nova_compute[183075]: 2026-01-22 17:04:44.042 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:46 compute-0 podman[214578]: 2026-01-22 17:04:46.336308725 +0000 UTC m=+0.052385989 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 17:04:46 compute-0 nova_compute[183075]: 2026-01-22 17:04:46.887 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:49 compute-0 nova_compute[183075]: 2026-01-22 17:04:49.044 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:51 compute-0 podman[214600]: 2026-01-22 17:04:51.373496886 +0000 UTC m=+0.077244822 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:04:51 compute-0 nova_compute[183075]: 2026-01-22 17:04:51.888 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:54 compute-0 nova_compute[183075]: 2026-01-22 17:04:54.046 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:56 compute-0 nova_compute[183075]: 2026-01-22 17:04:56.812 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:04:56 compute-0 nova_compute[183075]: 2026-01-22 17:04:56.890 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:04:57 compute-0 podman[214620]: 2026-01-22 17:04:57.383237662 +0000 UTC m=+0.094653253 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:04:59 compute-0 nova_compute[183075]: 2026-01-22 17:04:59.048 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:00 compute-0 nova_compute[183075]: 2026-01-22 17:05:00.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:05:00 compute-0 nova_compute[183075]: 2026-01-22 17:05:00.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:05:00 compute-0 nova_compute[183075]: 2026-01-22 17:05:00.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:05:01 compute-0 nova_compute[183075]: 2026-01-22 17:05:01.892 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:04 compute-0 nova_compute[183075]: 2026-01-22 17:05:04.051 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:06 compute-0 nova_compute[183075]: 2026-01-22 17:05:06.895 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:09 compute-0 nova_compute[183075]: 2026-01-22 17:05:09.054 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:09 compute-0 podman[214644]: 2026-01-22 17:05:09.374935255 +0000 UTC m=+0.081838941 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:05:11 compute-0 nova_compute[183075]: 2026-01-22 17:05:11.897 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:14 compute-0 nova_compute[183075]: 2026-01-22 17:05:14.058 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:14 compute-0 podman[214669]: 2026-01-22 17:05:14.370128229 +0000 UTC m=+0.073952657 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:05:14 compute-0 podman[214668]: 2026-01-22 17:05:14.410063343 +0000 UTC m=+0.114046225 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 17:05:16 compute-0 nova_compute[183075]: 2026-01-22 17:05:16.899 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:17 compute-0 podman[214713]: 2026-01-22 17:05:17.376230254 +0000 UTC m=+0.076885663 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 17:05:19 compute-0 nova_compute[183075]: 2026-01-22 17:05:19.095 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:21 compute-0 nova_compute[183075]: 2026-01-22 17:05:21.900 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:22 compute-0 podman[214733]: 2026-01-22 17:05:22.367733903 +0000 UTC m=+0.075556019 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:05:24 compute-0 nova_compute[183075]: 2026-01-22 17:05:24.097 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:26 compute-0 sshd-session[214753]: Received disconnect from 45.148.10.152 port 36856:11:  [preauth]
Jan 22 17:05:26 compute-0 sshd-session[214753]: Disconnected from authenticating user root 45.148.10.152 port 36856 [preauth]
Jan 22 17:05:26 compute-0 nova_compute[183075]: 2026-01-22 17:05:26.902 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:28 compute-0 podman[214755]: 2026-01-22 17:05:28.406812188 +0000 UTC m=+0.102072775 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:05:29 compute-0 nova_compute[183075]: 2026-01-22 17:05:29.099 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:31 compute-0 nova_compute[183075]: 2026-01-22 17:05:31.903 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:34 compute-0 nova_compute[183075]: 2026-01-22 17:05:34.099 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:36 compute-0 nova_compute[183075]: 2026-01-22 17:05:36.906 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:39 compute-0 nova_compute[183075]: 2026-01-22 17:05:39.103 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:40 compute-0 podman[214780]: 2026-01-22 17:05:40.389138139 +0000 UTC m=+0.088663548 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:05:41 compute-0 nova_compute[183075]: 2026-01-22 17:05:41.907 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:05:41.918 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:05:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:05:41.918 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:05:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:05:41.919 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:05:44 compute-0 nova_compute[183075]: 2026-01-22 17:05:44.107 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:45 compute-0 podman[214805]: 2026-01-22 17:05:45.381113589 +0000 UTC m=+0.076005880 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 17:05:45 compute-0 podman[214804]: 2026-01-22 17:05:45.460004363 +0000 UTC m=+0.160127329 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 22 17:05:46 compute-0 nova_compute[183075]: 2026-01-22 17:05:46.908 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:48 compute-0 podman[214848]: 2026-01-22 17:05:48.381686221 +0000 UTC m=+0.083013741 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc.)
Jan 22 17:05:49 compute-0 nova_compute[183075]: 2026-01-22 17:05:49.109 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:51 compute-0 nova_compute[183075]: 2026-01-22 17:05:51.909 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:53 compute-0 podman[214870]: 2026-01-22 17:05:53.405679189 +0000 UTC m=+0.097923668 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:05:54 compute-0 nova_compute[183075]: 2026-01-22 17:05:54.112 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.448 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.449 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.450 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.452 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:05:55.453 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:05:56 compute-0 nova_compute[183075]: 2026-01-22 17:05:56.911 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:59 compute-0 nova_compute[183075]: 2026-01-22 17:05:59.113 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:05:59 compute-0 podman[214891]: 2026-01-22 17:05:59.388296594 +0000 UTC m=+0.089687285 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:06:01 compute-0 nova_compute[183075]: 2026-01-22 17:06:01.914 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:04 compute-0 nova_compute[183075]: 2026-01-22 17:06:04.116 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:06 compute-0 nova_compute[183075]: 2026-01-22 17:06:06.915 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:09 compute-0 nova_compute[183075]: 2026-01-22 17:06:09.117 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:11 compute-0 podman[214914]: 2026-01-22 17:06:11.374062136 +0000 UTC m=+0.073275817 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:06:11 compute-0 nova_compute[183075]: 2026-01-22 17:06:11.916 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:14 compute-0 nova_compute[183075]: 2026-01-22 17:06:14.130 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:16 compute-0 podman[214939]: 2026-01-22 17:06:16.390561806 +0000 UTC m=+0.073755170 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 17:06:16 compute-0 podman[214938]: 2026-01-22 17:06:16.417448079 +0000 UTC m=+0.119908377 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 17:06:16 compute-0 nova_compute[183075]: 2026-01-22 17:06:16.918 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:19 compute-0 nova_compute[183075]: 2026-01-22 17:06:19.169 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:19 compute-0 podman[214989]: 2026-01-22 17:06:19.349536231 +0000 UTC m=+0.056992241 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, release=1755695350)
Jan 22 17:06:21 compute-0 nova_compute[183075]: 2026-01-22 17:06:21.921 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:24 compute-0 nova_compute[183075]: 2026-01-22 17:06:24.172 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:24 compute-0 podman[215010]: 2026-01-22 17:06:24.413978832 +0000 UTC m=+0.118661224 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:06:26 compute-0 nova_compute[183075]: 2026-01-22 17:06:26.923 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:29 compute-0 nova_compute[183075]: 2026-01-22 17:06:29.175 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:30 compute-0 podman[215030]: 2026-01-22 17:06:30.407755506 +0000 UTC m=+0.105693625 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:06:31 compute-0 nova_compute[183075]: 2026-01-22 17:06:31.925 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.572 183079 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 82.72 sec
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.610 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.611 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.611 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.612 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.612 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.612 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.612 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.612 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.612 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.669 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.671 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.671 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.672 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.829 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:32.829 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:32.832 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.891 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.892 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5812MB free_disk=73.3820915222168GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.892 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.893 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.974 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.974 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:06:32 compute-0 nova_compute[183075]: 2026-01-22 17:06:32.995 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing inventories for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 17:06:33 compute-0 nova_compute[183075]: 2026-01-22 17:06:33.211 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Updating ProviderTree inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 17:06:33 compute-0 nova_compute[183075]: 2026-01-22 17:06:33.212 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 17:06:33 compute-0 nova_compute[183075]: 2026-01-22 17:06:33.236 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing aggregate associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 17:06:33 compute-0 nova_compute[183075]: 2026-01-22 17:06:33.269 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing trait associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NODE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 17:06:33 compute-0 nova_compute[183075]: 2026-01-22 17:06:33.301 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:06:33 compute-0 nova_compute[183075]: 2026-01-22 17:06:33.327 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:06:33 compute-0 nova_compute[183075]: 2026-01-22 17:06:33.329 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:06:33 compute-0 nova_compute[183075]: 2026-01-22 17:06:33.330 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.437s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:06:34 compute-0 nova_compute[183075]: 2026-01-22 17:06:34.176 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:36 compute-0 nova_compute[183075]: 2026-01-22 17:06:36.927 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:37.835 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:06:39 compute-0 nova_compute[183075]: 2026-01-22 17:06:39.234 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:40 compute-0 nova_compute[183075]: 2026-01-22 17:06:40.327 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:06:40 compute-0 nova_compute[183075]: 2026-01-22 17:06:40.328 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:06:40 compute-0 nova_compute[183075]: 2026-01-22 17:06:40.350 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:06:40 compute-0 nova_compute[183075]: 2026-01-22 17:06:40.350 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:06:40 compute-0 nova_compute[183075]: 2026-01-22 17:06:40.351 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:06:40 compute-0 nova_compute[183075]: 2026-01-22 17:06:40.367 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:06:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:41.919 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:06:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:41.920 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:06:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:41.920 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:06:41 compute-0 nova_compute[183075]: 2026-01-22 17:06:41.928 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:42 compute-0 podman[215057]: 2026-01-22 17:06:42.061225118 +0000 UTC m=+0.101322020 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:06:44 compute-0 nova_compute[183075]: 2026-01-22 17:06:44.384 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:44 compute-0 nova_compute[183075]: 2026-01-22 17:06:44.611 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "bd249764-12e4-4e25-9445-dd6e132ca53c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:06:44 compute-0 nova_compute[183075]: 2026-01-22 17:06:44.611 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "bd249764-12e4-4e25-9445-dd6e132ca53c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:06:44 compute-0 nova_compute[183075]: 2026-01-22 17:06:44.628 183079 DEBUG nova.compute.manager [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:06:44 compute-0 nova_compute[183075]: 2026-01-22 17:06:44.714 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:06:44 compute-0 nova_compute[183075]: 2026-01-22 17:06:44.715 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:06:44 compute-0 nova_compute[183075]: 2026-01-22 17:06:44.723 183079 DEBUG nova.virt.hardware [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:06:44 compute-0 nova_compute[183075]: 2026-01-22 17:06:44.723 183079 INFO nova.compute.claims [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:06:44 compute-0 nova_compute[183075]: 2026-01-22 17:06:44.845 183079 DEBUG nova.compute.provider_tree [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:06:44 compute-0 nova_compute[183075]: 2026-01-22 17:06:44.860 183079 DEBUG nova.scheduler.client.report [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:06:44 compute-0 nova_compute[183075]: 2026-01-22 17:06:44.879 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:06:44 compute-0 nova_compute[183075]: 2026-01-22 17:06:44.880 183079 DEBUG nova.compute.manager [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:06:44 compute-0 nova_compute[183075]: 2026-01-22 17:06:44.923 183079 DEBUG nova.compute.manager [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:06:44 compute-0 nova_compute[183075]: 2026-01-22 17:06:44.923 183079 DEBUG nova.network.neutron [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:06:44 compute-0 nova_compute[183075]: 2026-01-22 17:06:44.947 183079 INFO nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:06:44 compute-0 nova_compute[183075]: 2026-01-22 17:06:44.977 183079 DEBUG nova.compute.manager [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.074 183079 DEBUG nova.compute.manager [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.075 183079 DEBUG nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.076 183079 INFO nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Creating image(s)
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.076 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "/var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.077 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "/var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.078 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "/var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.095 183079 DEBUG oslo_concurrency.processutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.185 183079 DEBUG oslo_concurrency.processutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.186 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.187 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.200 183079 DEBUG oslo_concurrency.processutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.253 183079 DEBUG oslo_concurrency.processutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.254 183079 DEBUG oslo_concurrency.processutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.569 183079 DEBUG oslo_concurrency.processutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c/disk 1073741824" returned: 0 in 0.315s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.571 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.572 183079 DEBUG oslo_concurrency.processutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.650 183079 DEBUG oslo_concurrency.processutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.651 183079 DEBUG nova.virt.disk.api [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Checking if we can resize image /var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.652 183079 DEBUG oslo_concurrency.processutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.722 183079 DEBUG oslo_concurrency.processutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.724 183079 DEBUG nova.virt.disk.api [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Cannot resize image /var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.724 183079 DEBUG nova.objects.instance [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lazy-loading 'migration_context' on Instance uuid bd249764-12e4-4e25-9445-dd6e132ca53c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.907 183079 DEBUG nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.908 183079 DEBUG nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Ensure instance console log exists: /var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.908 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.908 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:06:45 compute-0 nova_compute[183075]: 2026-01-22 17:06:45.909 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:06:46 compute-0 nova_compute[183075]: 2026-01-22 17:06:46.178 183079 DEBUG nova.policy [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c9cdd80799a74efb8ce82cfb5148ac89', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cc642b97aa4e4886902a0d1233877b88', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:06:46 compute-0 nova_compute[183075]: 2026-01-22 17:06:46.930 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:47 compute-0 podman[215097]: 2026-01-22 17:06:47.349862802 +0000 UTC m=+0.053515730 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:06:47 compute-0 podman[215096]: 2026-01-22 17:06:47.379376824 +0000 UTC m=+0.085183619 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 22 17:06:47 compute-0 nova_compute[183075]: 2026-01-22 17:06:47.675 183079 DEBUG nova.network.neutron [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Successfully updated port: 6b397961-0eb5-4ccd-8c0a-f433961cd08a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:06:47 compute-0 nova_compute[183075]: 2026-01-22 17:06:47.695 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "refresh_cache-bd249764-12e4-4e25-9445-dd6e132ca53c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:06:47 compute-0 nova_compute[183075]: 2026-01-22 17:06:47.695 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquired lock "refresh_cache-bd249764-12e4-4e25-9445-dd6e132ca53c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:06:47 compute-0 nova_compute[183075]: 2026-01-22 17:06:47.696 183079 DEBUG nova.network.neutron [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:06:47 compute-0 nova_compute[183075]: 2026-01-22 17:06:47.949 183079 DEBUG nova.compute.manager [req-253ad921-7dbe-422a-822b-6aaa063251a1 req-79fbf128-c2f2-4eba-9c0e-dc48a5dca68c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Received event network-changed-6b397961-0eb5-4ccd-8c0a-f433961cd08a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:06:47 compute-0 nova_compute[183075]: 2026-01-22 17:06:47.949 183079 DEBUG nova.compute.manager [req-253ad921-7dbe-422a-822b-6aaa063251a1 req-79fbf128-c2f2-4eba-9c0e-dc48a5dca68c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Refreshing instance network info cache due to event network-changed-6b397961-0eb5-4ccd-8c0a-f433961cd08a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:06:47 compute-0 nova_compute[183075]: 2026-01-22 17:06:47.950 183079 DEBUG oslo_concurrency.lockutils [req-253ad921-7dbe-422a-822b-6aaa063251a1 req-79fbf128-c2f2-4eba-9c0e-dc48a5dca68c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-bd249764-12e4-4e25-9445-dd6e132ca53c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:06:48 compute-0 nova_compute[183075]: 2026-01-22 17:06:48.031 183079 DEBUG nova.network.neutron [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.386 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.405 183079 DEBUG nova.network.neutron [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Updating instance_info_cache with network_info: [{"id": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "address": "fa:16:3e:6c:74:51", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b397961-0e", "ovs_interfaceid": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.440 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Releasing lock "refresh_cache-bd249764-12e4-4e25-9445-dd6e132ca53c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.441 183079 DEBUG nova.compute.manager [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Instance network_info: |[{"id": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "address": "fa:16:3e:6c:74:51", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b397961-0e", "ovs_interfaceid": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.441 183079 DEBUG oslo_concurrency.lockutils [req-253ad921-7dbe-422a-822b-6aaa063251a1 req-79fbf128-c2f2-4eba-9c0e-dc48a5dca68c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-bd249764-12e4-4e25-9445-dd6e132ca53c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.442 183079 DEBUG nova.network.neutron [req-253ad921-7dbe-422a-822b-6aaa063251a1 req-79fbf128-c2f2-4eba-9c0e-dc48a5dca68c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Refreshing network info cache for port 6b397961-0eb5-4ccd-8c0a-f433961cd08a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.445 183079 DEBUG nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Start _get_guest_xml network_info=[{"id": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "address": "fa:16:3e:6c:74:51", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b397961-0e", "ovs_interfaceid": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.450 183079 WARNING nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.462 183079 DEBUG nova.virt.libvirt.host [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.463 183079 DEBUG nova.virt.libvirt.host [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.477 183079 DEBUG nova.virt.libvirt.host [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.478 183079 DEBUG nova.virt.libvirt.host [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.479 183079 DEBUG nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.480 183079 DEBUG nova.virt.hardware [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.480 183079 DEBUG nova.virt.hardware [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.481 183079 DEBUG nova.virt.hardware [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.482 183079 DEBUG nova.virt.hardware [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.482 183079 DEBUG nova.virt.hardware [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.482 183079 DEBUG nova.virt.hardware [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.483 183079 DEBUG nova.virt.hardware [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.484 183079 DEBUG nova.virt.hardware [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.484 183079 DEBUG nova.virt.hardware [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.485 183079 DEBUG nova.virt.hardware [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.485 183079 DEBUG nova.virt.hardware [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.492 183079 DEBUG nova.virt.libvirt.vif [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:06:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1744620446',display_name='tempest-server-test-1744620446',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1744620446',id=2,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFwDv7A3xWV6IJ9sHJD6IHIGofwjIy3SwtF7ZsSnO9i7yxDOnvofgvCRbmYwkVhe4LG2M7JC1Bh9mcomUiffvFuBa9GwDItaNN685Z4fyZXr+GZx+rbje/8Qtcf+s+bYwA==',key_name='tempest-keypair-test-76774833',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cc642b97aa4e4886902a0d1233877b88',ramdisk_id='',reservation_id='r-o3uwyal5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DefaultSnatToExternal-1301723521',owner_user_name='tempest-DefaultSnatToExternal-1301723521-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:06:45Z,user_data=None,user_id='c9cdd80799a74efb8ce82cfb5148ac89',uuid=bd249764-12e4-4e25-9445-dd6e132ca53c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "address": "fa:16:3e:6c:74:51", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b397961-0e", "ovs_interfaceid": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.493 183079 DEBUG nova.network.os_vif_util [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Converting VIF {"id": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "address": "fa:16:3e:6c:74:51", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b397961-0e", "ovs_interfaceid": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.495 183079 DEBUG nova.network.os_vif_util [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:74:51,bridge_name='br-int',has_traffic_filtering=True,id=6b397961-0eb5-4ccd-8c0a-f433961cd08a,network=Network(ca57ee46-b6e8-4b60-affe-0c1349cb8abe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6b397961-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.496 183079 DEBUG nova.objects.instance [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lazy-loading 'pci_devices' on Instance uuid bd249764-12e4-4e25-9445-dd6e132ca53c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.521 183079 DEBUG nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:06:49 compute-0 nova_compute[183075]:   <uuid>bd249764-12e4-4e25-9445-dd6e132ca53c</uuid>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   <name>instance-00000002</name>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1744620446</nova:name>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:06:49</nova:creationTime>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:06:49 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:06:49 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:06:49 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:06:49 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:06:49 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:06:49 compute-0 nova_compute[183075]:         <nova:user uuid="c9cdd80799a74efb8ce82cfb5148ac89">tempest-DefaultSnatToExternal-1301723521-project-member</nova:user>
Jan 22 17:06:49 compute-0 nova_compute[183075]:         <nova:project uuid="cc642b97aa4e4886902a0d1233877b88">tempest-DefaultSnatToExternal-1301723521</nova:project>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:06:49 compute-0 nova_compute[183075]:         <nova:port uuid="6b397961-0eb5-4ccd-8c0a-f433961cd08a">
Jan 22 17:06:49 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <system>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <entry name="serial">bd249764-12e4-4e25-9445-dd6e132ca53c</entry>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <entry name="uuid">bd249764-12e4-4e25-9445-dd6e132ca53c</entry>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     </system>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   <os>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   </os>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   <features>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   </features>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c/disk"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:6c:74:51"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <target dev="tap6b397961-0e"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c/console.log" append="off"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <video>
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     </video>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:06:49 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:06:49 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:06:49 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:06:49 compute-0 nova_compute[183075]: </domain>
Jan 22 17:06:49 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.524 183079 DEBUG nova.compute.manager [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Preparing to wait for external event network-vif-plugged-6b397961-0eb5-4ccd-8c0a-f433961cd08a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.525 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.525 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.526 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.527 183079 DEBUG nova.virt.libvirt.vif [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:06:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1744620446',display_name='tempest-server-test-1744620446',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1744620446',id=2,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFwDv7A3xWV6IJ9sHJD6IHIGofwjIy3SwtF7ZsSnO9i7yxDOnvofgvCRbmYwkVhe4LG2M7JC1Bh9mcomUiffvFuBa9GwDItaNN685Z4fyZXr+GZx+rbje/8Qtcf+s+bYwA==',key_name='tempest-keypair-test-76774833',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cc642b97aa4e4886902a0d1233877b88',ramdisk_id='',reservation_id='r-o3uwyal5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DefaultSnatToExternal-1301723521',owner_user_name='tempest-DefaultSnatToExternal-1301723521-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:06:45Z,user_data=None,user_id='c9cdd80799a74efb8ce82cfb5148ac89',uuid=bd249764-12e4-4e25-9445-dd6e132ca53c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "address": "fa:16:3e:6c:74:51", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b397961-0e", "ovs_interfaceid": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.528 183079 DEBUG nova.network.os_vif_util [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Converting VIF {"id": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "address": "fa:16:3e:6c:74:51", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b397961-0e", "ovs_interfaceid": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.529 183079 DEBUG nova.network.os_vif_util [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:74:51,bridge_name='br-int',has_traffic_filtering=True,id=6b397961-0eb5-4ccd-8c0a-f433961cd08a,network=Network(ca57ee46-b6e8-4b60-affe-0c1349cb8abe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6b397961-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.530 183079 DEBUG os_vif [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:74:51,bridge_name='br-int',has_traffic_filtering=True,id=6b397961-0eb5-4ccd-8c0a-f433961cd08a,network=Network(ca57ee46-b6e8-4b60-affe-0c1349cb8abe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6b397961-0e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.531 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.532 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.532 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.540 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.541 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b397961-0e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.542 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6b397961-0e, col_values=(('external_ids', {'iface-id': '6b397961-0eb5-4ccd-8c0a-f433961cd08a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6c:74:51', 'vm-uuid': 'bd249764-12e4-4e25-9445-dd6e132ca53c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.544 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:49 compute-0 NetworkManager[55454]: <info>  [1769101609.5459] manager: (tap6b397961-0e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.548 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.553 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.554 183079 INFO os_vif [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:74:51,bridge_name='br-int',has_traffic_filtering=True,id=6b397961-0eb5-4ccd-8c0a-f433961cd08a,network=Network(ca57ee46-b6e8-4b60-affe-0c1349cb8abe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6b397961-0e')
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.601 183079 DEBUG nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.601 183079 DEBUG nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] No VIF found with MAC fa:16:3e:6c:74:51, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:06:49 compute-0 kernel: tap6b397961-0e: entered promiscuous mode
Jan 22 17:06:49 compute-0 NetworkManager[55454]: <info>  [1769101609.6838] manager: (tap6b397961-0e): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Jan 22 17:06:49 compute-0 ovn_controller[95372]: 2026-01-22T17:06:49Z|00037|binding|INFO|Claiming lport 6b397961-0eb5-4ccd-8c0a-f433961cd08a for this chassis.
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.686 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:49 compute-0 ovn_controller[95372]: 2026-01-22T17:06:49Z|00038|binding|INFO|6b397961-0eb5-4ccd-8c0a-f433961cd08a: Claiming fa:16:3e:6c:74:51 10.100.0.12
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.698 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:49 compute-0 systemd-udevd[215161]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.716 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:74:51 10.100.0.12'], port_security=['fa:16:3e:6c:74:51 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'bd249764-12e4-4e25-9445-dd6e132ca53c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca57ee46-b6e8-4b60-affe-0c1349cb8abe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cc642b97aa4e4886902a0d1233877b88', 'neutron:revision_number': '2', 'neutron:security_group_ids': '78ff1eac-eb3f-4f3f-baa4-7b8094682c5f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.245'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad421178-e3d9-4f12-a6a2-84aabbd8f25c, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=6b397961-0eb5-4ccd-8c0a-f433961cd08a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.718 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 6b397961-0eb5-4ccd-8c0a-f433961cd08a in datapath ca57ee46-b6e8-4b60-affe-0c1349cb8abe bound to our chassis
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.719 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca57ee46-b6e8-4b60-affe-0c1349cb8abe
Jan 22 17:06:49 compute-0 NetworkManager[55454]: <info>  [1769101609.7280] device (tap6b397961-0e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:06:49 compute-0 NetworkManager[55454]: <info>  [1769101609.7286] device (tap6b397961-0e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:06:49 compute-0 systemd-machined[154382]: New machine qemu-2-instance-00000002.
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.730 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[443b2792-7211-4dc7-97d8-59bbaae5032e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.732 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapca57ee46-b1 in ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.733 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapca57ee46-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.733 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[63037a0f-05e1-42a6-923e-23bd765a88bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.734 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a56fdebf-1307-4106-a52b-4e12acbab0dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.746 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[595c3f07-ff3c-4b72-93aa-76cf38f8f851]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:06:49 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.777 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e3e0e601-969b-49df-b234-db05acb914a6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:06:49 compute-0 ovn_controller[95372]: 2026-01-22T17:06:49Z|00039|binding|INFO|Setting lport 6b397961-0eb5-4ccd-8c0a-f433961cd08a ovn-installed in OVS
Jan 22 17:06:49 compute-0 ovn_controller[95372]: 2026-01-22T17:06:49Z|00040|binding|INFO|Setting lport 6b397961-0eb5-4ccd-8c0a-f433961cd08a up in Southbound
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.780 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:49 compute-0 podman[215148]: 2026-01-22 17:06:49.795689319 +0000 UTC m=+0.101549376 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6)
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.806 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[1bc4f61c-b4bf-48c9-9dff-acfce0d30bdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.812 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cf34e12d-fad5-4ec2-beec-d31ea188e679]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:06:49 compute-0 NetworkManager[55454]: <info>  [1769101609.8143] manager: (tapca57ee46-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.848 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a32ac3fe-306e-4d80-af0a-2ddca15e5a1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.851 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[af30dcd1-c6e7-4c39-b047-63c0020602e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:06:49 compute-0 NetworkManager[55454]: <info>  [1769101609.8797] device (tapca57ee46-b0): carrier: link connected
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.886 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:49 compute-0 NetworkManager[55454]: <info>  [1769101609.8877] manager: (patch-br-int-to-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 22 17:06:49 compute-0 NetworkManager[55454]: <info>  [1769101609.8889] manager: (patch-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.889 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ae98fdb2-0885-4632-a02e-2bc7a8748167]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.906 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e3209718-c684-4196-8fad-0c14fcb6a7cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca57ee46-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:69:1c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397358, 'reachable_time': 27603, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215207, 'error': None, 'target': 'ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.916 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:49 compute-0 nova_compute[183075]: 2026-01-22 17:06:49.921 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.924 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[49192b09-39cd-46a5-91ec-e515c1db9152]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:691c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 397358, 'tstamp': 397358}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215208, 'error': None, 'target': 'ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.943 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f4e4cd64-f473-4271-97d4-bcd0624ade8a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca57ee46-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:69:1c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397358, 'reachable_time': 27603, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215209, 'error': None, 'target': 'ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:06:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:49.986 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7db74620-7fcf-438b-8ae7-7a380b905912]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:50.058 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4f2cce35-96b4-4322-9298-7c4e41266001]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:50.060 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca57ee46-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:50.061 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:50.061 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca57ee46-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:06:50 compute-0 kernel: tapca57ee46-b0: entered promiscuous mode
Jan 22 17:06:50 compute-0 NetworkManager[55454]: <info>  [1769101610.0649] manager: (tapca57ee46-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Jan 22 17:06:50 compute-0 nova_compute[183075]: 2026-01-22 17:06:50.064 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:50.067 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca57ee46-b0, col_values=(('external_ids', {'iface-id': '75ce3d88-09ff-4158-a831-34ffcbec4888'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:06:50 compute-0 nova_compute[183075]: 2026-01-22 17:06:50.069 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:50 compute-0 ovn_controller[95372]: 2026-01-22T17:06:50Z|00041|binding|INFO|Releasing lport 75ce3d88-09ff-4158-a831-34ffcbec4888 from this chassis (sb_readonly=0)
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:50.070 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ca57ee46-b6e8-4b60-affe-0c1349cb8abe.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ca57ee46-b6e8-4b60-affe-0c1349cb8abe.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:50.071 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a5afadf3-a31b-42f9-9349-5f095a061e8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:50.072 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/ca57ee46-b6e8-4b60-affe-0c1349cb8abe.pid.haproxy
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID ca57ee46-b6e8-4b60-affe-0c1349cb8abe
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:06:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:06:50.074 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe', 'env', 'PROCESS_TAG=haproxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ca57ee46-b6e8-4b60-affe-0c1349cb8abe.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:06:50 compute-0 nova_compute[183075]: 2026-01-22 17:06:50.079 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:50 compute-0 nova_compute[183075]: 2026-01-22 17:06:50.289 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101610.2885072, bd249764-12e4-4e25-9445-dd6e132ca53c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:06:50 compute-0 nova_compute[183075]: 2026-01-22 17:06:50.290 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] VM Started (Lifecycle Event)
Jan 22 17:06:50 compute-0 nova_compute[183075]: 2026-01-22 17:06:50.323 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:06:50 compute-0 nova_compute[183075]: 2026-01-22 17:06:50.328 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101610.289178, bd249764-12e4-4e25-9445-dd6e132ca53c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:06:50 compute-0 nova_compute[183075]: 2026-01-22 17:06:50.329 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] VM Paused (Lifecycle Event)
Jan 22 17:06:50 compute-0 nova_compute[183075]: 2026-01-22 17:06:50.352 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:06:50 compute-0 nova_compute[183075]: 2026-01-22 17:06:50.357 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:06:50 compute-0 nova_compute[183075]: 2026-01-22 17:06:50.381 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:06:50 compute-0 podman[215248]: 2026-01-22 17:06:50.482668113 +0000 UTC m=+0.055739569 container create aaaa7538d3c0886fe2729d55d17736b07529c6f43da9a107883fbd5fe70c8423 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:06:50 compute-0 systemd[1]: Started libpod-conmon-aaaa7538d3c0886fe2729d55d17736b07529c6f43da9a107883fbd5fe70c8423.scope.
Jan 22 17:06:50 compute-0 podman[215248]: 2026-01-22 17:06:50.4538942 +0000 UTC m=+0.026965676 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:06:50 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:06:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2244431f15046fcf1df958c192f3e2514255cb947c42b723d6394ed8db21857e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:06:50 compute-0 podman[215248]: 2026-01-22 17:06:50.576018364 +0000 UTC m=+0.149089870 container init aaaa7538d3c0886fe2729d55d17736b07529c6f43da9a107883fbd5fe70c8423 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:06:50 compute-0 podman[215248]: 2026-01-22 17:06:50.588565252 +0000 UTC m=+0.161636718 container start aaaa7538d3c0886fe2729d55d17736b07529c6f43da9a107883fbd5fe70c8423 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 17:06:50 compute-0 neutron-haproxy-ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215264]: [NOTICE]   (215268) : New worker (215270) forked
Jan 22 17:06:50 compute-0 neutron-haproxy-ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215264]: [NOTICE]   (215268) : Loading success.
Jan 22 17:06:52 compute-0 sshd-session[215079]: Connection reset by authenticating user root 176.120.22.47 port 56124 [preauth]
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.631 183079 DEBUG nova.compute.manager [req-2b000321-a08e-4413-b007-3bc314f85bd7 req-77d55e3a-c4e0-49da-bd8b-202423b24b9f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Received event network-vif-plugged-6b397961-0eb5-4ccd-8c0a-f433961cd08a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.632 183079 DEBUG oslo_concurrency.lockutils [req-2b000321-a08e-4413-b007-3bc314f85bd7 req-77d55e3a-c4e0-49da-bd8b-202423b24b9f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.633 183079 DEBUG oslo_concurrency.lockutils [req-2b000321-a08e-4413-b007-3bc314f85bd7 req-77d55e3a-c4e0-49da-bd8b-202423b24b9f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.634 183079 DEBUG oslo_concurrency.lockutils [req-2b000321-a08e-4413-b007-3bc314f85bd7 req-77d55e3a-c4e0-49da-bd8b-202423b24b9f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.634 183079 DEBUG nova.compute.manager [req-2b000321-a08e-4413-b007-3bc314f85bd7 req-77d55e3a-c4e0-49da-bd8b-202423b24b9f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Processing event network-vif-plugged-6b397961-0eb5-4ccd-8c0a-f433961cd08a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.636 183079 DEBUG nova.compute.manager [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.642 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101612.6424644, bd249764-12e4-4e25-9445-dd6e132ca53c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.643 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] VM Resumed (Lifecycle Event)
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.647 183079 DEBUG nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.653 183079 INFO nova.virt.libvirt.driver [-] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Instance spawned successfully.
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.653 183079 DEBUG nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.810 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.816 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.834 183079 DEBUG nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.834 183079 DEBUG nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.835 183079 DEBUG nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.836 183079 DEBUG nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.836 183079 DEBUG nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.836 183079 DEBUG nova.virt.libvirt.driver [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.842 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.892 183079 INFO nova.compute.manager [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Took 7.82 seconds to spawn the instance on the hypervisor.
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.893 183079 DEBUG nova.compute.manager [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.960 183079 INFO nova.compute.manager [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Took 8.28 seconds to build instance.
Jan 22 17:06:52 compute-0 nova_compute[183075]: 2026-01-22 17:06:52.978 183079 DEBUG oslo_concurrency.lockutils [None req-c59621f2-3ae6-408c-9187-07596a70ec39 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "bd249764-12e4-4e25-9445-dd6e132ca53c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:06:53 compute-0 nova_compute[183075]: 2026-01-22 17:06:53.354 183079 DEBUG nova.network.neutron [req-253ad921-7dbe-422a-822b-6aaa063251a1 req-79fbf128-c2f2-4eba-9c0e-dc48a5dca68c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Updated VIF entry in instance network info cache for port 6b397961-0eb5-4ccd-8c0a-f433961cd08a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:06:53 compute-0 nova_compute[183075]: 2026-01-22 17:06:53.356 183079 DEBUG nova.network.neutron [req-253ad921-7dbe-422a-822b-6aaa063251a1 req-79fbf128-c2f2-4eba-9c0e-dc48a5dca68c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Updating instance_info_cache with network_info: [{"id": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "address": "fa:16:3e:6c:74:51", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b397961-0e", "ovs_interfaceid": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:06:53 compute-0 nova_compute[183075]: 2026-01-22 17:06:53.380 183079 DEBUG oslo_concurrency.lockutils [req-253ad921-7dbe-422a-822b-6aaa063251a1 req-79fbf128-c2f2-4eba-9c0e-dc48a5dca68c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-bd249764-12e4-4e25-9445-dd6e132ca53c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:06:53 compute-0 nova_compute[183075]: 2026-01-22 17:06:53.422 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:54 compute-0 nova_compute[183075]: 2026-01-22 17:06:54.424 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:54 compute-0 nova_compute[183075]: 2026-01-22 17:06:54.545 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:54 compute-0 nova_compute[183075]: 2026-01-22 17:06:54.750 183079 DEBUG nova.compute.manager [req-16ff4c4f-cd65-4ea9-bdfc-3a9c45029d20 req-adc3bf03-1289-46f7-9b1a-2f415f2556d9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Received event network-vif-plugged-6b397961-0eb5-4ccd-8c0a-f433961cd08a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:06:54 compute-0 nova_compute[183075]: 2026-01-22 17:06:54.752 183079 DEBUG oslo_concurrency.lockutils [req-16ff4c4f-cd65-4ea9-bdfc-3a9c45029d20 req-adc3bf03-1289-46f7-9b1a-2f415f2556d9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:06:54 compute-0 nova_compute[183075]: 2026-01-22 17:06:54.752 183079 DEBUG oslo_concurrency.lockutils [req-16ff4c4f-cd65-4ea9-bdfc-3a9c45029d20 req-adc3bf03-1289-46f7-9b1a-2f415f2556d9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:06:54 compute-0 nova_compute[183075]: 2026-01-22 17:06:54.753 183079 DEBUG oslo_concurrency.lockutils [req-16ff4c4f-cd65-4ea9-bdfc-3a9c45029d20 req-adc3bf03-1289-46f7-9b1a-2f415f2556d9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:06:54 compute-0 nova_compute[183075]: 2026-01-22 17:06:54.754 183079 DEBUG nova.compute.manager [req-16ff4c4f-cd65-4ea9-bdfc-3a9c45029d20 req-adc3bf03-1289-46f7-9b1a-2f415f2556d9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] No waiting events found dispatching network-vif-plugged-6b397961-0eb5-4ccd-8c0a-f433961cd08a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:06:54 compute-0 nova_compute[183075]: 2026-01-22 17:06:54.754 183079 WARNING nova.compute.manager [req-16ff4c4f-cd65-4ea9-bdfc-3a9c45029d20 req-adc3bf03-1289-46f7-9b1a-2f415f2556d9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Received unexpected event network-vif-plugged-6b397961-0eb5-4ccd-8c0a-f433961cd08a for instance with vm_state active and task_state None.
Jan 22 17:06:54 compute-0 nova_compute[183075]: 2026-01-22 17:06:54.758 183079 INFO nova.compute.manager [None req-f49f8b5a-bdd3-43e2-b782-eb63a7a6e6ea c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Get console output
Jan 22 17:06:55 compute-0 podman[215283]: 2026-01-22 17:06:55.42260891 +0000 UTC m=+0.097864950 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:06:57 compute-0 nova_compute[183075]: 2026-01-22 17:06:57.972 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:59 compute-0 nova_compute[183075]: 2026-01-22 17:06:59.427 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:59 compute-0 nova_compute[183075]: 2026-01-22 17:06:59.548 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:06:59 compute-0 nova_compute[183075]: 2026-01-22 17:06:59.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:06:59 compute-0 sshd-session[215281]: Connection reset by authenticating user root 176.120.22.47 port 53654 [preauth]
Jan 22 17:06:59 compute-0 nova_compute[183075]: 2026-01-22 17:06:59.992 183079 INFO nova.compute.manager [None req-92933afc-1250-4f7d-9324-d60331845f19 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Get console output
Jan 22 17:07:00 compute-0 nova_compute[183075]: 2026-01-22 17:07:00.001 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:07:01 compute-0 podman[215308]: 2026-01-22 17:07:01.387505968 +0000 UTC m=+0.091578106 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:07:01 compute-0 nova_compute[183075]: 2026-01-22 17:07:01.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:07:02 compute-0 sshd-session[215279]: Connection reset by authenticating user root 176.120.22.47 port 53642 [preauth]
Jan 22 17:07:03 compute-0 nova_compute[183075]: 2026-01-22 17:07:03.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:07:03 compute-0 nova_compute[183075]: 2026-01-22 17:07:03.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:07:03 compute-0 nova_compute[183075]: 2026-01-22 17:07:03.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:07:03 compute-0 nova_compute[183075]: 2026-01-22 17:07:03.956 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-bd249764-12e4-4e25-9445-dd6e132ca53c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:07:03 compute-0 nova_compute[183075]: 2026-01-22 17:07:03.956 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-bd249764-12e4-4e25-9445-dd6e132ca53c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:07:03 compute-0 nova_compute[183075]: 2026-01-22 17:07:03.956 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:07:03 compute-0 nova_compute[183075]: 2026-01-22 17:07:03.956 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bd249764-12e4-4e25-9445-dd6e132ca53c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:07:04 compute-0 nova_compute[183075]: 2026-01-22 17:07:04.488 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:04 compute-0 nova_compute[183075]: 2026-01-22 17:07:04.550 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:05 compute-0 nova_compute[183075]: 2026-01-22 17:07:05.128 183079 INFO nova.compute.manager [None req-a04657e8-d1a4-43d3-89e7-3d393ade120a c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Get console output
Jan 22 17:07:05 compute-0 nova_compute[183075]: 2026-01-22 17:07:05.134 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:07:05 compute-0 ovn_controller[95372]: 2026-01-22T17:07:05Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6c:74:51 10.100.0.12
Jan 22 17:07:05 compute-0 ovn_controller[95372]: 2026-01-22T17:07:05Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6c:74:51 10.100.0.12
Jan 22 17:07:05 compute-0 nova_compute[183075]: 2026-01-22 17:07:05.993 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Updating instance_info_cache with network_info: [{"id": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "address": "fa:16:3e:6c:74:51", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b397961-0e", "ovs_interfaceid": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.014 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-bd249764-12e4-4e25-9445-dd6e132ca53c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.015 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.016 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.568 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.569 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.592 183079 DEBUG nova.compute.manager [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:07:06 compute-0 sshd-session[215304]: Connection reset by authenticating user root 176.120.22.47 port 53660 [preauth]
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.669 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.670 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.683 183079 DEBUG nova.virt.hardware [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.683 183079 INFO nova.compute.claims [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.803 183079 DEBUG nova.compute.provider_tree [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:07:06 compute-0 sshd-session[215305]: Connection reset by authenticating user root 176.120.22.47 port 46290 [preauth]
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.854 183079 DEBUG nova.scheduler.client.report [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.880 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.882 183079 DEBUG nova.compute.manager [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.925 183079 DEBUG nova.compute.manager [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.927 183079 DEBUG nova.network.neutron [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.946 183079 INFO nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:07:06 compute-0 nova_compute[183075]: 2026-01-22 17:07:06.970 183079 DEBUG nova.compute.manager [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.046 183079 DEBUG nova.compute.manager [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.047 183079 DEBUG nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.047 183079 INFO nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Creating image(s)
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.048 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "/var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.048 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "/var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.049 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "/var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.061 183079 DEBUG oslo_concurrency.processutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.127 183079 DEBUG oslo_concurrency.processutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.129 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.129 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.143 183079 DEBUG oslo_concurrency.processutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.200 183079 DEBUG oslo_concurrency.processutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.202 183079 DEBUG oslo_concurrency.processutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.237 183079 DEBUG oslo_concurrency.processutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.239 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.240 183079 DEBUG oslo_concurrency.processutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.299 183079 DEBUG nova.policy [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.307 183079 DEBUG oslo_concurrency.processutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.308 183079 DEBUG nova.virt.disk.api [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Checking if we can resize image /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.308 183079 DEBUG oslo_concurrency.processutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.366 183079 DEBUG oslo_concurrency.processutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.368 183079 DEBUG nova.virt.disk.api [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Cannot resize image /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.368 183079 DEBUG nova.objects.instance [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lazy-loading 'migration_context' on Instance uuid effaddee-27ef-49f6-ac5f-2e3258c8d5d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.387 183079 DEBUG nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.388 183079 DEBUG nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Ensure instance console log exists: /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.389 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.389 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.390 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:07 compute-0 nova_compute[183075]: 2026-01-22 17:07:07.784 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:07:08 compute-0 nova_compute[183075]: 2026-01-22 17:07:08.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:07:08 compute-0 nova_compute[183075]: 2026-01-22 17:07:08.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:07:09 compute-0 nova_compute[183075]: 2026-01-22 17:07:09.012 183079 DEBUG nova.network.neutron [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Successfully updated port: 44437e9e-7bcf-4942-83a0-cb6139413a8e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:07:09 compute-0 nova_compute[183075]: 2026-01-22 17:07:09.028 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:07:09 compute-0 nova_compute[183075]: 2026-01-22 17:07:09.029 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquired lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:07:09 compute-0 nova_compute[183075]: 2026-01-22 17:07:09.029 183079 DEBUG nova.network.neutron [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:07:09 compute-0 nova_compute[183075]: 2026-01-22 17:07:09.120 183079 DEBUG nova.compute.manager [req-98733be2-b783-493d-b016-a6c12f8a24c6 req-ec7ab120-8e94-4105-9696-f484eb6fea15 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received event network-changed-44437e9e-7bcf-4942-83a0-cb6139413a8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:07:09 compute-0 nova_compute[183075]: 2026-01-22 17:07:09.121 183079 DEBUG nova.compute.manager [req-98733be2-b783-493d-b016-a6c12f8a24c6 req-ec7ab120-8e94-4105-9696-f484eb6fea15 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Refreshing instance network info cache due to event network-changed-44437e9e-7bcf-4942-83a0-cb6139413a8e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:07:09 compute-0 nova_compute[183075]: 2026-01-22 17:07:09.122 183079 DEBUG oslo_concurrency.lockutils [req-98733be2-b783-493d-b016-a6c12f8a24c6 req-ec7ab120-8e94-4105-9696-f484eb6fea15 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:07:09 compute-0 nova_compute[183075]: 2026-01-22 17:07:09.211 183079 DEBUG nova.network.neutron [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:07:09 compute-0 nova_compute[183075]: 2026-01-22 17:07:09.490 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:09 compute-0 nova_compute[183075]: 2026-01-22 17:07:09.552 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.303 183079 INFO nova.compute.manager [None req-8b608dfa-7468-4759-a2a8-d0ff7fe67db0 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Get console output
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.311 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.332 183079 DEBUG nova.network.neutron [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Updating instance_info_cache with network_info: [{"id": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "address": "fa:16:3e:c8:71:80", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44437e9e-7b", "ovs_interfaceid": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.355 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Releasing lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.355 183079 DEBUG nova.compute.manager [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Instance network_info: |[{"id": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "address": "fa:16:3e:c8:71:80", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44437e9e-7b", "ovs_interfaceid": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.356 183079 DEBUG oslo_concurrency.lockutils [req-98733be2-b783-493d-b016-a6c12f8a24c6 req-ec7ab120-8e94-4105-9696-f484eb6fea15 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.356 183079 DEBUG nova.network.neutron [req-98733be2-b783-493d-b016-a6c12f8a24c6 req-ec7ab120-8e94-4105-9696-f484eb6fea15 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Refreshing network info cache for port 44437e9e-7bcf-4942-83a0-cb6139413a8e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.360 183079 DEBUG nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Start _get_guest_xml network_info=[{"id": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "address": "fa:16:3e:c8:71:80", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44437e9e-7b", "ovs_interfaceid": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.364 183079 WARNING nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.369 183079 DEBUG nova.virt.libvirt.host [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.370 183079 DEBUG nova.virt.libvirt.host [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.373 183079 DEBUG nova.virt.libvirt.host [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.374 183079 DEBUG nova.virt.libvirt.host [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.374 183079 DEBUG nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.375 183079 DEBUG nova.virt.hardware [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.375 183079 DEBUG nova.virt.hardware [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.375 183079 DEBUG nova.virt.hardware [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.375 183079 DEBUG nova.virt.hardware [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.376 183079 DEBUG nova.virt.hardware [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.376 183079 DEBUG nova.virt.hardware [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.376 183079 DEBUG nova.virt.hardware [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.376 183079 DEBUG nova.virt.hardware [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.376 183079 DEBUG nova.virt.hardware [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.377 183079 DEBUG nova.virt.hardware [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.377 183079 DEBUG nova.virt.hardware [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.379 183079 DEBUG nova.virt.libvirt.vif [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1783669850',display_name='tempest-server-test-1783669850',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1783669850',id=3,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCYZfzAO5Ui0C6aNyi2ae5nossybaMPhlAmp9Zj0dL/LunopVV9meBl8qfqrR0u5rqBGUC3w2LWkiMuhmtUWjdFFHsn2/jcRm/feYqTpe58YE0uSBcC8gJ0hMPoRirOIAg==',key_name='tempest-keypair-test-1166356111',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eb8e9f7a891a4a38af8b01557eddc991',ramdisk_id='',reservation_id='r-89frdmqf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPPortDetailsTest-1812576526',owner_user_name='tempest-FloatingIPPortDetailsTest-1812576526-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:07:07Z,user_data=None,user_id='d7ee6c51c2b8447baefccea20fa16de5',uuid=effaddee-27ef-49f6-ac5f-2e3258c8d5d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "address": "fa:16:3e:c8:71:80", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44437e9e-7b", "ovs_interfaceid": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.380 183079 DEBUG nova.network.os_vif_util [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converting VIF {"id": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "address": "fa:16:3e:c8:71:80", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44437e9e-7b", "ovs_interfaceid": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.380 183079 DEBUG nova.network.os_vif_util [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:71:80,bridge_name='br-int',has_traffic_filtering=True,id=44437e9e-7bcf-4942-83a0-cb6139413a8e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap44437e9e-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.381 183079 DEBUG nova.objects.instance [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lazy-loading 'pci_devices' on Instance uuid effaddee-27ef-49f6-ac5f-2e3258c8d5d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.399 183079 DEBUG nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:07:10 compute-0 nova_compute[183075]:   <uuid>effaddee-27ef-49f6-ac5f-2e3258c8d5d2</uuid>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   <name>instance-00000003</name>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1783669850</nova:name>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:07:10</nova:creationTime>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:07:10 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:07:10 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:07:10 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:07:10 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:07:10 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:07:10 compute-0 nova_compute[183075]:         <nova:user uuid="d7ee6c51c2b8447baefccea20fa16de5">tempest-FloatingIPPortDetailsTest-1812576526-project-member</nova:user>
Jan 22 17:07:10 compute-0 nova_compute[183075]:         <nova:project uuid="eb8e9f7a891a4a38af8b01557eddc991">tempest-FloatingIPPortDetailsTest-1812576526</nova:project>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:07:10 compute-0 nova_compute[183075]:         <nova:port uuid="44437e9e-7bcf-4942-83a0-cb6139413a8e">
Jan 22 17:07:10 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <system>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <entry name="serial">effaddee-27ef-49f6-ac5f-2e3258c8d5d2</entry>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <entry name="uuid">effaddee-27ef-49f6-ac5f-2e3258c8d5d2</entry>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     </system>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   <os>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   </os>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   <features>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   </features>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:c8:71:80"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <target dev="tap44437e9e-7b"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/console.log" append="off"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <video>
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     </video>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:07:10 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:07:10 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:07:10 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:07:10 compute-0 nova_compute[183075]: </domain>
Jan 22 17:07:10 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.401 183079 DEBUG nova.compute.manager [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Preparing to wait for external event network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.401 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.401 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.402 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.402 183079 DEBUG nova.virt.libvirt.vif [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1783669850',display_name='tempest-server-test-1783669850',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1783669850',id=3,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCYZfzAO5Ui0C6aNyi2ae5nossybaMPhlAmp9Zj0dL/LunopVV9meBl8qfqrR0u5rqBGUC3w2LWkiMuhmtUWjdFFHsn2/jcRm/feYqTpe58YE0uSBcC8gJ0hMPoRirOIAg==',key_name='tempest-keypair-test-1166356111',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eb8e9f7a891a4a38af8b01557eddc991',ramdisk_id='',reservation_id='r-89frdmqf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPPortDetailsTest-1812576526',owner_user_name='tempest-FloatingIPPortDetailsTest-1812576526-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:07:07Z,user_data=None,user_id='d7ee6c51c2b8447baefccea20fa16de5',uuid=effaddee-27ef-49f6-ac5f-2e3258c8d5d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "address": "fa:16:3e:c8:71:80", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44437e9e-7b", "ovs_interfaceid": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.403 183079 DEBUG nova.network.os_vif_util [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converting VIF {"id": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "address": "fa:16:3e:c8:71:80", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44437e9e-7b", "ovs_interfaceid": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.403 183079 DEBUG nova.network.os_vif_util [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:71:80,bridge_name='br-int',has_traffic_filtering=True,id=44437e9e-7bcf-4942-83a0-cb6139413a8e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap44437e9e-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.403 183079 DEBUG os_vif [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:71:80,bridge_name='br-int',has_traffic_filtering=True,id=44437e9e-7bcf-4942-83a0-cb6139413a8e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap44437e9e-7b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.404 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.404 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.405 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:07:10 compute-0 sshd-session[215332]: Connection reset by authenticating user root 176.120.22.47 port 46292 [preauth]
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.407 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.407 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44437e9e-7b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.408 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap44437e9e-7b, col_values=(('external_ids', {'iface-id': '44437e9e-7bcf-4942-83a0-cb6139413a8e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c8:71:80', 'vm-uuid': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.463 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:10 compute-0 NetworkManager[55454]: <info>  [1769101630.4651] manager: (tap44437e9e-7b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.467 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.471 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.472 183079 INFO os_vif [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:71:80,bridge_name='br-int',has_traffic_filtering=True,id=44437e9e-7bcf-4942-83a0-cb6139413a8e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap44437e9e-7b')
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.537 183079 DEBUG nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.537 183079 DEBUG nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] No VIF found with MAC fa:16:3e:c8:71:80, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:07:10 compute-0 kernel: tap44437e9e-7b: entered promiscuous mode
Jan 22 17:07:10 compute-0 NetworkManager[55454]: <info>  [1769101630.5786] manager: (tap44437e9e-7b): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.580 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:10 compute-0 ovn_controller[95372]: 2026-01-22T17:07:10Z|00042|binding|INFO|Claiming lport 44437e9e-7bcf-4942-83a0-cb6139413a8e for this chassis.
Jan 22 17:07:10 compute-0 ovn_controller[95372]: 2026-01-22T17:07:10Z|00043|binding|INFO|44437e9e-7bcf-4942-83a0-cb6139413a8e: Claiming fa:16:3e:c8:71:80 10.100.0.6
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.587 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:71:80 10.100.0.6'], port_security=['fa:16:3e:c8:71:80 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b0c128dc-3c6d-4d32-ac6a-884653522196', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a8cd0033-3766-4772-8688-44d3d0e3250a, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=44437e9e-7bcf-4942-83a0-cb6139413a8e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.589 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 44437e9e-7bcf-4942-83a0-cb6139413a8e in datapath 473b4e99-4018-4fa7-ab1c-2d3e7944d850 bound to our chassis
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.590 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 473b4e99-4018-4fa7-ab1c-2d3e7944d850
Jan 22 17:07:10 compute-0 ovn_controller[95372]: 2026-01-22T17:07:10Z|00044|binding|INFO|Setting lport 44437e9e-7bcf-4942-83a0-cb6139413a8e ovn-installed in OVS
Jan 22 17:07:10 compute-0 ovn_controller[95372]: 2026-01-22T17:07:10Z|00045|binding|INFO|Setting lport 44437e9e-7bcf-4942-83a0-cb6139413a8e up in Southbound
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.593 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.601 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a54eba1f-6bae-431c-9014-adb3bfa65302]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.601 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.603 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap473b4e99-41 in ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.604 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap473b4e99-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.604 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[af2a58b6-1016-461b-ad8a-cc6d412a4d71]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.605 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[16cc5591-3870-4377-8b88-3c3ec71f362e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:10 compute-0 systemd-udevd[215388]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.617 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[c0f596fb-aec8-4e42-b7db-2c67af9859a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:10 compute-0 systemd-machined[154382]: New machine qemu-3-instance-00000003.
Jan 22 17:07:10 compute-0 NetworkManager[55454]: <info>  [1769101630.6254] device (tap44437e9e-7b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:07:10 compute-0 NetworkManager[55454]: <info>  [1769101630.6263] device (tap44437e9e-7b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:07:10 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.633 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[07b225a5-1076-4ca4-9716-927f1089e4f1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.660 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[fe3c6214-1bf3-4ddc-a462-a01b4ce5c817]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:10 compute-0 NetworkManager[55454]: <info>  [1769101630.6692] manager: (tap473b4e99-40): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.669 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[87f11a64-8e86-4fb5-ba62-b8e045d49c63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.704 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d3a780d0-5d8a-4818-83ad-732512486fe0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.707 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[1e56e6a4-9b5b-4445-b8c6-64687d2be6ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:10 compute-0 NetworkManager[55454]: <info>  [1769101630.7367] device (tap473b4e99-40): carrier: link connected
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.743 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[09e46b3b-c831-4926-a261-596a3aeb4196]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.763 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ebe9723b-a0f8-4794-bf86-41a3fdd620fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap473b4e99-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:87:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399443, 'reachable_time': 32301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215421, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.780 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1bcba0b3-6066-48f9-aa01-73e39d360134]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4d:87a4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399443, 'tstamp': 399443}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215422, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.797 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[65447328-2d26-49d6-b093-da56e9957c20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap473b4e99-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:87:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399443, 'reachable_time': 32301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215423, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.820 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.821 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.822 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.822 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.838 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ec417501-471b-4dbc-b071-7cb0747d1f55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.909 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e6639900-9d9d-41cd-bd72-9084ea693c25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.910 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap473b4e99-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.911 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.911 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap473b4e99-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.914 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:10 compute-0 NetworkManager[55454]: <info>  [1769101630.9148] manager: (tap473b4e99-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 22 17:07:10 compute-0 kernel: tap473b4e99-40: entered promiscuous mode
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.916 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.920 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap473b4e99-40, col_values=(('external_ids', {'iface-id': '424ac40e-403e-4504-adbb-47a319b401fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.921 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:10 compute-0 ovn_controller[95372]: 2026-01-22T17:07:10Z|00046|binding|INFO|Releasing lport 424ac40e-403e-4504-adbb-47a319b401fd from this chassis (sb_readonly=0)
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.922 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.924 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/473b4e99-4018-4fa7-ab1c-2d3e7944d850.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/473b4e99-4018-4fa7-ab1c-2d3e7944d850.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.935 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[360b46cf-651e-4a86-96e5-c5c5068aaeb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.935 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.936 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/473b4e99-4018-4fa7-ab1c-2d3e7944d850.pid.haproxy
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 473b4e99-4018-4fa7-ab1c-2d3e7944d850
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:07:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:10.937 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'env', 'PROCESS_TAG=haproxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/473b4e99-4018-4fa7-ab1c-2d3e7944d850.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:07:10 compute-0 nova_compute[183075]: 2026-01-22 17:07:10.953 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.029 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.030 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.086 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.093 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.160 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.169 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.191 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101631.180286, effaddee-27ef-49f6-ac5f-2e3258c8d5d2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.192 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] VM Started (Lifecycle Event)
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.212 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.216 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101631.180831, effaddee-27ef-49f6-ac5f-2e3258c8d5d2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.217 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] VM Paused (Lifecycle Event)
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.231 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.235 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.235 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.251 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.254 183079 DEBUG nova.compute.manager [req-f44c0777-4b34-43d8-aba9-df217403aeae req-ba6525f5-0ae5-4633-a0af-c7a16a161163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received event network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.255 183079 DEBUG oslo_concurrency.lockutils [req-f44c0777-4b34-43d8-aba9-df217403aeae req-ba6525f5-0ae5-4633-a0af-c7a16a161163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.255 183079 DEBUG oslo_concurrency.lockutils [req-f44c0777-4b34-43d8-aba9-df217403aeae req-ba6525f5-0ae5-4633-a0af-c7a16a161163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.256 183079 DEBUG oslo_concurrency.lockutils [req-f44c0777-4b34-43d8-aba9-df217403aeae req-ba6525f5-0ae5-4633-a0af-c7a16a161163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.256 183079 DEBUG nova.compute.manager [req-f44c0777-4b34-43d8-aba9-df217403aeae req-ba6525f5-0ae5-4633-a0af-c7a16a161163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Processing event network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.256 183079 DEBUG nova.compute.manager [req-f44c0777-4b34-43d8-aba9-df217403aeae req-ba6525f5-0ae5-4633-a0af-c7a16a161163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received event network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.256 183079 DEBUG oslo_concurrency.lockutils [req-f44c0777-4b34-43d8-aba9-df217403aeae req-ba6525f5-0ae5-4633-a0af-c7a16a161163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.257 183079 DEBUG oslo_concurrency.lockutils [req-f44c0777-4b34-43d8-aba9-df217403aeae req-ba6525f5-0ae5-4633-a0af-c7a16a161163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.257 183079 DEBUG oslo_concurrency.lockutils [req-f44c0777-4b34-43d8-aba9-df217403aeae req-ba6525f5-0ae5-4633-a0af-c7a16a161163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.257 183079 DEBUG nova.compute.manager [req-f44c0777-4b34-43d8-aba9-df217403aeae req-ba6525f5-0ae5-4633-a0af-c7a16a161163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] No waiting events found dispatching network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.257 183079 WARNING nova.compute.manager [req-f44c0777-4b34-43d8-aba9-df217403aeae req-ba6525f5-0ae5-4633-a0af-c7a16a161163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received unexpected event network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e for instance with vm_state building and task_state spawning.
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.258 183079 DEBUG nova.compute.manager [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.261 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101631.2615163, effaddee-27ef-49f6-ac5f-2e3258c8d5d2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.262 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] VM Resumed (Lifecycle Event)
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.263 183079 DEBUG nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.266 183079 INFO nova.virt.libvirt.driver [-] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Instance spawned successfully.
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.267 183079 DEBUG nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.285 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.288 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.302 183079 DEBUG nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.303 183079 DEBUG nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.303 183079 DEBUG nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.303 183079 DEBUG nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.304 183079 DEBUG nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.304 183079 DEBUG nova.virt.libvirt.driver [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.315 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:07:11 compute-0 podman[215475]: 2026-01-22 17:07:11.342201098 +0000 UTC m=+0.053363596 container create d57a73aad38b21a6f0273d0b23e99556d1dbeb69953d3deb8b92141d2bdbe91a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.379 183079 INFO nova.compute.manager [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Took 4.33 seconds to spawn the instance on the hypervisor.
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.381 183079 DEBUG nova.compute.manager [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:07:11 compute-0 systemd[1]: Started libpod-conmon-d57a73aad38b21a6f0273d0b23e99556d1dbeb69953d3deb8b92141d2bdbe91a.scope.
Jan 22 17:07:11 compute-0 podman[215475]: 2026-01-22 17:07:11.313478957 +0000 UTC m=+0.024641475 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:07:11 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:07:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18797a17e2baf24a32ea7ca8564085167b02b4354113699b0ae9251360e34b34/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:07:11 compute-0 podman[215475]: 2026-01-22 17:07:11.433599547 +0000 UTC m=+0.144762055 container init d57a73aad38b21a6f0273d0b23e99556d1dbeb69953d3deb8b92141d2bdbe91a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 22 17:07:11 compute-0 podman[215475]: 2026-01-22 17:07:11.442823558 +0000 UTC m=+0.153986056 container start d57a73aad38b21a6f0273d0b23e99556d1dbeb69953d3deb8b92141d2bdbe91a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.448 183079 INFO nova.compute.manager [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Took 4.81 seconds to build instance.
Jan 22 17:07:11 compute-0 neutron-haproxy-ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215490]: [NOTICE]   (215495) : New worker (215497) forked
Jan 22 17:07:11 compute-0 neutron-haproxy-ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215490]: [NOTICE]   (215495) : Loading success.
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.469 183079 DEBUG oslo_concurrency.lockutils [None req-b9b5ed06-7874-4bbe-914b-552351a0cd6c d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.901s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.495 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.496 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5582MB free_disk=73.35426330566406GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.496 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.496 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.554 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance bd249764-12e4-4e25-9445-dd6e132ca53c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.555 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance effaddee-27ef-49f6-ac5f-2e3258c8d5d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.555 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.555 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.613 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.628 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.654 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:07:11 compute-0 nova_compute[183075]: 2026-01-22 17:07:11.654 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:12 compute-0 nova_compute[183075]: 2026-01-22 17:07:12.089 183079 DEBUG nova.network.neutron [req-98733be2-b783-493d-b016-a6c12f8a24c6 req-ec7ab120-8e94-4105-9696-f484eb6fea15 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Updated VIF entry in instance network info cache for port 44437e9e-7bcf-4942-83a0-cb6139413a8e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:07:12 compute-0 nova_compute[183075]: 2026-01-22 17:07:12.090 183079 DEBUG nova.network.neutron [req-98733be2-b783-493d-b016-a6c12f8a24c6 req-ec7ab120-8e94-4105-9696-f484eb6fea15 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Updating instance_info_cache with network_info: [{"id": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "address": "fa:16:3e:c8:71:80", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44437e9e-7b", "ovs_interfaceid": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:07:12 compute-0 nova_compute[183075]: 2026-01-22 17:07:12.109 183079 DEBUG oslo_concurrency.lockutils [req-98733be2-b783-493d-b016-a6c12f8a24c6 req-ec7ab120-8e94-4105-9696-f484eb6fea15 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:07:12 compute-0 podman[215507]: 2026-01-22 17:07:12.377862749 +0000 UTC m=+0.077149198 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:07:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:12.517 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:12.518 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:07:12 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:12 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:12 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:12 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:12 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:12 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:07:12 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:12 compute-0 nova_compute[183075]: 2026-01-22 17:07:12.875 183079 INFO nova.compute.manager [None req-95d31f23-30bd-4aad-acfc-f09d7da07ede d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Get console output
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.120 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.120 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.6026585
Jan 22 17:07:13 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.12:51068 [22/Jan/2026:17:07:12.516] listener listener/metadata 0/0/0/604/604 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.131 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.133 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.164 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.165 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 167 time: 0.0324628
Jan 22 17:07:13 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.12:51082 [22/Jan/2026:17:07:13.130] listener listener/metadata 0/0/0/34/34 200 151 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.170 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.171 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.184 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:13 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.12:51094 [22/Jan/2026:17:07:13.170] listener listener/metadata 0/0/0/14/14 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.185 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0142510
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.195 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.196 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.211 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.212 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0164628
Jan 22 17:07:13 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.12:51104 [22/Jan/2026:17:07:13.194] listener listener/metadata 0/0/0/18/18 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.221 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.221 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.248 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.249 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0276937
Jan 22 17:07:13 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.12:51118 [22/Jan/2026:17:07:13.220] listener listener/metadata 0/0/0/29/29 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.256 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.257 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.279 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.279 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0225689
Jan 22 17:07:13 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.12:51132 [22/Jan/2026:17:07:13.255] listener listener/metadata 0/0/0/23/23 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.286 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.287 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.310 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.311 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0238211
Jan 22 17:07:13 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.12:51146 [22/Jan/2026:17:07:13.285] listener listener/metadata 0/0/0/25/25 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.318 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.318 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.340 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.341 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0223856
Jan 22 17:07:13 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.12:51152 [22/Jan/2026:17:07:13.317] listener listener/metadata 0/0/0/23/23 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.347 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.348 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.368 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.369 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0209422
Jan 22 17:07:13 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.12:51156 [22/Jan/2026:17:07:13.346] listener listener/metadata 0/0/0/22/22 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.374 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.375 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.394 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:13 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.12:51172 [22/Jan/2026:17:07:13.374] listener listener/metadata 0/0/0/21/21 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.395 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0194550
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.402 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.403 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:13 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.12:51184 [22/Jan/2026:17:07:13.401] listener listener/metadata 0/0/0/18/18 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.420 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0167899
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.429 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.429 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.448 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.448 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0191996
Jan 22 17:07:13 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.12:51190 [22/Jan/2026:17:07:13.428] listener listener/metadata 0/0/0/20/20 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.454 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.455 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.470 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:13 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.12:51204 [22/Jan/2026:17:07:13.453] listener listener/metadata 0/0/0/17/17 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.471 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0162187
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.476 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.477 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.491 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:13 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.12:51210 [22/Jan/2026:17:07:13.475] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.491 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0147326
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.496 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.496 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.511 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:13 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.12:51218 [22/Jan/2026:17:07:13.495] listener listener/metadata 0/0/0/16/16 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.512 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0158777
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.517 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.518 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.541 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:13.541 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0235777
Jan 22 17:07:13 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.12:51222 [22/Jan/2026:17:07:13.516] listener listener/metadata 0/0/0/25/25 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:07:14 compute-0 sshd-session[215351]: Connection reset by authenticating user root 176.120.22.47 port 46294 [preauth]
Jan 22 17:07:14 compute-0 sshd-session[215352]: Connection reset by authenticating user root 176.120.22.47 port 46302 [preauth]
Jan 22 17:07:14 compute-0 nova_compute[183075]: 2026-01-22 17:07:14.494 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:14 compute-0 nova_compute[183075]: 2026-01-22 17:07:14.650 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:07:15 compute-0 nova_compute[183075]: 2026-01-22 17:07:15.515 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:15 compute-0 nova_compute[183075]: 2026-01-22 17:07:15.610 183079 INFO nova.compute.manager [None req-b30c5082-8d6b-4efa-b439-1342c2f5e25a c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Get console output
Jan 22 17:07:15 compute-0 nova_compute[183075]: 2026-01-22 17:07:15.618 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:07:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:15.923 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:d5:42 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-58a8796f-cab9-4555-b38b-f7cfc6f7c89d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a8796f-cab9-4555-b38b-f7cfc6f7c89d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '407e1eda04f54c6f9a53644846eca741', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d2f72c1-5c17-4a03-88a5-7ceb3ec192b9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3bea80e2-de47-424d-bbb3-ba80d091df18) old=Port_Binding(mac=['fa:16:3e:93:d5:42 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-58a8796f-cab9-4555-b38b-f7cfc6f7c89d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a8796f-cab9-4555-b38b-f7cfc6f7c89d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '407e1eda04f54c6f9a53644846eca741', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:07:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:15.926 104629 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3bea80e2-de47-424d-bbb3-ba80d091df18 in datapath 58a8796f-cab9-4555-b38b-f7cfc6f7c89d updated
Jan 22 17:07:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:15.930 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58a8796f-cab9-4555-b38b-f7cfc6f7c89d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:07:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:15.932 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2f033d66-f995-4965-8540-46ed00517c78]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:16 compute-0 sshd-session[215393]: Connection reset by authenticating user root 176.120.22.47 port 60806 [preauth]
Jan 22 17:07:16 compute-0 sshd-session[215492]: Connection reset by authenticating user root 176.120.22.47 port 60820 [preauth]
Jan 22 17:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:17.367 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:d5:42 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-58a8796f-cab9-4555-b38b-f7cfc6f7c89d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a8796f-cab9-4555-b38b-f7cfc6f7c89d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '407e1eda04f54c6f9a53644846eca741', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d2f72c1-5c17-4a03-88a5-7ceb3ec192b9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3bea80e2-de47-424d-bbb3-ba80d091df18) old=Port_Binding(mac=['fa:16:3e:93:d5:42 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-58a8796f-cab9-4555-b38b-f7cfc6f7c89d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a8796f-cab9-4555-b38b-f7cfc6f7c89d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '407e1eda04f54c6f9a53644846eca741', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:17.371 104629 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3bea80e2-de47-424d-bbb3-ba80d091df18 in datapath 58a8796f-cab9-4555-b38b-f7cfc6f7c89d updated
Jan 22 17:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:17.374 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58a8796f-cab9-4555-b38b-f7cfc6f7c89d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:17.376 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c9d79ac5-db16-49ad-8459-52332581a065]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:18 compute-0 nova_compute[183075]: 2026-01-22 17:07:18.164 183079 INFO nova.compute.manager [None req-6f6e8a24-4130-4512-a60c-ca61940c302b d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Get console output
Jan 22 17:07:18 compute-0 podman[215538]: 2026-01-22 17:07:18.410460199 +0000 UTC m=+0.102345077 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 17:07:18 compute-0 podman[215537]: 2026-01-22 17:07:18.458407163 +0000 UTC m=+0.150667281 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:07:19 compute-0 sshd-session[215532]: Connection reset by authenticating user root 176.120.22.47 port 60834 [preauth]
Jan 22 17:07:19 compute-0 sshd-session[215531]: Connection reset by authenticating user root 176.120.22.47 port 60824 [preauth]
Jan 22 17:07:19 compute-0 nova_compute[183075]: 2026-01-22 17:07:19.502 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:20 compute-0 podman[215585]: 2026-01-22 17:07:20.402388236 +0000 UTC m=+0.102161972 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 17:07:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:20.478 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:d5:42 10.100.0.18 10.100.0.2 10.100.0.34 10.100.0.50'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28 10.100.0.50/28', 'neutron:device_id': 'ovnmeta-58a8796f-cab9-4555-b38b-f7cfc6f7c89d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a8796f-cab9-4555-b38b-f7cfc6f7c89d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '407e1eda04f54c6f9a53644846eca741', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d2f72c1-5c17-4a03-88a5-7ceb3ec192b9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3bea80e2-de47-424d-bbb3-ba80d091df18) old=Port_Binding(mac=['fa:16:3e:93:d5:42 10.100.0.18 10.100.0.2 10.100.0.34'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-58a8796f-cab9-4555-b38b-f7cfc6f7c89d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a8796f-cab9-4555-b38b-f7cfc6f7c89d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '407e1eda04f54c6f9a53644846eca741', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:07:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:20.480 104629 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3bea80e2-de47-424d-bbb3-ba80d091df18 in datapath 58a8796f-cab9-4555-b38b-f7cfc6f7c89d updated
Jan 22 17:07:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:20.481 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58a8796f-cab9-4555-b38b-f7cfc6f7c89d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:07:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:20.483 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3db680e8-3002-4cea-b635-b6d3f72f04ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:20 compute-0 nova_compute[183075]: 2026-01-22 17:07:20.519 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:21 compute-0 sshd-session[215536]: Connection reset by authenticating user root 176.120.22.47 port 60854 [preauth]
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.130 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "b9680bfd-e87f-427c-8f13-2b3a415aca39" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.130 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "b9680bfd-e87f-427c-8f13-2b3a415aca39" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.147 183079 DEBUG nova.compute.manager [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:07:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:22.195 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:d5:42 10.100.0.18 10.100.0.2 10.100.0.34 10.100.0.50 10.100.0.66'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28 10.100.0.50/28 10.100.0.66/28', 'neutron:device_id': 'ovnmeta-58a8796f-cab9-4555-b38b-f7cfc6f7c89d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a8796f-cab9-4555-b38b-f7cfc6f7c89d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '407e1eda04f54c6f9a53644846eca741', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d2f72c1-5c17-4a03-88a5-7ceb3ec192b9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3bea80e2-de47-424d-bbb3-ba80d091df18) old=Port_Binding(mac=['fa:16:3e:93:d5:42 10.100.0.18 10.100.0.2 10.100.0.34 10.100.0.50'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28 10.100.0.50/28', 'neutron:device_id': 'ovnmeta-58a8796f-cab9-4555-b38b-f7cfc6f7c89d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a8796f-cab9-4555-b38b-f7cfc6f7c89d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '407e1eda04f54c6f9a53644846eca741', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:07:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:22.196 104629 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3bea80e2-de47-424d-bbb3-ba80d091df18 in datapath 58a8796f-cab9-4555-b38b-f7cfc6f7c89d updated
Jan 22 17:07:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:22.197 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58a8796f-cab9-4555-b38b-f7cfc6f7c89d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:07:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:22.198 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3756bb82-8a9a-497f-8c3d-661084a1a02e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.240 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.241 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.248 183079 DEBUG nova.virt.hardware [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.248 183079 INFO nova.compute.claims [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.433 183079 DEBUG nova.compute.provider_tree [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.453 183079 DEBUG nova.scheduler.client.report [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.486 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.488 183079 DEBUG nova.compute.manager [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.544 183079 DEBUG nova.compute.manager [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.545 183079 DEBUG nova.network.neutron [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.572 183079 INFO nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.592 183079 DEBUG nova.compute.manager [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.718 183079 DEBUG nova.compute.manager [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.720 183079 DEBUG nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.721 183079 INFO nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Creating image(s)
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.722 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "/var/lib/nova/instances/b9680bfd-e87f-427c-8f13-2b3a415aca39/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.722 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "/var/lib/nova/instances/b9680bfd-e87f-427c-8f13-2b3a415aca39/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.723 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "/var/lib/nova/instances/b9680bfd-e87f-427c-8f13-2b3a415aca39/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.738 183079 DEBUG oslo_concurrency.processutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.828 183079 DEBUG oslo_concurrency.processutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.829 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.830 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.844 183079 DEBUG oslo_concurrency.processutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.913 183079 DEBUG oslo_concurrency.processutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.914 183079 DEBUG oslo_concurrency.processutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/b9680bfd-e87f-427c-8f13-2b3a415aca39/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.943 183079 DEBUG oslo_concurrency.processutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/b9680bfd-e87f-427c-8f13-2b3a415aca39/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.944 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:22 compute-0 nova_compute[183075]: 2026-01-22 17:07:22.945 183079 DEBUG oslo_concurrency.processutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.007 183079 DEBUG oslo_concurrency.processutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.009 183079 DEBUG nova.virt.disk.api [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Checking if we can resize image /var/lib/nova/instances/b9680bfd-e87f-427c-8f13-2b3a415aca39/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.009 183079 DEBUG oslo_concurrency.processutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9680bfd-e87f-427c-8f13-2b3a415aca39/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.060 183079 DEBUG oslo_concurrency.processutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9680bfd-e87f-427c-8f13-2b3a415aca39/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.062 183079 DEBUG nova.virt.disk.api [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Cannot resize image /var/lib/nova/instances/b9680bfd-e87f-427c-8f13-2b3a415aca39/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.062 183079 DEBUG nova.objects.instance [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lazy-loading 'migration_context' on Instance uuid b9680bfd-e87f-427c-8f13-2b3a415aca39 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.082 183079 DEBUG nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.083 183079 DEBUG nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Ensure instance console log exists: /var/lib/nova/instances/b9680bfd-e87f-427c-8f13-2b3a415aca39/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.083 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.084 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.084 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:23 compute-0 sshd-session[215583]: Connection reset by authenticating user root 176.120.22.47 port 60864 [preauth]
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.182 183079 DEBUG nova.policy [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c9cdd80799a74efb8ce82cfb5148ac89', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cc642b97aa4e4886902a0d1233877b88', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.375 183079 INFO nova.compute.manager [None req-abb672af-2fcf-480a-a36f-e98bb7f933fb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Get console output
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.379 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.922 183079 DEBUG nova.network.neutron [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Successfully updated port: 4b2da161-220a-41f0-b689-1c6e25e8b81a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.935 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "refresh_cache-b9680bfd-e87f-427c-8f13-2b3a415aca39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.935 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquired lock "refresh_cache-b9680bfd-e87f-427c-8f13-2b3a415aca39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:07:23 compute-0 nova_compute[183075]: 2026-01-22 17:07:23.936 183079 DEBUG nova.network.neutron [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:07:23 compute-0 ovn_controller[95372]: 2026-01-22T17:07:23Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c8:71:80 10.100.0.6
Jan 22 17:07:23 compute-0 ovn_controller[95372]: 2026-01-22T17:07:23Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c8:71:80 10.100.0.6
Jan 22 17:07:24 compute-0 nova_compute[183075]: 2026-01-22 17:07:24.064 183079 DEBUG nova.compute.manager [req-ae6ae40b-6688-4102-8335-de3540be416c req-5f7f8fc8-99e7-46c3-b591-dec1b9770151 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Received event network-changed-4b2da161-220a-41f0-b689-1c6e25e8b81a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:07:24 compute-0 nova_compute[183075]: 2026-01-22 17:07:24.064 183079 DEBUG nova.compute.manager [req-ae6ae40b-6688-4102-8335-de3540be416c req-5f7f8fc8-99e7-46c3-b591-dec1b9770151 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Refreshing instance network info cache due to event network-changed-4b2da161-220a-41f0-b689-1c6e25e8b81a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:07:24 compute-0 nova_compute[183075]: 2026-01-22 17:07:24.064 183079 DEBUG oslo_concurrency.lockutils [req-ae6ae40b-6688-4102-8335-de3540be416c req-5f7f8fc8-99e7-46c3-b591-dec1b9770151 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-b9680bfd-e87f-427c-8f13-2b3a415aca39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:07:24 compute-0 nova_compute[183075]: 2026-01-22 17:07:24.504 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:24 compute-0 sshd-session[215608]: Connection reset by authenticating user root 176.120.22.47 port 56718 [preauth]
Jan 22 17:07:24 compute-0 nova_compute[183075]: 2026-01-22 17:07:24.754 183079 DEBUG nova.network.neutron [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:07:25 compute-0 nova_compute[183075]: 2026-01-22 17:07:25.522 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.063 183079 DEBUG nova.network.neutron [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Updating instance_info_cache with network_info: [{"id": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "address": "fa:16:3e:22:ea:7f", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2da161-22", "ovs_interfaceid": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.087 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Releasing lock "refresh_cache-b9680bfd-e87f-427c-8f13-2b3a415aca39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.087 183079 DEBUG nova.compute.manager [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Instance network_info: |[{"id": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "address": "fa:16:3e:22:ea:7f", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2da161-22", "ovs_interfaceid": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.087 183079 DEBUG oslo_concurrency.lockutils [req-ae6ae40b-6688-4102-8335-de3540be416c req-5f7f8fc8-99e7-46c3-b591-dec1b9770151 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-b9680bfd-e87f-427c-8f13-2b3a415aca39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.088 183079 DEBUG nova.network.neutron [req-ae6ae40b-6688-4102-8335-de3540be416c req-5f7f8fc8-99e7-46c3-b591-dec1b9770151 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Refreshing network info cache for port 4b2da161-220a-41f0-b689-1c6e25e8b81a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.090 183079 DEBUG nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Start _get_guest_xml network_info=[{"id": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "address": "fa:16:3e:22:ea:7f", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2da161-22", "ovs_interfaceid": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.096 183079 WARNING nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.100 183079 DEBUG nova.virt.libvirt.host [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.100 183079 DEBUG nova.virt.libvirt.host [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.103 183079 DEBUG nova.virt.libvirt.host [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.104 183079 DEBUG nova.virt.libvirt.host [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.104 183079 DEBUG nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.104 183079 DEBUG nova.virt.hardware [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.104 183079 DEBUG nova.virt.hardware [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.105 183079 DEBUG nova.virt.hardware [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.105 183079 DEBUG nova.virt.hardware [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.105 183079 DEBUG nova.virt.hardware [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.105 183079 DEBUG nova.virt.hardware [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.106 183079 DEBUG nova.virt.hardware [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.106 183079 DEBUG nova.virt.hardware [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.106 183079 DEBUG nova.virt.hardware [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.106 183079 DEBUG nova.virt.hardware [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.106 183079 DEBUG nova.virt.hardware [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.109 183079 DEBUG nova.virt.libvirt.vif [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-783174209',display_name='tempest-server-test-783174209',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-783174209',id=4,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFwDv7A3xWV6IJ9sHJD6IHIGofwjIy3SwtF7ZsSnO9i7yxDOnvofgvCRbmYwkVhe4LG2M7JC1Bh9mcomUiffvFuBa9GwDItaNN685Z4fyZXr+GZx+rbje/8Qtcf+s+bYwA==',key_name='tempest-keypair-test-76774833',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cc642b97aa4e4886902a0d1233877b88',ramdisk_id='',reservation_id='r-8jpndd40',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DefaultSnatToExternal-1301723521',owner_user_name='tempest-DefaultSnatToExternal-1301723521-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:07:22Z,user_data=None,user_id='c9cdd80799a74efb8ce82cfb5148ac89',uuid=b9680bfd-e87f-427c-8f13-2b3a415aca39,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "address": "fa:16:3e:22:ea:7f", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2da161-22", "ovs_interfaceid": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.109 183079 DEBUG nova.network.os_vif_util [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Converting VIF {"id": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "address": "fa:16:3e:22:ea:7f", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2da161-22", "ovs_interfaceid": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.110 183079 DEBUG nova.network.os_vif_util [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:ea:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2da161-220a-41f0-b689-1c6e25e8b81a,network=Network(ca57ee46-b6e8-4b60-affe-0c1349cb8abe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4b2da161-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.111 183079 DEBUG nova.objects.instance [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lazy-loading 'pci_devices' on Instance uuid b9680bfd-e87f-427c-8f13-2b3a415aca39 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.129 183079 DEBUG nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:07:26 compute-0 nova_compute[183075]:   <uuid>b9680bfd-e87f-427c-8f13-2b3a415aca39</uuid>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   <name>instance-00000004</name>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-783174209</nova:name>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:07:26</nova:creationTime>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:07:26 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:07:26 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:07:26 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:07:26 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:07:26 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:07:26 compute-0 nova_compute[183075]:         <nova:user uuid="c9cdd80799a74efb8ce82cfb5148ac89">tempest-DefaultSnatToExternal-1301723521-project-member</nova:user>
Jan 22 17:07:26 compute-0 nova_compute[183075]:         <nova:project uuid="cc642b97aa4e4886902a0d1233877b88">tempest-DefaultSnatToExternal-1301723521</nova:project>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:07:26 compute-0 nova_compute[183075]:         <nova:port uuid="4b2da161-220a-41f0-b689-1c6e25e8b81a">
Jan 22 17:07:26 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <system>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <entry name="serial">b9680bfd-e87f-427c-8f13-2b3a415aca39</entry>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <entry name="uuid">b9680bfd-e87f-427c-8f13-2b3a415aca39</entry>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     </system>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   <os>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   </os>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   <features>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   </features>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/b9680bfd-e87f-427c-8f13-2b3a415aca39/disk"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:22:ea:7f"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <target dev="tap4b2da161-22"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/b9680bfd-e87f-427c-8f13-2b3a415aca39/console.log" append="off"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <video>
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     </video>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:07:26 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:07:26 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:07:26 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:07:26 compute-0 nova_compute[183075]: </domain>
Jan 22 17:07:26 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.132 183079 DEBUG nova.compute.manager [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Preparing to wait for external event network-vif-plugged-4b2da161-220a-41f0-b689-1c6e25e8b81a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.133 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.133 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.133 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.134 183079 DEBUG nova.virt.libvirt.vif [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-783174209',display_name='tempest-server-test-783174209',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-783174209',id=4,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFwDv7A3xWV6IJ9sHJD6IHIGofwjIy3SwtF7ZsSnO9i7yxDOnvofgvCRbmYwkVhe4LG2M7JC1Bh9mcomUiffvFuBa9GwDItaNN685Z4fyZXr+GZx+rbje/8Qtcf+s+bYwA==',key_name='tempest-keypair-test-76774833',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cc642b97aa4e4886902a0d1233877b88',ramdisk_id='',reservation_id='r-8jpndd40',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DefaultSnatToExternal-1301723521',owner_user_name='tempest-DefaultSnatToExternal-1301723521-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:07:22Z,user_data=None,user_id='c9cdd80799a74efb8ce82cfb5148ac89',uuid=b9680bfd-e87f-427c-8f13-2b3a415aca39,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "address": "fa:16:3e:22:ea:7f", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2da161-22", "ovs_interfaceid": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.134 183079 DEBUG nova.network.os_vif_util [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Converting VIF {"id": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "address": "fa:16:3e:22:ea:7f", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2da161-22", "ovs_interfaceid": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.135 183079 DEBUG nova.network.os_vif_util [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:ea:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2da161-220a-41f0-b689-1c6e25e8b81a,network=Network(ca57ee46-b6e8-4b60-affe-0c1349cb8abe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4b2da161-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.135 183079 DEBUG os_vif [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:ea:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2da161-220a-41f0-b689-1c6e25e8b81a,network=Network(ca57ee46-b6e8-4b60-affe-0c1349cb8abe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4b2da161-22') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.136 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.136 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.137 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.142 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.143 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b2da161-22, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.143 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4b2da161-22, col_values=(('external_ids', {'iface-id': '4b2da161-220a-41f0-b689-1c6e25e8b81a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:ea:7f', 'vm-uuid': 'b9680bfd-e87f-427c-8f13-2b3a415aca39'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:26 compute-0 NetworkManager[55454]: <info>  [1769101646.1457] manager: (tap4b2da161-22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.147 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.151 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.151 183079 INFO os_vif [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:ea:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2da161-220a-41f0-b689-1c6e25e8b81a,network=Network(ca57ee46-b6e8-4b60-affe-0c1349cb8abe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4b2da161-22')
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.196 183079 DEBUG nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.196 183079 DEBUG nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] No VIF found with MAC fa:16:3e:22:ea:7f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:07:26 compute-0 kernel: tap4b2da161-22: entered promiscuous mode
Jan 22 17:07:26 compute-0 NetworkManager[55454]: <info>  [1769101646.2794] manager: (tap4b2da161-22): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Jan 22 17:07:26 compute-0 ovn_controller[95372]: 2026-01-22T17:07:26Z|00047|binding|INFO|Claiming lport 4b2da161-220a-41f0-b689-1c6e25e8b81a for this chassis.
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.282 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:26 compute-0 ovn_controller[95372]: 2026-01-22T17:07:26Z|00048|binding|INFO|4b2da161-220a-41f0-b689-1c6e25e8b81a: Claiming fa:16:3e:22:ea:7f 10.100.0.10
Jan 22 17:07:26 compute-0 ovn_controller[95372]: 2026-01-22T17:07:26Z|00049|binding|INFO|Setting lport 4b2da161-220a-41f0-b689-1c6e25e8b81a ovn-installed in OVS
Jan 22 17:07:26 compute-0 ovn_controller[95372]: 2026-01-22T17:07:26Z|00050|binding|INFO|Setting lport 4b2da161-220a-41f0-b689-1c6e25e8b81a up in Southbound
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.297 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.298 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.300 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:26.295 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:ea:7f 10.100.0.10'], port_security=['fa:16:3e:22:ea:7f 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'b9680bfd-e87f-427c-8f13-2b3a415aca39', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca57ee46-b6e8-4b60-affe-0c1349cb8abe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cc642b97aa4e4886902a0d1233877b88', 'neutron:revision_number': '2', 'neutron:security_group_ids': '78ff1eac-eb3f-4f3f-baa4-7b8094682c5f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad421178-e3d9-4f12-a6a2-84aabbd8f25c, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=4b2da161-220a-41f0-b689-1c6e25e8b81a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:07:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:26.296 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 4b2da161-220a-41f0-b689-1c6e25e8b81a in datapath ca57ee46-b6e8-4b60-affe-0c1349cb8abe bound to our chassis
Jan 22 17:07:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:26.298 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca57ee46-b6e8-4b60-affe-0c1349cb8abe
Jan 22 17:07:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:26.316 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[838747a8-3345-4b69-816b-e9b89f64cfb1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:26 compute-0 systemd-udevd[215674]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:07:26 compute-0 systemd-machined[154382]: New machine qemu-4-instance-00000004.
Jan 22 17:07:26 compute-0 sshd-session[215642]: Connection reset by authenticating user root 176.120.22.47 port 56726 [preauth]
Jan 22 17:07:26 compute-0 NetworkManager[55454]: <info>  [1769101646.3465] device (tap4b2da161-22): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:07:26 compute-0 NetworkManager[55454]: <info>  [1769101646.3473] device (tap4b2da161-22): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:07:26 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Jan 22 17:07:26 compute-0 podman[215653]: 2026-01-22 17:07:26.351388841 +0000 UTC m=+0.076715417 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 17:07:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:26.356 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[199c38a7-21c0-44ae-a551-ab485a4f5741]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:26.363 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[232a473f-033e-427d-92bb-108708800253]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:26.395 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[892aa6d3-b991-4a3e-a76c-1f39665a5afb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:26.413 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce88da7-bcf0-437a-8094-aadb5a71ba51]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca57ee46-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:69:1c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 55, 'rx_bytes': 8920, 'tx_bytes': 6236, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 55, 'rx_bytes': 8920, 'tx_bytes': 6236, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397358, 'reachable_time': 27603, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215695, 'error': None, 'target': 'ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:26.434 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a4f8fdf2-4d4b-46f4-ab72-cc48b4163399]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapca57ee46-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 397371, 'tstamp': 397371}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215696, 'error': None, 'target': 'ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapca57ee46-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 397375, 'tstamp': 397375}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215696, 'error': None, 'target': 'ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:26.436 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca57ee46-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.438 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.439 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:26.439 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca57ee46-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:26.439 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:07:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:26.440 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca57ee46-b0, col_values=(('external_ids', {'iface-id': '75ce3d88-09ff-4158-a831-34ffcbec4888'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:26.440 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.641 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101646.641183, b9680bfd-e87f-427c-8f13-2b3a415aca39 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.643 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] VM Started (Lifecycle Event)
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.673 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.679 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101646.6424406, b9680bfd-e87f-427c-8f13-2b3a415aca39 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.680 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] VM Paused (Lifecycle Event)
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.716 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.721 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.762 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.941 183079 DEBUG nova.compute.manager [req-f18f9e50-f9f2-4a60-9e1b-9e600fa1a110 req-29f54575-7944-4865-85e4-3305a41778f5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Received event network-vif-plugged-4b2da161-220a-41f0-b689-1c6e25e8b81a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.942 183079 DEBUG oslo_concurrency.lockutils [req-f18f9e50-f9f2-4a60-9e1b-9e600fa1a110 req-29f54575-7944-4865-85e4-3305a41778f5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.942 183079 DEBUG oslo_concurrency.lockutils [req-f18f9e50-f9f2-4a60-9e1b-9e600fa1a110 req-29f54575-7944-4865-85e4-3305a41778f5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.943 183079 DEBUG oslo_concurrency.lockutils [req-f18f9e50-f9f2-4a60-9e1b-9e600fa1a110 req-29f54575-7944-4865-85e4-3305a41778f5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.943 183079 DEBUG nova.compute.manager [req-f18f9e50-f9f2-4a60-9e1b-9e600fa1a110 req-29f54575-7944-4865-85e4-3305a41778f5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Processing event network-vif-plugged-4b2da161-220a-41f0-b689-1c6e25e8b81a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.944 183079 DEBUG nova.compute.manager [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.949 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101646.9494138, b9680bfd-e87f-427c-8f13-2b3a415aca39 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.950 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] VM Resumed (Lifecycle Event)
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.955 183079 DEBUG nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.959 183079 INFO nova.virt.libvirt.driver [-] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Instance spawned successfully.
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.960 183079 DEBUG nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.979 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.988 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.996 183079 DEBUG nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.996 183079 DEBUG nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.997 183079 DEBUG nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:26 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.998 183079 DEBUG nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:27 compute-0 nova_compute[183075]: 2026-01-22 17:07:26.999 183079 DEBUG nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:27 compute-0 nova_compute[183075]: 2026-01-22 17:07:27.000 183079 DEBUG nova.virt.libvirt.driver [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:27 compute-0 nova_compute[183075]: 2026-01-22 17:07:27.031 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:07:27 compute-0 nova_compute[183075]: 2026-01-22 17:07:27.060 183079 INFO nova.compute.manager [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Took 4.34 seconds to spawn the instance on the hypervisor.
Jan 22 17:07:27 compute-0 nova_compute[183075]: 2026-01-22 17:07:27.060 183079 DEBUG nova.compute.manager [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:07:27 compute-0 nova_compute[183075]: 2026-01-22 17:07:27.124 183079 INFO nova.compute.manager [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Took 4.92 seconds to build instance.
Jan 22 17:07:27 compute-0 nova_compute[183075]: 2026-01-22 17:07:27.143 183079 DEBUG oslo_concurrency.lockutils [None req-d981f052-95e8-44b9-91c0-06d707d8310b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "b9680bfd-e87f-427c-8f13-2b3a415aca39" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.013s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:27 compute-0 sshd-session[215704]: Connection reset by authenticating user root 176.120.22.47 port 56736 [preauth]
Jan 22 17:07:27 compute-0 nova_compute[183075]: 2026-01-22 17:07:27.558 183079 DEBUG nova.network.neutron [req-ae6ae40b-6688-4102-8335-de3540be416c req-5f7f8fc8-99e7-46c3-b591-dec1b9770151 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Updated VIF entry in instance network info cache for port 4b2da161-220a-41f0-b689-1c6e25e8b81a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:07:27 compute-0 nova_compute[183075]: 2026-01-22 17:07:27.559 183079 DEBUG nova.network.neutron [req-ae6ae40b-6688-4102-8335-de3540be416c req-5f7f8fc8-99e7-46c3-b591-dec1b9770151 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Updating instance_info_cache with network_info: [{"id": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "address": "fa:16:3e:22:ea:7f", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2da161-22", "ovs_interfaceid": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:07:27 compute-0 nova_compute[183075]: 2026-01-22 17:07:27.581 183079 DEBUG oslo_concurrency.lockutils [req-ae6ae40b-6688-4102-8335-de3540be416c req-5f7f8fc8-99e7-46c3-b591-dec1b9770151 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-b9680bfd-e87f-427c-8f13-2b3a415aca39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:07:27 compute-0 sshd-session[215535]: error: kex_exchange_identification: read: Connection reset by peer
Jan 22 17:07:27 compute-0 sshd-session[215535]: Connection reset by 176.120.22.47 port 60850
Jan 22 17:07:28 compute-0 nova_compute[183075]: 2026-01-22 17:07:28.546 183079 INFO nova.compute.manager [None req-526ad9dc-d827-42b8-b49d-88faf168b054 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Get console output
Jan 22 17:07:28 compute-0 nova_compute[183075]: 2026-01-22 17:07:28.555 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:07:29 compute-0 nova_compute[183075]: 2026-01-22 17:07:29.023 183079 DEBUG nova.compute.manager [req-275ac11a-0dbf-4c58-aeae-532478c1456d req-1d6d1b69-7dfb-4bc5-a58b-0f716fd947b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Received event network-vif-plugged-4b2da161-220a-41f0-b689-1c6e25e8b81a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:07:29 compute-0 nova_compute[183075]: 2026-01-22 17:07:29.024 183079 DEBUG oslo_concurrency.lockutils [req-275ac11a-0dbf-4c58-aeae-532478c1456d req-1d6d1b69-7dfb-4bc5-a58b-0f716fd947b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:29 compute-0 nova_compute[183075]: 2026-01-22 17:07:29.024 183079 DEBUG oslo_concurrency.lockutils [req-275ac11a-0dbf-4c58-aeae-532478c1456d req-1d6d1b69-7dfb-4bc5-a58b-0f716fd947b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:29 compute-0 nova_compute[183075]: 2026-01-22 17:07:29.024 183079 DEBUG oslo_concurrency.lockutils [req-275ac11a-0dbf-4c58-aeae-532478c1456d req-1d6d1b69-7dfb-4bc5-a58b-0f716fd947b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:29 compute-0 nova_compute[183075]: 2026-01-22 17:07:29.025 183079 DEBUG nova.compute.manager [req-275ac11a-0dbf-4c58-aeae-532478c1456d req-1d6d1b69-7dfb-4bc5-a58b-0f716fd947b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] No waiting events found dispatching network-vif-plugged-4b2da161-220a-41f0-b689-1c6e25e8b81a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:07:29 compute-0 nova_compute[183075]: 2026-01-22 17:07:29.025 183079 WARNING nova.compute.manager [req-275ac11a-0dbf-4c58-aeae-532478c1456d req-1d6d1b69-7dfb-4bc5-a58b-0f716fd947b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Received unexpected event network-vif-plugged-4b2da161-220a-41f0-b689-1c6e25e8b81a for instance with vm_state active and task_state None.
Jan 22 17:07:29 compute-0 nova_compute[183075]: 2026-01-22 17:07:29.049 183079 INFO nova.compute.manager [None req-4556354d-9e22-460b-9bdc-6a8c3d20891e c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Get console output
Jan 22 17:07:29 compute-0 nova_compute[183075]: 2026-01-22 17:07:29.057 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:07:29 compute-0 nova_compute[183075]: 2026-01-22 17:07:29.553 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:30.212 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:30.217 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:07:30 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:30 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:30 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:30 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:30 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:30 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:07:30 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:31 compute-0 nova_compute[183075]: 2026-01-22 17:07:31.148 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.583 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.584 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.3669426
Jan 22 17:07:31 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.6:58864 [22/Jan/2026:17:07:30.207] listener listener/metadata 0/0/0/1376/1376 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.599 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.600 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.624 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:31 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.6:58868 [22/Jan/2026:17:07:31.598] listener listener/metadata 0/0/0/26/26 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.624 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0243444
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.634 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.637 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.660 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:31 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.6:58880 [22/Jan/2026:17:07:31.631] listener listener/metadata 0/0/0/30/30 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.661 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0242858
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.668 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.669 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.688 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.689 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0197899
Jan 22 17:07:31 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.6:58890 [22/Jan/2026:17:07:31.667] listener listener/metadata 0/0/0/21/21 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.696 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.696 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.715 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.716 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0194309
Jan 22 17:07:31 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.6:58892 [22/Jan/2026:17:07:31.695] listener listener/metadata 0/0/0/20/20 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.724 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.725 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.745 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.746 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0212042
Jan 22 17:07:31 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.6:58906 [22/Jan/2026:17:07:31.723] listener listener/metadata 0/0/0/22/22 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.752 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.753 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.776 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:31 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.6:58920 [22/Jan/2026:17:07:31.751] listener listener/metadata 0/0/0/25/25 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.777 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0247078
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.785 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.786 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.810 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.811 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0248039
Jan 22 17:07:31 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.6:58932 [22/Jan/2026:17:07:31.784] listener listener/metadata 0/0/0/26/26 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.820 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.822 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.854 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.855 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0336280
Jan 22 17:07:31 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.6:58944 [22/Jan/2026:17:07:31.819] listener listener/metadata 0/0/0/36/36 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.867 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.869 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.892 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:31 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.6:58958 [22/Jan/2026:17:07:31.867] listener listener/metadata 0/0/0/26/26 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.893 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0248320
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.902 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.903 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:31 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.6:58964 [22/Jan/2026:17:07:31.901] listener listener/metadata 0/0/0/23/23 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.925 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0212781
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.943 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.944 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.965 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.966 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0220191
Jan 22 17:07:31 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.6:58980 [22/Jan/2026:17:07:31.942] listener listener/metadata 0/0/0/23/23 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.973 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.973 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.992 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.993 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0195467
Jan 22 17:07:31 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.6:58984 [22/Jan/2026:17:07:31.972] listener listener/metadata 0/0/0/20/20 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.998 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:31.999 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:07:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:32.017 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:32.018 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0189600
Jan 22 17:07:32 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.6:59000 [22/Jan/2026:17:07:31.998] listener listener/metadata 0/0/0/19/19 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:32.029 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:32.030 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:32.053 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:32 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.6:59004 [22/Jan/2026:17:07:32.028] listener listener/metadata 0/0/0/25/25 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:32.054 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0247185
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:32.063 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:32.064 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:32.091 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:32.091 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0273194
Jan 22 17:07:32 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.6:59014 [22/Jan/2026:17:07:32.062] listener listener/metadata 0/0/0/29/29 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:07:32 compute-0 podman[215707]: 2026-01-22 17:07:32.400363689 +0000 UTC m=+0.091014431 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:07:33 compute-0 nova_compute[183075]: 2026-01-22 17:07:33.814 183079 INFO nova.compute.manager [None req-8fdbeb3c-1209-4956-b9a2-7f7b31fb1b91 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Get console output
Jan 22 17:07:33 compute-0 nova_compute[183075]: 2026-01-22 17:07:33.823 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:07:34 compute-0 nova_compute[183075]: 2026-01-22 17:07:34.241 183079 INFO nova.compute.manager [None req-1db5e798-234d-4fc9-af93-684b1c0acf1f c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Get console output
Jan 22 17:07:34 compute-0 nova_compute[183075]: 2026-01-22 17:07:34.247 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:07:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:34.249 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:07:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:34.250 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:07:34 compute-0 nova_compute[183075]: 2026-01-22 17:07:34.252 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:34 compute-0 nova_compute[183075]: 2026-01-22 17:07:34.596 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:36 compute-0 nova_compute[183075]: 2026-01-22 17:07:36.152 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:38.253 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:39 compute-0 nova_compute[183075]: 2026-01-22 17:07:39.453 183079 INFO nova.compute.manager [None req-390f26f2-738c-42f9-8009-507c205dd80d c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Get console output
Jan 22 17:07:39 compute-0 nova_compute[183075]: 2026-01-22 17:07:39.458 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:07:39 compute-0 nova_compute[183075]: 2026-01-22 17:07:39.639 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:39 compute-0 nova_compute[183075]: 2026-01-22 17:07:39.785 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:39 compute-0 nova_compute[183075]: 2026-01-22 17:07:39.786 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:39 compute-0 nova_compute[183075]: 2026-01-22 17:07:39.806 183079 DEBUG nova.compute.manager [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:07:39 compute-0 nova_compute[183075]: 2026-01-22 17:07:39.894 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:39 compute-0 nova_compute[183075]: 2026-01-22 17:07:39.894 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:39 compute-0 nova_compute[183075]: 2026-01-22 17:07:39.902 183079 DEBUG nova.virt.hardware [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:07:39 compute-0 nova_compute[183075]: 2026-01-22 17:07:39.902 183079 INFO nova.compute.claims [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.076 183079 DEBUG nova.compute.provider_tree [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.091 183079 DEBUG nova.scheduler.client.report [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.111 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.112 183079 DEBUG nova.compute.manager [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.159 183079 DEBUG nova.compute.manager [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.159 183079 DEBUG nova.network.neutron [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.180 183079 INFO nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.196 183079 DEBUG nova.compute.manager [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:07:40 compute-0 ovn_controller[95372]: 2026-01-22T17:07:40Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:22:ea:7f 10.100.0.10
Jan 22 17:07:40 compute-0 ovn_controller[95372]: 2026-01-22T17:07:40Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:ea:7f 10.100.0.10
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.300 183079 DEBUG nova.compute.manager [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.301 183079 DEBUG nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.302 183079 INFO nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Creating image(s)
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.302 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "/var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.302 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "/var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.303 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "/var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.317 183079 DEBUG oslo_concurrency.processutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.377 183079 DEBUG oslo_concurrency.processutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.378 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.378 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.388 183079 DEBUG oslo_concurrency.processutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.450 183079 DEBUG oslo_concurrency.processutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.451 183079 DEBUG oslo_concurrency.processutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.486 183079 DEBUG oslo_concurrency.processutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.487 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.488 183079 DEBUG oslo_concurrency.processutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.567 183079 DEBUG oslo_concurrency.processutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.568 183079 DEBUG nova.virt.disk.api [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Checking if we can resize image /var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.569 183079 DEBUG oslo_concurrency.processutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.607 183079 DEBUG nova.policy [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.630 183079 DEBUG oslo_concurrency.processutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.631 183079 DEBUG nova.virt.disk.api [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Cannot resize image /var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.632 183079 DEBUG nova.objects.instance [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lazy-loading 'migration_context' on Instance uuid 000b64b8-bcc5-4bbe-9703-8400a83a27d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.654 183079 DEBUG nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.655 183079 DEBUG nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Ensure instance console log exists: /var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.656 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.657 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:40 compute-0 nova_compute[183075]: 2026-01-22 17:07:40.657 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:41 compute-0 nova_compute[183075]: 2026-01-22 17:07:41.155 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:41 compute-0 nova_compute[183075]: 2026-01-22 17:07:41.333 183079 DEBUG nova.network.neutron [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Successfully updated port: c694dca0-bf6e-4e89-a43e-1d10b3b23075 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:07:41 compute-0 nova_compute[183075]: 2026-01-22 17:07:41.356 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "refresh_cache-000b64b8-bcc5-4bbe-9703-8400a83a27d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:07:41 compute-0 nova_compute[183075]: 2026-01-22 17:07:41.356 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquired lock "refresh_cache-000b64b8-bcc5-4bbe-9703-8400a83a27d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:07:41 compute-0 nova_compute[183075]: 2026-01-22 17:07:41.357 183079 DEBUG nova.network.neutron [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:07:41 compute-0 nova_compute[183075]: 2026-01-22 17:07:41.440 183079 DEBUG nova.compute.manager [req-a2fdd4f8-8ada-4e56-8a48-c1ba30fe701a req-6a07bb2a-c255-4455-96fd-a871d4f722bc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Received event network-changed-c694dca0-bf6e-4e89-a43e-1d10b3b23075 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:07:41 compute-0 nova_compute[183075]: 2026-01-22 17:07:41.441 183079 DEBUG nova.compute.manager [req-a2fdd4f8-8ada-4e56-8a48-c1ba30fe701a req-6a07bb2a-c255-4455-96fd-a871d4f722bc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Refreshing instance network info cache due to event network-changed-c694dca0-bf6e-4e89-a43e-1d10b3b23075. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:07:41 compute-0 nova_compute[183075]: 2026-01-22 17:07:41.441 183079 DEBUG oslo_concurrency.lockutils [req-a2fdd4f8-8ada-4e56-8a48-c1ba30fe701a req-6a07bb2a-c255-4455-96fd-a871d4f722bc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-000b64b8-bcc5-4bbe-9703-8400a83a27d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:07:41 compute-0 nova_compute[183075]: 2026-01-22 17:07:41.504 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:41 compute-0 nova_compute[183075]: 2026-01-22 17:07:41.643 183079 DEBUG nova.network.neutron [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:07:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:41.920 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:41.921 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:41.921 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:42 compute-0 nova_compute[183075]: 2026-01-22 17:07:42.978 183079 DEBUG nova.network.neutron [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Updating instance_info_cache with network_info: [{"id": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "address": "fa:16:3e:0a:25:d6", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc694dca0-bf", "ovs_interfaceid": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.002 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Releasing lock "refresh_cache-000b64b8-bcc5-4bbe-9703-8400a83a27d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.005 183079 DEBUG nova.compute.manager [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Instance network_info: |[{"id": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "address": "fa:16:3e:0a:25:d6", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc694dca0-bf", "ovs_interfaceid": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.007 183079 DEBUG oslo_concurrency.lockutils [req-a2fdd4f8-8ada-4e56-8a48-c1ba30fe701a req-6a07bb2a-c255-4455-96fd-a871d4f722bc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-000b64b8-bcc5-4bbe-9703-8400a83a27d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.008 183079 DEBUG nova.network.neutron [req-a2fdd4f8-8ada-4e56-8a48-c1ba30fe701a req-6a07bb2a-c255-4455-96fd-a871d4f722bc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Refreshing network info cache for port c694dca0-bf6e-4e89-a43e-1d10b3b23075 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.011 183079 DEBUG nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Start _get_guest_xml network_info=[{"id": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "address": "fa:16:3e:0a:25:d6", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc694dca0-bf", "ovs_interfaceid": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.019 183079 WARNING nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.024 183079 DEBUG nova.virt.libvirt.host [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.024 183079 DEBUG nova.virt.libvirt.host [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.028 183079 DEBUG nova.virt.libvirt.host [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.028 183079 DEBUG nova.virt.libvirt.host [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.029 183079 DEBUG nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.029 183079 DEBUG nova.virt.hardware [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.029 183079 DEBUG nova.virt.hardware [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.029 183079 DEBUG nova.virt.hardware [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.029 183079 DEBUG nova.virt.hardware [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.030 183079 DEBUG nova.virt.hardware [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.030 183079 DEBUG nova.virt.hardware [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.030 183079 DEBUG nova.virt.hardware [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.030 183079 DEBUG nova.virt.hardware [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.030 183079 DEBUG nova.virt.hardware [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.030 183079 DEBUG nova.virt.hardware [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.031 183079 DEBUG nova.virt.hardware [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.034 183079 DEBUG nova.virt.libvirt.vif [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1172585183',display_name='tempest-server-test-1172585183',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1172585183',id=5,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCYZfzAO5Ui0C6aNyi2ae5nossybaMPhlAmp9Zj0dL/LunopVV9meBl8qfqrR0u5rqBGUC3w2LWkiMuhmtUWjdFFHsn2/jcRm/feYqTpe58YE0uSBcC8gJ0hMPoRirOIAg==',key_name='tempest-keypair-test-1166356111',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eb8e9f7a891a4a38af8b01557eddc991',ramdisk_id='',reservation_id='r-68us24bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPPortDetailsTest-1812576526',owner_user_name='tempest-FloatingIPPortDetailsTest-1812576526-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:07:40Z,user_data=None,user_id='d7ee6c51c2b8447baefccea20fa16de5',uuid=000b64b8-bcc5-4bbe-9703-8400a83a27d0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "address": "fa:16:3e:0a:25:d6", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc694dca0-bf", "ovs_interfaceid": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.034 183079 DEBUG nova.network.os_vif_util [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converting VIF {"id": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "address": "fa:16:3e:0a:25:d6", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc694dca0-bf", "ovs_interfaceid": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.035 183079 DEBUG nova.network.os_vif_util [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:25:d6,bridge_name='br-int',has_traffic_filtering=True,id=c694dca0-bf6e-4e89-a43e-1d10b3b23075,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc694dca0-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.035 183079 DEBUG nova.objects.instance [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lazy-loading 'pci_devices' on Instance uuid 000b64b8-bcc5-4bbe-9703-8400a83a27d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.059 183079 DEBUG nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:07:43 compute-0 nova_compute[183075]:   <uuid>000b64b8-bcc5-4bbe-9703-8400a83a27d0</uuid>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   <name>instance-00000005</name>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1172585183</nova:name>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:07:43</nova:creationTime>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:07:43 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:07:43 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:07:43 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:07:43 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:07:43 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:07:43 compute-0 nova_compute[183075]:         <nova:user uuid="d7ee6c51c2b8447baefccea20fa16de5">tempest-FloatingIPPortDetailsTest-1812576526-project-member</nova:user>
Jan 22 17:07:43 compute-0 nova_compute[183075]:         <nova:project uuid="eb8e9f7a891a4a38af8b01557eddc991">tempest-FloatingIPPortDetailsTest-1812576526</nova:project>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:07:43 compute-0 nova_compute[183075]:         <nova:port uuid="c694dca0-bf6e-4e89-a43e-1d10b3b23075">
Jan 22 17:07:43 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <system>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <entry name="serial">000b64b8-bcc5-4bbe-9703-8400a83a27d0</entry>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <entry name="uuid">000b64b8-bcc5-4bbe-9703-8400a83a27d0</entry>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     </system>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   <os>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   </os>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   <features>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   </features>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:0a:25:d6"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <target dev="tapc694dca0-bf"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/console.log" append="off"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <video>
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     </video>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:07:43 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:07:43 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:07:43 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:07:43 compute-0 nova_compute[183075]: </domain>
Jan 22 17:07:43 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.060 183079 DEBUG nova.compute.manager [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Preparing to wait for external event network-vif-plugged-c694dca0-bf6e-4e89-a43e-1d10b3b23075 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.060 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.060 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.060 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.061 183079 DEBUG nova.virt.libvirt.vif [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1172585183',display_name='tempest-server-test-1172585183',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1172585183',id=5,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCYZfzAO5Ui0C6aNyi2ae5nossybaMPhlAmp9Zj0dL/LunopVV9meBl8qfqrR0u5rqBGUC3w2LWkiMuhmtUWjdFFHsn2/jcRm/feYqTpe58YE0uSBcC8gJ0hMPoRirOIAg==',key_name='tempest-keypair-test-1166356111',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eb8e9f7a891a4a38af8b01557eddc991',ramdisk_id='',reservation_id='r-68us24bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPPortDetailsTest-1812576526',owner_user_name='tempest-FloatingIPPortDetailsTest-1812576526-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:07:40Z,user_data=None,user_id='d7ee6c51c2b8447baefccea20fa16de5',uuid=000b64b8-bcc5-4bbe-9703-8400a83a27d0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "address": "fa:16:3e:0a:25:d6", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc694dca0-bf", "ovs_interfaceid": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.061 183079 DEBUG nova.network.os_vif_util [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converting VIF {"id": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "address": "fa:16:3e:0a:25:d6", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc694dca0-bf", "ovs_interfaceid": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.062 183079 DEBUG nova.network.os_vif_util [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:25:d6,bridge_name='br-int',has_traffic_filtering=True,id=c694dca0-bf6e-4e89-a43e-1d10b3b23075,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc694dca0-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.062 183079 DEBUG os_vif [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:25:d6,bridge_name='br-int',has_traffic_filtering=True,id=c694dca0-bf6e-4e89-a43e-1d10b3b23075,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc694dca0-bf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.063 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.063 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.063 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.067 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.067 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc694dca0-bf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.068 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc694dca0-bf, col_values=(('external_ids', {'iface-id': 'c694dca0-bf6e-4e89-a43e-1d10b3b23075', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0a:25:d6', 'vm-uuid': '000b64b8-bcc5-4bbe-9703-8400a83a27d0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.069 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:43 compute-0 NetworkManager[55454]: <info>  [1769101663.0705] manager: (tapc694dca0-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.072 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.080 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.082 183079 INFO os_vif [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:25:d6,bridge_name='br-int',has_traffic_filtering=True,id=c694dca0-bf6e-4e89-a43e-1d10b3b23075,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc694dca0-bf')
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.150 183079 DEBUG nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.150 183079 DEBUG nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] No VIF found with MAC fa:16:3e:0a:25:d6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:07:43 compute-0 kernel: tapc694dca0-bf: entered promiscuous mode
Jan 22 17:07:43 compute-0 NetworkManager[55454]: <info>  [1769101663.2205] manager: (tapc694dca0-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Jan 22 17:07:43 compute-0 ovn_controller[95372]: 2026-01-22T17:07:43Z|00051|binding|INFO|Claiming lport c694dca0-bf6e-4e89-a43e-1d10b3b23075 for this chassis.
Jan 22 17:07:43 compute-0 ovn_controller[95372]: 2026-01-22T17:07:43Z|00052|binding|INFO|c694dca0-bf6e-4e89-a43e-1d10b3b23075: Claiming fa:16:3e:0a:25:d6 10.100.0.4
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.274 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:43 compute-0 systemd-udevd[215789]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:07:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:43.280 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:25:d6 10.100.0.4'], port_security=['fa:16:3e:0a:25:d6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b0c128dc-3c6d-4d32-ac6a-884653522196', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a8cd0033-3766-4772-8688-44d3d0e3250a, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=c694dca0-bf6e-4e89-a43e-1d10b3b23075) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:07:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:43.281 104629 INFO neutron.agent.ovn.metadata.agent [-] Port c694dca0-bf6e-4e89-a43e-1d10b3b23075 in datapath 473b4e99-4018-4fa7-ab1c-2d3e7944d850 bound to our chassis
Jan 22 17:07:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:43.283 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 473b4e99-4018-4fa7-ab1c-2d3e7944d850
Jan 22 17:07:43 compute-0 NetworkManager[55454]: <info>  [1769101663.2897] device (tapc694dca0-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:07:43 compute-0 ovn_controller[95372]: 2026-01-22T17:07:43Z|00053|binding|INFO|Setting lport c694dca0-bf6e-4e89-a43e-1d10b3b23075 ovn-installed in OVS
Jan 22 17:07:43 compute-0 ovn_controller[95372]: 2026-01-22T17:07:43Z|00054|binding|INFO|Setting lport c694dca0-bf6e-4e89-a43e-1d10b3b23075 up in Southbound
Jan 22 17:07:43 compute-0 podman[215759]: 2026-01-22 17:07:43.290870269 +0000 UTC m=+0.156617197 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:07:43 compute-0 NetworkManager[55454]: <info>  [1769101663.2910] device (tapc694dca0-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.293 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.296 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:43.304 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d3d42c87-5204-4df8-897c-02426977394f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:43 compute-0 systemd-machined[154382]: New machine qemu-5-instance-00000005.
Jan 22 17:07:43 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Jan 22 17:07:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:43.347 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f167a1-1211-4c0c-bf48-47c588241cb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:43.352 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f83a7b92-4961-41ca-a390-c999d7b2fae8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:43.390 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[6b7b2244-06e9-4ed7-a2dd-52434750af16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:43.415 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[85cbda20-bf9f-491e-8235-83429a55ad32]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap473b4e99-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:87:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6131, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6131, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399443, 'reachable_time': 32301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215811, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:43.437 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c747c41e-e232-4f84-a67e-58f800711e9e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap473b4e99-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399457, 'tstamp': 399457}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215813, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap473b4e99-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399460, 'tstamp': 399460}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215813, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:43.441 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap473b4e99-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.443 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.444 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:43.445 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap473b4e99-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:43.446 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:07:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:43.447 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap473b4e99-40, col_values=(('external_ids', {'iface-id': '424ac40e-403e-4504-adbb-47a319b401fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:43.448 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.997 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101663.9972193, 000b64b8-bcc5-4bbe-9703-8400a83a27d0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:07:43 compute-0 nova_compute[183075]: 2026-01-22 17:07:43.998 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] VM Started (Lifecycle Event)
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.016 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.021 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101663.9995983, 000b64b8-bcc5-4bbe-9703-8400a83a27d0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.022 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] VM Paused (Lifecycle Event)
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.042 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.045 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.064 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.692 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.729 183079 INFO nova.compute.manager [None req-dfadb394-406a-4885-b53b-8fe816be2218 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Get console output
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.735 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.883 183079 DEBUG nova.compute.manager [req-3fef6aaf-1e20-4f59-959e-d268e97eabe8 req-162928a3-ebf1-466a-be0b-8b7b4b48262d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Received event network-vif-plugged-c694dca0-bf6e-4e89-a43e-1d10b3b23075 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.883 183079 DEBUG oslo_concurrency.lockutils [req-3fef6aaf-1e20-4f59-959e-d268e97eabe8 req-162928a3-ebf1-466a-be0b-8b7b4b48262d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.884 183079 DEBUG oslo_concurrency.lockutils [req-3fef6aaf-1e20-4f59-959e-d268e97eabe8 req-162928a3-ebf1-466a-be0b-8b7b4b48262d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.884 183079 DEBUG oslo_concurrency.lockutils [req-3fef6aaf-1e20-4f59-959e-d268e97eabe8 req-162928a3-ebf1-466a-be0b-8b7b4b48262d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.884 183079 DEBUG nova.compute.manager [req-3fef6aaf-1e20-4f59-959e-d268e97eabe8 req-162928a3-ebf1-466a-be0b-8b7b4b48262d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Processing event network-vif-plugged-c694dca0-bf6e-4e89-a43e-1d10b3b23075 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.885 183079 DEBUG nova.compute.manager [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.888 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101664.8882923, 000b64b8-bcc5-4bbe-9703-8400a83a27d0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.888 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] VM Resumed (Lifecycle Event)
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.890 183079 DEBUG nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.893 183079 INFO nova.virt.libvirt.driver [-] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Instance spawned successfully.
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.894 183079 DEBUG nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.915 183079 DEBUG nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.916 183079 DEBUG nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.916 183079 DEBUG nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.917 183079 DEBUG nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.917 183079 DEBUG nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.917 183079 DEBUG nova.virt.libvirt.driver [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.987 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.992 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.996 183079 INFO nova.compute.manager [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Took 4.70 seconds to spawn the instance on the hypervisor.
Jan 22 17:07:44 compute-0 nova_compute[183075]: 2026-01-22 17:07:44.997 183079 DEBUG nova.compute.manager [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:07:45 compute-0 nova_compute[183075]: 2026-01-22 17:07:45.009 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:07:45 compute-0 nova_compute[183075]: 2026-01-22 17:07:45.057 183079 INFO nova.compute.manager [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Took 5.19 seconds to build instance.
Jan 22 17:07:45 compute-0 nova_compute[183075]: 2026-01-22 17:07:45.072 183079 DEBUG oslo_concurrency.lockutils [None req-d306a654-8c24-4ded-b8a8-56626f56a2eb d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:45 compute-0 nova_compute[183075]: 2026-01-22 17:07:45.262 183079 DEBUG nova.network.neutron [req-a2fdd4f8-8ada-4e56-8a48-c1ba30fe701a req-6a07bb2a-c255-4455-96fd-a871d4f722bc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Updated VIF entry in instance network info cache for port c694dca0-bf6e-4e89-a43e-1d10b3b23075. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:07:45 compute-0 nova_compute[183075]: 2026-01-22 17:07:45.262 183079 DEBUG nova.network.neutron [req-a2fdd4f8-8ada-4e56-8a48-c1ba30fe701a req-6a07bb2a-c255-4455-96fd-a871d4f722bc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Updating instance_info_cache with network_info: [{"id": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "address": "fa:16:3e:0a:25:d6", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc694dca0-bf", "ovs_interfaceid": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:07:45 compute-0 nova_compute[183075]: 2026-01-22 17:07:45.278 183079 DEBUG oslo_concurrency.lockutils [req-a2fdd4f8-8ada-4e56-8a48-c1ba30fe701a req-6a07bb2a-c255-4455-96fd-a871d4f722bc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-000b64b8-bcc5-4bbe-9703-8400a83a27d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:07:45 compute-0 nova_compute[183075]: 2026-01-22 17:07:45.587 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:46.537 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:46.538 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:07:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:07:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:46 compute-0 nova_compute[183075]: 2026-01-22 17:07:46.970 183079 INFO nova.compute.manager [None req-19c5caee-09fb-4ebb-b151-8e8231ac77fc d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Get console output
Jan 22 17:07:46 compute-0 nova_compute[183075]: 2026-01-22 17:07:46.978 183079 DEBUG nova.compute.manager [req-9fc7fed3-394e-47fd-8b3f-6af206f64742 req-21b5e3b8-3aef-4429-b0b2-f879b83cbf21 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Received event network-vif-plugged-c694dca0-bf6e-4e89-a43e-1d10b3b23075 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:07:46 compute-0 nova_compute[183075]: 2026-01-22 17:07:46.979 183079 DEBUG oslo_concurrency.lockutils [req-9fc7fed3-394e-47fd-8b3f-6af206f64742 req-21b5e3b8-3aef-4429-b0b2-f879b83cbf21 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:46 compute-0 nova_compute[183075]: 2026-01-22 17:07:46.979 183079 DEBUG oslo_concurrency.lockutils [req-9fc7fed3-394e-47fd-8b3f-6af206f64742 req-21b5e3b8-3aef-4429-b0b2-f879b83cbf21 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:46 compute-0 nova_compute[183075]: 2026-01-22 17:07:46.980 183079 DEBUG oslo_concurrency.lockutils [req-9fc7fed3-394e-47fd-8b3f-6af206f64742 req-21b5e3b8-3aef-4429-b0b2-f879b83cbf21 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:46 compute-0 nova_compute[183075]: 2026-01-22 17:07:46.980 183079 DEBUG nova.compute.manager [req-9fc7fed3-394e-47fd-8b3f-6af206f64742 req-21b5e3b8-3aef-4429-b0b2-f879b83cbf21 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] No waiting events found dispatching network-vif-plugged-c694dca0-bf6e-4e89-a43e-1d10b3b23075 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:07:46 compute-0 nova_compute[183075]: 2026-01-22 17:07:46.980 183079 WARNING nova.compute.manager [req-9fc7fed3-394e-47fd-8b3f-6af206f64742 req-21b5e3b8-3aef-4429-b0b2-f879b83cbf21 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Received unexpected event network-vif-plugged-c694dca0-bf6e-4e89-a43e-1d10b3b23075 for instance with vm_state active and task_state None.
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.832 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.832 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.2944279
Jan 22 17:07:47 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.10:60726 [22/Jan/2026:17:07:46.536] listener listener/metadata 0/0/0/1296/1296 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.843 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.844 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.870 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.871 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 167 time: 0.0269630
Jan 22 17:07:47 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.10:60728 [22/Jan/2026:17:07:47.842] listener listener/metadata 0/0/0/28/28 200 151 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.876 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.877 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.893 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.893 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0161507
Jan 22 17:07:47 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.10:60732 [22/Jan/2026:17:07:47.876] listener listener/metadata 0/0/0/17/17 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.898 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.899 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.918 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.918 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0196526
Jan 22 17:07:47 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.10:60736 [22/Jan/2026:17:07:47.898] listener listener/metadata 0/0/0/20/20 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.924 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.925 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.945 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.946 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0207238
Jan 22 17:07:47 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.10:60742 [22/Jan/2026:17:07:47.923] listener listener/metadata 0/0/0/22/22 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.950 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.951 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.972 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:47 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.10:60750 [22/Jan/2026:17:07:47.950] listener listener/metadata 0/0/0/22/22 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.973 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0222225
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.977 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:47.978 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:07:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.000 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.000 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0222840
Jan 22 17:07:48 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.10:60752 [22/Jan/2026:17:07:47.977] listener listener/metadata 0/0/0/23/23 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.005 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.005 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.025 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.025 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0197308
Jan 22 17:07:48 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.10:60766 [22/Jan/2026:17:07:48.004] listener listener/metadata 0/0/0/20/20 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.030 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.031 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.051 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.051 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0203850
Jan 22 17:07:48 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.10:60774 [22/Jan/2026:17:07:48.030] listener listener/metadata 0/0/0/21/21 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.056 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.056 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:48 compute-0 nova_compute[183075]: 2026-01-22 17:07:48.069 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.077 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.077 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0204821
Jan 22 17:07:48 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.10:60786 [22/Jan/2026:17:07:48.055] listener listener/metadata 0/0/0/21/21 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.082 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.083 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:48 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.10:60794 [22/Jan/2026:17:07:48.081] listener listener/metadata 0/0/0/28/28 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.110 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0270848
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.119 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.120 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.139 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.139 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0191212
Jan 22 17:07:48 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.10:60796 [22/Jan/2026:17:07:48.119] listener listener/metadata 0/0/0/20/20 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.143 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.144 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.162 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.163 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0187793
Jan 22 17:07:48 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.10:60802 [22/Jan/2026:17:07:48.143] listener listener/metadata 0/0/0/19/19 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.167 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.168 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.187 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:48 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.10:60808 [22/Jan/2026:17:07:48.167] listener listener/metadata 0/0/0/19/19 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.187 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0192802
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.192 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.193 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.220 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:48 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.10:60818 [22/Jan/2026:17:07:48.192] listener listener/metadata 0/0/0/29/29 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.222 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0289993
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.226 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.227 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ca57ee46-b6e8-4b60-affe-0c1349cb8abe __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.246 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:07:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:48.246 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0196390
Jan 22 17:07:48 compute-0 haproxy-metadata-proxy-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215270]: 10.100.0.10:60822 [22/Jan/2026:17:07:48.226] listener listener/metadata 0/0/0/20/20 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:07:49 compute-0 podman[215822]: 2026-01-22 17:07:49.378028944 +0000 UTC m=+0.082798086 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:07:49 compute-0 podman[215821]: 2026-01-22 17:07:49.464449724 +0000 UTC m=+0.164174724 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 17:07:49 compute-0 nova_compute[183075]: 2026-01-22 17:07:49.694 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:49 compute-0 nova_compute[183075]: 2026-01-22 17:07:49.885 183079 INFO nova.compute.manager [None req-f04be975-62f9-4a36-a271-233e7c6f3d0b c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Get console output
Jan 22 17:07:49 compute-0 nova_compute[183075]: 2026-01-22 17:07:49.893 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:07:51 compute-0 podman[215866]: 2026-01-22 17:07:51.366520742 +0000 UTC m=+0.065764391 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, vcs-type=git, release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.6)
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.452 183079 DEBUG oslo_concurrency.lockutils [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "b9680bfd-e87f-427c-8f13-2b3a415aca39" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.453 183079 DEBUG oslo_concurrency.lockutils [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "b9680bfd-e87f-427c-8f13-2b3a415aca39" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.453 183079 DEBUG oslo_concurrency.lockutils [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.453 183079 DEBUG oslo_concurrency.lockutils [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.453 183079 DEBUG oslo_concurrency.lockutils [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.454 183079 INFO nova.compute.manager [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Terminating instance
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.455 183079 DEBUG nova.compute.manager [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:07:51 compute-0 kernel: tap4b2da161-22 (unregistering): left promiscuous mode
Jan 22 17:07:51 compute-0 NetworkManager[55454]: <info>  [1769101671.4805] device (tap4b2da161-22): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.495 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:51 compute-0 ovn_controller[95372]: 2026-01-22T17:07:51Z|00055|binding|INFO|Releasing lport 4b2da161-220a-41f0-b689-1c6e25e8b81a from this chassis (sb_readonly=0)
Jan 22 17:07:51 compute-0 ovn_controller[95372]: 2026-01-22T17:07:51Z|00056|binding|INFO|Setting lport 4b2da161-220a-41f0-b689-1c6e25e8b81a down in Southbound
Jan 22 17:07:51 compute-0 ovn_controller[95372]: 2026-01-22T17:07:51Z|00057|binding|INFO|Removing iface tap4b2da161-22 ovn-installed in OVS
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.500 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:51.507 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:ea:7f 10.100.0.10'], port_security=['fa:16:3e:22:ea:7f 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'b9680bfd-e87f-427c-8f13-2b3a415aca39', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca57ee46-b6e8-4b60-affe-0c1349cb8abe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cc642b97aa4e4886902a0d1233877b88', 'neutron:revision_number': '4', 'neutron:security_group_ids': '78ff1eac-eb3f-4f3f-baa4-7b8094682c5f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad421178-e3d9-4f12-a6a2-84aabbd8f25c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=4b2da161-220a-41f0-b689-1c6e25e8b81a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:07:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:51.509 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 4b2da161-220a-41f0-b689-1c6e25e8b81a in datapath ca57ee46-b6e8-4b60-affe-0c1349cb8abe unbound from our chassis
Jan 22 17:07:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:51.513 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca57ee46-b6e8-4b60-affe-0c1349cb8abe
Jan 22 17:07:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:51.553 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[75d4dad9-d2dc-48a0-a7de-941bfca2c1f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.576 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:51 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Jan 22 17:07:51 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 12.974s CPU time.
Jan 22 17:07:51 compute-0 systemd-machined[154382]: Machine qemu-4-instance-00000004 terminated.
Jan 22 17:07:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:51.596 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[e352c515-cb98-4e56-ac13-a7317f281e6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:51.599 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f546b62d-fde4-4354-91b8-215d0e4e62d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:51.641 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[53a7b116-235a-4c7e-8ae9-3555a8426260]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:51.659 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[45552517-2143-42ec-8fd9-946cc6a11632]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca57ee46-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:69:1c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 106, 'rx_bytes': 17308, 'tx_bytes': 12093, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 106, 'rx_bytes': 17308, 'tx_bytes': 12093, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397358, 'reachable_time': 27603, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215899, 'error': None, 'target': 'ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:51.676 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5d8de8e6-477d-49a3-88ab-fb5553d676ed]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapca57ee46-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 397371, 'tstamp': 397371}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215900, 'error': None, 'target': 'ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapca57ee46-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 397375, 'tstamp': 397375}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215900, 'error': None, 'target': 'ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:51.678 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca57ee46-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.681 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.685 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:51.686 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca57ee46-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:51.686 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:07:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:51.687 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca57ee46-b0, col_values=(('external_ids', {'iface-id': '75ce3d88-09ff-4158-a831-34ffcbec4888'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:51.687 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.714 183079 INFO nova.virt.libvirt.driver [-] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Instance destroyed successfully.
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.715 183079 DEBUG nova.objects.instance [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lazy-loading 'resources' on Instance uuid b9680bfd-e87f-427c-8f13-2b3a415aca39 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.722 183079 DEBUG nova.compute.manager [req-385bb82a-14b2-4b17-aa0d-7e673b50d75c req-4a3356a9-a3f3-42b4-841a-6e51a09c10bd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Received event network-vif-unplugged-4b2da161-220a-41f0-b689-1c6e25e8b81a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.723 183079 DEBUG oslo_concurrency.lockutils [req-385bb82a-14b2-4b17-aa0d-7e673b50d75c req-4a3356a9-a3f3-42b4-841a-6e51a09c10bd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.723 183079 DEBUG oslo_concurrency.lockutils [req-385bb82a-14b2-4b17-aa0d-7e673b50d75c req-4a3356a9-a3f3-42b4-841a-6e51a09c10bd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.724 183079 DEBUG oslo_concurrency.lockutils [req-385bb82a-14b2-4b17-aa0d-7e673b50d75c req-4a3356a9-a3f3-42b4-841a-6e51a09c10bd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.724 183079 DEBUG nova.compute.manager [req-385bb82a-14b2-4b17-aa0d-7e673b50d75c req-4a3356a9-a3f3-42b4-841a-6e51a09c10bd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] No waiting events found dispatching network-vif-unplugged-4b2da161-220a-41f0-b689-1c6e25e8b81a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.724 183079 DEBUG nova.compute.manager [req-385bb82a-14b2-4b17-aa0d-7e673b50d75c req-4a3356a9-a3f3-42b4-841a-6e51a09c10bd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Received event network-vif-unplugged-4b2da161-220a-41f0-b689-1c6e25e8b81a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.739 183079 DEBUG nova.virt.libvirt.vif [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-783174209',display_name='tempest-server-test-783174209',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-783174209',id=4,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFwDv7A3xWV6IJ9sHJD6IHIGofwjIy3SwtF7ZsSnO9i7yxDOnvofgvCRbmYwkVhe4LG2M7JC1Bh9mcomUiffvFuBa9GwDItaNN685Z4fyZXr+GZx+rbje/8Qtcf+s+bYwA==',key_name='tempest-keypair-test-76774833',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:07:27Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cc642b97aa4e4886902a0d1233877b88',ramdisk_id='',reservation_id='r-8jpndd40',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DefaultSnatToExternal-1301723521',owner_user_name='tempest-DefaultSnatToExternal-1301723521-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:07:27Z,user_data=None,user_id='c9cdd80799a74efb8ce82cfb5148ac89',uuid=b9680bfd-e87f-427c-8f13-2b3a415aca39,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "address": "fa:16:3e:22:ea:7f", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2da161-22", "ovs_interfaceid": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.739 183079 DEBUG nova.network.os_vif_util [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Converting VIF {"id": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "address": "fa:16:3e:22:ea:7f", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2da161-22", "ovs_interfaceid": "4b2da161-220a-41f0-b689-1c6e25e8b81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.740 183079 DEBUG nova.network.os_vif_util [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:ea:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2da161-220a-41f0-b689-1c6e25e8b81a,network=Network(ca57ee46-b6e8-4b60-affe-0c1349cb8abe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4b2da161-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.741 183079 DEBUG os_vif [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:ea:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2da161-220a-41f0-b689-1c6e25e8b81a,network=Network(ca57ee46-b6e8-4b60-affe-0c1349cb8abe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4b2da161-22') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.743 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.748 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b2da161-22, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.754 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.757 183079 INFO os_vif [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:ea:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2da161-220a-41f0-b689-1c6e25e8b81a,network=Network(ca57ee46-b6e8-4b60-affe-0c1349cb8abe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4b2da161-22')
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.758 183079 INFO nova.virt.libvirt.driver [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Deleting instance files /var/lib/nova/instances/b9680bfd-e87f-427c-8f13-2b3a415aca39_del
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.759 183079 INFO nova.virt.libvirt.driver [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Deletion of /var/lib/nova/instances/b9680bfd-e87f-427c-8f13-2b3a415aca39_del complete
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.830 183079 INFO nova.compute.manager [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.831 183079 DEBUG oslo.service.loopingcall [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.832 183079 DEBUG nova.compute.manager [-] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.832 183079 DEBUG nova.network.neutron [-] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.890 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.891 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:51 compute-0 nova_compute[183075]: 2026-01-22 17:07:51.911 183079 DEBUG nova.compute.manager [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.023 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.024 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.031 183079 DEBUG nova.virt.hardware [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.031 183079 INFO nova.compute.claims [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.087 183079 INFO nova.compute.manager [None req-fbc02cdf-7b48-43ce-99cd-71968d686038 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Get console output
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.092 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.276 183079 DEBUG nova.compute.provider_tree [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.290 183079 DEBUG nova.scheduler.client.report [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.311 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.312 183079 DEBUG nova.compute.manager [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.350 183079 DEBUG nova.compute.manager [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.351 183079 DEBUG nova.network.neutron [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.366 183079 INFO nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.412 183079 DEBUG nova.compute.manager [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.508 183079 DEBUG nova.compute.manager [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.510 183079 DEBUG nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.511 183079 INFO nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Creating image(s)
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.512 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "/var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.512 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "/var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.513 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "/var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.535 183079 DEBUG oslo_concurrency.processutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.608 183079 DEBUG oslo_concurrency.processutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.610 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.611 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.633 183079 DEBUG oslo_concurrency.processutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.696 183079 DEBUG oslo_concurrency.processutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.697 183079 DEBUG oslo_concurrency.processutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.734 183079 DEBUG oslo_concurrency.processutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.735 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.735 183079 DEBUG oslo_concurrency.processutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.799 183079 DEBUG oslo_concurrency.processutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.800 183079 DEBUG nova.virt.disk.api [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Checking if we can resize image /var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.800 183079 DEBUG oslo_concurrency.processutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.886 183079 DEBUG oslo_concurrency.processutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.887 183079 DEBUG nova.virt.disk.api [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Cannot resize image /var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.888 183079 DEBUG nova.objects.instance [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lazy-loading 'migration_context' on Instance uuid 4bb7efdc-59ab-46cd-ae0d-582182c85f5b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.925 183079 DEBUG nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.925 183079 DEBUG nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Ensure instance console log exists: /var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.926 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.926 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:52 compute-0 nova_compute[183075]: 2026-01-22 17:07:52.926 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:53 compute-0 nova_compute[183075]: 2026-01-22 17:07:53.567 183079 DEBUG nova.network.neutron [-] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:07:53 compute-0 nova_compute[183075]: 2026-01-22 17:07:53.571 183079 DEBUG nova.network.neutron [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Successfully created port: 65d8ece3-00e3-43f9-8231-6893ea4cf9a4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:07:53 compute-0 nova_compute[183075]: 2026-01-22 17:07:53.585 183079 INFO nova.compute.manager [-] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Took 1.75 seconds to deallocate network for instance.
Jan 22 17:07:53 compute-0 nova_compute[183075]: 2026-01-22 17:07:53.624 183079 DEBUG oslo_concurrency.lockutils [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:53 compute-0 nova_compute[183075]: 2026-01-22 17:07:53.624 183079 DEBUG oslo_concurrency.lockutils [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:53 compute-0 nova_compute[183075]: 2026-01-22 17:07:53.741 183079 DEBUG nova.compute.provider_tree [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:07:53 compute-0 nova_compute[183075]: 2026-01-22 17:07:53.755 183079 DEBUG nova.scheduler.client.report [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:07:53 compute-0 nova_compute[183075]: 2026-01-22 17:07:53.775 183079 DEBUG oslo_concurrency.lockutils [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:53 compute-0 nova_compute[183075]: 2026-01-22 17:07:53.799 183079 INFO nova.scheduler.client.report [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Deleted allocations for instance b9680bfd-e87f-427c-8f13-2b3a415aca39
Jan 22 17:07:53 compute-0 nova_compute[183075]: 2026-01-22 17:07:53.809 183079 DEBUG nova.compute.manager [req-7e1588e1-2b77-4e6f-afae-f50f504a82d9 req-60b6182f-0c3a-41b5-9273-67ab6d73e235 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Received event network-vif-plugged-4b2da161-220a-41f0-b689-1c6e25e8b81a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:07:53 compute-0 nova_compute[183075]: 2026-01-22 17:07:53.809 183079 DEBUG oslo_concurrency.lockutils [req-7e1588e1-2b77-4e6f-afae-f50f504a82d9 req-60b6182f-0c3a-41b5-9273-67ab6d73e235 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:53 compute-0 nova_compute[183075]: 2026-01-22 17:07:53.810 183079 DEBUG oslo_concurrency.lockutils [req-7e1588e1-2b77-4e6f-afae-f50f504a82d9 req-60b6182f-0c3a-41b5-9273-67ab6d73e235 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:53 compute-0 nova_compute[183075]: 2026-01-22 17:07:53.810 183079 DEBUG oslo_concurrency.lockutils [req-7e1588e1-2b77-4e6f-afae-f50f504a82d9 req-60b6182f-0c3a-41b5-9273-67ab6d73e235 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b9680bfd-e87f-427c-8f13-2b3a415aca39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:53 compute-0 nova_compute[183075]: 2026-01-22 17:07:53.810 183079 DEBUG nova.compute.manager [req-7e1588e1-2b77-4e6f-afae-f50f504a82d9 req-60b6182f-0c3a-41b5-9273-67ab6d73e235 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] No waiting events found dispatching network-vif-plugged-4b2da161-220a-41f0-b689-1c6e25e8b81a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:07:53 compute-0 nova_compute[183075]: 2026-01-22 17:07:53.811 183079 WARNING nova.compute.manager [req-7e1588e1-2b77-4e6f-afae-f50f504a82d9 req-60b6182f-0c3a-41b5-9273-67ab6d73e235 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Received unexpected event network-vif-plugged-4b2da161-220a-41f0-b689-1c6e25e8b81a for instance with vm_state deleted and task_state None.
Jan 22 17:07:53 compute-0 nova_compute[183075]: 2026-01-22 17:07:53.858 183079 DEBUG oslo_concurrency.lockutils [None req-2c0bc1ab-490c-468a-a0a5-6ed44b9847e8 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "b9680bfd-e87f-427c-8f13-2b3a415aca39" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.405s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.639 183079 DEBUG oslo_concurrency.lockutils [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "bd249764-12e4-4e25-9445-dd6e132ca53c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.640 183079 DEBUG oslo_concurrency.lockutils [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "bd249764-12e4-4e25-9445-dd6e132ca53c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.641 183079 DEBUG oslo_concurrency.lockutils [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.642 183079 DEBUG oslo_concurrency.lockutils [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.642 183079 DEBUG oslo_concurrency.lockutils [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.644 183079 INFO nova.compute.manager [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Terminating instance
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.646 183079 DEBUG nova.compute.manager [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:07:54 compute-0 kernel: tap6b397961-0e (unregistering): left promiscuous mode
Jan 22 17:07:54 compute-0 NetworkManager[55454]: <info>  [1769101674.6725] device (tap6b397961-0e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:07:54 compute-0 ovn_controller[95372]: 2026-01-22T17:07:54Z|00058|binding|INFO|Releasing lport 6b397961-0eb5-4ccd-8c0a-f433961cd08a from this chassis (sb_readonly=0)
Jan 22 17:07:54 compute-0 ovn_controller[95372]: 2026-01-22T17:07:54Z|00059|binding|INFO|Setting lport 6b397961-0eb5-4ccd-8c0a-f433961cd08a down in Southbound
Jan 22 17:07:54 compute-0 ovn_controller[95372]: 2026-01-22T17:07:54Z|00060|binding|INFO|Removing iface tap6b397961-0e ovn-installed in OVS
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.711 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:54.721 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:74:51 10.100.0.12'], port_security=['fa:16:3e:6c:74:51 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'bd249764-12e4-4e25-9445-dd6e132ca53c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca57ee46-b6e8-4b60-affe-0c1349cb8abe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cc642b97aa4e4886902a0d1233877b88', 'neutron:revision_number': '4', 'neutron:security_group_ids': '78ff1eac-eb3f-4f3f-baa4-7b8094682c5f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.245', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad421178-e3d9-4f12-a6a2-84aabbd8f25c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=6b397961-0eb5-4ccd-8c0a-f433961cd08a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:07:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:54.723 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 6b397961-0eb5-4ccd-8c0a-f433961cd08a in datapath ca57ee46-b6e8-4b60-affe-0c1349cb8abe unbound from our chassis
Jan 22 17:07:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:54.725 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ca57ee46-b6e8-4b60-affe-0c1349cb8abe, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:07:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:54.726 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[58786b81-5856-4915-87b2-378631866cfd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:54.727 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe namespace which is not needed anymore
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.732 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:54 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Jan 22 17:07:54 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 15.266s CPU time.
Jan 22 17:07:54 compute-0 systemd-machined[154382]: Machine qemu-2-instance-00000002 terminated.
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.803 183079 DEBUG nova.network.neutron [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Successfully updated port: 65d8ece3-00e3-43f9-8231-6893ea4cf9a4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.819 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "refresh_cache-4bb7efdc-59ab-46cd-ae0d-582182c85f5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.819 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquired lock "refresh_cache-4bb7efdc-59ab-46cd-ae0d-582182c85f5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.820 183079 DEBUG nova.network.neutron [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:07:54 compute-0 neutron-haproxy-ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215264]: [NOTICE]   (215268) : haproxy version is 2.8.14-c23fe91
Jan 22 17:07:54 compute-0 neutron-haproxy-ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215264]: [NOTICE]   (215268) : path to executable is /usr/sbin/haproxy
Jan 22 17:07:54 compute-0 neutron-haproxy-ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215264]: [WARNING]  (215268) : Exiting Master process...
Jan 22 17:07:54 compute-0 neutron-haproxy-ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215264]: [ALERT]    (215268) : Current worker (215270) exited with code 143 (Terminated)
Jan 22 17:07:54 compute-0 neutron-haproxy-ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe[215264]: [WARNING]  (215268) : All workers exited. Exiting... (0)
Jan 22 17:07:54 compute-0 systemd[1]: libpod-aaaa7538d3c0886fe2729d55d17736b07529c6f43da9a107883fbd5fe70c8423.scope: Deactivated successfully.
Jan 22 17:07:54 compute-0 podman[215956]: 2026-01-22 17:07:54.895594315 +0000 UTC m=+0.055405850 container died aaaa7538d3c0886fe2729d55d17736b07529c6f43da9a107883fbd5fe70c8423 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.902 183079 INFO nova.virt.libvirt.driver [-] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Instance destroyed successfully.
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.903 183079 DEBUG nova.objects.instance [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lazy-loading 'resources' on Instance uuid bd249764-12e4-4e25-9445-dd6e132ca53c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.909 183079 DEBUG nova.compute.manager [req-2c2ca593-fd5b-4a98-ba1e-9d5b9549cbb0 req-408cc574-f6ef-4ea2-b4f3-eba2455ef5e0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Received event network-changed-65d8ece3-00e3-43f9-8231-6893ea4cf9a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.909 183079 DEBUG nova.compute.manager [req-2c2ca593-fd5b-4a98-ba1e-9d5b9549cbb0 req-408cc574-f6ef-4ea2-b4f3-eba2455ef5e0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Refreshing instance network info cache due to event network-changed-65d8ece3-00e3-43f9-8231-6893ea4cf9a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.910 183079 DEBUG oslo_concurrency.lockutils [req-2c2ca593-fd5b-4a98-ba1e-9d5b9549cbb0 req-408cc574-f6ef-4ea2-b4f3-eba2455ef5e0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-4bb7efdc-59ab-46cd-ae0d-582182c85f5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:07:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aaaa7538d3c0886fe2729d55d17736b07529c6f43da9a107883fbd5fe70c8423-userdata-shm.mount: Deactivated successfully.
Jan 22 17:07:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-2244431f15046fcf1df958c192f3e2514255cb947c42b723d6394ed8db21857e-merged.mount: Deactivated successfully.
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.925 183079 DEBUG nova.virt.libvirt.vif [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:06:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1744620446',display_name='tempest-server-test-1744620446',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1744620446',id=2,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFwDv7A3xWV6IJ9sHJD6IHIGofwjIy3SwtF7ZsSnO9i7yxDOnvofgvCRbmYwkVhe4LG2M7JC1Bh9mcomUiffvFuBa9GwDItaNN685Z4fyZXr+GZx+rbje/8Qtcf+s+bYwA==',key_name='tempest-keypair-test-76774833',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:06:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cc642b97aa4e4886902a0d1233877b88',ramdisk_id='',reservation_id='r-o3uwyal5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DefaultSnatToExternal-1301723521',owner_user_name='tempest-DefaultSnatToExternal-1301723521-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:06:52Z,user_data=None,user_id='c9cdd80799a74efb8ce82cfb5148ac89',uuid=bd249764-12e4-4e25-9445-dd6e132ca53c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "address": "fa:16:3e:6c:74:51", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b397961-0e", "ovs_interfaceid": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.933 183079 DEBUG nova.network.os_vif_util [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Converting VIF {"id": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "address": "fa:16:3e:6c:74:51", "network": {"id": "ca57ee46-b6e8-4b60-affe-0c1349cb8abe", "bridge": "br-int", "label": "tempest-test-network--418767780", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cc642b97aa4e4886902a0d1233877b88", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b397961-0e", "ovs_interfaceid": "6b397961-0eb5-4ccd-8c0a-f433961cd08a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.934 183079 DEBUG nova.network.os_vif_util [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6c:74:51,bridge_name='br-int',has_traffic_filtering=True,id=6b397961-0eb5-4ccd-8c0a-f433961cd08a,network=Network(ca57ee46-b6e8-4b60-affe-0c1349cb8abe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6b397961-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.934 183079 DEBUG os_vif [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:74:51,bridge_name='br-int',has_traffic_filtering=True,id=6b397961-0eb5-4ccd-8c0a-f433961cd08a,network=Network(ca57ee46-b6e8-4b60-affe-0c1349cb8abe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6b397961-0e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.937 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.937 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b397961-0e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.940 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.941 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:54 compute-0 podman[215956]: 2026-01-22 17:07:54.943427006 +0000 UTC m=+0.103238541 container cleanup aaaa7538d3c0886fe2729d55d17736b07529c6f43da9a107883fbd5fe70c8423 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.944 183079 INFO os_vif [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:74:51,bridge_name='br-int',has_traffic_filtering=True,id=6b397961-0eb5-4ccd-8c0a-f433961cd08a,network=Network(ca57ee46-b6e8-4b60-affe-0c1349cb8abe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6b397961-0e')
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.944 183079 INFO nova.virt.libvirt.driver [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Deleting instance files /var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c_del
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.945 183079 INFO nova.virt.libvirt.driver [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Deletion of /var/lib/nova/instances/bd249764-12e4-4e25-9445-dd6e132ca53c_del complete
Jan 22 17:07:54 compute-0 systemd[1]: libpod-conmon-aaaa7538d3c0886fe2729d55d17736b07529c6f43da9a107883fbd5fe70c8423.scope: Deactivated successfully.
Jan 22 17:07:54 compute-0 nova_compute[183075]: 2026-01-22 17:07:54.995 183079 DEBUG nova.network.neutron [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.002 183079 INFO nova.compute.manager [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.003 183079 DEBUG oslo.service.loopingcall [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.003 183079 DEBUG nova.compute.manager [-] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.003 183079 DEBUG nova.network.neutron [-] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:07:55 compute-0 podman[215999]: 2026-01-22 17:07:55.041717656 +0000 UTC m=+0.067771423 container remove aaaa7538d3c0886fe2729d55d17736b07529c6f43da9a107883fbd5fe70c8423 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:07:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:55.048 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1c7794ab-2f04-49f4-b090-c1abaa5516f4]: (4, ('Thu Jan 22 05:07:54 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe (aaaa7538d3c0886fe2729d55d17736b07529c6f43da9a107883fbd5fe70c8423)\naaaa7538d3c0886fe2729d55d17736b07529c6f43da9a107883fbd5fe70c8423\nThu Jan 22 05:07:54 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe (aaaa7538d3c0886fe2729d55d17736b07529c6f43da9a107883fbd5fe70c8423)\naaaa7538d3c0886fe2729d55d17736b07529c6f43da9a107883fbd5fe70c8423\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:55.049 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[90a19d5c-88be-4a51-880d-2a31ef340268]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:55.050 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca57ee46-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.053 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:55 compute-0 kernel: tapca57ee46-b0: left promiscuous mode
Jan 22 17:07:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:55.059 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ba23c7f3-1396-4ccf-88b8-52f09557efd3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.067 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:55.075 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[23303fec-86dd-4ad4-9271-651ed44661e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:55.076 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d6d8245c-b5a2-43d2-8494-05d16cb99b6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:55.088 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[afbee7d2-4d74-4cc0-b12b-20f7fbcab23d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397350, 'reachable_time': 26454, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216014, 'error': None, 'target': 'ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:55 compute-0 systemd[1]: run-netns-ovnmeta\x2dca57ee46\x2db6e8\x2d4b60\x2daffe\x2d0c1349cb8abe.mount: Deactivated successfully.
Jan 22 17:07:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:55.093 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ca57ee46-b6e8-4b60-affe-0c1349cb8abe deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:07:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:55.093 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[0d883fc3-a318-4620-a580-e9d8564d338f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:55.851 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}3191194fa35e00b1a719798addec8a2fa76929e8b1428fdbb5d770a6e1527cb3" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.920 183079 DEBUG nova.compute.manager [req-f39a9ab3-af6e-4f02-b7b1-d14f2aa040d3 req-fca170a8-7bf1-491d-806c-7e3ec2b91b9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Received event network-vif-unplugged-6b397961-0eb5-4ccd-8c0a-f433961cd08a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.921 183079 DEBUG oslo_concurrency.lockutils [req-f39a9ab3-af6e-4f02-b7b1-d14f2aa040d3 req-fca170a8-7bf1-491d-806c-7e3ec2b91b9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.921 183079 DEBUG oslo_concurrency.lockutils [req-f39a9ab3-af6e-4f02-b7b1-d14f2aa040d3 req-fca170a8-7bf1-491d-806c-7e3ec2b91b9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.922 183079 DEBUG oslo_concurrency.lockutils [req-f39a9ab3-af6e-4f02-b7b1-d14f2aa040d3 req-fca170a8-7bf1-491d-806c-7e3ec2b91b9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.922 183079 DEBUG nova.compute.manager [req-f39a9ab3-af6e-4f02-b7b1-d14f2aa040d3 req-fca170a8-7bf1-491d-806c-7e3ec2b91b9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] No waiting events found dispatching network-vif-unplugged-6b397961-0eb5-4ccd-8c0a-f433961cd08a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.922 183079 DEBUG nova.compute.manager [req-f39a9ab3-af6e-4f02-b7b1-d14f2aa040d3 req-fca170a8-7bf1-491d-806c-7e3ec2b91b9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Received event network-vif-unplugged-6b397961-0eb5-4ccd-8c0a-f433961cd08a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.922 183079 DEBUG nova.compute.manager [req-f39a9ab3-af6e-4f02-b7b1-d14f2aa040d3 req-fca170a8-7bf1-491d-806c-7e3ec2b91b9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Received event network-vif-plugged-6b397961-0eb5-4ccd-8c0a-f433961cd08a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.923 183079 DEBUG oslo_concurrency.lockutils [req-f39a9ab3-af6e-4f02-b7b1-d14f2aa040d3 req-fca170a8-7bf1-491d-806c-7e3ec2b91b9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.923 183079 DEBUG oslo_concurrency.lockutils [req-f39a9ab3-af6e-4f02-b7b1-d14f2aa040d3 req-fca170a8-7bf1-491d-806c-7e3ec2b91b9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.923 183079 DEBUG oslo_concurrency.lockutils [req-f39a9ab3-af6e-4f02-b7b1-d14f2aa040d3 req-fca170a8-7bf1-491d-806c-7e3ec2b91b9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bd249764-12e4-4e25-9445-dd6e132ca53c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.923 183079 DEBUG nova.compute.manager [req-f39a9ab3-af6e-4f02-b7b1-d14f2aa040d3 req-fca170a8-7bf1-491d-806c-7e3ec2b91b9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] No waiting events found dispatching network-vif-plugged-6b397961-0eb5-4ccd-8c0a-f433961cd08a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:07:55 compute-0 nova_compute[183075]: 2026-01-22 17:07:55.924 183079 WARNING nova.compute.manager [req-f39a9ab3-af6e-4f02-b7b1-d14f2aa040d3 req-fca170a8-7bf1-491d-806c-7e3ec2b91b9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Received unexpected event network-vif-plugged-6b397961-0eb5-4ccd-8c0a-f433961cd08a for instance with vm_state active and task_state deleting.
Jan 22 17:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:55.989 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Thu, 22 Jan 2026 17:07:55 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-9bfbd503-a941-4172-8288-6fdfa75a7b1b x-openstack-request-id: req-9bfbd503-a941-4172-8288-6fdfa75a7b1b _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 22 17:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:55.989 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "8d1ce660-7497-440b-8666-00c695d0b4d2", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/8d1ce660-7497-440b-8666-00c695d0b4d2"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/8d1ce660-7497-440b-8666-00c695d0b4d2"}]}, {"id": "c36c4338-67fc-4ac7-9a68-89ed828dd90b", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/c36c4338-67fc-4ac7-9a68-89ed828dd90b"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/c36c4338-67fc-4ac7-9a68-89ed828dd90b"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 22 17:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:55.989 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-9bfbd503-a941-4172-8288-6fdfa75a7b1b request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 22 17:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:55.991 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/8d1ce660-7497-440b-8666-00c695d0b4d2 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}3191194fa35e00b1a719798addec8a2fa76929e8b1428fdbb5d770a6e1527cb3" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.042 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Thu, 22 Jan 2026 17:07:55 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-d829dc7d-d1ed-4a0c-912b-69a8286c8e91 x-openstack-request-id: req-d829dc7d-d1ed-4a0c-912b-69a8286c8e91 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.042 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "8d1ce660-7497-440b-8666-00c695d0b4d2", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/8d1ce660-7497-440b-8666-00c695d0b4d2"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/8d1ce660-7497-440b-8666-00c695d0b4d2"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.042 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/8d1ce660-7497-440b-8666-00c695d0b4d2 used request id req-d829dc7d-d1ed-4a0c-912b-69a8286c8e91 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.043 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'name': 'tempest-server-test-1783669850', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'hostId': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.046 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'name': 'tempest-server-test-1172585183', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000005', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'hostId': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.046 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.049 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for effaddee-27ef-49f6-ac5f-2e3258c8d5d2 / tap44437e9e-7b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.050 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/network.outgoing.packets volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.052 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 000b64b8-bcc5-4bbe-9703-8400a83a27d0 / tapc694dca0-bf inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.053 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9bc8a163-cc3a-4941-8b7b-8e9bbf8a4d62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 120, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000003-effaddee-27ef-49f6-ac5f-2e3258c8d5d2-tap44437e9e-7b', 'timestamp': '2026-01-22T17:07:56.046920', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'tap44437e9e-7b', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:71:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44437e9e-7b'}, 'message_id': 'e53af334-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.807466026, 'message_signature': 'b701c77246302cbffe67b9fdfcf38acd17758816edc6292fa4666615b9a10e85'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000005-000b64b8-bcc5-4bbe-9703-8400a83a27d0-tapc694dca0-bf', 'timestamp': '2026-01-22T17:07:56.046920', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'tapc694dca0-bf', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:25:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc694dca0-bf'}, 'message_id': 'e53b6210-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.811254265, 'message_signature': 'c14d4d8e3babc4adf79afb897e039bc71c1451214feca2274f6c3ebb34c16d05'}]}, 'timestamp': '2026-01-22 17:07:56.053559', '_unique_id': '1b19871412ae4ac2b98cc94b7346fd6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.059 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.062 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.062 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.063 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1783669850>, <NovaLikeServer: tempest-server-test-1172585183>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1783669850>, <NovaLikeServer: tempest-server-test-1172585183>]
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.063 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.063 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.063 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37954f36-0069-48b0-8e7a-781171a82a51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000003-effaddee-27ef-49f6-ac5f-2e3258c8d5d2-tap44437e9e-7b', 'timestamp': '2026-01-22T17:07:56.063483', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'tap44437e9e-7b', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:71:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44437e9e-7b'}, 'message_id': 'e53cf6a2-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.807466026, 'message_signature': 'c12636dc4c5606f8c5d75395cab1dca17d3783ddaf5c560cec20012c657f9060'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000005-000b64b8-bcc5-4bbe-9703-8400a83a27d0-tapc694dca0-bf', 'timestamp': '2026-01-22T17:07:56.063483', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'tapc694dca0-bf', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:25:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc694dca0-bf'}, 'message_id': 'e53d04b2-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.811254265, 'message_signature': 'b222bac0d0d2d58d0eb3704466a05eee6c728c923e844679cd0597954ae5cdbb'}]}, 'timestamp': '2026-01-22 17:07:56.064230', '_unique_id': '547687f229e94ba290edf7b9fd75feea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.065 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.066 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.080 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk.device.read.bytes volume: 30059008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.099 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk.device.read.bytes volume: 27626496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b1e2caa-c940-4f5e-9909-c09890a21ef4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30059008, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2-vda', 'timestamp': '2026-01-22T17:07:56.066150', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'instance-00000003', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e53f92ae-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.826706419, 'message_signature': 'e17c3f7e80ae1bfe6e2eb4c9f65f2be53f91a108139f0bd9ff71d54c6794ee9a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 27626496, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0-vda', 'timestamp': '2026-01-22T17:07:56.066150', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'instance-00000005', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e54275fa-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.841456635, 'message_signature': 'b3e2ceae3692c6ba5918e6c08d86024b9f67d73f64631f2a55491e6c289aa5b0'}]}, 'timestamp': '2026-01-22 17:07:56.099911', '_unique_id': 'f3e60d89f0ba415487493caa5077fda5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.100 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.101 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.101 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk.device.write.requests volume: 325 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.101 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk.device.write.requests volume: 200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5b5356b-1cd3-48e6-9542-0032108e006c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 325, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2-vda', 'timestamp': '2026-01-22T17:07:56.101551', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'instance-00000003', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e542c28a-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.826706419, 'message_signature': '087bc01c4e434b3da8ff5b4c35e812c449ee72be58cea229fd208467884070dc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 200, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0-vda', 'timestamp': '2026-01-22T17:07:56.101551', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'instance-00000005', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e542cc26-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.841456635, 'message_signature': '1e10152161b44ac2a0fc720f0f3318dd007c5953e5a69ccd1e85beee8af8d628'}]}, 'timestamp': '2026-01-22 17:07:56.102054', '_unique_id': 'cb778db55a014556a5818ebc38c1cf7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.102 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.103 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.103 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/network.outgoing.bytes volume: 10531 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.103 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0589d539-4df5-42c9-a519-9c19d02fc138', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10531, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000003-effaddee-27ef-49f6-ac5f-2e3258c8d5d2-tap44437e9e-7b', 'timestamp': '2026-01-22T17:07:56.103310', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'tap44437e9e-7b', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:71:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44437e9e-7b'}, 'message_id': 'e5430696-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.807466026, 'message_signature': 'cb6238c79f3cca097adcc3bbc166047bfe6639819ea9aed614f741aaec22e29b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000005-000b64b8-bcc5-4bbe-9703-8400a83a27d0-tapc694dca0-bf', 'timestamp': '2026-01-22T17:07:56.103310', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'tapc694dca0-bf', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:25:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc694dca0-bf'}, 'message_id': 'e543114a-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.811254265, 'message_signature': 'dc439f6253d3fd181eb68c0e5a49bc78192f7bae90615363b79033fa9f72e4a9'}]}, 'timestamp': '2026-01-22 17:07:56.103824', '_unique_id': 'a40375a1fb774f14bdf66ac1744ee069'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.104 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/network.incoming.bytes volume: 7395 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80faa782-549c-466b-8522-0956e059bd70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7395, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000003-effaddee-27ef-49f6-ac5f-2e3258c8d5d2-tap44437e9e-7b', 'timestamp': '2026-01-22T17:07:56.104976', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'tap44437e9e-7b', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:71:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44437e9e-7b'}, 'message_id': 'e5434750-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.807466026, 'message_signature': '3301d75004d1db350ad1b1aaa0172275cea8f7110ee20579b9cc26c5376be82e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000005-000b64b8-bcc5-4bbe-9703-8400a83a27d0-tapc694dca0-bf', 'timestamp': '2026-01-22T17:07:56.104976', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'tapc694dca0-bf', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:25:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc694dca0-bf'}, 'message_id': 'e5434fb6-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.811254265, 'message_signature': '313b1d82fe6c41e71ad99c487df0ffbeef25ac6e97ddb9178d1849ff0ac07557'}]}, 'timestamp': '2026-01-22 17:07:56.105415', '_unique_id': '70d77ec3109e4843addbc2f8a0e8729d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.105 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.106 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.106 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.106 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a302a812-0beb-4141-a943-cd4f446a94a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000003-effaddee-27ef-49f6-ac5f-2e3258c8d5d2-tap44437e9e-7b', 'timestamp': '2026-01-22T17:07:56.106569', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'tap44437e9e-7b', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:71:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44437e9e-7b'}, 'message_id': 'e543868e-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.807466026, 'message_signature': 'cf5237c043c35e281739dc5f78aed7920a59ce75604072b27af7ccc36712a6a1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000005-000b64b8-bcc5-4bbe-9703-8400a83a27d0-tapc694dca0-bf', 'timestamp': '2026-01-22T17:07:56.106569', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'tapc694dca0-bf', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:25:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc694dca0-bf'}, 'message_id': 'e5438efe-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.811254265, 'message_signature': 'b6dbef52092a5493595e85106f3b233b31512e846057f83517f39f51defe6ab7'}]}, 'timestamp': '2026-01-22 17:07:56.107031', '_unique_id': 'be2bee3e040345a2aa5137fad74ad569'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.107 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.108 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.108 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.108 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1783669850>, <NovaLikeServer: tempest-server-test-1172585183>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1783669850>, <NovaLikeServer: tempest-server-test-1172585183>]
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.114 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2f34597-30fe-4aa3-a58d-5b4999fe0749', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2-vda', 'timestamp': '2026-01-22T17:07:56.108422', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'instance-00000003', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e544b158-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.868945024, 'message_signature': '9e34aa7c355bd659eba62a86842d0ab94741404d068be636d651f6e8f668c22f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0-vda', 'timestamp': '2026-01-22T17:07:56.108422', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'instance-00000005', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5456f94-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.874865909, 'message_signature': '1666f446ae987728dd9678eb1d9dbfe3c615dd2acab064aa22fb600a5eb916ad'}]}, 'timestamp': '2026-01-22 17:07:56.119336', '_unique_id': 'a27f543681e0468aab1c4633eeabbcda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.119 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.120 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.133 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/memory.usage volume: 42.69140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.146 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5134a238-8055-40df-865b-4b150c384cc3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.69140625, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'timestamp': '2026-01-22T17:07:56.120512', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'instance-00000003', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'e547b678-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.893904246, 'message_signature': 'cbf53a89e6a82bb749e4b25c41abe09ec5b387dd0e41765fa309a3f3efb60a5c'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'timestamp': '2026-01-22T17:07:56.120512', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'instance-00000005', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'e54995a6-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.906578098, 'message_signature': 'a75b00e6da18b36027e23ba7618812320b0ddd4cb1541c672ef154cdb1518fdc'}]}, 'timestamp': '2026-01-22 17:07:56.146552', '_unique_id': '71981fb770144795af1dfabf927d0b1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk.device.read.requests volume: 1115 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk.device.read.requests volume: 959 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5df8fe14-4f15-4565-bfe2-884fca9bb731', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1115, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2-vda', 'timestamp': '2026-01-22T17:07:56.148063', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'instance-00000003', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e549da84-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.826706419, 'message_signature': '30d63ea41d944bac16542138c6e951c1c2e506c33ccba81f4016977f583909b4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 959, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0-vda', 'timestamp': '2026-01-22T17:07:56.148063', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'instance-00000005', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e549e376-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.841456635, 'message_signature': '597519d4c5a7cb76f1181ed8baf939ac01674c9127b08520e890b63a322d8eb9'}]}, 'timestamp': '2026-01-22 17:07:56.148514', '_unique_id': 'e5fbcb9a31644e7cb6037b89b3642e2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.148 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.149 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.149 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.149 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5245755-c0c8-46b4-a60d-5bd6b7c8dff3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000003-effaddee-27ef-49f6-ac5f-2e3258c8d5d2-tap44437e9e-7b', 'timestamp': '2026-01-22T17:07:56.149594', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'tap44437e9e-7b', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:71:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44437e9e-7b'}, 'message_id': 'e54a1792-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.807466026, 'message_signature': '01babaa946953be24bc99598d9d842148e02391ab9d5dfedfba19bdec4202c29'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000005-000b64b8-bcc5-4bbe-9703-8400a83a27d0-tapc694dca0-bf', 'timestamp': '2026-01-22T17:07:56.149594', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'tapc694dca0-bf', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:25:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc694dca0-bf'}, 'message_id': 'e54a1f9e-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.811254265, 'message_signature': 'b4bd8b5c7088307832d1ef24d327d06c1c092bdc300ed561c4895d28823a4b5c'}]}, 'timestamp': '2026-01-22 17:07:56.150088', '_unique_id': '8605a8361df149bba13fb410b15fc0d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.150 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.151 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.151 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.151 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c18bd7a-7513-4f42-a173-dad2c6fd5be7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000003-effaddee-27ef-49f6-ac5f-2e3258c8d5d2-tap44437e9e-7b', 'timestamp': '2026-01-22T17:07:56.151207', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'tap44437e9e-7b', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:71:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44437e9e-7b'}, 'message_id': 'e54a5536-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.807466026, 'message_signature': '00644c6322b4d97e59690e988cfcd99c1eb93d4334af28eba76a5ee05ad58c7a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000005-000b64b8-bcc5-4bbe-9703-8400a83a27d0-tapc694dca0-bf', 'timestamp': '2026-01-22T17:07:56.151207', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'tapc694dca0-bf', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:25:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc694dca0-bf'}, 'message_id': 'e54a5d38-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.811254265, 'message_signature': '89784e3e02d3faed0d08df7345c046b689a65ab5d7931f5abba56e19c9e5bc7d'}]}, 'timestamp': '2026-01-22 17:07:56.151654', '_unique_id': '4a200ac0cd92499082a451e2e6017bf5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.152 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '785345d6-2f4d-4edf-a5c7-539df0ccef80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000003-effaddee-27ef-49f6-ac5f-2e3258c8d5d2-tap44437e9e-7b', 'timestamp': '2026-01-22T17:07:56.152725', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'tap44437e9e-7b', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:71:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44437e9e-7b'}, 'message_id': 'e54a9082-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.807466026, 'message_signature': 'd850f98ca7faeede72bc56541595c2222e98881d24b1223dea1076dfbf737a7e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000005-000b64b8-bcc5-4bbe-9703-8400a83a27d0-tapc694dca0-bf', 'timestamp': '2026-01-22T17:07:56.152725', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'tapc694dca0-bf', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:25:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc694dca0-bf'}, 'message_id': 'e54a9884-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.811254265, 'message_signature': 'cdd6fa22ee602cfc13c2f0452c26aaba8dd55c8c2570a45b8a6e4ae2c9ac7810'}]}, 'timestamp': '2026-01-22 17:07:56.153148', '_unique_id': 'baa3607d3541412180ee05e2eedf6c03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.153 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.154 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.154 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.154 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk.device.allocation volume: 29171712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a7e4f72-ab7d-4092-b418-2c58a331c48f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2-vda', 'timestamp': '2026-01-22T17:07:56.154273', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'instance-00000003', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e54accd2-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.868945024, 'message_signature': '4c974343ac7d936a3ecac7b9af093b4879f1dcd676f2fecca75bd0286ec79fc2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29171712, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0-vda', 'timestamp': '2026-01-22T17:07:56.154273', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'instance-00000005', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e54ad47a-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.874865909, 'message_signature': '7a93cf29d2bf84940fd54dc168ae97592e35d0c4ff6af57fb7c1ba649bfaafeb'}]}, 'timestamp': '2026-01-22 17:07:56.154695', '_unique_id': 'beb4494f88fe43e98f78612c12eed79a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.155 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '847556b4-ae90-4185-94b9-efab42b2e67f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000003-effaddee-27ef-49f6-ac5f-2e3258c8d5d2-tap44437e9e-7b', 'timestamp': '2026-01-22T17:07:56.155753', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'tap44437e9e-7b', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:71:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44437e9e-7b'}, 'message_id': 'e54b06ac-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.807466026, 'message_signature': '57fc8bfb702c2e84dcca164e058026af26cb544e5347d3d57a74cf5121afb6c1'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000005-000b64b8-bcc5-4bbe-9703-8400a83a27d0-tapc694dca0-bf', 'timestamp': '2026-01-22T17:07:56.155753', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'tapc694dca0-bf', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:25:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc694dca0-bf'}, 'message_id': 'e54b0eae-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.811254265, 'message_signature': '62a02e57766c6e9b0524bed926e51a65ed308bf5a880036f5299833e005dffd9'}]}, 'timestamp': '2026-01-22 17:07:56.156174', '_unique_id': '12e44267e04c4f7fac8777196f47d129'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.156 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.157 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.157 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.157 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1783669850>, <NovaLikeServer: tempest-server-test-1172585183>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1783669850>, <NovaLikeServer: tempest-server-test-1172585183>]
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.157 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.157 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk.device.read.latency volume: 201432259 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.157 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk.device.read.latency volume: 156837922 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4ca4eaa-26e3-47bf-9b09-2079eed78f76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 201432259, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2-vda', 'timestamp': '2026-01-22T17:07:56.157542', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'instance-00000003', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e54b4ce8-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.826706419, 'message_signature': 'eeee8770ef6fbc5d328e84033c8e1702b7370306fb21d44ac03aee1a18f900e9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 156837922, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0-vda', 'timestamp': '2026-01-22T17:07:56.157542', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'instance-00000005', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e54b54cc-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.841456635, 'message_signature': '2c4f02a33ebd1265b37bb0e3b5874c0805612bcf3401c68bc42123fb7e10de09'}]}, 'timestamp': '2026-01-22 17:07:56.157962', '_unique_id': '018232c9d0214c28a7b1c5ffb97acb77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.158 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk.device.usage volume: 28246016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84c79715-4834-4b62-8d3a-1ae37079a682', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2-vda', 'timestamp': '2026-01-22T17:07:56.159061', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'instance-00000003', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e54b8802-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.868945024, 'message_signature': '25c72019a641cb61690e85724d16cf2587328b88087b027fc4410e595d9b3085'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28246016, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0-vda', 'timestamp': '2026-01-22T17:07:56.159061', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'instance-00000005', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e54b8fdc-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.874865909, 'message_signature': 'ab09afe939b56bc5df151dfa28f1aa4158ae8dafce3af397f005c7ec45a0782c'}]}, 'timestamp': '2026-01-22 17:07:56.159484', '_unique_id': '5dabbb9849e74f51a18dd505ea2d1f8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.159 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.160 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.160 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/cpu volume: 10810000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.160 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/cpu volume: 10290000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd694940b-ffa5-4edf-8ca5-28eae2015870', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10810000000, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'timestamp': '2026-01-22T17:07:56.160569', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'instance-00000003', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e54bc38a-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.893904246, 'message_signature': '133ccf2b2423c03ce1e4b99d7331ce81e1471300342033e6a601ae815347b60e'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10290000000, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'timestamp': '2026-01-22T17:07:56.160569', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'instance-00000005', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e54bcb3c-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.906578098, 'message_signature': '3353b4362ef7661375d3132a413b9283933311048f29f878ad0c967f70c2badf'}]}, 'timestamp': '2026-01-22 17:07:56.160995', '_unique_id': '063db656207148519a2dffdb7074b2be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.162 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.162 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc014869-eadf-419f-9e3d-6960eb70fd6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000003-effaddee-27ef-49f6-ac5f-2e3258c8d5d2-tap44437e9e-7b', 'timestamp': '2026-01-22T17:07:56.162059', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'tap44437e9e-7b', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:71:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44437e9e-7b'}, 'message_id': 'e54bfcf6-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.807466026, 'message_signature': '776166d72f29ddb8cf0cf94d04b9691e7f351e22bc7b2533c7930c728f15410f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'instance-00000005-000b64b8-bcc5-4bbe-9703-8400a83a27d0-tapc694dca0-bf', 'timestamp': '2026-01-22T17:07:56.162059', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'tapc694dca0-bf', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:25:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc694dca0-bf'}, 'message_id': 'e54c09d0-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.811254265, 'message_signature': 'fe3b32faed8b0fcbd96d7fa8623cf2fc553cb34bcd106012aa46c5a536817381'}]}, 'timestamp': '2026-01-22 17:07:56.162601', '_unique_id': '2a5f534eafae463f837ade281bd11c89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk.device.write.bytes volume: 73011200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.163 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk.device.write.bytes volume: 25628672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b2cfad6-b29c-4113-a8b5-92c339ab54c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73011200, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2-vda', 'timestamp': '2026-01-22T17:07:56.163645', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'instance-00000003', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e54c3ad6-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.826706419, 'message_signature': '4a49103e92484265c06b9a7e5fe2b16ed45d3f52bb1d5fa59712201d36c70de8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25628672, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0-vda', 'timestamp': '2026-01-22T17:07:56.163645', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'instance-00000005', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e54c4242-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.841456635, 'message_signature': '4dbf456dd9295daa9f7ddc5db10d543c1f3edabd14b2f38e188fc3a2d68da24c'}]}, 'timestamp': '2026-01-22 17:07:56.164040', '_unique_id': '932113eaae944117bf8b2349da32d47c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.164 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.165 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.165 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.165 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1783669850>, <NovaLikeServer: tempest-server-test-1172585183>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1783669850>, <NovaLikeServer: tempest-server-test-1172585183>]
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.165 12 DEBUG ceilometer.compute.pollsters [-] effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk.device.write.latency volume: 5568919355 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.165 12 DEBUG ceilometer.compute.pollsters [-] 000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk.device.write.latency volume: 3882922950 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a1dd1fe-e498-4f52-b0e8-f5e1c9a7609b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5568919355, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2-vda', 'timestamp': '2026-01-22T17:07:56.165360', 'resource_metadata': {'display_name': 'tempest-server-test-1783669850', 'name': 'instance-00000003', 'instance_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e54c7da2-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.826706419, 'message_signature': 'c4337e7bdd2cc16d9a52cc10828504046f97780f87ed7601ed735b2a5657d6a1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3882922950, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_name': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_name': None, 'resource_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0-vda', 'timestamp': '2026-01-22T17:07:56.165360', 'resource_metadata': {'display_name': 'tempest-server-test-1172585183', 'name': 'instance-00000005', 'instance_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'instance_type': 'm1.nano', 'host': 'ef513d9552fa7431b193721e9a208d93d0420a41a95c71b59a237c50', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e54c864e-f7b4-11f0-9e69-fa163eaea1db', 'monotonic_time': 4039.841456635, 'message_signature': 'dc5ced8cdad2c0084893fe5f1833329bee47afbafe324fba418b5e7b777dc49c'}]}, 'timestamp': '2026-01-22 17:07:56.165785', '_unique_id': '7a630a7f2f884d91a24305f1acb8c961'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:07:56 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:07:56.166 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.488 183079 DEBUG nova.network.neutron [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Updating instance_info_cache with network_info: [{"id": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "address": "fa:16:3e:eb:3e:ac", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65d8ece3-00", "ovs_interfaceid": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.506 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Releasing lock "refresh_cache-4bb7efdc-59ab-46cd-ae0d-582182c85f5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.506 183079 DEBUG nova.compute.manager [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Instance network_info: |[{"id": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "address": "fa:16:3e:eb:3e:ac", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65d8ece3-00", "ovs_interfaceid": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.507 183079 DEBUG oslo_concurrency.lockutils [req-2c2ca593-fd5b-4a98-ba1e-9d5b9549cbb0 req-408cc574-f6ef-4ea2-b4f3-eba2455ef5e0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-4bb7efdc-59ab-46cd-ae0d-582182c85f5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.507 183079 DEBUG nova.network.neutron [req-2c2ca593-fd5b-4a98-ba1e-9d5b9549cbb0 req-408cc574-f6ef-4ea2-b4f3-eba2455ef5e0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Refreshing network info cache for port 65d8ece3-00e3-43f9-8231-6893ea4cf9a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.509 183079 DEBUG nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Start _get_guest_xml network_info=[{"id": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "address": "fa:16:3e:eb:3e:ac", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65d8ece3-00", "ovs_interfaceid": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.513 183079 WARNING nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.517 183079 DEBUG nova.virt.libvirt.host [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.517 183079 DEBUG nova.virt.libvirt.host [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.523 183079 DEBUG nova.virt.libvirt.host [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.524 183079 DEBUG nova.virt.libvirt.host [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.524 183079 DEBUG nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.524 183079 DEBUG nova.virt.hardware [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.524 183079 DEBUG nova.virt.hardware [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.525 183079 DEBUG nova.virt.hardware [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.525 183079 DEBUG nova.virt.hardware [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.525 183079 DEBUG nova.virt.hardware [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.525 183079 DEBUG nova.virt.hardware [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.525 183079 DEBUG nova.virt.hardware [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.526 183079 DEBUG nova.virt.hardware [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.526 183079 DEBUG nova.virt.hardware [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.526 183079 DEBUG nova.virt.hardware [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.526 183079 DEBUG nova.virt.hardware [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.529 183079 DEBUG nova.virt.libvirt.vif [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1697070694',display_name='tempest-server-test-1697070694',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1697070694',id=6,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOzuS5c+/c3nchlEjzQZcF6sVTY+If+Ronj929KhDV7B+UoVTOX4QPWtIie7F0q5o+yMVIvrndYwVXNXP5sf1WYo75cdEmtnNs2NDMwzYXACOAw67rfC/ZLmfMKPakZxTQ==',key_name='tempest-keypair-test-694600506',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6884ab5c00114ca19f253d0c91e2706f',ramdisk_id='',reservation_id='r-rlpd8jek',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpTestCasesAdmin-1567259425',owner_user_name='tempest-FloatingIpTestCasesAdmin-1567259425-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:07:52Z,user_data=None,user_id='7554977cf766467891ad30986750ca88',uuid=4bb7efdc-59ab-46cd-ae0d-582182c85f5b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "address": "fa:16:3e:eb:3e:ac", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65d8ece3-00", "ovs_interfaceid": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.529 183079 DEBUG nova.network.os_vif_util [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Converting VIF {"id": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "address": "fa:16:3e:eb:3e:ac", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65d8ece3-00", "ovs_interfaceid": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.530 183079 DEBUG nova.network.os_vif_util [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:3e:ac,bridge_name='br-int',has_traffic_filtering=True,id=65d8ece3-00e3-43f9-8231-6893ea4cf9a4,network=Network(9fe870b5-173a-4c8a-b406-6fedb3ddcc4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65d8ece3-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.530 183079 DEBUG nova.objects.instance [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lazy-loading 'pci_devices' on Instance uuid 4bb7efdc-59ab-46cd-ae0d-582182c85f5b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.543 183079 DEBUG nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:07:56 compute-0 nova_compute[183075]:   <uuid>4bb7efdc-59ab-46cd-ae0d-582182c85f5b</uuid>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   <name>instance-00000006</name>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1697070694</nova:name>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:07:56</nova:creationTime>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:07:56 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:07:56 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:07:56 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:07:56 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:07:56 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:07:56 compute-0 nova_compute[183075]:         <nova:user uuid="7554977cf766467891ad30986750ca88">tempest-FloatingIpTestCasesAdmin-1567259425-project-admin</nova:user>
Jan 22 17:07:56 compute-0 nova_compute[183075]:         <nova:project uuid="6884ab5c00114ca19f253d0c91e2706f">tempest-FloatingIpTestCasesAdmin-1567259425</nova:project>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:07:56 compute-0 nova_compute[183075]:         <nova:port uuid="65d8ece3-00e3-43f9-8231-6893ea4cf9a4">
Jan 22 17:07:56 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <system>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <entry name="serial">4bb7efdc-59ab-46cd-ae0d-582182c85f5b</entry>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <entry name="uuid">4bb7efdc-59ab-46cd-ae0d-582182c85f5b</entry>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     </system>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   <os>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   </os>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   <features>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   </features>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b/disk"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:eb:3e:ac"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <target dev="tap65d8ece3-00"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b/console.log" append="off"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <video>
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     </video>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:07:56 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:07:56 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:07:56 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:07:56 compute-0 nova_compute[183075]: </domain>
Jan 22 17:07:56 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.543 183079 DEBUG nova.compute.manager [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Preparing to wait for external event network-vif-plugged-65d8ece3-00e3-43f9-8231-6893ea4cf9a4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.544 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.544 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.544 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.545 183079 DEBUG nova.virt.libvirt.vif [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1697070694',display_name='tempest-server-test-1697070694',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1697070694',id=6,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOzuS5c+/c3nchlEjzQZcF6sVTY+If+Ronj929KhDV7B+UoVTOX4QPWtIie7F0q5o+yMVIvrndYwVXNXP5sf1WYo75cdEmtnNs2NDMwzYXACOAw67rfC/ZLmfMKPakZxTQ==',key_name='tempest-keypair-test-694600506',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6884ab5c00114ca19f253d0c91e2706f',ramdisk_id='',reservation_id='r-rlpd8jek',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpTestCasesAdmin-1567259425',owner_user_name='tempest-FloatingIpTestCasesAdmin-1567259425-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:07:52Z,user_data=None,user_id='7554977cf766467891ad30986750ca88',uuid=4bb7efdc-59ab-46cd-ae0d-582182c85f5b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "address": "fa:16:3e:eb:3e:ac", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65d8ece3-00", "ovs_interfaceid": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.545 183079 DEBUG nova.network.os_vif_util [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Converting VIF {"id": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "address": "fa:16:3e:eb:3e:ac", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65d8ece3-00", "ovs_interfaceid": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.546 183079 DEBUG nova.network.os_vif_util [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:3e:ac,bridge_name='br-int',has_traffic_filtering=True,id=65d8ece3-00e3-43f9-8231-6893ea4cf9a4,network=Network(9fe870b5-173a-4c8a-b406-6fedb3ddcc4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65d8ece3-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.546 183079 DEBUG os_vif [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:3e:ac,bridge_name='br-int',has_traffic_filtering=True,id=65d8ece3-00e3-43f9-8231-6893ea4cf9a4,network=Network(9fe870b5-173a-4c8a-b406-6fedb3ddcc4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65d8ece3-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.547 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.547 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.548 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.551 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.551 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap65d8ece3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.551 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap65d8ece3-00, col_values=(('external_ids', {'iface-id': '65d8ece3-00e3-43f9-8231-6893ea4cf9a4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:3e:ac', 'vm-uuid': '4bb7efdc-59ab-46cd-ae0d-582182c85f5b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.554 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:56 compute-0 NetworkManager[55454]: <info>  [1769101676.5564] manager: (tap65d8ece3-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.557 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.559 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.560 183079 INFO os_vif [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:3e:ac,bridge_name='br-int',has_traffic_filtering=True,id=65d8ece3-00e3-43f9-8231-6893ea4cf9a4,network=Network(9fe870b5-173a-4c8a-b406-6fedb3ddcc4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65d8ece3-00')
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.621 183079 DEBUG nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.623 183079 DEBUG nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] No VIF found with MAC fa:16:3e:eb:3e:ac, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:07:56 compute-0 podman[216031]: 2026-01-22 17:07:56.681905626 +0000 UTC m=+0.075647759 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:07:56 compute-0 kernel: tap65d8ece3-00: entered promiscuous mode
Jan 22 17:07:56 compute-0 NetworkManager[55454]: <info>  [1769101676.6867] manager: (tap65d8ece3-00): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.686 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:56 compute-0 systemd-udevd[215936]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:07:56 compute-0 ovn_controller[95372]: 2026-01-22T17:07:56Z|00061|binding|INFO|Claiming lport 65d8ece3-00e3-43f9-8231-6893ea4cf9a4 for this chassis.
Jan 22 17:07:56 compute-0 ovn_controller[95372]: 2026-01-22T17:07:56Z|00062|binding|INFO|65d8ece3-00e3-43f9-8231-6893ea4cf9a4: Claiming fa:16:3e:eb:3e:ac 10.100.0.8
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.696 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:3e:ac 10.100.0.8'], port_security=['fa:16:3e:eb:3e:ac 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '4bb7efdc-59ab-46cd-ae0d-582182c85f5b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6884ab5c00114ca19f253d0c91e2706f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ced556d8-3a2b-4ec1-a804-8cbb50ada768', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f9feb03-8564-422d-a49d-142dd411b92f, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=65d8ece3-00e3-43f9-8231-6893ea4cf9a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.698 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 65d8ece3-00e3-43f9-8231-6893ea4cf9a4 in datapath 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e bound to our chassis
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.701 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.702 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e
Jan 22 17:07:56 compute-0 ovn_controller[95372]: 2026-01-22T17:07:56Z|00063|binding|INFO|Setting lport 65d8ece3-00e3-43f9-8231-6893ea4cf9a4 ovn-installed in OVS
Jan 22 17:07:56 compute-0 ovn_controller[95372]: 2026-01-22T17:07:56Z|00064|binding|INFO|Setting lport 65d8ece3-00e3-43f9-8231-6893ea4cf9a4 up in Southbound
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.704 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:56 compute-0 NetworkManager[55454]: <info>  [1769101676.7085] device (tap65d8ece3-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:07:56 compute-0 NetworkManager[55454]: <info>  [1769101676.7093] device (tap65d8ece3-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.718 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f639292d-eedc-42cd-bb8a-bb10503f8a69]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.719 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9fe870b5-11 in ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.721 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9fe870b5-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.721 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[512fb276-1e0c-43e0-8dbe-4d1a896b3126]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.723 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[731c4e78-23eb-4b5e-8ce6-372285d7297c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:56 compute-0 systemd-machined[154382]: New machine qemu-6-instance-00000006.
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.735 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[0676f1ee-4a7f-4421-b699-ae94f15f990b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:56 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.763 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7d594dfb-5816-4762-b1be-13b563083087]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.796 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[58213233-cb41-4314-b5e4-0fdfff17362f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:56 compute-0 NetworkManager[55454]: <info>  [1769101676.8045] manager: (tap9fe870b5-10): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.803 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ec05d7-89a3-49c4-9c0a-b203f285d78a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.835 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[3a69abdf-24d3-43f2-9fd9-3078f2983b5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.838 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[3cea37bd-9088-477e-822c-3f56d04719e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:56 compute-0 NetworkManager[55454]: <info>  [1769101676.8645] device (tap9fe870b5-10): carrier: link connected
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.865 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[83163029-a038-4c33-8444-52441a420a69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.881 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ff86b661-0fd5-4f73-a76f-28904d3a591e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9fe870b5-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:dc:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404056, 'reachable_time': 26287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216098, 'error': None, 'target': 'ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.902 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1adfec2a-b6e4-46a3-b8f5-f10eb0895866]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:dcce'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 404056, 'tstamp': 404056}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216100, 'error': None, 'target': 'ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.919 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[eadb84e4-f54d-4fa5-9125-84f6a2c89aac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9fe870b5-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:dc:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404056, 'reachable_time': 26287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216101, 'error': None, 'target': 'ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.920 183079 DEBUG nova.network.neutron [-] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.934 183079 INFO nova.compute.manager [-] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Took 1.93 seconds to deallocate network for instance.
Jan 22 17:07:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:56.962 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4bad20e0-77bd-4808-9c1e-65b796fd831a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.973 183079 DEBUG oslo_concurrency.lockutils [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:56 compute-0 nova_compute[183075]: 2026-01-22 17:07:56.973 183079 DEBUG oslo_concurrency.lockutils [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:57.036 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f08cc1ed-7fe5-42a8-a19a-835777fa7d36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:57.038 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9fe870b5-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:57.038 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:57.039 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9fe870b5-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.041 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:57 compute-0 NetworkManager[55454]: <info>  [1769101677.0415] manager: (tap9fe870b5-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 22 17:07:57 compute-0 kernel: tap9fe870b5-10: entered promiscuous mode
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.046 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:57.047 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9fe870b5-10, col_values=(('external_ids', {'iface-id': '124a4c16-1255-4f0e-8e20-7c85f64c00e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.048 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:57 compute-0 ovn_controller[95372]: 2026-01-22T17:07:57Z|00065|binding|INFO|Releasing lport 124a4c16-1255-4f0e-8e20-7c85f64c00e5 from this chassis (sb_readonly=0)
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.062 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:57.064 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9fe870b5-173a-4c8a-b406-6fedb3ddcc4e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9fe870b5-173a-4c8a-b406-6fedb3ddcc4e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:57.065 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[695a8eb1-f76b-430d-8532-78d1986695d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:57.066 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/9fe870b5-173a-4c8a-b406-6fedb3ddcc4e.pid.haproxy
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:07:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:07:57.067 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e', 'env', 'PROCESS_TAG=haproxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9fe870b5-173a-4c8a-b406-6fedb3ddcc4e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.136 183079 DEBUG nova.compute.provider_tree [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.148 183079 DEBUG nova.scheduler.client.report [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.170 183079 DEBUG oslo_concurrency.lockutils [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.193 183079 INFO nova.scheduler.client.report [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Deleted allocations for instance bd249764-12e4-4e25-9445-dd6e132ca53c
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.198 183079 INFO nova.compute.manager [None req-f6bfb351-d6ec-4e9d-9e4b-c69f7486976b d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Get console output
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.204 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.261 183079 DEBUG oslo_concurrency.lockutils [None req-e4de29b1-671b-4a9b-94eb-03974f244909 c9cdd80799a74efb8ce82cfb5148ac89 cc642b97aa4e4886902a0d1233877b88 - - default default] Lock "bd249764-12e4-4e25-9445-dd6e132ca53c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:57 compute-0 ovn_controller[95372]: 2026-01-22T17:07:57Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0a:25:d6 10.100.0.4
Jan 22 17:07:57 compute-0 ovn_controller[95372]: 2026-01-22T17:07:57Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0a:25:d6 10.100.0.4
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.385 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101677.3844655, 4bb7efdc-59ab-46cd-ae0d-582182c85f5b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.386 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] VM Started (Lifecycle Event)
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.402 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.407 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101677.3847644, 4bb7efdc-59ab-46cd-ae0d-582182c85f5b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.408 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] VM Paused (Lifecycle Event)
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.428 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.432 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:07:57 compute-0 nova_compute[183075]: 2026-01-22 17:07:57.447 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:07:57 compute-0 podman[216140]: 2026-01-22 17:07:57.490394348 +0000 UTC m=+0.061473938 container create 3843f3618761b92374dc588c34bf04c8023fc31d75e32df465150b1bfcb808ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:07:57 compute-0 systemd[1]: Started libpod-conmon-3843f3618761b92374dc588c34bf04c8023fc31d75e32df465150b1bfcb808ac.scope.
Jan 22 17:07:57 compute-0 podman[216140]: 2026-01-22 17:07:57.457878598 +0000 UTC m=+0.028958218 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:07:57 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:07:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7dbf004c3aa91ae1d161ce24f74ee37bbd39a59273d901ebb8e041f477297d53/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:07:57 compute-0 podman[216140]: 2026-01-22 17:07:57.588984926 +0000 UTC m=+0.160064516 container init 3843f3618761b92374dc588c34bf04c8023fc31d75e32df465150b1bfcb808ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:07:57 compute-0 podman[216140]: 2026-01-22 17:07:57.599413499 +0000 UTC m=+0.170493089 container start 3843f3618761b92374dc588c34bf04c8023fc31d75e32df465150b1bfcb808ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:07:57 compute-0 neutron-haproxy-ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216156]: [NOTICE]   (216160) : New worker (216162) forked
Jan 22 17:07:57 compute-0 neutron-haproxy-ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216156]: [NOTICE]   (216160) : Loading success.
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.054 183079 DEBUG nova.compute.manager [req-59b0ee47-058e-4cf7-ad9a-d5b4522a7303 req-6b442254-4902-402b-be7b-0e7a62d12fd3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Received event network-vif-plugged-65d8ece3-00e3-43f9-8231-6893ea4cf9a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.055 183079 DEBUG oslo_concurrency.lockutils [req-59b0ee47-058e-4cf7-ad9a-d5b4522a7303 req-6b442254-4902-402b-be7b-0e7a62d12fd3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.055 183079 DEBUG oslo_concurrency.lockutils [req-59b0ee47-058e-4cf7-ad9a-d5b4522a7303 req-6b442254-4902-402b-be7b-0e7a62d12fd3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.056 183079 DEBUG oslo_concurrency.lockutils [req-59b0ee47-058e-4cf7-ad9a-d5b4522a7303 req-6b442254-4902-402b-be7b-0e7a62d12fd3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.056 183079 DEBUG nova.compute.manager [req-59b0ee47-058e-4cf7-ad9a-d5b4522a7303 req-6b442254-4902-402b-be7b-0e7a62d12fd3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Processing event network-vif-plugged-65d8ece3-00e3-43f9-8231-6893ea4cf9a4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.056 183079 DEBUG nova.compute.manager [req-59b0ee47-058e-4cf7-ad9a-d5b4522a7303 req-6b442254-4902-402b-be7b-0e7a62d12fd3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Received event network-vif-plugged-65d8ece3-00e3-43f9-8231-6893ea4cf9a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.056 183079 DEBUG oslo_concurrency.lockutils [req-59b0ee47-058e-4cf7-ad9a-d5b4522a7303 req-6b442254-4902-402b-be7b-0e7a62d12fd3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.057 183079 DEBUG oslo_concurrency.lockutils [req-59b0ee47-058e-4cf7-ad9a-d5b4522a7303 req-6b442254-4902-402b-be7b-0e7a62d12fd3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.058 183079 DEBUG oslo_concurrency.lockutils [req-59b0ee47-058e-4cf7-ad9a-d5b4522a7303 req-6b442254-4902-402b-be7b-0e7a62d12fd3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.058 183079 DEBUG nova.compute.manager [req-59b0ee47-058e-4cf7-ad9a-d5b4522a7303 req-6b442254-4902-402b-be7b-0e7a62d12fd3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] No waiting events found dispatching network-vif-plugged-65d8ece3-00e3-43f9-8231-6893ea4cf9a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.058 183079 WARNING nova.compute.manager [req-59b0ee47-058e-4cf7-ad9a-d5b4522a7303 req-6b442254-4902-402b-be7b-0e7a62d12fd3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Received unexpected event network-vif-plugged-65d8ece3-00e3-43f9-8231-6893ea4cf9a4 for instance with vm_state building and task_state spawning.
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.060 183079 DEBUG nova.compute.manager [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.064 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101678.0638359, 4bb7efdc-59ab-46cd-ae0d-582182c85f5b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.064 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] VM Resumed (Lifecycle Event)
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.067 183079 DEBUG nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.071 183079 INFO nova.virt.libvirt.driver [-] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Instance spawned successfully.
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.071 183079 DEBUG nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.083 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.095 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.099 183079 DEBUG nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.099 183079 DEBUG nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.100 183079 DEBUG nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.100 183079 DEBUG nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.101 183079 DEBUG nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.101 183079 DEBUG nova.virt.libvirt.driver [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.132 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.181 183079 INFO nova.compute.manager [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Took 5.67 seconds to spawn the instance on the hypervisor.
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.181 183079 DEBUG nova.compute.manager [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.247 183079 INFO nova.compute.manager [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Took 6.25 seconds to build instance.
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.273 183079 DEBUG oslo_concurrency.lockutils [None req-3e8bd753-1bae-466c-b1fd-4d2d8b31cd87 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.382s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.428 183079 DEBUG nova.network.neutron [req-2c2ca593-fd5b-4a98-ba1e-9d5b9549cbb0 req-408cc574-f6ef-4ea2-b4f3-eba2455ef5e0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Updated VIF entry in instance network info cache for port 65d8ece3-00e3-43f9-8231-6893ea4cf9a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.428 183079 DEBUG nova.network.neutron [req-2c2ca593-fd5b-4a98-ba1e-9d5b9549cbb0 req-408cc574-f6ef-4ea2-b4f3-eba2455ef5e0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Updating instance_info_cache with network_info: [{"id": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "address": "fa:16:3e:eb:3e:ac", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65d8ece3-00", "ovs_interfaceid": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:07:58 compute-0 nova_compute[183075]: 2026-01-22 17:07:58.443 183079 DEBUG oslo_concurrency.lockutils [req-2c2ca593-fd5b-4a98-ba1e-9d5b9549cbb0 req-408cc574-f6ef-4ea2-b4f3-eba2455ef5e0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-4bb7efdc-59ab-46cd-ae0d-582182c85f5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:07:59 compute-0 nova_compute[183075]: 2026-01-22 17:07:59.732 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:00 compute-0 nova_compute[183075]: 2026-01-22 17:08:00.257 183079 INFO nova.compute.manager [None req-f158917f-c0b6-485e-9411-0bf27a6b4090 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Get console output
Jan 22 17:08:00 compute-0 nova_compute[183075]: 2026-01-22 17:08:00.261 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:08:00 compute-0 nova_compute[183075]: 2026-01-22 17:08:00.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:08:01 compute-0 nova_compute[183075]: 2026-01-22 17:08:01.556 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:02 compute-0 nova_compute[183075]: 2026-01-22 17:08:02.322 183079 INFO nova.compute.manager [None req-58d75478-208d-46ce-b8b2-b182a2fca799 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Get console output
Jan 22 17:08:02 compute-0 nova_compute[183075]: 2026-01-22 17:08:02.330 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:08:02 compute-0 nova_compute[183075]: 2026-01-22 17:08:02.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:08:03 compute-0 ovn_controller[95372]: 2026-01-22T17:08:03Z|00066|binding|INFO|Releasing lport 424ac40e-403e-4504-adbb-47a319b401fd from this chassis (sb_readonly=0)
Jan 22 17:08:03 compute-0 ovn_controller[95372]: 2026-01-22T17:08:03Z|00067|binding|INFO|Releasing lport 124a4c16-1255-4f0e-8e20-7c85f64c00e5 from this chassis (sb_readonly=0)
Jan 22 17:08:03 compute-0 nova_compute[183075]: 2026-01-22 17:08:03.041 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:03 compute-0 ovn_controller[95372]: 2026-01-22T17:08:03Z|00068|binding|INFO|Releasing lport 424ac40e-403e-4504-adbb-47a319b401fd from this chassis (sb_readonly=0)
Jan 22 17:08:03 compute-0 ovn_controller[95372]: 2026-01-22T17:08:03Z|00069|binding|INFO|Releasing lport 124a4c16-1255-4f0e-8e20-7c85f64c00e5 from this chassis (sb_readonly=0)
Jan 22 17:08:03 compute-0 nova_compute[183075]: 2026-01-22 17:08:03.141 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:03 compute-0 podman[216173]: 2026-01-22 17:08:03.356079314 +0000 UTC m=+0.065387521 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:04.118 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:04.119 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:04 compute-0 nova_compute[183075]: 2026-01-22 17:08:04.751 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:04 compute-0 nova_compute[183075]: 2026-01-22 17:08:04.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:04.929 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:04.930 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.8108599
Jan 22 17:08:04 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.4:33394 [22/Jan/2026:17:08:04.117] listener listener/metadata 0/0/0/813/813 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:04.940 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:04.940 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:04.956 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:04.956 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0160272
Jan 22 17:08:04 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.4:33406 [22/Jan/2026:17:08:04.939] listener listener/metadata 0/0/0/17/17 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:04.962 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:04.962 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:04.978 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:04.979 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0163774
Jan 22 17:08:04 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.4:33418 [22/Jan/2026:17:08:04.961] listener listener/metadata 0/0/0/17/17 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:04.985 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:04.986 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:04.999 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.000 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0134184
Jan 22 17:08:05 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.4:33422 [22/Jan/2026:17:08:04.984] listener listener/metadata 0/0/0/15/15 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.006 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.009 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.024 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.024 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0150912
Jan 22 17:08:05 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.4:33436 [22/Jan/2026:17:08:05.005] listener listener/metadata 0/0/0/19/19 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.030 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.031 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.044 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.045 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0139554
Jan 22 17:08:05 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.4:33450 [22/Jan/2026:17:08:05.030] listener listener/metadata 0/0/0/14/14 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.050 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.050 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.070 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.070 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0203068
Jan 22 17:08:05 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.4:33454 [22/Jan/2026:17:08:05.049] listener listener/metadata 0/0/0/21/21 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.077 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.078 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.092 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.092 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0147622
Jan 22 17:08:05 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.4:33458 [22/Jan/2026:17:08:05.076] listener listener/metadata 0/0/0/15/15 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.099 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.100 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.117 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.118 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0178270
Jan 22 17:08:05 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.4:33460 [22/Jan/2026:17:08:05.099] listener listener/metadata 0/0/0/19/19 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.126 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.126 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.147 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.148 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0212235
Jan 22 17:08:05 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.4:33466 [22/Jan/2026:17:08:05.125] listener listener/metadata 0/0/0/22/22 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.155 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.156 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:05 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.4:33468 [22/Jan/2026:17:08:05.155] listener listener/metadata 0/0/0/15/15 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.170 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0143800
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.180 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.180 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.195 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.196 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0154040
Jan 22 17:08:05 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.4:33470 [22/Jan/2026:17:08:05.179] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.200 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.200 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.220 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.221 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0202148
Jan 22 17:08:05 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.4:33478 [22/Jan/2026:17:08:05.200] listener listener/metadata 0/0/0/21/21 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.226 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.227 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.245 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.245 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0185447
Jan 22 17:08:05 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.4:33494 [22/Jan/2026:17:08:05.226] listener listener/metadata 0/0/0/19/19 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.252 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.253 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:05 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.4:33496 [22/Jan/2026:17:08:05.252] listener listener/metadata 0/0/0/32/32 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.284 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.285 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0317743
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.292 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.293 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 473b4e99-4018-4fa7-ab1c-2d3e7944d850 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.309 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:05 compute-0 haproxy-metadata-proxy-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215497]: 10.100.0.4:33512 [22/Jan/2026:17:08:05.292] listener listener/metadata 0/0/0/18/18 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:08:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:05.310 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0171468
Jan 22 17:08:05 compute-0 nova_compute[183075]: 2026-01-22 17:08:05.456 183079 INFO nova.compute.manager [None req-b974a273-cf8f-4c45-9e22-e6dbb3ba4877 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Get console output
Jan 22 17:08:05 compute-0 nova_compute[183075]: 2026-01-22 17:08:05.461 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:08:05 compute-0 nova_compute[183075]: 2026-01-22 17:08:05.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:08:05 compute-0 nova_compute[183075]: 2026-01-22 17:08:05.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:08:05 compute-0 nova_compute[183075]: 2026-01-22 17:08:05.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:08:06 compute-0 nova_compute[183075]: 2026-01-22 17:08:06.126 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:08:06 compute-0 nova_compute[183075]: 2026-01-22 17:08:06.127 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:08:06 compute-0 nova_compute[183075]: 2026-01-22 17:08:06.127 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:08:06 compute-0 nova_compute[183075]: 2026-01-22 17:08:06.127 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid effaddee-27ef-49f6-ac5f-2e3258c8d5d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:08:06 compute-0 nova_compute[183075]: 2026-01-22 17:08:06.559 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:06 compute-0 nova_compute[183075]: 2026-01-22 17:08:06.712 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101671.711068, b9680bfd-e87f-427c-8f13-2b3a415aca39 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:08:06 compute-0 nova_compute[183075]: 2026-01-22 17:08:06.713 183079 INFO nova.compute.manager [-] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] VM Stopped (Lifecycle Event)
Jan 22 17:08:06 compute-0 nova_compute[183075]: 2026-01-22 17:08:06.731 183079 DEBUG nova.compute.manager [None req-01b36e1e-3f35-47a1-a3e1-c1de41339022 - - - - - -] [instance: b9680bfd-e87f-427c-8f13-2b3a415aca39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:08:07 compute-0 nova_compute[183075]: 2026-01-22 17:08:07.551 183079 INFO nova.compute.manager [None req-28d1be19-9f2b-4fc7-b35d-f11510a81311 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Get console output
Jan 22 17:08:07 compute-0 nova_compute[183075]: 2026-01-22 17:08:07.558 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:08:07 compute-0 nova_compute[183075]: 2026-01-22 17:08:07.941 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Updating instance_info_cache with network_info: [{"id": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "address": "fa:16:3e:c8:71:80", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44437e9e-7b", "ovs_interfaceid": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:08:07 compute-0 nova_compute[183075]: 2026-01-22 17:08:07.964 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:08:07 compute-0 nova_compute[183075]: 2026-01-22 17:08:07.965 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:08:07 compute-0 nova_compute[183075]: 2026-01-22 17:08:07.966 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:08:07 compute-0 nova_compute[183075]: 2026-01-22 17:08:07.967 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:08:08 compute-0 nova_compute[183075]: 2026-01-22 17:08:08.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:08:08 compute-0 nova_compute[183075]: 2026-01-22 17:08:08.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:08:08 compute-0 nova_compute[183075]: 2026-01-22 17:08:08.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:08:09 compute-0 nova_compute[183075]: 2026-01-22 17:08:09.613 183079 DEBUG oslo_concurrency.lockutils [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "interface-effaddee-27ef-49f6-ac5f-2e3258c8d5d2-deda3dcd-de47-47e2-8bb1-526d3882d38e" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:09 compute-0 nova_compute[183075]: 2026-01-22 17:08:09.614 183079 DEBUG oslo_concurrency.lockutils [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "interface-effaddee-27ef-49f6-ac5f-2e3258c8d5d2-deda3dcd-de47-47e2-8bb1-526d3882d38e" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:09 compute-0 nova_compute[183075]: 2026-01-22 17:08:09.614 183079 DEBUG nova.objects.instance [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lazy-loading 'flavor' on Instance uuid effaddee-27ef-49f6-ac5f-2e3258c8d5d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:08:09 compute-0 ovn_controller[95372]: 2026-01-22T17:08:09Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:eb:3e:ac 10.100.0.8
Jan 22 17:08:09 compute-0 ovn_controller[95372]: 2026-01-22T17:08:09Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:eb:3e:ac 10.100.0.8
Jan 22 17:08:09 compute-0 nova_compute[183075]: 2026-01-22 17:08:09.811 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:09 compute-0 nova_compute[183075]: 2026-01-22 17:08:09.896 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101674.8952682, bd249764-12e4-4e25-9445-dd6e132ca53c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:08:09 compute-0 nova_compute[183075]: 2026-01-22 17:08:09.897 183079 INFO nova.compute.manager [-] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] VM Stopped (Lifecycle Event)
Jan 22 17:08:09 compute-0 nova_compute[183075]: 2026-01-22 17:08:09.923 183079 DEBUG nova.compute.manager [None req-ac3cc8ec-6bad-4154-af82-32fd374db819 - - - - - -] [instance: bd249764-12e4-4e25-9445-dd6e132ca53c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:08:10 compute-0 nova_compute[183075]: 2026-01-22 17:08:10.619 183079 INFO nova.compute.manager [None req-e370ebff-7b14-461c-986d-586b3bfb2235 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Get console output
Jan 22 17:08:10 compute-0 nova_compute[183075]: 2026-01-22 17:08:10.622 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:08:10 compute-0 nova_compute[183075]: 2026-01-22 17:08:10.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:08:10 compute-0 nova_compute[183075]: 2026-01-22 17:08:10.809 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:10 compute-0 nova_compute[183075]: 2026-01-22 17:08:10.810 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:10 compute-0 nova_compute[183075]: 2026-01-22 17:08:10.811 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:10 compute-0 nova_compute[183075]: 2026-01-22 17:08:10.811 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:08:10 compute-0 nova_compute[183075]: 2026-01-22 17:08:10.889 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:10 compute-0 nova_compute[183075]: 2026-01-22 17:08:10.947 183079 DEBUG nova.objects.instance [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lazy-loading 'pci_requests' on Instance uuid effaddee-27ef-49f6-ac5f-2e3258c8d5d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:08:10 compute-0 nova_compute[183075]: 2026-01-22 17:08:10.950 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:10 compute-0 nova_compute[183075]: 2026-01-22 17:08:10.951 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:10 compute-0 nova_compute[183075]: 2026-01-22 17:08:10.969 183079 DEBUG nova.network.neutron [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.014 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.020 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.080 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.081 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.142 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.148 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.215 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.216 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.281 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.451 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.453 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5268MB free_disk=73.29690933227539GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.453 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.453 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.553 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance effaddee-27ef-49f6-ac5f-2e3258c8d5d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.553 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 000b64b8-bcc5-4bbe-9703-8400a83a27d0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.553 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 4bb7efdc-59ab-46cd-ae0d-582182c85f5b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.554 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.554 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.561 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.623 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.636 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.663 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.663 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:11 compute-0 nova_compute[183075]: 2026-01-22 17:08:11.955 183079 DEBUG nova.policy [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:08:12 compute-0 nova_compute[183075]: 2026-01-22 17:08:12.716 183079 DEBUG nova.network.neutron [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Successfully updated port: deda3dcd-de47-47e2-8bb1-526d3882d38e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:08:12 compute-0 nova_compute[183075]: 2026-01-22 17:08:12.747 183079 DEBUG oslo_concurrency.lockutils [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:08:12 compute-0 nova_compute[183075]: 2026-01-22 17:08:12.749 183079 DEBUG oslo_concurrency.lockutils [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquired lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:08:12 compute-0 nova_compute[183075]: 2026-01-22 17:08:12.749 183079 DEBUG nova.network.neutron [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:08:12 compute-0 nova_compute[183075]: 2026-01-22 17:08:12.852 183079 DEBUG nova.compute.manager [req-c74d5af5-f2ab-4a94-8f2d-d8046dbff920 req-1cffcb7f-ef7b-4f9f-85f6-98b28bcbf692 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received event network-changed-deda3dcd-de47-47e2-8bb1-526d3882d38e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:12 compute-0 nova_compute[183075]: 2026-01-22 17:08:12.853 183079 DEBUG nova.compute.manager [req-c74d5af5-f2ab-4a94-8f2d-d8046dbff920 req-1cffcb7f-ef7b-4f9f-85f6-98b28bcbf692 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Refreshing instance network info cache due to event network-changed-deda3dcd-de47-47e2-8bb1-526d3882d38e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:08:12 compute-0 nova_compute[183075]: 2026-01-22 17:08:12.853 183079 DEBUG oslo_concurrency.lockutils [req-c74d5af5-f2ab-4a94-8f2d-d8046dbff920 req-1cffcb7f-ef7b-4f9f-85f6-98b28bcbf692 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:08:12 compute-0 nova_compute[183075]: 2026-01-22 17:08:12.911 183079 WARNING nova.network.neutron [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] 473b4e99-4018-4fa7-ab1c-2d3e7944d850 already exists in list: networks containing: ['473b4e99-4018-4fa7-ab1c-2d3e7944d850']. ignoring it
Jan 22 17:08:14 compute-0 podman[216242]: 2026-01-22 17:08:14.361961403 +0000 UTC m=+0.059633640 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:08:14 compute-0 nova_compute[183075]: 2026-01-22 17:08:14.851 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:15.562 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:15.564 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:08:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:08:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.773 183079 INFO nova.compute.manager [None req-e2ab2ce9-d135-4d99-9de9-263ff8e616fa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Get console output
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.779 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.894 183079 DEBUG nova.network.neutron [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Updating instance_info_cache with network_info: [{"id": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "address": "fa:16:3e:c8:71:80", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44437e9e-7b", "ovs_interfaceid": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.919 183079 DEBUG oslo_concurrency.lockutils [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Releasing lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.921 183079 DEBUG oslo_concurrency.lockutils [req-c74d5af5-f2ab-4a94-8f2d-d8046dbff920 req-1cffcb7f-ef7b-4f9f-85f6-98b28bcbf692 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.921 183079 DEBUG nova.network.neutron [req-c74d5af5-f2ab-4a94-8f2d-d8046dbff920 req-1cffcb7f-ef7b-4f9f-85f6-98b28bcbf692 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Refreshing network info cache for port deda3dcd-de47-47e2-8bb1-526d3882d38e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.925 183079 DEBUG nova.virt.libvirt.vif [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1783669850',display_name='tempest-server-test-1783669850',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1783669850',id=3,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCYZfzAO5Ui0C6aNyi2ae5nossybaMPhlAmp9Zj0dL/LunopVV9meBl8qfqrR0u5rqBGUC3w2LWkiMuhmtUWjdFFHsn2/jcRm/feYqTpe58YE0uSBcC8gJ0hMPoRirOIAg==',key_name='tempest-keypair-test-1166356111',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:07:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='eb8e9f7a891a4a38af8b01557eddc991',ramdisk_id='',reservation_id='r-89frdmqf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPPortDetailsTest-1812576526',owner_user_name='tempest-FloatingIPPortDetailsTest-1812576526-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:07:11Z,user_data=None,user_id='d7ee6c51c2b8447baefccea20fa16de5',uuid=effaddee-27ef-49f6-ac5f-2e3258c8d5d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.925 183079 DEBUG nova.network.os_vif_util [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converting VIF {"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.926 183079 DEBUG nova.network.os_vif_util [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:1c:cd,bridge_name='br-int',has_traffic_filtering=True,id=deda3dcd-de47-47e2-8bb1-526d3882d38e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdeda3dcd-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.926 183079 DEBUG os_vif [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:1c:cd,bridge_name='br-int',has_traffic_filtering=True,id=deda3dcd-de47-47e2-8bb1-526d3882d38e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdeda3dcd-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.927 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.927 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.928 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.932 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.932 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdeda3dcd-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.933 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdeda3dcd-de, col_values=(('external_ids', {'iface-id': 'deda3dcd-de47-47e2-8bb1-526d3882d38e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:1c:cd', 'vm-uuid': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.972 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:15 compute-0 NetworkManager[55454]: <info>  [1769101695.9732] manager: (tapdeda3dcd-de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.977 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.980 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.981 183079 INFO os_vif [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:1c:cd,bridge_name='br-int',has_traffic_filtering=True,id=deda3dcd-de47-47e2-8bb1-526d3882d38e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdeda3dcd-de')
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.982 183079 DEBUG nova.virt.libvirt.vif [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1783669850',display_name='tempest-server-test-1783669850',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1783669850',id=3,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCYZfzAO5Ui0C6aNyi2ae5nossybaMPhlAmp9Zj0dL/LunopVV9meBl8qfqrR0u5rqBGUC3w2LWkiMuhmtUWjdFFHsn2/jcRm/feYqTpe58YE0uSBcC8gJ0hMPoRirOIAg==',key_name='tempest-keypair-test-1166356111',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:07:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='eb8e9f7a891a4a38af8b01557eddc991',ramdisk_id='',reservation_id='r-89frdmqf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPPortDetailsTest-1812576526',owner_user_name='tempest-FloatingIPPortDetailsTest-1812576526-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:07:11Z,user_data=None,user_id='d7ee6c51c2b8447baefccea20fa16de5',uuid=effaddee-27ef-49f6-ac5f-2e3258c8d5d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.982 183079 DEBUG nova.network.os_vif_util [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converting VIF {"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.983 183079 DEBUG nova.network.os_vif_util [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:1c:cd,bridge_name='br-int',has_traffic_filtering=True,id=deda3dcd-de47-47e2-8bb1-526d3882d38e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdeda3dcd-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:08:15 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.985 183079 DEBUG nova.virt.libvirt.guest [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] attach device xml: <interface type="ethernet">
Jan 22 17:08:15 compute-0 nova_compute[183075]:   <mac address="fa:16:3e:a0:1c:cd"/>
Jan 22 17:08:15 compute-0 nova_compute[183075]:   <model type="virtio"/>
Jan 22 17:08:15 compute-0 nova_compute[183075]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:08:15 compute-0 nova_compute[183075]:   <mtu size="1442"/>
Jan 22 17:08:15 compute-0 nova_compute[183075]:   <target dev="tapdeda3dcd-de"/>
Jan 22 17:08:15 compute-0 nova_compute[183075]: </interface>
Jan 22 17:08:15 compute-0 nova_compute[183075]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 22 17:08:15 compute-0 kernel: tapdeda3dcd-de: entered promiscuous mode
Jan 22 17:08:15 compute-0 NetworkManager[55454]: <info>  [1769101695.9970] manager: (tapdeda3dcd-de): new Tun device (/org/freedesktop/NetworkManager/Devices/46)
Jan 22 17:08:15 compute-0 ovn_controller[95372]: 2026-01-22T17:08:15Z|00070|binding|INFO|Claiming lport deda3dcd-de47-47e2-8bb1-526d3882d38e for this chassis.
Jan 22 17:08:15 compute-0 ovn_controller[95372]: 2026-01-22T17:08:15Z|00071|binding|INFO|deda3dcd-de47-47e2-8bb1-526d3882d38e: Claiming fa:16:3e:a0:1c:cd 10.100.0.10
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:15.998 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.008 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:1c:cd 10.100.0.10'], port_security=['fa:16:3e:a0:1c:cd 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3fb419e8-dd25-4fec-8107-6e7d89977d34', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.234'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a8cd0033-3766-4772-8688-44d3d0e3250a, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=deda3dcd-de47-47e2-8bb1-526d3882d38e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.010 104629 INFO neutron.agent.ovn.metadata.agent [-] Port deda3dcd-de47-47e2-8bb1-526d3882d38e in datapath 473b4e99-4018-4fa7-ab1c-2d3e7944d850 bound to our chassis
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.012 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 473b4e99-4018-4fa7-ab1c-2d3e7944d850
Jan 22 17:08:16 compute-0 ovn_controller[95372]: 2026-01-22T17:08:16Z|00072|binding|INFO|Setting lport deda3dcd-de47-47e2-8bb1-526d3882d38e ovn-installed in OVS
Jan 22 17:08:16 compute-0 ovn_controller[95372]: 2026-01-22T17:08:16Z|00073|binding|INFO|Setting lport deda3dcd-de47-47e2-8bb1-526d3882d38e up in Southbound
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.021 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.025 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:16 compute-0 systemd-udevd[216273]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.032 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ef10e9ef-6782-4bd8-a554-395462aea842]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:16 compute-0 NetworkManager[55454]: <info>  [1769101696.0459] device (tapdeda3dcd-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:08:16 compute-0 NetworkManager[55454]: <info>  [1769101696.0469] device (tapdeda3dcd-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:08:16 compute-0 NetworkManager[55454]: <info>  [1769101696.0582] manager: (patch-br-int-to-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Jan 22 17:08:16 compute-0 ovn_controller[95372]: 2026-01-22T17:08:16Z|00074|binding|INFO|Releasing lport 424ac40e-403e-4504-adbb-47a319b401fd from this chassis (sb_readonly=0)
Jan 22 17:08:16 compute-0 ovn_controller[95372]: 2026-01-22T17:08:16Z|00075|binding|INFO|Releasing lport 124a4c16-1255-4f0e-8e20-7c85f64c00e5 from this chassis (sb_readonly=0)
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.060 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:16 compute-0 NetworkManager[55454]: <info>  [1769101696.0614] manager: (patch-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.070 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[fcbdfa3b-1bbc-4217-9b1e-b1976abaee45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.074 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c3ff35c1-40b4-4faa-8b50-0f18e74f2f78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:16 compute-0 ovn_controller[95372]: 2026-01-22T17:08:16Z|00076|binding|INFO|Releasing lport 424ac40e-403e-4504-adbb-47a319b401fd from this chassis (sb_readonly=0)
Jan 22 17:08:16 compute-0 ovn_controller[95372]: 2026-01-22T17:08:16Z|00077|binding|INFO|Releasing lport 124a4c16-1255-4f0e-8e20-7c85f64c00e5 from this chassis (sb_readonly=0)
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.089 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.094 183079 DEBUG nova.virt.libvirt.driver [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.095 183079 DEBUG nova.virt.libvirt.driver [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] No VIF found with MAC fa:16:3e:c8:71:80, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.095 183079 DEBUG nova.virt.libvirt.driver [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] No VIF found with MAC fa:16:3e:a0:1c:cd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.098 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.102 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[9c7397f2-5a5f-4393-8554-a2ca32f24657]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.115 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[24b45ca0-5408-4c53-9786-6c1d9ddd3ab3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap473b4e99-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:87:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 105, 'rx_bytes': 17308, 'tx_bytes': 11992, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 105, 'rx_bytes': 17308, 'tx_bytes': 11992, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399443, 'reachable_time': 32301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216280, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.121 183079 DEBUG nova.virt.libvirt.guest [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:08:16 compute-0 nova_compute[183075]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:08:16 compute-0 nova_compute[183075]:   <nova:name>tempest-server-test-1783669850</nova:name>
Jan 22 17:08:16 compute-0 nova_compute[183075]:   <nova:creationTime>2026-01-22 17:08:16</nova:creationTime>
Jan 22 17:08:16 compute-0 nova_compute[183075]:   <nova:flavor name="m1.nano">
Jan 22 17:08:16 compute-0 nova_compute[183075]:     <nova:memory>128</nova:memory>
Jan 22 17:08:16 compute-0 nova_compute[183075]:     <nova:disk>1</nova:disk>
Jan 22 17:08:16 compute-0 nova_compute[183075]:     <nova:swap>0</nova:swap>
Jan 22 17:08:16 compute-0 nova_compute[183075]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:08:16 compute-0 nova_compute[183075]:     <nova:vcpus>1</nova:vcpus>
Jan 22 17:08:16 compute-0 nova_compute[183075]:   </nova:flavor>
Jan 22 17:08:16 compute-0 nova_compute[183075]:   <nova:owner>
Jan 22 17:08:16 compute-0 nova_compute[183075]:     <nova:user uuid="d7ee6c51c2b8447baefccea20fa16de5">tempest-FloatingIPPortDetailsTest-1812576526-project-member</nova:user>
Jan 22 17:08:16 compute-0 nova_compute[183075]:     <nova:project uuid="eb8e9f7a891a4a38af8b01557eddc991">tempest-FloatingIPPortDetailsTest-1812576526</nova:project>
Jan 22 17:08:16 compute-0 nova_compute[183075]:   </nova:owner>
Jan 22 17:08:16 compute-0 nova_compute[183075]:   <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:08:16 compute-0 nova_compute[183075]:   <nova:ports>
Jan 22 17:08:16 compute-0 nova_compute[183075]:     <nova:port uuid="44437e9e-7bcf-4942-83a0-cb6139413a8e">
Jan 22 17:08:16 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 17:08:16 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:08:16 compute-0 nova_compute[183075]:     <nova:port uuid="deda3dcd-de47-47e2-8bb1-526d3882d38e">
Jan 22 17:08:16 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:08:16 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:08:16 compute-0 nova_compute[183075]:   </nova:ports>
Jan 22 17:08:16 compute-0 nova_compute[183075]: </nova:instance>
Jan 22 17:08:16 compute-0 nova_compute[183075]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.133 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f4ed20f8-4164-45fc-b33c-f471b4d53054]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap473b4e99-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399457, 'tstamp': 399457}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216281, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap473b4e99-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399460, 'tstamp': 399460}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216281, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.135 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap473b4e99-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.138 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.139 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.139 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap473b4e99-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.140 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.141 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap473b4e99-40, col_values=(('external_ids', {'iface-id': '424ac40e-403e-4504-adbb-47a319b401fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.141 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.143 183079 DEBUG oslo_concurrency.lockutils [None req-1d302c36-d5c7-4cb4-8752-a5c55f0307b9 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "interface-effaddee-27ef-49f6-ac5f-2e3258c8d5d2-deda3dcd-de47-47e2-8bb1-526d3882d38e" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.529s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.208 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:16 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.8:55662 [22/Jan/2026:17:08:15.561] listener listener/metadata 0/0/0/647/647 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.208 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.6447616
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.219 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.222 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.248 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.249 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0270605
Jan 22 17:08:16 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.8:55670 [22/Jan/2026:17:08:16.217] listener listener/metadata 0/0/0/31/31 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.254 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.255 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.273 183079 DEBUG nova.compute.manager [req-1f4f2966-b2d7-4d0d-9a50-5993341a71a3 req-1988a7b3-3cf4-491a-aea9-fec71e1dac9f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received event network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.273 183079 DEBUG oslo_concurrency.lockutils [req-1f4f2966-b2d7-4d0d-9a50-5993341a71a3 req-1988a7b3-3cf4-491a-aea9-fec71e1dac9f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.274 183079 DEBUG oslo_concurrency.lockutils [req-1f4f2966-b2d7-4d0d-9a50-5993341a71a3 req-1988a7b3-3cf4-491a-aea9-fec71e1dac9f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.274 183079 DEBUG oslo_concurrency.lockutils [req-1f4f2966-b2d7-4d0d-9a50-5993341a71a3 req-1988a7b3-3cf4-491a-aea9-fec71e1dac9f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.274 183079 DEBUG nova.compute.manager [req-1f4f2966-b2d7-4d0d-9a50-5993341a71a3 req-1988a7b3-3cf4-491a-aea9-fec71e1dac9f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] No waiting events found dispatching network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:08:16 compute-0 nova_compute[183075]: 2026-01-22 17:08:16.274 183079 WARNING nova.compute.manager [req-1f4f2966-b2d7-4d0d-9a50-5993341a71a3 req-1988a7b3-3cf4-491a-aea9-fec71e1dac9f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received unexpected event network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e for instance with vm_state active and task_state None.
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.280 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:16 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.8:55686 [22/Jan/2026:17:08:16.253] listener listener/metadata 0/0/0/27/27 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.281 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0260179
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.287 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.288 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.306 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.307 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0195932
Jan 22 17:08:16 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.8:55698 [22/Jan/2026:17:08:16.286] listener listener/metadata 0/0/0/20/20 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.312 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.313 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.331 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:16 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.8:55702 [22/Jan/2026:17:08:16.312] listener listener/metadata 0/0/0/18/18 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.331 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0179222
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.338 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.339 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.351 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.351 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0126848
Jan 22 17:08:16 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.8:55718 [22/Jan/2026:17:08:16.337] listener listener/metadata 0/0/0/14/14 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.357 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.357 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.371 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.371 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0143433
Jan 22 17:08:16 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.8:55730 [22/Jan/2026:17:08:16.356] listener listener/metadata 0/0/0/15/15 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.376 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.378 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.395 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.395 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0178576
Jan 22 17:08:16 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.8:55734 [22/Jan/2026:17:08:16.376] listener listener/metadata 0/0/0/19/19 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.403 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.405 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.428 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:16 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.8:55742 [22/Jan/2026:17:08:16.403] listener listener/metadata 0/0/0/25/25 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.429 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0239377
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.435 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.436 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.454 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.455 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0185609
Jan 22 17:08:16 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.8:55754 [22/Jan/2026:17:08:16.435] listener listener/metadata 0/0/0/19/19 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.460 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.461 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:16 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.8:55766 [22/Jan/2026:17:08:16.460] listener listener/metadata 0/0/0/32/32 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.492 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0308263
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.501 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.502 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.520 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:16 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.8:55780 [22/Jan/2026:17:08:16.501] listener listener/metadata 0/0/0/19/19 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.520 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0179629
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.523 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.524 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.544 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:16 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.8:55790 [22/Jan/2026:17:08:16.523] listener listener/metadata 0/0/0/21/21 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.545 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0205204
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.548 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.549 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.600 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:16 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.8:55794 [22/Jan/2026:17:08:16.547] listener listener/metadata 0/0/0/53/53 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.601 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0525236
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.606 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.607 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.628 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.629 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0217717
Jan 22 17:08:16 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.8:55810 [22/Jan/2026:17:08:16.605] listener listener/metadata 0/0/0/23/23 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.634 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.636 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.656 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:16.657 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0215046
Jan 22 17:08:16 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.8:55816 [22/Jan/2026:17:08:16.634] listener listener/metadata 0/0/0/23/23 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.394 183079 DEBUG oslo_concurrency.lockutils [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "interface-effaddee-27ef-49f6-ac5f-2e3258c8d5d2-deda3dcd-de47-47e2-8bb1-526d3882d38e" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.395 183079 DEBUG oslo_concurrency.lockutils [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "interface-effaddee-27ef-49f6-ac5f-2e3258c8d5d2-deda3dcd-de47-47e2-8bb1-526d3882d38e" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.415 183079 DEBUG nova.objects.instance [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lazy-loading 'flavor' on Instance uuid effaddee-27ef-49f6-ac5f-2e3258c8d5d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.435 183079 DEBUG nova.network.neutron [req-c74d5af5-f2ab-4a94-8f2d-d8046dbff920 req-1cffcb7f-ef7b-4f9f-85f6-98b28bcbf692 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Updated VIF entry in instance network info cache for port deda3dcd-de47-47e2-8bb1-526d3882d38e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.435 183079 DEBUG nova.network.neutron [req-c74d5af5-f2ab-4a94-8f2d-d8046dbff920 req-1cffcb7f-ef7b-4f9f-85f6-98b28bcbf692 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Updating instance_info_cache with network_info: [{"id": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "address": "fa:16:3e:c8:71:80", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44437e9e-7b", "ovs_interfaceid": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.438 183079 DEBUG nova.virt.libvirt.vif [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1783669850',display_name='tempest-server-test-1783669850',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1783669850',id=3,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCYZfzAO5Ui0C6aNyi2ae5nossybaMPhlAmp9Zj0dL/LunopVV9meBl8qfqrR0u5rqBGUC3w2LWkiMuhmtUWjdFFHsn2/jcRm/feYqTpe58YE0uSBcC8gJ0hMPoRirOIAg==',key_name='tempest-keypair-test-1166356111',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:07:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='eb8e9f7a891a4a38af8b01557eddc991',ramdisk_id='',reservation_id='r-89frdmqf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPPortDetailsTest-1812576526',owner_user_name='tempest-FloatingIPPortDetailsTest-1812576526-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:07:11Z,user_data=None,user_id='d7ee6c51c2b8447baefccea20fa16de5',uuid=effaddee-27ef-49f6-ac5f-2e3258c8d5d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.438 183079 DEBUG nova.network.os_vif_util [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converting VIF {"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.438 183079 DEBUG nova.network.os_vif_util [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:1c:cd,bridge_name='br-int',has_traffic_filtering=True,id=deda3dcd-de47-47e2-8bb1-526d3882d38e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdeda3dcd-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.441 183079 DEBUG nova.virt.libvirt.guest [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a0:1c:cd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdeda3dcd-de"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.445 183079 DEBUG nova.virt.libvirt.guest [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a0:1c:cd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdeda3dcd-de"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.447 183079 DEBUG nova.virt.libvirt.driver [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Attempting to detach device tapdeda3dcd-de from instance effaddee-27ef-49f6-ac5f-2e3258c8d5d2 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.448 183079 DEBUG nova.virt.libvirt.guest [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] detach device xml: <interface type="ethernet">
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <mac address="fa:16:3e:a0:1c:cd"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <model type="virtio"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <mtu size="1442"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <target dev="tapdeda3dcd-de"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]: </interface>
Jan 22 17:08:17 compute-0 nova_compute[183075]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.450 183079 DEBUG oslo_concurrency.lockutils [req-c74d5af5-f2ab-4a94-8f2d-d8046dbff920 req-1cffcb7f-ef7b-4f9f-85f6-98b28bcbf692 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.456 183079 DEBUG nova.virt.libvirt.guest [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a0:1c:cd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdeda3dcd-de"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.461 183079 DEBUG nova.virt.libvirt.guest [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a0:1c:cd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdeda3dcd-de"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <name>instance-00000003</name>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <uuid>effaddee-27ef-49f6-ac5f-2e3258c8d5d2</uuid>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:name>tempest-server-test-1783669850</nova:name>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:creationTime>2026-01-22 17:08:16</nova:creationTime>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:flavor name="m1.nano">
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:memory>128</nova:memory>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:disk>1</nova:disk>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:swap>0</nova:swap>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:vcpus>1</nova:vcpus>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </nova:flavor>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:owner>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:user uuid="d7ee6c51c2b8447baefccea20fa16de5">tempest-FloatingIPPortDetailsTest-1812576526-project-member</nova:user>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:project uuid="eb8e9f7a891a4a38af8b01557eddc991">tempest-FloatingIPPortDetailsTest-1812576526</nova:project>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </nova:owner>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:ports>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:port uuid="44437e9e-7bcf-4942-83a0-cb6139413a8e">
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:port uuid="deda3dcd-de47-47e2-8bb1-526d3882d38e">
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </nova:ports>
Jan 22 17:08:17 compute-0 nova_compute[183075]: </nova:instance>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <memory unit='KiB'>131072</memory>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <vcpu placement='static'>1</vcpu>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <resource>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <partition>/machine</partition>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </resource>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <sysinfo type='smbios'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <system>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <entry name='manufacturer'>RDO</entry>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <entry name='serial'>effaddee-27ef-49f6-ac5f-2e3258c8d5d2</entry>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <entry name='uuid'>effaddee-27ef-49f6-ac5f-2e3258c8d5d2</entry>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <entry name='family'>Virtual Machine</entry>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </system>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <os>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <boot dev='hd'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <smbios mode='sysinfo'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </os>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <features>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <vmcoreinfo state='on'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </features>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <vendor>AMD</vendor>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='x2apic'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc-deadline'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='hypervisor'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc_adjust'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='spec-ctrl'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='stibp'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='ssbd'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='cmp_legacy'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='overflow-recov'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='succor'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='ibrs'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='amd-ssbd'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='virt-ssbd'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='lbrv'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='tsc-scale'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='vmcb-clean'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='flushbyasid'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='pause-filter'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='pfthreshold'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='xsaves'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='svm'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='topoext'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='npt'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='nrip-save'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <clock offset='utc'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <timer name='hpet' present='no'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <on_poweroff>destroy</on_poweroff>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <on_reboot>restart</on_reboot>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <on_crash>destroy</on_crash>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <disk type='file' device='disk'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <source file='/var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk' index='1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <backingStore type='file' index='2'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:         <format type='raw'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:         <source file='/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:         <backingStore/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       </backingStore>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target dev='vda' bus='virtio'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='virtio-disk0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pcie.0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='1' port='0x10'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='2' port='0x11'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.2'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='3' port='0x12'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.3'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='4' port='0x13'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.4'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='5' port='0x14'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.5'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='6' port='0x15'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.6'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='7' port='0x16'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.7'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='8' port='0x17'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.8'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='9' port='0x18'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.9'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='10' port='0x19'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.10'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='11' port='0x1a'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.11'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='12' port='0x1b'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.12'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='13' port='0x1c'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.13'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='14' port='0x1d'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.14'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='15' port='0x1e'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.15'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='16' port='0x1f'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.16'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='17' port='0x20'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.17'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='18' port='0x21'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.18'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='19' port='0x22'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.19'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='20' port='0x23'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.20'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='21' port='0x24'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.21'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='22' port='0x25'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.22'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='23' port='0x26'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.23'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='24' port='0x27'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.24'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='25' port='0x28'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.25'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-pci-bridge'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.26'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='usb'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='sata' index='0'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='ide'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <interface type='ethernet'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <mac address='fa:16:3e:c8:71:80'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target dev='tap44437e9e-7b'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model type='virtio'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <mtu size='1442'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='net0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <interface type='ethernet'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <mac address='fa:16:3e:a0:1c:cd'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target dev='tapdeda3dcd-de'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model type='virtio'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <mtu size='1442'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='net1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <serial type='pty'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <source path='/dev/pts/1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/console.log' append='off'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target type='isa-serial' port='0'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:         <model name='isa-serial'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       </target>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <console type='pty' tty='/dev/pts/1'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <source path='/dev/pts/1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/console.log' append='off'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target type='serial' port='0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </console>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <input type='tablet' bus='usb'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='input0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='usb' bus='0' port='1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </input>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <input type='mouse' bus='ps2'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='input1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </input>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <input type='keyboard' bus='ps2'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='input2'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </input>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <listen type='address' address='::0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </graphics>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <audio id='1' type='none'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <video>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='video0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </video>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <watchdog model='itco' action='reset'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='watchdog0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </watchdog>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <memballoon model='virtio'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <stats period='10'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='balloon0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <rng model='virtio'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <backend model='random'>/dev/urandom</backend>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='rng0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <label>system_u:system_r:svirt_t:s0:c687,c864</label>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c687,c864</imagelabel>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <label>+107:+107</label>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <imagelabel>+107:+107</imagelabel>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:08:17 compute-0 nova_compute[183075]: </domain>
Jan 22 17:08:17 compute-0 nova_compute[183075]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.462 183079 INFO nova.virt.libvirt.driver [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Successfully detached device tapdeda3dcd-de from instance effaddee-27ef-49f6-ac5f-2e3258c8d5d2 from the persistent domain config.
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.463 183079 DEBUG nova.virt.libvirt.driver [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] (1/8): Attempting to detach device tapdeda3dcd-de with device alias net1 from instance effaddee-27ef-49f6-ac5f-2e3258c8d5d2 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.463 183079 DEBUG nova.virt.libvirt.guest [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] detach device xml: <interface type="ethernet">
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <mac address="fa:16:3e:a0:1c:cd"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <model type="virtio"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <mtu size="1442"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <target dev="tapdeda3dcd-de"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]: </interface>
Jan 22 17:08:17 compute-0 nova_compute[183075]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 22 17:08:17 compute-0 kernel: tapdeda3dcd-de (unregistering): left promiscuous mode
Jan 22 17:08:17 compute-0 NetworkManager[55454]: <info>  [1769101697.5828] device (tapdeda3dcd-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.589 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:17 compute-0 ovn_controller[95372]: 2026-01-22T17:08:17Z|00078|binding|INFO|Releasing lport deda3dcd-de47-47e2-8bb1-526d3882d38e from this chassis (sb_readonly=0)
Jan 22 17:08:17 compute-0 ovn_controller[95372]: 2026-01-22T17:08:17Z|00079|binding|INFO|Setting lport deda3dcd-de47-47e2-8bb1-526d3882d38e down in Southbound
Jan 22 17:08:17 compute-0 ovn_controller[95372]: 2026-01-22T17:08:17Z|00080|binding|INFO|Removing iface tapdeda3dcd-de ovn-installed in OVS
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.592 183079 DEBUG nova.virt.libvirt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Received event <DeviceRemovedEvent: 1769101697.59144, effaddee-27ef-49f6-ac5f-2e3258c8d5d2 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.593 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:17.596 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:1c:cd 10.100.0.10'], port_security=['fa:16:3e:a0:1c:cd 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3fb419e8-dd25-4fec-8107-6e7d89977d34', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.234', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a8cd0033-3766-4772-8688-44d3d0e3250a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=deda3dcd-de47-47e2-8bb1-526d3882d38e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.596 183079 DEBUG nova.virt.libvirt.driver [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Start waiting for the detach event from libvirt for device tapdeda3dcd-de with device alias net1 for instance effaddee-27ef-49f6-ac5f-2e3258c8d5d2 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 22 17:08:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:17.597 104629 INFO neutron.agent.ovn.metadata.agent [-] Port deda3dcd-de47-47e2-8bb1-526d3882d38e in datapath 473b4e99-4018-4fa7-ab1c-2d3e7944d850 unbound from our chassis
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.597 183079 DEBUG nova.virt.libvirt.guest [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a0:1c:cd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdeda3dcd-de"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:08:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:17.598 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 473b4e99-4018-4fa7-ab1c-2d3e7944d850
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.602 183079 DEBUG nova.virt.libvirt.guest [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a0:1c:cd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdeda3dcd-de"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <name>instance-00000003</name>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <uuid>effaddee-27ef-49f6-ac5f-2e3258c8d5d2</uuid>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:name>tempest-server-test-1783669850</nova:name>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:creationTime>2026-01-22 17:08:16</nova:creationTime>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:flavor name="m1.nano">
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:memory>128</nova:memory>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:disk>1</nova:disk>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:swap>0</nova:swap>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:vcpus>1</nova:vcpus>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </nova:flavor>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:owner>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:user uuid="d7ee6c51c2b8447baefccea20fa16de5">tempest-FloatingIPPortDetailsTest-1812576526-project-member</nova:user>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:project uuid="eb8e9f7a891a4a38af8b01557eddc991">tempest-FloatingIPPortDetailsTest-1812576526</nova:project>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </nova:owner>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:ports>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:port uuid="44437e9e-7bcf-4942-83a0-cb6139413a8e">
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:port uuid="deda3dcd-de47-47e2-8bb1-526d3882d38e">
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </nova:ports>
Jan 22 17:08:17 compute-0 nova_compute[183075]: </nova:instance>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <memory unit='KiB'>131072</memory>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <vcpu placement='static'>1</vcpu>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <resource>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <partition>/machine</partition>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </resource>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <sysinfo type='smbios'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <system>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <entry name='manufacturer'>RDO</entry>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <entry name='serial'>effaddee-27ef-49f6-ac5f-2e3258c8d5d2</entry>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <entry name='uuid'>effaddee-27ef-49f6-ac5f-2e3258c8d5d2</entry>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <entry name='family'>Virtual Machine</entry>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </system>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <os>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <boot dev='hd'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <smbios mode='sysinfo'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </os>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <features>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <vmcoreinfo state='on'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </features>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <vendor>AMD</vendor>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='x2apic'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc-deadline'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='hypervisor'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc_adjust'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='spec-ctrl'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='stibp'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='ssbd'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='cmp_legacy'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='overflow-recov'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='succor'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='ibrs'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='amd-ssbd'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='virt-ssbd'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='lbrv'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='tsc-scale'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='vmcb-clean'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='flushbyasid'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='pause-filter'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='pfthreshold'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='xsaves'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='svm'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='require' name='topoext'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='npt'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <feature policy='disable' name='nrip-save'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <clock offset='utc'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <timer name='hpet' present='no'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <on_poweroff>destroy</on_poweroff>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <on_reboot>restart</on_reboot>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <on_crash>destroy</on_crash>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <disk type='file' device='disk'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <source file='/var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/disk' index='1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <backingStore type='file' index='2'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:         <format type='raw'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:         <source file='/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:         <backingStore/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       </backingStore>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target dev='vda' bus='virtio'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='virtio-disk0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pcie.0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='1' port='0x10'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='2' port='0x11'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.2'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='3' port='0x12'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.3'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='4' port='0x13'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.4'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='5' port='0x14'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.5'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='6' port='0x15'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.6'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='7' port='0x16'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.7'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='8' port='0x17'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.8'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='9' port='0x18'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.9'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='10' port='0x19'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.10'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='11' port='0x1a'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.11'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='12' port='0x1b'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.12'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='13' port='0x1c'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.13'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='14' port='0x1d'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.14'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='15' port='0x1e'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.15'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='16' port='0x1f'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.16'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='17' port='0x20'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.17'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='18' port='0x21'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.18'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='19' port='0x22'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.19'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='20' port='0x23'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.20'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='21' port='0x24'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.21'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='22' port='0x25'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.22'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='23' port='0x26'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.23'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='24' port='0x27'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.24'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target chassis='25' port='0x28'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.25'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model name='pcie-pci-bridge'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='pci.26'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='usb'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <controller type='sata' index='0'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='ide'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <interface type='ethernet'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <mac address='fa:16:3e:c8:71:80'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target dev='tap44437e9e-7b'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model type='virtio'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <mtu size='1442'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='net0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <serial type='pty'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <source path='/dev/pts/1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/console.log' append='off'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target type='isa-serial' port='0'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:         <model name='isa-serial'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       </target>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <console type='pty' tty='/dev/pts/1'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <source path='/dev/pts/1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2/console.log' append='off'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <target type='serial' port='0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </console>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <input type='tablet' bus='usb'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='input0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='usb' bus='0' port='1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </input>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <input type='mouse' bus='ps2'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='input1'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </input>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <input type='keyboard' bus='ps2'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='input2'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </input>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <listen type='address' address='::0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </graphics>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <audio id='1' type='none'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <video>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='video0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </video>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <watchdog model='itco' action='reset'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='watchdog0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </watchdog>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <memballoon model='virtio'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <stats period='10'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='balloon0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <rng model='virtio'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <backend model='random'>/dev/urandom</backend>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <alias name='rng0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <label>system_u:system_r:svirt_t:s0:c687,c864</label>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c687,c864</imagelabel>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <label>+107:+107</label>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <imagelabel>+107:+107</imagelabel>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:08:17 compute-0 nova_compute[183075]: </domain>
Jan 22 17:08:17 compute-0 nova_compute[183075]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.602 183079 INFO nova.virt.libvirt.driver [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Successfully detached device tapdeda3dcd-de from instance effaddee-27ef-49f6-ac5f-2e3258c8d5d2 from the live domain config.
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.609 183079 DEBUG nova.virt.libvirt.vif [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1783669850',display_name='tempest-server-test-1783669850',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1783669850',id=3,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCYZfzAO5Ui0C6aNyi2ae5nossybaMPhlAmp9Zj0dL/LunopVV9meBl8qfqrR0u5rqBGUC3w2LWkiMuhmtUWjdFFHsn2/jcRm/feYqTpe58YE0uSBcC8gJ0hMPoRirOIAg==',key_name='tempest-keypair-test-1166356111',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:07:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='eb8e9f7a891a4a38af8b01557eddc991',ramdisk_id='',reservation_id='r-89frdmqf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPPortDetailsTest-1812576526',owner_user_name='tempest-FloatingIPPortDetailsTest-1812576526-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:07:11Z,user_data=None,user_id='d7ee6c51c2b8447baefccea20fa16de5',uuid=effaddee-27ef-49f6-ac5f-2e3258c8d5d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.610 183079 DEBUG nova.network.os_vif_util [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converting VIF {"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.611 183079 DEBUG nova.network.os_vif_util [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:1c:cd,bridge_name='br-int',has_traffic_filtering=True,id=deda3dcd-de47-47e2-8bb1-526d3882d38e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdeda3dcd-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.612 183079 DEBUG os_vif [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:1c:cd,bridge_name='br-int',has_traffic_filtering=True,id=deda3dcd-de47-47e2-8bb1-526d3882d38e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdeda3dcd-de') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.616 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.617 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdeda3dcd-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.619 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.620 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.624 183079 INFO os_vif [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:1c:cd,bridge_name='br-int',has_traffic_filtering=True,id=deda3dcd-de47-47e2-8bb1-526d3882d38e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdeda3dcd-de')
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.626 183079 DEBUG nova.virt.libvirt.guest [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:name>tempest-server-test-1783669850</nova:name>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:creationTime>2026-01-22 17:08:17</nova:creationTime>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:flavor name="m1.nano">
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:memory>128</nova:memory>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:disk>1</nova:disk>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:swap>0</nova:swap>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:vcpus>1</nova:vcpus>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </nova:flavor>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:owner>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:user uuid="d7ee6c51c2b8447baefccea20fa16de5">tempest-FloatingIPPortDetailsTest-1812576526-project-member</nova:user>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:project uuid="eb8e9f7a891a4a38af8b01557eddc991">tempest-FloatingIPPortDetailsTest-1812576526</nova:project>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </nova:owner>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   <nova:ports>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     <nova:port uuid="44437e9e-7bcf-4942-83a0-cb6139413a8e">
Jan 22 17:08:17 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 17:08:17 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:08:17 compute-0 nova_compute[183075]:   </nova:ports>
Jan 22 17:08:17 compute-0 nova_compute[183075]: </nova:instance>
Jan 22 17:08:17 compute-0 nova_compute[183075]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 17:08:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:17.626 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bdc85765-0ef5-401b-9dc2-11db71894450]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:17.670 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[1dc883a9-5766-41fb-8ef4-378316702cdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:17.676 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[630174d6-3570-4b12-a68d-0c8e27961816]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:17.725 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f41f3d6e-5081-48bb-b449-6c7c8f1533f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:17.741 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3600a587-eb36-4e8f-a1b7-1d15ee9aabce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap473b4e99-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:87:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 107, 'rx_bytes': 17308, 'tx_bytes': 12076, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 107, 'rx_bytes': 17308, 'tx_bytes': 12076, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399443, 'reachable_time': 32301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216291, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:17.763 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6d00ec44-0711-4174-8102-e40ef0dce45f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap473b4e99-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399457, 'tstamp': 399457}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216292, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap473b4e99-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399460, 'tstamp': 399460}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216292, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:17.766 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap473b4e99-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.768 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.784 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:17 compute-0 nova_compute[183075]: 2026-01-22 17:08:17.785 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:17.786 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap473b4e99-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:17.787 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:17.787 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap473b4e99-40, col_values=(('external_ids', {'iface-id': '424ac40e-403e-4504-adbb-47a319b401fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:17.788 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.561 183079 DEBUG nova.compute.manager [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received event network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.561 183079 DEBUG oslo_concurrency.lockutils [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.561 183079 DEBUG oslo_concurrency.lockutils [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.562 183079 DEBUG oslo_concurrency.lockutils [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.562 183079 DEBUG nova.compute.manager [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] No waiting events found dispatching network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.562 183079 WARNING nova.compute.manager [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received unexpected event network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e for instance with vm_state active and task_state None.
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.562 183079 DEBUG nova.compute.manager [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received event network-vif-unplugged-deda3dcd-de47-47e2-8bb1-526d3882d38e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.562 183079 DEBUG oslo_concurrency.lockutils [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.563 183079 DEBUG oslo_concurrency.lockutils [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.563 183079 DEBUG oslo_concurrency.lockutils [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.563 183079 DEBUG nova.compute.manager [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] No waiting events found dispatching network-vif-unplugged-deda3dcd-de47-47e2-8bb1-526d3882d38e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.563 183079 WARNING nova.compute.manager [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received unexpected event network-vif-unplugged-deda3dcd-de47-47e2-8bb1-526d3882d38e for instance with vm_state active and task_state None.
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.563 183079 DEBUG nova.compute.manager [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received event network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.564 183079 DEBUG oslo_concurrency.lockutils [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.564 183079 DEBUG oslo_concurrency.lockutils [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.564 183079 DEBUG oslo_concurrency.lockutils [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.564 183079 DEBUG nova.compute.manager [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] No waiting events found dispatching network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:08:18 compute-0 nova_compute[183075]: 2026-01-22 17:08:18.564 183079 WARNING nova.compute.manager [req-7762c350-a29b-4291-8f4b-e45777131b0c req-daeba1ab-311e-4259-965d-594e714a513b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received unexpected event network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e for instance with vm_state active and task_state None.
Jan 22 17:08:19 compute-0 nova_compute[183075]: 2026-01-22 17:08:19.301 183079 DEBUG oslo_concurrency.lockutils [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:08:19 compute-0 nova_compute[183075]: 2026-01-22 17:08:19.301 183079 DEBUG oslo_concurrency.lockutils [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquired lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:08:19 compute-0 nova_compute[183075]: 2026-01-22 17:08:19.301 183079 DEBUG nova.network.neutron [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:08:19 compute-0 nova_compute[183075]: 2026-01-22 17:08:19.855 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:20 compute-0 podman[216294]: 2026-01-22 17:08:20.40669431 +0000 UTC m=+0.105495290 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:08:20 compute-0 podman[216293]: 2026-01-22 17:08:20.431537749 +0000 UTC m=+0.124928998 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:08:20 compute-0 nova_compute[183075]: 2026-01-22 17:08:20.776 183079 INFO nova.network.neutron [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Port deda3dcd-de47-47e2-8bb1-526d3882d38e from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 22 17:08:20 compute-0 nova_compute[183075]: 2026-01-22 17:08:20.776 183079 DEBUG nova.network.neutron [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Updating instance_info_cache with network_info: [{"id": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "address": "fa:16:3e:c8:71:80", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44437e9e-7b", "ovs_interfaceid": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:08:20 compute-0 nova_compute[183075]: 2026-01-22 17:08:20.802 183079 DEBUG oslo_concurrency.lockutils [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Releasing lock "refresh_cache-effaddee-27ef-49f6-ac5f-2e3258c8d5d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:08:20 compute-0 nova_compute[183075]: 2026-01-22 17:08:20.832 183079 DEBUG oslo_concurrency.lockutils [None req-dbcff40d-6ba2-4f5e-8261-49234fadde66 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "interface-effaddee-27ef-49f6-ac5f-2e3258c8d5d2-deda3dcd-de47-47e2-8bb1-526d3882d38e" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.437s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:21 compute-0 nova_compute[183075]: 2026-01-22 17:08:21.638 183079 INFO nova.compute.manager [None req-bcc7b963-b567-4c64-8911-cfb90a3c51fd 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Get console output
Jan 22 17:08:21 compute-0 nova_compute[183075]: 2026-01-22 17:08:21.645 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:08:22 compute-0 podman[216336]: 2026-01-22 17:08:22.344649207 +0000 UTC m=+0.057668009 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, name=ubi9-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, release=1755695350, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:08:22 compute-0 nova_compute[183075]: 2026-01-22 17:08:22.620 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:24 compute-0 nova_compute[183075]: 2026-01-22 17:08:24.318 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:24 compute-0 nova_compute[183075]: 2026-01-22 17:08:24.857 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.377 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "fa77228b-8be7-4bab-9a40-7241201bdbff" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.378 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "fa77228b-8be7-4bab-9a40-7241201bdbff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.394 183079 DEBUG nova.compute.manager [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.471 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.471 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.479 183079 DEBUG nova.virt.hardware [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.479 183079 INFO nova.compute.claims [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.696 183079 DEBUG nova.compute.provider_tree [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.714 183079 DEBUG nova.scheduler.client.report [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.735 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.735 183079 DEBUG nova.compute.manager [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.803 183079 DEBUG nova.compute.manager [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.804 183079 DEBUG nova.network.neutron [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.838 183079 INFO nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.861 183079 DEBUG nova.compute.manager [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.961 183079 DEBUG nova.compute.manager [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.963 183079 DEBUG nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.964 183079 INFO nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Creating image(s)
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.965 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "/var/lib/nova/instances/fa77228b-8be7-4bab-9a40-7241201bdbff/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.966 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "/var/lib/nova/instances/fa77228b-8be7-4bab-9a40-7241201bdbff/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.967 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "/var/lib/nova/instances/fa77228b-8be7-4bab-9a40-7241201bdbff/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:25 compute-0 nova_compute[183075]: 2026-01-22 17:08:25.983 183079 DEBUG oslo_concurrency.processutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.082 183079 DEBUG oslo_concurrency.processutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.084 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.085 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.102 183079 DEBUG oslo_concurrency.processutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.157 183079 DEBUG oslo_concurrency.processutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.158 183079 DEBUG oslo_concurrency.processutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/fa77228b-8be7-4bab-9a40-7241201bdbff/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.202 183079 DEBUG oslo_concurrency.processutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/fa77228b-8be7-4bab-9a40-7241201bdbff/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.203 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.204 183079 DEBUG oslo_concurrency.processutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.260 183079 DEBUG oslo_concurrency.processutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.261 183079 DEBUG nova.virt.disk.api [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Checking if we can resize image /var/lib/nova/instances/fa77228b-8be7-4bab-9a40-7241201bdbff/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.261 183079 DEBUG oslo_concurrency.processutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fa77228b-8be7-4bab-9a40-7241201bdbff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.314 183079 DEBUG oslo_concurrency.processutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fa77228b-8be7-4bab-9a40-7241201bdbff/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.315 183079 DEBUG nova.virt.disk.api [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Cannot resize image /var/lib/nova/instances/fa77228b-8be7-4bab-9a40-7241201bdbff/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.315 183079 DEBUG nova.objects.instance [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lazy-loading 'migration_context' on Instance uuid fa77228b-8be7-4bab-9a40-7241201bdbff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.340 183079 DEBUG nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.341 183079 DEBUG nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Ensure instance console log exists: /var/lib/nova/instances/fa77228b-8be7-4bab-9a40-7241201bdbff/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.341 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.342 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.342 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:26 compute-0 nova_compute[183075]: 2026-01-22 17:08:26.712 183079 DEBUG nova.network.neutron [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Successfully created port: 611e2a01-0a7c-4b7f-a941-623f993a5547 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:08:27 compute-0 podman[216373]: 2026-01-22 17:08:27.355067446 +0000 UTC m=+0.062524776 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:08:27 compute-0 nova_compute[183075]: 2026-01-22 17:08:27.399 183079 DEBUG nova.network.neutron [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Successfully updated port: 611e2a01-0a7c-4b7f-a941-623f993a5547 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:08:27 compute-0 nova_compute[183075]: 2026-01-22 17:08:27.418 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "refresh_cache-fa77228b-8be7-4bab-9a40-7241201bdbff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:08:27 compute-0 nova_compute[183075]: 2026-01-22 17:08:27.418 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquired lock "refresh_cache-fa77228b-8be7-4bab-9a40-7241201bdbff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:08:27 compute-0 nova_compute[183075]: 2026-01-22 17:08:27.418 183079 DEBUG nova.network.neutron [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:08:27 compute-0 nova_compute[183075]: 2026-01-22 17:08:27.509 183079 DEBUG nova.compute.manager [req-016ecd24-f4b1-46a3-aad2-0c91651642e6 req-3473966e-3e87-43ff-a00d-7f22274f740e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Received event network-changed-611e2a01-0a7c-4b7f-a941-623f993a5547 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:27 compute-0 nova_compute[183075]: 2026-01-22 17:08:27.509 183079 DEBUG nova.compute.manager [req-016ecd24-f4b1-46a3-aad2-0c91651642e6 req-3473966e-3e87-43ff-a00d-7f22274f740e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Refreshing instance network info cache due to event network-changed-611e2a01-0a7c-4b7f-a941-623f993a5547. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:08:27 compute-0 nova_compute[183075]: 2026-01-22 17:08:27.510 183079 DEBUG oslo_concurrency.lockutils [req-016ecd24-f4b1-46a3-aad2-0c91651642e6 req-3473966e-3e87-43ff-a00d-7f22274f740e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-fa77228b-8be7-4bab-9a40-7241201bdbff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:08:27 compute-0 nova_compute[183075]: 2026-01-22 17:08:27.622 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:27 compute-0 nova_compute[183075]: 2026-01-22 17:08:27.860 183079 DEBUG nova.network.neutron [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.509 183079 DEBUG oslo_concurrency.lockutils [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "interface-000b64b8-bcc5-4bbe-9703-8400a83a27d0-deda3dcd-de47-47e2-8bb1-526d3882d38e" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.510 183079 DEBUG oslo_concurrency.lockutils [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "interface-000b64b8-bcc5-4bbe-9703-8400a83a27d0-deda3dcd-de47-47e2-8bb1-526d3882d38e" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.511 183079 DEBUG nova.objects.instance [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lazy-loading 'flavor' on Instance uuid 000b64b8-bcc5-4bbe-9703-8400a83a27d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.819 183079 DEBUG nova.network.neutron [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Updating instance_info_cache with network_info: [{"id": "611e2a01-0a7c-4b7f-a941-623f993a5547", "address": "fa:16:3e:67:5c:7a", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap611e2a01-0a", "ovs_interfaceid": "611e2a01-0a7c-4b7f-a941-623f993a5547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.849 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Releasing lock "refresh_cache-fa77228b-8be7-4bab-9a40-7241201bdbff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.850 183079 DEBUG nova.compute.manager [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Instance network_info: |[{"id": "611e2a01-0a7c-4b7f-a941-623f993a5547", "address": "fa:16:3e:67:5c:7a", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap611e2a01-0a", "ovs_interfaceid": "611e2a01-0a7c-4b7f-a941-623f993a5547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.851 183079 DEBUG oslo_concurrency.lockutils [req-016ecd24-f4b1-46a3-aad2-0c91651642e6 req-3473966e-3e87-43ff-a00d-7f22274f740e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-fa77228b-8be7-4bab-9a40-7241201bdbff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.851 183079 DEBUG nova.network.neutron [req-016ecd24-f4b1-46a3-aad2-0c91651642e6 req-3473966e-3e87-43ff-a00d-7f22274f740e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Refreshing network info cache for port 611e2a01-0a7c-4b7f-a941-623f993a5547 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.856 183079 DEBUG nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Start _get_guest_xml network_info=[{"id": "611e2a01-0a7c-4b7f-a941-623f993a5547", "address": "fa:16:3e:67:5c:7a", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap611e2a01-0a", "ovs_interfaceid": "611e2a01-0a7c-4b7f-a941-623f993a5547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.864 183079 WARNING nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.869 183079 DEBUG nova.virt.libvirt.host [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.870 183079 DEBUG nova.virt.libvirt.host [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.882 183079 DEBUG nova.virt.libvirt.host [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.883 183079 DEBUG nova.virt.libvirt.host [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.883 183079 DEBUG nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.884 183079 DEBUG nova.virt.hardware [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.885 183079 DEBUG nova.virt.hardware [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.885 183079 DEBUG nova.virt.hardware [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.885 183079 DEBUG nova.virt.hardware [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.886 183079 DEBUG nova.virt.hardware [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.886 183079 DEBUG nova.virt.hardware [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.886 183079 DEBUG nova.virt.hardware [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.887 183079 DEBUG nova.virt.hardware [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.887 183079 DEBUG nova.virt.hardware [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.887 183079 DEBUG nova.virt.hardware [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.888 183079 DEBUG nova.virt.hardware [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.894 183079 DEBUG nova.virt.libvirt.vif [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:08:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1530587158',display_name='tempest-server-test-1530587158',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1530587158',id=7,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOzuS5c+/c3nchlEjzQZcF6sVTY+If+Ronj929KhDV7B+UoVTOX4QPWtIie7F0q5o+yMVIvrndYwVXNXP5sf1WYo75cdEmtnNs2NDMwzYXACOAw67rfC/ZLmfMKPakZxTQ==',key_name='tempest-keypair-test-694600506',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6884ab5c00114ca19f253d0c91e2706f',ramdisk_id='',reservation_id='r-hnn1nfqy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpTestCasesAdmin-1567259425',owner_user_name='tempest-FloatingIpTestCasesAdmin-1567259425-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:08:25Z,user_data=None,user_id='7554977cf766467891ad30986750ca88',uuid=fa77228b-8be7-4bab-9a40-7241201bdbff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "611e2a01-0a7c-4b7f-a941-623f993a5547", "address": "fa:16:3e:67:5c:7a", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap611e2a01-0a", "ovs_interfaceid": "611e2a01-0a7c-4b7f-a941-623f993a5547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.895 183079 DEBUG nova.network.os_vif_util [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Converting VIF {"id": "611e2a01-0a7c-4b7f-a941-623f993a5547", "address": "fa:16:3e:67:5c:7a", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap611e2a01-0a", "ovs_interfaceid": "611e2a01-0a7c-4b7f-a941-623f993a5547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.896 183079 DEBUG nova.network.os_vif_util [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:5c:7a,bridge_name='br-int',has_traffic_filtering=True,id=611e2a01-0a7c-4b7f-a941-623f993a5547,network=Network(9fe870b5-173a-4c8a-b406-6fedb3ddcc4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap611e2a01-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.897 183079 DEBUG nova.objects.instance [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lazy-loading 'pci_devices' on Instance uuid fa77228b-8be7-4bab-9a40-7241201bdbff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.915 183079 DEBUG nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:08:28 compute-0 nova_compute[183075]:   <uuid>fa77228b-8be7-4bab-9a40-7241201bdbff</uuid>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   <name>instance-00000007</name>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1530587158</nova:name>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:08:28</nova:creationTime>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:08:28 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:08:28 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:08:28 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:08:28 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:08:28 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:08:28 compute-0 nova_compute[183075]:         <nova:user uuid="7554977cf766467891ad30986750ca88">tempest-FloatingIpTestCasesAdmin-1567259425-project-admin</nova:user>
Jan 22 17:08:28 compute-0 nova_compute[183075]:         <nova:project uuid="6884ab5c00114ca19f253d0c91e2706f">tempest-FloatingIpTestCasesAdmin-1567259425</nova:project>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:08:28 compute-0 nova_compute[183075]:         <nova:port uuid="611e2a01-0a7c-4b7f-a941-623f993a5547">
Jan 22 17:08:28 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <system>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <entry name="serial">fa77228b-8be7-4bab-9a40-7241201bdbff</entry>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <entry name="uuid">fa77228b-8be7-4bab-9a40-7241201bdbff</entry>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     </system>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   <os>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   </os>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   <features>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   </features>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/fa77228b-8be7-4bab-9a40-7241201bdbff/disk"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:67:5c:7a"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <target dev="tap611e2a01-0a"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/fa77228b-8be7-4bab-9a40-7241201bdbff/console.log" append="off"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <video>
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     </video>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:08:28 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:08:28 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:08:28 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:08:28 compute-0 nova_compute[183075]: </domain>
Jan 22 17:08:28 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.916 183079 DEBUG nova.compute.manager [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Preparing to wait for external event network-vif-plugged-611e2a01-0a7c-4b7f-a941-623f993a5547 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.917 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.917 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.917 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.919 183079 DEBUG nova.virt.libvirt.vif [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:08:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1530587158',display_name='tempest-server-test-1530587158',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1530587158',id=7,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOzuS5c+/c3nchlEjzQZcF6sVTY+If+Ronj929KhDV7B+UoVTOX4QPWtIie7F0q5o+yMVIvrndYwVXNXP5sf1WYo75cdEmtnNs2NDMwzYXACOAw67rfC/ZLmfMKPakZxTQ==',key_name='tempest-keypair-test-694600506',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6884ab5c00114ca19f253d0c91e2706f',ramdisk_id='',reservation_id='r-hnn1nfqy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpTestCasesAdmin-1567259425',owner_user_name='tempest-FloatingIpTestCasesAdmin-1567259425-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:08:25Z,user_data=None,user_id='7554977cf766467891ad30986750ca88',uuid=fa77228b-8be7-4bab-9a40-7241201bdbff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "611e2a01-0a7c-4b7f-a941-623f993a5547", "address": "fa:16:3e:67:5c:7a", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap611e2a01-0a", "ovs_interfaceid": "611e2a01-0a7c-4b7f-a941-623f993a5547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.919 183079 DEBUG nova.network.os_vif_util [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Converting VIF {"id": "611e2a01-0a7c-4b7f-a941-623f993a5547", "address": "fa:16:3e:67:5c:7a", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap611e2a01-0a", "ovs_interfaceid": "611e2a01-0a7c-4b7f-a941-623f993a5547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.920 183079 DEBUG nova.network.os_vif_util [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:5c:7a,bridge_name='br-int',has_traffic_filtering=True,id=611e2a01-0a7c-4b7f-a941-623f993a5547,network=Network(9fe870b5-173a-4c8a-b406-6fedb3ddcc4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap611e2a01-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.921 183079 DEBUG os_vif [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:5c:7a,bridge_name='br-int',has_traffic_filtering=True,id=611e2a01-0a7c-4b7f-a941-623f993a5547,network=Network(9fe870b5-173a-4c8a-b406-6fedb3ddcc4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap611e2a01-0a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.922 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.922 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.923 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.927 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.927 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap611e2a01-0a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.928 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap611e2a01-0a, col_values=(('external_ids', {'iface-id': '611e2a01-0a7c-4b7f-a941-623f993a5547', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:67:5c:7a', 'vm-uuid': 'fa77228b-8be7-4bab-9a40-7241201bdbff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.963 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:28 compute-0 NetworkManager[55454]: <info>  [1769101708.9647] manager: (tap611e2a01-0a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.966 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.972 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:28 compute-0 nova_compute[183075]: 2026-01-22 17:08:28.973 183079 INFO os_vif [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:5c:7a,bridge_name='br-int',has_traffic_filtering=True,id=611e2a01-0a7c-4b7f-a941-623f993a5547,network=Network(9fe870b5-173a-4c8a-b406-6fedb3ddcc4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap611e2a01-0a')
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.029 183079 DEBUG nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.029 183079 DEBUG nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] No VIF found with MAC fa:16:3e:67:5c:7a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:08:29 compute-0 kernel: tap611e2a01-0a: entered promiscuous mode
Jan 22 17:08:29 compute-0 NetworkManager[55454]: <info>  [1769101709.1140] manager: (tap611e2a01-0a): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Jan 22 17:08:29 compute-0 ovn_controller[95372]: 2026-01-22T17:08:29Z|00081|binding|INFO|Claiming lport 611e2a01-0a7c-4b7f-a941-623f993a5547 for this chassis.
Jan 22 17:08:29 compute-0 ovn_controller[95372]: 2026-01-22T17:08:29Z|00082|binding|INFO|611e2a01-0a7c-4b7f-a941-623f993a5547: Claiming fa:16:3e:67:5c:7a 10.100.0.9
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.115 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:29.124 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:67:5c:7a 10.100.0.9'], port_security=['fa:16:3e:67:5c:7a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'fa77228b-8be7-4bab-9a40-7241201bdbff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6884ab5c00114ca19f253d0c91e2706f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ced556d8-3a2b-4ec1-a804-8cbb50ada768', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f9feb03-8564-422d-a49d-142dd411b92f, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=611e2a01-0a7c-4b7f-a941-623f993a5547) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:08:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:29.125 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 611e2a01-0a7c-4b7f-a941-623f993a5547 in datapath 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e bound to our chassis
Jan 22 17:08:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:29.128 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e
Jan 22 17:08:29 compute-0 ovn_controller[95372]: 2026-01-22T17:08:29Z|00083|binding|INFO|Setting lport 611e2a01-0a7c-4b7f-a941-623f993a5547 ovn-installed in OVS
Jan 22 17:08:29 compute-0 ovn_controller[95372]: 2026-01-22T17:08:29Z|00084|binding|INFO|Setting lport 611e2a01-0a7c-4b7f-a941-623f993a5547 up in Southbound
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.132 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.135 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:29.146 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3dcc9aad-306e-4e61-92b7-313d14f1500a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:29 compute-0 systemd-machined[154382]: New machine qemu-7-instance-00000007.
Jan 22 17:08:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:29.185 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[dd7f802c-32bd-40cd-ba80-47b9e3b30bfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:29 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000007.
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.188 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:29.188 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[62d4e2e1-6eda-4e6c-97f0-24849621f616]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:29 compute-0 systemd-udevd[216414]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:08:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:29.221 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ca098f80-3fae-40ca-8a3d-8dc08daca562]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:29 compute-0 NetworkManager[55454]: <info>  [1769101709.2310] device (tap611e2a01-0a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:08:29 compute-0 NetworkManager[55454]: <info>  [1769101709.2319] device (tap611e2a01-0a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:08:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:29.247 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3ebe8aed-5315-4cf0-a1cf-a975197fa393]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9fe870b5-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:dc:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 55, 'rx_bytes': 8920, 'tx_bytes': 6196, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 55, 'rx_bytes': 8920, 'tx_bytes': 6196, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404056, 'reachable_time': 30789, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216418, 'error': None, 'target': 'ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:29.267 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[04bbd87d-1f0d-43f6-8e1d-c6539efab59d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9fe870b5-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 404069, 'tstamp': 404069}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216423, 'error': None, 'target': 'ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9fe870b5-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 404072, 'tstamp': 404072}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216423, 'error': None, 'target': 'ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:29.268 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9fe870b5-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.270 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.271 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:29.271 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9fe870b5-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:29.272 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:29.272 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9fe870b5-10, col_values=(('external_ids', {'iface-id': '124a4c16-1255-4f0e-8e20-7c85f64c00e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:29.272 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.331 183079 DEBUG nova.objects.instance [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lazy-loading 'pci_requests' on Instance uuid 000b64b8-bcc5-4bbe-9703-8400a83a27d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.346 183079 DEBUG nova.network.neutron [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.685 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101709.6845062, fa77228b-8be7-4bab-9a40-7241201bdbff => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.685 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] VM Started (Lifecycle Event)
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.704 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.709 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101709.684763, fa77228b-8be7-4bab-9a40-7241201bdbff => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.710 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] VM Paused (Lifecycle Event)
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.730 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.734 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.755 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.766 183079 DEBUG nova.compute.manager [req-f419ecd2-2f77-47d6-b05a-11a9b3f8601c req-0ebd98b6-d97e-46b2-81ea-01b41e485890 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Received event network-vif-plugged-611e2a01-0a7c-4b7f-a941-623f993a5547 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.766 183079 DEBUG oslo_concurrency.lockutils [req-f419ecd2-2f77-47d6-b05a-11a9b3f8601c req-0ebd98b6-d97e-46b2-81ea-01b41e485890 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.766 183079 DEBUG oslo_concurrency.lockutils [req-f419ecd2-2f77-47d6-b05a-11a9b3f8601c req-0ebd98b6-d97e-46b2-81ea-01b41e485890 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.767 183079 DEBUG oslo_concurrency.lockutils [req-f419ecd2-2f77-47d6-b05a-11a9b3f8601c req-0ebd98b6-d97e-46b2-81ea-01b41e485890 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.767 183079 DEBUG nova.compute.manager [req-f419ecd2-2f77-47d6-b05a-11a9b3f8601c req-0ebd98b6-d97e-46b2-81ea-01b41e485890 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Processing event network-vif-plugged-611e2a01-0a7c-4b7f-a941-623f993a5547 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.770 183079 DEBUG nova.compute.manager [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.776 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101709.775526, fa77228b-8be7-4bab-9a40-7241201bdbff => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.776 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] VM Resumed (Lifecycle Event)
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.779 183079 DEBUG nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.784 183079 INFO nova.virt.libvirt.driver [-] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Instance spawned successfully.
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.785 183079 DEBUG nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.793 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.797 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.812 183079 DEBUG nova.policy [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd7ee6c51c2b8447baefccea20fa16de5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.820 183079 DEBUG nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.821 183079 DEBUG nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.821 183079 DEBUG nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.822 183079 DEBUG nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.823 183079 DEBUG nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.824 183079 DEBUG nova.virt.libvirt.driver [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.832 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.859 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.905 183079 INFO nova.compute.manager [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Took 3.94 seconds to spawn the instance on the hypervisor.
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.906 183079 DEBUG nova.compute.manager [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:08:29 compute-0 nova_compute[183075]: 2026-01-22 17:08:29.980 183079 INFO nova.compute.manager [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Took 4.54 seconds to build instance.
Jan 22 17:08:30 compute-0 nova_compute[183075]: 2026-01-22 17:08:30.003 183079 DEBUG oslo_concurrency.lockutils [None req-f7ba9dc5-9665-4325-8c1c-bce23a5e8aad 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "fa77228b-8be7-4bab-9a40-7241201bdbff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.625s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:30 compute-0 nova_compute[183075]: 2026-01-22 17:08:30.456 183079 DEBUG nova.network.neutron [req-016ecd24-f4b1-46a3-aad2-0c91651642e6 req-3473966e-3e87-43ff-a00d-7f22274f740e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Updated VIF entry in instance network info cache for port 611e2a01-0a7c-4b7f-a941-623f993a5547. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:08:30 compute-0 nova_compute[183075]: 2026-01-22 17:08:30.456 183079 DEBUG nova.network.neutron [req-016ecd24-f4b1-46a3-aad2-0c91651642e6 req-3473966e-3e87-43ff-a00d-7f22274f740e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Updating instance_info_cache with network_info: [{"id": "611e2a01-0a7c-4b7f-a941-623f993a5547", "address": "fa:16:3e:67:5c:7a", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap611e2a01-0a", "ovs_interfaceid": "611e2a01-0a7c-4b7f-a941-623f993a5547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:08:30 compute-0 nova_compute[183075]: 2026-01-22 17:08:30.474 183079 DEBUG oslo_concurrency.lockutils [req-016ecd24-f4b1-46a3-aad2-0c91651642e6 req-3473966e-3e87-43ff-a00d-7f22274f740e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-fa77228b-8be7-4bab-9a40-7241201bdbff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:08:30 compute-0 nova_compute[183075]: 2026-01-22 17:08:30.685 183079 DEBUG nova.network.neutron [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Successfully updated port: deda3dcd-de47-47e2-8bb1-526d3882d38e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:08:30 compute-0 nova_compute[183075]: 2026-01-22 17:08:30.710 183079 DEBUG oslo_concurrency.lockutils [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "refresh_cache-000b64b8-bcc5-4bbe-9703-8400a83a27d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:08:30 compute-0 nova_compute[183075]: 2026-01-22 17:08:30.711 183079 DEBUG oslo_concurrency.lockutils [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquired lock "refresh_cache-000b64b8-bcc5-4bbe-9703-8400a83a27d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:08:30 compute-0 nova_compute[183075]: 2026-01-22 17:08:30.711 183079 DEBUG nova.network.neutron [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:08:31 compute-0 nova_compute[183075]: 2026-01-22 17:08:31.646 183079 WARNING nova.network.neutron [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] 473b4e99-4018-4fa7-ab1c-2d3e7944d850 already exists in list: networks containing: ['473b4e99-4018-4fa7-ab1c-2d3e7944d850']. ignoring it
Jan 22 17:08:32 compute-0 nova_compute[183075]: 2026-01-22 17:08:32.149 183079 DEBUG nova.compute.manager [req-e94b748f-af46-4284-a10d-3d9fc3b89e4d req-c32cf383-7a4d-43d6-bd28-a32b65a10bbe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Received event network-changed-deda3dcd-de47-47e2-8bb1-526d3882d38e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:32 compute-0 nova_compute[183075]: 2026-01-22 17:08:32.149 183079 DEBUG nova.compute.manager [req-e94b748f-af46-4284-a10d-3d9fc3b89e4d req-c32cf383-7a4d-43d6-bd28-a32b65a10bbe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Refreshing instance network info cache due to event network-changed-deda3dcd-de47-47e2-8bb1-526d3882d38e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:08:32 compute-0 nova_compute[183075]: 2026-01-22 17:08:32.150 183079 DEBUG oslo_concurrency.lockutils [req-e94b748f-af46-4284-a10d-3d9fc3b89e4d req-c32cf383-7a4d-43d6-bd28-a32b65a10bbe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-000b64b8-bcc5-4bbe-9703-8400a83a27d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:08:32 compute-0 nova_compute[183075]: 2026-01-22 17:08:32.328 183079 DEBUG nova.compute.manager [req-ef304090-c26d-4eb4-921e-67f8549daec7 req-95afbb49-94fd-44c6-88fd-2712e38eabf6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Received event network-vif-plugged-611e2a01-0a7c-4b7f-a941-623f993a5547 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:32 compute-0 nova_compute[183075]: 2026-01-22 17:08:32.329 183079 DEBUG oslo_concurrency.lockutils [req-ef304090-c26d-4eb4-921e-67f8549daec7 req-95afbb49-94fd-44c6-88fd-2712e38eabf6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:32 compute-0 nova_compute[183075]: 2026-01-22 17:08:32.329 183079 DEBUG oslo_concurrency.lockutils [req-ef304090-c26d-4eb4-921e-67f8549daec7 req-95afbb49-94fd-44c6-88fd-2712e38eabf6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:32 compute-0 nova_compute[183075]: 2026-01-22 17:08:32.329 183079 DEBUG oslo_concurrency.lockutils [req-ef304090-c26d-4eb4-921e-67f8549daec7 req-95afbb49-94fd-44c6-88fd-2712e38eabf6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:32 compute-0 nova_compute[183075]: 2026-01-22 17:08:32.329 183079 DEBUG nova.compute.manager [req-ef304090-c26d-4eb4-921e-67f8549daec7 req-95afbb49-94fd-44c6-88fd-2712e38eabf6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] No waiting events found dispatching network-vif-plugged-611e2a01-0a7c-4b7f-a941-623f993a5547 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:08:32 compute-0 nova_compute[183075]: 2026-01-22 17:08:32.330 183079 WARNING nova.compute.manager [req-ef304090-c26d-4eb4-921e-67f8549daec7 req-95afbb49-94fd-44c6-88fd-2712e38eabf6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Received unexpected event network-vif-plugged-611e2a01-0a7c-4b7f-a941-623f993a5547 for instance with vm_state active and task_state None.
Jan 22 17:08:32 compute-0 nova_compute[183075]: 2026-01-22 17:08:32.496 183079 INFO nova.compute.manager [None req-791c3915-badc-46ad-a6f8-64a49843a773 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Get console output
Jan 22 17:08:32 compute-0 nova_compute[183075]: 2026-01-22 17:08:32.501 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:08:33 compute-0 nova_compute[183075]: 2026-01-22 17:08:33.965 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:34 compute-0 podman[216432]: 2026-01-22 17:08:34.359537067 +0000 UTC m=+0.059640080 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:08:34 compute-0 nova_compute[183075]: 2026-01-22 17:08:34.862 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:35.885 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.886 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:35.887 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.934 183079 DEBUG nova.network.neutron [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Updating instance_info_cache with network_info: [{"id": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "address": "fa:16:3e:0a:25:d6", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc694dca0-bf", "ovs_interfaceid": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.955 183079 DEBUG oslo_concurrency.lockutils [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Releasing lock "refresh_cache-000b64b8-bcc5-4bbe-9703-8400a83a27d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.956 183079 DEBUG oslo_concurrency.lockutils [req-e94b748f-af46-4284-a10d-3d9fc3b89e4d req-c32cf383-7a4d-43d6-bd28-a32b65a10bbe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-000b64b8-bcc5-4bbe-9703-8400a83a27d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.956 183079 DEBUG nova.network.neutron [req-e94b748f-af46-4284-a10d-3d9fc3b89e4d req-c32cf383-7a4d-43d6-bd28-a32b65a10bbe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Refreshing network info cache for port deda3dcd-de47-47e2-8bb1-526d3882d38e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.958 183079 DEBUG nova.virt.libvirt.vif [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1172585183',display_name='tempest-server-test-1172585183',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1172585183',id=5,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCYZfzAO5Ui0C6aNyi2ae5nossybaMPhlAmp9Zj0dL/LunopVV9meBl8qfqrR0u5rqBGUC3w2LWkiMuhmtUWjdFFHsn2/jcRm/feYqTpe58YE0uSBcC8gJ0hMPoRirOIAg==',key_name='tempest-keypair-test-1166356111',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:07:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='eb8e9f7a891a4a38af8b01557eddc991',ramdisk_id='',reservation_id='r-68us24bx',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPPortDetailsTest-1812576526',owner_user_name='tempest-FloatingIPPortDetailsTest-1812576526-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:07:45Z,user_data=None,user_id='d7ee6c51c2b8447baefccea20fa16de5',uuid=000b64b8-bcc5-4bbe-9703-8400a83a27d0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.959 183079 DEBUG nova.network.os_vif_util [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converting VIF {"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.960 183079 DEBUG nova.network.os_vif_util [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:1c:cd,bridge_name='br-int',has_traffic_filtering=True,id=deda3dcd-de47-47e2-8bb1-526d3882d38e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdeda3dcd-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.960 183079 DEBUG os_vif [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:1c:cd,bridge_name='br-int',has_traffic_filtering=True,id=deda3dcd-de47-47e2-8bb1-526d3882d38e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdeda3dcd-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.961 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.961 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.961 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.963 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.964 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdeda3dcd-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.964 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdeda3dcd-de, col_values=(('external_ids', {'iface-id': 'deda3dcd-de47-47e2-8bb1-526d3882d38e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:1c:cd', 'vm-uuid': '000b64b8-bcc5-4bbe-9703-8400a83a27d0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.966 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:35 compute-0 NetworkManager[55454]: <info>  [1769101715.9677] manager: (tapdeda3dcd-de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.969 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.978 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.980 183079 INFO os_vif [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:1c:cd,bridge_name='br-int',has_traffic_filtering=True,id=deda3dcd-de47-47e2-8bb1-526d3882d38e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdeda3dcd-de')
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.981 183079 DEBUG nova.virt.libvirt.vif [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1172585183',display_name='tempest-server-test-1172585183',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1172585183',id=5,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCYZfzAO5Ui0C6aNyi2ae5nossybaMPhlAmp9Zj0dL/LunopVV9meBl8qfqrR0u5rqBGUC3w2LWkiMuhmtUWjdFFHsn2/jcRm/feYqTpe58YE0uSBcC8gJ0hMPoRirOIAg==',key_name='tempest-keypair-test-1166356111',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:07:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='eb8e9f7a891a4a38af8b01557eddc991',ramdisk_id='',reservation_id='r-68us24bx',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPPortDetailsTest-1812576526',owner_user_name='tempest-FloatingIPPortDetailsTest-1812576526-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:07:45Z,user_data=None,user_id='d7ee6c51c2b8447baefccea20fa16de5',uuid=000b64b8-bcc5-4bbe-9703-8400a83a27d0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.981 183079 DEBUG nova.network.os_vif_util [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converting VIF {"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.982 183079 DEBUG nova.network.os_vif_util [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:1c:cd,bridge_name='br-int',has_traffic_filtering=True,id=deda3dcd-de47-47e2-8bb1-526d3882d38e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdeda3dcd-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:08:35 compute-0 nova_compute[183075]: 2026-01-22 17:08:35.985 183079 DEBUG nova.virt.libvirt.guest [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] attach device xml: <interface type="ethernet">
Jan 22 17:08:35 compute-0 nova_compute[183075]:   <mac address="fa:16:3e:a0:1c:cd"/>
Jan 22 17:08:35 compute-0 nova_compute[183075]:   <model type="virtio"/>
Jan 22 17:08:35 compute-0 nova_compute[183075]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:08:35 compute-0 nova_compute[183075]:   <mtu size="1442"/>
Jan 22 17:08:35 compute-0 nova_compute[183075]:   <target dev="tapdeda3dcd-de"/>
Jan 22 17:08:35 compute-0 nova_compute[183075]: </interface>
Jan 22 17:08:35 compute-0 nova_compute[183075]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 22 17:08:36 compute-0 kernel: tapdeda3dcd-de: entered promiscuous mode
Jan 22 17:08:36 compute-0 NetworkManager[55454]: <info>  [1769101716.0029] manager: (tapdeda3dcd-de): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Jan 22 17:08:36 compute-0 nova_compute[183075]: 2026-01-22 17:08:36.002 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:36 compute-0 ovn_controller[95372]: 2026-01-22T17:08:36Z|00085|binding|INFO|Claiming lport deda3dcd-de47-47e2-8bb1-526d3882d38e for this chassis.
Jan 22 17:08:36 compute-0 ovn_controller[95372]: 2026-01-22T17:08:36Z|00086|binding|INFO|deda3dcd-de47-47e2-8bb1-526d3882d38e: Claiming fa:16:3e:a0:1c:cd 10.100.0.10
Jan 22 17:08:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:36.026 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:1c:cd 10.100.0.10'], port_security=['fa:16:3e:a0:1c:cd 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'neutron:revision_number': '7', 'neutron:security_group_ids': '3fb419e8-dd25-4fec-8107-6e7d89977d34', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.234'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a8cd0033-3766-4772-8688-44d3d0e3250a, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=deda3dcd-de47-47e2-8bb1-526d3882d38e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:08:36 compute-0 ovn_controller[95372]: 2026-01-22T17:08:36Z|00087|binding|INFO|Setting lport deda3dcd-de47-47e2-8bb1-526d3882d38e ovn-installed in OVS
Jan 22 17:08:36 compute-0 ovn_controller[95372]: 2026-01-22T17:08:36Z|00088|binding|INFO|Setting lport deda3dcd-de47-47e2-8bb1-526d3882d38e up in Southbound
Jan 22 17:08:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:36.028 104629 INFO neutron.agent.ovn.metadata.agent [-] Port deda3dcd-de47-47e2-8bb1-526d3882d38e in datapath 473b4e99-4018-4fa7-ab1c-2d3e7944d850 bound to our chassis
Jan 22 17:08:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:36.030 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 473b4e99-4018-4fa7-ab1c-2d3e7944d850
Jan 22 17:08:36 compute-0 nova_compute[183075]: 2026-01-22 17:08:36.032 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:36 compute-0 nova_compute[183075]: 2026-01-22 17:08:36.035 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:36 compute-0 systemd-udevd[216464]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:08:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:36.050 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[56a42538-b9bc-477b-8651-f875b0ffaaf8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:36 compute-0 NetworkManager[55454]: <info>  [1769101716.0561] device (tapdeda3dcd-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:08:36 compute-0 NetworkManager[55454]: <info>  [1769101716.0566] device (tapdeda3dcd-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:08:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:36.080 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[b65efa8d-2163-4e4f-9e74-2469b9c2b323]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:36.082 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[102a7b3e-1bf3-4106-80c9-5b5416cb62cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:36 compute-0 nova_compute[183075]: 2026-01-22 17:08:36.095 183079 DEBUG nova.virt.libvirt.driver [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:08:36 compute-0 nova_compute[183075]: 2026-01-22 17:08:36.096 183079 DEBUG nova.virt.libvirt.driver [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] No VIF found with MAC fa:16:3e:0a:25:d6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:08:36 compute-0 nova_compute[183075]: 2026-01-22 17:08:36.096 183079 DEBUG nova.virt.libvirt.driver [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] No VIF found with MAC fa:16:3e:a0:1c:cd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:08:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:36.109 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[7d56f856-f158-4de4-8b54-966e04d5c195]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:36.128 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9cbdcacd-7ef8-4c51-9ce2-2f0762b8d5a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap473b4e99-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:87:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 203, 'tx_packets': 109, 'rx_bytes': 17350, 'tx_bytes': 12160, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 203, 'tx_packets': 109, 'rx_bytes': 17350, 'tx_bytes': 12160, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399443, 'reachable_time': 41758, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216471, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:36 compute-0 nova_compute[183075]: 2026-01-22 17:08:36.130 183079 DEBUG nova.virt.libvirt.guest [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:08:36 compute-0 nova_compute[183075]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:08:36 compute-0 nova_compute[183075]:   <nova:name>tempest-server-test-1172585183</nova:name>
Jan 22 17:08:36 compute-0 nova_compute[183075]:   <nova:creationTime>2026-01-22 17:08:36</nova:creationTime>
Jan 22 17:08:36 compute-0 nova_compute[183075]:   <nova:flavor name="m1.nano">
Jan 22 17:08:36 compute-0 nova_compute[183075]:     <nova:memory>128</nova:memory>
Jan 22 17:08:36 compute-0 nova_compute[183075]:     <nova:disk>1</nova:disk>
Jan 22 17:08:36 compute-0 nova_compute[183075]:     <nova:swap>0</nova:swap>
Jan 22 17:08:36 compute-0 nova_compute[183075]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:08:36 compute-0 nova_compute[183075]:     <nova:vcpus>1</nova:vcpus>
Jan 22 17:08:36 compute-0 nova_compute[183075]:   </nova:flavor>
Jan 22 17:08:36 compute-0 nova_compute[183075]:   <nova:owner>
Jan 22 17:08:36 compute-0 nova_compute[183075]:     <nova:user uuid="d7ee6c51c2b8447baefccea20fa16de5">tempest-FloatingIPPortDetailsTest-1812576526-project-member</nova:user>
Jan 22 17:08:36 compute-0 nova_compute[183075]:     <nova:project uuid="eb8e9f7a891a4a38af8b01557eddc991">tempest-FloatingIPPortDetailsTest-1812576526</nova:project>
Jan 22 17:08:36 compute-0 nova_compute[183075]:   </nova:owner>
Jan 22 17:08:36 compute-0 nova_compute[183075]:   <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:08:36 compute-0 nova_compute[183075]:   <nova:ports>
Jan 22 17:08:36 compute-0 nova_compute[183075]:     <nova:port uuid="c694dca0-bf6e-4e89-a43e-1d10b3b23075">
Jan 22 17:08:36 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:08:36 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:08:36 compute-0 nova_compute[183075]:     <nova:port uuid="deda3dcd-de47-47e2-8bb1-526d3882d38e">
Jan 22 17:08:36 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:08:36 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:08:36 compute-0 nova_compute[183075]:   </nova:ports>
Jan 22 17:08:36 compute-0 nova_compute[183075]: </nova:instance>
Jan 22 17:08:36 compute-0 nova_compute[183075]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 17:08:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:36.151 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[eac4e8d3-ca65-4fde-a6cf-b596e3007327]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap473b4e99-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399457, 'tstamp': 399457}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216472, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap473b4e99-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399460, 'tstamp': 399460}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216472, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:36.152 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap473b4e99-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:36 compute-0 nova_compute[183075]: 2026-01-22 17:08:36.156 183079 DEBUG oslo_concurrency.lockutils [None req-39db135c-c216-4e93-9ab8-d612592604b2 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "interface-000b64b8-bcc5-4bbe-9703-8400a83a27d0-deda3dcd-de47-47e2-8bb1-526d3882d38e" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:36 compute-0 nova_compute[183075]: 2026-01-22 17:08:36.158 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:36 compute-0 nova_compute[183075]: 2026-01-22 17:08:36.160 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:36.160 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap473b4e99-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:36.160 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:36.161 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap473b4e99-40, col_values=(('external_ids', {'iface-id': '424ac40e-403e-4504-adbb-47a319b401fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:36.161 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:36 compute-0 nova_compute[183075]: 2026-01-22 17:08:36.265 183079 DEBUG nova.compute.manager [req-a6529051-4046-4582-a499-dce97968c4d3 req-8994ba46-319e-4c71-a1e4-7497f66cc23e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Received event network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:36 compute-0 nova_compute[183075]: 2026-01-22 17:08:36.266 183079 DEBUG oslo_concurrency.lockutils [req-a6529051-4046-4582-a499-dce97968c4d3 req-8994ba46-319e-4c71-a1e4-7497f66cc23e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:36 compute-0 nova_compute[183075]: 2026-01-22 17:08:36.266 183079 DEBUG oslo_concurrency.lockutils [req-a6529051-4046-4582-a499-dce97968c4d3 req-8994ba46-319e-4c71-a1e4-7497f66cc23e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:36 compute-0 nova_compute[183075]: 2026-01-22 17:08:36.266 183079 DEBUG oslo_concurrency.lockutils [req-a6529051-4046-4582-a499-dce97968c4d3 req-8994ba46-319e-4c71-a1e4-7497f66cc23e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:36 compute-0 nova_compute[183075]: 2026-01-22 17:08:36.266 183079 DEBUG nova.compute.manager [req-a6529051-4046-4582-a499-dce97968c4d3 req-8994ba46-319e-4c71-a1e4-7497f66cc23e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] No waiting events found dispatching network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:08:36 compute-0 nova_compute[183075]: 2026-01-22 17:08:36.267 183079 WARNING nova.compute.manager [req-a6529051-4046-4582-a499-dce97968c4d3 req-8994ba46-319e-4c71-a1e4-7497f66cc23e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Received unexpected event network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e for instance with vm_state active and task_state None.
Jan 22 17:08:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:36.889 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.283 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "ee33030a-2035-4fd1-8de4-261142b89bc6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.283 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.306 183079 DEBUG nova.compute.manager [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.408 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.409 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.417 183079 DEBUG nova.virt.hardware [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.417 183079 INFO nova.compute.claims [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.587 183079 DEBUG nova.compute.provider_tree [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.606 183079 DEBUG nova.scheduler.client.report [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.611 183079 DEBUG nova.network.neutron [req-e94b748f-af46-4284-a10d-3d9fc3b89e4d req-c32cf383-7a4d-43d6-bd28-a32b65a10bbe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Updated VIF entry in instance network info cache for port deda3dcd-de47-47e2-8bb1-526d3882d38e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.612 183079 DEBUG nova.network.neutron [req-e94b748f-af46-4284-a10d-3d9fc3b89e4d req-c32cf383-7a4d-43d6-bd28-a32b65a10bbe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Updating instance_info_cache with network_info: [{"id": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "address": "fa:16:3e:0a:25:d6", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc694dca0-bf", "ovs_interfaceid": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.630 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.631 183079 DEBUG nova.compute.manager [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.634 183079 DEBUG oslo_concurrency.lockutils [req-e94b748f-af46-4284-a10d-3d9fc3b89e4d req-c32cf383-7a4d-43d6-bd28-a32b65a10bbe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-000b64b8-bcc5-4bbe-9703-8400a83a27d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.666 183079 DEBUG nova.compute.manager [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.667 183079 DEBUG nova.network.neutron [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.687 183079 INFO nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.695 183079 INFO nova.compute.manager [None req-bbff7bc1-5069-4738-8c20-b425d364c069 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Get console output
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.706 183079 DEBUG nova.compute.manager [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.831 183079 DEBUG oslo_concurrency.lockutils [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "interface-000b64b8-bcc5-4bbe-9703-8400a83a27d0-deda3dcd-de47-47e2-8bb1-526d3882d38e" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.832 183079 DEBUG oslo_concurrency.lockutils [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "interface-000b64b8-bcc5-4bbe-9703-8400a83a27d0-deda3dcd-de47-47e2-8bb1-526d3882d38e" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.841 183079 DEBUG nova.compute.manager [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.845 183079 DEBUG nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.846 183079 INFO nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Creating image(s)
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.847 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "/var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.847 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.849 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.877 183079 DEBUG oslo_concurrency.processutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.914 183079 DEBUG nova.objects.instance [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lazy-loading 'flavor' on Instance uuid 000b64b8-bcc5-4bbe-9703-8400a83a27d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.966 183079 DEBUG oslo_concurrency.processutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.968 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.968 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:37 compute-0 nova_compute[183075]: 2026-01-22 17:08:37.983 183079 DEBUG oslo_concurrency.processutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.002 183079 DEBUG nova.virt.libvirt.vif [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1172585183',display_name='tempest-server-test-1172585183',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1172585183',id=5,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCYZfzAO5Ui0C6aNyi2ae5nossybaMPhlAmp9Zj0dL/LunopVV9meBl8qfqrR0u5rqBGUC3w2LWkiMuhmtUWjdFFHsn2/jcRm/feYqTpe58YE0uSBcC8gJ0hMPoRirOIAg==',key_name='tempest-keypair-test-1166356111',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:07:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='eb8e9f7a891a4a38af8b01557eddc991',ramdisk_id='',reservation_id='r-68us24bx',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPPortDetailsTest-1812576526',owner_user_name='tempest-FloatingIPPortDetailsTest-1812576526-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:07:45Z,user_data=None,user_id='d7ee6c51c2b8447baefccea20fa16de5',uuid=000b64b8-bcc5-4bbe-9703-8400a83a27d0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.003 183079 DEBUG nova.network.os_vif_util [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converting VIF {"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.004 183079 DEBUG nova.network.os_vif_util [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:1c:cd,bridge_name='br-int',has_traffic_filtering=True,id=deda3dcd-de47-47e2-8bb1-526d3882d38e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdeda3dcd-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.008 183079 DEBUG nova.virt.libvirt.guest [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a0:1c:cd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdeda3dcd-de"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.010 183079 DEBUG nova.virt.libvirt.guest [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a0:1c:cd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdeda3dcd-de"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.014 183079 DEBUG nova.virt.libvirt.driver [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Attempting to detach device tapdeda3dcd-de from instance 000b64b8-bcc5-4bbe-9703-8400a83a27d0 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.015 183079 DEBUG nova.virt.libvirt.guest [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] detach device xml: <interface type="ethernet">
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <mac address="fa:16:3e:a0:1c:cd"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <model type="virtio"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <mtu size="1442"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <target dev="tapdeda3dcd-de"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]: </interface>
Jan 22 17:08:38 compute-0 nova_compute[183075]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.023 183079 DEBUG nova.virt.libvirt.guest [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a0:1c:cd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdeda3dcd-de"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.027 183079 DEBUG nova.virt.libvirt.guest [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a0:1c:cd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdeda3dcd-de"/></interface>not found in domain: <domain type='kvm' id='5'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <name>instance-00000005</name>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <uuid>000b64b8-bcc5-4bbe-9703-8400a83a27d0</uuid>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:name>tempest-server-test-1172585183</nova:name>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:creationTime>2026-01-22 17:08:36</nova:creationTime>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:flavor name="m1.nano">
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:memory>128</nova:memory>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:disk>1</nova:disk>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:swap>0</nova:swap>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:vcpus>1</nova:vcpus>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </nova:flavor>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:owner>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:user uuid="d7ee6c51c2b8447baefccea20fa16de5">tempest-FloatingIPPortDetailsTest-1812576526-project-member</nova:user>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:project uuid="eb8e9f7a891a4a38af8b01557eddc991">tempest-FloatingIPPortDetailsTest-1812576526</nova:project>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </nova:owner>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:ports>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:port uuid="c694dca0-bf6e-4e89-a43e-1d10b3b23075">
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:port uuid="deda3dcd-de47-47e2-8bb1-526d3882d38e">
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </nova:ports>
Jan 22 17:08:38 compute-0 nova_compute[183075]: </nova:instance>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <memory unit='KiB'>131072</memory>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <vcpu placement='static'>1</vcpu>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <resource>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <partition>/machine</partition>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </resource>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <sysinfo type='smbios'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <system>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <entry name='manufacturer'>RDO</entry>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <entry name='serial'>000b64b8-bcc5-4bbe-9703-8400a83a27d0</entry>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <entry name='uuid'>000b64b8-bcc5-4bbe-9703-8400a83a27d0</entry>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <entry name='family'>Virtual Machine</entry>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </system>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <os>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <boot dev='hd'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <smbios mode='sysinfo'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </os>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <features>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <vmcoreinfo state='on'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </features>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <vendor>AMD</vendor>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='x2apic'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc-deadline'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='hypervisor'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc_adjust'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='spec-ctrl'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='stibp'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='ssbd'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='cmp_legacy'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='overflow-recov'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='succor'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='ibrs'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='amd-ssbd'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='virt-ssbd'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='lbrv'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='tsc-scale'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='vmcb-clean'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='flushbyasid'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='pause-filter'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='pfthreshold'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='xsaves'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='svm'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='topoext'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='npt'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='nrip-save'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <clock offset='utc'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <timer name='hpet' present='no'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <on_poweroff>destroy</on_poweroff>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <on_reboot>restart</on_reboot>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <on_crash>destroy</on_crash>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <disk type='file' device='disk'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <source file='/var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk' index='1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <backingStore type='file' index='2'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:         <format type='raw'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:         <source file='/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:         <backingStore/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       </backingStore>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target dev='vda' bus='virtio'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='virtio-disk0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pcie.0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='1' port='0x10'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='2' port='0x11'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.2'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='3' port='0x12'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.3'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='4' port='0x13'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.4'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='5' port='0x14'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.5'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='6' port='0x15'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.6'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='7' port='0x16'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.7'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='8' port='0x17'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.8'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='9' port='0x18'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.9'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='10' port='0x19'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.10'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='11' port='0x1a'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.11'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='12' port='0x1b'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.12'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='13' port='0x1c'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.13'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='14' port='0x1d'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.14'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='15' port='0x1e'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.15'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='16' port='0x1f'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.16'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='17' port='0x20'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.17'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='18' port='0x21'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.18'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='19' port='0x22'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.19'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='20' port='0x23'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.20'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='21' port='0x24'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.21'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='22' port='0x25'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.22'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='23' port='0x26'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.23'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='24' port='0x27'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.24'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='25' port='0x28'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.25'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-pci-bridge'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.26'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='usb'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='sata' index='0'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='ide'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <interface type='ethernet'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <mac address='fa:16:3e:0a:25:d6'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target dev='tapc694dca0-bf'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model type='virtio'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <mtu size='1442'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='net0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <interface type='ethernet'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <mac address='fa:16:3e:a0:1c:cd'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target dev='tapdeda3dcd-de'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model type='virtio'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <mtu size='1442'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='net1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <serial type='pty'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <source path='/dev/pts/3'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/console.log' append='off'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target type='isa-serial' port='0'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:         <model name='isa-serial'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       </target>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <console type='pty' tty='/dev/pts/3'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <source path='/dev/pts/3'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/console.log' append='off'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target type='serial' port='0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </console>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <input type='tablet' bus='usb'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='input0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='usb' bus='0' port='1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </input>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <input type='mouse' bus='ps2'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='input1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </input>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <input type='keyboard' bus='ps2'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='input2'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </input>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <graphics type='vnc' port='5903' autoport='yes' listen='::0'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <listen type='address' address='::0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </graphics>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <audio id='1' type='none'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <video>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='video0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </video>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <watchdog model='itco' action='reset'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='watchdog0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </watchdog>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <memballoon model='virtio'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <stats period='10'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='balloon0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <rng model='virtio'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <backend model='random'>/dev/urandom</backend>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='rng0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <label>system_u:system_r:svirt_t:s0:c754,c781</label>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c754,c781</imagelabel>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <label>+107:+107</label>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <imagelabel>+107:+107</imagelabel>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:08:38 compute-0 nova_compute[183075]: </domain>
Jan 22 17:08:38 compute-0 nova_compute[183075]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.029 183079 INFO nova.virt.libvirt.driver [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Successfully detached device tapdeda3dcd-de from instance 000b64b8-bcc5-4bbe-9703-8400a83a27d0 from the persistent domain config.
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.029 183079 DEBUG nova.virt.libvirt.driver [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] (1/8): Attempting to detach device tapdeda3dcd-de with device alias net1 from instance 000b64b8-bcc5-4bbe-9703-8400a83a27d0 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.030 183079 DEBUG nova.virt.libvirt.guest [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] detach device xml: <interface type="ethernet">
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <mac address="fa:16:3e:a0:1c:cd"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <model type="virtio"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <mtu size="1442"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <target dev="tapdeda3dcd-de"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]: </interface>
Jan 22 17:08:38 compute-0 nova_compute[183075]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.041 183079 DEBUG oslo_concurrency.processutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.042 183079 DEBUG oslo_concurrency.processutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.079 183079 DEBUG oslo_concurrency.processutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.080 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.080 183079 DEBUG oslo_concurrency.processutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:38 compute-0 kernel: tapdeda3dcd-de (unregistering): left promiscuous mode
Jan 22 17:08:38 compute-0 NetworkManager[55454]: <info>  [1769101718.1328] device (tapdeda3dcd-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.145 183079 DEBUG nova.virt.libvirt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Received event <DeviceRemovedEvent: 1769101718.1448326, 000b64b8-bcc5-4bbe-9703-8400a83a27d0 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.147 183079 DEBUG oslo_concurrency.processutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.148 183079 DEBUG nova.virt.disk.api [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Checking if we can resize image /var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.148 183079 DEBUG oslo_concurrency.processutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:38 compute-0 ovn_controller[95372]: 2026-01-22T17:08:38Z|00089|binding|INFO|Releasing lport deda3dcd-de47-47e2-8bb1-526d3882d38e from this chassis (sb_readonly=0)
Jan 22 17:08:38 compute-0 ovn_controller[95372]: 2026-01-22T17:08:38Z|00090|binding|INFO|Setting lport deda3dcd-de47-47e2-8bb1-526d3882d38e down in Southbound
Jan 22 17:08:38 compute-0 ovn_controller[95372]: 2026-01-22T17:08:38Z|00091|binding|INFO|Removing iface tapdeda3dcd-de ovn-installed in OVS
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.166 183079 DEBUG nova.virt.libvirt.driver [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Start waiting for the detach event from libvirt for device tapdeda3dcd-de with device alias net1 for instance 000b64b8-bcc5-4bbe-9703-8400a83a27d0 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.167 183079 DEBUG nova.virt.libvirt.guest [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a0:1c:cd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdeda3dcd-de"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.168 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:38.160 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:1c:cd 10.100.0.10'], port_security=['fa:16:3e:a0:1c:cd 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'neutron:revision_number': '9', 'neutron:security_group_ids': '3fb419e8-dd25-4fec-8107-6e7d89977d34', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.234', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a8cd0033-3766-4772-8688-44d3d0e3250a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=deda3dcd-de47-47e2-8bb1-526d3882d38e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:08:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:38.161 104629 INFO neutron.agent.ovn.metadata.agent [-] Port deda3dcd-de47-47e2-8bb1-526d3882d38e in datapath 473b4e99-4018-4fa7-ab1c-2d3e7944d850 unbound from our chassis
Jan 22 17:08:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:38.163 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 473b4e99-4018-4fa7-ab1c-2d3e7944d850
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.174 183079 DEBUG nova.virt.libvirt.guest [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a0:1c:cd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdeda3dcd-de"/></interface>not found in domain: <domain type='kvm' id='5'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <name>instance-00000005</name>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <uuid>000b64b8-bcc5-4bbe-9703-8400a83a27d0</uuid>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:name>tempest-server-test-1172585183</nova:name>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:creationTime>2026-01-22 17:08:36</nova:creationTime>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:flavor name="m1.nano">
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:memory>128</nova:memory>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:disk>1</nova:disk>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:swap>0</nova:swap>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:vcpus>1</nova:vcpus>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </nova:flavor>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:owner>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:user uuid="d7ee6c51c2b8447baefccea20fa16de5">tempest-FloatingIPPortDetailsTest-1812576526-project-member</nova:user>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:project uuid="eb8e9f7a891a4a38af8b01557eddc991">tempest-FloatingIPPortDetailsTest-1812576526</nova:project>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </nova:owner>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:ports>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:port uuid="c694dca0-bf6e-4e89-a43e-1d10b3b23075">
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:port uuid="deda3dcd-de47-47e2-8bb1-526d3882d38e">
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </nova:ports>
Jan 22 17:08:38 compute-0 nova_compute[183075]: </nova:instance>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <memory unit='KiB'>131072</memory>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <vcpu placement='static'>1</vcpu>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <resource>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <partition>/machine</partition>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </resource>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <sysinfo type='smbios'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <system>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <entry name='manufacturer'>RDO</entry>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <entry name='serial'>000b64b8-bcc5-4bbe-9703-8400a83a27d0</entry>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <entry name='uuid'>000b64b8-bcc5-4bbe-9703-8400a83a27d0</entry>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <entry name='family'>Virtual Machine</entry>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </system>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <os>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <boot dev='hd'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <smbios mode='sysinfo'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </os>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <features>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <vmcoreinfo state='on'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </features>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <vendor>AMD</vendor>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='x2apic'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc-deadline'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='hypervisor'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc_adjust'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='spec-ctrl'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='stibp'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='ssbd'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='cmp_legacy'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='overflow-recov'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='succor'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='ibrs'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='amd-ssbd'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='virt-ssbd'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='lbrv'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='tsc-scale'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='vmcb-clean'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='flushbyasid'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='pause-filter'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='pfthreshold'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='xsaves'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='svm'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='require' name='topoext'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='npt'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <feature policy='disable' name='nrip-save'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <clock offset='utc'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <timer name='hpet' present='no'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <on_poweroff>destroy</on_poweroff>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <on_reboot>restart</on_reboot>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <on_crash>destroy</on_crash>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <disk type='file' device='disk'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <source file='/var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/disk' index='1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <backingStore type='file' index='2'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:         <format type='raw'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:         <source file='/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:         <backingStore/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       </backingStore>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target dev='vda' bus='virtio'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='virtio-disk0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pcie.0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='1' port='0x10'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='2' port='0x11'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.2'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='3' port='0x12'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.3'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='4' port='0x13'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.4'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='5' port='0x14'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.5'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='6' port='0x15'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.6'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='7' port='0x16'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.7'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='8' port='0x17'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.8'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='9' port='0x18'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.9'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='10' port='0x19'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.10'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='11' port='0x1a'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.11'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='12' port='0x1b'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.12'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='13' port='0x1c'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.13'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='14' port='0x1d'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.14'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='15' port='0x1e'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.15'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='16' port='0x1f'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.16'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='17' port='0x20'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.17'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='18' port='0x21'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.18'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='19' port='0x22'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.19'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='20' port='0x23'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.20'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='21' port='0x24'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.21'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='22' port='0x25'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.22'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='23' port='0x26'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.23'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='24' port='0x27'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.24'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target chassis='25' port='0x28'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.25'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model name='pcie-pci-bridge'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='pci.26'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='usb'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <controller type='sata' index='0'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='ide'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <interface type='ethernet'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <mac address='fa:16:3e:0a:25:d6'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target dev='tapc694dca0-bf'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model type='virtio'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <mtu size='1442'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='net0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <serial type='pty'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <source path='/dev/pts/3'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/console.log' append='off'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target type='isa-serial' port='0'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:         <model name='isa-serial'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       </target>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <console type='pty' tty='/dev/pts/3'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <source path='/dev/pts/3'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0/console.log' append='off'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <target type='serial' port='0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </console>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <input type='tablet' bus='usb'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='input0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='usb' bus='0' port='1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </input>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <input type='mouse' bus='ps2'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='input1'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </input>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <input type='keyboard' bus='ps2'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='input2'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </input>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <graphics type='vnc' port='5903' autoport='yes' listen='::0'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <listen type='address' address='::0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </graphics>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <audio id='1' type='none'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <video>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='video0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </video>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <watchdog model='itco' action='reset'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='watchdog0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </watchdog>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <memballoon model='virtio'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <stats period='10'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='balloon0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <rng model='virtio'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <backend model='random'>/dev/urandom</backend>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <alias name='rng0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <label>system_u:system_r:svirt_t:s0:c754,c781</label>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c754,c781</imagelabel>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <label>+107:+107</label>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <imagelabel>+107:+107</imagelabel>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:08:38 compute-0 nova_compute[183075]: </domain>
Jan 22 17:08:38 compute-0 nova_compute[183075]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 17:08:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:38.178 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3af6960a-4034-42e5-b28c-3451c3b27163]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.187 183079 INFO nova.virt.libvirt.driver [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Successfully detached device tapdeda3dcd-de from instance 000b64b8-bcc5-4bbe-9703-8400a83a27d0 from the live domain config.
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.188 183079 DEBUG nova.virt.libvirt.vif [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1172585183',display_name='tempest-server-test-1172585183',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1172585183',id=5,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCYZfzAO5Ui0C6aNyi2ae5nossybaMPhlAmp9Zj0dL/LunopVV9meBl8qfqrR0u5rqBGUC3w2LWkiMuhmtUWjdFFHsn2/jcRm/feYqTpe58YE0uSBcC8gJ0hMPoRirOIAg==',key_name='tempest-keypair-test-1166356111',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:07:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='eb8e9f7a891a4a38af8b01557eddc991',ramdisk_id='',reservation_id='r-68us24bx',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPPortDetailsTest-1812576526',owner_user_name='tempest-FloatingIPPortDetailsTest-1812576526-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:07:45Z,user_data=None,user_id='d7ee6c51c2b8447baefccea20fa16de5',uuid=000b64b8-bcc5-4bbe-9703-8400a83a27d0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.188 183079 DEBUG nova.network.os_vif_util [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converting VIF {"id": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "address": "fa:16:3e:a0:1c:cd", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdeda3dcd-de", "ovs_interfaceid": "deda3dcd-de47-47e2-8bb1-526d3882d38e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.189 183079 DEBUG nova.network.os_vif_util [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:1c:cd,bridge_name='br-int',has_traffic_filtering=True,id=deda3dcd-de47-47e2-8bb1-526d3882d38e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdeda3dcd-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.190 183079 DEBUG os_vif [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:1c:cd,bridge_name='br-int',has_traffic_filtering=True,id=deda3dcd-de47-47e2-8bb1-526d3882d38e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdeda3dcd-de') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.191 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.192 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdeda3dcd-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.193 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.195 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.197 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.203 183079 INFO os_vif [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:1c:cd,bridge_name='br-int',has_traffic_filtering=True,id=deda3dcd-de47-47e2-8bb1-526d3882d38e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdeda3dcd-de')
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.204 183079 DEBUG nova.virt.libvirt.guest [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:name>tempest-server-test-1172585183</nova:name>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:creationTime>2026-01-22 17:08:38</nova:creationTime>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:flavor name="m1.nano">
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:memory>128</nova:memory>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:disk>1</nova:disk>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:swap>0</nova:swap>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:vcpus>1</nova:vcpus>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </nova:flavor>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:owner>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:user uuid="d7ee6c51c2b8447baefccea20fa16de5">tempest-FloatingIPPortDetailsTest-1812576526-project-member</nova:user>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:project uuid="eb8e9f7a891a4a38af8b01557eddc991">tempest-FloatingIPPortDetailsTest-1812576526</nova:project>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </nova:owner>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   <nova:ports>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     <nova:port uuid="c694dca0-bf6e-4e89-a43e-1d10b3b23075">
Jan 22 17:08:38 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:08:38 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:08:38 compute-0 nova_compute[183075]:   </nova:ports>
Jan 22 17:08:38 compute-0 nova_compute[183075]: </nova:instance>
Jan 22 17:08:38 compute-0 nova_compute[183075]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.213 183079 DEBUG oslo_concurrency.processutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.214 183079 DEBUG nova.virt.disk.api [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Cannot resize image /var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.214 183079 DEBUG nova.objects.instance [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'migration_context' on Instance uuid ee33030a-2035-4fd1-8de4-261142b89bc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:08:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:38.216 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[26e63836-b9f5-4a18-9d75-58d3553243cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:38.219 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[844483f3-67d3-49da-a09e-0cbd76e1f00b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.233 183079 DEBUG nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.233 183079 DEBUG nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Ensure instance console log exists: /var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.234 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.234 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.235 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:38.244 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5f3d017e-662f-4bc4-ab8d-4569a31e694f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:38.262 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[345c194e-784b-468b-918e-dfb8a415a27b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap473b4e99-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:87:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 203, 'tx_packets': 111, 'rx_bytes': 17350, 'tx_bytes': 12244, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 203, 'tx_packets': 111, 'rx_bytes': 17350, 'tx_bytes': 12244, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399443, 'reachable_time': 41758, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216505, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:38.279 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cf0655d5-05ab-4fe5-a6f6-85d872b35d62]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap473b4e99-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399457, 'tstamp': 399457}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216506, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap473b4e99-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399460, 'tstamp': 399460}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216506, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:38.280 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap473b4e99-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.281 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.282 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:38.283 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap473b4e99-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:38.283 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:38.284 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap473b4e99-40, col_values=(('external_ids', {'iface-id': '424ac40e-403e-4504-adbb-47a319b401fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:38.284 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.546 183079 DEBUG nova.compute.manager [req-d94880b9-0ad2-483a-97a0-05454eb2fc8a req-3115f508-26ee-4b2a-8bd3-22dba5346158 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Received event network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.547 183079 DEBUG oslo_concurrency.lockutils [req-d94880b9-0ad2-483a-97a0-05454eb2fc8a req-3115f508-26ee-4b2a-8bd3-22dba5346158 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.548 183079 DEBUG oslo_concurrency.lockutils [req-d94880b9-0ad2-483a-97a0-05454eb2fc8a req-3115f508-26ee-4b2a-8bd3-22dba5346158 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.548 183079 DEBUG oslo_concurrency.lockutils [req-d94880b9-0ad2-483a-97a0-05454eb2fc8a req-3115f508-26ee-4b2a-8bd3-22dba5346158 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.548 183079 DEBUG nova.compute.manager [req-d94880b9-0ad2-483a-97a0-05454eb2fc8a req-3115f508-26ee-4b2a-8bd3-22dba5346158 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] No waiting events found dispatching network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.549 183079 WARNING nova.compute.manager [req-d94880b9-0ad2-483a-97a0-05454eb2fc8a req-3115f508-26ee-4b2a-8bd3-22dba5346158 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Received unexpected event network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e for instance with vm_state active and task_state None.
Jan 22 17:08:38 compute-0 nova_compute[183075]: 2026-01-22 17:08:38.953 183079 DEBUG nova.policy [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:08:39 compute-0 nova_compute[183075]: 2026-01-22 17:08:39.865 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:39 compute-0 nova_compute[183075]: 2026-01-22 17:08:39.883 183079 DEBUG nova.network.neutron [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Successfully updated port: 96fce86f-5c20-4930-8c2f-5cb0eb62e7fa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:08:39 compute-0 nova_compute[183075]: 2026-01-22 17:08:39.901 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "refresh_cache-ee33030a-2035-4fd1-8de4-261142b89bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:08:39 compute-0 nova_compute[183075]: 2026-01-22 17:08:39.901 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquired lock "refresh_cache-ee33030a-2035-4fd1-8de4-261142b89bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:08:39 compute-0 nova_compute[183075]: 2026-01-22 17:08:39.901 183079 DEBUG nova.network.neutron [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:08:39 compute-0 nova_compute[183075]: 2026-01-22 17:08:39.955 183079 DEBUG oslo_concurrency.lockutils [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "refresh_cache-000b64b8-bcc5-4bbe-9703-8400a83a27d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:08:39 compute-0 nova_compute[183075]: 2026-01-22 17:08:39.956 183079 DEBUG oslo_concurrency.lockutils [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquired lock "refresh_cache-000b64b8-bcc5-4bbe-9703-8400a83a27d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:08:39 compute-0 nova_compute[183075]: 2026-01-22 17:08:39.956 183079 DEBUG nova.network.neutron [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:08:40 compute-0 nova_compute[183075]: 2026-01-22 17:08:40.002 183079 DEBUG nova.compute.manager [req-60212762-c474-4078-acc6-4ebadcc6a005 req-b79efd9e-20b9-4ec9-bc78-95867fb47ff5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Received event network-changed-96fce86f-5c20-4930-8c2f-5cb0eb62e7fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:40 compute-0 nova_compute[183075]: 2026-01-22 17:08:40.003 183079 DEBUG nova.compute.manager [req-60212762-c474-4078-acc6-4ebadcc6a005 req-b79efd9e-20b9-4ec9-bc78-95867fb47ff5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Refreshing instance network info cache due to event network-changed-96fce86f-5c20-4930-8c2f-5cb0eb62e7fa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:08:40 compute-0 nova_compute[183075]: 2026-01-22 17:08:40.003 183079 DEBUG oslo_concurrency.lockutils [req-60212762-c474-4078-acc6-4ebadcc6a005 req-b79efd9e-20b9-4ec9-bc78-95867fb47ff5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-ee33030a-2035-4fd1-8de4-261142b89bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:08:40 compute-0 nova_compute[183075]: 2026-01-22 17:08:40.639 183079 DEBUG nova.compute.manager [req-b0fe8a7a-d4a1-4c70-bc2a-8ce5795a59bd req-01a83f80-cbbf-4259-8beb-ef054302650a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Received event network-vif-unplugged-deda3dcd-de47-47e2-8bb1-526d3882d38e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:40 compute-0 nova_compute[183075]: 2026-01-22 17:08:40.639 183079 DEBUG oslo_concurrency.lockutils [req-b0fe8a7a-d4a1-4c70-bc2a-8ce5795a59bd req-01a83f80-cbbf-4259-8beb-ef054302650a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:40 compute-0 nova_compute[183075]: 2026-01-22 17:08:40.640 183079 DEBUG oslo_concurrency.lockutils [req-b0fe8a7a-d4a1-4c70-bc2a-8ce5795a59bd req-01a83f80-cbbf-4259-8beb-ef054302650a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:40 compute-0 nova_compute[183075]: 2026-01-22 17:08:40.640 183079 DEBUG oslo_concurrency.lockutils [req-b0fe8a7a-d4a1-4c70-bc2a-8ce5795a59bd req-01a83f80-cbbf-4259-8beb-ef054302650a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:40 compute-0 nova_compute[183075]: 2026-01-22 17:08:40.640 183079 DEBUG nova.compute.manager [req-b0fe8a7a-d4a1-4c70-bc2a-8ce5795a59bd req-01a83f80-cbbf-4259-8beb-ef054302650a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] No waiting events found dispatching network-vif-unplugged-deda3dcd-de47-47e2-8bb1-526d3882d38e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:08:40 compute-0 nova_compute[183075]: 2026-01-22 17:08:40.640 183079 WARNING nova.compute.manager [req-b0fe8a7a-d4a1-4c70-bc2a-8ce5795a59bd req-01a83f80-cbbf-4259-8beb-ef054302650a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Received unexpected event network-vif-unplugged-deda3dcd-de47-47e2-8bb1-526d3882d38e for instance with vm_state active and task_state None.
Jan 22 17:08:40 compute-0 nova_compute[183075]: 2026-01-22 17:08:40.640 183079 DEBUG nova.compute.manager [req-b0fe8a7a-d4a1-4c70-bc2a-8ce5795a59bd req-01a83f80-cbbf-4259-8beb-ef054302650a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Received event network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:40 compute-0 nova_compute[183075]: 2026-01-22 17:08:40.640 183079 DEBUG oslo_concurrency.lockutils [req-b0fe8a7a-d4a1-4c70-bc2a-8ce5795a59bd req-01a83f80-cbbf-4259-8beb-ef054302650a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:40 compute-0 nova_compute[183075]: 2026-01-22 17:08:40.641 183079 DEBUG oslo_concurrency.lockutils [req-b0fe8a7a-d4a1-4c70-bc2a-8ce5795a59bd req-01a83f80-cbbf-4259-8beb-ef054302650a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:40 compute-0 nova_compute[183075]: 2026-01-22 17:08:40.641 183079 DEBUG oslo_concurrency.lockutils [req-b0fe8a7a-d4a1-4c70-bc2a-8ce5795a59bd req-01a83f80-cbbf-4259-8beb-ef054302650a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:40 compute-0 nova_compute[183075]: 2026-01-22 17:08:40.641 183079 DEBUG nova.compute.manager [req-b0fe8a7a-d4a1-4c70-bc2a-8ce5795a59bd req-01a83f80-cbbf-4259-8beb-ef054302650a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] No waiting events found dispatching network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:08:40 compute-0 nova_compute[183075]: 2026-01-22 17:08:40.641 183079 WARNING nova.compute.manager [req-b0fe8a7a-d4a1-4c70-bc2a-8ce5795a59bd req-01a83f80-cbbf-4259-8beb-ef054302650a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Received unexpected event network-vif-plugged-deda3dcd-de47-47e2-8bb1-526d3882d38e for instance with vm_state active and task_state None.
Jan 22 17:08:40 compute-0 nova_compute[183075]: 2026-01-22 17:08:40.906 183079 DEBUG nova.network.neutron [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:08:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:41.921 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:41.922 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:41.923 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.092 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:42 compute-0 ovn_controller[95372]: 2026-01-22T17:08:42Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:67:5c:7a 10.100.0.9
Jan 22 17:08:42 compute-0 ovn_controller[95372]: 2026-01-22T17:08:42Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:67:5c:7a 10.100.0.9
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.823 183079 INFO nova.compute.manager [None req-8267d195-002e-451f-955e-49464385f6c7 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Get console output
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.828 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.848 183079 INFO nova.network.neutron [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Port deda3dcd-de47-47e2-8bb1-526d3882d38e from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.849 183079 DEBUG nova.network.neutron [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Updating instance_info_cache with network_info: [{"id": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "address": "fa:16:3e:0a:25:d6", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc694dca0-bf", "ovs_interfaceid": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.868 183079 DEBUG oslo_concurrency.lockutils [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Releasing lock "refresh_cache-000b64b8-bcc5-4bbe-9703-8400a83a27d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.895 183079 DEBUG oslo_concurrency.lockutils [None req-ab74e55c-f22a-43b9-95a2-a48745fa2bd8 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "interface-000b64b8-bcc5-4bbe-9703-8400a83a27d0-deda3dcd-de47-47e2-8bb1-526d3882d38e" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 5.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.896 183079 DEBUG nova.network.neutron [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Updating instance_info_cache with network_info: [{"id": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "address": "fa:16:3e:c6:77:15", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96fce86f-5c", "ovs_interfaceid": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.916 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Releasing lock "refresh_cache-ee33030a-2035-4fd1-8de4-261142b89bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.916 183079 DEBUG nova.compute.manager [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Instance network_info: |[{"id": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "address": "fa:16:3e:c6:77:15", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96fce86f-5c", "ovs_interfaceid": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.917 183079 DEBUG oslo_concurrency.lockutils [req-60212762-c474-4078-acc6-4ebadcc6a005 req-b79efd9e-20b9-4ec9-bc78-95867fb47ff5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-ee33030a-2035-4fd1-8de4-261142b89bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.917 183079 DEBUG nova.network.neutron [req-60212762-c474-4078-acc6-4ebadcc6a005 req-b79efd9e-20b9-4ec9-bc78-95867fb47ff5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Refreshing network info cache for port 96fce86f-5c20-4930-8c2f-5cb0eb62e7fa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.919 183079 DEBUG nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Start _get_guest_xml network_info=[{"id": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "address": "fa:16:3e:c6:77:15", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96fce86f-5c", "ovs_interfaceid": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.923 183079 WARNING nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.927 183079 DEBUG nova.virt.libvirt.host [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.927 183079 DEBUG nova.virt.libvirt.host [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.933 183079 DEBUG nova.virt.libvirt.host [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.933 183079 DEBUG nova.virt.libvirt.host [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.933 183079 DEBUG nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.934 183079 DEBUG nova.virt.hardware [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.934 183079 DEBUG nova.virt.hardware [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.934 183079 DEBUG nova.virt.hardware [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.934 183079 DEBUG nova.virt.hardware [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.935 183079 DEBUG nova.virt.hardware [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.935 183079 DEBUG nova.virt.hardware [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.935 183079 DEBUG nova.virt.hardware [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.935 183079 DEBUG nova.virt.hardware [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.935 183079 DEBUG nova.virt.hardware [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.936 183079 DEBUG nova.virt.hardware [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.936 183079 DEBUG nova.virt.hardware [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.938 183079 DEBUG nova.virt.libvirt.vif [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:08:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-197701255',display_name='tempest-server-test-197701255',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-197701255',id=8,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-9ywe3qjk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:08:37Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=ee33030a-2035-4fd1-8de4-261142b89bc6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "address": "fa:16:3e:c6:77:15", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96fce86f-5c", "ovs_interfaceid": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.939 183079 DEBUG nova.network.os_vif_util [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "address": "fa:16:3e:c6:77:15", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96fce86f-5c", "ovs_interfaceid": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.939 183079 DEBUG nova.network.os_vif_util [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:77:15,bridge_name='br-int',has_traffic_filtering=True,id=96fce86f-5c20-4930-8c2f-5cb0eb62e7fa,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96fce86f-5c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.940 183079 DEBUG nova.objects.instance [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'pci_devices' on Instance uuid ee33030a-2035-4fd1-8de4-261142b89bc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.949 183079 DEBUG nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:08:42 compute-0 nova_compute[183075]:   <uuid>ee33030a-2035-4fd1-8de4-261142b89bc6</uuid>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   <name>instance-00000008</name>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-197701255</nova:name>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:08:42</nova:creationTime>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:08:42 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:08:42 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:08:42 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:08:42 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:08:42 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:08:42 compute-0 nova_compute[183075]:         <nova:user uuid="cd47d63cff2548a88e21e5c2e6a5c161">tempest-FloatingIpSeparateNetwork-931877966-project-member</nova:user>
Jan 22 17:08:42 compute-0 nova_compute[183075]:         <nova:project uuid="e05c7aae349e4a1d859a387df45650a0">tempest-FloatingIpSeparateNetwork-931877966</nova:project>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:08:42 compute-0 nova_compute[183075]:         <nova:port uuid="96fce86f-5c20-4930-8c2f-5cb0eb62e7fa">
Jan 22 17:08:42 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <system>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <entry name="serial">ee33030a-2035-4fd1-8de4-261142b89bc6</entry>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <entry name="uuid">ee33030a-2035-4fd1-8de4-261142b89bc6</entry>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     </system>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   <os>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   </os>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   <features>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   </features>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6/disk"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:c6:77:15"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <target dev="tap96fce86f-5c"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6/console.log" append="off"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <video>
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     </video>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:08:42 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:08:42 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:08:42 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:08:42 compute-0 nova_compute[183075]: </domain>
Jan 22 17:08:42 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.950 183079 DEBUG nova.compute.manager [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Preparing to wait for external event network-vif-plugged-96fce86f-5c20-4930-8c2f-5cb0eb62e7fa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.950 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.951 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.951 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.951 183079 DEBUG nova.virt.libvirt.vif [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:08:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-197701255',display_name='tempest-server-test-197701255',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-197701255',id=8,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-9ywe3qjk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:08:37Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=ee33030a-2035-4fd1-8de4-261142b89bc6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "address": "fa:16:3e:c6:77:15", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96fce86f-5c", "ovs_interfaceid": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.952 183079 DEBUG nova.network.os_vif_util [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "address": "fa:16:3e:c6:77:15", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96fce86f-5c", "ovs_interfaceid": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.952 183079 DEBUG nova.network.os_vif_util [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:77:15,bridge_name='br-int',has_traffic_filtering=True,id=96fce86f-5c20-4930-8c2f-5cb0eb62e7fa,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96fce86f-5c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.953 183079 DEBUG os_vif [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:77:15,bridge_name='br-int',has_traffic_filtering=True,id=96fce86f-5c20-4930-8c2f-5cb0eb62e7fa,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96fce86f-5c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.953 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.953 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.954 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.956 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.956 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap96fce86f-5c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.956 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap96fce86f-5c, col_values=(('external_ids', {'iface-id': '96fce86f-5c20-4930-8c2f-5cb0eb62e7fa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c6:77:15', 'vm-uuid': 'ee33030a-2035-4fd1-8de4-261142b89bc6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.957 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:42 compute-0 NetworkManager[55454]: <info>  [1769101722.9587] manager: (tap96fce86f-5c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.960 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.963 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:42 compute-0 nova_compute[183075]: 2026-01-22 17:08:42.964 183079 INFO os_vif [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:77:15,bridge_name='br-int',has_traffic_filtering=True,id=96fce86f-5c20-4930-8c2f-5cb0eb62e7fa,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96fce86f-5c')
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.009 183079 DEBUG nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.010 183079 DEBUG nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No VIF found with MAC fa:16:3e:c6:77:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:08:43 compute-0 kernel: tap96fce86f-5c: entered promiscuous mode
Jan 22 17:08:43 compute-0 NetworkManager[55454]: <info>  [1769101723.0601] manager: (tap96fce86f-5c): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Jan 22 17:08:43 compute-0 ovn_controller[95372]: 2026-01-22T17:08:43Z|00092|binding|INFO|Claiming lport 96fce86f-5c20-4930-8c2f-5cb0eb62e7fa for this chassis.
Jan 22 17:08:43 compute-0 ovn_controller[95372]: 2026-01-22T17:08:43Z|00093|binding|INFO|96fce86f-5c20-4930-8c2f-5cb0eb62e7fa: Claiming fa:16:3e:c6:77:15 10.100.0.12
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.061 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.072 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:77:15 10.100.0.12'], port_security=['fa:16:3e:c6:77:15 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ee33030a-2035-4fd1-8de4-261142b89bc6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-576f6598-999f-46d9-809a-65b7475a1ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.203'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92087573-6410-4f80-a195-ce8b2058420e, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=96fce86f-5c20-4930-8c2f-5cb0eb62e7fa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.073 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 96fce86f-5c20-4930-8c2f-5cb0eb62e7fa in datapath 576f6598-999f-46d9-809a-65b7475a1ec7 bound to our chassis
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.075 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 576f6598-999f-46d9-809a-65b7475a1ec7
Jan 22 17:08:43 compute-0 ovn_controller[95372]: 2026-01-22T17:08:43Z|00094|binding|INFO|Setting lport 96fce86f-5c20-4930-8c2f-5cb0eb62e7fa ovn-installed in OVS
Jan 22 17:08:43 compute-0 ovn_controller[95372]: 2026-01-22T17:08:43Z|00095|binding|INFO|Setting lport 96fce86f-5c20-4930-8c2f-5cb0eb62e7fa up in Southbound
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.081 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.090 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f9356818-5f6a-42ff-af76-262f136690f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.092 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap576f6598-91 in ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:08:43 compute-0 systemd-machined[154382]: New machine qemu-8-instance-00000008.
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.094 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap576f6598-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.094 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6d3464a2-5cf0-4018-87ce-aee432d2d33c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.095 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0e0cc234-d676-4730-bbae-d5e89778cdfd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:43 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000008.
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.107 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[530d72c9-dd6e-4844-9e09-0772b7ad36dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:43 compute-0 systemd-udevd[216540]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:08:43 compute-0 NetworkManager[55454]: <info>  [1769101723.1301] device (tap96fce86f-5c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:08:43 compute-0 NetworkManager[55454]: <info>  [1769101723.1314] device (tap96fce86f-5c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.135 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1fadfc0c-4c1f-4873-a60b-b5dde48c6d82]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.163 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5e4c9bf5-bab3-46bc-ae0e-e7adabd93098]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.167 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fc7ddf7c-ff6a-4de6-87fa-15f61304176b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:43 compute-0 NetworkManager[55454]: <info>  [1769101723.1687] manager: (tap576f6598-90): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.196 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a04c5699-903a-4a50-8155-7d208406999c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.199 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c2126a94-4aba-4131-a54e-a2202484b732]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:43 compute-0 NetworkManager[55454]: <info>  [1769101723.2254] device (tap576f6598-90): carrier: link connected
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.230 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[2ababd7c-95e7-472a-8545-38cbd49ba446]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.247 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[002befbf-c6f1-4d6e-ac89-2e72eaf5e504]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap576f6598-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:fa:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408692, 'reachable_time': 30470, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216571, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.263 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[97f21634-2227-4249-9d01-5816fd5221af]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea1:facd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408692, 'tstamp': 408692}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216572, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.281 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bfbeee1d-444e-4cec-9c62-2a45de94d36c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap576f6598-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:fa:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408692, 'reachable_time': 30470, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216573, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.319 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bfea2d45-f79d-474e-8f25-37c48697dc9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.388 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f316af53-a4dc-4368-8722-04a8da7f5938]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.389 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap576f6598-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.390 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.390 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap576f6598-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.392 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:43 compute-0 NetworkManager[55454]: <info>  [1769101723.3925] manager: (tap576f6598-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Jan 22 17:08:43 compute-0 kernel: tap576f6598-90: entered promiscuous mode
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.395 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.396 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap576f6598-90, col_values=(('external_ids', {'iface-id': '1759254b-798a-4e65-baf5-489557c1f604'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:43 compute-0 ovn_controller[95372]: 2026-01-22T17:08:43Z|00096|binding|INFO|Releasing lport 1759254b-798a-4e65-baf5-489557c1f604 from this chassis (sb_readonly=0)
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.398 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.423 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.424 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/576f6598-999f-46d9-809a-65b7475a1ec7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/576f6598-999f-46d9-809a-65b7475a1ec7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.426 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e92a4c0c-36a6-4574-b8b0-5048ae5b33da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.426 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/576f6598-999f-46d9-809a-65b7475a1ec7.pid.haproxy
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 576f6598-999f-46d9-809a-65b7475a1ec7
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:08:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:43.427 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'env', 'PROCESS_TAG=haproxy-576f6598-999f-46d9-809a-65b7475a1ec7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/576f6598-999f-46d9-809a-65b7475a1ec7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:08:43 compute-0 podman[216607]: 2026-01-22 17:08:43.796925781 +0000 UTC m=+0.061644673 container create f2b5b24e23b87235f99f4debb313a36e66992aa00f49bd54657271c674244e6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.817 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101723.8166907, ee33030a-2035-4fd1-8de4-261142b89bc6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.818 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] VM Started (Lifecycle Event)
Jan 22 17:08:43 compute-0 systemd[1]: Started libpod-conmon-f2b5b24e23b87235f99f4debb313a36e66992aa00f49bd54657271c674244e6f.scope.
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.846 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.853 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101723.816813, ee33030a-2035-4fd1-8de4-261142b89bc6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.854 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] VM Paused (Lifecycle Event)
Jan 22 17:08:43 compute-0 podman[216607]: 2026-01-22 17:08:43.765311225 +0000 UTC m=+0.030030117 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:08:43 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.876 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.878 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:08:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db7cdd337bbfefe149ee8eac594df93ad5083e1b3fc7233b3c22e94254963d11/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:08:43 compute-0 podman[216607]: 2026-01-22 17:08:43.892245964 +0000 UTC m=+0.156964886 container init f2b5b24e23b87235f99f4debb313a36e66992aa00f49bd54657271c674244e6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 17:08:43 compute-0 podman[216607]: 2026-01-22 17:08:43.897329437 +0000 UTC m=+0.162048329 container start f2b5b24e23b87235f99f4debb313a36e66992aa00f49bd54657271c674244e6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.912 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:08:43 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[216628]: [NOTICE]   (216632) : New worker (216634) forked
Jan 22 17:08:43 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[216628]: [NOTICE]   (216632) : Loading success.
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.992 183079 DEBUG nova.compute.manager [req-39d61760-ebb0-404f-aae0-ff7bcdfb3edd req-81f3c8a8-6290-4e30-878a-2db4b8829ce5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Received event network-vif-plugged-96fce86f-5c20-4930-8c2f-5cb0eb62e7fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.992 183079 DEBUG oslo_concurrency.lockutils [req-39d61760-ebb0-404f-aae0-ff7bcdfb3edd req-81f3c8a8-6290-4e30-878a-2db4b8829ce5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.992 183079 DEBUG oslo_concurrency.lockutils [req-39d61760-ebb0-404f-aae0-ff7bcdfb3edd req-81f3c8a8-6290-4e30-878a-2db4b8829ce5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.992 183079 DEBUG oslo_concurrency.lockutils [req-39d61760-ebb0-404f-aae0-ff7bcdfb3edd req-81f3c8a8-6290-4e30-878a-2db4b8829ce5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.992 183079 DEBUG nova.compute.manager [req-39d61760-ebb0-404f-aae0-ff7bcdfb3edd req-81f3c8a8-6290-4e30-878a-2db4b8829ce5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Processing event network-vif-plugged-96fce86f-5c20-4930-8c2f-5cb0eb62e7fa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.993 183079 DEBUG nova.compute.manager [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.996 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101723.9958768, ee33030a-2035-4fd1-8de4-261142b89bc6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.996 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] VM Resumed (Lifecycle Event)
Jan 22 17:08:43 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.997 183079 DEBUG nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:08:44 compute-0 nova_compute[183075]: 2026-01-22 17:08:43.999 183079 INFO nova.virt.libvirt.driver [-] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Instance spawned successfully.
Jan 22 17:08:44 compute-0 nova_compute[183075]: 2026-01-22 17:08:44.000 183079 DEBUG nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:08:44 compute-0 nova_compute[183075]: 2026-01-22 17:08:44.023 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:08:44 compute-0 nova_compute[183075]: 2026-01-22 17:08:44.036 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:08:44 compute-0 nova_compute[183075]: 2026-01-22 17:08:44.042 183079 DEBUG nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:44 compute-0 nova_compute[183075]: 2026-01-22 17:08:44.043 183079 DEBUG nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:44 compute-0 nova_compute[183075]: 2026-01-22 17:08:44.044 183079 DEBUG nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:44 compute-0 nova_compute[183075]: 2026-01-22 17:08:44.045 183079 DEBUG nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:44 compute-0 nova_compute[183075]: 2026-01-22 17:08:44.046 183079 DEBUG nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:44 compute-0 nova_compute[183075]: 2026-01-22 17:08:44.047 183079 DEBUG nova.virt.libvirt.driver [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:44 compute-0 nova_compute[183075]: 2026-01-22 17:08:44.062 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:08:44 compute-0 nova_compute[183075]: 2026-01-22 17:08:44.117 183079 INFO nova.compute.manager [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Took 6.27 seconds to spawn the instance on the hypervisor.
Jan 22 17:08:44 compute-0 nova_compute[183075]: 2026-01-22 17:08:44.118 183079 DEBUG nova.compute.manager [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:08:44 compute-0 nova_compute[183075]: 2026-01-22 17:08:44.213 183079 INFO nova.compute.manager [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Took 6.87 seconds to build instance.
Jan 22 17:08:44 compute-0 nova_compute[183075]: 2026-01-22 17:08:44.227 183079 DEBUG oslo_concurrency.lockutils [None req-d3176b63-9e32-43ae-89c1-00507f46ee69 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.944s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:44 compute-0 nova_compute[183075]: 2026-01-22 17:08:44.872 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:45 compute-0 podman[216643]: 2026-01-22 17:08:45.407835235 +0000 UTC m=+0.095987251 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:08:46 compute-0 nova_compute[183075]: 2026-01-22 17:08:46.086 183079 DEBUG nova.compute.manager [req-bb847a82-2cf2-4a0f-9d23-a4adcb2925b1 req-55b78e4e-688b-42dd-acae-c22831114227 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Received event network-vif-plugged-96fce86f-5c20-4930-8c2f-5cb0eb62e7fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:46 compute-0 nova_compute[183075]: 2026-01-22 17:08:46.086 183079 DEBUG oslo_concurrency.lockutils [req-bb847a82-2cf2-4a0f-9d23-a4adcb2925b1 req-55b78e4e-688b-42dd-acae-c22831114227 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:46 compute-0 nova_compute[183075]: 2026-01-22 17:08:46.086 183079 DEBUG oslo_concurrency.lockutils [req-bb847a82-2cf2-4a0f-9d23-a4adcb2925b1 req-55b78e4e-688b-42dd-acae-c22831114227 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:46 compute-0 nova_compute[183075]: 2026-01-22 17:08:46.087 183079 DEBUG oslo_concurrency.lockutils [req-bb847a82-2cf2-4a0f-9d23-a4adcb2925b1 req-55b78e4e-688b-42dd-acae-c22831114227 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:46 compute-0 nova_compute[183075]: 2026-01-22 17:08:46.087 183079 DEBUG nova.compute.manager [req-bb847a82-2cf2-4a0f-9d23-a4adcb2925b1 req-55b78e4e-688b-42dd-acae-c22831114227 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] No waiting events found dispatching network-vif-plugged-96fce86f-5c20-4930-8c2f-5cb0eb62e7fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:08:46 compute-0 nova_compute[183075]: 2026-01-22 17:08:46.087 183079 WARNING nova.compute.manager [req-bb847a82-2cf2-4a0f-9d23-a4adcb2925b1 req-55b78e4e-688b-42dd-acae-c22831114227 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Received unexpected event network-vif-plugged-96fce86f-5c20-4930-8c2f-5cb0eb62e7fa for instance with vm_state active and task_state None.
Jan 22 17:08:46 compute-0 nova_compute[183075]: 2026-01-22 17:08:46.420 183079 DEBUG nova.network.neutron [req-60212762-c474-4078-acc6-4ebadcc6a005 req-b79efd9e-20b9-4ec9-bc78-95867fb47ff5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Updated VIF entry in instance network info cache for port 96fce86f-5c20-4930-8c2f-5cb0eb62e7fa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:08:46 compute-0 nova_compute[183075]: 2026-01-22 17:08:46.420 183079 DEBUG nova.network.neutron [req-60212762-c474-4078-acc6-4ebadcc6a005 req-b79efd9e-20b9-4ec9-bc78-95867fb47ff5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Updating instance_info_cache with network_info: [{"id": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "address": "fa:16:3e:c6:77:15", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96fce86f-5c", "ovs_interfaceid": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:08:46 compute-0 nova_compute[183075]: 2026-01-22 17:08:46.434 183079 DEBUG oslo_concurrency.lockutils [req-60212762-c474-4078-acc6-4ebadcc6a005 req-b79efd9e-20b9-4ec9-bc78-95867fb47ff5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-ee33030a-2035-4fd1-8de4-261142b89bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:08:46 compute-0 nova_compute[183075]: 2026-01-22 17:08:46.698 183079 INFO nova.compute.manager [None req-16515626-a954-408d-aa5f-70450cd48c98 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Get console output
Jan 22 17:08:46 compute-0 nova_compute[183075]: 2026-01-22 17:08:46.706 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:08:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:47.463 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:47.465 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:08:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:08:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:47 compute-0 nova_compute[183075]: 2026-01-22 17:08:47.946 183079 INFO nova.compute.manager [None req-a19fed82-bf16-4165-b00c-84d0c8f41bd5 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Get console output
Jan 22 17:08:47 compute-0 nova_compute[183075]: 2026-01-22 17:08:47.951 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:08:48 compute-0 nova_compute[183075]: 2026-01-22 17:08:48.003 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.095 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.095 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.6304762
Jan 22 17:08:48 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.9:39508 [22/Jan/2026:17:08:47.461] listener listener/metadata 0/0/0/634/634 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.103 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.104 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.120 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.120 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0160832
Jan 22 17:08:48 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.9:39524 [22/Jan/2026:17:08:48.103] listener listener/metadata 0/0/0/17/17 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.124 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.125 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.146 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:48 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.9:39532 [22/Jan/2026:17:08:48.124] listener listener/metadata 0/0/0/22/22 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.146 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0212882
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.152 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.152 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.176 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:48 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.9:39546 [22/Jan/2026:17:08:48.151] listener listener/metadata 0/0/0/25/25 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.177 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0246036
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.183 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.184 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.202 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:48 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.9:39548 [22/Jan/2026:17:08:48.183] listener listener/metadata 0/0/0/20/20 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.203 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0191190
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.207 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.209 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.225 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:48 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.9:39554 [22/Jan/2026:17:08:48.207] listener listener/metadata 0/0/0/18/18 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.225 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0169432
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.230 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.230 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.244 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:48 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.9:39568 [22/Jan/2026:17:08:48.229] listener listener/metadata 0/0/0/15/15 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.245 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0143783
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.249 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.250 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.267 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:48 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.9:39574 [22/Jan/2026:17:08:48.249] listener listener/metadata 0/0/0/18/18 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.268 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0175426
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.272 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.273 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.293 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.294 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0207732
Jan 22 17:08:48 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.9:39578 [22/Jan/2026:17:08:48.272] listener listener/metadata 0/0/0/21/21 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.298 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.299 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.315 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:48 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.9:39594 [22/Jan/2026:17:08:48.298] listener listener/metadata 0/0/0/17/17 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.315 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0162041
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.320 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.320 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:48 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.9:39600 [22/Jan/2026:17:08:48.320] listener listener/metadata 0/0/0/39/39 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.359 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0382667
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.376 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.376 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.396 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.396 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0201387
Jan 22 17:08:48 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.9:39610 [22/Jan/2026:17:08:48.375] listener listener/metadata 0/0/0/21/21 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.400 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.401 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.423 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.424 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0226073
Jan 22 17:08:48 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.9:39616 [22/Jan/2026:17:08:48.400] listener listener/metadata 0/0/0/24/24 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.428 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.429 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.450 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.450 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0217271
Jan 22 17:08:48 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.9:39628 [22/Jan/2026:17:08:48.427] listener listener/metadata 0/0/0/23/23 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.460 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.465 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.493 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:48 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.9:39644 [22/Jan/2026:17:08:48.460] listener listener/metadata 0/0/0/33/33 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.494 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0276685
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.499 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.500 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.519 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:08:48 compute-0 haproxy-metadata-proxy-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216162]: 10.100.0.9:39654 [22/Jan/2026:17:08:48.498] listener listener/metadata 0/0/0/20/20 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:08:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:48.519 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0194657
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.179 183079 DEBUG oslo_concurrency.lockutils [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.180 183079 DEBUG oslo_concurrency.lockutils [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.180 183079 DEBUG oslo_concurrency.lockutils [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.180 183079 DEBUG oslo_concurrency.lockutils [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.181 183079 DEBUG oslo_concurrency.lockutils [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.182 183079 INFO nova.compute.manager [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Terminating instance
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.183 183079 DEBUG nova.compute.manager [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:08:49 compute-0 kernel: tapc694dca0-bf (unregistering): left promiscuous mode
Jan 22 17:08:49 compute-0 NetworkManager[55454]: <info>  [1769101729.2197] device (tapc694dca0-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:08:49 compute-0 ovn_controller[95372]: 2026-01-22T17:08:49Z|00097|binding|INFO|Releasing lport c694dca0-bf6e-4e89-a43e-1d10b3b23075 from this chassis (sb_readonly=0)
Jan 22 17:08:49 compute-0 ovn_controller[95372]: 2026-01-22T17:08:49Z|00098|binding|INFO|Setting lport c694dca0-bf6e-4e89-a43e-1d10b3b23075 down in Southbound
Jan 22 17:08:49 compute-0 ovn_controller[95372]: 2026-01-22T17:08:49Z|00099|binding|INFO|Removing iface tapc694dca0-bf ovn-installed in OVS
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.226 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:49.237 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:25:d6 10.100.0.4'], port_security=['fa:16:3e:0a:25:d6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '000b64b8-bcc5-4bbe-9703-8400a83a27d0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b0c128dc-3c6d-4d32-ac6a-884653522196', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a8cd0033-3766-4772-8688-44d3d0e3250a, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=c694dca0-bf6e-4e89-a43e-1d10b3b23075) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:08:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:49.241 104629 INFO neutron.agent.ovn.metadata.agent [-] Port c694dca0-bf6e-4e89-a43e-1d10b3b23075 in datapath 473b4e99-4018-4fa7-ab1c-2d3e7944d850 unbound from our chassis
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.246 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:49.250 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 473b4e99-4018-4fa7-ab1c-2d3e7944d850
Jan 22 17:08:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:49.267 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[eb8ba5c4-2dc7-47e1-8724-3d2cc96ccb32]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:49 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Jan 22 17:08:49 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 14.829s CPU time.
Jan 22 17:08:49 compute-0 systemd-machined[154382]: Machine qemu-5-instance-00000005 terminated.
Jan 22 17:08:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:49.300 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[404bac96-cf7e-4f56-91b5-9f2fb162fa1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:49.308 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ecfa7902-68a2-4a37-8c3d-61a621a91bf0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:49.340 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[fdd2821a-501e-48be-8ecf-663d620f12d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:49.367 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[761603a1-0a0d-48eb-bda6-c5c8d770c8e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap473b4e99-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:87:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 203, 'tx_packets': 113, 'rx_bytes': 17350, 'tx_bytes': 12328, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 203, 'tx_packets': 113, 'rx_bytes': 17350, 'tx_bytes': 12328, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399443, 'reachable_time': 41758, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216686, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:49.388 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6790ba80-4261-458e-b986-b003452b19a9]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap473b4e99-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399457, 'tstamp': 399457}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216687, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap473b4e99-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399460, 'tstamp': 399460}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216687, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:49.392 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap473b4e99-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.394 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.401 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:49.401 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap473b4e99-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:49.402 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:49.402 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap473b4e99-40, col_values=(('external_ids', {'iface-id': '424ac40e-403e-4504-adbb-47a319b401fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:49.403 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.407 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.413 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.445 183079 INFO nova.virt.libvirt.driver [-] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Instance destroyed successfully.
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.447 183079 DEBUG nova.objects.instance [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lazy-loading 'resources' on Instance uuid 000b64b8-bcc5-4bbe-9703-8400a83a27d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.467 183079 DEBUG nova.virt.libvirt.vif [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1172585183',display_name='tempest-server-test-1172585183',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1172585183',id=5,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCYZfzAO5Ui0C6aNyi2ae5nossybaMPhlAmp9Zj0dL/LunopVV9meBl8qfqrR0u5rqBGUC3w2LWkiMuhmtUWjdFFHsn2/jcRm/feYqTpe58YE0uSBcC8gJ0hMPoRirOIAg==',key_name='tempest-keypair-test-1166356111',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:07:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='eb8e9f7a891a4a38af8b01557eddc991',ramdisk_id='',reservation_id='r-68us24bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPPortDetailsTest-1812576526',owner_user_name='tempest-FloatingIPPortDetailsTest-1812576526-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:07:45Z,user_data=None,user_id='d7ee6c51c2b8447baefccea20fa16de5',uuid=000b64b8-bcc5-4bbe-9703-8400a83a27d0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "address": "fa:16:3e:0a:25:d6", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc694dca0-bf", "ovs_interfaceid": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.468 183079 DEBUG nova.network.os_vif_util [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converting VIF {"id": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "address": "fa:16:3e:0a:25:d6", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc694dca0-bf", "ovs_interfaceid": "c694dca0-bf6e-4e89-a43e-1d10b3b23075", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.469 183079 DEBUG nova.network.os_vif_util [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0a:25:d6,bridge_name='br-int',has_traffic_filtering=True,id=c694dca0-bf6e-4e89-a43e-1d10b3b23075,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc694dca0-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.470 183079 DEBUG os_vif [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:25:d6,bridge_name='br-int',has_traffic_filtering=True,id=c694dca0-bf6e-4e89-a43e-1d10b3b23075,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc694dca0-bf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.472 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.472 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc694dca0-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.477 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.480 183079 INFO os_vif [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:25:d6,bridge_name='br-int',has_traffic_filtering=True,id=c694dca0-bf6e-4e89-a43e-1d10b3b23075,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc694dca0-bf')
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.480 183079 INFO nova.virt.libvirt.driver [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Deleting instance files /var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0_del
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.481 183079 INFO nova.virt.libvirt.driver [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Deletion of /var/lib/nova/instances/000b64b8-bcc5-4bbe-9703-8400a83a27d0_del complete
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.534 183079 INFO nova.compute.manager [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.535 183079 DEBUG oslo.service.loopingcall [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.535 183079 DEBUG nova.compute.manager [-] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.536 183079 DEBUG nova.network.neutron [-] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:08:49 compute-0 nova_compute[183075]: 2026-01-22 17:08:49.875 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:51 compute-0 podman[216702]: 2026-01-22 17:08:51.368154625 +0000 UTC m=+0.077306473 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:08:51 compute-0 podman[216701]: 2026-01-22 17:08:51.381780761 +0000 UTC m=+0.091552885 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 22 17:08:51 compute-0 nova_compute[183075]: 2026-01-22 17:08:51.821 183079 INFO nova.compute.manager [None req-d9f2af8b-1408-43d2-8479-273a4fd3c84a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Get console output
Jan 22 17:08:52 compute-0 nova_compute[183075]: 2026-01-22 17:08:52.082 183079 DEBUG nova.network.neutron [-] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:08:52 compute-0 nova_compute[183075]: 2026-01-22 17:08:52.118 183079 INFO nova.compute.manager [-] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Took 2.58 seconds to deallocate network for instance.
Jan 22 17:08:52 compute-0 nova_compute[183075]: 2026-01-22 17:08:52.349 183079 DEBUG oslo_concurrency.lockutils [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:52 compute-0 nova_compute[183075]: 2026-01-22 17:08:52.350 183079 DEBUG oslo_concurrency.lockutils [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:52 compute-0 nova_compute[183075]: 2026-01-22 17:08:52.549 183079 DEBUG nova.compute.provider_tree [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:08:52 compute-0 nova_compute[183075]: 2026-01-22 17:08:52.568 183079 DEBUG nova.scheduler.client.report [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:08:52 compute-0 nova_compute[183075]: 2026-01-22 17:08:52.635 183079 DEBUG oslo_concurrency.lockutils [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:52 compute-0 nova_compute[183075]: 2026-01-22 17:08:52.662 183079 INFO nova.scheduler.client.report [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Deleted allocations for instance 000b64b8-bcc5-4bbe-9703-8400a83a27d0
Jan 22 17:08:52 compute-0 nova_compute[183075]: 2026-01-22 17:08:52.724 183079 DEBUG oslo_concurrency.lockutils [None req-27eaba90-596a-416c-9408-8bbbecbfd952 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "000b64b8-bcc5-4bbe-9703-8400a83a27d0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.544s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.078 183079 INFO nova.compute.manager [None req-bcf57101-2eaa-45fc-96f2-ce4334d25f84 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Get console output
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.084 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.199 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Acquiring lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.200 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.217 183079 DEBUG nova.compute.manager [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.281 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.282 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.287 183079 DEBUG nova.virt.hardware [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.287 183079 INFO nova.compute.claims [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:08:53 compute-0 podman[216743]: 2026-01-22 17:08:53.342097992 +0000 UTC m=+0.055984175 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.456 183079 DEBUG nova.compute.provider_tree [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.470 183079 DEBUG nova.scheduler.client.report [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.503 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.504 183079 DEBUG nova.compute.manager [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.576 183079 DEBUG nova.compute.manager [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.576 183079 DEBUG nova.network.neutron [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.623 183079 INFO nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.660 183079 DEBUG nova.compute.manager [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.665 183079 DEBUG oslo_concurrency.lockutils [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.665 183079 DEBUG oslo_concurrency.lockutils [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.666 183079 DEBUG oslo_concurrency.lockutils [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.666 183079 DEBUG oslo_concurrency.lockutils [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.667 183079 DEBUG oslo_concurrency.lockutils [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.668 183079 INFO nova.compute.manager [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Terminating instance
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.669 183079 DEBUG nova.compute.manager [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:08:53 compute-0 kernel: tap44437e9e-7b (unregistering): left promiscuous mode
Jan 22 17:08:53 compute-0 NetworkManager[55454]: <info>  [1769101733.6907] device (tap44437e9e-7b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:08:53 compute-0 ovn_controller[95372]: 2026-01-22T17:08:53Z|00100|binding|INFO|Releasing lport 44437e9e-7bcf-4942-83a0-cb6139413a8e from this chassis (sb_readonly=0)
Jan 22 17:08:53 compute-0 ovn_controller[95372]: 2026-01-22T17:08:53Z|00101|binding|INFO|Setting lport 44437e9e-7bcf-4942-83a0-cb6139413a8e down in Southbound
Jan 22 17:08:53 compute-0 ovn_controller[95372]: 2026-01-22T17:08:53Z|00102|binding|INFO|Removing iface tap44437e9e-7b ovn-installed in OVS
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.704 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:53.711 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:71:80 10.100.0.6'], port_security=['fa:16:3e:c8:71:80 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b0c128dc-3c6d-4d32-ac6a-884653522196', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a8cd0033-3766-4772-8688-44d3d0e3250a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=44437e9e-7bcf-4942-83a0-cb6139413a8e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:08:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:53.712 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 44437e9e-7bcf-4942-83a0-cb6139413a8e in datapath 473b4e99-4018-4fa7-ab1c-2d3e7944d850 unbound from our chassis
Jan 22 17:08:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:53.713 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 473b4e99-4018-4fa7-ab1c-2d3e7944d850, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:08:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:53.714 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9e57f605-36b5-434e-a8c0-b393cb7a6b78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:53.715 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850 namespace which is not needed anymore
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.715 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:53 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Jan 22 17:08:53 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 15.979s CPU time.
Jan 22 17:08:53 compute-0 systemd-machined[154382]: Machine qemu-3-instance-00000003 terminated.
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.778 183079 DEBUG nova.compute.manager [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.780 183079 DEBUG nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.780 183079 INFO nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Creating image(s)
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.781 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Acquiring lock "/var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.781 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "/var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.782 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "/var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.799 183079 DEBUG oslo_concurrency.processutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:53 compute-0 neutron-haproxy-ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215490]: [NOTICE]   (215495) : haproxy version is 2.8.14-c23fe91
Jan 22 17:08:53 compute-0 neutron-haproxy-ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215490]: [NOTICE]   (215495) : path to executable is /usr/sbin/haproxy
Jan 22 17:08:53 compute-0 neutron-haproxy-ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215490]: [WARNING]  (215495) : Exiting Master process...
Jan 22 17:08:53 compute-0 neutron-haproxy-ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215490]: [WARNING]  (215495) : Exiting Master process...
Jan 22 17:08:53 compute-0 neutron-haproxy-ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215490]: [ALERT]    (215495) : Current worker (215497) exited with code 143 (Terminated)
Jan 22 17:08:53 compute-0 neutron-haproxy-ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850[215490]: [WARNING]  (215495) : All workers exited. Exiting... (0)
Jan 22 17:08:53 compute-0 systemd[1]: libpod-d57a73aad38b21a6f0273d0b23e99556d1dbeb69953d3deb8b92141d2bdbe91a.scope: Deactivated successfully.
Jan 22 17:08:53 compute-0 podman[216786]: 2026-01-22 17:08:53.860055636 +0000 UTC m=+0.061136170 container died d57a73aad38b21a6f0273d0b23e99556d1dbeb69953d3deb8b92141d2bdbe91a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.882 183079 DEBUG oslo_concurrency.processutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:53 compute-0 kernel: tap44437e9e-7b: entered promiscuous mode
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.887 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:53 compute-0 systemd-udevd[216768]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:08:53 compute-0 NetworkManager[55454]: <info>  [1769101733.8919] manager: (tap44437e9e-7b): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Jan 22 17:08:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d57a73aad38b21a6f0273d0b23e99556d1dbeb69953d3deb8b92141d2bdbe91a-userdata-shm.mount: Deactivated successfully.
Jan 22 17:08:53 compute-0 ovn_controller[95372]: 2026-01-22T17:08:53Z|00103|binding|INFO|Claiming lport 44437e9e-7bcf-4942-83a0-cb6139413a8e for this chassis.
Jan 22 17:08:53 compute-0 ovn_controller[95372]: 2026-01-22T17:08:53Z|00104|binding|INFO|44437e9e-7bcf-4942-83a0-cb6139413a8e: Claiming fa:16:3e:c8:71:80 10.100.0.6
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.895 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-18797a17e2baf24a32ea7ca8564085167b02b4354113699b0ae9251360e34b34-merged.mount: Deactivated successfully.
Jan 22 17:08:53 compute-0 kernel: tap44437e9e-7b (unregistering): left promiscuous mode
Jan 22 17:08:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:53.908 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:71:80 10.100.0.6'], port_security=['fa:16:3e:c8:71:80 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b0c128dc-3c6d-4d32-ac6a-884653522196', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a8cd0033-3766-4772-8688-44d3d0e3250a, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=44437e9e-7bcf-4942-83a0-cb6139413a8e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:08:53 compute-0 podman[216786]: 2026-01-22 17:08:53.910752672 +0000 UTC m=+0.111833206 container cleanup d57a73aad38b21a6f0273d0b23e99556d1dbeb69953d3deb8b92141d2bdbe91a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 17:08:53 compute-0 virtnodedevd[182988]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 22 17:08:53 compute-0 virtnodedevd[182988]: hostname: compute-0
Jan 22 17:08:53 compute-0 virtnodedevd[182988]: ethtool ioctl error on tap44437e9e-7b: No such device
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.915 183079 DEBUG oslo_concurrency.processutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:53 compute-0 virtnodedevd[182988]: ethtool ioctl error on tap44437e9e-7b: No such device
Jan 22 17:08:53 compute-0 ovn_controller[95372]: 2026-01-22T17:08:53Z|00105|binding|INFO|Setting lport 44437e9e-7bcf-4942-83a0-cb6139413a8e ovn-installed in OVS
Jan 22 17:08:53 compute-0 virtnodedevd[182988]: ethtool ioctl error on tap44437e9e-7b: No such device
Jan 22 17:08:53 compute-0 ovn_controller[95372]: 2026-01-22T17:08:53Z|00106|binding|INFO|Setting lport 44437e9e-7bcf-4942-83a0-cb6139413a8e up in Southbound
Jan 22 17:08:53 compute-0 ovn_controller[95372]: 2026-01-22T17:08:53Z|00107|binding|INFO|Releasing lport 44437e9e-7bcf-4942-83a0-cb6139413a8e from this chassis (sb_readonly=1)
Jan 22 17:08:53 compute-0 ovn_controller[95372]: 2026-01-22T17:08:53Z|00108|if_status|INFO|Not setting lport 44437e9e-7bcf-4942-83a0-cb6139413a8e down as sb is readonly
Jan 22 17:08:53 compute-0 virtnodedevd[182988]: ethtool ioctl error on tap44437e9e-7b: No such device
Jan 22 17:08:53 compute-0 ovn_controller[95372]: 2026-01-22T17:08:53Z|00109|binding|INFO|Removing iface tap44437e9e-7b ovn-installed in OVS
Jan 22 17:08:53 compute-0 ovn_controller[95372]: 2026-01-22T17:08:53Z|00110|binding|INFO|Releasing lport 44437e9e-7bcf-4942-83a0-cb6139413a8e from this chassis (sb_readonly=0)
Jan 22 17:08:53 compute-0 ovn_controller[95372]: 2026-01-22T17:08:53Z|00111|binding|INFO|Setting lport 44437e9e-7bcf-4942-83a0-cb6139413a8e down in Southbound
Jan 22 17:08:53 compute-0 systemd[1]: libpod-conmon-d57a73aad38b21a6f0273d0b23e99556d1dbeb69953d3deb8b92141d2bdbe91a.scope: Deactivated successfully.
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.933 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:53 compute-0 virtnodedevd[182988]: ethtool ioctl error on tap44437e9e-7b: No such device
Jan 22 17:08:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:53.939 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:71:80 10.100.0.6'], port_security=['fa:16:3e:c8:71:80 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'effaddee-27ef-49f6-ac5f-2e3258c8d5d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb8e9f7a891a4a38af8b01557eddc991', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b0c128dc-3c6d-4d32-ac6a-884653522196', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a8cd0033-3766-4772-8688-44d3d0e3250a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=44437e9e-7bcf-4942-83a0-cb6139413a8e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.941 183079 DEBUG nova.policy [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56c736d5c1ab41d8a02fcbc021d28353', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a08acddaa8b748f8b4fbb432b95408d1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:08:53 compute-0 virtnodedevd[182988]: ethtool ioctl error on tap44437e9e-7b: No such device
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.946 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:53 compute-0 virtnodedevd[182988]: ethtool ioctl error on tap44437e9e-7b: No such device
Jan 22 17:08:53 compute-0 virtnodedevd[182988]: ethtool ioctl error on tap44437e9e-7b: No such device
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.960 183079 INFO nova.virt.libvirt.driver [-] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Instance destroyed successfully.
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.960 183079 DEBUG nova.objects.instance [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lazy-loading 'resources' on Instance uuid effaddee-27ef-49f6-ac5f-2e3258c8d5d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.977 183079 DEBUG oslo_concurrency.processutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.978 183079 DEBUG oslo_concurrency.processutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.994 183079 DEBUG nova.virt.libvirt.vif [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1783669850',display_name='tempest-server-test-1783669850',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1783669850',id=3,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCYZfzAO5Ui0C6aNyi2ae5nossybaMPhlAmp9Zj0dL/LunopVV9meBl8qfqrR0u5rqBGUC3w2LWkiMuhmtUWjdFFHsn2/jcRm/feYqTpe58YE0uSBcC8gJ0hMPoRirOIAg==',key_name='tempest-keypair-test-1166356111',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:07:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='eb8e9f7a891a4a38af8b01557eddc991',ramdisk_id='',reservation_id='r-89frdmqf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPPortDetailsTest-1812576526',owner_user_name='tempest-FloatingIPPortDetailsTest-1812576526-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:07:11Z,user_data=None,user_id='d7ee6c51c2b8447baefccea20fa16de5',uuid=effaddee-27ef-49f6-ac5f-2e3258c8d5d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "address": "fa:16:3e:c8:71:80", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44437e9e-7b", "ovs_interfaceid": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:08:53 compute-0 podman[216827]: 2026-01-22 17:08:53.995329933 +0000 UTC m=+0.050233934 container remove d57a73aad38b21a6f0273d0b23e99556d1dbeb69953d3deb8b92141d2bdbe91a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.995 183079 DEBUG nova.network.os_vif_util [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converting VIF {"id": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "address": "fa:16:3e:c8:71:80", "network": {"id": "473b4e99-4018-4fa7-ab1c-2d3e7944d850", "bridge": "br-int", "label": "tempest-test-network--1171979719", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb8e9f7a891a4a38af8b01557eddc991", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44437e9e-7b", "ovs_interfaceid": "44437e9e-7bcf-4942-83a0-cb6139413a8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.996 183079 DEBUG nova.network.os_vif_util [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c8:71:80,bridge_name='br-int',has_traffic_filtering=True,id=44437e9e-7bcf-4942-83a0-cb6139413a8e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap44437e9e-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.996 183079 DEBUG os_vif [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:71:80,bridge_name='br-int',has_traffic_filtering=True,id=44437e9e-7bcf-4942-83a0-cb6139413a8e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap44437e9e-7b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.998 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:53 compute-0 nova_compute[183075]: 2026-01-22 17:08:53.998 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44437e9e-7b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.003 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:54.003 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6624b01a-e7ac-48ee-88c2-31d19fd63038]: (4, ('Thu Jan 22 05:08:53 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850 (d57a73aad38b21a6f0273d0b23e99556d1dbeb69953d3deb8b92141d2bdbe91a)\nd57a73aad38b21a6f0273d0b23e99556d1dbeb69953d3deb8b92141d2bdbe91a\nThu Jan 22 05:08:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850 (d57a73aad38b21a6f0273d0b23e99556d1dbeb69953d3deb8b92141d2bdbe91a)\nd57a73aad38b21a6f0273d0b23e99556d1dbeb69953d3deb8b92141d2bdbe91a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.005 183079 INFO os_vif [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:71:80,bridge_name='br-int',has_traffic_filtering=True,id=44437e9e-7bcf-4942-83a0-cb6139413a8e,network=Network(473b4e99-4018-4fa7-ab1c-2d3e7944d850),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap44437e9e-7b')
Jan 22 17:08:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:54.005 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a9006cc7-d64a-4a62-ad4c-7a84b822c272]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.005 183079 INFO nova.virt.libvirt.driver [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Deleting instance files /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2_del
Jan 22 17:08:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:54.006 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap473b4e99-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.006 183079 INFO nova.virt.libvirt.driver [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Deletion of /var/lib/nova/instances/effaddee-27ef-49f6-ac5f-2e3258c8d5d2_del complete
Jan 22 17:08:54 compute-0 kernel: tap473b4e99-40: left promiscuous mode
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.014 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.016 183079 DEBUG oslo_concurrency.processutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.016 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.016 183079 DEBUG oslo_concurrency.processutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:54.029 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d1583299-4b52-44b0-b81d-16bb0aa08dd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.031 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:54.046 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[df1618ee-e2f5-4041-a58f-dac7f9a0ac06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:54.047 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[509a3896-b266-4114-9f43-6212a94aa91c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:54.062 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8689b211-5370-4825-9715-681ae48d6953]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399435, 'reachable_time': 22435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216861, 'error': None, 'target': 'ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:54.065 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-473b4e99-4018-4fa7-ab1c-2d3e7944d850 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:08:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:54.065 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[59916442-47e7-452c-8e5f-6aabefeceb64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:54.065 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 44437e9e-7bcf-4942-83a0-cb6139413a8e in datapath 473b4e99-4018-4fa7-ab1c-2d3e7944d850 unbound from our chassis
Jan 22 17:08:54 compute-0 systemd[1]: run-netns-ovnmeta\x2d473b4e99\x2d4018\x2d4fa7\x2dab1c\x2d2d3e7944d850.mount: Deactivated successfully.
Jan 22 17:08:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:54.067 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 473b4e99-4018-4fa7-ab1c-2d3e7944d850, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:08:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:54.067 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5433052b-3ade-4ac3-b9e2-5dfc7456072f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:54.068 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 44437e9e-7bcf-4942-83a0-cb6139413a8e in datapath 473b4e99-4018-4fa7-ab1c-2d3e7944d850 unbound from our chassis
Jan 22 17:08:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:54.069 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 473b4e99-4018-4fa7-ab1c-2d3e7944d850, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:08:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:54.069 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c67c4b19-70e1-41f3-aa57-eb86e3e9f763]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.070 183079 DEBUG oslo_concurrency.processutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.070 183079 DEBUG nova.virt.disk.api [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Checking if we can resize image /var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.071 183079 DEBUG oslo_concurrency.processutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.089 183079 INFO nova.compute.manager [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.090 183079 DEBUG oslo.service.loopingcall [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.090 183079 DEBUG nova.compute.manager [-] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.090 183079 DEBUG nova.network.neutron [-] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.123 183079 DEBUG oslo_concurrency.processutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.124 183079 DEBUG nova.virt.disk.api [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Cannot resize image /var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.124 183079 DEBUG nova.objects.instance [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lazy-loading 'migration_context' on Instance uuid 1fa7475b-9f51-4229-8ded-3a0c4de806c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.142 183079 DEBUG nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.142 183079 DEBUG nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Ensure instance console log exists: /var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.143 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.143 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.143 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.224 183079 DEBUG nova.compute.manager [req-63b0672d-0939-40db-b2fe-c61d3fc9793c req-451c23b6-b674-4977-bad7-d27e43dc1ddf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received event network-vif-unplugged-44437e9e-7bcf-4942-83a0-cb6139413a8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.224 183079 DEBUG oslo_concurrency.lockutils [req-63b0672d-0939-40db-b2fe-c61d3fc9793c req-451c23b6-b674-4977-bad7-d27e43dc1ddf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.224 183079 DEBUG oslo_concurrency.lockutils [req-63b0672d-0939-40db-b2fe-c61d3fc9793c req-451c23b6-b674-4977-bad7-d27e43dc1ddf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.225 183079 DEBUG oslo_concurrency.lockutils [req-63b0672d-0939-40db-b2fe-c61d3fc9793c req-451c23b6-b674-4977-bad7-d27e43dc1ddf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.225 183079 DEBUG nova.compute.manager [req-63b0672d-0939-40db-b2fe-c61d3fc9793c req-451c23b6-b674-4977-bad7-d27e43dc1ddf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] No waiting events found dispatching network-vif-unplugged-44437e9e-7bcf-4942-83a0-cb6139413a8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.225 183079 DEBUG nova.compute.manager [req-63b0672d-0939-40db-b2fe-c61d3fc9793c req-451c23b6-b674-4977-bad7-d27e43dc1ddf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received event network-vif-unplugged-44437e9e-7bcf-4942-83a0-cb6139413a8e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:08:54 compute-0 nova_compute[183075]: 2026-01-22 17:08:54.923 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:55 compute-0 nova_compute[183075]: 2026-01-22 17:08:55.245 183079 DEBUG nova.network.neutron [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Successfully created port: d53b3c49-e24c-4e07-944d-d72baf3994e0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:08:55 compute-0 nova_compute[183075]: 2026-01-22 17:08:55.984 183079 DEBUG nova.network.neutron [-] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.004 183079 INFO nova.compute.manager [-] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Took 1.91 seconds to deallocate network for instance.
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.054 183079 DEBUG oslo_concurrency.lockutils [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.054 183079 DEBUG oslo_concurrency.lockutils [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.101 183079 DEBUG nova.network.neutron [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Successfully updated port: d53b3c49-e24c-4e07-944d-d72baf3994e0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.158 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Acquiring lock "refresh_cache-1fa7475b-9f51-4229-8ded-3a0c4de806c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.158 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Acquired lock "refresh_cache-1fa7475b-9f51-4229-8ded-3a0c4de806c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.159 183079 DEBUG nova.network.neutron [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.216 183079 DEBUG nova.compute.provider_tree [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.221 183079 DEBUG nova.compute.manager [req-74364e86-c0a8-4a35-9678-96037237e056 req-f9274e6b-be9f-400c-bad9-de7a1c0ddbc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Received event network-changed-65d8ece3-00e3-43f9-8231-6893ea4cf9a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.222 183079 DEBUG nova.compute.manager [req-74364e86-c0a8-4a35-9678-96037237e056 req-f9274e6b-be9f-400c-bad9-de7a1c0ddbc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Refreshing instance network info cache due to event network-changed-65d8ece3-00e3-43f9-8231-6893ea4cf9a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.222 183079 DEBUG oslo_concurrency.lockutils [req-74364e86-c0a8-4a35-9678-96037237e056 req-f9274e6b-be9f-400c-bad9-de7a1c0ddbc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-4bb7efdc-59ab-46cd-ae0d-582182c85f5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.222 183079 DEBUG oslo_concurrency.lockutils [req-74364e86-c0a8-4a35-9678-96037237e056 req-f9274e6b-be9f-400c-bad9-de7a1c0ddbc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-4bb7efdc-59ab-46cd-ae0d-582182c85f5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.223 183079 DEBUG nova.network.neutron [req-74364e86-c0a8-4a35-9678-96037237e056 req-f9274e6b-be9f-400c-bad9-de7a1c0ddbc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Refreshing network info cache for port 65d8ece3-00e3-43f9-8231-6893ea4cf9a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.231 183079 DEBUG nova.scheduler.client.report [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.255 183079 DEBUG oslo_concurrency.lockutils [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.294 183079 INFO nova.scheduler.client.report [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Deleted allocations for instance effaddee-27ef-49f6-ac5f-2e3258c8d5d2
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.332 183079 DEBUG nova.compute.manager [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received event network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.333 183079 DEBUG oslo_concurrency.lockutils [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.333 183079 DEBUG oslo_concurrency.lockutils [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.333 183079 DEBUG oslo_concurrency.lockutils [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.333 183079 DEBUG nova.compute.manager [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] No waiting events found dispatching network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.333 183079 WARNING nova.compute.manager [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received unexpected event network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e for instance with vm_state deleted and task_state None.
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.333 183079 DEBUG nova.compute.manager [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received event network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.334 183079 DEBUG oslo_concurrency.lockutils [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.334 183079 DEBUG oslo_concurrency.lockutils [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.334 183079 DEBUG oslo_concurrency.lockutils [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.334 183079 DEBUG nova.compute.manager [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] No waiting events found dispatching network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.334 183079 WARNING nova.compute.manager [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received unexpected event network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e for instance with vm_state deleted and task_state None.
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.334 183079 DEBUG nova.compute.manager [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received event network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.335 183079 DEBUG oslo_concurrency.lockutils [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.335 183079 DEBUG oslo_concurrency.lockutils [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.335 183079 DEBUG oslo_concurrency.lockutils [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.335 183079 DEBUG nova.compute.manager [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] No waiting events found dispatching network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.335 183079 WARNING nova.compute.manager [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received unexpected event network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e for instance with vm_state deleted and task_state None.
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.336 183079 DEBUG nova.compute.manager [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received event network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.336 183079 DEBUG oslo_concurrency.lockutils [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.336 183079 DEBUG oslo_concurrency.lockutils [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.336 183079 DEBUG oslo_concurrency.lockutils [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.336 183079 DEBUG nova.compute.manager [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] No waiting events found dispatching network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.336 183079 WARNING nova.compute.manager [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Received unexpected event network-vif-plugged-44437e9e-7bcf-4942-83a0-cb6139413a8e for instance with vm_state deleted and task_state None.
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.336 183079 DEBUG nova.compute.manager [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Received event network-changed-d53b3c49-e24c-4e07-944d-d72baf3994e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.337 183079 DEBUG nova.compute.manager [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Refreshing instance network info cache due to event network-changed-d53b3c49-e24c-4e07-944d-d72baf3994e0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.337 183079 DEBUG oslo_concurrency.lockutils [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-1fa7475b-9f51-4229-8ded-3a0c4de806c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.388 183079 DEBUG oslo_concurrency.lockutils [None req-872fb4e8-ee36-4fed-8b4f-01ff29ebc822 d7ee6c51c2b8447baefccea20fa16de5 eb8e9f7a891a4a38af8b01557eddc991 - - default default] Lock "effaddee-27ef-49f6-ac5f-2e3258c8d5d2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.723s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.389 183079 DEBUG nova.network.neutron [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:08:56 compute-0 nova_compute[183075]: 2026-01-22 17:08:56.967 183079 INFO nova.compute.manager [None req-bbe02ebb-70cd-4eda-ac6e-a47a487645ea cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Get console output
Jan 22 17:08:57 compute-0 ovn_controller[95372]: 2026-01-22T17:08:57Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c6:77:15 10.100.0.12
Jan 22 17:08:57 compute-0 ovn_controller[95372]: 2026-01-22T17:08:57Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c6:77:15 10.100.0.12
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.286 183079 DEBUG nova.network.neutron [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Updating instance_info_cache with network_info: [{"id": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "address": "fa:16:3e:72:f9:88", "network": {"id": "a2733777-0394-47df-88c8-302fae8b0aef", "bridge": "br-int", "label": "tempest-test-network--1786359447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a08acddaa8b748f8b4fbb432b95408d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd53b3c49-e2", "ovs_interfaceid": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.306 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Releasing lock "refresh_cache-1fa7475b-9f51-4229-8ded-3a0c4de806c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.306 183079 DEBUG nova.compute.manager [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Instance network_info: |[{"id": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "address": "fa:16:3e:72:f9:88", "network": {"id": "a2733777-0394-47df-88c8-302fae8b0aef", "bridge": "br-int", "label": "tempest-test-network--1786359447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a08acddaa8b748f8b4fbb432b95408d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd53b3c49-e2", "ovs_interfaceid": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.306 183079 DEBUG oslo_concurrency.lockutils [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-1fa7475b-9f51-4229-8ded-3a0c4de806c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.307 183079 DEBUG nova.network.neutron [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Refreshing network info cache for port d53b3c49-e24c-4e07-944d-d72baf3994e0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.309 183079 DEBUG nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Start _get_guest_xml network_info=[{"id": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "address": "fa:16:3e:72:f9:88", "network": {"id": "a2733777-0394-47df-88c8-302fae8b0aef", "bridge": "br-int", "label": "tempest-test-network--1786359447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a08acddaa8b748f8b4fbb432b95408d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd53b3c49-e2", "ovs_interfaceid": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.312 183079 WARNING nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.316 183079 DEBUG nova.virt.libvirt.host [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.317 183079 DEBUG nova.virt.libvirt.host [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.320 183079 DEBUG nova.virt.libvirt.host [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.320 183079 DEBUG nova.virt.libvirt.host [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.321 183079 DEBUG nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.321 183079 DEBUG nova.virt.hardware [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.321 183079 DEBUG nova.virt.hardware [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.321 183079 DEBUG nova.virt.hardware [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.321 183079 DEBUG nova.virt.hardware [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.322 183079 DEBUG nova.virt.hardware [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.322 183079 DEBUG nova.virt.hardware [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.322 183079 DEBUG nova.virt.hardware [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.322 183079 DEBUG nova.virt.hardware [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.322 183079 DEBUG nova.virt.hardware [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.322 183079 DEBUG nova.virt.hardware [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.322 183079 DEBUG nova.virt.hardware [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.325 183079 DEBUG nova.virt.libvirt.vif [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:08:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-901772077',display_name='tempest-server-test-901772077',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-901772077',id=9,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQuJuvkIPPJyGB3O37xHb6l4WpEoydCT10tXlWa50WKy7gHgurcOWRMNAPM4HnhDWmUgLmLU1COdO9GFsKDe7/yg9HhZhk7hp24KACIdwwfi6BYaCn9sbAGhcwglP9yYw==',key_name='tempest-keypair-test-1152764916',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a08acddaa8b748f8b4fbb432b95408d1',ramdisk_id='',reservation_id='r-hxqj1zr5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkBasicTest-330550171',owner_user_name='tempest-NetworkBasicTest-330550171-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:08:53Z,user_data=None,user_id='56c736d5c1ab41d8a02fcbc021d28353',uuid=1fa7475b-9f51-4229-8ded-3a0c4de806c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "address": "fa:16:3e:72:f9:88", "network": {"id": "a2733777-0394-47df-88c8-302fae8b0aef", "bridge": "br-int", "label": "tempest-test-network--1786359447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a08acddaa8b748f8b4fbb432b95408d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd53b3c49-e2", "ovs_interfaceid": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.326 183079 DEBUG nova.network.os_vif_util [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Converting VIF {"id": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "address": "fa:16:3e:72:f9:88", "network": {"id": "a2733777-0394-47df-88c8-302fae8b0aef", "bridge": "br-int", "label": "tempest-test-network--1786359447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a08acddaa8b748f8b4fbb432b95408d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd53b3c49-e2", "ovs_interfaceid": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.328 183079 DEBUG nova.network.os_vif_util [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:f9:88,bridge_name='br-int',has_traffic_filtering=True,id=d53b3c49-e24c-4e07-944d-d72baf3994e0,network=Network(a2733777-0394-47df-88c8-302fae8b0aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd53b3c49-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.328 183079 DEBUG nova.objects.instance [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1fa7475b-9f51-4229-8ded-3a0c4de806c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.341 183079 DEBUG nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:08:58 compute-0 nova_compute[183075]:   <uuid>1fa7475b-9f51-4229-8ded-3a0c4de806c5</uuid>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   <name>instance-00000009</name>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-901772077</nova:name>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:08:58</nova:creationTime>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:08:58 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:08:58 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:08:58 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:08:58 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:08:58 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:08:58 compute-0 nova_compute[183075]:         <nova:user uuid="56c736d5c1ab41d8a02fcbc021d28353">tempest-NetworkBasicTest-330550171-project-member</nova:user>
Jan 22 17:08:58 compute-0 nova_compute[183075]:         <nova:project uuid="a08acddaa8b748f8b4fbb432b95408d1">tempest-NetworkBasicTest-330550171</nova:project>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:08:58 compute-0 nova_compute[183075]:         <nova:port uuid="d53b3c49-e24c-4e07-944d-d72baf3994e0">
Jan 22 17:08:58 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <system>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <entry name="serial">1fa7475b-9f51-4229-8ded-3a0c4de806c5</entry>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <entry name="uuid">1fa7475b-9f51-4229-8ded-3a0c4de806c5</entry>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     </system>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   <os>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   </os>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   <features>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   </features>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5/disk"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:72:f9:88"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <target dev="tapd53b3c49-e2"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5/console.log" append="off"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <video>
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     </video>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:08:58 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:08:58 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:08:58 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:08:58 compute-0 nova_compute[183075]: </domain>
Jan 22 17:08:58 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.342 183079 DEBUG nova.compute.manager [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Preparing to wait for external event network-vif-plugged-d53b3c49-e24c-4e07-944d-d72baf3994e0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.342 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Acquiring lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.343 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.343 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.344 183079 DEBUG nova.virt.libvirt.vif [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:08:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-901772077',display_name='tempest-server-test-901772077',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-901772077',id=9,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQuJuvkIPPJyGB3O37xHb6l4WpEoydCT10tXlWa50WKy7gHgurcOWRMNAPM4HnhDWmUgLmLU1COdO9GFsKDe7/yg9HhZhk7hp24KACIdwwfi6BYaCn9sbAGhcwglP9yYw==',key_name='tempest-keypair-test-1152764916',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a08acddaa8b748f8b4fbb432b95408d1',ramdisk_id='',reservation_id='r-hxqj1zr5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkBasicTest-330550171',owner_user_name='tempest-NetworkBasicTest-330550171-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:08:53Z,user_data=None,user_id='56c736d5c1ab41d8a02fcbc021d28353',uuid=1fa7475b-9f51-4229-8ded-3a0c4de806c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "address": "fa:16:3e:72:f9:88", "network": {"id": "a2733777-0394-47df-88c8-302fae8b0aef", "bridge": "br-int", "label": "tempest-test-network--1786359447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a08acddaa8b748f8b4fbb432b95408d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd53b3c49-e2", "ovs_interfaceid": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.344 183079 DEBUG nova.network.os_vif_util [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Converting VIF {"id": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "address": "fa:16:3e:72:f9:88", "network": {"id": "a2733777-0394-47df-88c8-302fae8b0aef", "bridge": "br-int", "label": "tempest-test-network--1786359447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a08acddaa8b748f8b4fbb432b95408d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd53b3c49-e2", "ovs_interfaceid": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.345 183079 DEBUG nova.network.os_vif_util [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:f9:88,bridge_name='br-int',has_traffic_filtering=True,id=d53b3c49-e24c-4e07-944d-d72baf3994e0,network=Network(a2733777-0394-47df-88c8-302fae8b0aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd53b3c49-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.345 183079 DEBUG os_vif [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:f9:88,bridge_name='br-int',has_traffic_filtering=True,id=d53b3c49-e24c-4e07-944d-d72baf3994e0,network=Network(a2733777-0394-47df-88c8-302fae8b0aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd53b3c49-e2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.345 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.346 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.346 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.349 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.349 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd53b3c49-e2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.350 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd53b3c49-e2, col_values=(('external_ids', {'iface-id': 'd53b3c49-e24c-4e07-944d-d72baf3994e0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:72:f9:88', 'vm-uuid': '1fa7475b-9f51-4229-8ded-3a0c4de806c5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:58 compute-0 podman[216880]: 2026-01-22 17:08:58.353353574 +0000 UTC m=+0.062868715 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, tcib_managed=true)
Jan 22 17:08:58 compute-0 NetworkManager[55454]: <info>  [1769101738.3857] manager: (tapd53b3c49-e2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.384 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.388 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.392 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.392 183079 INFO os_vif [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:f9:88,bridge_name='br-int',has_traffic_filtering=True,id=d53b3c49-e24c-4e07-944d-d72baf3994e0,network=Network(a2733777-0394-47df-88c8-302fae8b0aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd53b3c49-e2')
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.449 183079 DEBUG nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.450 183079 DEBUG nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] No VIF found with MAC fa:16:3e:72:f9:88, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:08:58 compute-0 NetworkManager[55454]: <info>  [1769101738.5184] manager: (tapd53b3c49-e2): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Jan 22 17:08:58 compute-0 kernel: tapd53b3c49-e2: entered promiscuous mode
Jan 22 17:08:58 compute-0 ovn_controller[95372]: 2026-01-22T17:08:58Z|00112|binding|INFO|Claiming lport d53b3c49-e24c-4e07-944d-d72baf3994e0 for this chassis.
Jan 22 17:08:58 compute-0 ovn_controller[95372]: 2026-01-22T17:08:58Z|00113|binding|INFO|d53b3c49-e24c-4e07-944d-d72baf3994e0: Claiming fa:16:3e:72:f9:88 10.100.0.7
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.523 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.534 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:f9:88 10.100.0.7'], port_security=['fa:16:3e:72:f9:88 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '1fa7475b-9f51-4229-8ded-3a0c4de806c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2733777-0394-47df-88c8-302fae8b0aef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a08acddaa8b748f8b4fbb432b95408d1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c4cdf4d-bf9c-4640-a79a-ea997ee124d5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fe09b5f5-d61e-4c9e-ad28-b743a26596fc, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=d53b3c49-e24c-4e07-944d-d72baf3994e0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.539 104629 INFO neutron.agent.ovn.metadata.agent [-] Port d53b3c49-e24c-4e07-944d-d72baf3994e0 in datapath a2733777-0394-47df-88c8-302fae8b0aef bound to our chassis
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.544 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a2733777-0394-47df-88c8-302fae8b0aef
Jan 22 17:08:58 compute-0 ovn_controller[95372]: 2026-01-22T17:08:58Z|00114|binding|INFO|Setting lport d53b3c49-e24c-4e07-944d-d72baf3994e0 ovn-installed in OVS
Jan 22 17:08:58 compute-0 ovn_controller[95372]: 2026-01-22T17:08:58Z|00115|binding|INFO|Setting lport d53b3c49-e24c-4e07-944d-d72baf3994e0 up in Southbound
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.553 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[48e1a747-d9c7-4938-a757-e3a01d1591a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.555 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa2733777-01 in ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.555 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.556 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:58 compute-0 systemd-udevd[216917]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.559 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa2733777-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.559 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e97574de-d242-4cf5-9047-6b3df4ac96bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.560 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ac1653c4-8b00-435d-b06d-596207311f54]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:58 compute-0 NetworkManager[55454]: <info>  [1769101738.5701] device (tapd53b3c49-e2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:08:58 compute-0 NetworkManager[55454]: <info>  [1769101738.5710] device (tapd53b3c49-e2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.570 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[6986ae4b-2797-47ee-b01f-ba542ce95405]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:58 compute-0 systemd-machined[154382]: New machine qemu-9-instance-00000009.
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.594 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[45e457e8-0e49-48e2-a0ab-499e3064395f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:58 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000009.
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.623 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[efbee72c-01f8-42a8-bfc8-d1696d142926]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.628 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c9297df9-2387-4af2-a668-2b98b4a112c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:58 compute-0 NetworkManager[55454]: <info>  [1769101738.6312] manager: (tapa2733777-00): new Veth device (/org/freedesktop/NetworkManager/Devices/60)
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.658 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[bf5393ac-a9ef-4715-a18e-a8cd902d2e55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.661 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ede369b5-9132-4911-a689-2bed49846022]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:58 compute-0 NetworkManager[55454]: <info>  [1769101738.6839] device (tapa2733777-00): carrier: link connected
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.688 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ca8079da-7e4c-499f-81dc-f684f56f9598]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.703 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e8eb1325-1d93-4c11-94a7-b74361e4d9b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2733777-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:b0:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 410238, 'reachable_time': 19964, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216950, 'error': None, 'target': 'ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.716 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fba91e36-9e31-4319-a772-525a39c09164]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe13:b0f6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 410238, 'tstamp': 410238}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216951, 'error': None, 'target': 'ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.733 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c11ca5ec-4030-4b04-b54c-9049437c941d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2733777-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:b0:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 410238, 'reachable_time': 19964, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216952, 'error': None, 'target': 'ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.759 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[694a62c5-e1fc-445e-bffe-71d5d60837c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.828 183079 DEBUG nova.network.neutron [req-74364e86-c0a8-4a35-9678-96037237e056 req-f9274e6b-be9f-400c-bad9-de7a1c0ddbc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Updated VIF entry in instance network info cache for port 65d8ece3-00e3-43f9-8231-6893ea4cf9a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.829 183079 DEBUG nova.network.neutron [req-74364e86-c0a8-4a35-9678-96037237e056 req-f9274e6b-be9f-400c-bad9-de7a1c0ddbc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Updating instance_info_cache with network_info: [{"id": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "address": "fa:16:3e:eb:3e:ac", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65d8ece3-00", "ovs_interfaceid": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.829 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fa6d2251-9c27-46ab-a812-64d147c54289]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.830 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2733777-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.831 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.831 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2733777-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.832 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:58 compute-0 kernel: tapa2733777-00: entered promiscuous mode
Jan 22 17:08:58 compute-0 NetworkManager[55454]: <info>  [1769101738.8351] manager: (tapa2733777-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.836 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa2733777-00, col_values=(('external_ids', {'iface-id': '6ec27c8a-b564-4b6c-b0c1-5475212e439c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.837 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:58 compute-0 ovn_controller[95372]: 2026-01-22T17:08:58Z|00116|binding|INFO|Releasing lport 6ec27c8a-b564-4b6c-b0c1-5475212e439c from this chassis (sb_readonly=0)
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.852 183079 DEBUG oslo_concurrency.lockutils [req-74364e86-c0a8-4a35-9678-96037237e056 req-f9274e6b-be9f-400c-bad9-de7a1c0ddbc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-4bb7efdc-59ab-46cd-ae0d-582182c85f5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:08:58 compute-0 nova_compute[183075]: 2026-01-22 17:08:58.854 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.856 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2733777-0394-47df-88c8-302fae8b0aef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2733777-0394-47df-88c8-302fae8b0aef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.857 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fe47bc39-f4f6-44e5-8e72-549b6543f175]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.858 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/a2733777-0394-47df-88c8-302fae8b0aef.pid.haproxy
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID a2733777-0394-47df-88c8-302fae8b0aef
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:08:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:08:58.860 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef', 'env', 'PROCESS_TAG=haproxy-a2733777-0394-47df-88c8-302fae8b0aef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a2733777-0394-47df-88c8-302fae8b0aef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.164 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101739.1638954, 1fa7475b-9f51-4229-8ded-3a0c4de806c5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.165 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] VM Started (Lifecycle Event)
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.196 183079 DEBUG nova.compute.manager [req-acfa4580-28b7-4dcb-9e3a-a0d87c8b42b3 req-e28eef9d-aad6-4d72-b399-215db9f042b2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Received event network-vif-plugged-d53b3c49-e24c-4e07-944d-d72baf3994e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.197 183079 DEBUG oslo_concurrency.lockutils [req-acfa4580-28b7-4dcb-9e3a-a0d87c8b42b3 req-e28eef9d-aad6-4d72-b399-215db9f042b2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.197 183079 DEBUG oslo_concurrency.lockutils [req-acfa4580-28b7-4dcb-9e3a-a0d87c8b42b3 req-e28eef9d-aad6-4d72-b399-215db9f042b2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.197 183079 DEBUG oslo_concurrency.lockutils [req-acfa4580-28b7-4dcb-9e3a-a0d87c8b42b3 req-e28eef9d-aad6-4d72-b399-215db9f042b2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.197 183079 DEBUG nova.compute.manager [req-acfa4580-28b7-4dcb-9e3a-a0d87c8b42b3 req-e28eef9d-aad6-4d72-b399-215db9f042b2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Processing event network-vif-plugged-d53b3c49-e24c-4e07-944d-d72baf3994e0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.198 183079 DEBUG nova.compute.manager [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.199 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.204 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.215 183079 DEBUG nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.218 183079 INFO nova.virt.libvirt.driver [-] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Instance spawned successfully.
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.218 183079 DEBUG nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.226 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.227 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101739.1640778, 1fa7475b-9f51-4229-8ded-3a0c4de806c5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.227 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] VM Paused (Lifecycle Event)
Jan 22 17:08:59 compute-0 podman[216991]: 2026-01-22 17:08:59.231349772 +0000 UTC m=+0.052830222 container create f5c7be3cd8cf91f488b76e6d98ff905752304abfae464bcffcf5594c59e3cf63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.238 183079 DEBUG nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.238 183079 DEBUG nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.239 183079 DEBUG nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.240 183079 DEBUG nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.241 183079 DEBUG nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.241 183079 DEBUG nova.virt.libvirt.driver [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.248 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.252 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101739.2017083, 1fa7475b-9f51-4229-8ded-3a0c4de806c5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.252 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] VM Resumed (Lifecycle Event)
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.281 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:08:59 compute-0 systemd[1]: Started libpod-conmon-f5c7be3cd8cf91f488b76e6d98ff905752304abfae464bcffcf5594c59e3cf63.scope.
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.284 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:08:59 compute-0 podman[216991]: 2026-01-22 17:08:59.202640151 +0000 UTC m=+0.024120621 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.310 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:08:59 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:08:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea4763da4ca0b3487e0caf7be75a3f64ec6586a9c55f7d467a972ce94a3f654f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.319 183079 INFO nova.compute.manager [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Took 5.54 seconds to spawn the instance on the hypervisor.
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.320 183079 DEBUG nova.compute.manager [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:08:59 compute-0 podman[216991]: 2026-01-22 17:08:59.333157444 +0000 UTC m=+0.154637924 container init f5c7be3cd8cf91f488b76e6d98ff905752304abfae464bcffcf5594c59e3cf63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:08:59 compute-0 podman[216991]: 2026-01-22 17:08:59.339445289 +0000 UTC m=+0.160925729 container start f5c7be3cd8cf91f488b76e6d98ff905752304abfae464bcffcf5594c59e3cf63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 17:08:59 compute-0 neutron-haproxy-ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef[217006]: [NOTICE]   (217010) : New worker (217012) forked
Jan 22 17:08:59 compute-0 neutron-haproxy-ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef[217006]: [NOTICE]   (217010) : Loading success.
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.381 183079 INFO nova.compute.manager [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Took 6.12 seconds to build instance.
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.397 183079 DEBUG oslo_concurrency.lockutils [None req-5673209d-7d2b-438f-97a3-29b9e4c9cd5f 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.511 183079 DEBUG nova.network.neutron [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Updated VIF entry in instance network info cache for port d53b3c49-e24c-4e07-944d-d72baf3994e0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.512 183079 DEBUG nova.network.neutron [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Updating instance_info_cache with network_info: [{"id": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "address": "fa:16:3e:72:f9:88", "network": {"id": "a2733777-0394-47df-88c8-302fae8b0aef", "bridge": "br-int", "label": "tempest-test-network--1786359447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a08acddaa8b748f8b4fbb432b95408d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd53b3c49-e2", "ovs_interfaceid": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:08:59 compute-0 nova_compute[183075]: 2026-01-22 17:08:59.925 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:00 compute-0 nova_compute[183075]: 2026-01-22 17:09:00.178 183079 DEBUG oslo_concurrency.lockutils [req-4d38b0b5-5c7d-4929-92c1-926502f443f4 req-5fe9583f-2127-47c3-b065-bec2139e1e52 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-1fa7475b-9f51-4229-8ded-3a0c4de806c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:09:00 compute-0 nova_compute[183075]: 2026-01-22 17:09:00.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:09:00 compute-0 nova_compute[183075]: 2026-01-22 17:09:00.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 17:09:01 compute-0 nova_compute[183075]: 2026-01-22 17:09:01.580 183079 DEBUG nova.compute.manager [req-70579ea2-00a4-43d7-907a-682e555568fd req-54b9990f-e8d2-43b6-914b-6ca28517c543 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Received event network-vif-plugged-d53b3c49-e24c-4e07-944d-d72baf3994e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:01 compute-0 nova_compute[183075]: 2026-01-22 17:09:01.580 183079 DEBUG oslo_concurrency.lockutils [req-70579ea2-00a4-43d7-907a-682e555568fd req-54b9990f-e8d2-43b6-914b-6ca28517c543 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:01 compute-0 nova_compute[183075]: 2026-01-22 17:09:01.581 183079 DEBUG oslo_concurrency.lockutils [req-70579ea2-00a4-43d7-907a-682e555568fd req-54b9990f-e8d2-43b6-914b-6ca28517c543 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:01 compute-0 nova_compute[183075]: 2026-01-22 17:09:01.581 183079 DEBUG oslo_concurrency.lockutils [req-70579ea2-00a4-43d7-907a-682e555568fd req-54b9990f-e8d2-43b6-914b-6ca28517c543 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:01 compute-0 nova_compute[183075]: 2026-01-22 17:09:01.582 183079 DEBUG nova.compute.manager [req-70579ea2-00a4-43d7-907a-682e555568fd req-54b9990f-e8d2-43b6-914b-6ca28517c543 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] No waiting events found dispatching network-vif-plugged-d53b3c49-e24c-4e07-944d-d72baf3994e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:09:01 compute-0 nova_compute[183075]: 2026-01-22 17:09:01.582 183079 WARNING nova.compute.manager [req-70579ea2-00a4-43d7-907a-682e555568fd req-54b9990f-e8d2-43b6-914b-6ca28517c543 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Received unexpected event network-vif-plugged-d53b3c49-e24c-4e07-944d-d72baf3994e0 for instance with vm_state active and task_state None.
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.115 183079 INFO nova.compute.manager [None req-b4794323-007f-4392-ba03-ac02eaec098e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Get console output
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.119 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.373 183079 INFO nova.compute.manager [None req-f568c4f1-2d5a-4c0c-bbc2-49a33f36504c 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Get console output
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.379 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.610 183079 DEBUG oslo_concurrency.lockutils [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "fa77228b-8be7-4bab-9a40-7241201bdbff" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.611 183079 DEBUG oslo_concurrency.lockutils [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "fa77228b-8be7-4bab-9a40-7241201bdbff" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.612 183079 DEBUG oslo_concurrency.lockutils [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.612 183079 DEBUG oslo_concurrency.lockutils [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.613 183079 DEBUG oslo_concurrency.lockutils [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.615 183079 INFO nova.compute.manager [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Terminating instance
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.617 183079 DEBUG nova.compute.manager [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:09:02 compute-0 kernel: tap611e2a01-0a (unregistering): left promiscuous mode
Jan 22 17:09:02 compute-0 NetworkManager[55454]: <info>  [1769101742.6403] device (tap611e2a01-0a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:09:02 compute-0 ovn_controller[95372]: 2026-01-22T17:09:02Z|00117|binding|INFO|Releasing lport 611e2a01-0a7c-4b7f-a941-623f993a5547 from this chassis (sb_readonly=0)
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.648 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:02 compute-0 ovn_controller[95372]: 2026-01-22T17:09:02Z|00118|binding|INFO|Setting lport 611e2a01-0a7c-4b7f-a941-623f993a5547 down in Southbound
Jan 22 17:09:02 compute-0 ovn_controller[95372]: 2026-01-22T17:09:02Z|00119|binding|INFO|Removing iface tap611e2a01-0a ovn-installed in OVS
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.650 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:02.657 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:67:5c:7a 10.100.0.9'], port_security=['fa:16:3e:67:5c:7a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'fa77228b-8be7-4bab-9a40-7241201bdbff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6884ab5c00114ca19f253d0c91e2706f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ced556d8-3a2b-4ec1-a804-8cbb50ada768', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.248'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f9feb03-8564-422d-a49d-142dd411b92f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=611e2a01-0a7c-4b7f-a941-623f993a5547) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:09:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:02.659 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 611e2a01-0a7c-4b7f-a941-623f993a5547 in datapath 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e unbound from our chassis
Jan 22 17:09:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:02.662 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e
Jan 22 17:09:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:02.690 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fc009d84-d04d-4e04-8880-6bab19a59c11]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.697 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:02 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 22 17:09:02 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Consumed 13.004s CPU time.
Jan 22 17:09:02 compute-0 systemd-machined[154382]: Machine qemu-7-instance-00000007 terminated.
Jan 22 17:09:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:02.726 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[03e60a22-e33c-40b6-a9f0-0b738cfaf13e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:02.730 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[51ad48ba-280c-46ba-bf5a-c236be871ea7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:02.754 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[48e5f611-55fb-469e-957e-c5ca4af9d7e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:02.775 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[52c5afce-7dfe-4060-aa33-cf799f9b774b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9fe870b5-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:dc:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 106, 'rx_bytes': 17308, 'tx_bytes': 12056, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 106, 'rx_bytes': 17308, 'tx_bytes': 12056, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404056, 'reachable_time': 30789, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217033, 'error': None, 'target': 'ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:02.802 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b72c42b2-74bc-40e9-ba7c-c36d45eb5510]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9fe870b5-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 404069, 'tstamp': 404069}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217034, 'error': None, 'target': 'ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9fe870b5-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 404072, 'tstamp': 404072}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217034, 'error': None, 'target': 'ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:02.805 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9fe870b5-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.808 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.809 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.812 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:02.813 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9fe870b5-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:02.814 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:09:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:02.814 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9fe870b5-10, col_values=(('external_ids', {'iface-id': '124a4c16-1255-4f0e-8e20-7c85f64c00e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:02.815 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.823 183079 DEBUG nova.compute.manager [req-d2242bf6-9634-4a8d-a0cf-52b6d1e746f6 req-b06aed04-9bf6-402b-87c7-fce6b8f6aa5b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Received event network-vif-unplugged-611e2a01-0a7c-4b7f-a941-623f993a5547 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.823 183079 DEBUG oslo_concurrency.lockutils [req-d2242bf6-9634-4a8d-a0cf-52b6d1e746f6 req-b06aed04-9bf6-402b-87c7-fce6b8f6aa5b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.824 183079 DEBUG oslo_concurrency.lockutils [req-d2242bf6-9634-4a8d-a0cf-52b6d1e746f6 req-b06aed04-9bf6-402b-87c7-fce6b8f6aa5b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.824 183079 DEBUG oslo_concurrency.lockutils [req-d2242bf6-9634-4a8d-a0cf-52b6d1e746f6 req-b06aed04-9bf6-402b-87c7-fce6b8f6aa5b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.825 183079 DEBUG nova.compute.manager [req-d2242bf6-9634-4a8d-a0cf-52b6d1e746f6 req-b06aed04-9bf6-402b-87c7-fce6b8f6aa5b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] No waiting events found dispatching network-vif-unplugged-611e2a01-0a7c-4b7f-a941-623f993a5547 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.825 183079 DEBUG nova.compute.manager [req-d2242bf6-9634-4a8d-a0cf-52b6d1e746f6 req-b06aed04-9bf6-402b-87c7-fce6b8f6aa5b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Received event network-vif-unplugged-611e2a01-0a7c-4b7f-a941-623f993a5547 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.896 183079 INFO nova.virt.libvirt.driver [-] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Instance destroyed successfully.
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.897 183079 DEBUG nova.objects.instance [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lazy-loading 'resources' on Instance uuid fa77228b-8be7-4bab-9a40-7241201bdbff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.911 183079 DEBUG nova.virt.libvirt.vif [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:08:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1530587158',display_name='tempest-server-test-1530587158',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1530587158',id=7,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOzuS5c+/c3nchlEjzQZcF6sVTY+If+Ronj929KhDV7B+UoVTOX4QPWtIie7F0q5o+yMVIvrndYwVXNXP5sf1WYo75cdEmtnNs2NDMwzYXACOAw67rfC/ZLmfMKPakZxTQ==',key_name='tempest-keypair-test-694600506',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:08:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6884ab5c00114ca19f253d0c91e2706f',ramdisk_id='',reservation_id='r-hnn1nfqy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpTestCasesAdmin-1567259425',owner_user_name='tempest-FloatingIpTestCasesAdmin-1567259425-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:08:29Z,user_data=None,user_id='7554977cf766467891ad30986750ca88',uuid=fa77228b-8be7-4bab-9a40-7241201bdbff,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "611e2a01-0a7c-4b7f-a941-623f993a5547", "address": "fa:16:3e:67:5c:7a", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap611e2a01-0a", "ovs_interfaceid": "611e2a01-0a7c-4b7f-a941-623f993a5547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.911 183079 DEBUG nova.network.os_vif_util [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Converting VIF {"id": "611e2a01-0a7c-4b7f-a941-623f993a5547", "address": "fa:16:3e:67:5c:7a", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap611e2a01-0a", "ovs_interfaceid": "611e2a01-0a7c-4b7f-a941-623f993a5547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.912 183079 DEBUG nova.network.os_vif_util [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:5c:7a,bridge_name='br-int',has_traffic_filtering=True,id=611e2a01-0a7c-4b7f-a941-623f993a5547,network=Network(9fe870b5-173a-4c8a-b406-6fedb3ddcc4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap611e2a01-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.912 183079 DEBUG os_vif [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:5c:7a,bridge_name='br-int',has_traffic_filtering=True,id=611e2a01-0a7c-4b7f-a941-623f993a5547,network=Network(9fe870b5-173a-4c8a-b406-6fedb3ddcc4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap611e2a01-0a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.914 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.914 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap611e2a01-0a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.915 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.917 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.919 183079 INFO os_vif [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:5c:7a,bridge_name='br-int',has_traffic_filtering=True,id=611e2a01-0a7c-4b7f-a941-623f993a5547,network=Network(9fe870b5-173a-4c8a-b406-6fedb3ddcc4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap611e2a01-0a')
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.919 183079 INFO nova.virt.libvirt.driver [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Deleting instance files /var/lib/nova/instances/fa77228b-8be7-4bab-9a40-7241201bdbff_del
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.920 183079 INFO nova.virt.libvirt.driver [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Deletion of /var/lib/nova/instances/fa77228b-8be7-4bab-9a40-7241201bdbff_del complete
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.973 183079 INFO nova.compute.manager [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.973 183079 DEBUG oslo.service.loopingcall [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.974 183079 DEBUG nova.compute.manager [-] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:09:02 compute-0 nova_compute[183075]: 2026-01-22 17:09:02.974 183079 DEBUG nova.network.neutron [-] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:09:03 compute-0 nova_compute[183075]: 2026-01-22 17:09:03.677 183079 DEBUG nova.compute.manager [req-3c940853-4ab2-41c0-b185-7b75ed294152 req-4da59581-eac0-4812-91a2-1545abd26ffe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Received event network-changed-611e2a01-0a7c-4b7f-a941-623f993a5547 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:03 compute-0 nova_compute[183075]: 2026-01-22 17:09:03.677 183079 DEBUG nova.compute.manager [req-3c940853-4ab2-41c0-b185-7b75ed294152 req-4da59581-eac0-4812-91a2-1545abd26ffe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Refreshing instance network info cache due to event network-changed-611e2a01-0a7c-4b7f-a941-623f993a5547. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:09:03 compute-0 nova_compute[183075]: 2026-01-22 17:09:03.677 183079 DEBUG oslo_concurrency.lockutils [req-3c940853-4ab2-41c0-b185-7b75ed294152 req-4da59581-eac0-4812-91a2-1545abd26ffe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-fa77228b-8be7-4bab-9a40-7241201bdbff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:09:03 compute-0 nova_compute[183075]: 2026-01-22 17:09:03.677 183079 DEBUG oslo_concurrency.lockutils [req-3c940853-4ab2-41c0-b185-7b75ed294152 req-4da59581-eac0-4812-91a2-1545abd26ffe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-fa77228b-8be7-4bab-9a40-7241201bdbff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:09:03 compute-0 nova_compute[183075]: 2026-01-22 17:09:03.678 183079 DEBUG nova.network.neutron [req-3c940853-4ab2-41c0-b185-7b75ed294152 req-4da59581-eac0-4812-91a2-1545abd26ffe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Refreshing network info cache for port 611e2a01-0a7c-4b7f-a941-623f993a5547 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:09:03 compute-0 nova_compute[183075]: 2026-01-22 17:09:03.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:09:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:03.805 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:03.806 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:09:03 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:03 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:03 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:03 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:03 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:03 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:09:03 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:04 compute-0 nova_compute[183075]: 2026-01-22 17:09:04.444 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101729.4431813, 000b64b8-bcc5-4bbe-9703-8400a83a27d0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:09:04 compute-0 nova_compute[183075]: 2026-01-22 17:09:04.445 183079 INFO nova.compute.manager [-] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] VM Stopped (Lifecycle Event)
Jan 22 17:09:04 compute-0 nova_compute[183075]: 2026-01-22 17:09:04.472 183079 DEBUG nova.compute.manager [None req-835f6511-e01c-4232-a9f0-5a38620f165d - - - - - -] [instance: 000b64b8-bcc5-4bbe-9703-8400a83a27d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:09:04 compute-0 nova_compute[183075]: 2026-01-22 17:09:04.964 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:05 compute-0 nova_compute[183075]: 2026-01-22 17:09:05.059 183079 DEBUG nova.compute.manager [req-e1595d3e-e55e-4224-9433-f7815b1301d4 req-05b3d84e-a505-439c-a96a-1e4225a83d6d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Received event network-vif-plugged-611e2a01-0a7c-4b7f-a941-623f993a5547 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:05 compute-0 nova_compute[183075]: 2026-01-22 17:09:05.059 183079 DEBUG oslo_concurrency.lockutils [req-e1595d3e-e55e-4224-9433-f7815b1301d4 req-05b3d84e-a505-439c-a96a-1e4225a83d6d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:05 compute-0 nova_compute[183075]: 2026-01-22 17:09:05.059 183079 DEBUG oslo_concurrency.lockutils [req-e1595d3e-e55e-4224-9433-f7815b1301d4 req-05b3d84e-a505-439c-a96a-1e4225a83d6d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:05 compute-0 nova_compute[183075]: 2026-01-22 17:09:05.059 183079 DEBUG oslo_concurrency.lockutils [req-e1595d3e-e55e-4224-9433-f7815b1301d4 req-05b3d84e-a505-439c-a96a-1e4225a83d6d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "fa77228b-8be7-4bab-9a40-7241201bdbff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:05 compute-0 nova_compute[183075]: 2026-01-22 17:09:05.059 183079 DEBUG nova.compute.manager [req-e1595d3e-e55e-4224-9433-f7815b1301d4 req-05b3d84e-a505-439c-a96a-1e4225a83d6d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] No waiting events found dispatching network-vif-plugged-611e2a01-0a7c-4b7f-a941-623f993a5547 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:09:05 compute-0 nova_compute[183075]: 2026-01-22 17:09:05.060 183079 WARNING nova.compute.manager [req-e1595d3e-e55e-4224-9433-f7815b1301d4 req-05b3d84e-a505-439c-a96a-1e4225a83d6d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Received unexpected event network-vif-plugged-611e2a01-0a7c-4b7f-a941-623f993a5547 for instance with vm_state active and task_state deleting.
Jan 22 17:09:05 compute-0 nova_compute[183075]: 2026-01-22 17:09:05.131 183079 DEBUG nova.network.neutron [-] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:09:05 compute-0 nova_compute[183075]: 2026-01-22 17:09:05.148 183079 INFO nova.compute.manager [-] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Took 2.17 seconds to deallocate network for instance.
Jan 22 17:09:05 compute-0 nova_compute[183075]: 2026-01-22 17:09:05.199 183079 DEBUG oslo_concurrency.lockutils [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:05 compute-0 nova_compute[183075]: 2026-01-22 17:09:05.199 183079 DEBUG oslo_concurrency.lockutils [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.209 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.209 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.4038222
Jan 22 17:09:05 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[216634]: 10.100.0.12:42364 [22/Jan/2026:17:09:03.804] listener listener/metadata 0/0/0/1405/1405 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.218 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.221 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.243 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.243 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0228052
Jan 22 17:09:05 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[216634]: 10.100.0.12:42372 [22/Jan/2026:17:09:05.218] listener listener/metadata 0/0/0/25/25 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.248 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.249 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.269 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.270 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0211639
Jan 22 17:09:05 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[216634]: 10.100.0.12:42380 [22/Jan/2026:17:09:05.247] listener listener/metadata 0/0/0/22/22 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.275 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.276 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.309 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.310 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0337486
Jan 22 17:09:05 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[216634]: 10.100.0.12:42382 [22/Jan/2026:17:09:05.275] listener listener/metadata 0/0/0/34/34 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.315 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.316 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:05 compute-0 nova_compute[183075]: 2026-01-22 17:09:05.324 183079 DEBUG nova.compute.provider_tree [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:09:05 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[216634]: 10.100.0.12:42388 [22/Jan/2026:17:09:05.315] listener listener/metadata 0/0/0/22/22 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.338 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.338 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0216701
Jan 22 17:09:05 compute-0 nova_compute[183075]: 2026-01-22 17:09:05.344 183079 DEBUG nova.scheduler.client.report [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.347 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.348 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:05 compute-0 podman[217051]: 2026-01-22 17:09:05.355349212 +0000 UTC m=+0.063865801 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:09:05 compute-0 nova_compute[183075]: 2026-01-22 17:09:05.377 183079 DEBUG oslo_concurrency.lockutils [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.381 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.382 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0339081
Jan 22 17:09:05 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[216634]: 10.100.0.12:42400 [22/Jan/2026:17:09:05.344] listener listener/metadata 0/0/0/38/38 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.386 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.387 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.403 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.404 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0170231
Jan 22 17:09:05 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[216634]: 10.100.0.12:42412 [22/Jan/2026:17:09:05.386] listener listener/metadata 0/0/0/17/17 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.409 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.409 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:05 compute-0 nova_compute[183075]: 2026-01-22 17:09:05.411 183079 INFO nova.scheduler.client.report [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Deleted allocations for instance fa77228b-8be7-4bab-9a40-7241201bdbff
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.435 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:05 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[216634]: 10.100.0.12:42424 [22/Jan/2026:17:09:05.408] listener listener/metadata 0/0/0/27/27 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.436 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0268929
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.440 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.440 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.455 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.455 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0146663
Jan 22 17:09:05 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[216634]: 10.100.0.12:42440 [22/Jan/2026:17:09:05.440] listener listener/metadata 0/0/0/15/15 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.460 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.461 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:05 compute-0 nova_compute[183075]: 2026-01-22 17:09:05.477 183079 DEBUG oslo_concurrency.lockutils [None req-e1a548fb-666f-4321-9e7a-ca872b366baa 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "fa77228b-8be7-4bab-9a40-7241201bdbff" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.866s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.499 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.500 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0392575
Jan 22 17:09:05 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[216634]: 10.100.0.12:42444 [22/Jan/2026:17:09:05.460] listener listener/metadata 0/0/0/40/40 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.504 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.505 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.526 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0214283
Jan 22 17:09:05 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[216634]: 10.100.0.12:42452 [22/Jan/2026:17:09:05.504] listener listener/metadata 0/0/0/22/22 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.534 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.535 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.551 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.551 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0165129
Jan 22 17:09:05 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[216634]: 10.100.0.12:42454 [22/Jan/2026:17:09:05.534] listener listener/metadata 0/0/0/17/17 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.554 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.555 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.571 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:05 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[216634]: 10.100.0.12:42458 [22/Jan/2026:17:09:05.554] listener listener/metadata 0/0/0/17/17 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.572 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0175424
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.576 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.577 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.591 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.591 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0145559
Jan 22 17:09:05 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[216634]: 10.100.0.12:42472 [22/Jan/2026:17:09:05.576] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.595 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.596 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.625 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.625 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0293553
Jan 22 17:09:05 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[216634]: 10.100.0.12:42482 [22/Jan/2026:17:09:05.595] listener listener/metadata 0/0/0/30/30 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.634 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.635 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.659 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:05 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[216634]: 10.100.0.12:42486 [22/Jan/2026:17:09:05.632] listener listener/metadata 0/0/0/27/27 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:09:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:05.660 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0252435
Jan 22 17:09:05 compute-0 nova_compute[183075]: 2026-01-22 17:09:05.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.067 183079 DEBUG nova.network.neutron [req-3c940853-4ab2-41c0-b185-7b75ed294152 req-4da59581-eac0-4812-91a2-1545abd26ffe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Updated VIF entry in instance network info cache for port 611e2a01-0a7c-4b7f-a941-623f993a5547. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.067 183079 DEBUG nova.network.neutron [req-3c940853-4ab2-41c0-b185-7b75ed294152 req-4da59581-eac0-4812-91a2-1545abd26ffe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Updating instance_info_cache with network_info: [{"id": "611e2a01-0a7c-4b7f-a941-623f993a5547", "address": "fa:16:3e:67:5c:7a", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap611e2a01-0a", "ovs_interfaceid": "611e2a01-0a7c-4b7f-a941-623f993a5547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.086 183079 DEBUG oslo_concurrency.lockutils [req-3c940853-4ab2-41c0-b185-7b75ed294152 req-4da59581-eac0-4812-91a2-1545abd26ffe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-fa77228b-8be7-4bab-9a40-7241201bdbff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:09:06 compute-0 ovn_controller[95372]: 2026-01-22T17:09:06Z|00120|binding|INFO|Releasing lport 1759254b-798a-4e65-baf5-489557c1f604 from this chassis (sb_readonly=0)
Jan 22 17:09:06 compute-0 ovn_controller[95372]: 2026-01-22T17:09:06Z|00121|binding|INFO|Releasing lport 124a4c16-1255-4f0e-8e20-7c85f64c00e5 from this chassis (sb_readonly=0)
Jan 22 17:09:06 compute-0 ovn_controller[95372]: 2026-01-22T17:09:06Z|00122|binding|INFO|Releasing lport 6ec27c8a-b564-4b6c-b0c1-5475212e439c from this chassis (sb_readonly=0)
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.192 183079 DEBUG oslo_concurrency.lockutils [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.192 183079 DEBUG oslo_concurrency.lockutils [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.193 183079 DEBUG oslo_concurrency.lockutils [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.194 183079 DEBUG oslo_concurrency.lockutils [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.194 183079 DEBUG oslo_concurrency.lockutils [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.195 183079 INFO nova.compute.manager [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Terminating instance
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.196 183079 DEBUG nova.compute.manager [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.200 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:06 compute-0 kernel: tap65d8ece3-00 (unregistering): left promiscuous mode
Jan 22 17:09:06 compute-0 NetworkManager[55454]: <info>  [1769101746.2285] device (tap65d8ece3-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:09:06 compute-0 ovn_controller[95372]: 2026-01-22T17:09:06Z|00123|binding|INFO|Releasing lport 65d8ece3-00e3-43f9-8231-6893ea4cf9a4 from this chassis (sb_readonly=0)
Jan 22 17:09:06 compute-0 ovn_controller[95372]: 2026-01-22T17:09:06Z|00124|binding|INFO|Setting lport 65d8ece3-00e3-43f9-8231-6893ea4cf9a4 down in Southbound
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.254 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:06 compute-0 ovn_controller[95372]: 2026-01-22T17:09:06Z|00125|binding|INFO|Removing iface tap65d8ece3-00 ovn-installed in OVS
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.261 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:06.270 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:3e:ac 10.100.0.8'], port_security=['fa:16:3e:eb:3e:ac 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '4bb7efdc-59ab-46cd-ae0d-582182c85f5b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6884ab5c00114ca19f253d0c91e2706f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ced556d8-3a2b-4ec1-a804-8cbb50ada768', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.239'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f9feb03-8564-422d-a49d-142dd411b92f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=65d8ece3-00e3-43f9-8231-6893ea4cf9a4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:09:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:06.272 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 65d8ece3-00e3-43f9-8231-6893ea4cf9a4 in datapath 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e unbound from our chassis
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.273 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:06.278 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9fe870b5-173a-4c8a-b406-6fedb3ddcc4e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:09:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:06.279 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bb24ff7d-7ab2-4722-a223-18ec6c15be0f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:06.280 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e namespace which is not needed anymore
Jan 22 17:09:06 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Jan 22 17:09:06 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 14.939s CPU time.
Jan 22 17:09:06 compute-0 systemd-machined[154382]: Machine qemu-6-instance-00000006 terminated.
Jan 22 17:09:06 compute-0 neutron-haproxy-ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216156]: [NOTICE]   (216160) : haproxy version is 2.8.14-c23fe91
Jan 22 17:09:06 compute-0 neutron-haproxy-ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216156]: [NOTICE]   (216160) : path to executable is /usr/sbin/haproxy
Jan 22 17:09:06 compute-0 neutron-haproxy-ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216156]: [WARNING]  (216160) : Exiting Master process...
Jan 22 17:09:06 compute-0 neutron-haproxy-ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216156]: [ALERT]    (216160) : Current worker (216162) exited with code 143 (Terminated)
Jan 22 17:09:06 compute-0 neutron-haproxy-ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e[216156]: [WARNING]  (216160) : All workers exited. Exiting... (0)
Jan 22 17:09:06 compute-0 systemd[1]: libpod-3843f3618761b92374dc588c34bf04c8023fc31d75e32df465150b1bfcb808ac.scope: Deactivated successfully.
Jan 22 17:09:06 compute-0 podman[217098]: 2026-01-22 17:09:06.417549367 +0000 UTC m=+0.054212649 container died 3843f3618761b92374dc588c34bf04c8023fc31d75e32df465150b1bfcb808ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.453 183079 INFO nova.virt.libvirt.driver [-] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Instance destroyed successfully.
Jan 22 17:09:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3843f3618761b92374dc588c34bf04c8023fc31d75e32df465150b1bfcb808ac-userdata-shm.mount: Deactivated successfully.
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.454 183079 DEBUG nova.objects.instance [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lazy-loading 'resources' on Instance uuid 4bb7efdc-59ab-46cd-ae0d-582182c85f5b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:09:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-7dbf004c3aa91ae1d161ce24f74ee37bbd39a59273d901ebb8e041f477297d53-merged.mount: Deactivated successfully.
Jan 22 17:09:06 compute-0 podman[217098]: 2026-01-22 17:09:06.462374989 +0000 UTC m=+0.099038271 container cleanup 3843f3618761b92374dc588c34bf04c8023fc31d75e32df465150b1bfcb808ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.469 183079 DEBUG nova.virt.libvirt.vif [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:07:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1697070694',display_name='tempest-server-test-1697070694',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1697070694',id=6,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOzuS5c+/c3nchlEjzQZcF6sVTY+If+Ronj929KhDV7B+UoVTOX4QPWtIie7F0q5o+yMVIvrndYwVXNXP5sf1WYo75cdEmtnNs2NDMwzYXACOAw67rfC/ZLmfMKPakZxTQ==',key_name='tempest-keypair-test-694600506',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:07:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6884ab5c00114ca19f253d0c91e2706f',ramdisk_id='',reservation_id='r-rlpd8jek',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpTestCasesAdmin-1567259425',owner_user_name='tempest-FloatingIpTestCasesAdmin-1567259425-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:07:58Z,user_data=None,user_id='7554977cf766467891ad30986750ca88',uuid=4bb7efdc-59ab-46cd-ae0d-582182c85f5b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "address": "fa:16:3e:eb:3e:ac", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65d8ece3-00", "ovs_interfaceid": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.469 183079 DEBUG nova.network.os_vif_util [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Converting VIF {"id": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "address": "fa:16:3e:eb:3e:ac", "network": {"id": "9fe870b5-173a-4c8a-b406-6fedb3ddcc4e", "bridge": "br-int", "label": "tempest-test-network--1353019071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "572446eea54c4f0baab88bf4419b4082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65d8ece3-00", "ovs_interfaceid": "65d8ece3-00e3-43f9-8231-6893ea4cf9a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.470 183079 DEBUG nova.network.os_vif_util [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:eb:3e:ac,bridge_name='br-int',has_traffic_filtering=True,id=65d8ece3-00e3-43f9-8231-6893ea4cf9a4,network=Network(9fe870b5-173a-4c8a-b406-6fedb3ddcc4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65d8ece3-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.471 183079 DEBUG os_vif [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:3e:ac,bridge_name='br-int',has_traffic_filtering=True,id=65d8ece3-00e3-43f9-8231-6893ea4cf9a4,network=Network(9fe870b5-173a-4c8a-b406-6fedb3ddcc4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65d8ece3-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.473 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.473 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65d8ece3-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.477 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.481 183079 INFO os_vif [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:3e:ac,bridge_name='br-int',has_traffic_filtering=True,id=65d8ece3-00e3-43f9-8231-6893ea4cf9a4,network=Network(9fe870b5-173a-4c8a-b406-6fedb3ddcc4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65d8ece3-00')
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.482 183079 INFO nova.virt.libvirt.driver [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Deleting instance files /var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b_del
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.482 183079 INFO nova.virt.libvirt.driver [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Deletion of /var/lib/nova/instances/4bb7efdc-59ab-46cd-ae0d-582182c85f5b_del complete
Jan 22 17:09:06 compute-0 systemd[1]: libpod-conmon-3843f3618761b92374dc588c34bf04c8023fc31d75e32df465150b1bfcb808ac.scope: Deactivated successfully.
Jan 22 17:09:06 compute-0 podman[217141]: 2026-01-22 17:09:06.524012851 +0000 UTC m=+0.038884178 container remove 3843f3618761b92374dc588c34bf04c8023fc31d75e32df465150b1bfcb808ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 17:09:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:06.529 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4e788036-8de8-4cd1-af37-99ba9636ef92]: (4, ('Thu Jan 22 05:09:06 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e (3843f3618761b92374dc588c34bf04c8023fc31d75e32df465150b1bfcb808ac)\n3843f3618761b92374dc588c34bf04c8023fc31d75e32df465150b1bfcb808ac\nThu Jan 22 05:09:06 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e (3843f3618761b92374dc588c34bf04c8023fc31d75e32df465150b1bfcb808ac)\n3843f3618761b92374dc588c34bf04c8023fc31d75e32df465150b1bfcb808ac\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:06.530 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4f0f1bec-45f2-4bed-8b51-350773b491d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:06.531 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9fe870b5-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.533 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:06 compute-0 kernel: tap9fe870b5-10: left promiscuous mode
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.541 183079 INFO nova.compute.manager [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.542 183079 DEBUG oslo.service.loopingcall [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.543 183079 DEBUG nova.compute.manager [-] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.543 183079 DEBUG nova.network.neutron [-] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:09:06 compute-0 nova_compute[183075]: 2026-01-22 17:09:06.554 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:06.556 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b7ad6e69-36d1-44f8-85b5-9ac955ddf6df]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:06.573 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3ba69ff7-f084-49d7-86c9-84ac6f20a0ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:06.575 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7ae9f230-d26d-4298-be8a-f84e6a635e47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:06.590 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fa6b864e-1b63-43a2-9a2c-0ea878de3fa2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404049, 'reachable_time': 19284, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217156, 'error': None, 'target': 'ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:06.592 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9fe870b5-173a-4c8a-b406-6fedb3ddcc4e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:09:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:06.592 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[8092d308-07a0-4288-ba6e-9235021982f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:06 compute-0 systemd[1]: run-netns-ovnmeta\x2d9fe870b5\x2d173a\x2d4c8a\x2db406\x2d6fedb3ddcc4e.mount: Deactivated successfully.
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.174 183079 DEBUG nova.compute.manager [req-f702ab13-41fa-4d65-8306-e68ca63362d8 req-7ab4a9d4-0622-432e-9692-5b7c732d2fb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Received event network-vif-deleted-611e2a01-0a7c-4b7f-a941-623f993a5547 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.174 183079 DEBUG nova.compute.manager [req-f702ab13-41fa-4d65-8306-e68ca63362d8 req-7ab4a9d4-0622-432e-9692-5b7c732d2fb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Received event network-vif-unplugged-65d8ece3-00e3-43f9-8231-6893ea4cf9a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.175 183079 DEBUG oslo_concurrency.lockutils [req-f702ab13-41fa-4d65-8306-e68ca63362d8 req-7ab4a9d4-0622-432e-9692-5b7c732d2fb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.175 183079 DEBUG oslo_concurrency.lockutils [req-f702ab13-41fa-4d65-8306-e68ca63362d8 req-7ab4a9d4-0622-432e-9692-5b7c732d2fb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.175 183079 DEBUG oslo_concurrency.lockutils [req-f702ab13-41fa-4d65-8306-e68ca63362d8 req-7ab4a9d4-0622-432e-9692-5b7c732d2fb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.175 183079 DEBUG nova.compute.manager [req-f702ab13-41fa-4d65-8306-e68ca63362d8 req-7ab4a9d4-0622-432e-9692-5b7c732d2fb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] No waiting events found dispatching network-vif-unplugged-65d8ece3-00e3-43f9-8231-6893ea4cf9a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.175 183079 DEBUG nova.compute.manager [req-f702ab13-41fa-4d65-8306-e68ca63362d8 req-7ab4a9d4-0622-432e-9692-5b7c732d2fb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Received event network-vif-unplugged-65d8ece3-00e3-43f9-8231-6893ea4cf9a4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.175 183079 DEBUG nova.compute.manager [req-f702ab13-41fa-4d65-8306-e68ca63362d8 req-7ab4a9d4-0622-432e-9692-5b7c732d2fb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Received event network-vif-plugged-65d8ece3-00e3-43f9-8231-6893ea4cf9a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.176 183079 DEBUG oslo_concurrency.lockutils [req-f702ab13-41fa-4d65-8306-e68ca63362d8 req-7ab4a9d4-0622-432e-9692-5b7c732d2fb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.176 183079 DEBUG oslo_concurrency.lockutils [req-f702ab13-41fa-4d65-8306-e68ca63362d8 req-7ab4a9d4-0622-432e-9692-5b7c732d2fb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.176 183079 DEBUG oslo_concurrency.lockutils [req-f702ab13-41fa-4d65-8306-e68ca63362d8 req-7ab4a9d4-0622-432e-9692-5b7c732d2fb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.176 183079 DEBUG nova.compute.manager [req-f702ab13-41fa-4d65-8306-e68ca63362d8 req-7ab4a9d4-0622-432e-9692-5b7c732d2fb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] No waiting events found dispatching network-vif-plugged-65d8ece3-00e3-43f9-8231-6893ea4cf9a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.176 183079 WARNING nova.compute.manager [req-f702ab13-41fa-4d65-8306-e68ca63362d8 req-7ab4a9d4-0622-432e-9692-5b7c732d2fb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Received unexpected event network-vif-plugged-65d8ece3-00e3-43f9-8231-6893ea4cf9a4 for instance with vm_state active and task_state deleting.
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.247 183079 INFO nova.compute.manager [None req-15abd5fd-890f-4b5f-9f88-60d166f21c66 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Get console output
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.253 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.533 183079 INFO nova.compute.manager [None req-1f99936c-5db5-4122-ad04-16d4a3c1e4ac 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Get console output
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.537 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.663 183079 DEBUG nova.network.neutron [-] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.691 183079 INFO nova.compute.manager [-] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Took 1.15 seconds to deallocate network for instance.
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.754 183079 DEBUG nova.compute.manager [req-f7252bb3-c56e-4ea9-a419-4b16f97a5070 req-20ac1ed3-fdeb-4911-adee-9d081ce240a5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Received event network-vif-deleted-65d8ece3-00e3-43f9-8231-6893ea4cf9a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.760 183079 DEBUG oslo_concurrency.lockutils [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.762 183079 DEBUG oslo_concurrency.lockutils [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.784 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.893 183079 DEBUG nova.compute.provider_tree [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.912 183079 DEBUG nova.scheduler.client.report [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.936 183079 DEBUG oslo_concurrency.lockutils [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:07 compute-0 nova_compute[183075]: 2026-01-22 17:09:07.973 183079 INFO nova.scheduler.client.report [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Deleted allocations for instance 4bb7efdc-59ab-46cd-ae0d-582182c85f5b
Jan 22 17:09:08 compute-0 nova_compute[183075]: 2026-01-22 17:09:08.060 183079 DEBUG oslo_concurrency.lockutils [None req-52eaff83-d704-4208-a7ba-cc0302894f4a 7554977cf766467891ad30986750ca88 6884ab5c00114ca19f253d0c91e2706f - - default default] Lock "4bb7efdc-59ab-46cd-ae0d-582182c85f5b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.868s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:08 compute-0 nova_compute[183075]: 2026-01-22 17:09:08.064 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-4bb7efdc-59ab-46cd-ae0d-582182c85f5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:09:08 compute-0 nova_compute[183075]: 2026-01-22 17:09:08.064 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-4bb7efdc-59ab-46cd-ae0d-582182c85f5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:09:08 compute-0 nova_compute[183075]: 2026-01-22 17:09:08.064 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:09:08 compute-0 nova_compute[183075]: 2026-01-22 17:09:08.089 183079 DEBUG nova.compute.utils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Can not refresh info_cache because instance was not found refresh_info_cache_for_instance /usr/lib/python3.9/site-packages/nova/compute/utils.py:1010
Jan 22 17:09:08 compute-0 nova_compute[183075]: 2026-01-22 17:09:08.281 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:09:08 compute-0 nova_compute[183075]: 2026-01-22 17:09:08.957 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101733.9559467, effaddee-27ef-49f6-ac5f-2e3258c8d5d2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:09:08 compute-0 nova_compute[183075]: 2026-01-22 17:09:08.957 183079 INFO nova.compute.manager [-] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] VM Stopped (Lifecycle Event)
Jan 22 17:09:08 compute-0 nova_compute[183075]: 2026-01-22 17:09:08.981 183079 DEBUG nova.compute.manager [None req-9f799d10-5b52-4837-8d72-77744ee10c99 - - - - - -] [instance: effaddee-27ef-49f6-ac5f-2e3258c8d5d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:09:09 compute-0 nova_compute[183075]: 2026-01-22 17:09:09.263 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:09:09 compute-0 nova_compute[183075]: 2026-01-22 17:09:09.277 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-4bb7efdc-59ab-46cd-ae0d-582182c85f5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:09:09 compute-0 nova_compute[183075]: 2026-01-22 17:09:09.277 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:09:09 compute-0 nova_compute[183075]: 2026-01-22 17:09:09.277 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:09:09 compute-0 nova_compute[183075]: 2026-01-22 17:09:09.278 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:09:09 compute-0 nova_compute[183075]: 2026-01-22 17:09:09.278 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:09:09 compute-0 nova_compute[183075]: 2026-01-22 17:09:09.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:09:09 compute-0 nova_compute[183075]: 2026-01-22 17:09:09.968 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:10 compute-0 ovn_controller[95372]: 2026-01-22T17:09:10Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:72:f9:88 10.100.0.7
Jan 22 17:09:10 compute-0 ovn_controller[95372]: 2026-01-22T17:09:10Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:72:f9:88 10.100.0.7
Jan 22 17:09:10 compute-0 nova_compute[183075]: 2026-01-22 17:09:10.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:09:10 compute-0 nova_compute[183075]: 2026-01-22 17:09:10.823 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:10 compute-0 nova_compute[183075]: 2026-01-22 17:09:10.823 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:10 compute-0 nova_compute[183075]: 2026-01-22 17:09:10.824 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:10 compute-0 nova_compute[183075]: 2026-01-22 17:09:10.824 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:09:10 compute-0 nova_compute[183075]: 2026-01-22 17:09:10.920 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:10 compute-0 nova_compute[183075]: 2026-01-22 17:09:10.997 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:10 compute-0 nova_compute[183075]: 2026-01-22 17:09:10.998 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.051 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.056 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.108 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.109 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.161 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.316 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.317 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5401MB free_disk=73.3248176574707GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.318 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.318 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.409 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance ee33030a-2035-4fd1-8de4-261142b89bc6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.409 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 1fa7475b-9f51-4229-8ded-3a0c4de806c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.410 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.410 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.476 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.733 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.748 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.768 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:09:11 compute-0 nova_compute[183075]: 2026-01-22 17:09:11.769 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.451s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:12 compute-0 nova_compute[183075]: 2026-01-22 17:09:12.864 183079 INFO nova.compute.manager [None req-c55681d3-fb41-45f6-9094-974184b8d068 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Get console output
Jan 22 17:09:12 compute-0 nova_compute[183075]: 2026-01-22 17:09:12.873 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:13 compute-0 nova_compute[183075]: 2026-01-22 17:09:13.740 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "7d42beab-5bb4-43a0-9756-ced73188f5ba" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:13 compute-0 nova_compute[183075]: 2026-01-22 17:09:13.740 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:13 compute-0 nova_compute[183075]: 2026-01-22 17:09:13.767 183079 DEBUG nova.compute.manager [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:09:13 compute-0 nova_compute[183075]: 2026-01-22 17:09:13.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:09:13 compute-0 nova_compute[183075]: 2026-01-22 17:09:13.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 17:09:13 compute-0 nova_compute[183075]: 2026-01-22 17:09:13.813 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 17:09:13 compute-0 nova_compute[183075]: 2026-01-22 17:09:13.841 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:13 compute-0 nova_compute[183075]: 2026-01-22 17:09:13.842 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:13 compute-0 nova_compute[183075]: 2026-01-22 17:09:13.848 183079 DEBUG nova.virt.hardware [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:09:13 compute-0 nova_compute[183075]: 2026-01-22 17:09:13.848 183079 INFO nova.compute.claims [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:09:13 compute-0 nova_compute[183075]: 2026-01-22 17:09:13.998 183079 DEBUG nova.compute.provider_tree [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.017 183079 DEBUG nova.scheduler.client.report [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.042 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.042 183079 DEBUG nova.compute.manager [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.103 183079 DEBUG nova.compute.manager [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.104 183079 DEBUG nova.network.neutron [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.126 183079 INFO nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.146 183079 DEBUG nova.compute.manager [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.244 183079 DEBUG nova.compute.manager [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.245 183079 DEBUG nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.246 183079 INFO nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Creating image(s)
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.246 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "/var/lib/nova/instances/7d42beab-5bb4-43a0-9756-ced73188f5ba/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.247 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/7d42beab-5bb4-43a0-9756-ced73188f5ba/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.247 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/7d42beab-5bb4-43a0-9756-ced73188f5ba/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.259 183079 DEBUG oslo_concurrency.processutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.315 183079 DEBUG oslo_concurrency.processutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.316 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.317 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.331 183079 DEBUG oslo_concurrency.processutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.386 183079 DEBUG oslo_concurrency.processutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.387 183079 DEBUG oslo_concurrency.processutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/7d42beab-5bb4-43a0-9756-ced73188f5ba/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.422 183079 DEBUG oslo_concurrency.processutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/7d42beab-5bb4-43a0-9756-ced73188f5ba/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.423 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.424 183079 DEBUG oslo_concurrency.processutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.476 183079 DEBUG oslo_concurrency.processutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.477 183079 DEBUG nova.virt.disk.api [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Checking if we can resize image /var/lib/nova/instances/7d42beab-5bb4-43a0-9756-ced73188f5ba/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.477 183079 DEBUG oslo_concurrency.processutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7d42beab-5bb4-43a0-9756-ced73188f5ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.534 183079 DEBUG oslo_concurrency.processutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7d42beab-5bb4-43a0-9756-ced73188f5ba/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.535 183079 DEBUG nova.virt.disk.api [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Cannot resize image /var/lib/nova/instances/7d42beab-5bb4-43a0-9756-ced73188f5ba/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.535 183079 DEBUG nova.objects.instance [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'migration_context' on Instance uuid 7d42beab-5bb4-43a0-9756-ced73188f5ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.553 183079 DEBUG nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.554 183079 DEBUG nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Ensure instance console log exists: /var/lib/nova/instances/7d42beab-5bb4-43a0-9756-ced73188f5ba/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.555 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.556 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.556 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.740 183079 DEBUG nova.policy [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:09:14 compute-0 nova_compute[183075]: 2026-01-22 17:09:14.971 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:15 compute-0 ovn_controller[95372]: 2026-01-22T17:09:15Z|00126|binding|INFO|Releasing lport 1759254b-798a-4e65-baf5-489557c1f604 from this chassis (sb_readonly=0)
Jan 22 17:09:15 compute-0 ovn_controller[95372]: 2026-01-22T17:09:15Z|00127|binding|INFO|Releasing lport 6ec27c8a-b564-4b6c-b0c1-5475212e439c from this chassis (sb_readonly=0)
Jan 22 17:09:15 compute-0 nova_compute[183075]: 2026-01-22 17:09:15.113 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:16.338 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:16.339 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: a2733777-0394-47df-88c8-302fae8b0aef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:16 compute-0 nova_compute[183075]: 2026-01-22 17:09:16.385 183079 DEBUG nova.network.neutron [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Successfully updated port: 877fd3c4-01ce-4616-b9d4-92cf337f7f6f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:09:16 compute-0 podman[217208]: 2026-01-22 17:09:16.40446977 +0000 UTC m=+0.098763764 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:09:16 compute-0 nova_compute[183075]: 2026-01-22 17:09:16.408 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "refresh_cache-7d42beab-5bb4-43a0-9756-ced73188f5ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:09:16 compute-0 nova_compute[183075]: 2026-01-22 17:09:16.408 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquired lock "refresh_cache-7d42beab-5bb4-43a0-9756-ced73188f5ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:09:16 compute-0 nova_compute[183075]: 2026-01-22 17:09:16.408 183079 DEBUG nova.network.neutron [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:09:16 compute-0 nova_compute[183075]: 2026-01-22 17:09:16.479 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:16 compute-0 nova_compute[183075]: 2026-01-22 17:09:16.510 183079 DEBUG nova.compute.manager [req-26cadd06-3349-48c2-94a1-844ccc67363d req-00c83120-4c9f-4af0-93d5-81398523c003 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Received event network-changed-877fd3c4-01ce-4616-b9d4-92cf337f7f6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:16 compute-0 nova_compute[183075]: 2026-01-22 17:09:16.511 183079 DEBUG nova.compute.manager [req-26cadd06-3349-48c2-94a1-844ccc67363d req-00c83120-4c9f-4af0-93d5-81398523c003 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Refreshing instance network info cache due to event network-changed-877fd3c4-01ce-4616-b9d4-92cf337f7f6f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:09:16 compute-0 nova_compute[183075]: 2026-01-22 17:09:16.511 183079 DEBUG oslo_concurrency.lockutils [req-26cadd06-3349-48c2-94a1-844ccc67363d req-00c83120-4c9f-4af0-93d5-81398523c003 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-7d42beab-5bb4-43a0-9756-ced73188f5ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:09:16 compute-0 nova_compute[183075]: 2026-01-22 17:09:16.592 183079 DEBUG nova.network.neutron [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:09:16 compute-0 nova_compute[183075]: 2026-01-22 17:09:16.809 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:16.984 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:16 compute-0 haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef[217012]: 10.100.0.7:39410 [22/Jan/2026:17:09:16.337] listener listener/metadata 0/0/0/647/647 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:16.984 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.6452460
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:16.994 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:16.995 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:09:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: a2733777-0394-47df-88c8-302fae8b0aef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.016 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.016 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0217605
Jan 22 17:09:17 compute-0 haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef[217012]: 10.100.0.7:39412 [22/Jan/2026:17:09:16.993] listener listener/metadata 0/0/0/23/23 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.021 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.022 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: a2733777-0394-47df-88c8-302fae8b0aef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.035 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:17 compute-0 haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef[217012]: 10.100.0.7:39414 [22/Jan/2026:17:09:17.020] listener listener/metadata 0/0/0/14/14 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.035 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0132494
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.040 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.041 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: a2733777-0394-47df-88c8-302fae8b0aef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.053 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.054 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0127597
Jan 22 17:09:17 compute-0 haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef[217012]: 10.100.0.7:39422 [22/Jan/2026:17:09:17.040] listener listener/metadata 0/0/0/13/13 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.060 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.061 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: a2733777-0394-47df-88c8-302fae8b0aef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.080 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.080 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0196726
Jan 22 17:09:17 compute-0 haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef[217012]: 10.100.0.7:39438 [22/Jan/2026:17:09:17.060] listener listener/metadata 0/0/0/20/20 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.090 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.091 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: a2733777-0394-47df-88c8-302fae8b0aef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.106 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.106 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0153866
Jan 22 17:09:17 compute-0 haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef[217012]: 10.100.0.7:39440 [22/Jan/2026:17:09:17.090] listener listener/metadata 0/0/0/16/16 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.118 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.120 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: a2733777-0394-47df-88c8-302fae8b0aef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.144 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.145 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0252261
Jan 22 17:09:17 compute-0 haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef[217012]: 10.100.0.7:39448 [22/Jan/2026:17:09:17.117] listener listener/metadata 0/0/0/28/28 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.154 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.155 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: a2733777-0394-47df-88c8-302fae8b0aef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.174 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.174 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0192058
Jan 22 17:09:17 compute-0 haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef[217012]: 10.100.0.7:39454 [22/Jan/2026:17:09:17.153] listener listener/metadata 0/0/0/20/20 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.183 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.183 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: a2733777-0394-47df-88c8-302fae8b0aef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.198 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:17 compute-0 haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef[217012]: 10.100.0.7:39456 [22/Jan/2026:17:09:17.182] listener listener/metadata 0/0/0/15/15 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.198 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0148208
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.204 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.204 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: a2733777-0394-47df-88c8-302fae8b0aef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.224 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:17 compute-0 haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef[217012]: 10.100.0.7:39470 [22/Jan/2026:17:09:17.203] listener listener/metadata 0/0/0/21/21 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.225 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0207005
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.230 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.230 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: a2733777-0394-47df-88c8-302fae8b0aef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.245 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0152080
Jan 22 17:09:17 compute-0 haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef[217012]: 10.100.0.7:39472 [22/Jan/2026:17:09:17.229] listener listener/metadata 0/0/0/16/16 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.254 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.255 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: a2733777-0394-47df-88c8-302fae8b0aef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.273 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:17 compute-0 haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef[217012]: 10.100.0.7:39482 [22/Jan/2026:17:09:17.253] listener listener/metadata 0/0/0/20/20 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.274 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0187981
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.278 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.278 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: a2733777-0394-47df-88c8-302fae8b0aef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.298 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:17 compute-0 haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef[217012]: 10.100.0.7:39486 [22/Jan/2026:17:09:17.277] listener listener/metadata 0/0/0/20/20 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.298 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0199480
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.303 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.304 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: a2733777-0394-47df-88c8-302fae8b0aef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.321 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:17 compute-0 haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef[217012]: 10.100.0.7:39490 [22/Jan/2026:17:09:17.302] listener listener/metadata 0/0/0/19/19 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.322 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0183406
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.326 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.327 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: a2733777-0394-47df-88c8-302fae8b0aef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.343 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.344 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0171983
Jan 22 17:09:17 compute-0 haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef[217012]: 10.100.0.7:39500 [22/Jan/2026:17:09:17.326] listener listener/metadata 0/0/0/18/18 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.348 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.349 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: a2733777-0394-47df-88c8-302fae8b0aef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.374 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:17.375 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0263739
Jan 22 17:09:17 compute-0 haproxy-metadata-proxy-a2733777-0394-47df-88c8-302fae8b0aef[217012]: 10.100.0.7:39516 [22/Jan/2026:17:09:17.347] listener listener/metadata 0/0/0/27/27 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:09:17 compute-0 nova_compute[183075]: 2026-01-22 17:09:17.892 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101742.8913739, fa77228b-8be7-4bab-9a40-7241201bdbff => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:09:17 compute-0 nova_compute[183075]: 2026-01-22 17:09:17.893 183079 INFO nova.compute.manager [-] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] VM Stopped (Lifecycle Event)
Jan 22 17:09:17 compute-0 nova_compute[183075]: 2026-01-22 17:09:17.918 183079 DEBUG nova.compute.manager [None req-3b81ac44-3434-4cd8-9f0e-10ad16c6fa1a - - - - - -] [instance: fa77228b-8be7-4bab-9a40-7241201bdbff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.140 183079 INFO nova.compute.manager [None req-e4ef603c-13a2-458b-bbe7-4ee87fd3824b 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Get console output
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.146 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.554 183079 DEBUG nova.network.neutron [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Updating instance_info_cache with network_info: [{"id": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "address": "fa:16:3e:4e:ec:b6", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877fd3c4-01", "ovs_interfaceid": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.587 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Releasing lock "refresh_cache-7d42beab-5bb4-43a0-9756-ced73188f5ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.587 183079 DEBUG nova.compute.manager [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Instance network_info: |[{"id": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "address": "fa:16:3e:4e:ec:b6", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877fd3c4-01", "ovs_interfaceid": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.588 183079 DEBUG oslo_concurrency.lockutils [req-26cadd06-3349-48c2-94a1-844ccc67363d req-00c83120-4c9f-4af0-93d5-81398523c003 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-7d42beab-5bb4-43a0-9756-ced73188f5ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.588 183079 DEBUG nova.network.neutron [req-26cadd06-3349-48c2-94a1-844ccc67363d req-00c83120-4c9f-4af0-93d5-81398523c003 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Refreshing network info cache for port 877fd3c4-01ce-4616-b9d4-92cf337f7f6f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.594 183079 DEBUG nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Start _get_guest_xml network_info=[{"id": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "address": "fa:16:3e:4e:ec:b6", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877fd3c4-01", "ovs_interfaceid": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.601 183079 WARNING nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.610 183079 DEBUG nova.virt.libvirt.host [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.613 183079 DEBUG nova.virt.libvirt.host [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.634 183079 DEBUG nova.virt.libvirt.host [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.634 183079 DEBUG nova.virt.libvirt.host [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.635 183079 DEBUG nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.635 183079 DEBUG nova.virt.hardware [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.635 183079 DEBUG nova.virt.hardware [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.635 183079 DEBUG nova.virt.hardware [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.636 183079 DEBUG nova.virt.hardware [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.636 183079 DEBUG nova.virt.hardware [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.636 183079 DEBUG nova.virt.hardware [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.636 183079 DEBUG nova.virt.hardware [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.636 183079 DEBUG nova.virt.hardware [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.636 183079 DEBUG nova.virt.hardware [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.637 183079 DEBUG nova.virt.hardware [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.637 183079 DEBUG nova.virt.hardware [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.639 183079 DEBUG nova.virt.libvirt.vif [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:09:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1958859049',display_name='tempest-server-test-1958859049',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1958859049',id=10,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-r79zmuj9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:09:14Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=7d42beab-5bb4-43a0-9756-ced73188f5ba,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "address": "fa:16:3e:4e:ec:b6", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877fd3c4-01", "ovs_interfaceid": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.640 183079 DEBUG nova.network.os_vif_util [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "address": "fa:16:3e:4e:ec:b6", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877fd3c4-01", "ovs_interfaceid": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.640 183079 DEBUG nova.network.os_vif_util [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:ec:b6,bridge_name='br-int',has_traffic_filtering=True,id=877fd3c4-01ce-4616-b9d4-92cf337f7f6f,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap877fd3c4-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.641 183079 DEBUG nova.objects.instance [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7d42beab-5bb4-43a0-9756-ced73188f5ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.653 183079 DEBUG nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:09:18 compute-0 nova_compute[183075]:   <uuid>7d42beab-5bb4-43a0-9756-ced73188f5ba</uuid>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   <name>instance-0000000a</name>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1958859049</nova:name>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:09:18</nova:creationTime>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:09:18 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:09:18 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:09:18 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:09:18 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:09:18 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:09:18 compute-0 nova_compute[183075]:         <nova:user uuid="cd47d63cff2548a88e21e5c2e6a5c161">tempest-FloatingIpSeparateNetwork-931877966-project-member</nova:user>
Jan 22 17:09:18 compute-0 nova_compute[183075]:         <nova:project uuid="e05c7aae349e4a1d859a387df45650a0">tempest-FloatingIpSeparateNetwork-931877966</nova:project>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:09:18 compute-0 nova_compute[183075]:         <nova:port uuid="877fd3c4-01ce-4616-b9d4-92cf337f7f6f">
Jan 22 17:09:18 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <system>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <entry name="serial">7d42beab-5bb4-43a0-9756-ced73188f5ba</entry>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <entry name="uuid">7d42beab-5bb4-43a0-9756-ced73188f5ba</entry>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     </system>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   <os>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   </os>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   <features>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   </features>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/7d42beab-5bb4-43a0-9756-ced73188f5ba/disk"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:4e:ec:b6"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <target dev="tap877fd3c4-01"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/7d42beab-5bb4-43a0-9756-ced73188f5ba/console.log" append="off"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <video>
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     </video>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:09:18 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:09:18 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:09:18 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:09:18 compute-0 nova_compute[183075]: </domain>
Jan 22 17:09:18 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.654 183079 DEBUG nova.compute.manager [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Preparing to wait for external event network-vif-plugged-877fd3c4-01ce-4616-b9d4-92cf337f7f6f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.654 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.654 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.654 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.655 183079 DEBUG nova.virt.libvirt.vif [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:09:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1958859049',display_name='tempest-server-test-1958859049',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1958859049',id=10,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-r79zmuj9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:09:14Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=7d42beab-5bb4-43a0-9756-ced73188f5ba,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "address": "fa:16:3e:4e:ec:b6", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877fd3c4-01", "ovs_interfaceid": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.655 183079 DEBUG nova.network.os_vif_util [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "address": "fa:16:3e:4e:ec:b6", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877fd3c4-01", "ovs_interfaceid": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.655 183079 DEBUG nova.network.os_vif_util [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:ec:b6,bridge_name='br-int',has_traffic_filtering=True,id=877fd3c4-01ce-4616-b9d4-92cf337f7f6f,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap877fd3c4-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.655 183079 DEBUG os_vif [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:ec:b6,bridge_name='br-int',has_traffic_filtering=True,id=877fd3c4-01ce-4616-b9d4-92cf337f7f6f,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap877fd3c4-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.656 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.656 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.656 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.658 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.659 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap877fd3c4-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.659 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap877fd3c4-01, col_values=(('external_ids', {'iface-id': '877fd3c4-01ce-4616-b9d4-92cf337f7f6f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:ec:b6', 'vm-uuid': '7d42beab-5bb4-43a0-9756-ced73188f5ba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.660 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:18 compute-0 NetworkManager[55454]: <info>  [1769101758.6613] manager: (tap877fd3c4-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.663 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.666 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.666 183079 INFO os_vif [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:ec:b6,bridge_name='br-int',has_traffic_filtering=True,id=877fd3c4-01ce-4616-b9d4-92cf337f7f6f,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap877fd3c4-01')
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.716 183079 DEBUG nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.716 183079 DEBUG nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No VIF found with MAC fa:16:3e:4e:ec:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:09:18 compute-0 kernel: tap877fd3c4-01: entered promiscuous mode
Jan 22 17:09:18 compute-0 NetworkManager[55454]: <info>  [1769101758.8065] manager: (tap877fd3c4-01): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Jan 22 17:09:18 compute-0 ovn_controller[95372]: 2026-01-22T17:09:18Z|00128|binding|INFO|Claiming lport 877fd3c4-01ce-4616-b9d4-92cf337f7f6f for this chassis.
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.808 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:18 compute-0 ovn_controller[95372]: 2026-01-22T17:09:18Z|00129|binding|INFO|877fd3c4-01ce-4616-b9d4-92cf337f7f6f: Claiming fa:16:3e:4e:ec:b6 10.100.0.25
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.823 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:ec:b6 10.100.0.25'], port_security=['fa:16:3e:4e:ec:b6 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '7d42beab-5bb4-43a0-9756-ced73188f5ba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a16be1a-262e-47f7-8518-5f24ee15796e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.222'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5b6ccb16-1216-4deb-9d72-42005a3163bb, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=877fd3c4-01ce-4616-b9d4-92cf337f7f6f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:09:18 compute-0 ovn_controller[95372]: 2026-01-22T17:09:18Z|00130|binding|INFO|Setting lport 877fd3c4-01ce-4616-b9d4-92cf337f7f6f ovn-installed in OVS
Jan 22 17:09:18 compute-0 ovn_controller[95372]: 2026-01-22T17:09:18Z|00131|binding|INFO|Setting lport 877fd3c4-01ce-4616-b9d4-92cf337f7f6f up in Southbound
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.825 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 877fd3c4-01ce-4616-b9d4-92cf337f7f6f in datapath 0a16be1a-262e-47f7-8518-5f24ee15796e bound to our chassis
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.826 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.829 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a16be1a-262e-47f7-8518-5f24ee15796e
Jan 22 17:09:18 compute-0 nova_compute[183075]: 2026-01-22 17:09:18.830 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:18 compute-0 systemd-udevd[217248]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.840 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[20277f8a-cab3-4177-84d3-10b7043d085b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.841 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0a16be1a-21 in ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.844 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0a16be1a-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.844 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[677d225e-8a68-4e2e-ab22-6dbf95500b73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.845 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e1382500-ab64-40dc-a41b-5445c5c7126c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:18 compute-0 NetworkManager[55454]: <info>  [1769101758.8592] device (tap877fd3c4-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:09:18 compute-0 NetworkManager[55454]: <info>  [1769101758.8597] device (tap877fd3c4-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.854 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[e360b984-bc56-443e-9cf4-cd544a974414]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:18 compute-0 systemd-machined[154382]: New machine qemu-10-instance-0000000a.
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.872 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9b202c0c-8d46-47ff-8ae4-06d92cf3ae42]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:18 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000a.
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.904 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[57eefd5d-e2bf-40d0-87e7-042617fec153]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.909 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2537afed-e53b-4203-b70c-54eb128f2373]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:18 compute-0 NetworkManager[55454]: <info>  [1769101758.9102] manager: (tap0a16be1a-20): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.936 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[e0ac051e-18bd-44a2-b52f-34f41a16a2e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.939 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[208c0078-b2b6-4526-b16d-5e254b7886aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:18 compute-0 NetworkManager[55454]: <info>  [1769101758.9588] device (tap0a16be1a-20): carrier: link connected
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.963 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[0928d77a-a941-42c1-806d-204b86bf80b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.977 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[38672e77-5e34-43a8-9809-c1bd1148fa32]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a16be1a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:16:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412266, 'reachable_time': 18583, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217283, 'error': None, 'target': 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:18.992 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3b03a6c9-200a-448f-a5ad-1bc1eeafbe92]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feba:16c2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 412266, 'tstamp': 412266}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217284, 'error': None, 'target': 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:19.005 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[03f96c0e-b105-4384-8339-89227688fbaf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a16be1a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:16:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412266, 'reachable_time': 18583, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217285, 'error': None, 'target': 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:19.032 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1d4ec650-ca11-4a30-b2c9-3d63e532fa3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:19.087 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fc4763d8-5d7d-413d-9078-54f32ca0690b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:19.091 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a16be1a-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:19.092 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:19.092 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a16be1a-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.093 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:19 compute-0 kernel: tap0a16be1a-20: entered promiscuous mode
Jan 22 17:09:19 compute-0 NetworkManager[55454]: <info>  [1769101759.0948] manager: (tap0a16be1a-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.095 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:19.096 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a16be1a-20, col_values=(('external_ids', {'iface-id': 'f5af8e72-5100-4440-84f0-c68eec4b5e5e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.097 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:19 compute-0 ovn_controller[95372]: 2026-01-22T17:09:19Z|00132|binding|INFO|Releasing lport f5af8e72-5100-4440-84f0-c68eec4b5e5e from this chassis (sb_readonly=0)
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.108 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:19.109 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a16be1a-262e-47f7-8518-5f24ee15796e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a16be1a-262e-47f7-8518-5f24ee15796e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:19.110 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c06cec95-6c55-44d6-bc97-5adfb84e8168]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:19.110 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/0a16be1a-262e-47f7-8518-5f24ee15796e.pid.haproxy
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 0a16be1a-262e-47f7-8518-5f24ee15796e
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:09:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:19.111 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'env', 'PROCESS_TAG=haproxy-0a16be1a-262e-47f7-8518-5f24ee15796e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0a16be1a-262e-47f7-8518-5f24ee15796e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.234 183079 DEBUG nova.compute.manager [req-d1af1b20-cb4c-4027-bbf3-8ee96621caa8 req-9d7fbaa7-dcd8-4a70-b0b5-6db7c81e2512 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Received event network-vif-plugged-877fd3c4-01ce-4616-b9d4-92cf337f7f6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.235 183079 DEBUG oslo_concurrency.lockutils [req-d1af1b20-cb4c-4027-bbf3-8ee96621caa8 req-9d7fbaa7-dcd8-4a70-b0b5-6db7c81e2512 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.235 183079 DEBUG oslo_concurrency.lockutils [req-d1af1b20-cb4c-4027-bbf3-8ee96621caa8 req-9d7fbaa7-dcd8-4a70-b0b5-6db7c81e2512 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.236 183079 DEBUG oslo_concurrency.lockutils [req-d1af1b20-cb4c-4027-bbf3-8ee96621caa8 req-9d7fbaa7-dcd8-4a70-b0b5-6db7c81e2512 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.236 183079 DEBUG nova.compute.manager [req-d1af1b20-cb4c-4027-bbf3-8ee96621caa8 req-9d7fbaa7-dcd8-4a70-b0b5-6db7c81e2512 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Processing event network-vif-plugged-877fd3c4-01ce-4616-b9d4-92cf337f7f6f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:09:19 compute-0 podman[217316]: 2026-01-22 17:09:19.538637306 +0000 UTC m=+0.081078471 container create ea33e279bc93bc8b13a6332e3a5396a084bd7749415e094cacd8ad958b29f27a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:09:19 compute-0 podman[217316]: 2026-01-22 17:09:19.48713048 +0000 UTC m=+0.029571725 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:09:19 compute-0 systemd[1]: Started libpod-conmon-ea33e279bc93bc8b13a6332e3a5396a084bd7749415e094cacd8ad958b29f27a.scope.
Jan 22 17:09:19 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:09:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e0e3b4cfe372c25a86bba776e808b8aac9c27de621d24b19c7730dce1323361/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:09:19 compute-0 podman[217316]: 2026-01-22 17:09:19.647721739 +0000 UTC m=+0.190162984 container init ea33e279bc93bc8b13a6332e3a5396a084bd7749415e094cacd8ad958b29f27a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:09:19 compute-0 podman[217316]: 2026-01-22 17:09:19.653579572 +0000 UTC m=+0.196020767 container start ea33e279bc93bc8b13a6332e3a5396a084bd7749415e094cacd8ad958b29f27a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:09:19 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[217331]: [NOTICE]   (217335) : New worker (217337) forked
Jan 22 17:09:19 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[217331]: [NOTICE]   (217335) : Loading success.
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.937 183079 DEBUG nova.compute.manager [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.937 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101759.9365482, 7d42beab-5bb4-43a0-9756-ced73188f5ba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.938 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] VM Started (Lifecycle Event)
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.942 183079 DEBUG nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.944 183079 INFO nova.virt.libvirt.driver [-] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Instance spawned successfully.
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.945 183079 DEBUG nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.974 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.978 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.982 183079 DEBUG nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.982 183079 DEBUG nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.982 183079 DEBUG nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.983 183079 DEBUG nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.983 183079 DEBUG nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.984 183079 DEBUG nova.virt.libvirt.driver [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:19 compute-0 nova_compute[183075]: 2026-01-22 17:09:19.989 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:09:20 compute-0 nova_compute[183075]: 2026-01-22 17:09:20.031 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:09:20 compute-0 nova_compute[183075]: 2026-01-22 17:09:20.032 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101759.9383974, 7d42beab-5bb4-43a0-9756-ced73188f5ba => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:09:20 compute-0 nova_compute[183075]: 2026-01-22 17:09:20.032 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] VM Paused (Lifecycle Event)
Jan 22 17:09:20 compute-0 nova_compute[183075]: 2026-01-22 17:09:20.053 183079 INFO nova.compute.manager [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Took 5.81 seconds to spawn the instance on the hypervisor.
Jan 22 17:09:20 compute-0 nova_compute[183075]: 2026-01-22 17:09:20.053 183079 DEBUG nova.compute.manager [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:09:20 compute-0 nova_compute[183075]: 2026-01-22 17:09:20.054 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:09:20 compute-0 nova_compute[183075]: 2026-01-22 17:09:20.060 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101759.9402056, 7d42beab-5bb4-43a0-9756-ced73188f5ba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:09:20 compute-0 nova_compute[183075]: 2026-01-22 17:09:20.061 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] VM Resumed (Lifecycle Event)
Jan 22 17:09:20 compute-0 nova_compute[183075]: 2026-01-22 17:09:20.096 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:09:20 compute-0 nova_compute[183075]: 2026-01-22 17:09:20.100 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:09:20 compute-0 nova_compute[183075]: 2026-01-22 17:09:20.118 183079 INFO nova.compute.manager [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Took 6.30 seconds to build instance.
Jan 22 17:09:20 compute-0 nova_compute[183075]: 2026-01-22 17:09:20.132 183079 DEBUG oslo_concurrency.lockutils [None req-bc017e0f-427c-46db-bc09-1a5185d0e68e cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.392s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:21 compute-0 nova_compute[183075]: 2026-01-22 17:09:21.426 183079 DEBUG nova.compute.manager [req-ebd5dc1f-e77a-4e51-9ceb-68dfd31dd620 req-a6b0a4d9-6aca-4f1d-a700-bf5c54359039 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Received event network-vif-plugged-877fd3c4-01ce-4616-b9d4-92cf337f7f6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:21 compute-0 nova_compute[183075]: 2026-01-22 17:09:21.426 183079 DEBUG oslo_concurrency.lockutils [req-ebd5dc1f-e77a-4e51-9ceb-68dfd31dd620 req-a6b0a4d9-6aca-4f1d-a700-bf5c54359039 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:21 compute-0 nova_compute[183075]: 2026-01-22 17:09:21.427 183079 DEBUG oslo_concurrency.lockutils [req-ebd5dc1f-e77a-4e51-9ceb-68dfd31dd620 req-a6b0a4d9-6aca-4f1d-a700-bf5c54359039 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:21 compute-0 nova_compute[183075]: 2026-01-22 17:09:21.427 183079 DEBUG oslo_concurrency.lockutils [req-ebd5dc1f-e77a-4e51-9ceb-68dfd31dd620 req-a6b0a4d9-6aca-4f1d-a700-bf5c54359039 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:21 compute-0 nova_compute[183075]: 2026-01-22 17:09:21.427 183079 DEBUG nova.compute.manager [req-ebd5dc1f-e77a-4e51-9ceb-68dfd31dd620 req-a6b0a4d9-6aca-4f1d-a700-bf5c54359039 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] No waiting events found dispatching network-vif-plugged-877fd3c4-01ce-4616-b9d4-92cf337f7f6f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:09:21 compute-0 nova_compute[183075]: 2026-01-22 17:09:21.427 183079 WARNING nova.compute.manager [req-ebd5dc1f-e77a-4e51-9ceb-68dfd31dd620 req-a6b0a4d9-6aca-4f1d-a700-bf5c54359039 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Received unexpected event network-vif-plugged-877fd3c4-01ce-4616-b9d4-92cf337f7f6f for instance with vm_state active and task_state None.
Jan 22 17:09:21 compute-0 nova_compute[183075]: 2026-01-22 17:09:21.452 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101746.4472437, 4bb7efdc-59ab-46cd-ae0d-582182c85f5b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:09:21 compute-0 nova_compute[183075]: 2026-01-22 17:09:21.463 183079 INFO nova.compute.manager [-] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] VM Stopped (Lifecycle Event)
Jan 22 17:09:21 compute-0 nova_compute[183075]: 2026-01-22 17:09:21.491 183079 DEBUG nova.compute.manager [None req-ce9a02f7-9591-423e-bb9f-af2b5557a1be - - - - - -] [instance: 4bb7efdc-59ab-46cd-ae0d-582182c85f5b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:09:22 compute-0 nova_compute[183075]: 2026-01-22 17:09:22.253 183079 DEBUG nova.network.neutron [req-26cadd06-3349-48c2-94a1-844ccc67363d req-00c83120-4c9f-4af0-93d5-81398523c003 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Updated VIF entry in instance network info cache for port 877fd3c4-01ce-4616-b9d4-92cf337f7f6f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:09:22 compute-0 nova_compute[183075]: 2026-01-22 17:09:22.254 183079 DEBUG nova.network.neutron [req-26cadd06-3349-48c2-94a1-844ccc67363d req-00c83120-4c9f-4af0-93d5-81398523c003 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Updating instance_info_cache with network_info: [{"id": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "address": "fa:16:3e:4e:ec:b6", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877fd3c4-01", "ovs_interfaceid": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:09:22 compute-0 nova_compute[183075]: 2026-01-22 17:09:22.271 183079 DEBUG oslo_concurrency.lockutils [req-26cadd06-3349-48c2-94a1-844ccc67363d req-00c83120-4c9f-4af0-93d5-81398523c003 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-7d42beab-5bb4-43a0-9756-ced73188f5ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:09:22 compute-0 podman[217354]: 2026-01-22 17:09:22.347836265 +0000 UTC m=+0.052816522 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:09:22 compute-0 podman[217353]: 2026-01-22 17:09:22.413520973 +0000 UTC m=+0.119815974 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 17:09:22 compute-0 nova_compute[183075]: 2026-01-22 17:09:22.425 183079 INFO nova.compute.manager [None req-84905ab1-0a06-4d26-acde-39a8f02ffd57 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Get console output
Jan 22 17:09:22 compute-0 nova_compute[183075]: 2026-01-22 17:09:22.433 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:23 compute-0 nova_compute[183075]: 2026-01-22 17:09:23.479 183079 INFO nova.compute.manager [None req-ec1588b8-85c5-4290-8230-7a161e04552a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Get console output
Jan 22 17:09:23 compute-0 nova_compute[183075]: 2026-01-22 17:09:23.488 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:23 compute-0 nova_compute[183075]: 2026-01-22 17:09:23.661 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:24 compute-0 podman[217399]: 2026-01-22 17:09:24.344294331 +0000 UTC m=+0.054825845 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, architecture=x86_64)
Jan 22 17:09:24 compute-0 nova_compute[183075]: 2026-01-22 17:09:24.977 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:27 compute-0 nova_compute[183075]: 2026-01-22 17:09:27.636 183079 INFO nova.compute.manager [None req-0d589e60-c9bd-42b4-a2c0-a2b94beeb438 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Get console output
Jan 22 17:09:27 compute-0 nova_compute[183075]: 2026-01-22 17:09:27.641 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:28 compute-0 nova_compute[183075]: 2026-01-22 17:09:28.335 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:28 compute-0 nova_compute[183075]: 2026-01-22 17:09:28.720 183079 INFO nova.compute.manager [None req-96d97595-2dcf-4ccd-b1d8-74c9297f43d8 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Get console output
Jan 22 17:09:28 compute-0 nova_compute[183075]: 2026-01-22 17:09:28.721 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:28 compute-0 nova_compute[183075]: 2026-01-22 17:09:28.724 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:29 compute-0 podman[217421]: 2026-01-22 17:09:29.377963689 +0000 UTC m=+0.091131074 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 22 17:09:29 compute-0 nova_compute[183075]: 2026-01-22 17:09:29.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:09:29 compute-0 nova_compute[183075]: 2026-01-22 17:09:29.980 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:31 compute-0 nova_compute[183075]: 2026-01-22 17:09:31.761 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:09:31 compute-0 nova_compute[183075]: 2026-01-22 17:09:31.801 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Triggering sync for uuid ee33030a-2035-4fd1-8de4-261142b89bc6 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 17:09:31 compute-0 nova_compute[183075]: 2026-01-22 17:09:31.801 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Triggering sync for uuid 1fa7475b-9f51-4229-8ded-3a0c4de806c5 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 17:09:31 compute-0 nova_compute[183075]: 2026-01-22 17:09:31.802 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Triggering sync for uuid 7d42beab-5bb4-43a0-9756-ced73188f5ba _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 17:09:31 compute-0 nova_compute[183075]: 2026-01-22 17:09:31.802 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "ee33030a-2035-4fd1-8de4-261142b89bc6" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:31 compute-0 nova_compute[183075]: 2026-01-22 17:09:31.802 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:31 compute-0 nova_compute[183075]: 2026-01-22 17:09:31.803 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:31 compute-0 nova_compute[183075]: 2026-01-22 17:09:31.803 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:31 compute-0 nova_compute[183075]: 2026-01-22 17:09:31.804 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "7d42beab-5bb4-43a0-9756-ced73188f5ba" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:31 compute-0 nova_compute[183075]: 2026-01-22 17:09:31.804 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:31 compute-0 ovn_controller[95372]: 2026-01-22T17:09:31Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4e:ec:b6 10.100.0.25
Jan 22 17:09:31 compute-0 ovn_controller[95372]: 2026-01-22T17:09:31Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4e:ec:b6 10.100.0.25
Jan 22 17:09:31 compute-0 nova_compute[183075]: 2026-01-22 17:09:31.896 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:31 compute-0 nova_compute[183075]: 2026-01-22 17:09:31.898 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:31 compute-0 nova_compute[183075]: 2026-01-22 17:09:31.906 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:32 compute-0 nova_compute[183075]: 2026-01-22 17:09:32.087 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:32 compute-0 nova_compute[183075]: 2026-01-22 17:09:32.222 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:32 compute-0 nova_compute[183075]: 2026-01-22 17:09:32.809 183079 INFO nova.compute.manager [None req-3b107289-de39-457b-87ca-99dbf70c99c5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Get console output
Jan 22 17:09:32 compute-0 nova_compute[183075]: 2026-01-22 17:09:32.814 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:33 compute-0 nova_compute[183075]: 2026-01-22 17:09:33.724 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:33 compute-0 nova_compute[183075]: 2026-01-22 17:09:33.890 183079 INFO nova.compute.manager [None req-32dae1de-9c68-4f15-b233-bceb242f49c4 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Get console output
Jan 22 17:09:33 compute-0 nova_compute[183075]: 2026-01-22 17:09:33.894 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:35 compute-0 nova_compute[183075]: 2026-01-22 17:09:35.024 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:35 compute-0 nova_compute[183075]: 2026-01-22 17:09:35.989 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:36 compute-0 podman[217462]: 2026-01-22 17:09:36.352488929 +0000 UTC m=+0.054024304 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:09:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:38.060 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:09:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:38.061 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:09:38 compute-0 nova_compute[183075]: 2026-01-22 17:09:38.062 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:38 compute-0 nova_compute[183075]: 2026-01-22 17:09:38.341 183079 INFO nova.compute.manager [None req-c37366b7-d790-4d89-b68b-e447a9b7c38a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Get console output
Jan 22 17:09:38 compute-0 nova_compute[183075]: 2026-01-22 17:09:38.348 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:38 compute-0 nova_compute[183075]: 2026-01-22 17:09:38.727 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.035 183079 INFO nova.compute.manager [None req-9d90e8a7-546e-4c02-994d-559c55f3f4a4 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Get console output
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.039 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.051 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.052 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.25
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.285 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.286 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.309 183079 DEBUG nova.compute.manager [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.384 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "e4683d56-25f3-42a9-aedd-1b076e9a5245" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.384 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "e4683d56-25f3-42a9-aedd-1b076e9a5245" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.396 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.396 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.403 183079 DEBUG nova.compute.manager [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.413 183079 DEBUG nova.virt.hardware [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.413 183079 INFO nova.compute.claims [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.438 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.438 104990 INFO eventlet.wsgi.server [-] 10.100.0.25,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.3867054
Jan 22 17:09:39 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[217337]: 10.100.0.25:37450 [22/Jan/2026:17:09:39.050] listener listener/metadata 0/0/0/388/388 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.449 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.450 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.25
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.474 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:39 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[217337]: 10.100.0.25:37454 [22/Jan/2026:17:09:39.448] listener listener/metadata 0/0/0/26/26 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.475 104990 INFO eventlet.wsgi.server [-] 10.100.0.25,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0249934
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.482 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.483 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.25
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.496 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.502 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.503 104990 INFO eventlet.wsgi.server [-] 10.100.0.25,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0201821
Jan 22 17:09:39 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[217337]: 10.100.0.25:37464 [22/Jan/2026:17:09:39.481] listener listener/metadata 0/0/0/21/21 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.511 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.512 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.25
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.534 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:39 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[217337]: 10.100.0.25:37476 [22/Jan/2026:17:09:39.511] listener listener/metadata 0/0/0/23/23 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.534 104990 INFO eventlet.wsgi.server [-] 10.100.0.25,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0219326
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.543 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.545 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.25
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.571 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.572 104990 INFO eventlet.wsgi.server [-] 10.100.0.25,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0265486
Jan 22 17:09:39 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[217337]: 10.100.0.25:37480 [22/Jan/2026:17:09:39.542] listener listener/metadata 0/0/0/29/29 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.580 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.581 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.25
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.636 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.637 104990 INFO eventlet.wsgi.server [-] 10.100.0.25,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0558078
Jan 22 17:09:39 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[217337]: 10.100.0.25:37486 [22/Jan/2026:17:09:39.580] listener listener/metadata 0/0/0/57/57 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.645 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.646 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.25
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.672 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.673 104990 INFO eventlet.wsgi.server [-] 10.100.0.25,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0266094
Jan 22 17:09:39 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[217337]: 10.100.0.25:37500 [22/Jan/2026:17:09:39.645] listener listener/metadata 0/0/0/28/28 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.677 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.678 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.25
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.699 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:39 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[217337]: 10.100.0.25:37512 [22/Jan/2026:17:09:39.677] listener listener/metadata 0/0/0/22/22 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.699 183079 DEBUG nova.compute.provider_tree [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.700 104990 INFO eventlet.wsgi.server [-] 10.100.0.25,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0221863
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.709 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.709 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.25
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.716 183079 DEBUG nova.scheduler.client.report [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.728 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:39 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[217337]: 10.100.0.25:37516 [22/Jan/2026:17:09:39.709] listener listener/metadata 0/0/0/20/20 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.729 104990 INFO eventlet.wsgi.server [-] 10.100.0.25,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0200303
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.737 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.738 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.25
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.740 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.343s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.740 183079 DEBUG nova.compute.manager [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.742 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.749 183079 DEBUG nova.virt.hardware [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.749 183079 INFO nova.compute.claims [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.757 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.758 104990 INFO eventlet.wsgi.server [-] 10.100.0.25,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0198271
Jan 22 17:09:39 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[217337]: 10.100.0.25:37520 [22/Jan/2026:17:09:39.736] listener listener/metadata 0/0/0/21/21 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.770 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.770 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.25
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.818 183079 DEBUG nova.compute.manager [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.820 183079 DEBUG nova.network.neutron [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.948 104990 INFO eventlet.wsgi.server [-] 10.100.0.25,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.1778007
Jan 22 17:09:39 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[217337]: 10.100.0.25:37536 [22/Jan/2026:17:09:39.769] listener listener/metadata 0/0/0/179/179 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.954 183079 INFO nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.967 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.968 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.25
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:39 compute-0 nova_compute[183075]: 2026-01-22 17:09:39.971 183079 DEBUG nova.compute.manager [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.996 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:39.996 104990 INFO eventlet.wsgi.server [-] 10.100.0.25,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0283608
Jan 22 17:09:39 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[217337]: 10.100.0.25:37552 [22/Jan/2026:17:09:39.966] listener listener/metadata 0/0/0/29/29 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:40.002 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:40.003 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.25
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:40.025 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:40.025 104990 INFO eventlet.wsgi.server [-] 10.100.0.25,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0219073
Jan 22 17:09:40 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[217337]: 10.100.0.25:37568 [22/Jan/2026:17:09:40.002] listener listener/metadata 0/0/0/22/22 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.027 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:40.031 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:40.032 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.25
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:40.047 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:40.048 104990 INFO eventlet.wsgi.server [-] 10.100.0.25,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0159492
Jan 22 17:09:40 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[217337]: 10.100.0.25:37580 [22/Jan/2026:17:09:40.031] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:40.054 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:40.054 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.25
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:40.069 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:40.070 104990 INFO eventlet.wsgi.server [-] 10.100.0.25,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0154283
Jan 22 17:09:40 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[217337]: 10.100.0.25:37586 [22/Jan/2026:17:09:40.053] listener listener/metadata 0/0/0/16/16 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.071 183079 DEBUG nova.compute.manager [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.073 183079 DEBUG nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.074 183079 INFO nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Creating image(s)
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.076 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "/var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.077 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "/var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.077 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "/var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:40.082 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:40.083 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.25
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.093 183079 DEBUG oslo_concurrency.processutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:40.096 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:09:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:40.096 104990 INFO eventlet.wsgi.server [-] 10.100.0.25,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0133193
Jan 22 17:09:40 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[217337]: 10.100.0.25:37590 [22/Jan/2026:17:09:40.082] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.126 183079 DEBUG nova.compute.provider_tree [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.145 183079 DEBUG nova.scheduler.client.report [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.163 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.421s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.164 183079 DEBUG nova.compute.manager [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.182 183079 DEBUG oslo_concurrency.processutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.184 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.184 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.204 183079 DEBUG oslo_concurrency.processutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.228 183079 DEBUG nova.compute.manager [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.229 183079 DEBUG nova.network.neutron [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.251 183079 INFO nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.268 183079 DEBUG nova.compute.manager [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.277 183079 DEBUG oslo_concurrency.processutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.278 183079 DEBUG oslo_concurrency.processutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.325 183079 DEBUG oslo_concurrency.processutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.326 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.327 183079 DEBUG oslo_concurrency.processutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.372 183079 DEBUG nova.compute.manager [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.374 183079 DEBUG nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.374 183079 INFO nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Creating image(s)
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.375 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "/var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.375 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "/var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.376 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "/var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.393 183079 DEBUG oslo_concurrency.processutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.394 183079 DEBUG oslo_concurrency.processutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.414 183079 DEBUG nova.virt.disk.api [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Checking if we can resize image /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.415 183079 DEBUG oslo_concurrency.processutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.458 183079 DEBUG oslo_concurrency.processutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.460 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.461 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.477 183079 DEBUG oslo_concurrency.processutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.510 183079 DEBUG oslo_concurrency.processutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.512 183079 DEBUG nova.virt.disk.api [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Cannot resize image /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.513 183079 DEBUG nova.objects.instance [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lazy-loading 'migration_context' on Instance uuid 7e8d077b-66fc-42ee-ad4e-a13327ad6764 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.541 183079 DEBUG nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.542 183079 DEBUG nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Ensure instance console log exists: /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.544 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.545 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.546 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.569 183079 DEBUG oslo_concurrency.processutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.570 183079 DEBUG oslo_concurrency.processutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.606 183079 DEBUG oslo_concurrency.processutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.608 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.608 183079 DEBUG oslo_concurrency.processutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.686 183079 DEBUG oslo_concurrency.processutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.687 183079 DEBUG nova.virt.disk.api [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Checking if we can resize image /var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.688 183079 DEBUG oslo_concurrency.processutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.759 183079 DEBUG oslo_concurrency.processutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.760 183079 DEBUG nova.virt.disk.api [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Cannot resize image /var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.760 183079 DEBUG nova.objects.instance [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lazy-loading 'migration_context' on Instance uuid e4683d56-25f3-42a9-aedd-1b076e9a5245 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.828 183079 DEBUG nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.828 183079 DEBUG nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Ensure instance console log exists: /var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.829 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.829 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:40 compute-0 nova_compute[183075]: 2026-01-22 17:09:40.829 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:41 compute-0 nova_compute[183075]: 2026-01-22 17:09:41.049 183079 DEBUG nova.policy [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:09:41 compute-0 nova_compute[183075]: 2026-01-22 17:09:41.916 183079 DEBUG nova.policy [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:09:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:41.923 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:41.924 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:41.925 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:43 compute-0 nova_compute[183075]: 2026-01-22 17:09:43.000 183079 DEBUG nova.network.neutron [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Successfully created port: 5644ae2a-c35b-431d-88a1-ad18de811d83 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:09:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:43.063 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:43 compute-0 nova_compute[183075]: 2026-01-22 17:09:43.488 183079 INFO nova.compute.manager [None req-82f7b075-ac8c-45c5-9830-12fea31e6770 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Get console output
Jan 22 17:09:43 compute-0 nova_compute[183075]: 2026-01-22 17:09:43.494 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:43 compute-0 nova_compute[183075]: 2026-01-22 17:09:43.500 183079 DEBUG nova.network.neutron [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Successfully updated port: 096b36b4-87c4-423a-a3ef-3c47a75704f7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:09:43 compute-0 nova_compute[183075]: 2026-01-22 17:09:43.512 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "refresh_cache-e4683d56-25f3-42a9-aedd-1b076e9a5245" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:09:43 compute-0 nova_compute[183075]: 2026-01-22 17:09:43.513 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquired lock "refresh_cache-e4683d56-25f3-42a9-aedd-1b076e9a5245" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:09:43 compute-0 nova_compute[183075]: 2026-01-22 17:09:43.513 183079 DEBUG nova.network.neutron [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:09:43 compute-0 nova_compute[183075]: 2026-01-22 17:09:43.729 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:43 compute-0 nova_compute[183075]: 2026-01-22 17:09:43.844 183079 DEBUG nova.compute.manager [req-ee7c5064-b2df-45e7-b87b-c8cc22dc3253 req-ca0bcbd1-ed68-4187-a691-a4a002af6434 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Received event network-changed-096b36b4-87c4-423a-a3ef-3c47a75704f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:43 compute-0 nova_compute[183075]: 2026-01-22 17:09:43.845 183079 DEBUG nova.compute.manager [req-ee7c5064-b2df-45e7-b87b-c8cc22dc3253 req-ca0bcbd1-ed68-4187-a691-a4a002af6434 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Refreshing instance network info cache due to event network-changed-096b36b4-87c4-423a-a3ef-3c47a75704f7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:09:43 compute-0 nova_compute[183075]: 2026-01-22 17:09:43.846 183079 DEBUG oslo_concurrency.lockutils [req-ee7c5064-b2df-45e7-b87b-c8cc22dc3253 req-ca0bcbd1-ed68-4187-a691-a4a002af6434 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-e4683d56-25f3-42a9-aedd-1b076e9a5245" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:09:44 compute-0 nova_compute[183075]: 2026-01-22 17:09:44.020 183079 DEBUG nova.network.neutron [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:09:44 compute-0 nova_compute[183075]: 2026-01-22 17:09:44.272 183079 DEBUG nova.network.neutron [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Successfully updated port: 5644ae2a-c35b-431d-88a1-ad18de811d83 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:09:44 compute-0 nova_compute[183075]: 2026-01-22 17:09:44.292 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "refresh_cache-7e8d077b-66fc-42ee-ad4e-a13327ad6764" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:09:44 compute-0 nova_compute[183075]: 2026-01-22 17:09:44.293 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquired lock "refresh_cache-7e8d077b-66fc-42ee-ad4e-a13327ad6764" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:09:44 compute-0 nova_compute[183075]: 2026-01-22 17:09:44.293 183079 DEBUG nova.network.neutron [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:09:44 compute-0 nova_compute[183075]: 2026-01-22 17:09:44.428 183079 DEBUG nova.network.neutron [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.030 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.445 183079 INFO nova.compute.manager [None req-d0014cfb-59ce-4756-8d90-f715a3f9818b 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Get console output
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.455 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.550 183079 DEBUG nova.compute.manager [req-6790f3d7-770c-4458-8a24-b812e0b0b6d6 req-9649e322-fac9-4057-aaf4-4af23866fb8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received event network-changed-5644ae2a-c35b-431d-88a1-ad18de811d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.550 183079 DEBUG nova.compute.manager [req-6790f3d7-770c-4458-8a24-b812e0b0b6d6 req-9649e322-fac9-4057-aaf4-4af23866fb8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Refreshing instance network info cache due to event network-changed-5644ae2a-c35b-431d-88a1-ad18de811d83. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.551 183079 DEBUG oslo_concurrency.lockutils [req-6790f3d7-770c-4458-8a24-b812e0b0b6d6 req-9649e322-fac9-4057-aaf4-4af23866fb8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-7e8d077b-66fc-42ee-ad4e-a13327ad6764" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.677 183079 DEBUG oslo_concurrency.lockutils [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "7d42beab-5bb4-43a0-9756-ced73188f5ba" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.677 183079 DEBUG oslo_concurrency.lockutils [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.678 183079 DEBUG oslo_concurrency.lockutils [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.678 183079 DEBUG oslo_concurrency.lockutils [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.678 183079 DEBUG oslo_concurrency.lockutils [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.679 183079 INFO nova.compute.manager [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Terminating instance
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.680 183079 DEBUG nova.compute.manager [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:09:45 compute-0 kernel: tap877fd3c4-01 (unregistering): left promiscuous mode
Jan 22 17:09:45 compute-0 NetworkManager[55454]: <info>  [1769101785.7155] device (tap877fd3c4-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:09:45 compute-0 ovn_controller[95372]: 2026-01-22T17:09:45Z|00133|binding|INFO|Releasing lport 877fd3c4-01ce-4616-b9d4-92cf337f7f6f from this chassis (sb_readonly=0)
Jan 22 17:09:45 compute-0 ovn_controller[95372]: 2026-01-22T17:09:45Z|00134|binding|INFO|Setting lport 877fd3c4-01ce-4616-b9d4-92cf337f7f6f down in Southbound
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.726 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:45 compute-0 ovn_controller[95372]: 2026-01-22T17:09:45Z|00135|binding|INFO|Removing iface tap877fd3c4-01 ovn-installed in OVS
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.730 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:45.735 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:ec:b6 10.100.0.25'], port_security=['fa:16:3e:4e:ec:b6 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '7d42beab-5bb4-43a0-9756-ced73188f5ba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a16be1a-262e-47f7-8518-5f24ee15796e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.222', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5b6ccb16-1216-4deb-9d72-42005a3163bb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=877fd3c4-01ce-4616-b9d4-92cf337f7f6f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:09:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:45.737 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 877fd3c4-01ce-4616-b9d4-92cf337f7f6f in datapath 0a16be1a-262e-47f7-8518-5f24ee15796e unbound from our chassis
Jan 22 17:09:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:45.740 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a16be1a-262e-47f7-8518-5f24ee15796e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:09:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:45.742 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7e6fbc58-7408-4db3-96b1-6df31ad46a51]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:45.743 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e namespace which is not needed anymore
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.746 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:45 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 22 17:09:45 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Consumed 13.468s CPU time.
Jan 22 17:09:45 compute-0 systemd-machined[154382]: Machine qemu-10-instance-0000000a terminated.
Jan 22 17:09:45 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[217331]: [NOTICE]   (217335) : haproxy version is 2.8.14-c23fe91
Jan 22 17:09:45 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[217331]: [NOTICE]   (217335) : path to executable is /usr/sbin/haproxy
Jan 22 17:09:45 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[217331]: [WARNING]  (217335) : Exiting Master process...
Jan 22 17:09:45 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[217331]: [WARNING]  (217335) : Exiting Master process...
Jan 22 17:09:45 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[217331]: [ALERT]    (217335) : Current worker (217337) exited with code 143 (Terminated)
Jan 22 17:09:45 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[217331]: [WARNING]  (217335) : All workers exited. Exiting... (0)
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.907 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:45 compute-0 systemd[1]: libpod-ea33e279bc93bc8b13a6332e3a5396a084bd7749415e094cacd8ad958b29f27a.scope: Deactivated successfully.
Jan 22 17:09:45 compute-0 podman[217550]: 2026-01-22 17:09:45.912968839 +0000 UTC m=+0.060133263 container died ea33e279bc93bc8b13a6332e3a5396a084bd7749415e094cacd8ad958b29f27a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.917 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.934 183079 DEBUG nova.network.neutron [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Updating instance_info_cache with network_info: [{"id": "5644ae2a-c35b-431d-88a1-ad18de811d83", "address": "fa:16:3e:d4:c8:e5", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5644ae2a-c3", "ovs_interfaceid": "5644ae2a-c35b-431d-88a1-ad18de811d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.948 183079 INFO nova.virt.libvirt.driver [-] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Instance destroyed successfully.
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.950 183079 DEBUG nova.objects.instance [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'resources' on Instance uuid 7d42beab-5bb4-43a0-9756-ced73188f5ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:09:45 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ea33e279bc93bc8b13a6332e3a5396a084bd7749415e094cacd8ad958b29f27a-userdata-shm.mount: Deactivated successfully.
Jan 22 17:09:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-9e0e3b4cfe372c25a86bba776e808b8aac9c27de621d24b19c7730dce1323361-merged.mount: Deactivated successfully.
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.969 183079 DEBUG nova.virt.libvirt.vif [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:09:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1958859049',display_name='tempest-server-test-1958859049',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1958859049',id=10,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:09:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-r79zmuj9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:09:20Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=7d42beab-5bb4-43a0-9756-ced73188f5ba,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "address": "fa:16:3e:4e:ec:b6", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877fd3c4-01", "ovs_interfaceid": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.969 183079 DEBUG nova.network.os_vif_util [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "address": "fa:16:3e:4e:ec:b6", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877fd3c4-01", "ovs_interfaceid": "877fd3c4-01ce-4616-b9d4-92cf337f7f6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:09:45 compute-0 podman[217550]: 2026-01-22 17:09:45.971348976 +0000 UTC m=+0.118513400 container cleanup ea33e279bc93bc8b13a6332e3a5396a084bd7749415e094cacd8ad958b29f27a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.971 183079 DEBUG nova.network.os_vif_util [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:ec:b6,bridge_name='br-int',has_traffic_filtering=True,id=877fd3c4-01ce-4616-b9d4-92cf337f7f6f,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap877fd3c4-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.972 183079 DEBUG os_vif [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:ec:b6,bridge_name='br-int',has_traffic_filtering=True,id=877fd3c4-01ce-4616-b9d4-92cf337f7f6f,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap877fd3c4-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.975 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.976 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap877fd3c4-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.978 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Releasing lock "refresh_cache-7e8d077b-66fc-42ee-ad4e-a13327ad6764" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.979 183079 DEBUG nova.compute.manager [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Instance network_info: |[{"id": "5644ae2a-c35b-431d-88a1-ad18de811d83", "address": "fa:16:3e:d4:c8:e5", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5644ae2a-c3", "ovs_interfaceid": "5644ae2a-c35b-431d-88a1-ad18de811d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.979 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:45 compute-0 systemd[1]: libpod-conmon-ea33e279bc93bc8b13a6332e3a5396a084bd7749415e094cacd8ad958b29f27a.scope: Deactivated successfully.
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.980 183079 DEBUG oslo_concurrency.lockutils [req-6790f3d7-770c-4458-8a24-b812e0b0b6d6 req-9649e322-fac9-4057-aaf4-4af23866fb8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-7e8d077b-66fc-42ee-ad4e-a13327ad6764" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.981 183079 DEBUG nova.network.neutron [req-6790f3d7-770c-4458-8a24-b812e0b0b6d6 req-9649e322-fac9-4057-aaf4-4af23866fb8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Refreshing network info cache for port 5644ae2a-c35b-431d-88a1-ad18de811d83 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.984 183079 DEBUG nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Start _get_guest_xml network_info=[{"id": "5644ae2a-c35b-431d-88a1-ad18de811d83", "address": "fa:16:3e:d4:c8:e5", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5644ae2a-c3", "ovs_interfaceid": "5644ae2a-c35b-431d-88a1-ad18de811d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.985 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.988 183079 INFO os_vif [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:ec:b6,bridge_name='br-int',has_traffic_filtering=True,id=877fd3c4-01ce-4616-b9d4-92cf337f7f6f,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap877fd3c4-01')
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.989 183079 INFO nova.virt.libvirt.driver [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Deleting instance files /var/lib/nova/instances/7d42beab-5bb4-43a0-9756-ced73188f5ba_del
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.990 183079 INFO nova.virt.libvirt.driver [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Deletion of /var/lib/nova/instances/7d42beab-5bb4-43a0-9756-ced73188f5ba_del complete
Jan 22 17:09:45 compute-0 nova_compute[183075]: 2026-01-22 17:09:45.997 183079 WARNING nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.011 183079 DEBUG nova.virt.libvirt.host [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.012 183079 DEBUG nova.virt.libvirt.host [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.015 183079 DEBUG nova.virt.libvirt.host [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.016 183079 DEBUG nova.virt.libvirt.host [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.017 183079 DEBUG nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.017 183079 DEBUG nova.virt.hardware [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.017 183079 DEBUG nova.virt.hardware [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.017 183079 DEBUG nova.virt.hardware [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.018 183079 DEBUG nova.virt.hardware [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.018 183079 DEBUG nova.virt.hardware [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.018 183079 DEBUG nova.virt.hardware [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.018 183079 DEBUG nova.virt.hardware [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.018 183079 DEBUG nova.virt.hardware [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.019 183079 DEBUG nova.virt.hardware [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.019 183079 DEBUG nova.virt.hardware [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.019 183079 DEBUG nova.virt.hardware [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.022 183079 DEBUG nova.virt.libvirt.vif [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:09:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-233401537',display_name='tempest-server-test-233401537',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-233401537',id=11,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA4izGuVdf36SsG+7n8kX9aNpboq22Z55adiWGM5qlH08LxqMkSxkCnGlFdsMKL8t/vQsOXqbCU1vgc4to/WoKVrvDSrylB83cxSgDIuuaEZv45HgYlb5csi4YLKl3Bk4g==',key_name='tempest-keypair-test-110348497',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c2b37b797ca344f2b31c3861277068d8',ramdisk_id='',reservation_id='r-lld7rcgo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpMultipleRoutersTest-2036232412',owner_user_name='tempest-FloatingIpMultipleRoutersTest-2036232412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:09:40Z,user_data=None,user_id='28bc4852545149e59d0541d4f39eb38e',uuid=7e8d077b-66fc-42ee-ad4e-a13327ad6764,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5644ae2a-c35b-431d-88a1-ad18de811d83", "address": "fa:16:3e:d4:c8:e5", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5644ae2a-c3", "ovs_interfaceid": "5644ae2a-c35b-431d-88a1-ad18de811d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.023 183079 DEBUG nova.network.os_vif_util [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converting VIF {"id": "5644ae2a-c35b-431d-88a1-ad18de811d83", "address": "fa:16:3e:d4:c8:e5", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5644ae2a-c3", "ovs_interfaceid": "5644ae2a-c35b-431d-88a1-ad18de811d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.023 183079 DEBUG nova.network.os_vif_util [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c8:e5,bridge_name='br-int',has_traffic_filtering=True,id=5644ae2a-c35b-431d-88a1-ad18de811d83,network=Network(ce346f8d-be8d-455f-b61c-12fea213a3f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5644ae2a-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.024 183079 DEBUG nova.objects.instance [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7e8d077b-66fc-42ee-ad4e-a13327ad6764 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.029 183079 DEBUG nova.network.neutron [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Updating instance_info_cache with network_info: [{"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.057 183079 DEBUG nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <uuid>7e8d077b-66fc-42ee-ad4e-a13327ad6764</uuid>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <name>instance-0000000b</name>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-233401537</nova:name>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:09:45</nova:creationTime>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:09:46 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:09:46 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:09:46 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:09:46 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:09:46 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:09:46 compute-0 nova_compute[183075]:         <nova:user uuid="28bc4852545149e59d0541d4f39eb38e">tempest-FloatingIpMultipleRoutersTest-2036232412-project-member</nova:user>
Jan 22 17:09:46 compute-0 nova_compute[183075]:         <nova:project uuid="c2b37b797ca344f2b31c3861277068d8">tempest-FloatingIpMultipleRoutersTest-2036232412</nova:project>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:09:46 compute-0 nova_compute[183075]:         <nova:port uuid="5644ae2a-c35b-431d-88a1-ad18de811d83">
Jan 22 17:09:46 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <system>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <entry name="serial">7e8d077b-66fc-42ee-ad4e-a13327ad6764</entry>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <entry name="uuid">7e8d077b-66fc-42ee-ad4e-a13327ad6764</entry>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     </system>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <os>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   </os>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <features>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   </features>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:d4:c8:e5"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <target dev="tap5644ae2a-c3"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/console.log" append="off"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <video>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     </video>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 podman[217592]: 2026-01-22 17:09:46.058326961 +0000 UTC m=+0.056815327 container remove ea33e279bc93bc8b13a6332e3a5396a084bd7749415e094cacd8ad958b29f27a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:09:46 compute-0 nova_compute[183075]: </domain>
Jan 22 17:09:46 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.057 183079 DEBUG nova.compute.manager [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Preparing to wait for external event network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.058 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.058 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.058 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.059 183079 DEBUG nova.virt.libvirt.vif [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:09:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-233401537',display_name='tempest-server-test-233401537',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-233401537',id=11,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA4izGuVdf36SsG+7n8kX9aNpboq22Z55adiWGM5qlH08LxqMkSxkCnGlFdsMKL8t/vQsOXqbCU1vgc4to/WoKVrvDSrylB83cxSgDIuuaEZv45HgYlb5csi4YLKl3Bk4g==',key_name='tempest-keypair-test-110348497',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c2b37b797ca344f2b31c3861277068d8',ramdisk_id='',reservation_id='r-lld7rcgo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpMultipleRoutersTest-2036232412',owner_user_name='tempest-FloatingIpMultipleRoutersTest-2036232412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:09:40Z,user_data=None,user_id='28bc4852545149e59d0541d4f39eb38e',uuid=7e8d077b-66fc-42ee-ad4e-a13327ad6764,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5644ae2a-c35b-431d-88a1-ad18de811d83", "address": "fa:16:3e:d4:c8:e5", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5644ae2a-c3", "ovs_interfaceid": "5644ae2a-c35b-431d-88a1-ad18de811d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.059 183079 DEBUG nova.network.os_vif_util [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converting VIF {"id": "5644ae2a-c35b-431d-88a1-ad18de811d83", "address": "fa:16:3e:d4:c8:e5", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5644ae2a-c3", "ovs_interfaceid": "5644ae2a-c35b-431d-88a1-ad18de811d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.060 183079 DEBUG nova.network.os_vif_util [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c8:e5,bridge_name='br-int',has_traffic_filtering=True,id=5644ae2a-c35b-431d-88a1-ad18de811d83,network=Network(ce346f8d-be8d-455f-b61c-12fea213a3f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5644ae2a-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.060 183079 DEBUG os_vif [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c8:e5,bridge_name='br-int',has_traffic_filtering=True,id=5644ae2a-c35b-431d-88a1-ad18de811d83,network=Network(ce346f8d-be8d-455f-b61c-12fea213a3f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5644ae2a-c3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.061 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.061 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.062 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.064 183079 INFO nova.compute.manager [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.064 183079 DEBUG oslo.service.loopingcall [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.065 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Releasing lock "refresh_cache-e4683d56-25f3-42a9-aedd-1b076e9a5245" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.064 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[79806598-9deb-4f96-8e93-e58f1d096b81]: (4, ('Thu Jan 22 05:09:45 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e (ea33e279bc93bc8b13a6332e3a5396a084bd7749415e094cacd8ad958b29f27a)\nea33e279bc93bc8b13a6332e3a5396a084bd7749415e094cacd8ad958b29f27a\nThu Jan 22 05:09:45 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e (ea33e279bc93bc8b13a6332e3a5396a084bd7749415e094cacd8ad958b29f27a)\nea33e279bc93bc8b13a6332e3a5396a084bd7749415e094cacd8ad958b29f27a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.065 183079 DEBUG nova.compute.manager [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Instance network_info: |[{"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.065 183079 DEBUG nova.compute.manager [-] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.066 183079 DEBUG nova.network.neutron [-] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.066 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[eb16335e-79ad-4e14-83f1-8ac32d94c10a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.067 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a16be1a-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.069 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 kernel: tap0a16be1a-20: left promiscuous mode
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.072 183079 DEBUG oslo_concurrency.lockutils [req-ee7c5064-b2df-45e7-b87b-c8cc22dc3253 req-ca0bcbd1-ed68-4187-a691-a4a002af6434 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-e4683d56-25f3-42a9-aedd-1b076e9a5245" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.072 183079 DEBUG nova.network.neutron [req-ee7c5064-b2df-45e7-b87b-c8cc22dc3253 req-ca0bcbd1-ed68-4187-a691-a4a002af6434 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Refreshing network info cache for port 096b36b4-87c4-423a-a3ef-3c47a75704f7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.076 183079 DEBUG nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Start _get_guest_xml network_info=[{"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.080 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.080 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5644ae2a-c3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.081 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5644ae2a-c3, col_values=(('external_ids', {'iface-id': '5644ae2a-c35b-431d-88a1-ad18de811d83', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d4:c8:e5', 'vm-uuid': '7e8d077b-66fc-42ee-ad4e-a13327ad6764'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:46 compute-0 NetworkManager[55454]: <info>  [1769101786.0839] manager: (tap5644ae2a-c3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.084 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.088 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.090 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[07b12565-9e28-4793-889c-566b3cdc8f45]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.092 183079 WARNING nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.094 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.095 183079 INFO os_vif [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c8:e5,bridge_name='br-int',has_traffic_filtering=True,id=5644ae2a-c35b-431d-88a1-ad18de811d83,network=Network(ce346f8d-be8d-455f-b61c-12fea213a3f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5644ae2a-c3')
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.109 183079 DEBUG nova.virt.libvirt.host [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.111 183079 DEBUG nova.virt.libvirt.host [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.115 183079 DEBUG nova.virt.libvirt.host [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.116 183079 DEBUG nova.virt.libvirt.host [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.115 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bbab9a42-af1b-42e6-8ed1-a4f1c943e93e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.116 183079 DEBUG nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.116 183079 DEBUG nova.virt.hardware [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.116 183079 DEBUG nova.virt.hardware [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.117 183079 DEBUG nova.virt.hardware [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.117 183079 DEBUG nova.virt.hardware [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.117 183079 DEBUG nova.virt.hardware [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.117 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[20bec834-5d62-4954-9e7e-7cecd2bef1a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.117 183079 DEBUG nova.virt.hardware [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.117 183079 DEBUG nova.virt.hardware [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.117 183079 DEBUG nova.virt.hardware [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.118 183079 DEBUG nova.virt.hardware [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.118 183079 DEBUG nova.virt.hardware [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.118 183079 DEBUG nova.virt.hardware [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.121 183079 DEBUG nova.virt.libvirt.vif [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:09:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-590988812',display_name='tempest-server-test-590988812',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-590988812',id=12,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPrkA/Z5RmtQU4jmjRMv9OOPvEkTJSvzTw8ebk65GzPrHqEHbv+wizg7XUt+WWaoThVx02ADkoi97wsj98MvMQXzRu+T8wQKRmnd1AKmVJARy0gGVc4wBfQufwEt526HBw==',key_name='tempest-keypair-test-1746127176',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bfc6667804934c92b71ce7638089e9e3',ramdisk_id='',reservation_id='r-n6f17aqe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-QoSTest-2146064006',owner_user_name='tempest-QoSTest-2146064006-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:09:40Z,user_data=None,user_id='1e61127d65144bcbaa0d43fe3eb484c0',uuid=e4683d56-25f3-42a9-aedd-1b076e9a5245,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.122 183079 DEBUG nova.network.os_vif_util [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converting VIF {"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.122 183079 DEBUG nova.network.os_vif_util [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:65,bridge_name='br-int',has_traffic_filtering=True,id=096b36b4-87c4-423a-a3ef-3c47a75704f7,network=Network(9c1e909c-8e03-49be-b02d-6bf4a2cedc0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap096b36b4-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.123 183079 DEBUG nova.objects.instance [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lazy-loading 'pci_devices' on Instance uuid e4683d56-25f3-42a9-aedd-1b076e9a5245 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.126 183079 DEBUG nova.compute.manager [req-a1b30d17-8973-45bd-9bda-2aac08d4a521 req-e631f1e4-2486-4f82-ad16-eefa6b3d2828 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Received event network-vif-unplugged-877fd3c4-01ce-4616-b9d4-92cf337f7f6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.127 183079 DEBUG oslo_concurrency.lockutils [req-a1b30d17-8973-45bd-9bda-2aac08d4a521 req-e631f1e4-2486-4f82-ad16-eefa6b3d2828 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.127 183079 DEBUG oslo_concurrency.lockutils [req-a1b30d17-8973-45bd-9bda-2aac08d4a521 req-e631f1e4-2486-4f82-ad16-eefa6b3d2828 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.127 183079 DEBUG oslo_concurrency.lockutils [req-a1b30d17-8973-45bd-9bda-2aac08d4a521 req-e631f1e4-2486-4f82-ad16-eefa6b3d2828 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.127 183079 DEBUG nova.compute.manager [req-a1b30d17-8973-45bd-9bda-2aac08d4a521 req-e631f1e4-2486-4f82-ad16-eefa6b3d2828 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] No waiting events found dispatching network-vif-unplugged-877fd3c4-01ce-4616-b9d4-92cf337f7f6f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.127 183079 DEBUG nova.compute.manager [req-a1b30d17-8973-45bd-9bda-2aac08d4a521 req-e631f1e4-2486-4f82-ad16-eefa6b3d2828 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Received event network-vif-unplugged-877fd3c4-01ce-4616-b9d4-92cf337f7f6f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.135 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b2bc5a9f-3911-4e5c-96e1-725c92406a77]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412260, 'reachable_time': 15991, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217611, 'error': None, 'target': 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.137 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.137 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[9a2448ca-f3d5-4c05-af7a-0ebba23bbbbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 systemd[1]: run-netns-ovnmeta\x2d0a16be1a\x2d262e\x2d47f7\x2d8518\x2d5f24ee15796e.mount: Deactivated successfully.
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.141 183079 DEBUG nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <uuid>e4683d56-25f3-42a9-aedd-1b076e9a5245</uuid>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <name>instance-0000000c</name>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-590988812</nova:name>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:09:46</nova:creationTime>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:09:46 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:09:46 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:09:46 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:09:46 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:09:46 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:09:46 compute-0 nova_compute[183075]:         <nova:user uuid="1e61127d65144bcbaa0d43fe3eb484c0">tempest-QoSTest-2146064006-project-member</nova:user>
Jan 22 17:09:46 compute-0 nova_compute[183075]:         <nova:project uuid="bfc6667804934c92b71ce7638089e9e3">tempest-QoSTest-2146064006</nova:project>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:09:46 compute-0 nova_compute[183075]:         <nova:port uuid="096b36b4-87c4-423a-a3ef-3c47a75704f7">
Jan 22 17:09:46 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <system>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <entry name="serial">e4683d56-25f3-42a9-aedd-1b076e9a5245</entry>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <entry name="uuid">e4683d56-25f3-42a9-aedd-1b076e9a5245</entry>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     </system>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <os>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   </os>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <features>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   </features>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245/disk"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:f6:53:65"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <target dev="tap096b36b4-87"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245/console.log" append="off"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <video>
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     </video>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:09:46 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:09:46 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:09:46 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:09:46 compute-0 nova_compute[183075]: </domain>
Jan 22 17:09:46 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.141 183079 DEBUG nova.compute.manager [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Preparing to wait for external event network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.142 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.142 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.142 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.142 183079 DEBUG nova.virt.libvirt.vif [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:09:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-590988812',display_name='tempest-server-test-590988812',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-590988812',id=12,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPrkA/Z5RmtQU4jmjRMv9OOPvEkTJSvzTw8ebk65GzPrHqEHbv+wizg7XUt+WWaoThVx02ADkoi97wsj98MvMQXzRu+T8wQKRmnd1AKmVJARy0gGVc4wBfQufwEt526HBw==',key_name='tempest-keypair-test-1746127176',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bfc6667804934c92b71ce7638089e9e3',ramdisk_id='',reservation_id='r-n6f17aqe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-QoSTest-2146064006',owner_user_name='tempest-QoSTest-2146064006-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:09:40Z,user_data=None,user_id='1e61127d65144bcbaa0d43fe3eb484c0',uuid=e4683d56-25f3-42a9-aedd-1b076e9a5245,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.143 183079 DEBUG nova.network.os_vif_util [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converting VIF {"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.143 183079 DEBUG nova.network.os_vif_util [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:65,bridge_name='br-int',has_traffic_filtering=True,id=096b36b4-87c4-423a-a3ef-3c47a75704f7,network=Network(9c1e909c-8e03-49be-b02d-6bf4a2cedc0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap096b36b4-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.143 183079 DEBUG os_vif [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:65,bridge_name='br-int',has_traffic_filtering=True,id=096b36b4-87c4-423a-a3ef-3c47a75704f7,network=Network(9c1e909c-8e03-49be-b02d-6bf4a2cedc0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap096b36b4-87') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.144 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.144 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.144 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.147 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.148 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap096b36b4-87, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.148 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap096b36b4-87, col_values=(('external_ids', {'iface-id': '096b36b4-87c4-423a-a3ef-3c47a75704f7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f6:53:65', 'vm-uuid': 'e4683d56-25f3-42a9-aedd-1b076e9a5245'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.149 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 NetworkManager[55454]: <info>  [1769101786.1511] manager: (tap096b36b4-87): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.151 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.157 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.158 183079 INFO os_vif [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:65,bridge_name='br-int',has_traffic_filtering=True,id=096b36b4-87c4-423a-a3ef-3c47a75704f7,network=Network(9c1e909c-8e03-49be-b02d-6bf4a2cedc0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap096b36b4-87')
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.168 183079 DEBUG nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.169 183079 DEBUG nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] No VIF found with MAC fa:16:3e:d4:c8:e5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.212 183079 DEBUG nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.212 183079 DEBUG nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] No VIF found with MAC fa:16:3e:f6:53:65, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:09:46 compute-0 kernel: tap5644ae2a-c3: entered promiscuous mode
Jan 22 17:09:46 compute-0 systemd-udevd[217530]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:09:46 compute-0 NetworkManager[55454]: <info>  [1769101786.2206] manager: (tap5644ae2a-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.223 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 ovn_controller[95372]: 2026-01-22T17:09:46Z|00136|binding|INFO|Claiming lport 5644ae2a-c35b-431d-88a1-ad18de811d83 for this chassis.
Jan 22 17:09:46 compute-0 ovn_controller[95372]: 2026-01-22T17:09:46Z|00137|binding|INFO|5644ae2a-c35b-431d-88a1-ad18de811d83: Claiming fa:16:3e:d4:c8:e5 10.100.0.9
Jan 22 17:09:46 compute-0 NetworkManager[55454]: <info>  [1769101786.2352] device (tap5644ae2a-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:09:46 compute-0 NetworkManager[55454]: <info>  [1769101786.2363] device (tap5644ae2a-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:09:46 compute-0 ovn_controller[95372]: 2026-01-22T17:09:46Z|00138|binding|INFO|Setting lport 5644ae2a-c35b-431d-88a1-ad18de811d83 ovn-installed in OVS
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.243 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 ovn_controller[95372]: 2026-01-22T17:09:46Z|00139|binding|INFO|Setting lport 5644ae2a-c35b-431d-88a1-ad18de811d83 up in Southbound
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.250 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:c8:e5 10.100.0.9'], port_security=['fa:16:3e:d4:c8:e5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2b37b797ca344f2b31c3861277068d8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2f51838f-8a2c-425b-a70e-e288886c38d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f29732e-c99f-480d-89f6-9caa444040c9, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=5644ae2a-c35b-431d-88a1-ad18de811d83) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.251 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 5644ae2a-c35b-431d-88a1-ad18de811d83 in datapath ce346f8d-be8d-455f-b61c-12fea213a3f4 bound to our chassis
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.255 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ce346f8d-be8d-455f-b61c-12fea213a3f4
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.251 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.266 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[397d752d-96f2-484b-b265-ef7c8e3bbf41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.267 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapce346f8d-b1 in ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:09:46 compute-0 systemd-machined[154382]: New machine qemu-11-instance-0000000b.
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.269 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapce346f8d-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.269 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[018fd823-24c2-40d2-92ef-ce9d75c7e58b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 NetworkManager[55454]: <info>  [1769101786.2717] manager: (tap096b36b4-87): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.270 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ff7fce82-f41d-4b33-a8fa-3984fe045b6a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 kernel: tap096b36b4-87: entered promiscuous mode
Jan 22 17:09:46 compute-0 ovn_controller[95372]: 2026-01-22T17:09:46Z|00140|binding|INFO|Claiming lport 096b36b4-87c4-423a-a3ef-3c47a75704f7 for this chassis.
Jan 22 17:09:46 compute-0 ovn_controller[95372]: 2026-01-22T17:09:46Z|00141|binding|INFO|096b36b4-87c4-423a-a3ef-3c47a75704f7: Claiming fa:16:3e:f6:53:65 10.100.0.14
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.274 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000b.
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.282 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:53:65 10.100.0.14'], port_security=['fa:16:3e:f6:53:65 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bfc6667804934c92b71ce7638089e9e3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd9af03c0-27db-4d08-b124-ee395583cdd3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd725c57-a5bb-4dca-9677-d74d2fa01c15, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=096b36b4-87c4-423a-a3ef-3c47a75704f7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.284 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[15cc3f9e-1937-40aa-b2cf-a8b8c76ea9ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 NetworkManager[55454]: <info>  [1769101786.2864] device (tap096b36b4-87): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:09:46 compute-0 NetworkManager[55454]: <info>  [1769101786.2870] device (tap096b36b4-87): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:09:46 compute-0 ovn_controller[95372]: 2026-01-22T17:09:46Z|00142|binding|INFO|Setting lport 096b36b4-87c4-423a-a3ef-3c47a75704f7 ovn-installed in OVS
Jan 22 17:09:46 compute-0 ovn_controller[95372]: 2026-01-22T17:09:46Z|00143|binding|INFO|Setting lport 096b36b4-87c4-423a-a3ef-3c47a75704f7 up in Southbound
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.294 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 systemd-machined[154382]: New machine qemu-12-instance-0000000c.
Jan 22 17:09:46 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000c.
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.308 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ac0979ed-ad59-4b80-a401-5e273a14e495]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.343 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[06b291dd-3233-4302-a7b7-78e69e7ca62c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.350 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f0b22397-002f-4b10-be53-63effed81f18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 NetworkManager[55454]: <info>  [1769101786.3518] manager: (tapce346f8d-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.383 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[04a9ce74-359f-44bf-9cf7-e83ccc9901d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.385 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[84af28ba-5ed8-45c5-9d4b-94e913d7e9ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 NetworkManager[55454]: <info>  [1769101786.4078] device (tapce346f8d-b0): carrier: link connected
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.413 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[bedfbba6-0fcb-4423-b69f-49ddb4487f1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.430 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e82954e7-b523-4f8c-8444-8e77016d61a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce346f8d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:a7:4b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415010, 'reachable_time': 25211, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217683, 'error': None, 'target': 'ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.447 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b66efee0-2d94-448c-8618-3b132186eec4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe30:a74b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415010, 'tstamp': 415010}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217684, 'error': None, 'target': 'ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.465 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7a9e186c-1c18-4fb2-bd71-7138c38af00c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce346f8d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:a7:4b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415010, 'reachable_time': 25211, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217685, 'error': None, 'target': 'ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.495 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7eaac1f0-64b5-4cfd-9771-66bbd324a331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.552 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c40f30f6-dfdc-43fb-98d8-83f1f29b759c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.553 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce346f8d-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.554 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.554 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce346f8d-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.555 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 kernel: tapce346f8d-b0: entered promiscuous mode
Jan 22 17:09:46 compute-0 NetworkManager[55454]: <info>  [1769101786.5568] manager: (tapce346f8d-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.558 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.560 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapce346f8d-b0, col_values=(('external_ids', {'iface-id': '255f865e-6322-48b0-a0d1-c16ced648c78'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.561 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 ovn_controller[95372]: 2026-01-22T17:09:46Z|00144|binding|INFO|Releasing lport 255f865e-6322-48b0-a0d1-c16ced648c78 from this chassis (sb_readonly=0)
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.562 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.564 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ce346f8d-be8d-455f-b61c-12fea213a3f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ce346f8d-be8d-455f-b61c-12fea213a3f4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.577 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[00170553-9b48-4487-933f-d13229678e81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.578 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/ce346f8d-be8d-455f-b61c-12fea213a3f4.pid.haproxy
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID ce346f8d-be8d-455f-b61c-12fea213a3f4
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:09:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:46.579 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'env', 'PROCESS_TAG=haproxy-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ce346f8d-be8d-455f-b61c-12fea213a3f4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.582 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.647 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101786.6466715, 7e8d077b-66fc-42ee-ad4e-a13327ad6764 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.647 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] VM Started (Lifecycle Event)
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.671 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.675 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101786.6468706, 7e8d077b-66fc-42ee-ad4e-a13327ad6764 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.675 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] VM Paused (Lifecycle Event)
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.700 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.704 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:09:46 compute-0 nova_compute[183075]: 2026-01-22 17:09:46.729 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:09:47 compute-0 podman[217722]: 2026-01-22 17:09:47.005263003 +0000 UTC m=+0.059623111 container create de6e0593f000d31e8ceafde0b4908ce9bfd4992406fcad5cbdf6e36686c72865 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:09:47 compute-0 systemd[1]: Started libpod-conmon-de6e0593f000d31e8ceafde0b4908ce9bfd4992406fcad5cbdf6e36686c72865.scope.
Jan 22 17:09:47 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:09:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb99b46687ec2aa5d3c7adf23d310d69db4a67b07a13c288ee5deb7b5100ba51/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:09:47 compute-0 podman[217722]: 2026-01-22 17:09:46.973880712 +0000 UTC m=+0.028240870 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:09:47 compute-0 podman[217722]: 2026-01-22 17:09:47.069719848 +0000 UTC m=+0.124079986 container init de6e0593f000d31e8ceafde0b4908ce9bfd4992406fcad5cbdf6e36686c72865 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:09:47 compute-0 podman[217722]: 2026-01-22 17:09:47.075061918 +0000 UTC m=+0.129422036 container start de6e0593f000d31e8ceafde0b4908ce9bfd4992406fcad5cbdf6e36686c72865 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:09:47 compute-0 podman[217735]: 2026-01-22 17:09:47.085956933 +0000 UTC m=+0.046443886 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:09:47 compute-0 neutron-haproxy-ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4[217746]: [NOTICE]   (217764) : New worker (217774) forked
Jan 22 17:09:47 compute-0 neutron-haproxy-ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4[217746]: [NOTICE]   (217764) : Loading success.
Jan 22 17:09:47 compute-0 nova_compute[183075]: 2026-01-22 17:09:47.116 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101787.1162822, e4683d56-25f3-42a9-aedd-1b076e9a5245 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:09:47 compute-0 nova_compute[183075]: 2026-01-22 17:09:47.117 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] VM Started (Lifecycle Event)
Jan 22 17:09:47 compute-0 nova_compute[183075]: 2026-01-22 17:09:47.147 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:09:47 compute-0 nova_compute[183075]: 2026-01-22 17:09:47.150 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101787.1173804, e4683d56-25f3-42a9-aedd-1b076e9a5245 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:09:47 compute-0 nova_compute[183075]: 2026-01-22 17:09:47.150 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] VM Paused (Lifecycle Event)
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.178 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 096b36b4-87c4-423a-a3ef-3c47a75704f7 in datapath 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c unbound from our chassis
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.180 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.191 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7453bede-5e9c-49d6-b6cd-ee5848409b57]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.192 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9c1e909c-81 in ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.193 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9c1e909c-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.193 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d7289dac-bc63-44ba-b4c0-566fa1590b12]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.194 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6c8dd71f-9422-4327-b7e7-ce202b5e2b1f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.206 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[796cac50-e4fb-498f-8d92-630edc982772]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.229 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[606463de-63e2-4cea-93ec-46b401109cf2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.257 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[4dd0652a-47ab-4cb5-b2c2-b7d80a80db28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:47 compute-0 NetworkManager[55454]: <info>  [1769101787.2663] manager: (tap9c1e909c-80): new Veth device (/org/freedesktop/NetworkManager/Devices/72)
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.266 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ce5189f8-0afc-4d9f-bed0-9cff5bfec75e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:47 compute-0 systemd-udevd[217671]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:09:47 compute-0 nova_compute[183075]: 2026-01-22 17:09:47.278 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:09:47 compute-0 nova_compute[183075]: 2026-01-22 17:09:47.281 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.296 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5d1c6318-8b5f-41ac-8236-49e6722a89ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:47 compute-0 nova_compute[183075]: 2026-01-22 17:09:47.297 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.299 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[03dfbadf-c424-4efc-aada-356790b0adaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:47 compute-0 NetworkManager[55454]: <info>  [1769101787.3217] device (tap9c1e909c-80): carrier: link connected
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.326 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[353398b1-f247-4c7e-99f8-d994267bddb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.343 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6e2f5a69-aa71-45e6-a9b4-809adad1ce16]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c1e909c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:42:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415102, 'reachable_time': 29834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217793, 'error': None, 'target': 'ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.358 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ecd03b43-d03f-418d-a0db-8b73cb5bf1c4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe65:4225'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415102, 'tstamp': 415102}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217794, 'error': None, 'target': 'ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.374 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e4637189-cb39-4543-a35d-7099bc1f983e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c1e909c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:42:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415102, 'reachable_time': 29834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217795, 'error': None, 'target': 'ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.404 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[391a1c63-995c-4e7d-a29f-24cdd60c41ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.458 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[87d5d1e1-1a91-4144-923f-b3d0ad2094de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.461 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c1e909c-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.461 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.462 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c1e909c-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:47 compute-0 nova_compute[183075]: 2026-01-22 17:09:47.466 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:47 compute-0 NetworkManager[55454]: <info>  [1769101787.4673] manager: (tap9c1e909c-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Jan 22 17:09:47 compute-0 kernel: tap9c1e909c-80: entered promiscuous mode
Jan 22 17:09:47 compute-0 nova_compute[183075]: 2026-01-22 17:09:47.469 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.470 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9c1e909c-80, col_values=(('external_ids', {'iface-id': '02f52a63-476f-468b-a774-c9514d6b2206'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:47 compute-0 nova_compute[183075]: 2026-01-22 17:09:47.471 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:47 compute-0 ovn_controller[95372]: 2026-01-22T17:09:47Z|00145|binding|INFO|Releasing lport 02f52a63-476f-468b-a774-c9514d6b2206 from this chassis (sb_readonly=0)
Jan 22 17:09:47 compute-0 nova_compute[183075]: 2026-01-22 17:09:47.474 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.474 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c1e909c-8e03-49be-b02d-6bf4a2cedc0c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c1e909c-8e03-49be-b02d-6bf4a2cedc0c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.475 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[77dbfd40-3b4d-4ae2-9299-2250b9eea73b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.476 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/9c1e909c-8e03-49be-b02d-6bf4a2cedc0c.pid.haproxy
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:09:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:47.477 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c', 'env', 'PROCESS_TAG=haproxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9c1e909c-8e03-49be-b02d-6bf4a2cedc0c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:09:47 compute-0 nova_compute[183075]: 2026-01-22 17:09:47.499 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:47 compute-0 podman[217827]: 2026-01-22 17:09:47.845420922 +0000 UTC m=+0.064828306 container create d08ace27ecb69db472cf1c88d34cd385ad0ccc6297f06e0652738cdd93aacc1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:09:47 compute-0 systemd[1]: Started libpod-conmon-d08ace27ecb69db472cf1c88d34cd385ad0ccc6297f06e0652738cdd93aacc1d.scope.
Jan 22 17:09:47 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:09:47 compute-0 podman[217827]: 2026-01-22 17:09:47.805209901 +0000 UTC m=+0.024617325 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:09:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0707cc5d81be2218fac9e3ab46bc067375d6e998cfcdfb09daec75defb5bce1f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:09:47 compute-0 podman[217827]: 2026-01-22 17:09:47.916201093 +0000 UTC m=+0.135608507 container init d08ace27ecb69db472cf1c88d34cd385ad0ccc6297f06e0652738cdd93aacc1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:09:47 compute-0 podman[217827]: 2026-01-22 17:09:47.921368678 +0000 UTC m=+0.140776062 container start d08ace27ecb69db472cf1c88d34cd385ad0ccc6297f06e0652738cdd93aacc1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 17:09:47 compute-0 neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217842]: [NOTICE]   (217846) : New worker (217848) forked
Jan 22 17:09:47 compute-0 neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217842]: [NOTICE]   (217846) : Loading success.
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.119 183079 DEBUG nova.compute.manager [req-b1059075-a104-41ca-868d-d4e13dcd08f7 req-0212af1a-5bb9-4bb7-b586-fd23ef3543ee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Received event network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.120 183079 DEBUG oslo_concurrency.lockutils [req-b1059075-a104-41ca-868d-d4e13dcd08f7 req-0212af1a-5bb9-4bb7-b586-fd23ef3543ee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.120 183079 DEBUG oslo_concurrency.lockutils [req-b1059075-a104-41ca-868d-d4e13dcd08f7 req-0212af1a-5bb9-4bb7-b586-fd23ef3543ee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.120 183079 DEBUG oslo_concurrency.lockutils [req-b1059075-a104-41ca-868d-d4e13dcd08f7 req-0212af1a-5bb9-4bb7-b586-fd23ef3543ee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.121 183079 DEBUG nova.compute.manager [req-b1059075-a104-41ca-868d-d4e13dcd08f7 req-0212af1a-5bb9-4bb7-b586-fd23ef3543ee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Processing event network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.122 183079 DEBUG nova.compute.manager [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.126 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101788.1265395, e4683d56-25f3-42a9-aedd-1b076e9a5245 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.127 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] VM Resumed (Lifecycle Event)
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.128 183079 DEBUG nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.131 183079 INFO nova.virt.libvirt.driver [-] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Instance spawned successfully.
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.132 183079 DEBUG nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.147 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.152 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.156 183079 DEBUG nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.157 183079 DEBUG nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.157 183079 DEBUG nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.158 183079 DEBUG nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.158 183079 DEBUG nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.158 183079 DEBUG nova.virt.libvirt.driver [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.184 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.224 183079 INFO nova.compute.manager [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Took 7.85 seconds to spawn the instance on the hypervisor.
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.225 183079 DEBUG nova.compute.manager [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.242 183079 DEBUG nova.network.neutron [req-6790f3d7-770c-4458-8a24-b812e0b0b6d6 req-9649e322-fac9-4057-aaf4-4af23866fb8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Updated VIF entry in instance network info cache for port 5644ae2a-c35b-431d-88a1-ad18de811d83. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.243 183079 DEBUG nova.network.neutron [req-6790f3d7-770c-4458-8a24-b812e0b0b6d6 req-9649e322-fac9-4057-aaf4-4af23866fb8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Updating instance_info_cache with network_info: [{"id": "5644ae2a-c35b-431d-88a1-ad18de811d83", "address": "fa:16:3e:d4:c8:e5", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5644ae2a-c3", "ovs_interfaceid": "5644ae2a-c35b-431d-88a1-ad18de811d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.251 183079 DEBUG nova.compute.manager [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Received event network-vif-plugged-877fd3c4-01ce-4616-b9d4-92cf337f7f6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.251 183079 DEBUG oslo_concurrency.lockutils [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.251 183079 DEBUG oslo_concurrency.lockutils [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.252 183079 DEBUG oslo_concurrency.lockutils [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.252 183079 DEBUG nova.compute.manager [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] No waiting events found dispatching network-vif-plugged-877fd3c4-01ce-4616-b9d4-92cf337f7f6f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.252 183079 WARNING nova.compute.manager [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Received unexpected event network-vif-plugged-877fd3c4-01ce-4616-b9d4-92cf337f7f6f for instance with vm_state active and task_state deleting.
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.252 183079 DEBUG nova.compute.manager [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received event network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.252 183079 DEBUG oslo_concurrency.lockutils [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.253 183079 DEBUG oslo_concurrency.lockutils [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.253 183079 DEBUG oslo_concurrency.lockutils [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.253 183079 DEBUG nova.compute.manager [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Processing event network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.253 183079 DEBUG nova.compute.manager [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received event network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.254 183079 DEBUG oslo_concurrency.lockutils [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.254 183079 DEBUG oslo_concurrency.lockutils [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.254 183079 DEBUG oslo_concurrency.lockutils [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.254 183079 DEBUG nova.compute.manager [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] No waiting events found dispatching network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.254 183079 WARNING nova.compute.manager [req-a8102cf7-6650-4fc8-b0cf-4cb0679d46d4 req-ac8852e9-d7ba-4684-aa4f-42c437e94ca3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received unexpected event network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 for instance with vm_state building and task_state spawning.
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.255 183079 DEBUG nova.compute.manager [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.259 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101788.2587295, 7e8d077b-66fc-42ee-ad4e-a13327ad6764 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.259 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] VM Resumed (Lifecycle Event)
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.260 183079 DEBUG nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.263 183079 INFO nova.virt.libvirt.driver [-] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Instance spawned successfully.
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.264 183079 DEBUG nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.268 183079 DEBUG oslo_concurrency.lockutils [req-6790f3d7-770c-4458-8a24-b812e0b0b6d6 req-9649e322-fac9-4057-aaf4-4af23866fb8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-7e8d077b-66fc-42ee-ad4e-a13327ad6764" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.291 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.293 183079 INFO nova.compute.manager [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Took 8.82 seconds to build instance.
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.298 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.300 183079 DEBUG nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.300 183079 DEBUG nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.301 183079 DEBUG nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.301 183079 DEBUG nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.302 183079 DEBUG nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.302 183079 DEBUG nova.virt.libvirt.driver [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.325 183079 DEBUG oslo_concurrency.lockutils [None req-ceed4b41-3c81-4fc7-93cc-d77aaa4a4786 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "e4683d56-25f3-42a9-aedd-1b076e9a5245" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.941s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.327 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.364 183079 INFO nova.compute.manager [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Took 8.29 seconds to spawn the instance on the hypervisor.
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.364 183079 DEBUG nova.compute.manager [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.428 183079 INFO nova.compute.manager [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Took 9.07 seconds to build instance.
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.440 183079 DEBUG nova.network.neutron [req-ee7c5064-b2df-45e7-b87b-c8cc22dc3253 req-ca0bcbd1-ed68-4187-a691-a4a002af6434 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Updated VIF entry in instance network info cache for port 096b36b4-87c4-423a-a3ef-3c47a75704f7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.440 183079 DEBUG nova.network.neutron [req-ee7c5064-b2df-45e7-b87b-c8cc22dc3253 req-ca0bcbd1-ed68-4187-a691-a4a002af6434 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Updating instance_info_cache with network_info: [{"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.456 183079 DEBUG oslo_concurrency.lockutils [req-ee7c5064-b2df-45e7-b87b-c8cc22dc3253 req-ca0bcbd1-ed68-4187-a691-a4a002af6434 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-e4683d56-25f3-42a9-aedd-1b076e9a5245" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.461 183079 DEBUG oslo_concurrency.lockutils [None req-40fd79af-28a9-4d6f-80f1-ed4e3e485205 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.462 183079 DEBUG nova.network.neutron [-] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.477 183079 INFO nova.compute.manager [-] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Took 2.41 seconds to deallocate network for instance.
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.511 183079 DEBUG oslo_concurrency.lockutils [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.512 183079 DEBUG oslo_concurrency.lockutils [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.618 183079 DEBUG nova.compute.provider_tree [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.643 183079 DEBUG nova.scheduler.client.report [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.682 183079 DEBUG oslo_concurrency.lockutils [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.732 183079 INFO nova.scheduler.client.report [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Deleted allocations for instance 7d42beab-5bb4-43a0-9756-ced73188f5ba
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.795 183079 DEBUG oslo_concurrency.lockutils [None req-3932db1c-287f-410b-9e0c-bc3df11d3a47 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "7d42beab-5bb4-43a0-9756-ced73188f5ba" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.864 183079 INFO nova.compute.manager [None req-76c7db30-ee36-4e3c-a5c0-6c2c7ccf9c41 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Get console output
Jan 22 17:09:48 compute-0 nova_compute[183075]: 2026-01-22 17:09:48.869 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.551 183079 DEBUG oslo_concurrency.lockutils [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "ee33030a-2035-4fd1-8de4-261142b89bc6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.552 183079 DEBUG oslo_concurrency.lockutils [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.552 183079 DEBUG oslo_concurrency.lockutils [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.552 183079 DEBUG oslo_concurrency.lockutils [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.552 183079 DEBUG oslo_concurrency.lockutils [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.553 183079 INFO nova.compute.manager [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Terminating instance
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.554 183079 DEBUG nova.compute.manager [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:09:49 compute-0 kernel: tap96fce86f-5c (unregistering): left promiscuous mode
Jan 22 17:09:49 compute-0 NetworkManager[55454]: <info>  [1769101789.5845] device (tap96fce86f-5c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.597 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.600 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:49 compute-0 ovn_controller[95372]: 2026-01-22T17:09:49Z|00146|binding|INFO|Releasing lport 96fce86f-5c20-4930-8c2f-5cb0eb62e7fa from this chassis (sb_readonly=0)
Jan 22 17:09:49 compute-0 ovn_controller[95372]: 2026-01-22T17:09:49Z|00147|binding|INFO|Setting lport 96fce86f-5c20-4930-8c2f-5cb0eb62e7fa down in Southbound
Jan 22 17:09:49 compute-0 ovn_controller[95372]: 2026-01-22T17:09:49Z|00148|binding|INFO|Removing iface tap96fce86f-5c ovn-installed in OVS
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.602 183079 INFO nova.compute.manager [None req-016d4639-1834-4a5b-be37-85c108dd54f7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Get console output
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.606 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:49.613 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:77:15 10.100.0.12'], port_security=['fa:16:3e:c6:77:15 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ee33030a-2035-4fd1-8de4-261142b89bc6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-576f6598-999f-46d9-809a-65b7475a1ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.203', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92087573-6410-4f80-a195-ce8b2058420e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=96fce86f-5c20-4930-8c2f-5cb0eb62e7fa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.613 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:49.615 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 96fce86f-5c20-4930-8c2f-5cb0eb62e7fa in datapath 576f6598-999f-46d9-809a-65b7475a1ec7 unbound from our chassis
Jan 22 17:09:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:49.617 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 576f6598-999f-46d9-809a-65b7475a1ec7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:09:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:49.619 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c7ea9a35-6e72-4fc1-b58f-0de6ce239e67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:49.619 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 namespace which is not needed anymore
Jan 22 17:09:49 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Deactivated successfully.
Jan 22 17:09:49 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Consumed 15.747s CPU time.
Jan 22 17:09:49 compute-0 systemd-machined[154382]: Machine qemu-8-instance-00000008 terminated.
Jan 22 17:09:49 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[216628]: [NOTICE]   (216632) : haproxy version is 2.8.14-c23fe91
Jan 22 17:09:49 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[216628]: [NOTICE]   (216632) : path to executable is /usr/sbin/haproxy
Jan 22 17:09:49 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[216628]: [WARNING]  (216632) : Exiting Master process...
Jan 22 17:09:49 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[216628]: [ALERT]    (216632) : Current worker (216634) exited with code 143 (Terminated)
Jan 22 17:09:49 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[216628]: [WARNING]  (216632) : All workers exited. Exiting... (0)
Jan 22 17:09:49 compute-0 systemd[1]: libpod-f2b5b24e23b87235f99f4debb313a36e66992aa00f49bd54657271c674244e6f.scope: Deactivated successfully.
Jan 22 17:09:49 compute-0 podman[217879]: 2026-01-22 17:09:49.76009949 +0000 UTC m=+0.053935912 container died f2b5b24e23b87235f99f4debb313a36e66992aa00f49bd54657271c674244e6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 22 17:09:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f2b5b24e23b87235f99f4debb313a36e66992aa00f49bd54657271c674244e6f-userdata-shm.mount: Deactivated successfully.
Jan 22 17:09:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-db7cdd337bbfefe149ee8eac594df93ad5083e1b3fc7233b3c22e94254963d11-merged.mount: Deactivated successfully.
Jan 22 17:09:49 compute-0 podman[217879]: 2026-01-22 17:09:49.809672986 +0000 UTC m=+0.103509408 container cleanup f2b5b24e23b87235f99f4debb313a36e66992aa00f49bd54657271c674244e6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.810 183079 INFO nova.virt.libvirt.driver [-] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Instance destroyed successfully.
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.810 183079 DEBUG nova.objects.instance [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'resources' on Instance uuid ee33030a-2035-4fd1-8de4-261142b89bc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:09:49 compute-0 systemd[1]: libpod-conmon-f2b5b24e23b87235f99f4debb313a36e66992aa00f49bd54657271c674244e6f.scope: Deactivated successfully.
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.832 183079 DEBUG nova.virt.libvirt.vif [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:08:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-197701255',display_name='tempest-server-test-197701255',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-197701255',id=8,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:08:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-9ywe3qjk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:08:44Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=ee33030a-2035-4fd1-8de4-261142b89bc6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "address": "fa:16:3e:c6:77:15", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96fce86f-5c", "ovs_interfaceid": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.832 183079 DEBUG nova.network.os_vif_util [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "address": "fa:16:3e:c6:77:15", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96fce86f-5c", "ovs_interfaceid": "96fce86f-5c20-4930-8c2f-5cb0eb62e7fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.833 183079 DEBUG nova.network.os_vif_util [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:77:15,bridge_name='br-int',has_traffic_filtering=True,id=96fce86f-5c20-4930-8c2f-5cb0eb62e7fa,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96fce86f-5c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.834 183079 DEBUG os_vif [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:77:15,bridge_name='br-int',has_traffic_filtering=True,id=96fce86f-5c20-4930-8c2f-5cb0eb62e7fa,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96fce86f-5c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.835 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.836 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap96fce86f-5c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.840 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.841 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.843 183079 INFO os_vif [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:77:15,bridge_name='br-int',has_traffic_filtering=True,id=96fce86f-5c20-4930-8c2f-5cb0eb62e7fa,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96fce86f-5c')
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.844 183079 INFO nova.virt.libvirt.driver [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Deleting instance files /var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6_del
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.844 183079 INFO nova.virt.libvirt.driver [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Deletion of /var/lib/nova/instances/ee33030a-2035-4fd1-8de4-261142b89bc6_del complete
Jan 22 17:09:49 compute-0 podman[217923]: 2026-01-22 17:09:49.887229984 +0000 UTC m=+0.056481538 container remove f2b5b24e23b87235f99f4debb313a36e66992aa00f49bd54657271c674244e6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 17:09:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:49.892 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d53d85c0-94ac-4bfd-ad1f-91e601703004]: (4, ('Thu Jan 22 05:09:49 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 (f2b5b24e23b87235f99f4debb313a36e66992aa00f49bd54657271c674244e6f)\nf2b5b24e23b87235f99f4debb313a36e66992aa00f49bd54657271c674244e6f\nThu Jan 22 05:09:49 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 (f2b5b24e23b87235f99f4debb313a36e66992aa00f49bd54657271c674244e6f)\nf2b5b24e23b87235f99f4debb313a36e66992aa00f49bd54657271c674244e6f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:49.895 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3962c749-37ae-45db-bd3f-17f23a5a55e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:49.896 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap576f6598-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.898 183079 INFO nova.compute.manager [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.898 183079 DEBUG oslo.service.loopingcall [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.899 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:49 compute-0 kernel: tap576f6598-90: left promiscuous mode
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.900 183079 DEBUG nova.compute.manager [-] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.900 183079 DEBUG nova.network.neutron [-] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:09:49 compute-0 nova_compute[183075]: 2026-01-22 17:09:49.913 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:49.916 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[68a336ae-1b4b-4f54-91b3-8c48e3a08f4a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:49.929 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8a0e05b7-ccd8-4ea6-950a-47d3310a6c89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:49.931 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cbfa8bd1-84e4-4c1d-8ec8-dfe8521eeb71]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:49.945 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ce5ede2a-2694-4234-aaae-ccd6d0a2f2a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408685, 'reachable_time': 23840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217938, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:49 compute-0 systemd[1]: run-netns-ovnmeta\x2d576f6598\x2d999f\x2d46d9\x2d809a\x2d65b7475a1ec7.mount: Deactivated successfully.
Jan 22 17:09:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:49.947 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:09:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:49.947 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[4fcac22d-39e4-4fb5-a0b8-43a003e47997]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.036 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.236 183079 DEBUG nova.compute.manager [req-6af56c3b-0172-4e90-aecb-abe889363793 req-710c0f0f-5747-4c80-9e8b-13bad6d6de19 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Received event network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.237 183079 DEBUG oslo_concurrency.lockutils [req-6af56c3b-0172-4e90-aecb-abe889363793 req-710c0f0f-5747-4c80-9e8b-13bad6d6de19 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.237 183079 DEBUG oslo_concurrency.lockutils [req-6af56c3b-0172-4e90-aecb-abe889363793 req-710c0f0f-5747-4c80-9e8b-13bad6d6de19 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.238 183079 DEBUG oslo_concurrency.lockutils [req-6af56c3b-0172-4e90-aecb-abe889363793 req-710c0f0f-5747-4c80-9e8b-13bad6d6de19 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.239 183079 DEBUG nova.compute.manager [req-6af56c3b-0172-4e90-aecb-abe889363793 req-710c0f0f-5747-4c80-9e8b-13bad6d6de19 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] No waiting events found dispatching network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.240 183079 WARNING nova.compute.manager [req-6af56c3b-0172-4e90-aecb-abe889363793 req-710c0f0f-5747-4c80-9e8b-13bad6d6de19 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Received unexpected event network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 for instance with vm_state active and task_state None.
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.351 183079 DEBUG nova.compute.manager [req-322f4f34-43c7-4db7-9321-14b7e8cc5ac2 req-73102138-8e40-4be1-a455-f7fdd8f49f81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Received event network-vif-unplugged-96fce86f-5c20-4930-8c2f-5cb0eb62e7fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.352 183079 DEBUG oslo_concurrency.lockutils [req-322f4f34-43c7-4db7-9321-14b7e8cc5ac2 req-73102138-8e40-4be1-a455-f7fdd8f49f81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.352 183079 DEBUG oslo_concurrency.lockutils [req-322f4f34-43c7-4db7-9321-14b7e8cc5ac2 req-73102138-8e40-4be1-a455-f7fdd8f49f81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.352 183079 DEBUG oslo_concurrency.lockutils [req-322f4f34-43c7-4db7-9321-14b7e8cc5ac2 req-73102138-8e40-4be1-a455-f7fdd8f49f81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.352 183079 DEBUG nova.compute.manager [req-322f4f34-43c7-4db7-9321-14b7e8cc5ac2 req-73102138-8e40-4be1-a455-f7fdd8f49f81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] No waiting events found dispatching network-vif-unplugged-96fce86f-5c20-4930-8c2f-5cb0eb62e7fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.352 183079 DEBUG nova.compute.manager [req-322f4f34-43c7-4db7-9321-14b7e8cc5ac2 req-73102138-8e40-4be1-a455-f7fdd8f49f81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Received event network-vif-unplugged-96fce86f-5c20-4930-8c2f-5cb0eb62e7fa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.353 183079 DEBUG nova.compute.manager [req-322f4f34-43c7-4db7-9321-14b7e8cc5ac2 req-73102138-8e40-4be1-a455-f7fdd8f49f81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Received event network-vif-plugged-96fce86f-5c20-4930-8c2f-5cb0eb62e7fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.353 183079 DEBUG oslo_concurrency.lockutils [req-322f4f34-43c7-4db7-9321-14b7e8cc5ac2 req-73102138-8e40-4be1-a455-f7fdd8f49f81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.353 183079 DEBUG oslo_concurrency.lockutils [req-322f4f34-43c7-4db7-9321-14b7e8cc5ac2 req-73102138-8e40-4be1-a455-f7fdd8f49f81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.353 183079 DEBUG oslo_concurrency.lockutils [req-322f4f34-43c7-4db7-9321-14b7e8cc5ac2 req-73102138-8e40-4be1-a455-f7fdd8f49f81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.353 183079 DEBUG nova.compute.manager [req-322f4f34-43c7-4db7-9321-14b7e8cc5ac2 req-73102138-8e40-4be1-a455-f7fdd8f49f81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] No waiting events found dispatching network-vif-plugged-96fce86f-5c20-4930-8c2f-5cb0eb62e7fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.353 183079 WARNING nova.compute.manager [req-322f4f34-43c7-4db7-9321-14b7e8cc5ac2 req-73102138-8e40-4be1-a455-f7fdd8f49f81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Received unexpected event network-vif-plugged-96fce86f-5c20-4930-8c2f-5cb0eb62e7fa for instance with vm_state active and task_state deleting.
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.550 183079 INFO nova.compute.manager [None req-7a975df2-dd25-48d4-9f16-96288129247a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Get console output
Jan 22 17:09:50 compute-0 nova_compute[183075]: 2026-01-22 17:09:50.557 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:51 compute-0 nova_compute[183075]: 2026-01-22 17:09:51.469 183079 DEBUG nova.network.neutron [-] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:09:51 compute-0 nova_compute[183075]: 2026-01-22 17:09:51.501 183079 INFO nova.compute.manager [-] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Took 1.60 seconds to deallocate network for instance.
Jan 22 17:09:51 compute-0 nova_compute[183075]: 2026-01-22 17:09:51.565 183079 DEBUG oslo_concurrency.lockutils [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:51 compute-0 nova_compute[183075]: 2026-01-22 17:09:51.566 183079 DEBUG oslo_concurrency.lockutils [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:51 compute-0 nova_compute[183075]: 2026-01-22 17:09:51.837 183079 DEBUG nova.compute.provider_tree [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:09:51 compute-0 nova_compute[183075]: 2026-01-22 17:09:51.850 183079 DEBUG nova.scheduler.client.report [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:09:51 compute-0 nova_compute[183075]: 2026-01-22 17:09:51.871 183079 DEBUG oslo_concurrency.lockutils [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:51 compute-0 nova_compute[183075]: 2026-01-22 17:09:51.907 183079 INFO nova.scheduler.client.report [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Deleted allocations for instance ee33030a-2035-4fd1-8de4-261142b89bc6
Jan 22 17:09:51 compute-0 nova_compute[183075]: 2026-01-22 17:09:51.983 183079 DEBUG oslo_concurrency.lockutils [None req-a0146de4-f4f5-434e-a72e-0f3e11bb9d88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "ee33030a-2035-4fd1-8de4-261142b89bc6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.431s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:52 compute-0 nova_compute[183075]: 2026-01-22 17:09:52.607 183079 DEBUG nova.compute.manager [req-a1e34201-7f87-43e7-b30f-e2486387141c req-7e521deb-945f-4858-b7c8-02a2d8c638bf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Received event network-changed-d53b3c49-e24c-4e07-944d-d72baf3994e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:52 compute-0 nova_compute[183075]: 2026-01-22 17:09:52.607 183079 DEBUG nova.compute.manager [req-a1e34201-7f87-43e7-b30f-e2486387141c req-7e521deb-945f-4858-b7c8-02a2d8c638bf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Refreshing instance network info cache due to event network-changed-d53b3c49-e24c-4e07-944d-d72baf3994e0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:09:52 compute-0 nova_compute[183075]: 2026-01-22 17:09:52.607 183079 DEBUG oslo_concurrency.lockutils [req-a1e34201-7f87-43e7-b30f-e2486387141c req-7e521deb-945f-4858-b7c8-02a2d8c638bf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-1fa7475b-9f51-4229-8ded-3a0c4de806c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:09:52 compute-0 nova_compute[183075]: 2026-01-22 17:09:52.608 183079 DEBUG oslo_concurrency.lockutils [req-a1e34201-7f87-43e7-b30f-e2486387141c req-7e521deb-945f-4858-b7c8-02a2d8c638bf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-1fa7475b-9f51-4229-8ded-3a0c4de806c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:09:52 compute-0 nova_compute[183075]: 2026-01-22 17:09:52.608 183079 DEBUG nova.network.neutron [req-a1e34201-7f87-43e7-b30f-e2486387141c req-7e521deb-945f-4858-b7c8-02a2d8c638bf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Refreshing network info cache for port d53b3c49-e24c-4e07-944d-d72baf3994e0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:09:53 compute-0 podman[217940]: 2026-01-22 17:09:53.420301542 +0000 UTC m=+0.096232148 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 22 17:09:53 compute-0 podman[217939]: 2026-01-22 17:09:53.468139833 +0000 UTC m=+0.146193194 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:09:53 compute-0 nova_compute[183075]: 2026-01-22 17:09:53.994 183079 DEBUG oslo_concurrency.lockutils [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Acquiring lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:53 compute-0 nova_compute[183075]: 2026-01-22 17:09:53.995 183079 DEBUG oslo_concurrency.lockutils [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:53 compute-0 nova_compute[183075]: 2026-01-22 17:09:53.995 183079 DEBUG oslo_concurrency.lockutils [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Acquiring lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:53 compute-0 nova_compute[183075]: 2026-01-22 17:09:53.995 183079 DEBUG oslo_concurrency.lockutils [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:53 compute-0 nova_compute[183075]: 2026-01-22 17:09:53.996 183079 DEBUG oslo_concurrency.lockutils [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:53 compute-0 nova_compute[183075]: 2026-01-22 17:09:53.997 183079 INFO nova.compute.manager [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Terminating instance
Jan 22 17:09:53 compute-0 nova_compute[183075]: 2026-01-22 17:09:53.998 183079 DEBUG nova.compute.manager [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.000 183079 DEBUG nova.network.neutron [req-a1e34201-7f87-43e7-b30f-e2486387141c req-7e521deb-945f-4858-b7c8-02a2d8c638bf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Updated VIF entry in instance network info cache for port d53b3c49-e24c-4e07-944d-d72baf3994e0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.001 183079 DEBUG nova.network.neutron [req-a1e34201-7f87-43e7-b30f-e2486387141c req-7e521deb-945f-4858-b7c8-02a2d8c638bf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Updating instance_info_cache with network_info: [{"id": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "address": "fa:16:3e:72:f9:88", "network": {"id": "a2733777-0394-47df-88c8-302fae8b0aef", "bridge": "br-int", "label": "tempest-test-network--1786359447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a08acddaa8b748f8b4fbb432b95408d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd53b3c49-e2", "ovs_interfaceid": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.003 183079 INFO nova.compute.manager [None req-6d0c15b3-f17b-4cbf-a903-7bf5c273c434 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Get console output
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.010 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.021 183079 DEBUG oslo_concurrency.lockutils [req-a1e34201-7f87-43e7-b30f-e2486387141c req-7e521deb-945f-4858-b7c8-02a2d8c638bf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-1fa7475b-9f51-4229-8ded-3a0c4de806c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:09:54 compute-0 kernel: tapd53b3c49-e2 (unregistering): left promiscuous mode
Jan 22 17:09:54 compute-0 NetworkManager[55454]: <info>  [1769101794.0378] device (tapd53b3c49-e2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.079 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:54 compute-0 ovn_controller[95372]: 2026-01-22T17:09:54Z|00149|binding|INFO|Releasing lport d53b3c49-e24c-4e07-944d-d72baf3994e0 from this chassis (sb_readonly=0)
Jan 22 17:09:54 compute-0 ovn_controller[95372]: 2026-01-22T17:09:54Z|00150|binding|INFO|Setting lport d53b3c49-e24c-4e07-944d-d72baf3994e0 down in Southbound
Jan 22 17:09:54 compute-0 ovn_controller[95372]: 2026-01-22T17:09:54Z|00151|binding|INFO|Removing iface tapd53b3c49-e2 ovn-installed in OVS
Jan 22 17:09:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:54.088 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:f9:88 10.100.0.7'], port_security=['fa:16:3e:72:f9:88 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '1fa7475b-9f51-4229-8ded-3a0c4de806c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2733777-0394-47df-88c8-302fae8b0aef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a08acddaa8b748f8b4fbb432b95408d1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c4cdf4d-bf9c-4640-a79a-ea997ee124d5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.245'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fe09b5f5-d61e-4c9e-ad28-b743a26596fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=d53b3c49-e24c-4e07-944d-d72baf3994e0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:09:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:54.090 104629 INFO neutron.agent.ovn.metadata.agent [-] Port d53b3c49-e24c-4e07-944d-d72baf3994e0 in datapath a2733777-0394-47df-88c8-302fae8b0aef unbound from our chassis
Jan 22 17:09:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:54.091 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a2733777-0394-47df-88c8-302fae8b0aef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.086 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:54.093 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[175a346d-21c8-486e-ac7c-b2ce2ed1b156]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:54.094 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef namespace which is not needed anymore
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.096 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:54 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 22 17:09:54 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Consumed 14.204s CPU time.
Jan 22 17:09:54 compute-0 systemd-machined[154382]: Machine qemu-9-instance-00000009 terminated.
Jan 22 17:09:54 compute-0 neutron-haproxy-ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef[217006]: [NOTICE]   (217010) : haproxy version is 2.8.14-c23fe91
Jan 22 17:09:54 compute-0 neutron-haproxy-ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef[217006]: [NOTICE]   (217010) : path to executable is /usr/sbin/haproxy
Jan 22 17:09:54 compute-0 neutron-haproxy-ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef[217006]: [WARNING]  (217010) : Exiting Master process...
Jan 22 17:09:54 compute-0 neutron-haproxy-ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef[217006]: [WARNING]  (217010) : Exiting Master process...
Jan 22 17:09:54 compute-0 neutron-haproxy-ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef[217006]: [ALERT]    (217010) : Current worker (217012) exited with code 143 (Terminated)
Jan 22 17:09:54 compute-0 neutron-haproxy-ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef[217006]: [WARNING]  (217010) : All workers exited. Exiting... (0)
Jan 22 17:09:54 compute-0 systemd[1]: libpod-f5c7be3cd8cf91f488b76e6d98ff905752304abfae464bcffcf5594c59e3cf63.scope: Deactivated successfully.
Jan 22 17:09:54 compute-0 podman[218005]: 2026-01-22 17:09:54.224250475 +0000 UTC m=+0.047171175 container died f5c7be3cd8cf91f488b76e6d98ff905752304abfae464bcffcf5594c59e3cf63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 17:09:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f5c7be3cd8cf91f488b76e6d98ff905752304abfae464bcffcf5594c59e3cf63-userdata-shm.mount: Deactivated successfully.
Jan 22 17:09:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-ea4763da4ca0b3487e0caf7be75a3f64ec6586a9c55f7d467a972ce94a3f654f-merged.mount: Deactivated successfully.
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.259 183079 INFO nova.virt.libvirt.driver [-] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Instance destroyed successfully.
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.260 183079 DEBUG nova.objects.instance [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lazy-loading 'resources' on Instance uuid 1fa7475b-9f51-4229-8ded-3a0c4de806c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:09:54 compute-0 podman[218005]: 2026-01-22 17:09:54.263519792 +0000 UTC m=+0.086440482 container cleanup f5c7be3cd8cf91f488b76e6d98ff905752304abfae464bcffcf5594c59e3cf63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.270 183079 DEBUG nova.virt.libvirt.vif [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:08:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-901772077',display_name='tempest-server-test-901772077',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-901772077',id=9,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQuJuvkIPPJyGB3O37xHb6l4WpEoydCT10tXlWa50WKy7gHgurcOWRMNAPM4HnhDWmUgLmLU1COdO9GFsKDe7/yg9HhZhk7hp24KACIdwwfi6BYaCn9sbAGhcwglP9yYw==',key_name='tempest-keypair-test-1152764916',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:08:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a08acddaa8b748f8b4fbb432b95408d1',ramdisk_id='',reservation_id='r-hxqj1zr5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkBasicTest-330550171',owner_user_name='tempest-NetworkBasicTest-330550171-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:08:59Z,user_data=None,user_id='56c736d5c1ab41d8a02fcbc021d28353',uuid=1fa7475b-9f51-4229-8ded-3a0c4de806c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "address": "fa:16:3e:72:f9:88", "network": {"id": "a2733777-0394-47df-88c8-302fae8b0aef", "bridge": "br-int", "label": "tempest-test-network--1786359447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a08acddaa8b748f8b4fbb432b95408d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd53b3c49-e2", "ovs_interfaceid": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.271 183079 DEBUG nova.network.os_vif_util [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Converting VIF {"id": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "address": "fa:16:3e:72:f9:88", "network": {"id": "a2733777-0394-47df-88c8-302fae8b0aef", "bridge": "br-int", "label": "tempest-test-network--1786359447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a08acddaa8b748f8b4fbb432b95408d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd53b3c49-e2", "ovs_interfaceid": "d53b3c49-e24c-4e07-944d-d72baf3994e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.272 183079 DEBUG nova.network.os_vif_util [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:f9:88,bridge_name='br-int',has_traffic_filtering=True,id=d53b3c49-e24c-4e07-944d-d72baf3994e0,network=Network(a2733777-0394-47df-88c8-302fae8b0aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd53b3c49-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.272 183079 DEBUG os_vif [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:f9:88,bridge_name='br-int',has_traffic_filtering=True,id=d53b3c49-e24c-4e07-944d-d72baf3994e0,network=Network(a2733777-0394-47df-88c8-302fae8b0aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd53b3c49-e2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.274 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.275 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd53b3c49-e2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.277 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:54 compute-0 systemd[1]: libpod-conmon-f5c7be3cd8cf91f488b76e6d98ff905752304abfae464bcffcf5594c59e3cf63.scope: Deactivated successfully.
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.278 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.283 183079 INFO os_vif [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:f9:88,bridge_name='br-int',has_traffic_filtering=True,id=d53b3c49-e24c-4e07-944d-d72baf3994e0,network=Network(a2733777-0394-47df-88c8-302fae8b0aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd53b3c49-e2')
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.284 183079 INFO nova.virt.libvirt.driver [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Deleting instance files /var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5_del
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.285 183079 INFO nova.virt.libvirt.driver [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Deletion of /var/lib/nova/instances/1fa7475b-9f51-4229-8ded-3a0c4de806c5_del complete
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.329 183079 INFO nova.compute.manager [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.330 183079 DEBUG oslo.service.loopingcall [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.330 183079 DEBUG nova.compute.manager [-] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.331 183079 DEBUG nova.network.neutron [-] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:09:54 compute-0 podman[218047]: 2026-01-22 17:09:54.345318811 +0000 UTC m=+0.052464253 container remove f5c7be3cd8cf91f488b76e6d98ff905752304abfae464bcffcf5594c59e3cf63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:09:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:54.351 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d7cab29a-4041-47e6-b7ee-7c77b82e4bda]: (4, ('Thu Jan 22 05:09:54 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef (f5c7be3cd8cf91f488b76e6d98ff905752304abfae464bcffcf5594c59e3cf63)\nf5c7be3cd8cf91f488b76e6d98ff905752304abfae464bcffcf5594c59e3cf63\nThu Jan 22 05:09:54 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef (f5c7be3cd8cf91f488b76e6d98ff905752304abfae464bcffcf5594c59e3cf63)\nf5c7be3cd8cf91f488b76e6d98ff905752304abfae464bcffcf5594c59e3cf63\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:54.354 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cd15bcbc-12f6-4e2c-b93e-3176eec0b73a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:54.355 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2733777-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.357 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:54 compute-0 kernel: tapa2733777-00: left promiscuous mode
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.359 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.374 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:54.377 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6fcf77d0-dc89-4b7f-b267-9afabd2eb5dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:54.398 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fd9dfe28-6184-4909-adfe-9c23d9ed321e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:54.398 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d7c91366-4fda-4c11-a5bb-9b7659109af5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:54.413 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2760272f-278a-4664-8c23-d7090f3af275]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 410232, 'reachable_time': 25307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218067, 'error': None, 'target': 'ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:54.416 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a2733777-0394-47df-88c8-302fae8b0aef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:09:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:09:54.416 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[2b296595-fadd-4653-a1d0-51ddc8f7056c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:09:54 compute-0 systemd[1]: run-netns-ovnmeta\x2da2733777\x2d0394\x2d47df\x2d88c8\x2d302fae8b0aef.mount: Deactivated successfully.
Jan 22 17:09:54 compute-0 podman[218059]: 2026-01-22 17:09:54.448456047 +0000 UTC m=+0.051674842 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7)
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.716 183079 DEBUG nova.compute.manager [req-f2a38089-51de-4d7e-af33-1e7495fc18ec req-f81ddde1-b45e-4a49-a744-deb40e620f75 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Received event network-vif-unplugged-d53b3c49-e24c-4e07-944d-d72baf3994e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.717 183079 DEBUG oslo_concurrency.lockutils [req-f2a38089-51de-4d7e-af33-1e7495fc18ec req-f81ddde1-b45e-4a49-a744-deb40e620f75 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.717 183079 DEBUG oslo_concurrency.lockutils [req-f2a38089-51de-4d7e-af33-1e7495fc18ec req-f81ddde1-b45e-4a49-a744-deb40e620f75 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.717 183079 DEBUG oslo_concurrency.lockutils [req-f2a38089-51de-4d7e-af33-1e7495fc18ec req-f81ddde1-b45e-4a49-a744-deb40e620f75 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.718 183079 DEBUG nova.compute.manager [req-f2a38089-51de-4d7e-af33-1e7495fc18ec req-f81ddde1-b45e-4a49-a744-deb40e620f75 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] No waiting events found dispatching network-vif-unplugged-d53b3c49-e24c-4e07-944d-d72baf3994e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.718 183079 DEBUG nova.compute.manager [req-f2a38089-51de-4d7e-af33-1e7495fc18ec req-f81ddde1-b45e-4a49-a744-deb40e620f75 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Received event network-vif-unplugged-d53b3c49-e24c-4e07-944d-d72baf3994e0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.754 183079 INFO nova.compute.manager [None req-2f1fbce6-ba79-4bbe-964a-7f1b604a9525 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Get console output
Jan 22 17:09:54 compute-0 nova_compute[183075]: 2026-01-22 17:09:54.762 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:55 compute-0 nova_compute[183075]: 2026-01-22 17:09:55.037 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:55 compute-0 nova_compute[183075]: 2026-01-22 17:09:55.412 183079 DEBUG nova.network.neutron [-] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:09:55 compute-0 nova_compute[183075]: 2026-01-22 17:09:55.427 183079 INFO nova.compute.manager [-] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Took 1.10 seconds to deallocate network for instance.
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.451 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'name': 'tempest-server-test-233401537', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c2b37b797ca344f2b31c3861277068d8', 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'hostId': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.456 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'name': 'tempest-server-test-590988812', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'bfc6667804934c92b71ce7638089e9e3', 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'hostId': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.457 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:09:55 compute-0 nova_compute[183075]: 2026-01-22 17:09:55.464 183079 DEBUG oslo_concurrency.lockutils [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:55 compute-0 nova_compute[183075]: 2026-01-22 17:09:55.465 183079 DEBUG oslo_concurrency.lockutils [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.470 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.481 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38811e51-4697-4d2d-b4cb-f53385bacbb4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:09:55.457606', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c69197a-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.218148403, 'message_signature': 'b5e70e89221e575ac4130250934681eda53a0f5890b5577e258039020fdd1085'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245-vda', 'timestamp': '2026-01-22T17:09:55.457606', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'instance-0000000c', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c6ad1a2-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.232277292, 'message_signature': '74acb0737d6a2bc82e6f103196cf5b1f2d598d9997893492bf8e9bea40d56611'}]}, 'timestamp': '2026-01-22 17:09:55.482856', '_unique_id': '1ceb32c3e9784daaaa7653a51e54f3c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.486 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.505 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.520 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1fd9328-a213-43ec-8fff-2fef598368d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:09:55.486994', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c6e6254-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.247461619, 'message_signature': 'e37f9ca55d27ec6eb78a4b7426a831dd447932e55ddc0a7042eefa06034fe422'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245-vda', 'timestamp': '2026-01-22T17:09:55.486994', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'instance-0000000c', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c70b388-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.266543758, 'message_signature': 'ea3714a5fd896c524e4fead73fde6bae5be89e50303aed1ad3ac668d1881283a'}]}, 'timestamp': '2026-01-22 17:09:55.521258', '_unique_id': 'd71d2041eeec4670ba157de1996530f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.523 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.523 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.523 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c17b95d-f102-412f-a972-10c3880375a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:09:55.523132', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c710ce8-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.247461619, 'message_signature': 'b4b7ffd66385f7cac348e16a2e4401a4ac6eeb0b4f828d5df7914dcf6eabdbdb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245-vda', 'timestamp': '2026-01-22T17:09:55.523132', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'instance-0000000c', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c711828-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.266543758, 'message_signature': '4ffb2db8d22baf61bbe5930ab430a2adf990117696b86991d707c8882237478c'}]}, 'timestamp': '2026-01-22 17:09:55.523745', '_unique_id': 'd843f0d0a40846c48080d3f64a813ad8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.524 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.525 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.527 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7e8d077b-66fc-42ee-ad4e-a13327ad6764 / tap5644ae2a-c3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.527 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.530 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e4683d56-25f3-42a9-aedd-1b076e9a5245 / tap096b36b4-87 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.531 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7401dc54-2b9f-4277-ad6f-e7d31bb0a307', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:09:55.525235', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '2c71c9bc-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.285693769, 'message_signature': '9777356ffc53c60040d41b2e8f7845b44fa6836c5b883f5844a5e136928b29d7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-0000000c-e4683d56-25f3-42a9-aedd-1b076e9a5245-tap096b36b4-87', 'timestamp': '2026-01-22T17:09:55.525235', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'tap096b36b4-87', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '2c724734-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.288680507, 'message_signature': '8dcbba17808fbd0299fef1510bd46d341e0900df50a4c045772c510317082194'}]}, 'timestamp': '2026-01-22 17:09:55.531504', '_unique_id': 'd760acf3fb1441c5bad1eb036af7bcec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.533 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.533 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.533 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ac5682d-07e2-44bb-af01-605f8ed9506b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:09:55.533235', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c72973e-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.218148403, 'message_signature': '1d7078c6121aa07ea181113dc3b3e46a56b733be2d11840d5b3b826902b8c1ce'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245-vda', 'timestamp': '2026-01-22T17:09:55.533235', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'instance-0000000c', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c72a184-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.232277292, 'message_signature': '4003f98555a98a894b1cfb2574f358b08430bf833c1ad7a48279c91d8a7fb115'}]}, 'timestamp': '2026-01-22 17:09:55.533783', '_unique_id': '8965e095f26b493091ed31d754ac33e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.534 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.535 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c50bba9-15dd-4bc2-ad19-a4d9f927da21', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:09:55.534961', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '2c72daaa-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.285693769, 'message_signature': 'af3021d3f5e5fbb8d3eb30fcefaaec694b56dfc29781f2e40e8fd37cb4f58579'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-0000000c-e4683d56-25f3-42a9-aedd-1b076e9a5245-tap096b36b4-87', 'timestamp': '2026-01-22T17:09:55.534961', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'tap096b36b4-87', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '2c72e5fe-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.288680507, 'message_signature': '767005fe57c4cd5b053b688670ad4497abe50dcf0736a971ec4d93257baa8a0b'}]}, 'timestamp': '2026-01-22 17:09:55.535553', '_unique_id': '3b11956d1d2d4fb9802b5a2c65dbb823'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.536 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.537 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.537 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5916234f-0aa4-4751-9a44-befca48a2621', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:09:55.537030', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '2c732b72-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.285693769, 'message_signature': '9717174ab7249a9b793b12c5ae0a97c6fe95dddc78043ddc115d4d11c9f8e66a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-0000000c-e4683d56-25f3-42a9-aedd-1b076e9a5245-tap096b36b4-87', 'timestamp': '2026-01-22T17:09:55.537030', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'tap096b36b4-87', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '2c73369e-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.288680507, 'message_signature': 'a872bce018f728818236c7b889df5afcea431fd341637bd6f788f11115d5131a'}]}, 'timestamp': '2026-01-22 17:09:55.537616', '_unique_id': '0cf6b395c6fd44d1b4167bcd6014b606'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.538 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.539 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.555 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/cpu volume: 7030000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 nova_compute[183075]: 2026-01-22 17:09:55.554 183079 DEBUG nova.compute.provider_tree [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:09:55 compute-0 nova_compute[183075]: 2026-01-22 17:09:55.571 183079 DEBUG nova.scheduler.client.report [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.582 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/cpu volume: 7110000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bb52523-d4a5-4e9a-b183-88160ce4bfa5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7030000000, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'timestamp': '2026-01-22T17:09:55.539256', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2c75f64a-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.315464938, 'message_signature': 'fa9925fd54325666c09b44b6a1863b73f5379b77f6f61d8336e33d4a539cd222'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7110000000, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'timestamp': '2026-01-22T17:09:55.539256', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'instance-0000000c', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2c7a2120-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.342758301, 'message_signature': '3c3af8e78834181c8be97762a7681ad74ae16e31c29171d94d73d8f75566c6a1'}]}, 'timestamp': '2026-01-22 17:09:55.582967', '_unique_id': 'e9cf456820ed41d486f25449904083f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.583 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.584 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.584 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.584 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '001cd1fe-eabc-4e99-be20-c7cd802cdb30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:09:55.584690', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '2c7a701c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.285693769, 'message_signature': 'cc1672c7e8c3bd355ad78ebe4972a53fed6ffa56b52d886a9fcc22681e8b8bae'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-0000000c-e4683d56-25f3-42a9-aedd-1b076e9a5245-tap096b36b4-87', 'timestamp': '2026-01-22T17:09:55.584690', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'tap096b36b4-87', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '2c7a785a-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.288680507, 'message_signature': '581072663d2c24f7b60b99897020beb62f8166416348e96e6f34c8e106dfa667'}]}, 'timestamp': '2026-01-22 17:09:55.585135', '_unique_id': '7ea417bf96b141f191de09183d0a7b0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.586 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.586 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.586 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cde9585-f55b-497a-98c6-21132e037ea8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:09:55.586259', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '2c7aad02-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.285693769, 'message_signature': '943e5739bc7b3f94226e396145cc118c0bb07bb84d6bd5e5270d8a77a27a9848'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-0000000c-e4683d56-25f3-42a9-aedd-1b076e9a5245-tap096b36b4-87', 'timestamp': '2026-01-22T17:09:55.586259', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'tap096b36b4-87', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '2c7ab50e-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.288680507, 'message_signature': '1653ffc2d67acb390ca5074b6a432ee67693f31f47a6fcc23b4f1d1953354450'}]}, 'timestamp': '2026-01-22 17:09:55.586712', '_unique_id': 'ef7e43a6485249b98516240335ed4be2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.587 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.588 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.588 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-233401537>, <NovaLikeServer: tempest-server-test-590988812>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-233401537>, <NovaLikeServer: tempest-server-test-590988812>]
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.588 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.588 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.588 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e5119ca-3a64-441b-8eb0-3ebb41e2659b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:09:55.588364', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c7b0004-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.218148403, 'message_signature': 'de7b80f2bf3d3551c7aa58f6b1917286ab64155a744e18cbd5a440fa8c56f7fa'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245-vda', 'timestamp': '2026-01-22T17:09:55.588364', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'instance-0000000c', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c7b0946-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.232277292, 'message_signature': '89ec2a8f4c5c8b7e4cbaee4fd99b6078b828de0dfd617578723a5938920eb064'}]}, 'timestamp': '2026-01-22 17:09:55.588838', '_unique_id': '7c100394939a4256ae5c1b33e66941e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.589 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb606e7e-c397-4919-91d6-3ec210a7d234', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:09:55.589954', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c7b3d1c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.247461619, 'message_signature': 'f47b5ffd612fc9c2c5dd1c40c5694750a073a12054f237f30bb6a57e3dc8aaf3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245-vda', 'timestamp': '2026-01-22T17:09:55.589954', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'instance-0000000c', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c7b4596-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.266543758, 'message_signature': '22ffec3be875ae2c3ad3ae33a1cbb68ba7c8fe9900d51c449a7a1afa52353e26'}]}, 'timestamp': '2026-01-22 17:09:55.590381', '_unique_id': '3a22c9830fbd461eb766fd75745bb998'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.591 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.591 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.591 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-233401537>, <NovaLikeServer: tempest-server-test-590988812>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-233401537>, <NovaLikeServer: tempest-server-test-590988812>]
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.591 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.591 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 nova_compute[183075]: 2026-01-22 17:09:55.594 183079 DEBUG oslo_concurrency.lockutils [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6aeb7d8a-d85e-4d25-9a2c-39cc206d11f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:09:55.591934', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c7b8a42-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.247461619, 'message_signature': 'c879d8a63d1d39023321b60117b7c57ad520f4768d1876a1a73149eff89b00ef'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245-vda', 'timestamp': '2026-01-22T17:09:55.591934', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'instance-0000000c', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c7b9320-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.266543758, 'message_signature': '8ff167a61117600d2330753f2119ae47cc8b3bd26dbf396def3979bf61abbf23'}]}, 'timestamp': '2026-01-22 17:09:55.592384', '_unique_id': '04526b396f16456a85abf3db5bedc08a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.593 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.593 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.593 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-233401537>, <NovaLikeServer: tempest-server-test-590988812>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-233401537>, <NovaLikeServer: tempest-server-test-590988812>]
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.593 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.593 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.read.latency volume: 125155986 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/disk.device.read.latency volume: 106971241 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a8a7dc8-d20f-42b8-8770-b4374226a364', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 125155986, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:09:55.593788', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c7bd2ae-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.247461619, 'message_signature': '8cc714e027e5e084a6ff7f86868bfa966c780e4da83a344e1ac37e8052693c63'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106971241, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245-vda', 'timestamp': '2026-01-22T17:09:55.593788', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'instance-0000000c', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c7bda4c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.266543758, 'message_signature': 'ff76763104c61306ad0f6f8a1112dc44d9cbaff33297568f0d05f9ffcab1ccea'}]}, 'timestamp': '2026-01-22 17:09:55.594211', '_unique_id': '981e439829f54ea7b150f3fa87803aba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.594 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.595 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.595 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94aae1bb-5de4-42ef-8bc0-5df4e2fbb380', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:09:55.595776', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '2c7c2146-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.285693769, 'message_signature': '8df41577b8c716022bef4a3faecbfa7095d97fed99446259550e111f117e1dee'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-0000000c-e4683d56-25f3-42a9-aedd-1b076e9a5245-tap096b36b4-87', 'timestamp': '2026-01-22T17:09:55.595776', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'tap096b36b4-87', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '2c7c2952-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.288680507, 'message_signature': '8ad5dd233272911fb83f0d1ba132047916b3433cce83f25b80f5981038f928ec'}]}, 'timestamp': '2026-01-22 17:09:55.596242', '_unique_id': 'f055db4ba6a7446c9be17258fc3b2695'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.596 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.597 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.597 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.597 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 7e8d077b-66fc-42ee-ad4e-a13327ad6764: ceilometer.compute.pollsters.NoVolumeException
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.597 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.597 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance e4683d56-25f3-42a9-aedd-1b076e9a5245: ceilometer.compute.pollsters.NoVolumeException
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.597 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.598 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.598 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd120f1a5-9ae7-4838-8b9e-2d037f189991', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:09:55.598075', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '2c7c7b28-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.285693769, 'message_signature': 'f619f1bf9cd8fdf8a74e42bea877389bd98189df136ed2a2b60900f66dd21d61'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-0000000c-e4683d56-25f3-42a9-aedd-1b076e9a5245-tap096b36b4-87', 'timestamp': '2026-01-22T17:09:55.598075', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'tap096b36b4-87', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '2c7c858c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.288680507, 'message_signature': '913ee668276a36bcec84083e72eb40a0dbb5c10d8f0711ec343721ab2dd56e32'}]}, 'timestamp': '2026-01-22 17:09:55.598582', '_unique_id': '63b810a9120b4a5ba2e71bbfd35b949c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.599 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2daee06-3e60-4ac4-b1fc-5d52b9b58d50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:09:55.599805', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '2c7cbe1c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.285693769, 'message_signature': '82bdbaf388dc9e54391df74cec086bf3d13c103d70ed5d4c3fc1248460723910'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-0000000c-e4683d56-25f3-42a9-aedd-1b076e9a5245-tap096b36b4-87', 'timestamp': '2026-01-22T17:09:55.599805', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'tap096b36b4-87', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '2c7cc628-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.288680507, 'message_signature': '62bfeb91d565a87ac24e7de7260c6c199d5bd5a1966ab88a7e11787439b4e620'}]}, 'timestamp': '2026-01-22 17:09:55.600262', '_unique_id': '14f706767cfc4095976109f30f2bf1ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.600 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.601 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.601 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.601 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50fdef69-7931-4033-830f-895078e82a24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:09:55.601517', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '2c7d0304-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.285693769, 'message_signature': '4d135aa0a1849377d4dcf4bc80903ea2b38a95342b474dc2dff8e9c54c4a3646'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-0000000c-e4683d56-25f3-42a9-aedd-1b076e9a5245-tap096b36b4-87', 'timestamp': '2026-01-22T17:09:55.601517', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'tap096b36b4-87', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '2c7d0b60-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.288680507, 'message_signature': 'baf879052cd321a411f7248a72587bec08018cc01285be1b8da4a81015b00240'}]}, 'timestamp': '2026-01-22 17:09:55.602004', '_unique_id': 'e6f85439a6c24e54bfc696f6b467fda9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.602 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.603 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.603 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.603 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-233401537>, <NovaLikeServer: tempest-server-test-590988812>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-233401537>, <NovaLikeServer: tempest-server-test-590988812>]
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.603 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.603 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e8efac2-eea5-41b9-b8be-d10efae2ce12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:09:55.603795', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '2c7d5a20-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.285693769, 'message_signature': '4475bc7c17679d316c564096a8f7410711b21dc28370bf1b029c671cffeaede6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-0000000c-e4683d56-25f3-42a9-aedd-1b076e9a5245-tap096b36b4-87', 'timestamp': '2026-01-22T17:09:55.603795', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'tap096b36b4-87', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '2c7d62e0-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.288680507, 'message_signature': 'a72d897bd5ecfebe02ab05379619508ad0b67001c848d1d7405e48a6f3423074'}]}, 'timestamp': '2026-01-22 17:09:55.604267', '_unique_id': 'a38b5dbc1e7a4bb4a150ed875e6d00e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.604 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.605 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.605 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.605 12 DEBUG ceilometer.compute.pollsters [-] e4683d56-25f3-42a9-aedd-1b076e9a5245/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15bf0110-f8be-4829-9dd8-0f5e2c72776b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:09:55.605489', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c7d9bf2-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.247461619, 'message_signature': '13d2b8f902b395528d6cb8ca1bbc71b630c717f9c2e2f7495b9bb3104a743b23'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245-vda', 'timestamp': '2026-01-22T17:09:55.605489', 'resource_metadata': {'display_name': 'tempest-server-test-590988812', 'name': 'instance-0000000c', 'instance_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2c7da4e4-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4159.266543758, 'message_signature': '547945231b08512d71305a9037993d5e9a82610e6fd2a2780ed7d839825900d8'}]}, 'timestamp': '2026-01-22 17:09:55.605926', '_unique_id': 'aa1ff2139fcc4de3a804d82bea964471'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:09:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:09:55.606 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:09:55 compute-0 nova_compute[183075]: 2026-01-22 17:09:55.621 183079 INFO nova.scheduler.client.report [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Deleted allocations for instance 1fa7475b-9f51-4229-8ded-3a0c4de806c5
Jan 22 17:09:55 compute-0 nova_compute[183075]: 2026-01-22 17:09:55.672 183079 DEBUG oslo_concurrency.lockutils [None req-bd53f738-1e8d-4445-ac45-787410ceb51a 56c736d5c1ab41d8a02fcbc021d28353 a08acddaa8b748f8b4fbb432b95408d1 - - default default] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:55 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:09:56 compute-0 nova_compute[183075]: 2026-01-22 17:09:56.916 183079 DEBUG nova.compute.manager [req-26cf6ec1-45f5-4afe-b7d5-b924b9d2045b req-1dfe8206-f803-452d-b6e5-15f0df51a4f4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Received event network-vif-plugged-d53b3c49-e24c-4e07-944d-d72baf3994e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:56 compute-0 nova_compute[183075]: 2026-01-22 17:09:56.917 183079 DEBUG oslo_concurrency.lockutils [req-26cf6ec1-45f5-4afe-b7d5-b924b9d2045b req-1dfe8206-f803-452d-b6e5-15f0df51a4f4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:56 compute-0 nova_compute[183075]: 2026-01-22 17:09:56.919 183079 DEBUG oslo_concurrency.lockutils [req-26cf6ec1-45f5-4afe-b7d5-b924b9d2045b req-1dfe8206-f803-452d-b6e5-15f0df51a4f4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:56 compute-0 nova_compute[183075]: 2026-01-22 17:09:56.920 183079 DEBUG oslo_concurrency.lockutils [req-26cf6ec1-45f5-4afe-b7d5-b924b9d2045b req-1dfe8206-f803-452d-b6e5-15f0df51a4f4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "1fa7475b-9f51-4229-8ded-3a0c4de806c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:56 compute-0 nova_compute[183075]: 2026-01-22 17:09:56.920 183079 DEBUG nova.compute.manager [req-26cf6ec1-45f5-4afe-b7d5-b924b9d2045b req-1dfe8206-f803-452d-b6e5-15f0df51a4f4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] No waiting events found dispatching network-vif-plugged-d53b3c49-e24c-4e07-944d-d72baf3994e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:09:56 compute-0 nova_compute[183075]: 2026-01-22 17:09:56.921 183079 WARNING nova.compute.manager [req-26cf6ec1-45f5-4afe-b7d5-b924b9d2045b req-1dfe8206-f803-452d-b6e5-15f0df51a4f4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Received unexpected event network-vif-plugged-d53b3c49-e24c-4e07-944d-d72baf3994e0 for instance with vm_state deleted and task_state None.
Jan 22 17:09:56 compute-0 nova_compute[183075]: 2026-01-22 17:09:56.921 183079 DEBUG nova.compute.manager [req-26cf6ec1-45f5-4afe-b7d5-b924b9d2045b req-1dfe8206-f803-452d-b6e5-15f0df51a4f4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Received event network-vif-deleted-d53b3c49-e24c-4e07-944d-d72baf3994e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.122 183079 INFO nova.compute.manager [None req-514d11e5-0744-4448-b671-7c52a5b4a1c7 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Get console output
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.128 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.277 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.342 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.343 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.367 183079 DEBUG nova.compute.manager [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.434 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.435 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.441 183079 DEBUG nova.virt.hardware [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.441 183079 INFO nova.compute.claims [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.595 183079 DEBUG nova.compute.provider_tree [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.612 183079 DEBUG nova.scheduler.client.report [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.628 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.629 183079 DEBUG nova.compute.manager [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.695 183079 DEBUG nova.compute.manager [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.696 183079 DEBUG nova.network.neutron [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.715 183079 INFO nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.730 183079 DEBUG nova.compute.manager [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.821 183079 DEBUG nova.compute.manager [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.822 183079 DEBUG nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.823 183079 INFO nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Creating image(s)
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.824 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "/var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.824 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.825 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.841 183079 DEBUG oslo_concurrency.processutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.896 183079 INFO nova.compute.manager [None req-7f28221a-8af3-4f39-ba1d-837bb80f7280 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Get console output
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.900 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.913 183079 DEBUG oslo_concurrency.processutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.914 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.914 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.925 183079 DEBUG oslo_concurrency.processutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.984 183079 DEBUG oslo_concurrency.processutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:09:59 compute-0 nova_compute[183075]: 2026-01-22 17:09:59.985 183079 DEBUG oslo_concurrency.processutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.018 183079 DEBUG oslo_concurrency.processutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.019 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.019 183079 DEBUG oslo_concurrency.processutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.036 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.076 183079 DEBUG oslo_concurrency.processutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.077 183079 DEBUG nova.virt.disk.api [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Checking if we can resize image /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.077 183079 DEBUG oslo_concurrency.processutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.132 183079 DEBUG oslo_concurrency.processutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.133 183079 DEBUG nova.virt.disk.api [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Cannot resize image /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.133 183079 DEBUG nova.objects.instance [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'migration_context' on Instance uuid 618a8c78-4b30-4b4d-9617-4c54bfdd414e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.147 183079 DEBUG nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.148 183079 DEBUG nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Ensure instance console log exists: /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.148 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.148 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.148 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.220 183079 DEBUG nova.policy [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:10:00 compute-0 podman[218139]: 2026-01-22 17:10:00.368793693 +0000 UTC m=+0.071703036 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:10:00 compute-0 ovn_controller[95372]: 2026-01-22T17:10:00Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d4:c8:e5 10.100.0.9
Jan 22 17:10:00 compute-0 ovn_controller[95372]: 2026-01-22T17:10:00Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d4:c8:e5 10.100.0.9
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.941 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101785.9397206, 7d42beab-5bb4-43a0-9756-ced73188f5ba => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.943 183079 INFO nova.compute.manager [-] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] VM Stopped (Lifecycle Event)
Jan 22 17:10:00 compute-0 nova_compute[183075]: 2026-01-22 17:10:00.966 183079 DEBUG nova.compute.manager [None req-20dacf9e-5c73-4e7d-9446-c91e842a280f - - - - - -] [instance: 7d42beab-5bb4-43a0-9756-ced73188f5ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:01 compute-0 ovn_controller[95372]: 2026-01-22T17:10:01Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f6:53:65 10.100.0.14
Jan 22 17:10:01 compute-0 ovn_controller[95372]: 2026-01-22T17:10:01Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f6:53:65 10.100.0.14
Jan 22 17:10:02 compute-0 nova_compute[183075]: 2026-01-22 17:10:02.300 183079 DEBUG nova.network.neutron [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Successfully updated port: 1ea3e9d7-15d1-4941-93be-4710d9a29763 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:10:02 compute-0 nova_compute[183075]: 2026-01-22 17:10:02.321 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "refresh_cache-618a8c78-4b30-4b4d-9617-4c54bfdd414e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:10:02 compute-0 nova_compute[183075]: 2026-01-22 17:10:02.322 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquired lock "refresh_cache-618a8c78-4b30-4b4d-9617-4c54bfdd414e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:10:02 compute-0 nova_compute[183075]: 2026-01-22 17:10:02.322 183079 DEBUG nova.network.neutron [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:10:02 compute-0 nova_compute[183075]: 2026-01-22 17:10:02.386 183079 DEBUG nova.compute.manager [req-eb665bc4-e9d5-40f7-97ac-f2e4468facd7 req-a599af76-57bf-4844-ad0e-e6eeadfe5448 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Received event network-changed-1ea3e9d7-15d1-4941-93be-4710d9a29763 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:02 compute-0 nova_compute[183075]: 2026-01-22 17:10:02.387 183079 DEBUG nova.compute.manager [req-eb665bc4-e9d5-40f7-97ac-f2e4468facd7 req-a599af76-57bf-4844-ad0e-e6eeadfe5448 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Refreshing instance network info cache due to event network-changed-1ea3e9d7-15d1-4941-93be-4710d9a29763. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:10:02 compute-0 nova_compute[183075]: 2026-01-22 17:10:02.387 183079 DEBUG oslo_concurrency.lockutils [req-eb665bc4-e9d5-40f7-97ac-f2e4468facd7 req-a599af76-57bf-4844-ad0e-e6eeadfe5448 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-618a8c78-4b30-4b4d-9617-4c54bfdd414e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:10:02 compute-0 nova_compute[183075]: 2026-01-22 17:10:02.526 183079 DEBUG nova.network.neutron [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:10:02 compute-0 ovn_controller[95372]: 2026-01-22T17:10:02Z|00152|binding|INFO|Releasing lport 02f52a63-476f-468b-a774-c9514d6b2206 from this chassis (sb_readonly=0)
Jan 22 17:10:02 compute-0 ovn_controller[95372]: 2026-01-22T17:10:02Z|00153|binding|INFO|Releasing lport 255f865e-6322-48b0-a0d1-c16ced648c78 from this chassis (sb_readonly=0)
Jan 22 17:10:02 compute-0 nova_compute[183075]: 2026-01-22 17:10:02.756 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:03 compute-0 nova_compute[183075]: 2026-01-22 17:10:03.830 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.050 183079 DEBUG nova.network.neutron [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Updating instance_info_cache with network_info: [{"id": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "address": "fa:16:3e:ca:cc:e7", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea3e9d7-15", "ovs_interfaceid": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.073 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Releasing lock "refresh_cache-618a8c78-4b30-4b4d-9617-4c54bfdd414e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.074 183079 DEBUG nova.compute.manager [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Instance network_info: |[{"id": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "address": "fa:16:3e:ca:cc:e7", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea3e9d7-15", "ovs_interfaceid": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.075 183079 DEBUG oslo_concurrency.lockutils [req-eb665bc4-e9d5-40f7-97ac-f2e4468facd7 req-a599af76-57bf-4844-ad0e-e6eeadfe5448 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-618a8c78-4b30-4b4d-9617-4c54bfdd414e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.075 183079 DEBUG nova.network.neutron [req-eb665bc4-e9d5-40f7-97ac-f2e4468facd7 req-a599af76-57bf-4844-ad0e-e6eeadfe5448 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Refreshing network info cache for port 1ea3e9d7-15d1-4941-93be-4710d9a29763 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.078 183079 DEBUG nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Start _get_guest_xml network_info=[{"id": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "address": "fa:16:3e:ca:cc:e7", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea3e9d7-15", "ovs_interfaceid": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.082 183079 WARNING nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.088 183079 DEBUG nova.virt.libvirt.host [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.089 183079 DEBUG nova.virt.libvirt.host [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.096 183079 DEBUG nova.virt.libvirt.host [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.097 183079 DEBUG nova.virt.libvirt.host [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.098 183079 DEBUG nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.098 183079 DEBUG nova.virt.hardware [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.099 183079 DEBUG nova.virt.hardware [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.099 183079 DEBUG nova.virt.hardware [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.099 183079 DEBUG nova.virt.hardware [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.100 183079 DEBUG nova.virt.hardware [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.100 183079 DEBUG nova.virt.hardware [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.100 183079 DEBUG nova.virt.hardware [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.101 183079 DEBUG nova.virt.hardware [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.101 183079 DEBUG nova.virt.hardware [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.101 183079 DEBUG nova.virt.hardware [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.101 183079 DEBUG nova.virt.hardware [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.106 183079 DEBUG nova.virt.libvirt.vif [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:09:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-254681299',display_name='tempest-server-test-254681299',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-254681299',id=13,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-nqvb2q9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:09:59Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=618a8c78-4b30-4b4d-9617-4c54bfdd414e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "address": "fa:16:3e:ca:cc:e7", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea3e9d7-15", "ovs_interfaceid": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.107 183079 DEBUG nova.network.os_vif_util [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "address": "fa:16:3e:ca:cc:e7", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea3e9d7-15", "ovs_interfaceid": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.108 183079 DEBUG nova.network.os_vif_util [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ca:cc:e7,bridge_name='br-int',has_traffic_filtering=True,id=1ea3e9d7-15d1-4941-93be-4710d9a29763,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1ea3e9d7-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.109 183079 DEBUG nova.objects.instance [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 618a8c78-4b30-4b4d-9617-4c54bfdd414e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.126 183079 DEBUG nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:10:04 compute-0 nova_compute[183075]:   <uuid>618a8c78-4b30-4b4d-9617-4c54bfdd414e</uuid>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   <name>instance-0000000d</name>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-254681299</nova:name>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:10:04</nova:creationTime>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:10:04 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:10:04 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:10:04 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:10:04 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:10:04 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:10:04 compute-0 nova_compute[183075]:         <nova:user uuid="cd47d63cff2548a88e21e5c2e6a5c161">tempest-FloatingIpSeparateNetwork-931877966-project-member</nova:user>
Jan 22 17:10:04 compute-0 nova_compute[183075]:         <nova:project uuid="e05c7aae349e4a1d859a387df45650a0">tempest-FloatingIpSeparateNetwork-931877966</nova:project>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:10:04 compute-0 nova_compute[183075]:         <nova:port uuid="1ea3e9d7-15d1-4941-93be-4710d9a29763">
Jan 22 17:10:04 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <system>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <entry name="serial">618a8c78-4b30-4b4d-9617-4c54bfdd414e</entry>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <entry name="uuid">618a8c78-4b30-4b4d-9617-4c54bfdd414e</entry>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     </system>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   <os>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   </os>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   <features>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   </features>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:ca:cc:e7"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <target dev="tap1ea3e9d7-15"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/console.log" append="off"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <video>
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     </video>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:10:04 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:10:04 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:10:04 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:10:04 compute-0 nova_compute[183075]: </domain>
Jan 22 17:10:04 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.127 183079 DEBUG nova.compute.manager [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Preparing to wait for external event network-vif-plugged-1ea3e9d7-15d1-4941-93be-4710d9a29763 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.128 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.128 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.128 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.129 183079 DEBUG nova.virt.libvirt.vif [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:09:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-254681299',display_name='tempest-server-test-254681299',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-254681299',id=13,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-nqvb2q9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:09:59Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=618a8c78-4b30-4b4d-9617-4c54bfdd414e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "address": "fa:16:3e:ca:cc:e7", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea3e9d7-15", "ovs_interfaceid": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.129 183079 DEBUG nova.network.os_vif_util [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "address": "fa:16:3e:ca:cc:e7", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea3e9d7-15", "ovs_interfaceid": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.130 183079 DEBUG nova.network.os_vif_util [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ca:cc:e7,bridge_name='br-int',has_traffic_filtering=True,id=1ea3e9d7-15d1-4941-93be-4710d9a29763,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1ea3e9d7-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.131 183079 DEBUG os_vif [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:cc:e7,bridge_name='br-int',has_traffic_filtering=True,id=1ea3e9d7-15d1-4941-93be-4710d9a29763,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1ea3e9d7-15') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.131 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.132 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.132 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.135 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.136 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1ea3e9d7-15, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.136 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1ea3e9d7-15, col_values=(('external_ids', {'iface-id': '1ea3e9d7-15d1-4941-93be-4710d9a29763', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ca:cc:e7', 'vm-uuid': '618a8c78-4b30-4b4d-9617-4c54bfdd414e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.138 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:04 compute-0 NetworkManager[55454]: <info>  [1769101804.1397] manager: (tap1ea3e9d7-15): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.142 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.146 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.148 183079 INFO os_vif [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:cc:e7,bridge_name='br-int',has_traffic_filtering=True,id=1ea3e9d7-15d1-4941-93be-4710d9a29763,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1ea3e9d7-15')
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.223 183079 DEBUG nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.225 183079 DEBUG nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No VIF found with MAC fa:16:3e:ca:cc:e7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.273 183079 INFO nova.compute.manager [None req-ff48f38e-5194-412d-a115-b350ee49f0fe 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Get console output
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.279 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:04 compute-0 kernel: tap1ea3e9d7-15: entered promiscuous mode
Jan 22 17:10:04 compute-0 NetworkManager[55454]: <info>  [1769101804.2897] manager: (tap1ea3e9d7-15): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Jan 22 17:10:04 compute-0 ovn_controller[95372]: 2026-01-22T17:10:04Z|00154|binding|INFO|Claiming lport 1ea3e9d7-15d1-4941-93be-4710d9a29763 for this chassis.
Jan 22 17:10:04 compute-0 ovn_controller[95372]: 2026-01-22T17:10:04Z|00155|binding|INFO|1ea3e9d7-15d1-4941-93be-4710d9a29763: Claiming fa:16:3e:ca:cc:e7 10.100.0.8
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.293 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.299 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:cc:e7 10.100.0.8'], port_security=['fa:16:3e:ca:cc:e7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '618a8c78-4b30-4b4d-9617-4c54bfdd414e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-576f6598-999f-46d9-809a-65b7475a1ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.185'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92087573-6410-4f80-a195-ce8b2058420e, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=1ea3e9d7-15d1-4941-93be-4710d9a29763) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.300 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 1ea3e9d7-15d1-4941-93be-4710d9a29763 in datapath 576f6598-999f-46d9-809a-65b7475a1ec7 bound to our chassis
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.301 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 576f6598-999f-46d9-809a-65b7475a1ec7
Jan 22 17:10:04 compute-0 ovn_controller[95372]: 2026-01-22T17:10:04Z|00156|binding|INFO|Setting lport 1ea3e9d7-15d1-4941-93be-4710d9a29763 ovn-installed in OVS
Jan 22 17:10:04 compute-0 ovn_controller[95372]: 2026-01-22T17:10:04Z|00157|binding|INFO|Setting lport 1ea3e9d7-15d1-4941-93be-4710d9a29763 up in Southbound
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.312 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.315 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.316 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f25f4574-7817-49b4-82d1-d0949256e54d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.317 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap576f6598-91 in ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.318 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap576f6598-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.319 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6a8eabaa-b42b-4991-b80d-55260f142b20]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.321 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec882a7-d72a-4c41-86aa-531f528451f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:04 compute-0 systemd-udevd[218177]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.333 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[09dcacdb-7f34-40ff-9d70-1f7c5f83768a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:04 compute-0 NetworkManager[55454]: <info>  [1769101804.3428] device (tap1ea3e9d7-15): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:10:04 compute-0 NetworkManager[55454]: <info>  [1769101804.3438] device (tap1ea3e9d7-15): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:10:04 compute-0 systemd-machined[154382]: New machine qemu-13-instance-0000000d.
Jan 22 17:10:04 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-0000000d.
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.356 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2a3d09ac-2335-4137-baef-769175a91cf5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.387 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c95a4e39-0eb1-4d34-a5d1-caf3fa44e59b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:04 compute-0 systemd-udevd[218181]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.395 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5f894dc2-a9a3-450c-b6f0-054ac2a1debf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:04 compute-0 NetworkManager[55454]: <info>  [1769101804.3961] manager: (tap576f6598-90): new Veth device (/org/freedesktop/NetworkManager/Devices/76)
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.429 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[14c9a897-744e-4359-9f78-3e98260f142b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.433 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[117e6946-a158-41c9-ab7a-0e85faaf0cf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:04 compute-0 NetworkManager[55454]: <info>  [1769101804.4598] device (tap576f6598-90): carrier: link connected
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.467 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[9d888cd1-96fe-4615-a569-54c5b11d8e7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.489 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0e424d04-f481-4f23-961a-846c38f857b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap576f6598-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:fa:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416816, 'reachable_time': 22631, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218209, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.513 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[05170826-5fb6-4bed-b3f2-166639936488]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea1:facd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 416816, 'tstamp': 416816}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218210, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.537 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5a544a8c-4a79-4540-9dec-1a87abb90670]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap576f6598-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:fa:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416816, 'reachable_time': 22631, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218212, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.577 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5aef57de-deb4-45d9-b0a5-48c7f5e5cc56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.661 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101804.660793, 618a8c78-4b30-4b4d-9617-4c54bfdd414e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.661 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] VM Started (Lifecycle Event)
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.661 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bf2fbd71-9021-497f-81f7-97aabee115dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.664 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap576f6598-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.664 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.665 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap576f6598-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:04 compute-0 NetworkManager[55454]: <info>  [1769101804.6683] manager: (tap576f6598-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.667 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:04 compute-0 kernel: tap576f6598-90: entered promiscuous mode
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.670 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.671 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap576f6598-90, col_values=(('external_ids', {'iface-id': '1759254b-798a-4e65-baf5-489557c1f604'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.673 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:04 compute-0 ovn_controller[95372]: 2026-01-22T17:10:04Z|00158|binding|INFO|Releasing lport 1759254b-798a-4e65-baf5-489557c1f604 from this chassis (sb_readonly=0)
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.675 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.675 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/576f6598-999f-46d9-809a-65b7475a1ec7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/576f6598-999f-46d9-809a-65b7475a1ec7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.676 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[059bb279-499f-459f-81c5-70c2d76d4664]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.677 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/576f6598-999f-46d9-809a-65b7475a1ec7.pid.haproxy
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 576f6598-999f-46d9-809a-65b7475a1ec7
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:10:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:04.678 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'env', 'PROCESS_TAG=haproxy-576f6598-999f-46d9-809a-65b7475a1ec7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/576f6598-999f-46d9-809a-65b7475a1ec7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.686 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.690 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.695 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101804.660901, 618a8c78-4b30-4b4d-9617-4c54bfdd414e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.695 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] VM Paused (Lifecycle Event)
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.712 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.716 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.737 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.803 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101789.8025265, ee33030a-2035-4fd1-8de4-261142b89bc6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.804 183079 INFO nova.compute.manager [-] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] VM Stopped (Lifecycle Event)
Jan 22 17:10:04 compute-0 nova_compute[183075]: 2026-01-22 17:10:04.839 183079 DEBUG nova.compute.manager [None req-716f55ae-d7f7-49d6-86d1-0f607e7d210b - - - - - -] [instance: ee33030a-2035-4fd1-8de4-261142b89bc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.040 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.073 183079 INFO nova.compute.manager [None req-76e3cfc8-6960-4257-9fe0-21500b29b95f 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Get console output
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.080 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:05 compute-0 podman[218249]: 2026-01-22 17:10:05.110478336 +0000 UTC m=+0.061370586 container create 3a21c7084bfab33bfc6021ace08f3f1dbd4dd8dc47e25071b33adb61bf86283f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 17:10:05 compute-0 systemd[1]: Started libpod-conmon-3a21c7084bfab33bfc6021ace08f3f1dbd4dd8dc47e25071b33adb61bf86283f.scope.
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.160 183079 DEBUG nova.compute.manager [req-432ba639-b35e-4875-b440-d9fb4955a996 req-0bb0b449-554f-4eea-901e-aadb0630fb7c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Received event network-vif-plugged-1ea3e9d7-15d1-4941-93be-4710d9a29763 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.161 183079 DEBUG oslo_concurrency.lockutils [req-432ba639-b35e-4875-b440-d9fb4955a996 req-0bb0b449-554f-4eea-901e-aadb0630fb7c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.161 183079 DEBUG oslo_concurrency.lockutils [req-432ba639-b35e-4875-b440-d9fb4955a996 req-0bb0b449-554f-4eea-901e-aadb0630fb7c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.162 183079 DEBUG oslo_concurrency.lockutils [req-432ba639-b35e-4875-b440-d9fb4955a996 req-0bb0b449-554f-4eea-901e-aadb0630fb7c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.162 183079 DEBUG nova.compute.manager [req-432ba639-b35e-4875-b440-d9fb4955a996 req-0bb0b449-554f-4eea-901e-aadb0630fb7c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Processing event network-vif-plugged-1ea3e9d7-15d1-4941-93be-4710d9a29763 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.162 183079 DEBUG nova.compute.manager [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.168 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101805.1681395, 618a8c78-4b30-4b4d-9617-4c54bfdd414e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.168 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] VM Resumed (Lifecycle Event)
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.170 183079 DEBUG nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.174 183079 INFO nova.virt.libvirt.driver [-] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Instance spawned successfully.
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.174 183079 DEBUG nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:10:05 compute-0 podman[218249]: 2026-01-22 17:10:05.08004304 +0000 UTC m=+0.030935320 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:10:05 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:10:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a5198b9617c24de85708d09a3bb8ad0b0d2b51b2a0f24d17911cec713817343d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.189 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.198 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:10:05 compute-0 podman[218249]: 2026-01-22 17:10:05.205599623 +0000 UTC m=+0.156491883 container init 3a21c7084bfab33bfc6021ace08f3f1dbd4dd8dc47e25071b33adb61bf86283f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.206 183079 DEBUG nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.208 183079 DEBUG nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.208 183079 DEBUG nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.209 183079 DEBUG nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.210 183079 DEBUG nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.211 183079 DEBUG nova.virt.libvirt.driver [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:05 compute-0 podman[218249]: 2026-01-22 17:10:05.217055803 +0000 UTC m=+0.167948043 container start 3a21c7084bfab33bfc6021ace08f3f1dbd4dd8dc47e25071b33adb61bf86283f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.218 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:10:05 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[218264]: [NOTICE]   (218269) : New worker (218271) forked
Jan 22 17:10:05 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[218264]: [NOTICE]   (218269) : Loading success.
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.282 183079 INFO nova.compute.manager [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Took 5.46 seconds to spawn the instance on the hypervisor.
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.283 183079 DEBUG nova.compute.manager [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.349 183079 INFO nova.compute.manager [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Took 5.93 seconds to build instance.
Jan 22 17:10:05 compute-0 nova_compute[183075]: 2026-01-22 17:10:05.369 183079 DEBUG oslo_concurrency.lockutils [None req-cf34456f-fef8-421c-8fd1-fd41b4e35d84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:06 compute-0 nova_compute[183075]: 2026-01-22 17:10:06.147 183079 DEBUG nova.network.neutron [req-eb665bc4-e9d5-40f7-97ac-f2e4468facd7 req-a599af76-57bf-4844-ad0e-e6eeadfe5448 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Updated VIF entry in instance network info cache for port 1ea3e9d7-15d1-4941-93be-4710d9a29763. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:10:06 compute-0 nova_compute[183075]: 2026-01-22 17:10:06.147 183079 DEBUG nova.network.neutron [req-eb665bc4-e9d5-40f7-97ac-f2e4468facd7 req-a599af76-57bf-4844-ad0e-e6eeadfe5448 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Updating instance_info_cache with network_info: [{"id": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "address": "fa:16:3e:ca:cc:e7", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea3e9d7-15", "ovs_interfaceid": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:10:06 compute-0 nova_compute[183075]: 2026-01-22 17:10:06.164 183079 DEBUG oslo_concurrency.lockutils [req-eb665bc4-e9d5-40f7-97ac-f2e4468facd7 req-a599af76-57bf-4844-ad0e-e6eeadfe5448 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-618a8c78-4b30-4b4d-9617-4c54bfdd414e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:10:06 compute-0 nova_compute[183075]: 2026-01-22 17:10:06.195 183079 INFO nova.compute.manager [None req-b31efb01-71ce-436c-9b4a-d1e4d531bfc1 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Get console output
Jan 22 17:10:06 compute-0 nova_compute[183075]: 2026-01-22 17:10:06.201 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:06 compute-0 nova_compute[183075]: 2026-01-22 17:10:06.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:07.135 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:07.136 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:07 compute-0 nova_compute[183075]: 2026-01-22 17:10:07.298 183079 DEBUG nova.compute.manager [req-8b80fcd4-e679-4d86-bd78-5606255ccaf1 req-020ec4bb-a627-47eb-aab6-3926305cdb2a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Received event network-vif-plugged-1ea3e9d7-15d1-4941-93be-4710d9a29763 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:07 compute-0 nova_compute[183075]: 2026-01-22 17:10:07.299 183079 DEBUG oslo_concurrency.lockutils [req-8b80fcd4-e679-4d86-bd78-5606255ccaf1 req-020ec4bb-a627-47eb-aab6-3926305cdb2a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:07 compute-0 nova_compute[183075]: 2026-01-22 17:10:07.300 183079 DEBUG oslo_concurrency.lockutils [req-8b80fcd4-e679-4d86-bd78-5606255ccaf1 req-020ec4bb-a627-47eb-aab6-3926305cdb2a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:07 compute-0 nova_compute[183075]: 2026-01-22 17:10:07.300 183079 DEBUG oslo_concurrency.lockutils [req-8b80fcd4-e679-4d86-bd78-5606255ccaf1 req-020ec4bb-a627-47eb-aab6-3926305cdb2a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:07 compute-0 nova_compute[183075]: 2026-01-22 17:10:07.301 183079 DEBUG nova.compute.manager [req-8b80fcd4-e679-4d86-bd78-5606255ccaf1 req-020ec4bb-a627-47eb-aab6-3926305cdb2a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] No waiting events found dispatching network-vif-plugged-1ea3e9d7-15d1-4941-93be-4710d9a29763 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:10:07 compute-0 nova_compute[183075]: 2026-01-22 17:10:07.301 183079 WARNING nova.compute.manager [req-8b80fcd4-e679-4d86-bd78-5606255ccaf1 req-020ec4bb-a627-47eb-aab6-3926305cdb2a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Received unexpected event network-vif-plugged-1ea3e9d7-15d1-4941-93be-4710d9a29763 for instance with vm_state active and task_state None.
Jan 22 17:10:07 compute-0 podman[218280]: 2026-01-22 17:10:07.362234407 +0000 UTC m=+0.070166666 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:07.565 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:07.566 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:10:07 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:07 compute-0 nova_compute[183075]: 2026-01-22 17:10:07.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:10:07 compute-0 nova_compute[183075]: 2026-01-22 17:10:07.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.142 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.143 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.0063837
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217848]: 10.100.0.14:58390 [22/Jan/2026:17:10:07.134] listener listener/metadata 0/0/0/1008/1008 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.151 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.152 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.5856864
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.9:36222 [22/Jan/2026:17:10:07.564] listener listener/metadata 0/0/0/587/587 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.159 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.160 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.167 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.168 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.185 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.186 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0256276
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217848]: 10.100.0.14:58400 [22/Jan/2026:17:10:08.158] listener listener/metadata 0/0/0/27/27 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.188 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.189 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0212557
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.9:36230 [22/Jan/2026:17:10:08.163] listener listener/metadata 0/0/0/25/25 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.193 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.195 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.202 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.203 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.217 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217848]: 10.100.0.14:58414 [22/Jan/2026:17:10:08.192] listener listener/metadata 0/0/0/25/25 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.218 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0234175
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.225 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.226 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.230 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.231 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0278974
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.9:36238 [22/Jan/2026:17:10:08.195] listener listener/metadata 0/0/0/35/35 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.240 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.241 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.244 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.244 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0181632
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217848]: 10.100.0.14:58416 [22/Jan/2026:17:10:08.225] listener listener/metadata 0/0/0/19/19 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.252 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.253 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.260 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.260 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0195649
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.9:36244 [22/Jan/2026:17:10:08.240] listener listener/metadata 0/0/0/20/20 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.272 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.274 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.280 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.280 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0272145
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217848]: 10.100.0.14:58430 [22/Jan/2026:17:10:08.251] listener listener/metadata 0/0/0/29/29 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.285 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.286 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.302 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.302 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0160143
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217848]: 10.100.0.14:58438 [22/Jan/2026:17:10:08.285] listener listener/metadata 0/0/0/16/16 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.306 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.307 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0327816
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.9:36258 [22/Jan/2026:17:10:08.271] listener listener/metadata 0/0/0/35/35 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.313 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.314 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.318 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.318 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.334 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217848]: 10.100.0.14:58440 [22/Jan/2026:17:10:08.313] listener listener/metadata 0/0/0/22/22 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.335 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0215693
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.341 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.9:36264 [22/Jan/2026:17:10:08.316] listener listener/metadata 0/0/0/25/25 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.342 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0240228
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.343 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.344 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.349 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.349 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.371 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.371 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0276620
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217848]: 10.100.0.14:58452 [22/Jan/2026:17:10:08.340] listener listener/metadata 0/0/0/30/30 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.373 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.9:36270 [22/Jan/2026:17:10:08.347] listener listener/metadata 0/0/0/25/25 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.373 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0239375
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.380 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.380 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.381 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.388 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.411 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.412 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0310669
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217848]: 10.100.0.14:58456 [22/Jan/2026:17:10:08.379] listener listener/metadata 0/0/0/33/33 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.413 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.9:36272 [22/Jan/2026:17:10:08.380] listener listener/metadata 0/0/0/34/34 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.414 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0259225
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.418 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.420 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.425 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.426 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.447 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217848]: 10.100.0.14:58458 [22/Jan/2026:17:10:08.420] listener listener/metadata 0/0/0/27/27 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.448 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0220988
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.449 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.449 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0297618
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.9:36282 [22/Jan/2026:17:10:08.418] listener listener/metadata 0/0/0/31/31 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.454 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.455 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.459 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.460 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217848]: 10.100.0.14:58466 [22/Jan/2026:17:10:08.453] listener listener/metadata 0/0/0/22/22 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.476 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0213063
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.485 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.486 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.489 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.489 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0296934
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.9:36290 [22/Jan/2026:17:10:08.457] listener listener/metadata 0/0/0/32/32 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.493 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.494 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.519 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.520 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0342069
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217848]: 10.100.0.14:58468 [22/Jan/2026:17:10:08.484] listener listener/metadata 0/0/0/35/35 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.524 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.525 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.534 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0404134
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.9:36294 [22/Jan/2026:17:10:08.493] listener listener/metadata 0/0/0/41/41 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.549 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.550 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.557 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217848]: 10.100.0.14:58470 [22/Jan/2026:17:10:08.524] listener listener/metadata 0/0/0/33/33 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.557 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0328150
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.562 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.562 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.565 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.565 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0155039
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.9:36306 [22/Jan/2026:17:10:08.548] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.570 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.571 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.576 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.577 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0146794
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217848]: 10.100.0.14:58472 [22/Jan/2026:17:10:08.561] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.581 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.582 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.585 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.585 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0144169
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.9:36308 [22/Jan/2026:17:10:08.569] listener listener/metadata 0/0/0/15/15 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.590 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.590 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.602 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.602 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0205030
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217848]: 10.100.0.14:58486 [22/Jan/2026:17:10:08.581] listener listener/metadata 0/0/0/21/21 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.603 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.603 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0126007
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.9:36314 [22/Jan/2026:17:10:08.590] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.607 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.607 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.610 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.611 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.622 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.622 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0116465
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.9:36328 [22/Jan/2026:17:10:08.608] listener listener/metadata 0/0/0/14/14 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.628 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.628 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0209668
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217848]: 10.100.0.14:58498 [22/Jan/2026:17:10:08.606] listener listener/metadata 0/0/0/22/22 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.629 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.629 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.641 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:08 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.9:36330 [22/Jan/2026:17:10:08.627] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:10:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:08.642 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0129690
Jan 22 17:10:08 compute-0 nova_compute[183075]: 2026-01-22 17:10:08.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:10:09 compute-0 nova_compute[183075]: 2026-01-22 17:10:09.196 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:09 compute-0 nova_compute[183075]: 2026-01-22 17:10:09.252 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101794.2509959, 1fa7475b-9f51-4229-8ded-3a0c4de806c5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:10:09 compute-0 nova_compute[183075]: 2026-01-22 17:10:09.252 183079 INFO nova.compute.manager [-] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] VM Stopped (Lifecycle Event)
Jan 22 17:10:09 compute-0 nova_compute[183075]: 2026-01-22 17:10:09.278 183079 DEBUG nova.compute.manager [None req-393ac33d-9aa7-41de-b65d-b72b4d4ebb4f - - - - - -] [instance: 1fa7475b-9f51-4229-8ded-3a0c4de806c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:09 compute-0 nova_compute[183075]: 2026-01-22 17:10:09.630 183079 INFO nova.compute.manager [None req-67409e7e-37d8-4151-908f-742ba0636b0d 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Get console output
Jan 22 17:10:09 compute-0 nova_compute[183075]: 2026-01-22 17:10:09.636 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:09 compute-0 nova_compute[183075]: 2026-01-22 17:10:09.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:10:09 compute-0 nova_compute[183075]: 2026-01-22 17:10:09.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:10:09 compute-0 nova_compute[183075]: 2026-01-22 17:10:09.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:10:09 compute-0 nova_compute[183075]: 2026-01-22 17:10:09.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:10:10 compute-0 nova_compute[183075]: 2026-01-22 17:10:10.044 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:10 compute-0 nova_compute[183075]: 2026-01-22 17:10:10.263 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-7e8d077b-66fc-42ee-ad4e-a13327ad6764" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:10:10 compute-0 nova_compute[183075]: 2026-01-22 17:10:10.263 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-7e8d077b-66fc-42ee-ad4e-a13327ad6764" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:10:10 compute-0 nova_compute[183075]: 2026-01-22 17:10:10.263 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:10:10 compute-0 nova_compute[183075]: 2026-01-22 17:10:10.264 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7e8d077b-66fc-42ee-ad4e-a13327ad6764 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:10:10 compute-0 nova_compute[183075]: 2026-01-22 17:10:10.306 183079 INFO nova.compute.manager [None req-9701c60f-ada9-4ace-a486-4f402420cb48 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Get console output
Jan 22 17:10:10 compute-0 nova_compute[183075]: 2026-01-22 17:10:10.312 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:11 compute-0 nova_compute[183075]: 2026-01-22 17:10:11.442 183079 INFO nova.compute.manager [None req-1341bcf6-ddc8-4719-89f0-fbcabfefb53d cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Get console output
Jan 22 17:10:11 compute-0 nova_compute[183075]: 2026-01-22 17:10:11.447 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:11 compute-0 nova_compute[183075]: 2026-01-22 17:10:11.889 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Updating instance_info_cache with network_info: [{"id": "5644ae2a-c35b-431d-88a1-ad18de811d83", "address": "fa:16:3e:d4:c8:e5", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5644ae2a-c3", "ovs_interfaceid": "5644ae2a-c35b-431d-88a1-ad18de811d83", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:10:11 compute-0 nova_compute[183075]: 2026-01-22 17:10:11.909 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-7e8d077b-66fc-42ee-ad4e-a13327ad6764" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:10:11 compute-0 nova_compute[183075]: 2026-01-22 17:10:11.911 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:10:11 compute-0 nova_compute[183075]: 2026-01-22 17:10:11.912 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:10:11 compute-0 nova_compute[183075]: 2026-01-22 17:10:11.912 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:10:11 compute-0 nova_compute[183075]: 2026-01-22 17:10:11.933 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:11 compute-0 nova_compute[183075]: 2026-01-22 17:10:11.934 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:11 compute-0 nova_compute[183075]: 2026-01-22 17:10:11.935 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:11 compute-0 nova_compute[183075]: 2026-01-22 17:10:11.936 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:10:12 compute-0 nova_compute[183075]: 2026-01-22 17:10:12.036 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:12 compute-0 nova_compute[183075]: 2026-01-22 17:10:12.106 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:12 compute-0 nova_compute[183075]: 2026-01-22 17:10:12.108 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:12 compute-0 nova_compute[183075]: 2026-01-22 17:10:12.177 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:12 compute-0 nova_compute[183075]: 2026-01-22 17:10:12.189 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:12 compute-0 nova_compute[183075]: 2026-01-22 17:10:12.251 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:12 compute-0 nova_compute[183075]: 2026-01-22 17:10:12.252 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:12 compute-0 nova_compute[183075]: 2026-01-22 17:10:12.305 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:12 compute-0 nova_compute[183075]: 2026-01-22 17:10:12.314 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:12 compute-0 nova_compute[183075]: 2026-01-22 17:10:12.379 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:12 compute-0 nova_compute[183075]: 2026-01-22 17:10:12.381 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:12 compute-0 nova_compute[183075]: 2026-01-22 17:10:12.446 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:12 compute-0 nova_compute[183075]: 2026-01-22 17:10:12.716 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:10:12 compute-0 nova_compute[183075]: 2026-01-22 17:10:12.718 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5230MB free_disk=73.31864547729492GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:10:12 compute-0 nova_compute[183075]: 2026-01-22 17:10:12.718 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:12 compute-0 nova_compute[183075]: 2026-01-22 17:10:12.719 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:13 compute-0 nova_compute[183075]: 2026-01-22 17:10:13.501 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 7e8d077b-66fc-42ee-ad4e-a13327ad6764 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:10:13 compute-0 nova_compute[183075]: 2026-01-22 17:10:13.502 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance e4683d56-25f3-42a9-aedd-1b076e9a5245 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:10:13 compute-0 nova_compute[183075]: 2026-01-22 17:10:13.503 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 618a8c78-4b30-4b4d-9617-4c54bfdd414e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:10:13 compute-0 nova_compute[183075]: 2026-01-22 17:10:13.503 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:10:13 compute-0 nova_compute[183075]: 2026-01-22 17:10:13.504 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:10:13 compute-0 nova_compute[183075]: 2026-01-22 17:10:13.616 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:10:13 compute-0 nova_compute[183075]: 2026-01-22 17:10:13.637 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:10:13 compute-0 nova_compute[183075]: 2026-01-22 17:10:13.666 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:10:13 compute-0 nova_compute[183075]: 2026-01-22 17:10:13.667 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.948s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:14 compute-0 nova_compute[183075]: 2026-01-22 17:10:14.228 183079 DEBUG nova.compute.manager [req-ff9b6de5-6e78-4fc7-9fb5-6d6e706ed933 req-4f8c3f1c-de1f-4c54-8f90-b85e5008e955 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received event network-changed-5644ae2a-c35b-431d-88a1-ad18de811d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:14 compute-0 nova_compute[183075]: 2026-01-22 17:10:14.228 183079 DEBUG nova.compute.manager [req-ff9b6de5-6e78-4fc7-9fb5-6d6e706ed933 req-4f8c3f1c-de1f-4c54-8f90-b85e5008e955 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Refreshing instance network info cache due to event network-changed-5644ae2a-c35b-431d-88a1-ad18de811d83. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:10:14 compute-0 nova_compute[183075]: 2026-01-22 17:10:14.229 183079 DEBUG oslo_concurrency.lockutils [req-ff9b6de5-6e78-4fc7-9fb5-6d6e706ed933 req-4f8c3f1c-de1f-4c54-8f90-b85e5008e955 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-7e8d077b-66fc-42ee-ad4e-a13327ad6764" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:10:14 compute-0 nova_compute[183075]: 2026-01-22 17:10:14.229 183079 DEBUG oslo_concurrency.lockutils [req-ff9b6de5-6e78-4fc7-9fb5-6d6e706ed933 req-4f8c3f1c-de1f-4c54-8f90-b85e5008e955 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-7e8d077b-66fc-42ee-ad4e-a13327ad6764" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:10:14 compute-0 nova_compute[183075]: 2026-01-22 17:10:14.229 183079 DEBUG nova.network.neutron [req-ff9b6de5-6e78-4fc7-9fb5-6d6e706ed933 req-4f8c3f1c-de1f-4c54-8f90-b85e5008e955 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Refreshing network info cache for port 5644ae2a-c35b-431d-88a1-ad18de811d83 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:10:14 compute-0 nova_compute[183075]: 2026-01-22 17:10:14.232 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:14 compute-0 nova_compute[183075]: 2026-01-22 17:10:14.840 183079 INFO nova.compute.manager [None req-455efe4f-e479-4e37-84fc-0852d9a772cf 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Get console output
Jan 22 17:10:14 compute-0 nova_compute[183075]: 2026-01-22 17:10:14.851 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:15 compute-0 nova_compute[183075]: 2026-01-22 17:10:15.046 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:16 compute-0 nova_compute[183075]: 2026-01-22 17:10:16.661 183079 INFO nova.compute.manager [None req-c7e3ebc0-74fb-46c6-bca2-45eeef8a9f89 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Get console output
Jan 22 17:10:16 compute-0 nova_compute[183075]: 2026-01-22 17:10:16.667 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:17 compute-0 podman[218340]: 2026-01-22 17:10:17.389497635 +0000 UTC m=+0.077947329 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:10:17 compute-0 ovn_controller[95372]: 2026-01-22T17:10:17Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ca:cc:e7 10.100.0.8
Jan 22 17:10:17 compute-0 ovn_controller[95372]: 2026-01-22T17:10:17Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ca:cc:e7 10.100.0.8
Jan 22 17:10:18 compute-0 nova_compute[183075]: 2026-01-22 17:10:18.013 183079 DEBUG nova.network.neutron [req-ff9b6de5-6e78-4fc7-9fb5-6d6e706ed933 req-4f8c3f1c-de1f-4c54-8f90-b85e5008e955 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Updated VIF entry in instance network info cache for port 5644ae2a-c35b-431d-88a1-ad18de811d83. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:10:18 compute-0 nova_compute[183075]: 2026-01-22 17:10:18.014 183079 DEBUG nova.network.neutron [req-ff9b6de5-6e78-4fc7-9fb5-6d6e706ed933 req-4f8c3f1c-de1f-4c54-8f90-b85e5008e955 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Updating instance_info_cache with network_info: [{"id": "5644ae2a-c35b-431d-88a1-ad18de811d83", "address": "fa:16:3e:d4:c8:e5", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5644ae2a-c3", "ovs_interfaceid": "5644ae2a-c35b-431d-88a1-ad18de811d83", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:10:18 compute-0 nova_compute[183075]: 2026-01-22 17:10:18.036 183079 DEBUG oslo_concurrency.lockutils [req-ff9b6de5-6e78-4fc7-9fb5-6d6e706ed933 req-4f8c3f1c-de1f-4c54-8f90-b85e5008e955 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-7e8d077b-66fc-42ee-ad4e-a13327ad6764" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.234 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.444 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.444 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.514 183079 DEBUG nova.compute.manager [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.597 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.597 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.604 183079 DEBUG nova.virt.hardware [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.604 183079 INFO nova.compute.claims [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.768 183079 DEBUG nova.compute.provider_tree [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.786 183079 DEBUG nova.scheduler.client.report [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.812 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.814 183079 DEBUG nova.compute.manager [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.863 183079 DEBUG nova.compute.manager [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.863 183079 DEBUG nova.network.neutron [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.883 183079 INFO nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.901 183079 DEBUG nova.compute.manager [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:10:19 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.992 183079 INFO nova.compute.manager [None req-7e513478-6673-45b7-aed9-98b961fd7bdc 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Get console output
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.999 183079 DEBUG nova.compute.manager [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.000 183079 DEBUG nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.001 183079 INFO nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Creating image(s)
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.001 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "/var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.002 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "/var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.003 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "/var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:19.997 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.021 183079 DEBUG oslo_concurrency.processutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.051 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.091 183079 DEBUG oslo_concurrency.processutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.092 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.093 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.109 183079 DEBUG oslo_concurrency.processutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.141 183079 DEBUG nova.policy [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.170 183079 DEBUG oslo_concurrency.processutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.173 183079 DEBUG oslo_concurrency.processutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.215 183079 DEBUG oslo_concurrency.processutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.216 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.217 183079 DEBUG oslo_concurrency.processutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.275 183079 DEBUG oslo_concurrency.processutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.277 183079 DEBUG nova.virt.disk.api [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Checking if we can resize image /var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.278 183079 DEBUG oslo_concurrency.processutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.335 183079 DEBUG oslo_concurrency.processutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.336 183079 DEBUG nova.virt.disk.api [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Cannot resize image /var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.336 183079 DEBUG nova.objects.instance [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lazy-loading 'migration_context' on Instance uuid 7d7be65d-c615-4cfd-936e-e5b57b3f29c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.350 183079 DEBUG nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.351 183079 DEBUG nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Ensure instance console log exists: /var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.351 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.351 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:20 compute-0 nova_compute[183075]: 2026-01-22 17:10:20.351 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:21 compute-0 nova_compute[183075]: 2026-01-22 17:10:21.418 183079 DEBUG nova.network.neutron [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Successfully created port: 1188a618-4567-453e-b4f1-8d3fafe1d314 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:10:21 compute-0 nova_compute[183075]: 2026-01-22 17:10:21.798 183079 INFO nova.compute.manager [None req-f6f9c2bb-e824-4375-ba4f-dbf34a83854b cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Get console output
Jan 22 17:10:21 compute-0 nova_compute[183075]: 2026-01-22 17:10:21.805 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:22.943 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:22.944 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:10:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:10:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:24 compute-0 nova_compute[183075]: 2026-01-22 17:10:24.238 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:24 compute-0 podman[218380]: 2026-01-22 17:10:24.438100323 +0000 UTC m=+0.121104498 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 17:10:24 compute-0 podman[218379]: 2026-01-22 17:10:24.453764962 +0000 UTC m=+0.152514869 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:10:24 compute-0 podman[218424]: 2026-01-22 17:10:24.593595879 +0000 UTC m=+0.092325285 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, release=1755695350, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, vcs-type=git)
Jan 22 17:10:25 compute-0 nova_compute[183075]: 2026-01-22 17:10:25.006 183079 DEBUG nova.network.neutron [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Successfully updated port: 1188a618-4567-453e-b4f1-8d3fafe1d314 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:10:25 compute-0 nova_compute[183075]: 2026-01-22 17:10:25.030 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "refresh_cache-7d7be65d-c615-4cfd-936e-e5b57b3f29c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:10:25 compute-0 nova_compute[183075]: 2026-01-22 17:10:25.030 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquired lock "refresh_cache-7d7be65d-c615-4cfd-936e-e5b57b3f29c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:10:25 compute-0 nova_compute[183075]: 2026-01-22 17:10:25.030 183079 DEBUG nova.network.neutron [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:10:25 compute-0 nova_compute[183075]: 2026-01-22 17:10:25.052 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.067 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.068 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 2.1234288
Jan 22 17:10:25 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[218271]: 10.100.0.8:41512 [22/Jan/2026:17:10:22.942] listener listener/metadata 0/0/0/2125/2125 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.076 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.077 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.092 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.092 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0152800
Jan 22 17:10:25 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[218271]: 10.100.0.8:41522 [22/Jan/2026:17:10:25.076] listener listener/metadata 0/0/0/16/16 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.097 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.098 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.114 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.114 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0162554
Jan 22 17:10:25 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[218271]: 10.100.0.8:41532 [22/Jan/2026:17:10:25.096] listener listener/metadata 0/0/0/17/17 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.122 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.123 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:25 compute-0 nova_compute[183075]: 2026-01-22 17:10:25.125 183079 DEBUG nova.compute.manager [req-468fa30e-eb09-4066-aab4-beedf3b8d103 req-1f8e76e0-b590-47a2-842c-eae6f9dbf330 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Received event network-changed-1188a618-4567-453e-b4f1-8d3fafe1d314 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:25 compute-0 nova_compute[183075]: 2026-01-22 17:10:25.126 183079 DEBUG nova.compute.manager [req-468fa30e-eb09-4066-aab4-beedf3b8d103 req-1f8e76e0-b590-47a2-842c-eae6f9dbf330 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Refreshing instance network info cache due to event network-changed-1188a618-4567-453e-b4f1-8d3fafe1d314. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:10:25 compute-0 nova_compute[183075]: 2026-01-22 17:10:25.126 183079 DEBUG oslo_concurrency.lockutils [req-468fa30e-eb09-4066-aab4-beedf3b8d103 req-1f8e76e0-b590-47a2-842c-eae6f9dbf330 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-7d7be65d-c615-4cfd-936e-e5b57b3f29c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.140 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.141 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0174658
Jan 22 17:10:25 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[218271]: 10.100.0.8:41536 [22/Jan/2026:17:10:25.122] listener listener/metadata 0/0/0/18/18 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.149 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.149 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.163 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.164 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0144410
Jan 22 17:10:25 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[218271]: 10.100.0.8:41540 [22/Jan/2026:17:10:25.148] listener listener/metadata 0/0/0/15/15 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.172 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.173 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.191 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:25 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[218271]: 10.100.0.8:41550 [22/Jan/2026:17:10:25.171] listener listener/metadata 0/0/0/19/19 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.192 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0187116
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.201 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.202 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:25 compute-0 nova_compute[183075]: 2026-01-22 17:10:25.208 183079 INFO nova.compute.manager [None req-9c654862-b2c5-4c78-bb2a-26a9c4ed02c1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Get console output
Jan 22 17:10:25 compute-0 nova_compute[183075]: 2026-01-22 17:10:25.214 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:25 compute-0 nova_compute[183075]: 2026-01-22 17:10:25.232 183079 DEBUG nova.network.neutron [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.233 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.234 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0317578
Jan 22 17:10:25 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[218271]: 10.100.0.8:41552 [22/Jan/2026:17:10:25.200] listener listener/metadata 0/0/0/33/33 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.243 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.244 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.277 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.277 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0332453
Jan 22 17:10:25 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[218271]: 10.100.0.8:41554 [22/Jan/2026:17:10:25.242] listener listener/metadata 0/0/0/35/35 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.283 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.284 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.301 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.301 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0177016
Jan 22 17:10:25 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[218271]: 10.100.0.8:41560 [22/Jan/2026:17:10:25.282] listener listener/metadata 0/0/0/19/19 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.306 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.306 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.331 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.331 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0247567
Jan 22 17:10:25 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[218271]: 10.100.0.8:41568 [22/Jan/2026:17:10:25.305] listener listener/metadata 0/0/0/26/26 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.338 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.338 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:25 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[218271]: 10.100.0.8:41578 [22/Jan/2026:17:10:25.336] listener listener/metadata 0/0/0/17/17 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.354 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0150707
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.362 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.363 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.379 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.379 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0162840
Jan 22 17:10:25 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[218271]: 10.100.0.8:41580 [22/Jan/2026:17:10:25.361] listener listener/metadata 0/0/0/17/17 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.382 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.383 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.400 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.400 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0174732
Jan 22 17:10:25 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[218271]: 10.100.0.8:41584 [22/Jan/2026:17:10:25.382] listener listener/metadata 0/0/0/18/18 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.404 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.404 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.420 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.420 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0158904
Jan 22 17:10:25 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[218271]: 10.100.0.8:41598 [22/Jan/2026:17:10:25.403] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.426 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.427 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.441 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:25 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[218271]: 10.100.0.8:41602 [22/Jan/2026:17:10:25.426] listener listener/metadata 0/0/0/15/15 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.441 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0146079
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.447 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.448 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.465 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:25.466 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0176020
Jan 22 17:10:25 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[218271]: 10.100.0.8:41618 [22/Jan/2026:17:10:25.447] listener listener/metadata 0/0/0/18/18 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.116 183079 DEBUG nova.network.neutron [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Updating instance_info_cache with network_info: [{"id": "1188a618-4567-453e-b4f1-8d3fafe1d314", "address": "fa:16:3e:54:2a:05", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1188a618-45", "ovs_interfaceid": "1188a618-4567-453e-b4f1-8d3fafe1d314", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.158 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Releasing lock "refresh_cache-7d7be65d-c615-4cfd-936e-e5b57b3f29c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.159 183079 DEBUG nova.compute.manager [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Instance network_info: |[{"id": "1188a618-4567-453e-b4f1-8d3fafe1d314", "address": "fa:16:3e:54:2a:05", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1188a618-45", "ovs_interfaceid": "1188a618-4567-453e-b4f1-8d3fafe1d314", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.159 183079 DEBUG oslo_concurrency.lockutils [req-468fa30e-eb09-4066-aab4-beedf3b8d103 req-1f8e76e0-b590-47a2-842c-eae6f9dbf330 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-7d7be65d-c615-4cfd-936e-e5b57b3f29c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.159 183079 DEBUG nova.network.neutron [req-468fa30e-eb09-4066-aab4-beedf3b8d103 req-1f8e76e0-b590-47a2-842c-eae6f9dbf330 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Refreshing network info cache for port 1188a618-4567-453e-b4f1-8d3fafe1d314 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.162 183079 DEBUG nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Start _get_guest_xml network_info=[{"id": "1188a618-4567-453e-b4f1-8d3fafe1d314", "address": "fa:16:3e:54:2a:05", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1188a618-45", "ovs_interfaceid": "1188a618-4567-453e-b4f1-8d3fafe1d314", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.170 183079 WARNING nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.177 183079 DEBUG nova.virt.libvirt.host [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.178 183079 DEBUG nova.virt.libvirt.host [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.185 183079 DEBUG nova.virt.libvirt.host [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.186 183079 DEBUG nova.virt.libvirt.host [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.187 183079 DEBUG nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.187 183079 DEBUG nova.virt.hardware [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.187 183079 DEBUG nova.virt.hardware [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.187 183079 DEBUG nova.virt.hardware [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.188 183079 DEBUG nova.virt.hardware [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.188 183079 DEBUG nova.virt.hardware [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.188 183079 DEBUG nova.virt.hardware [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.188 183079 DEBUG nova.virt.hardware [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.188 183079 DEBUG nova.virt.hardware [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.189 183079 DEBUG nova.virt.hardware [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.189 183079 DEBUG nova.virt.hardware [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.189 183079 DEBUG nova.virt.hardware [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.193 183079 DEBUG nova.virt.libvirt.vif [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:10:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1811743565',display_name='tempest-server-test-1811743565',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1811743565',id=14,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA4izGuVdf36SsG+7n8kX9aNpboq22Z55adiWGM5qlH08LxqMkSxkCnGlFdsMKL8t/vQsOXqbCU1vgc4to/WoKVrvDSrylB83cxSgDIuuaEZv45HgYlb5csi4YLKl3Bk4g==',key_name='tempest-keypair-test-110348497',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c2b37b797ca344f2b31c3861277068d8',ramdisk_id='',reservation_id='r-9elkgxck',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpMultipleRoutersTest-2036232412',owner_user_name='tempest-FloatingIpMultipleRoutersTest-2036232412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:10:19Z,user_data=None,user_id='28bc4852545149e59d0541d4f39eb38e',uuid=7d7be65d-c615-4cfd-936e-e5b57b3f29c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1188a618-4567-453e-b4f1-8d3fafe1d314", "address": "fa:16:3e:54:2a:05", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1188a618-45", "ovs_interfaceid": "1188a618-4567-453e-b4f1-8d3fafe1d314", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.194 183079 DEBUG nova.network.os_vif_util [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converting VIF {"id": "1188a618-4567-453e-b4f1-8d3fafe1d314", "address": "fa:16:3e:54:2a:05", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1188a618-45", "ovs_interfaceid": "1188a618-4567-453e-b4f1-8d3fafe1d314", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.194 183079 DEBUG nova.network.os_vif_util [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:2a:05,bridge_name='br-int',has_traffic_filtering=True,id=1188a618-4567-453e-b4f1-8d3fafe1d314,network=Network(ce346f8d-be8d-455f-b61c-12fea213a3f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1188a618-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.195 183079 DEBUG nova.objects.instance [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7d7be65d-c615-4cfd-936e-e5b57b3f29c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.222 183079 DEBUG nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:10:26 compute-0 nova_compute[183075]:   <uuid>7d7be65d-c615-4cfd-936e-e5b57b3f29c1</uuid>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   <name>instance-0000000e</name>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1811743565</nova:name>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:10:26</nova:creationTime>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:10:26 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:10:26 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:10:26 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:10:26 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:10:26 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:10:26 compute-0 nova_compute[183075]:         <nova:user uuid="28bc4852545149e59d0541d4f39eb38e">tempest-FloatingIpMultipleRoutersTest-2036232412-project-member</nova:user>
Jan 22 17:10:26 compute-0 nova_compute[183075]:         <nova:project uuid="c2b37b797ca344f2b31c3861277068d8">tempest-FloatingIpMultipleRoutersTest-2036232412</nova:project>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:10:26 compute-0 nova_compute[183075]:         <nova:port uuid="1188a618-4567-453e-b4f1-8d3fafe1d314">
Jan 22 17:10:26 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <system>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <entry name="serial">7d7be65d-c615-4cfd-936e-e5b57b3f29c1</entry>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <entry name="uuid">7d7be65d-c615-4cfd-936e-e5b57b3f29c1</entry>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     </system>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   <os>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   </os>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   <features>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   </features>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1/disk"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:54:2a:05"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <target dev="tap1188a618-45"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1/console.log" append="off"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <video>
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     </video>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:10:26 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:10:26 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:10:26 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:10:26 compute-0 nova_compute[183075]: </domain>
Jan 22 17:10:26 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.223 183079 DEBUG nova.compute.manager [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Preparing to wait for external event network-vif-plugged-1188a618-4567-453e-b4f1-8d3fafe1d314 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.224 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.224 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.224 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.225 183079 DEBUG nova.virt.libvirt.vif [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:10:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1811743565',display_name='tempest-server-test-1811743565',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1811743565',id=14,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA4izGuVdf36SsG+7n8kX9aNpboq22Z55adiWGM5qlH08LxqMkSxkCnGlFdsMKL8t/vQsOXqbCU1vgc4to/WoKVrvDSrylB83cxSgDIuuaEZv45HgYlb5csi4YLKl3Bk4g==',key_name='tempest-keypair-test-110348497',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c2b37b797ca344f2b31c3861277068d8',ramdisk_id='',reservation_id='r-9elkgxck',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpMultipleRoutersTest-2036232412',owner_user_name='tempest-FloatingIpMultipleRoutersTest-2036232412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:10:19Z,user_data=None,user_id='28bc4852545149e59d0541d4f39eb38e',uuid=7d7be65d-c615-4cfd-936e-e5b57b3f29c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1188a618-4567-453e-b4f1-8d3fafe1d314", "address": "fa:16:3e:54:2a:05", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1188a618-45", "ovs_interfaceid": "1188a618-4567-453e-b4f1-8d3fafe1d314", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.226 183079 DEBUG nova.network.os_vif_util [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converting VIF {"id": "1188a618-4567-453e-b4f1-8d3fafe1d314", "address": "fa:16:3e:54:2a:05", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1188a618-45", "ovs_interfaceid": "1188a618-4567-453e-b4f1-8d3fafe1d314", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.227 183079 DEBUG nova.network.os_vif_util [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:2a:05,bridge_name='br-int',has_traffic_filtering=True,id=1188a618-4567-453e-b4f1-8d3fafe1d314,network=Network(ce346f8d-be8d-455f-b61c-12fea213a3f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1188a618-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.227 183079 DEBUG os_vif [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:2a:05,bridge_name='br-int',has_traffic_filtering=True,id=1188a618-4567-453e-b4f1-8d3fafe1d314,network=Network(ce346f8d-be8d-455f-b61c-12fea213a3f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1188a618-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.228 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.229 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.230 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.240 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.241 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1188a618-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.241 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1188a618-45, col_values=(('external_ids', {'iface-id': '1188a618-4567-453e-b4f1-8d3fafe1d314', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:54:2a:05', 'vm-uuid': '7d7be65d-c615-4cfd-936e-e5b57b3f29c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.270 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:26 compute-0 NetworkManager[55454]: <info>  [1769101826.2714] manager: (tap1188a618-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.273 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.283 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.283 183079 INFO os_vif [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:2a:05,bridge_name='br-int',has_traffic_filtering=True,id=1188a618-4567-453e-b4f1-8d3fafe1d314,network=Network(ce346f8d-be8d-455f-b61c-12fea213a3f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1188a618-45')
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.336 183079 DEBUG nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.337 183079 DEBUG nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] No VIF found with MAC fa:16:3e:54:2a:05, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:10:26 compute-0 kernel: tap1188a618-45: entered promiscuous mode
Jan 22 17:10:26 compute-0 NetworkManager[55454]: <info>  [1769101826.4139] manager: (tap1188a618-45): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.414 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:26 compute-0 ovn_controller[95372]: 2026-01-22T17:10:26Z|00159|binding|INFO|Claiming lport 1188a618-4567-453e-b4f1-8d3fafe1d314 for this chassis.
Jan 22 17:10:26 compute-0 ovn_controller[95372]: 2026-01-22T17:10:26Z|00160|binding|INFO|1188a618-4567-453e-b4f1-8d3fafe1d314: Claiming fa:16:3e:54:2a:05 10.100.0.13
Jan 22 17:10:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:26.422 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:2a:05 10.100.0.13'], port_security=['fa:16:3e:54:2a:05 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '7d7be65d-c615-4cfd-936e-e5b57b3f29c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2b37b797ca344f2b31c3861277068d8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2f51838f-8a2c-425b-a70e-e288886c38d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f29732e-c99f-480d-89f6-9caa444040c9, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=1188a618-4567-453e-b4f1-8d3fafe1d314) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:10:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:26.424 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 1188a618-4567-453e-b4f1-8d3fafe1d314 in datapath ce346f8d-be8d-455f-b61c-12fea213a3f4 bound to our chassis
Jan 22 17:10:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:26.426 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ce346f8d-be8d-455f-b61c-12fea213a3f4
Jan 22 17:10:26 compute-0 ovn_controller[95372]: 2026-01-22T17:10:26Z|00161|binding|INFO|Setting lport 1188a618-4567-453e-b4f1-8d3fafe1d314 ovn-installed in OVS
Jan 22 17:10:26 compute-0 ovn_controller[95372]: 2026-01-22T17:10:26Z|00162|binding|INFO|Setting lport 1188a618-4567-453e-b4f1-8d3fafe1d314 up in Southbound
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.431 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.434 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:26.447 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[645b8312-049c-424e-b433-168734c0900e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:26 compute-0 systemd-machined[154382]: New machine qemu-14-instance-0000000e.
Jan 22 17:10:26 compute-0 systemd-udevd[218465]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:10:26 compute-0 NetworkManager[55454]: <info>  [1769101826.4700] device (tap1188a618-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:10:26 compute-0 NetworkManager[55454]: <info>  [1769101826.4710] device (tap1188a618-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:10:26 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-0000000e.
Jan 22 17:10:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:26.490 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ae6794d5-5c83-4911-840e-6732973b1e1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:26.494 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[64636715-9dfb-465c-9a36-363374e27222]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:26.532 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[3e5be3a4-7203-4f72-b3ec-1885d83534e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:26.549 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fb4e75bb-578f-456b-b376-c4e3aae25d86]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce346f8d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:a7:4b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 55, 'rx_bytes': 8920, 'tx_bytes': 6193, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 55, 'rx_bytes': 8920, 'tx_bytes': 6193, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415010, 'reachable_time': 25211, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218477, 'error': None, 'target': 'ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:26.567 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[881b23a7-2d11-48e9-8a81-8906f3be31c1]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapce346f8d-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415022, 'tstamp': 415022}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218478, 'error': None, 'target': 'ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapce346f8d-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415024, 'tstamp': 415024}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218478, 'error': None, 'target': 'ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:26.569 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce346f8d-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.570 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:26.571 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce346f8d-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:26.572 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:10:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:26.572 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapce346f8d-b0, col_values=(('external_ids', {'iface-id': '255f865e-6322-48b0-a0d1-c16ced648c78'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:26.572 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.698 183079 DEBUG nova.compute.manager [req-c3db0db3-d7c6-4732-bd70-6be839661429 req-06208e71-b23b-4ad5-a124-472f3b7bd9c0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Received event network-vif-plugged-1188a618-4567-453e-b4f1-8d3fafe1d314 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.699 183079 DEBUG oslo_concurrency.lockutils [req-c3db0db3-d7c6-4732-bd70-6be839661429 req-06208e71-b23b-4ad5-a124-472f3b7bd9c0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.699 183079 DEBUG oslo_concurrency.lockutils [req-c3db0db3-d7c6-4732-bd70-6be839661429 req-06208e71-b23b-4ad5-a124-472f3b7bd9c0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.699 183079 DEBUG oslo_concurrency.lockutils [req-c3db0db3-d7c6-4732-bd70-6be839661429 req-06208e71-b23b-4ad5-a124-472f3b7bd9c0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.699 183079 DEBUG nova.compute.manager [req-c3db0db3-d7c6-4732-bd70-6be839661429 req-06208e71-b23b-4ad5-a124-472f3b7bd9c0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Processing event network-vif-plugged-1188a618-4567-453e-b4f1-8d3fafe1d314 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.759 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101826.7592707, 7d7be65d-c615-4cfd-936e-e5b57b3f29c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.760 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] VM Started (Lifecycle Event)
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.763 183079 DEBUG nova.compute.manager [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.769 183079 DEBUG nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.774 183079 INFO nova.virt.libvirt.driver [-] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Instance spawned successfully.
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.775 183079 DEBUG nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.779 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.783 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.792 183079 DEBUG nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.793 183079 DEBUG nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.793 183079 DEBUG nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.794 183079 DEBUG nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.794 183079 DEBUG nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.794 183079 DEBUG nova.virt.libvirt.driver [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.806 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.806 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101826.7635338, 7d7be65d-c615-4cfd-936e-e5b57b3f29c1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.806 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] VM Paused (Lifecycle Event)
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.845 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.849 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101826.7677708, 7d7be65d-c615-4cfd-936e-e5b57b3f29c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.849 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] VM Resumed (Lifecycle Event)
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.874 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.878 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.885 183079 INFO nova.compute.manager [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Took 6.89 seconds to spawn the instance on the hypervisor.
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.885 183079 DEBUG nova.compute.manager [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.896 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.951 183079 INFO nova.compute.manager [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Took 7.38 seconds to build instance.
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.954 183079 INFO nova.compute.manager [None req-5d172bd0-e655-48ee-8329-cf1e516c57eb cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Get console output
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.961 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:26 compute-0 nova_compute[183075]: 2026-01-22 17:10:26.967 183079 DEBUG oslo_concurrency.lockutils [None req-90766e87-7ee7-40b9-8c36-8fdc00a92830 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:30 compute-0 nova_compute[183075]: 2026-01-22 17:10:30.004 183079 DEBUG nova.network.neutron [req-468fa30e-eb09-4066-aab4-beedf3b8d103 req-1f8e76e0-b590-47a2-842c-eae6f9dbf330 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Updated VIF entry in instance network info cache for port 1188a618-4567-453e-b4f1-8d3fafe1d314. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:10:30 compute-0 nova_compute[183075]: 2026-01-22 17:10:30.005 183079 DEBUG nova.network.neutron [req-468fa30e-eb09-4066-aab4-beedf3b8d103 req-1f8e76e0-b590-47a2-842c-eae6f9dbf330 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Updating instance_info_cache with network_info: [{"id": "1188a618-4567-453e-b4f1-8d3fafe1d314", "address": "fa:16:3e:54:2a:05", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1188a618-45", "ovs_interfaceid": "1188a618-4567-453e-b4f1-8d3fafe1d314", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:10:30 compute-0 nova_compute[183075]: 2026-01-22 17:10:30.019 183079 DEBUG oslo_concurrency.lockutils [req-468fa30e-eb09-4066-aab4-beedf3b8d103 req-1f8e76e0-b590-47a2-842c-eae6f9dbf330 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-7d7be65d-c615-4cfd-936e-e5b57b3f29c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:10:30 compute-0 nova_compute[183075]: 2026-01-22 17:10:30.055 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:30 compute-0 nova_compute[183075]: 2026-01-22 17:10:30.111 183079 DEBUG nova.compute.manager [req-ad331e52-94fa-4406-8e7c-fa606ac568a0 req-93cf01d0-7bda-4a6e-9cc5-347c95888e8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Received event network-vif-plugged-1188a618-4567-453e-b4f1-8d3fafe1d314 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:30 compute-0 nova_compute[183075]: 2026-01-22 17:10:30.111 183079 DEBUG oslo_concurrency.lockutils [req-ad331e52-94fa-4406-8e7c-fa606ac568a0 req-93cf01d0-7bda-4a6e-9cc5-347c95888e8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:30 compute-0 nova_compute[183075]: 2026-01-22 17:10:30.111 183079 DEBUG oslo_concurrency.lockutils [req-ad331e52-94fa-4406-8e7c-fa606ac568a0 req-93cf01d0-7bda-4a6e-9cc5-347c95888e8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:30 compute-0 nova_compute[183075]: 2026-01-22 17:10:30.111 183079 DEBUG oslo_concurrency.lockutils [req-ad331e52-94fa-4406-8e7c-fa606ac568a0 req-93cf01d0-7bda-4a6e-9cc5-347c95888e8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:30 compute-0 nova_compute[183075]: 2026-01-22 17:10:30.112 183079 DEBUG nova.compute.manager [req-ad331e52-94fa-4406-8e7c-fa606ac568a0 req-93cf01d0-7bda-4a6e-9cc5-347c95888e8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] No waiting events found dispatching network-vif-plugged-1188a618-4567-453e-b4f1-8d3fafe1d314 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:10:30 compute-0 nova_compute[183075]: 2026-01-22 17:10:30.112 183079 WARNING nova.compute.manager [req-ad331e52-94fa-4406-8e7c-fa606ac568a0 req-93cf01d0-7bda-4a6e-9cc5-347c95888e8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Received unexpected event network-vif-plugged-1188a618-4567-453e-b4f1-8d3fafe1d314 for instance with vm_state active and task_state None.
Jan 22 17:10:30 compute-0 nova_compute[183075]: 2026-01-22 17:10:30.120 183079 INFO nova.compute.manager [None req-a8148ae2-357c-4f81-a3c0-73f6b6b73501 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Get console output
Jan 22 17:10:30 compute-0 nova_compute[183075]: 2026-01-22 17:10:30.309 183079 INFO nova.compute.manager [None req-9efde6c9-f1ba-4d4b-9f0e-bacd0e21adde 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Get console output
Jan 22 17:10:30 compute-0 nova_compute[183075]: 2026-01-22 17:10:30.314 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:31 compute-0 nova_compute[183075]: 2026-01-22 17:10:31.270 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:31 compute-0 podman[218487]: 2026-01-22 17:10:31.36959335 +0000 UTC m=+0.076134686 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:10:32 compute-0 nova_compute[183075]: 2026-01-22 17:10:32.486 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:35 compute-0 nova_compute[183075]: 2026-01-22 17:10:35.107 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:35 compute-0 nova_compute[183075]: 2026-01-22 17:10:35.255 183079 INFO nova.compute.manager [None req-ba8e6b45-bb5a-480a-9167-141ccef06666 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Get console output
Jan 22 17:10:35 compute-0 nova_compute[183075]: 2026-01-22 17:10:35.450 183079 INFO nova.compute.manager [None req-3a03cf1a-edda-4327-bb6a-dbef9e53e767 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Get console output
Jan 22 17:10:35 compute-0 nova_compute[183075]: 2026-01-22 17:10:35.455 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:36 compute-0 nova_compute[183075]: 2026-01-22 17:10:36.322 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:37 compute-0 nova_compute[183075]: 2026-01-22 17:10:37.823 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:37 compute-0 nova_compute[183075]: 2026-01-22 17:10:37.823 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:37 compute-0 nova_compute[183075]: 2026-01-22 17:10:37.835 183079 DEBUG nova.compute.manager [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:10:37 compute-0 nova_compute[183075]: 2026-01-22 17:10:37.917 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:37 compute-0 nova_compute[183075]: 2026-01-22 17:10:37.917 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:37 compute-0 nova_compute[183075]: 2026-01-22 17:10:37.930 183079 DEBUG nova.virt.hardware [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:10:37 compute-0 nova_compute[183075]: 2026-01-22 17:10:37.930 183079 INFO nova.compute.claims [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.122 183079 DEBUG nova.compute.provider_tree [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.148 183079 DEBUG nova.scheduler.client.report [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.173 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.174 183079 DEBUG nova.compute.manager [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.231 183079 DEBUG nova.compute.manager [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.231 183079 DEBUG nova.network.neutron [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.257 183079 INFO nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.302 183079 DEBUG nova.compute.manager [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.370 183079 DEBUG nova.compute.manager [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:10:38 compute-0 podman[218522]: 2026-01-22 17:10:38.370969205 +0000 UTC m=+0.063478596 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.371 183079 DEBUG nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.372 183079 INFO nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Creating image(s)
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.372 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "/var/lib/nova/instances/2bf3289b-0c4e-4286-80ca-c74eb06a8b96/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.372 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/2bf3289b-0c4e-4286-80ca-c74eb06a8b96/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.373 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/2bf3289b-0c4e-4286-80ca-c74eb06a8b96/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.384 183079 DEBUG oslo_concurrency.processutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.441 183079 DEBUG oslo_concurrency.processutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.442 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.442 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.452 183079 DEBUG oslo_concurrency.processutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.502 183079 DEBUG oslo_concurrency.processutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.504 183079 DEBUG oslo_concurrency.processutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/2bf3289b-0c4e-4286-80ca-c74eb06a8b96/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.539 183079 DEBUG oslo_concurrency.processutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/2bf3289b-0c4e-4286-80ca-c74eb06a8b96/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.540 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.541 183079 DEBUG oslo_concurrency.processutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.598 183079 DEBUG oslo_concurrency.processutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.600 183079 DEBUG nova.virt.disk.api [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Checking if we can resize image /var/lib/nova/instances/2bf3289b-0c4e-4286-80ca-c74eb06a8b96/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.600 183079 DEBUG oslo_concurrency.processutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bf3289b-0c4e-4286-80ca-c74eb06a8b96/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.665 183079 DEBUG oslo_concurrency.processutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bf3289b-0c4e-4286-80ca-c74eb06a8b96/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.666 183079 DEBUG nova.virt.disk.api [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Cannot resize image /var/lib/nova/instances/2bf3289b-0c4e-4286-80ca-c74eb06a8b96/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.667 183079 DEBUG nova.objects.instance [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'migration_context' on Instance uuid 2bf3289b-0c4e-4286-80ca-c74eb06a8b96 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.695 183079 DEBUG nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.696 183079 DEBUG nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Ensure instance console log exists: /var/lib/nova/instances/2bf3289b-0c4e-4286-80ca-c74eb06a8b96/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.696 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.697 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:38 compute-0 nova_compute[183075]: 2026-01-22 17:10:38.698 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:39 compute-0 nova_compute[183075]: 2026-01-22 17:10:39.074 183079 DEBUG nova.policy [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:10:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:39.129 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:10:39 compute-0 nova_compute[183075]: 2026-01-22 17:10:39.129 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:39.130 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:10:39 compute-0 ovn_controller[95372]: 2026-01-22T17:10:39Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:54:2a:05 10.100.0.13
Jan 22 17:10:39 compute-0 ovn_controller[95372]: 2026-01-22T17:10:39Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:54:2a:05 10.100.0.13
Jan 22 17:10:40 compute-0 nova_compute[183075]: 2026-01-22 17:10:40.109 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:40 compute-0 nova_compute[183075]: 2026-01-22 17:10:40.412 183079 INFO nova.compute.manager [None req-a992ccb0-0893-4715-90f7-45e0362dc461 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Get console output
Jan 22 17:10:40 compute-0 nova_compute[183075]: 2026-01-22 17:10:40.417 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:40 compute-0 ovn_controller[95372]: 2026-01-22T17:10:40Z|00163|binding|INFO|Releasing lport 1759254b-798a-4e65-baf5-489557c1f604 from this chassis (sb_readonly=0)
Jan 22 17:10:40 compute-0 ovn_controller[95372]: 2026-01-22T17:10:40Z|00164|binding|INFO|Releasing lport 02f52a63-476f-468b-a774-c9514d6b2206 from this chassis (sb_readonly=0)
Jan 22 17:10:40 compute-0 ovn_controller[95372]: 2026-01-22T17:10:40Z|00165|binding|INFO|Releasing lport 255f865e-6322-48b0-a0d1-c16ced648c78 from this chassis (sb_readonly=0)
Jan 22 17:10:40 compute-0 nova_compute[183075]: 2026-01-22 17:10:40.503 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:40 compute-0 nova_compute[183075]: 2026-01-22 17:10:40.703 183079 INFO nova.compute.manager [None req-b2b053c5-ff02-463a-b2c2-2607b0a06092 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Get console output
Jan 22 17:10:40 compute-0 nova_compute[183075]: 2026-01-22 17:10:40.709 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:40 compute-0 nova_compute[183075]: 2026-01-22 17:10:40.779 183079 DEBUG nova.network.neutron [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Successfully updated port: 9b370d66-b3f5-492d-b735-289048caa64f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:10:40 compute-0 nova_compute[183075]: 2026-01-22 17:10:40.799 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "refresh_cache-2bf3289b-0c4e-4286-80ca-c74eb06a8b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:10:40 compute-0 nova_compute[183075]: 2026-01-22 17:10:40.800 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquired lock "refresh_cache-2bf3289b-0c4e-4286-80ca-c74eb06a8b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:10:40 compute-0 nova_compute[183075]: 2026-01-22 17:10:40.800 183079 DEBUG nova.network.neutron [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:10:40 compute-0 nova_compute[183075]: 2026-01-22 17:10:40.904 183079 DEBUG nova.compute.manager [req-55f56932-f0ea-4317-ba6e-8d0918418f48 req-1832b580-bab4-4753-b282-b84b6fb57536 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Received event network-changed-9b370d66-b3f5-492d-b735-289048caa64f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:40 compute-0 nova_compute[183075]: 2026-01-22 17:10:40.905 183079 DEBUG nova.compute.manager [req-55f56932-f0ea-4317-ba6e-8d0918418f48 req-1832b580-bab4-4753-b282-b84b6fb57536 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Refreshing instance network info cache due to event network-changed-9b370d66-b3f5-492d-b735-289048caa64f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:10:40 compute-0 nova_compute[183075]: 2026-01-22 17:10:40.905 183079 DEBUG oslo_concurrency.lockutils [req-55f56932-f0ea-4317-ba6e-8d0918418f48 req-1832b580-bab4-4753-b282-b84b6fb57536 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-2bf3289b-0c4e-4286-80ca-c74eb06a8b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:10:40 compute-0 nova_compute[183075]: 2026-01-22 17:10:40.984 183079 DEBUG nova.network.neutron [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:10:41 compute-0 nova_compute[183075]: 2026-01-22 17:10:41.324 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:41.925 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:41.926 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:41.927 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.726 183079 DEBUG nova.network.neutron [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Updating instance_info_cache with network_info: [{"id": "9b370d66-b3f5-492d-b735-289048caa64f", "address": "fa:16:3e:76:97:ec", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b370d66-b3", "ovs_interfaceid": "9b370d66-b3f5-492d-b735-289048caa64f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.746 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Releasing lock "refresh_cache-2bf3289b-0c4e-4286-80ca-c74eb06a8b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.746 183079 DEBUG nova.compute.manager [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Instance network_info: |[{"id": "9b370d66-b3f5-492d-b735-289048caa64f", "address": "fa:16:3e:76:97:ec", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b370d66-b3", "ovs_interfaceid": "9b370d66-b3f5-492d-b735-289048caa64f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.747 183079 DEBUG oslo_concurrency.lockutils [req-55f56932-f0ea-4317-ba6e-8d0918418f48 req-1832b580-bab4-4753-b282-b84b6fb57536 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-2bf3289b-0c4e-4286-80ca-c74eb06a8b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.747 183079 DEBUG nova.network.neutron [req-55f56932-f0ea-4317-ba6e-8d0918418f48 req-1832b580-bab4-4753-b282-b84b6fb57536 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Refreshing network info cache for port 9b370d66-b3f5-492d-b735-289048caa64f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.750 183079 DEBUG nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Start _get_guest_xml network_info=[{"id": "9b370d66-b3f5-492d-b735-289048caa64f", "address": "fa:16:3e:76:97:ec", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b370d66-b3", "ovs_interfaceid": "9b370d66-b3f5-492d-b735-289048caa64f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.754 183079 WARNING nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.762 183079 DEBUG nova.virt.libvirt.host [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.763 183079 DEBUG nova.virt.libvirt.host [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.769 183079 DEBUG nova.virt.libvirt.host [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.770 183079 DEBUG nova.virt.libvirt.host [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.771 183079 DEBUG nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.771 183079 DEBUG nova.virt.hardware [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.772 183079 DEBUG nova.virt.hardware [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.772 183079 DEBUG nova.virt.hardware [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.772 183079 DEBUG nova.virt.hardware [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.772 183079 DEBUG nova.virt.hardware [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.772 183079 DEBUG nova.virt.hardware [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.773 183079 DEBUG nova.virt.hardware [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.773 183079 DEBUG nova.virt.hardware [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.773 183079 DEBUG nova.virt.hardware [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.773 183079 DEBUG nova.virt.hardware [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.774 183079 DEBUG nova.virt.hardware [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.777 183079 DEBUG nova.virt.libvirt.vif [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:10:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-368474683',display_name='tempest-server-test-368474683',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-368474683',id=15,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-lzk0if63',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:10:38Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=2bf3289b-0c4e-4286-80ca-c74eb06a8b96,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9b370d66-b3f5-492d-b735-289048caa64f", "address": "fa:16:3e:76:97:ec", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b370d66-b3", "ovs_interfaceid": "9b370d66-b3f5-492d-b735-289048caa64f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.777 183079 DEBUG nova.network.os_vif_util [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "9b370d66-b3f5-492d-b735-289048caa64f", "address": "fa:16:3e:76:97:ec", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b370d66-b3", "ovs_interfaceid": "9b370d66-b3f5-492d-b735-289048caa64f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.777 183079 DEBUG nova.network.os_vif_util [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:97:ec,bridge_name='br-int',has_traffic_filtering=True,id=9b370d66-b3f5-492d-b735-289048caa64f,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9b370d66-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.778 183079 DEBUG nova.objects.instance [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2bf3289b-0c4e-4286-80ca-c74eb06a8b96 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.791 183079 DEBUG nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:10:42 compute-0 nova_compute[183075]:   <uuid>2bf3289b-0c4e-4286-80ca-c74eb06a8b96</uuid>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   <name>instance-0000000f</name>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-368474683</nova:name>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:10:42</nova:creationTime>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:10:42 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:10:42 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:10:42 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:10:42 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:10:42 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:10:42 compute-0 nova_compute[183075]:         <nova:user uuid="cd47d63cff2548a88e21e5c2e6a5c161">tempest-FloatingIpSeparateNetwork-931877966-project-member</nova:user>
Jan 22 17:10:42 compute-0 nova_compute[183075]:         <nova:project uuid="e05c7aae349e4a1d859a387df45650a0">tempest-FloatingIpSeparateNetwork-931877966</nova:project>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:10:42 compute-0 nova_compute[183075]:         <nova:port uuid="9b370d66-b3f5-492d-b735-289048caa64f">
Jan 22 17:10:42 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.30" ipVersion="4"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <system>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <entry name="serial">2bf3289b-0c4e-4286-80ca-c74eb06a8b96</entry>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <entry name="uuid">2bf3289b-0c4e-4286-80ca-c74eb06a8b96</entry>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     </system>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   <os>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   </os>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   <features>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   </features>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/2bf3289b-0c4e-4286-80ca-c74eb06a8b96/disk"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:76:97:ec"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <target dev="tap9b370d66-b3"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/2bf3289b-0c4e-4286-80ca-c74eb06a8b96/console.log" append="off"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <video>
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     </video>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:10:42 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:10:42 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:10:42 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:10:42 compute-0 nova_compute[183075]: </domain>
Jan 22 17:10:42 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.792 183079 DEBUG nova.compute.manager [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Preparing to wait for external event network-vif-plugged-9b370d66-b3f5-492d-b735-289048caa64f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.793 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.793 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.794 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.794 183079 DEBUG nova.virt.libvirt.vif [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:10:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-368474683',display_name='tempest-server-test-368474683',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-368474683',id=15,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-lzk0if63',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:10:38Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=2bf3289b-0c4e-4286-80ca-c74eb06a8b96,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9b370d66-b3f5-492d-b735-289048caa64f", "address": "fa:16:3e:76:97:ec", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b370d66-b3", "ovs_interfaceid": "9b370d66-b3f5-492d-b735-289048caa64f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.794 183079 DEBUG nova.network.os_vif_util [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "9b370d66-b3f5-492d-b735-289048caa64f", "address": "fa:16:3e:76:97:ec", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b370d66-b3", "ovs_interfaceid": "9b370d66-b3f5-492d-b735-289048caa64f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.795 183079 DEBUG nova.network.os_vif_util [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:97:ec,bridge_name='br-int',has_traffic_filtering=True,id=9b370d66-b3f5-492d-b735-289048caa64f,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9b370d66-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.796 183079 DEBUG os_vif [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:97:ec,bridge_name='br-int',has_traffic_filtering=True,id=9b370d66-b3f5-492d-b735-289048caa64f,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9b370d66-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.796 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.796 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.797 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.799 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.799 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b370d66-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.800 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9b370d66-b3, col_values=(('external_ids', {'iface-id': '9b370d66-b3f5-492d-b735-289048caa64f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:76:97:ec', 'vm-uuid': '2bf3289b-0c4e-4286-80ca-c74eb06a8b96'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.801 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:42 compute-0 NetworkManager[55454]: <info>  [1769101842.8023] manager: (tap9b370d66-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.803 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.808 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.809 183079 INFO os_vif [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:97:ec,bridge_name='br-int',has_traffic_filtering=True,id=9b370d66-b3f5-492d-b735-289048caa64f,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9b370d66-b3')
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.864 183079 DEBUG nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.864 183079 DEBUG nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No VIF found with MAC fa:16:3e:76:97:ec, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:10:42 compute-0 kernel: tap9b370d66-b3: entered promiscuous mode
Jan 22 17:10:42 compute-0 NetworkManager[55454]: <info>  [1769101842.9209] manager: (tap9b370d66-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Jan 22 17:10:42 compute-0 ovn_controller[95372]: 2026-01-22T17:10:42Z|00166|binding|INFO|Claiming lport 9b370d66-b3f5-492d-b735-289048caa64f for this chassis.
Jan 22 17:10:42 compute-0 ovn_controller[95372]: 2026-01-22T17:10:42Z|00167|binding|INFO|9b370d66-b3f5-492d-b735-289048caa64f: Claiming fa:16:3e:76:97:ec 10.100.0.30
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.922 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:42.928 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:97:ec 10.100.0.30'], port_security=['fa:16:3e:76:97:ec 10.100.0.30'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.30/28', 'neutron:device_id': '2bf3289b-0c4e-4286-80ca-c74eb06a8b96', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a16be1a-262e-47f7-8518-5f24ee15796e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5b6ccb16-1216-4deb-9d72-42005a3163bb, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=9b370d66-b3f5-492d-b735-289048caa64f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:10:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:42.929 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 9b370d66-b3f5-492d-b735-289048caa64f in datapath 0a16be1a-262e-47f7-8518-5f24ee15796e bound to our chassis
Jan 22 17:10:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:42.931 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a16be1a-262e-47f7-8518-5f24ee15796e
Jan 22 17:10:42 compute-0 ovn_controller[95372]: 2026-01-22T17:10:42Z|00168|binding|INFO|Setting lport 9b370d66-b3f5-492d-b735-289048caa64f ovn-installed in OVS
Jan 22 17:10:42 compute-0 ovn_controller[95372]: 2026-01-22T17:10:42Z|00169|binding|INFO|Setting lport 9b370d66-b3f5-492d-b735-289048caa64f up in Southbound
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.938 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:42 compute-0 nova_compute[183075]: 2026-01-22 17:10:42.940 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:42.943 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b503e7d6-631e-4fd4-84be-7e767478fc3a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:42.944 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0a16be1a-21 in ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:10:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:42.946 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0a16be1a-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:10:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:42.946 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[77d7f4cc-e306-4388-bfc8-34bb95b366f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:42.948 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8d4f2f8e-f01c-46cd-83da-7e5d3430b631]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:42 compute-0 systemd-udevd[218580]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:10:42 compute-0 systemd-machined[154382]: New machine qemu-15-instance-0000000f.
Jan 22 17:10:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:42.961 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[472c8f9a-c1e0-46cb-be3e-bfb337fac992]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:42 compute-0 NetworkManager[55454]: <info>  [1769101842.9675] device (tap9b370d66-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:10:42 compute-0 NetworkManager[55454]: <info>  [1769101842.9682] device (tap9b370d66-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:10:42 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-0000000f.
Jan 22 17:10:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:42.989 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[33716f94-5a7d-40ae-8faa-ff018ecbc2b2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.020 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c0b42e2e-95ee-405b-b08f-9c4c6527d9d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.025 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c8b9ed19-bbdd-447e-8170-50aea89ccf03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:43 compute-0 NetworkManager[55454]: <info>  [1769101843.0263] manager: (tap0a16be1a-20): new Veth device (/org/freedesktop/NetworkManager/Devices/82)
Jan 22 17:10:43 compute-0 systemd-udevd[218583]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.057 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[fdc96306-91a6-402a-b3cf-252eca177cfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.060 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[af58dc5b-cecd-4050-a661-e87cc331b905]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:43 compute-0 NetworkManager[55454]: <info>  [1769101843.0874] device (tap0a16be1a-20): carrier: link connected
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.093 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[b2db4bd8-3ed7-4164-aeff-1a0b28170074]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.113 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3f05594e-5a33-4ee8-8f18-4ed645f27ec6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a16be1a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:16:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420678, 'reachable_time': 28773, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218612, 'error': None, 'target': 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.132 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.131 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5fdbeee8-87a1-45ba-b024-e55236bdac2d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feba:16c2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 420678, 'tstamp': 420678}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218613, 'error': None, 'target': 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.148 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[dada8324-5fbf-46e2-a4f8-2025ab0c7313]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a16be1a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:16:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420678, 'reachable_time': 28773, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218614, 'error': None, 'target': 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.182 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4a43f1fd-ad41-4266-b477-0d9abdf24ba3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.237 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[57981d03-d55d-4d32-a091-d03a31933c93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.239 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a16be1a-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.239 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.239 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a16be1a-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:43 compute-0 nova_compute[183075]: 2026-01-22 17:10:43.240 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:43 compute-0 NetworkManager[55454]: <info>  [1769101843.2415] manager: (tap0a16be1a-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Jan 22 17:10:43 compute-0 kernel: tap0a16be1a-20: entered promiscuous mode
Jan 22 17:10:43 compute-0 nova_compute[183075]: 2026-01-22 17:10:43.243 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.244 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a16be1a-20, col_values=(('external_ids', {'iface-id': 'f5af8e72-5100-4440-84f0-c68eec4b5e5e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:43 compute-0 ovn_controller[95372]: 2026-01-22T17:10:43Z|00170|binding|INFO|Releasing lport f5af8e72-5100-4440-84f0-c68eec4b5e5e from this chassis (sb_readonly=0)
Jan 22 17:10:43 compute-0 nova_compute[183075]: 2026-01-22 17:10:43.245 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:43 compute-0 nova_compute[183075]: 2026-01-22 17:10:43.257 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.258 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a16be1a-262e-47f7-8518-5f24ee15796e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a16be1a-262e-47f7-8518-5f24ee15796e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.259 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[32b558a5-bc0e-42de-b31b-40d8dd64087d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.260 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/0a16be1a-262e-47f7-8518-5f24ee15796e.pid.haproxy
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 0a16be1a-262e-47f7-8518-5f24ee15796e
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:10:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:43.261 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'env', 'PROCESS_TAG=haproxy-0a16be1a-262e-47f7-8518-5f24ee15796e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0a16be1a-262e-47f7-8518-5f24ee15796e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:10:43 compute-0 podman[218646]: 2026-01-22 17:10:43.647682566 +0000 UTC m=+0.060093468 container create 81f287b3ae086e0074d492f8f8a158bac7404596bd7c579dc4ed4edbe5ec93c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:10:43 compute-0 systemd[1]: Started libpod-conmon-81f287b3ae086e0074d492f8f8a158bac7404596bd7c579dc4ed4edbe5ec93c1.scope.
Jan 22 17:10:43 compute-0 podman[218646]: 2026-01-22 17:10:43.615578689 +0000 UTC m=+0.027989611 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:10:43 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:10:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d29937df1ac6d693a9d69c64067ccf3334b0d3988fbec9f9723241c1cfe6246/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:10:43 compute-0 podman[218646]: 2026-01-22 17:10:43.742479777 +0000 UTC m=+0.154890659 container init 81f287b3ae086e0074d492f8f8a158bac7404596bd7c579dc4ed4edbe5ec93c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:10:43 compute-0 podman[218646]: 2026-01-22 17:10:43.747674873 +0000 UTC m=+0.160085755 container start 81f287b3ae086e0074d492f8f8a158bac7404596bd7c579dc4ed4edbe5ec93c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:10:43 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[218662]: [NOTICE]   (218666) : New worker (218668) forked
Jan 22 17:10:43 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[218662]: [NOTICE]   (218666) : Loading success.
Jan 22 17:10:44 compute-0 nova_compute[183075]: 2026-01-22 17:10:44.007 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101844.0071917, 2bf3289b-0c4e-4286-80ca-c74eb06a8b96 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:10:44 compute-0 nova_compute[183075]: 2026-01-22 17:10:44.008 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] VM Started (Lifecycle Event)
Jan 22 17:10:44 compute-0 nova_compute[183075]: 2026-01-22 17:10:44.027 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:44 compute-0 nova_compute[183075]: 2026-01-22 17:10:44.031 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101844.0073674, 2bf3289b-0c4e-4286-80ca-c74eb06a8b96 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:10:44 compute-0 nova_compute[183075]: 2026-01-22 17:10:44.031 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] VM Paused (Lifecycle Event)
Jan 22 17:10:44 compute-0 nova_compute[183075]: 2026-01-22 17:10:44.050 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:44 compute-0 nova_compute[183075]: 2026-01-22 17:10:44.055 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:10:44 compute-0 nova_compute[183075]: 2026-01-22 17:10:44.073 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:10:44 compute-0 nova_compute[183075]: 2026-01-22 17:10:44.113 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:44 compute-0 nova_compute[183075]: 2026-01-22 17:10:44.300 183079 DEBUG nova.network.neutron [req-55f56932-f0ea-4317-ba6e-8d0918418f48 req-1832b580-bab4-4753-b282-b84b6fb57536 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Updated VIF entry in instance network info cache for port 9b370d66-b3f5-492d-b735-289048caa64f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:10:44 compute-0 nova_compute[183075]: 2026-01-22 17:10:44.301 183079 DEBUG nova.network.neutron [req-55f56932-f0ea-4317-ba6e-8d0918418f48 req-1832b580-bab4-4753-b282-b84b6fb57536 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Updating instance_info_cache with network_info: [{"id": "9b370d66-b3f5-492d-b735-289048caa64f", "address": "fa:16:3e:76:97:ec", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b370d66-b3", "ovs_interfaceid": "9b370d66-b3f5-492d-b735-289048caa64f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:10:44 compute-0 nova_compute[183075]: 2026-01-22 17:10:44.316 183079 DEBUG oslo_concurrency.lockutils [req-55f56932-f0ea-4317-ba6e-8d0918418f48 req-1832b580-bab4-4753-b282-b84b6fb57536 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-2bf3289b-0c4e-4286-80ca-c74eb06a8b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:10:45 compute-0 nova_compute[183075]: 2026-01-22 17:10:45.112 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:45.818 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:45.819 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:10:45 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:45 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:45 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:45 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:45 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:45 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:10:45 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:46 compute-0 nova_compute[183075]: 2026-01-22 17:10:46.070 183079 INFO nova.compute.manager [None req-fce77bde-d3de-419f-8a0b-a11a1329ea8c 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Get console output
Jan 22 17:10:46 compute-0 nova_compute[183075]: 2026-01-22 17:10:46.075 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:46 compute-0 nova_compute[183075]: 2026-01-22 17:10:46.103 183079 INFO nova.compute.manager [None req-9ae9a792-8090-491b-a647-783a1995c272 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Get console output
Jan 22 17:10:46 compute-0 nova_compute[183075]: 2026-01-22 17:10:46.108 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.213 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.213 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.3942089
Jan 22 17:10:46 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.13:52154 [22/Jan/2026:17:10:45.817] listener listener/metadata 0/0/0/396/396 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.222 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.223 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.239 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.239 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0157123
Jan 22 17:10:46 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.13:52160 [22/Jan/2026:17:10:46.222] listener listener/metadata 0/0/0/17/17 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.242 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.243 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.261 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.261 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0180728
Jan 22 17:10:46 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.13:52176 [22/Jan/2026:17:10:46.242] listener listener/metadata 0/0/0/18/18 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.265 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.265 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.280 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:46 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.13:52180 [22/Jan/2026:17:10:46.265] listener listener/metadata 0/0/0/15/15 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.280 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0145712
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.284 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.284 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.297 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:46 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.13:52196 [22/Jan/2026:17:10:46.284] listener listener/metadata 0/0/0/14/14 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.298 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0134103
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.301 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.302 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.314 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:46 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.13:52212 [22/Jan/2026:17:10:46.301] listener listener/metadata 0/0/0/13/13 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.315 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0132797
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.319 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.319 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.333 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.334 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0145383
Jan 22 17:10:46 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.13:52228 [22/Jan/2026:17:10:46.318] listener listener/metadata 0/0/0/15/15 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.337 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.338 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.363 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:46 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.13:52236 [22/Jan/2026:17:10:46.337] listener listener/metadata 0/0/0/26/26 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.364 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0259104
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.367 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.368 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.380 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:46 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.13:52244 [22/Jan/2026:17:10:46.367] listener listener/metadata 0/0/0/13/13 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.381 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0131547
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.385 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.386 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.400 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.400 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0146046
Jan 22 17:10:46 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.13:52254 [22/Jan/2026:17:10:46.385] listener listener/metadata 0/0/0/15/15 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.404 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.405 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.417 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0121553
Jan 22 17:10:46 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.13:52256 [22/Jan/2026:17:10:46.404] listener listener/metadata 0/0/0/13/13 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.428 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.428 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.442 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:46 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.13:52270 [22/Jan/2026:17:10:46.428] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.442 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0138600
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.446 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.446 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.461 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.461 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0145490
Jan 22 17:10:46 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.13:52286 [22/Jan/2026:17:10:46.446] listener listener/metadata 0/0/0/15/15 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.464 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.465 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.480 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.480 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0152121
Jan 22 17:10:46 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.13:52302 [22/Jan/2026:17:10:46.464] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.485 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.485 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.499 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.500 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0144353
Jan 22 17:10:46 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.13:52318 [22/Jan/2026:17:10:46.484] listener listener/metadata 0/0/0/15/15 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.504 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.504 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ce346f8d-be8d-455f-b61c-12fea213a3f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.523 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:10:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:46.524 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0195765
Jan 22 17:10:46 compute-0 haproxy-metadata-proxy-ce346f8d-be8d-455f-b61c-12fea213a3f4[217774]: 10.100.0.13:52326 [22/Jan/2026:17:10:46.503] listener listener/metadata 0/0/0/20/20 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:10:46 compute-0 nova_compute[183075]: 2026-01-22 17:10:46.715 183079 INFO nova.compute.manager [None req-0cb85d07-0c1c-4f4d-bfad-3d323f1d3a6c 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Get console output
Jan 22 17:10:46 compute-0 nova_compute[183075]: 2026-01-22 17:10:46.722 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:46 compute-0 nova_compute[183075]: 2026-01-22 17:10:46.966 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:46 compute-0 nova_compute[183075]: 2026-01-22 17:10:46.967 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:46 compute-0 nova_compute[183075]: 2026-01-22 17:10:46.988 183079 DEBUG nova.compute.manager [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.071 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.072 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.078 183079 DEBUG nova.virt.hardware [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.079 183079 INFO nova.compute.claims [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.297 183079 DEBUG nova.compute.provider_tree [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.311 183079 DEBUG nova.scheduler.client.report [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.331 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.332 183079 DEBUG nova.compute.manager [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.380 183079 DEBUG nova.compute.manager [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.380 183079 DEBUG nova.network.neutron [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.399 183079 INFO nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.415 183079 DEBUG nova.compute.manager [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.490 183079 DEBUG nova.compute.manager [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.491 183079 DEBUG nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.491 183079 INFO nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Creating image(s)
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.492 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "/var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.492 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "/var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.493 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "/var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.504 183079 DEBUG oslo_concurrency.processutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.576 183079 DEBUG oslo_concurrency.processutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.578 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.579 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.604 183079 DEBUG oslo_concurrency.processutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.680 183079 DEBUG oslo_concurrency.processutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.682 183079 DEBUG oslo_concurrency.processutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.722 183079 DEBUG oslo_concurrency.processutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.724 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.725 183079 DEBUG oslo_concurrency.processutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.781 183079 DEBUG oslo_concurrency.processutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.783 183079 DEBUG nova.virt.disk.api [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Checking if we can resize image /var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.784 183079 DEBUG oslo_concurrency.processutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.810 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.877 183079 DEBUG oslo_concurrency.processutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.878 183079 DEBUG nova.virt.disk.api [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Cannot resize image /var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.879 183079 DEBUG nova.objects.instance [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lazy-loading 'migration_context' on Instance uuid 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.905 183079 DEBUG nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.906 183079 DEBUG nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Ensure instance console log exists: /var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.907 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.908 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:47 compute-0 nova_compute[183075]: 2026-01-22 17:10:47.908 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:48 compute-0 podman[218699]: 2026-01-22 17:10:48.372124 +0000 UTC m=+0.075726936 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:10:48 compute-0 nova_compute[183075]: 2026-01-22 17:10:48.956 183079 DEBUG nova.policy [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.164 183079 DEBUG nova.compute.manager [req-4351c100-686a-4b5a-81af-d59621b518a8 req-68e287ef-c714-4a11-a735-1401315598c0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Received event network-vif-plugged-9b370d66-b3f5-492d-b735-289048caa64f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.164 183079 DEBUG oslo_concurrency.lockutils [req-4351c100-686a-4b5a-81af-d59621b518a8 req-68e287ef-c714-4a11-a735-1401315598c0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.164 183079 DEBUG oslo_concurrency.lockutils [req-4351c100-686a-4b5a-81af-d59621b518a8 req-68e287ef-c714-4a11-a735-1401315598c0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.165 183079 DEBUG oslo_concurrency.lockutils [req-4351c100-686a-4b5a-81af-d59621b518a8 req-68e287ef-c714-4a11-a735-1401315598c0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.165 183079 DEBUG nova.compute.manager [req-4351c100-686a-4b5a-81af-d59621b518a8 req-68e287ef-c714-4a11-a735-1401315598c0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Processing event network-vif-plugged-9b370d66-b3f5-492d-b735-289048caa64f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.165 183079 DEBUG nova.compute.manager [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.169 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101849.1683671, 2bf3289b-0c4e-4286-80ca-c74eb06a8b96 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.169 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] VM Resumed (Lifecycle Event)
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.171 183079 DEBUG nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.174 183079 INFO nova.virt.libvirt.driver [-] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Instance spawned successfully.
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.174 183079 DEBUG nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.190 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.196 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.199 183079 DEBUG nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.199 183079 DEBUG nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.200 183079 DEBUG nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.200 183079 DEBUG nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.201 183079 DEBUG nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.201 183079 DEBUG nova.virt.libvirt.driver [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.226 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.263 183079 INFO nova.compute.manager [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Took 10.89 seconds to spawn the instance on the hypervisor.
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.263 183079 DEBUG nova.compute.manager [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.338 183079 INFO nova.compute.manager [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Took 11.46 seconds to build instance.
Jan 22 17:10:49 compute-0 nova_compute[183075]: 2026-01-22 17:10:49.358 183079 DEBUG oslo_concurrency.lockutils [None req-6624bf61-d61c-47be-9154-246603ab48b4 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:50 compute-0 nova_compute[183075]: 2026-01-22 17:10:50.115 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:50 compute-0 nova_compute[183075]: 2026-01-22 17:10:50.904 183079 INFO nova.compute.manager [None req-4b96855d-d279-4654-93cc-74605c1a5168 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Get console output
Jan 22 17:10:50 compute-0 nova_compute[183075]: 2026-01-22 17:10:50.966 183079 DEBUG nova.network.neutron [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Successfully updated port: c26b2385-71db-477e-888c-d10712732db6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:10:50 compute-0 nova_compute[183075]: 2026-01-22 17:10:50.982 183079 DEBUG nova.compute.manager [req-000de649-84ea-4ce0-9c6e-b5178ef020b4 req-1b76361a-9eb4-4677-a72a-9b1a47bcb188 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Received event network-changed-096b36b4-87c4-423a-a3ef-3c47a75704f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:50 compute-0 nova_compute[183075]: 2026-01-22 17:10:50.982 183079 DEBUG nova.compute.manager [req-000de649-84ea-4ce0-9c6e-b5178ef020b4 req-1b76361a-9eb4-4677-a72a-9b1a47bcb188 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Refreshing instance network info cache due to event network-changed-096b36b4-87c4-423a-a3ef-3c47a75704f7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:10:50 compute-0 nova_compute[183075]: 2026-01-22 17:10:50.982 183079 DEBUG oslo_concurrency.lockutils [req-000de649-84ea-4ce0-9c6e-b5178ef020b4 req-1b76361a-9eb4-4677-a72a-9b1a47bcb188 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-e4683d56-25f3-42a9-aedd-1b076e9a5245" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:10:50 compute-0 nova_compute[183075]: 2026-01-22 17:10:50.983 183079 DEBUG oslo_concurrency.lockutils [req-000de649-84ea-4ce0-9c6e-b5178ef020b4 req-1b76361a-9eb4-4677-a72a-9b1a47bcb188 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-e4683d56-25f3-42a9-aedd-1b076e9a5245" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:10:50 compute-0 nova_compute[183075]: 2026-01-22 17:10:50.983 183079 DEBUG nova.network.neutron [req-000de649-84ea-4ce0-9c6e-b5178ef020b4 req-1b76361a-9eb4-4677-a72a-9b1a47bcb188 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Refreshing network info cache for port 096b36b4-87c4-423a-a3ef-3c47a75704f7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:10:50 compute-0 nova_compute[183075]: 2026-01-22 17:10:50.986 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "refresh_cache-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:10:50 compute-0 nova_compute[183075]: 2026-01-22 17:10:50.986 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquired lock "refresh_cache-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:10:50 compute-0 nova_compute[183075]: 2026-01-22 17:10:50.986 183079 DEBUG nova.network.neutron [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.017 183079 DEBUG oslo_concurrency.lockutils [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "e4683d56-25f3-42a9-aedd-1b076e9a5245" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.017 183079 DEBUG oslo_concurrency.lockutils [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "e4683d56-25f3-42a9-aedd-1b076e9a5245" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.017 183079 DEBUG oslo_concurrency.lockutils [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.018 183079 DEBUG oslo_concurrency.lockutils [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.018 183079 DEBUG oslo_concurrency.lockutils [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.019 183079 INFO nova.compute.manager [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Terminating instance
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.020 183079 DEBUG nova.compute.manager [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:10:51 compute-0 kernel: tap096b36b4-87 (unregistering): left promiscuous mode
Jan 22 17:10:51 compute-0 NetworkManager[55454]: <info>  [1769101851.0443] device (tap096b36b4-87): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.059 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:51 compute-0 ovn_controller[95372]: 2026-01-22T17:10:51Z|00171|binding|INFO|Releasing lport 096b36b4-87c4-423a-a3ef-3c47a75704f7 from this chassis (sb_readonly=0)
Jan 22 17:10:51 compute-0 ovn_controller[95372]: 2026-01-22T17:10:51Z|00172|binding|INFO|Setting lport 096b36b4-87c4-423a-a3ef-3c47a75704f7 down in Southbound
Jan 22 17:10:51 compute-0 ovn_controller[95372]: 2026-01-22T17:10:51Z|00173|binding|INFO|Removing iface tap096b36b4-87 ovn-installed in OVS
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.067 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:51.068 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:53:65 10.100.0.14'], port_security=['fa:16:3e:f6:53:65 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e4683d56-25f3-42a9-aedd-1b076e9a5245', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bfc6667804934c92b71ce7638089e9e3', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'd9af03c0-27db-4d08-b124-ee395583cdd3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd725c57-a5bb-4dca-9677-d74d2fa01c15, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=096b36b4-87c4-423a-a3ef-3c47a75704f7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:10:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:51.069 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 096b36b4-87c4-423a-a3ef-3c47a75704f7 in datapath 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c unbound from our chassis
Jan 22 17:10:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:51.071 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:10:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:51.072 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3357ad50-42f5-42c9-bd45-13f0e9eb7514]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:51.072 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c namespace which is not needed anymore
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.087 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:51 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 22 17:10:51 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Consumed 15.355s CPU time.
Jan 22 17:10:51 compute-0 systemd-machined[154382]: Machine qemu-12-instance-0000000c terminated.
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.144 183079 DEBUG nova.network.neutron [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:10:51 compute-0 neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217842]: [NOTICE]   (217846) : haproxy version is 2.8.14-c23fe91
Jan 22 17:10:51 compute-0 neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217842]: [NOTICE]   (217846) : path to executable is /usr/sbin/haproxy
Jan 22 17:10:51 compute-0 neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217842]: [WARNING]  (217846) : Exiting Master process...
Jan 22 17:10:51 compute-0 neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217842]: [WARNING]  (217846) : Exiting Master process...
Jan 22 17:10:51 compute-0 neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217842]: [ALERT]    (217846) : Current worker (217848) exited with code 143 (Terminated)
Jan 22 17:10:51 compute-0 neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[217842]: [WARNING]  (217846) : All workers exited. Exiting... (0)
Jan 22 17:10:51 compute-0 systemd[1]: libpod-d08ace27ecb69db472cf1c88d34cd385ad0ccc6297f06e0652738cdd93aacc1d.scope: Deactivated successfully.
Jan 22 17:10:51 compute-0 podman[218750]: 2026-01-22 17:10:51.211540527 +0000 UTC m=+0.045311212 container died d08ace27ecb69db472cf1c88d34cd385ad0ccc6297f06e0652738cdd93aacc1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:10:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d08ace27ecb69db472cf1c88d34cd385ad0ccc6297f06e0652738cdd93aacc1d-userdata-shm.mount: Deactivated successfully.
Jan 22 17:10:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-0707cc5d81be2218fac9e3ab46bc067375d6e998cfcdfb09daec75defb5bce1f-merged.mount: Deactivated successfully.
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.295 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:51 compute-0 podman[218750]: 2026-01-22 17:10:51.302419057 +0000 UTC m=+0.136189742 container cleanup d08ace27ecb69db472cf1c88d34cd385ad0ccc6297f06e0652738cdd93aacc1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:10:51 compute-0 systemd[1]: libpod-conmon-d08ace27ecb69db472cf1c88d34cd385ad0ccc6297f06e0652738cdd93aacc1d.scope: Deactivated successfully.
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.325 183079 INFO nova.virt.libvirt.driver [-] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Instance destroyed successfully.
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.326 183079 DEBUG nova.objects.instance [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lazy-loading 'resources' on Instance uuid e4683d56-25f3-42a9-aedd-1b076e9a5245 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.340 183079 DEBUG nova.virt.libvirt.vif [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:09:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-590988812',display_name='tempest-server-test-590988812',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-590988812',id=12,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPrkA/Z5RmtQU4jmjRMv9OOPvEkTJSvzTw8ebk65GzPrHqEHbv+wizg7XUt+WWaoThVx02ADkoi97wsj98MvMQXzRu+T8wQKRmnd1AKmVJARy0gGVc4wBfQufwEt526HBw==',key_name='tempest-keypair-test-1746127176',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:09:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bfc6667804934c92b71ce7638089e9e3',ramdisk_id='',reservation_id='r-n6f17aqe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-QoSTest-2146064006',owner_user_name='tempest-QoSTest-2146064006-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:09:48Z,user_data=None,user_id='1e61127d65144bcbaa0d43fe3eb484c0',uuid=e4683d56-25f3-42a9-aedd-1b076e9a5245,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.340 183079 DEBUG nova.network.os_vif_util [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converting VIF {"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.341 183079 DEBUG nova.network.os_vif_util [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:65,bridge_name='br-int',has_traffic_filtering=True,id=096b36b4-87c4-423a-a3ef-3c47a75704f7,network=Network(9c1e909c-8e03-49be-b02d-6bf4a2cedc0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap096b36b4-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.341 183079 DEBUG os_vif [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:65,bridge_name='br-int',has_traffic_filtering=True,id=096b36b4-87c4-423a-a3ef-3c47a75704f7,network=Network(9c1e909c-8e03-49be-b02d-6bf4a2cedc0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap096b36b4-87') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.343 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.343 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap096b36b4-87, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.344 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.346 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.348 183079 INFO os_vif [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:65,bridge_name='br-int',has_traffic_filtering=True,id=096b36b4-87c4-423a-a3ef-3c47a75704f7,network=Network(9c1e909c-8e03-49be-b02d-6bf4a2cedc0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap096b36b4-87')
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.348 183079 INFO nova.virt.libvirt.driver [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Deleting instance files /var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245_del
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.349 183079 INFO nova.virt.libvirt.driver [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Deletion of /var/lib/nova/instances/e4683d56-25f3-42a9-aedd-1b076e9a5245_del complete
Jan 22 17:10:51 compute-0 podman[218792]: 2026-01-22 17:10:51.380121233 +0000 UTC m=+0.052286154 container remove d08ace27ecb69db472cf1c88d34cd385ad0ccc6297f06e0652738cdd93aacc1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 17:10:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:51.387 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c79522-2f99-40d9-beff-1da2406027fc]: (4, ('Thu Jan 22 05:10:51 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c (d08ace27ecb69db472cf1c88d34cd385ad0ccc6297f06e0652738cdd93aacc1d)\nd08ace27ecb69db472cf1c88d34cd385ad0ccc6297f06e0652738cdd93aacc1d\nThu Jan 22 05:10:51 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c (d08ace27ecb69db472cf1c88d34cd385ad0ccc6297f06e0652738cdd93aacc1d)\nd08ace27ecb69db472cf1c88d34cd385ad0ccc6297f06e0652738cdd93aacc1d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:51.390 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[df2bcfc7-d56d-44f2-b091-a798861a9230]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:51.392 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c1e909c-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.394 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:51 compute-0 kernel: tap9c1e909c-80: left promiscuous mode
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.398 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.399 183079 INFO nova.compute.manager [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.399 183079 DEBUG oslo.service.loopingcall [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.399 183079 DEBUG nova.compute.manager [-] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.399 183079 DEBUG nova.network.neutron [-] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:10:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:51.406 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ddf2758f-84bb-4f41-98cc-60294483b452]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:51 compute-0 nova_compute[183075]: 2026-01-22 17:10:51.414 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:51.426 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cc05c521-e9b4-40c4-ae34-3c29bb0e0d2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:51.427 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c017b195-28db-47c4-be4e-7b10478c1514]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:51.440 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1a41396c-9fc3-4e70-9ec2-d029d37c742f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415095, 'reachable_time': 24013, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218807, 'error': None, 'target': 'ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:51 compute-0 systemd[1]: run-netns-ovnmeta\x2d9c1e909c\x2d8e03\x2d49be\x2db02d\x2d6bf4a2cedc0c.mount: Deactivated successfully.
Jan 22 17:10:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:51.445 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:10:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:51.445 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[d0adae27-cddf-4c77-a4c2-0c6b985ed62d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:52 compute-0 nova_compute[183075]: 2026-01-22 17:10:52.680 183079 INFO nova.compute.manager [None req-d1122049-c666-49e4-9bd6-a3ac1365fd6e 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Get console output
Jan 22 17:10:52 compute-0 nova_compute[183075]: 2026-01-22 17:10:52.686 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:52 compute-0 nova_compute[183075]: 2026-01-22 17:10:52.711 183079 DEBUG nova.compute.manager [req-012cfcbf-9e27-48bf-9c2a-bb3279470dc3 req-2dbe9371-5636-4375-acf9-b4f05229b3ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Received event network-vif-plugged-9b370d66-b3f5-492d-b735-289048caa64f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:52 compute-0 nova_compute[183075]: 2026-01-22 17:10:52.712 183079 DEBUG oslo_concurrency.lockutils [req-012cfcbf-9e27-48bf-9c2a-bb3279470dc3 req-2dbe9371-5636-4375-acf9-b4f05229b3ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:52 compute-0 nova_compute[183075]: 2026-01-22 17:10:52.713 183079 DEBUG oslo_concurrency.lockutils [req-012cfcbf-9e27-48bf-9c2a-bb3279470dc3 req-2dbe9371-5636-4375-acf9-b4f05229b3ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:52 compute-0 nova_compute[183075]: 2026-01-22 17:10:52.713 183079 DEBUG oslo_concurrency.lockutils [req-012cfcbf-9e27-48bf-9c2a-bb3279470dc3 req-2dbe9371-5636-4375-acf9-b4f05229b3ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:52 compute-0 nova_compute[183075]: 2026-01-22 17:10:52.714 183079 DEBUG nova.compute.manager [req-012cfcbf-9e27-48bf-9c2a-bb3279470dc3 req-2dbe9371-5636-4375-acf9-b4f05229b3ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] No waiting events found dispatching network-vif-plugged-9b370d66-b3f5-492d-b735-289048caa64f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:10:52 compute-0 nova_compute[183075]: 2026-01-22 17:10:52.714 183079 WARNING nova.compute.manager [req-012cfcbf-9e27-48bf-9c2a-bb3279470dc3 req-2dbe9371-5636-4375-acf9-b4f05229b3ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Received unexpected event network-vif-plugged-9b370d66-b3f5-492d-b735-289048caa64f for instance with vm_state active and task_state None.
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.054 183079 DEBUG nova.network.neutron [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Updating instance_info_cache with network_info: [{"id": "c26b2385-71db-477e-888c-d10712732db6", "address": "fa:16:3e:3c:de:8a", "network": {"id": "ea1cd914-64be-4fd0-b944-45368957fb5b", "bridge": "br-int", "label": "tempest-test-network--945581638", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc26b2385-71", "ovs_interfaceid": "c26b2385-71db-477e-888c-d10712732db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.058 183079 DEBUG nova.network.neutron [req-000de649-84ea-4ce0-9c6e-b5178ef020b4 req-1b76361a-9eb4-4677-a72a-9b1a47bcb188 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Updated VIF entry in instance network info cache for port 096b36b4-87c4-423a-a3ef-3c47a75704f7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.059 183079 DEBUG nova.network.neutron [req-000de649-84ea-4ce0-9c6e-b5178ef020b4 req-1b76361a-9eb4-4677-a72a-9b1a47bcb188 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Updating instance_info_cache with network_info: [{"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.060 183079 DEBUG nova.compute.manager [req-6f9c5517-457f-4f7d-a8c7-e9d4b7358f2c req-f43b1b75-1350-4af8-acb1-b8347f48d835 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Received event network-changed-c26b2385-71db-477e-888c-d10712732db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.061 183079 DEBUG nova.compute.manager [req-6f9c5517-457f-4f7d-a8c7-e9d4b7358f2c req-f43b1b75-1350-4af8-acb1-b8347f48d835 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Refreshing instance network info cache due to event network-changed-c26b2385-71db-477e-888c-d10712732db6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.061 183079 DEBUG oslo_concurrency.lockutils [req-6f9c5517-457f-4f7d-a8c7-e9d4b7358f2c req-f43b1b75-1350-4af8-acb1-b8347f48d835 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.074 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Releasing lock "refresh_cache-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.074 183079 DEBUG nova.compute.manager [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Instance network_info: |[{"id": "c26b2385-71db-477e-888c-d10712732db6", "address": "fa:16:3e:3c:de:8a", "network": {"id": "ea1cd914-64be-4fd0-b944-45368957fb5b", "bridge": "br-int", "label": "tempest-test-network--945581638", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc26b2385-71", "ovs_interfaceid": "c26b2385-71db-477e-888c-d10712732db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.075 183079 DEBUG oslo_concurrency.lockutils [req-6f9c5517-457f-4f7d-a8c7-e9d4b7358f2c req-f43b1b75-1350-4af8-acb1-b8347f48d835 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.076 183079 DEBUG nova.network.neutron [req-6f9c5517-457f-4f7d-a8c7-e9d4b7358f2c req-f43b1b75-1350-4af8-acb1-b8347f48d835 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Refreshing network info cache for port c26b2385-71db-477e-888c-d10712732db6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.079 183079 DEBUG nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Start _get_guest_xml network_info=[{"id": "c26b2385-71db-477e-888c-d10712732db6", "address": "fa:16:3e:3c:de:8a", "network": {"id": "ea1cd914-64be-4fd0-b944-45368957fb5b", "bridge": "br-int", "label": "tempest-test-network--945581638", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc26b2385-71", "ovs_interfaceid": "c26b2385-71db-477e-888c-d10712732db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.079 183079 DEBUG oslo_concurrency.lockutils [req-000de649-84ea-4ce0-9c6e-b5178ef020b4 req-1b76361a-9eb4-4677-a72a-9b1a47bcb188 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-e4683d56-25f3-42a9-aedd-1b076e9a5245" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.084 183079 WARNING nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.092 183079 DEBUG nova.virt.libvirt.host [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.093 183079 DEBUG nova.virt.libvirt.host [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.097 183079 DEBUG nova.virt.libvirt.host [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.098 183079 DEBUG nova.virt.libvirt.host [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.098 183079 DEBUG nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.098 183079 DEBUG nova.virt.hardware [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.099 183079 DEBUG nova.virt.hardware [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.099 183079 DEBUG nova.virt.hardware [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.100 183079 DEBUG nova.virt.hardware [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.100 183079 DEBUG nova.virt.hardware [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.100 183079 DEBUG nova.virt.hardware [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.101 183079 DEBUG nova.virt.hardware [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.101 183079 DEBUG nova.virt.hardware [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.102 183079 DEBUG nova.virt.hardware [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.102 183079 DEBUG nova.virt.hardware [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.102 183079 DEBUG nova.virt.hardware [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.107 183079 DEBUG nova.virt.libvirt.vif [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:10:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-939954687',display_name='tempest-server-test-939954687',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-939954687',id=16,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFDbtc4hRCSv9gHmltB2G2DhAS2UXQKlSKw7wO8jzDWMIBVwAbtKYxxZ/33aOA3bpXNLm1KqC5K8xgOmfq2yl/h8qq6Gn/Z0l0pwwJ1+wDWZki4CnB3qyJDKSWstQshx5w==',key_name='tempest-keypair-test-612599565',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26cca885d303443380036cbbe9e70744',ramdisk_id='',reservation_id='r-0x1txn9l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkConnectivityTest-1809867331',owner_user_name='tempest-NetworkConnectivityTest-1809867331-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:10:47Z,user_data=None,user_id='4a7542774b9c42618cf9d00113f9d23d',uuid=36a2dc63-6945-45c9-8e82-9d3aacdfc3bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c26b2385-71db-477e-888c-d10712732db6", "address": "fa:16:3e:3c:de:8a", "network": {"id": "ea1cd914-64be-4fd0-b944-45368957fb5b", "bridge": "br-int", "label": "tempest-test-network--945581638", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc26b2385-71", "ovs_interfaceid": "c26b2385-71db-477e-888c-d10712732db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.107 183079 DEBUG nova.network.os_vif_util [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converting VIF {"id": "c26b2385-71db-477e-888c-d10712732db6", "address": "fa:16:3e:3c:de:8a", "network": {"id": "ea1cd914-64be-4fd0-b944-45368957fb5b", "bridge": "br-int", "label": "tempest-test-network--945581638", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc26b2385-71", "ovs_interfaceid": "c26b2385-71db-477e-888c-d10712732db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.108 183079 DEBUG nova.network.os_vif_util [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:de:8a,bridge_name='br-int',has_traffic_filtering=True,id=c26b2385-71db-477e-888c-d10712732db6,network=Network(ea1cd914-64be-4fd0-b944-45368957fb5b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc26b2385-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.109 183079 DEBUG nova.objects.instance [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lazy-loading 'pci_devices' on Instance uuid 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.122 183079 DEBUG nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:10:53 compute-0 nova_compute[183075]:   <uuid>36a2dc63-6945-45c9-8e82-9d3aacdfc3bc</uuid>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   <name>instance-00000010</name>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-939954687</nova:name>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:10:53</nova:creationTime>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:10:53 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:10:53 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:10:53 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:10:53 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:10:53 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:10:53 compute-0 nova_compute[183075]:         <nova:user uuid="4a7542774b9c42618cf9d00113f9d23d">tempest-NetworkConnectivityTest-1809867331-project-member</nova:user>
Jan 22 17:10:53 compute-0 nova_compute[183075]:         <nova:project uuid="26cca885d303443380036cbbe9e70744">tempest-NetworkConnectivityTest-1809867331</nova:project>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:10:53 compute-0 nova_compute[183075]:         <nova:port uuid="c26b2385-71db-477e-888c-d10712732db6">
Jan 22 17:10:53 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.10.1.232" ipVersion="4"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <system>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <entry name="serial">36a2dc63-6945-45c9-8e82-9d3aacdfc3bc</entry>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <entry name="uuid">36a2dc63-6945-45c9-8e82-9d3aacdfc3bc</entry>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     </system>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   <os>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   </os>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   <features>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   </features>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:3c:de:8a"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <target dev="tapc26b2385-71"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/console.log" append="off"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <video>
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     </video>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:10:53 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:10:53 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:10:53 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:10:53 compute-0 nova_compute[183075]: </domain>
Jan 22 17:10:53 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.124 183079 DEBUG nova.compute.manager [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Preparing to wait for external event network-vif-plugged-c26b2385-71db-477e-888c-d10712732db6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.124 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.125 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.125 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.126 183079 DEBUG nova.virt.libvirt.vif [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:10:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-939954687',display_name='tempest-server-test-939954687',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-939954687',id=16,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFDbtc4hRCSv9gHmltB2G2DhAS2UXQKlSKw7wO8jzDWMIBVwAbtKYxxZ/33aOA3bpXNLm1KqC5K8xgOmfq2yl/h8qq6Gn/Z0l0pwwJ1+wDWZki4CnB3qyJDKSWstQshx5w==',key_name='tempest-keypair-test-612599565',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26cca885d303443380036cbbe9e70744',ramdisk_id='',reservation_id='r-0x1txn9l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkConnectivityTest-1809867331',owner_user_name='tempest-NetworkConnectivityTest-1809867331-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:10:47Z,user_data=None,user_id='4a7542774b9c42618cf9d00113f9d23d',uuid=36a2dc63-6945-45c9-8e82-9d3aacdfc3bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c26b2385-71db-477e-888c-d10712732db6", "address": "fa:16:3e:3c:de:8a", "network": {"id": "ea1cd914-64be-4fd0-b944-45368957fb5b", "bridge": "br-int", "label": "tempest-test-network--945581638", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc26b2385-71", "ovs_interfaceid": "c26b2385-71db-477e-888c-d10712732db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.126 183079 DEBUG nova.network.os_vif_util [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converting VIF {"id": "c26b2385-71db-477e-888c-d10712732db6", "address": "fa:16:3e:3c:de:8a", "network": {"id": "ea1cd914-64be-4fd0-b944-45368957fb5b", "bridge": "br-int", "label": "tempest-test-network--945581638", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc26b2385-71", "ovs_interfaceid": "c26b2385-71db-477e-888c-d10712732db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.127 183079 DEBUG nova.network.os_vif_util [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:de:8a,bridge_name='br-int',has_traffic_filtering=True,id=c26b2385-71db-477e-888c-d10712732db6,network=Network(ea1cd914-64be-4fd0-b944-45368957fb5b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc26b2385-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.127 183079 DEBUG os_vif [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:de:8a,bridge_name='br-int',has_traffic_filtering=True,id=c26b2385-71db-477e-888c-d10712732db6,network=Network(ea1cd914-64be-4fd0-b944-45368957fb5b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc26b2385-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.128 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.128 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.129 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.131 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.132 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc26b2385-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.132 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc26b2385-71, col_values=(('external_ids', {'iface-id': 'c26b2385-71db-477e-888c-d10712732db6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3c:de:8a', 'vm-uuid': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.134 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:53 compute-0 NetworkManager[55454]: <info>  [1769101853.1348] manager: (tapc26b2385-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.138 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.140 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.141 183079 INFO os_vif [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:de:8a,bridge_name='br-int',has_traffic_filtering=True,id=c26b2385-71db-477e-888c-d10712732db6,network=Network(ea1cd914-64be-4fd0-b944-45368957fb5b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc26b2385-71')
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.193 183079 DEBUG nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.194 183079 DEBUG nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] No VIF found with MAC fa:16:3e:3c:de:8a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:10:53 compute-0 kernel: tapc26b2385-71: entered promiscuous mode
Jan 22 17:10:53 compute-0 systemd-udevd[218730]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:10:53 compute-0 NetworkManager[55454]: <info>  [1769101853.2488] manager: (tapc26b2385-71): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.250 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:53 compute-0 ovn_controller[95372]: 2026-01-22T17:10:53Z|00174|binding|INFO|Claiming lport c26b2385-71db-477e-888c-d10712732db6 for this chassis.
Jan 22 17:10:53 compute-0 ovn_controller[95372]: 2026-01-22T17:10:53Z|00175|binding|INFO|c26b2385-71db-477e-888c-d10712732db6: Claiming fa:16:3e:3c:de:8a 10.10.1.232
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.258 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:de:8a 10.10.1.232'], port_security=['fa:16:3e:3c:de:8a 10.10.1.232'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.10.1.232/24', 'neutron:device_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea1cd914-64be-4fd0-b944-45368957fb5b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26cca885d303443380036cbbe9e70744', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9427a67d-1313-4d60-b73e-5a3f81f9a54d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f87222fa-7187-4fb1-9f2e-117949cb78fa, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=c26b2385-71db-477e-888c-d10712732db6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.259 104629 INFO neutron.agent.ovn.metadata.agent [-] Port c26b2385-71db-477e-888c-d10712732db6 in datapath ea1cd914-64be-4fd0-b944-45368957fb5b bound to our chassis
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.261 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ea1cd914-64be-4fd0-b944-45368957fb5b
Jan 22 17:10:53 compute-0 NetworkManager[55454]: <info>  [1769101853.2683] device (tapc26b2385-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:10:53 compute-0 NetworkManager[55454]: <info>  [1769101853.2688] device (tapc26b2385-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.271 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[46fb5e2c-3aaf-4345-9804-4157c6ac3d08]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.272 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapea1cd914-61 in ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.274 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapea1cd914-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.274 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[92f1d8fb-1554-40fb-b0e3-cd4f10e4c523]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.275 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2ed25262-a8bb-4b04-9ee6-01423b77004d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:53 compute-0 ovn_controller[95372]: 2026-01-22T17:10:53Z|00176|binding|INFO|Setting lport c26b2385-71db-477e-888c-d10712732db6 ovn-installed in OVS
Jan 22 17:10:53 compute-0 ovn_controller[95372]: 2026-01-22T17:10:53Z|00177|binding|INFO|Setting lport c26b2385-71db-477e-888c-d10712732db6 up in Southbound
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.284 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.292 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[ee77a921-c5b6-4d9c-9506-dc2f0ef1b261]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:53 compute-0 systemd-machined[154382]: New machine qemu-16-instance-00000010.
Jan 22 17:10:53 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000010.
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.316 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d119adc7-88fe-4aac-b60f-e67ee69ba4ad]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.354 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c40f8e37-0b21-4423-ab17-fb0ffd668f2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:53 compute-0 NetworkManager[55454]: <info>  [1769101853.3614] manager: (tapea1cd914-60): new Veth device (/org/freedesktop/NetworkManager/Devices/86)
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.362 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e064b692-5ad1-4043-94fa-2064e5dbddbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.397 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d40a8be9-e2ad-4ff4-a7c4-4498b5bd2c77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.400 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a4830782-b439-42d8-bb2f-12b91932afc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:53 compute-0 NetworkManager[55454]: <info>  [1769101853.4191] device (tapea1cd914-60): carrier: link connected
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.424 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[6442499e-2c84-48c7-8b4f-dd91ec39c8de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.438 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[866d03af-1d74-49c5-9a01-2c4512dba0cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapea1cd914-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:13:10'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 421712, 'reachable_time': 29321, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218855, 'error': None, 'target': 'ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.453 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b6ee8a43-91ea-4d0c-b527-59c5d92f07b0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe44:1310'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 421712, 'tstamp': 421712}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218856, 'error': None, 'target': 'ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.469 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0919002d-c909-4084-9e0c-d2d67b4286ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapea1cd914-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:13:10'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 421712, 'reachable_time': 29321, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218857, 'error': None, 'target': 'ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.503 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[352abc57-4e04-4cc6-a84f-98f9bbb30759]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.566 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1c53cba4-bae0-4c7a-aabf-14e2ff8b498b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.567 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea1cd914-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.568 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.568 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea1cd914-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.619 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:53 compute-0 NetworkManager[55454]: <info>  [1769101853.6199] manager: (tapea1cd914-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Jan 22 17:10:53 compute-0 kernel: tapea1cd914-60: entered promiscuous mode
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.627 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapea1cd914-60, col_values=(('external_ids', {'iface-id': '75c62732-e203-4484-b6f5-77c2880e15a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.628 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:53 compute-0 ovn_controller[95372]: 2026-01-22T17:10:53Z|00178|binding|INFO|Releasing lport 75c62732-e203-4484-b6f5-77c2880e15a3 from this chassis (sb_readonly=0)
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.629 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.631 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ea1cd914-64be-4fd0-b944-45368957fb5b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ea1cd914-64be-4fd0-b944-45368957fb5b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.632 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bd99f680-4b71-4b2c-a593-f4683044f708]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.633 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/ea1cd914-64be-4fd0-b944-45368957fb5b.pid.haproxy
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID ea1cd914-64be-4fd0-b944-45368957fb5b
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:10:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:10:53.633 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b', 'env', 'PROCESS_TAG=haproxy-ea1cd914-64be-4fd0-b944-45368957fb5b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ea1cd914-64be-4fd0-b944-45368957fb5b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.640 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.730 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101853.7301166, 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.730 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] VM Started (Lifecycle Event)
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.769 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.772 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101853.7301896, 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.772 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] VM Paused (Lifecycle Event)
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.797 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.800 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:10:53 compute-0 nova_compute[183075]: 2026-01-22 17:10:53.826 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:10:53 compute-0 podman[218897]: 2026-01-22 17:10:53.989493833 +0000 UTC m=+0.055170210 container create 04243dd0edec7173998e19ee857249c039bac30befee2d558a8d8b171cc9f8c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:10:54 compute-0 systemd[1]: Started libpod-conmon-04243dd0edec7173998e19ee857249c039bac30befee2d558a8d8b171cc9f8c7.scope.
Jan 22 17:10:54 compute-0 podman[218897]: 2026-01-22 17:10:53.95983674 +0000 UTC m=+0.025513127 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:10:54 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:10:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98a90ffaaf19cf754860dd41470f93fa7233e95a32e8f90c678af9e3a1058b81/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:10:54 compute-0 podman[218897]: 2026-01-22 17:10:54.095811205 +0000 UTC m=+0.161487592 container init 04243dd0edec7173998e19ee857249c039bac30befee2d558a8d8b171cc9f8c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:10:54 compute-0 podman[218897]: 2026-01-22 17:10:54.109514922 +0000 UTC m=+0.175191259 container start 04243dd0edec7173998e19ee857249c039bac30befee2d558a8d8b171cc9f8c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 17:10:54 compute-0 neutron-haproxy-ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b[218912]: [NOTICE]   (218916) : New worker (218918) forked
Jan 22 17:10:54 compute-0 neutron-haproxy-ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b[218912]: [NOTICE]   (218916) : Loading success.
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.801 183079 DEBUG nova.compute.manager [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Received event network-vif-unplugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.802 183079 DEBUG oslo_concurrency.lockutils [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.802 183079 DEBUG oslo_concurrency.lockutils [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.803 183079 DEBUG oslo_concurrency.lockutils [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.803 183079 DEBUG nova.compute.manager [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] No waiting events found dispatching network-vif-unplugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.803 183079 DEBUG nova.compute.manager [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Received event network-vif-unplugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.803 183079 DEBUG nova.compute.manager [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Received event network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.804 183079 DEBUG oslo_concurrency.lockutils [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.804 183079 DEBUG oslo_concurrency.lockutils [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.804 183079 DEBUG oslo_concurrency.lockutils [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e4683d56-25f3-42a9-aedd-1b076e9a5245-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.805 183079 DEBUG nova.compute.manager [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] No waiting events found dispatching network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.805 183079 WARNING nova.compute.manager [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Received unexpected event network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 for instance with vm_state active and task_state deleting.
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.805 183079 DEBUG nova.compute.manager [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Received event network-vif-plugged-c26b2385-71db-477e-888c-d10712732db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.805 183079 DEBUG oslo_concurrency.lockutils [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.806 183079 DEBUG oslo_concurrency.lockutils [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.806 183079 DEBUG oslo_concurrency.lockutils [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.806 183079 DEBUG nova.compute.manager [req-5a43b415-5767-4d81-8b36-f1ec9ee089f6 req-643f3e8f-fca8-4fa8-bd29-1c3d05e3d8a2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Processing event network-vif-plugged-c26b2385-71db-477e-888c-d10712732db6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.807 183079 DEBUG nova.compute.manager [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.816 183079 DEBUG nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.817 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101854.8165925, 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.818 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] VM Resumed (Lifecycle Event)
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.823 183079 INFO nova.virt.libvirt.driver [-] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Instance spawned successfully.
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.825 183079 DEBUG nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.836 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.842 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.846 183079 DEBUG nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.846 183079 DEBUG nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.847 183079 DEBUG nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.847 183079 DEBUG nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.848 183079 DEBUG nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.849 183079 DEBUG nova.virt.libvirt.driver [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.870 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.902 183079 INFO nova.compute.manager [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Took 7.41 seconds to spawn the instance on the hypervisor.
Jan 22 17:10:54 compute-0 nova_compute[183075]: 2026-01-22 17:10:54.903 183079 DEBUG nova.compute.manager [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:10:55 compute-0 nova_compute[183075]: 2026-01-22 17:10:55.003 183079 INFO nova.compute.manager [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Took 7.96 seconds to build instance.
Jan 22 17:10:55 compute-0 nova_compute[183075]: 2026-01-22 17:10:55.032 183079 DEBUG oslo_concurrency.lockutils [None req-a9be5cb0-1fea-4225-8cb1-228495e276fc 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.066s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:55 compute-0 nova_compute[183075]: 2026-01-22 17:10:55.162 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:55 compute-0 podman[218928]: 2026-01-22 17:10:55.367391657 +0000 UTC m=+0.070843008 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 22 17:10:55 compute-0 podman[218929]: 2026-01-22 17:10:55.371092194 +0000 UTC m=+0.070887650 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible)
Jan 22 17:10:55 compute-0 podman[218927]: 2026-01-22 17:10:55.390998793 +0000 UTC m=+0.096861817 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 17:10:55 compute-0 nova_compute[183075]: 2026-01-22 17:10:55.494 183079 DEBUG nova.network.neutron [req-6f9c5517-457f-4f7d-a8c7-e9d4b7358f2c req-f43b1b75-1350-4af8-acb1-b8347f48d835 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Updated VIF entry in instance network info cache for port c26b2385-71db-477e-888c-d10712732db6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:10:55 compute-0 nova_compute[183075]: 2026-01-22 17:10:55.495 183079 DEBUG nova.network.neutron [req-6f9c5517-457f-4f7d-a8c7-e9d4b7358f2c req-f43b1b75-1350-4af8-acb1-b8347f48d835 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Updating instance_info_cache with network_info: [{"id": "c26b2385-71db-477e-888c-d10712732db6", "address": "fa:16:3e:3c:de:8a", "network": {"id": "ea1cd914-64be-4fd0-b944-45368957fb5b", "bridge": "br-int", "label": "tempest-test-network--945581638", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc26b2385-71", "ovs_interfaceid": "c26b2385-71db-477e-888c-d10712732db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:10:55 compute-0 nova_compute[183075]: 2026-01-22 17:10:55.510 183079 DEBUG oslo_concurrency.lockutils [req-6f9c5517-457f-4f7d-a8c7-e9d4b7358f2c req-f43b1b75-1350-4af8-acb1-b8347f48d835 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:10:55 compute-0 nova_compute[183075]: 2026-01-22 17:10:55.566 183079 DEBUG nova.network.neutron [-] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:10:55 compute-0 nova_compute[183075]: 2026-01-22 17:10:55.586 183079 INFO nova.compute.manager [-] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Took 4.19 seconds to deallocate network for instance.
Jan 22 17:10:55 compute-0 nova_compute[183075]: 2026-01-22 17:10:55.636 183079 DEBUG oslo_concurrency.lockutils [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:55 compute-0 nova_compute[183075]: 2026-01-22 17:10:55.637 183079 DEBUG oslo_concurrency.lockutils [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:55 compute-0 nova_compute[183075]: 2026-01-22 17:10:55.776 183079 DEBUG nova.compute.provider_tree [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:10:55 compute-0 nova_compute[183075]: 2026-01-22 17:10:55.806 183079 DEBUG nova.scheduler.client.report [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:10:55 compute-0 nova_compute[183075]: 2026-01-22 17:10:55.830 183079 DEBUG oslo_concurrency.lockutils [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:55 compute-0 nova_compute[183075]: 2026-01-22 17:10:55.874 183079 INFO nova.scheduler.client.report [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Deleted allocations for instance e4683d56-25f3-42a9-aedd-1b076e9a5245
Jan 22 17:10:55 compute-0 nova_compute[183075]: 2026-01-22 17:10:55.933 183079 DEBUG oslo_concurrency.lockutils [None req-23d6ed77-9738-493b-bdfe-d9e80291c215 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "e4683d56-25f3-42a9-aedd-1b076e9a5245" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.915s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:56 compute-0 nova_compute[183075]: 2026-01-22 17:10:56.045 183079 INFO nova.compute.manager [None req-14556521-6653-45aa-a874-989309a8649b cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Get console output
Jan 22 17:10:56 compute-0 nova_compute[183075]: 2026-01-22 17:10:56.067 183079 INFO nova.compute.manager [None req-5af36c39-2b7f-4726-ae99-7fd5584dcb06 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Get console output
Jan 22 17:10:56 compute-0 nova_compute[183075]: 2026-01-22 17:10:56.071 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:10:56 compute-0 nova_compute[183075]: 2026-01-22 17:10:56.885 183079 DEBUG nova.compute.manager [req-d89a6fe6-2f8b-4da5-b037-4c9e549a1aff req-34187275-47ef-40df-a8e8-a6f3c4fee661 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Received event network-vif-plugged-c26b2385-71db-477e-888c-d10712732db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:56 compute-0 nova_compute[183075]: 2026-01-22 17:10:56.886 183079 DEBUG oslo_concurrency.lockutils [req-d89a6fe6-2f8b-4da5-b037-4c9e549a1aff req-34187275-47ef-40df-a8e8-a6f3c4fee661 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:10:56 compute-0 nova_compute[183075]: 2026-01-22 17:10:56.887 183079 DEBUG oslo_concurrency.lockutils [req-d89a6fe6-2f8b-4da5-b037-4c9e549a1aff req-34187275-47ef-40df-a8e8-a6f3c4fee661 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:10:56 compute-0 nova_compute[183075]: 2026-01-22 17:10:56.887 183079 DEBUG oslo_concurrency.lockutils [req-d89a6fe6-2f8b-4da5-b037-4c9e549a1aff req-34187275-47ef-40df-a8e8-a6f3c4fee661 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:10:56 compute-0 nova_compute[183075]: 2026-01-22 17:10:56.888 183079 DEBUG nova.compute.manager [req-d89a6fe6-2f8b-4da5-b037-4c9e549a1aff req-34187275-47ef-40df-a8e8-a6f3c4fee661 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] No waiting events found dispatching network-vif-plugged-c26b2385-71db-477e-888c-d10712732db6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:10:56 compute-0 nova_compute[183075]: 2026-01-22 17:10:56.888 183079 WARNING nova.compute.manager [req-d89a6fe6-2f8b-4da5-b037-4c9e549a1aff req-34187275-47ef-40df-a8e8-a6f3c4fee661 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Received unexpected event network-vif-plugged-c26b2385-71db-477e-888c-d10712732db6 for instance with vm_state active and task_state None.
Jan 22 17:10:58 compute-0 nova_compute[183075]: 2026-01-22 17:10:58.135 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:10:58 compute-0 nova_compute[183075]: 2026-01-22 17:10:58.980 183079 DEBUG nova.compute.manager [req-440a88b3-46d2-4ece-b91a-af3ae6b91329 req-377d2d3a-abf2-4340-842e-b9a143f4b2cc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Received event network-changed-1188a618-4567-453e-b4f1-8d3fafe1d314 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:10:58 compute-0 nova_compute[183075]: 2026-01-22 17:10:58.981 183079 DEBUG nova.compute.manager [req-440a88b3-46d2-4ece-b91a-af3ae6b91329 req-377d2d3a-abf2-4340-842e-b9a143f4b2cc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Refreshing instance network info cache due to event network-changed-1188a618-4567-453e-b4f1-8d3fafe1d314. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:10:58 compute-0 nova_compute[183075]: 2026-01-22 17:10:58.981 183079 DEBUG oslo_concurrency.lockutils [req-440a88b3-46d2-4ece-b91a-af3ae6b91329 req-377d2d3a-abf2-4340-842e-b9a143f4b2cc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-7d7be65d-c615-4cfd-936e-e5b57b3f29c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:10:58 compute-0 nova_compute[183075]: 2026-01-22 17:10:58.982 183079 DEBUG oslo_concurrency.lockutils [req-440a88b3-46d2-4ece-b91a-af3ae6b91329 req-377d2d3a-abf2-4340-842e-b9a143f4b2cc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-7d7be65d-c615-4cfd-936e-e5b57b3f29c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:10:58 compute-0 nova_compute[183075]: 2026-01-22 17:10:58.983 183079 DEBUG nova.network.neutron [req-440a88b3-46d2-4ece-b91a-af3ae6b91329 req-377d2d3a-abf2-4340-842e-b9a143f4b2cc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Refreshing network info cache for port 1188a618-4567-453e-b4f1-8d3fafe1d314 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.164 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.263 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "c1a1134b-933b-41d1-ba12-adb71c18d006" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.263 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "c1a1134b-933b-41d1-ba12-adb71c18d006" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.282 183079 DEBUG nova.compute.manager [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.344 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.345 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.350 183079 DEBUG nova.virt.hardware [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.350 183079 INFO nova.compute.claims [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.552 183079 DEBUG nova.compute.provider_tree [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.573 183079 DEBUG nova.scheduler.client.report [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.602 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.603 183079 DEBUG nova.compute.manager [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.654 183079 DEBUG nova.compute.manager [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.655 183079 DEBUG nova.network.neutron [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.677 183079 INFO nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.695 183079 DEBUG nova.compute.manager [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.786 183079 DEBUG nova.compute.manager [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.788 183079 DEBUG nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.789 183079 INFO nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Creating image(s)
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.790 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "/var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.791 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "/var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.792 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "/var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.818 183079 DEBUG oslo_concurrency.processutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.888 183079 DEBUG oslo_concurrency.processutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.890 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.891 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.910 183079 DEBUG oslo_concurrency.processutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.975 183079 DEBUG oslo_concurrency.processutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:00 compute-0 nova_compute[183075]: 2026-01-22 17:11:00.976 183079 DEBUG oslo_concurrency.processutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.026 183079 DEBUG oslo_concurrency.processutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.028 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.028 183079 DEBUG oslo_concurrency.processutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.100 183079 DEBUG oslo_concurrency.processutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.103 183079 DEBUG nova.virt.disk.api [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Checking if we can resize image /var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.104 183079 DEBUG oslo_concurrency.processutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.166 183079 DEBUG oslo_concurrency.processutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.169 183079 DEBUG nova.virt.disk.api [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Cannot resize image /var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.170 183079 DEBUG nova.objects.instance [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lazy-loading 'migration_context' on Instance uuid c1a1134b-933b-41d1-ba12-adb71c18d006 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.192 183079 DEBUG nova.network.neutron [req-440a88b3-46d2-4ece-b91a-af3ae6b91329 req-377d2d3a-abf2-4340-842e-b9a143f4b2cc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Updated VIF entry in instance network info cache for port 1188a618-4567-453e-b4f1-8d3fafe1d314. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.193 183079 DEBUG nova.network.neutron [req-440a88b3-46d2-4ece-b91a-af3ae6b91329 req-377d2d3a-abf2-4340-842e-b9a143f4b2cc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Updating instance_info_cache with network_info: [{"id": "1188a618-4567-453e-b4f1-8d3fafe1d314", "address": "fa:16:3e:54:2a:05", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1188a618-45", "ovs_interfaceid": "1188a618-4567-453e-b4f1-8d3fafe1d314", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.196 183079 DEBUG nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.196 183079 DEBUG nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Ensure instance console log exists: /var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.197 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.197 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.198 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.216 183079 INFO nova.compute.manager [None req-023b74b8-9720-4046-bd63-e94b191cdd88 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Get console output
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.217 183079 DEBUG oslo_concurrency.lockutils [req-440a88b3-46d2-4ece-b91a-af3ae6b91329 req-377d2d3a-abf2-4340-842e-b9a143f4b2cc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-7d7be65d-c615-4cfd-936e-e5b57b3f29c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.221 183079 INFO nova.compute.manager [None req-26f974df-e4b6-4b9a-87b3-96146dae9571 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Get console output
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.225 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.227 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:01 compute-0 nova_compute[183075]: 2026-01-22 17:11:01.276 183079 DEBUG nova.policy [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:11:01 compute-0 ovn_controller[95372]: 2026-01-22T17:11:01Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:76:97:ec 10.100.0.30
Jan 22 17:11:01 compute-0 ovn_controller[95372]: 2026-01-22T17:11:01Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:76:97:ec 10.100.0.30
Jan 22 17:11:02 compute-0 podman[219023]: 2026-01-22 17:11:02.370104128 +0000 UTC m=+0.074509683 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:11:02 compute-0 ovn_controller[95372]: 2026-01-22T17:11:02Z|00179|memory|INFO|peak resident set size grew 52% in last 1792.8 seconds, from 16000 kB to 24372 kB
Jan 22 17:11:02 compute-0 ovn_controller[95372]: 2026-01-22T17:11:02Z|00180|memory|INFO|idl-cells-OVN_Southbound:10666 idl-cells-Open_vSwitch:1098 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:420 lflow-cache-entries-cache-matches:286 lflow-cache-size-KB:1652 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:673 ofctrl_installed_flow_usage-KB:492 ofctrl_sb_flow_ref_usage-KB:252 oflow_update_usage-KB:1
Jan 22 17:11:02 compute-0 nova_compute[183075]: 2026-01-22 17:11:02.875 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:03 compute-0 nova_compute[183075]: 2026-01-22 17:11:03.137 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:03 compute-0 nova_compute[183075]: 2026-01-22 17:11:03.990 183079 DEBUG nova.network.neutron [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Successfully updated port: 096b36b4-87c4-423a-a3ef-3c47a75704f7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:11:04 compute-0 nova_compute[183075]: 2026-01-22 17:11:04.012 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "refresh_cache-c1a1134b-933b-41d1-ba12-adb71c18d006" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:11:04 compute-0 nova_compute[183075]: 2026-01-22 17:11:04.012 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquired lock "refresh_cache-c1a1134b-933b-41d1-ba12-adb71c18d006" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:11:04 compute-0 nova_compute[183075]: 2026-01-22 17:11:04.012 183079 DEBUG nova.network.neutron [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:11:04 compute-0 nova_compute[183075]: 2026-01-22 17:11:04.118 183079 DEBUG nova.compute.manager [req-b8e72f1e-727c-4e67-a62e-22cc55f296ab req-64993a28-52a6-409b-bb14-efc27ead817c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Received event network-changed-096b36b4-87c4-423a-a3ef-3c47a75704f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:04 compute-0 nova_compute[183075]: 2026-01-22 17:11:04.118 183079 DEBUG nova.compute.manager [req-b8e72f1e-727c-4e67-a62e-22cc55f296ab req-64993a28-52a6-409b-bb14-efc27ead817c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Refreshing instance network info cache due to event network-changed-096b36b4-87c4-423a-a3ef-3c47a75704f7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:11:04 compute-0 nova_compute[183075]: 2026-01-22 17:11:04.119 183079 DEBUG oslo_concurrency.lockutils [req-b8e72f1e-727c-4e67-a62e-22cc55f296ab req-64993a28-52a6-409b-bb14-efc27ead817c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-c1a1134b-933b-41d1-ba12-adb71c18d006" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:11:04 compute-0 nova_compute[183075]: 2026-01-22 17:11:04.256 183079 DEBUG nova.network.neutron [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.166 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.313 183079 DEBUG nova.network.neutron [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Updating instance_info_cache with network_info: [{"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.333 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Releasing lock "refresh_cache-c1a1134b-933b-41d1-ba12-adb71c18d006" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.333 183079 DEBUG nova.compute.manager [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Instance network_info: |[{"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.334 183079 DEBUG oslo_concurrency.lockutils [req-b8e72f1e-727c-4e67-a62e-22cc55f296ab req-64993a28-52a6-409b-bb14-efc27ead817c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-c1a1134b-933b-41d1-ba12-adb71c18d006" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.334 183079 DEBUG nova.network.neutron [req-b8e72f1e-727c-4e67-a62e-22cc55f296ab req-64993a28-52a6-409b-bb14-efc27ead817c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Refreshing network info cache for port 096b36b4-87c4-423a-a3ef-3c47a75704f7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.339 183079 DEBUG nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Start _get_guest_xml network_info=[{"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.345 183079 WARNING nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.349 183079 DEBUG nova.virt.libvirt.host [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.350 183079 DEBUG nova.virt.libvirt.host [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.352 183079 DEBUG nova.virt.libvirt.host [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.353 183079 DEBUG nova.virt.libvirt.host [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.353 183079 DEBUG nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.354 183079 DEBUG nova.virt.hardware [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.354 183079 DEBUG nova.virt.hardware [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.354 183079 DEBUG nova.virt.hardware [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.355 183079 DEBUG nova.virt.hardware [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.355 183079 DEBUG nova.virt.hardware [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.355 183079 DEBUG nova.virt.hardware [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.355 183079 DEBUG nova.virt.hardware [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.355 183079 DEBUG nova.virt.hardware [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.356 183079 DEBUG nova.virt.hardware [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.356 183079 DEBUG nova.virt.hardware [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.356 183079 DEBUG nova.virt.hardware [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.360 183079 DEBUG nova.virt.libvirt.vif [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:10:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-38386895',display_name='tempest-server-test-38386895',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-38386895',id=17,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHfe8rLYrRVMQd5qVieqZtJXgXMRXXWLO3wIeFfb7KYA9eoyBAovnsonBtjcWSfX5askB1oLz9+GVLr2BbeT56cbjxFVwHBiF5ai0hYAzgMHQMj/KeUJm66j5OTKSNVWEQ==',key_name='tempest-keypair-test-1258081705',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bfc6667804934c92b71ce7638089e9e3',ramdisk_id='',reservation_id='r-ddwqtn0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-QoSTest-2146064006',owner_user_name='tempest-QoSTest-2146064006-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:11:00Z,user_data=None,user_id='1e61127d65144bcbaa0d43fe3eb484c0',uuid=c1a1134b-933b-41d1-ba12-adb71c18d006,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.361 183079 DEBUG nova.network.os_vif_util [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converting VIF {"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.362 183079 DEBUG nova.network.os_vif_util [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:65,bridge_name='br-int',has_traffic_filtering=True,id=096b36b4-87c4-423a-a3ef-3c47a75704f7,network=Network(9c1e909c-8e03-49be-b02d-6bf4a2cedc0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap096b36b4-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.362 183079 DEBUG nova.objects.instance [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lazy-loading 'pci_devices' on Instance uuid c1a1134b-933b-41d1-ba12-adb71c18d006 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.384 183079 DEBUG nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:11:05 compute-0 nova_compute[183075]:   <uuid>c1a1134b-933b-41d1-ba12-adb71c18d006</uuid>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   <name>instance-00000011</name>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-38386895</nova:name>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:11:05</nova:creationTime>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:11:05 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:11:05 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:11:05 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:11:05 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:11:05 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:11:05 compute-0 nova_compute[183075]:         <nova:user uuid="1e61127d65144bcbaa0d43fe3eb484c0">tempest-QoSTest-2146064006-project-member</nova:user>
Jan 22 17:11:05 compute-0 nova_compute[183075]:         <nova:project uuid="bfc6667804934c92b71ce7638089e9e3">tempest-QoSTest-2146064006</nova:project>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:11:05 compute-0 nova_compute[183075]:         <nova:port uuid="096b36b4-87c4-423a-a3ef-3c47a75704f7">
Jan 22 17:11:05 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <system>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <entry name="serial">c1a1134b-933b-41d1-ba12-adb71c18d006</entry>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <entry name="uuid">c1a1134b-933b-41d1-ba12-adb71c18d006</entry>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     </system>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   <os>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   </os>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   <features>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   </features>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006/disk"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:f6:53:65"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <target dev="tap096b36b4-87"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006/console.log" append="off"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <video>
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     </video>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:11:05 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:11:05 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:11:05 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:11:05 compute-0 nova_compute[183075]: </domain>
Jan 22 17:11:05 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.385 183079 DEBUG nova.compute.manager [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Preparing to wait for external event network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.385 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.385 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.386 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.386 183079 DEBUG nova.virt.libvirt.vif [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:10:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-38386895',display_name='tempest-server-test-38386895',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-38386895',id=17,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHfe8rLYrRVMQd5qVieqZtJXgXMRXXWLO3wIeFfb7KYA9eoyBAovnsonBtjcWSfX5askB1oLz9+GVLr2BbeT56cbjxFVwHBiF5ai0hYAzgMHQMj/KeUJm66j5OTKSNVWEQ==',key_name='tempest-keypair-test-1258081705',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bfc6667804934c92b71ce7638089e9e3',ramdisk_id='',reservation_id='r-ddwqtn0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-QoSTest-2146064006',owner_user_name='tempest-QoSTest-2146064006-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:11:00Z,user_data=None,user_id='1e61127d65144bcbaa0d43fe3eb484c0',uuid=c1a1134b-933b-41d1-ba12-adb71c18d006,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.387 183079 DEBUG nova.network.os_vif_util [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converting VIF {"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.387 183079 DEBUG nova.network.os_vif_util [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:65,bridge_name='br-int',has_traffic_filtering=True,id=096b36b4-87c4-423a-a3ef-3c47a75704f7,network=Network(9c1e909c-8e03-49be-b02d-6bf4a2cedc0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap096b36b4-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.388 183079 DEBUG os_vif [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:65,bridge_name='br-int',has_traffic_filtering=True,id=096b36b4-87c4-423a-a3ef-3c47a75704f7,network=Network(9c1e909c-8e03-49be-b02d-6bf4a2cedc0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap096b36b4-87') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.389 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.389 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.389 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.392 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.393 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap096b36b4-87, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.393 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap096b36b4-87, col_values=(('external_ids', {'iface-id': '096b36b4-87c4-423a-a3ef-3c47a75704f7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f6:53:65', 'vm-uuid': 'c1a1134b-933b-41d1-ba12-adb71c18d006'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.395 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:05 compute-0 NetworkManager[55454]: <info>  [1769101865.3959] manager: (tap096b36b4-87): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.397 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.402 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.404 183079 INFO os_vif [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:65,bridge_name='br-int',has_traffic_filtering=True,id=096b36b4-87c4-423a-a3ef-3c47a75704f7,network=Network(9c1e909c-8e03-49be-b02d-6bf4a2cedc0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap096b36b4-87')
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.465 183079 DEBUG nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.465 183079 DEBUG nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] No VIF found with MAC fa:16:3e:f6:53:65, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:11:05 compute-0 kernel: tap096b36b4-87: entered promiscuous mode
Jan 22 17:11:05 compute-0 NetworkManager[55454]: <info>  [1769101865.5254] manager: (tap096b36b4-87): new Tun device (/org/freedesktop/NetworkManager/Devices/89)
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.527 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:05 compute-0 ovn_controller[95372]: 2026-01-22T17:11:05Z|00181|binding|INFO|Claiming lport 096b36b4-87c4-423a-a3ef-3c47a75704f7 for this chassis.
Jan 22 17:11:05 compute-0 ovn_controller[95372]: 2026-01-22T17:11:05Z|00182|binding|INFO|096b36b4-87c4-423a-a3ef-3c47a75704f7: Claiming fa:16:3e:f6:53:65 10.100.0.14
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.536 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:53:65 10.100.0.14'], port_security=['fa:16:3e:f6:53:65 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bfc6667804934c92b71ce7638089e9e3', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'd9af03c0-27db-4d08-b124-ee395583cdd3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd725c57-a5bb-4dca-9677-d74d2fa01c15, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=096b36b4-87c4-423a-a3ef-3c47a75704f7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.537 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 096b36b4-87c4-423a-a3ef-3c47a75704f7 in datapath 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c bound to our chassis
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.539 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c
Jan 22 17:11:05 compute-0 ovn_controller[95372]: 2026-01-22T17:11:05Z|00183|binding|INFO|Setting lport 096b36b4-87c4-423a-a3ef-3c47a75704f7 ovn-installed in OVS
Jan 22 17:11:05 compute-0 ovn_controller[95372]: 2026-01-22T17:11:05Z|00184|binding|INFO|Setting lport 096b36b4-87c4-423a-a3ef-3c47a75704f7 up in Southbound
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.548 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.551 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8a88e725-bd42-4cbf-adfc-9947a9d1acfa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.551 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.552 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9c1e909c-81 in ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:11:05 compute-0 systemd-udevd[219065]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.555 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9c1e909c-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.556 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[72e88a3b-b218-49c9-85ce-7783b1be5de8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.560 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[05d8e917-04f9-443a-be49-252282a018c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:05 compute-0 systemd-machined[154382]: New machine qemu-17-instance-00000011.
Jan 22 17:11:05 compute-0 NetworkManager[55454]: <info>  [1769101865.5729] device (tap096b36b4-87): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:11:05 compute-0 NetworkManager[55454]: <info>  [1769101865.5734] device (tap096b36b4-87): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:11:05 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000011.
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.577 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[596e7001-f3cd-45d9-bcd6-21887f89768c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.602 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0ee62dc1-f622-4bfa-a7ce-c92562911807]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.637 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c0ca594c-7263-402c-8186-78143ad462e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:05 compute-0 NetworkManager[55454]: <info>  [1769101865.6463] manager: (tap9c1e909c-80): new Veth device (/org/freedesktop/NetworkManager/Devices/90)
Jan 22 17:11:05 compute-0 systemd-udevd[219070]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.646 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3be7ca7d-d0db-4559-9648-09bcdb4dc59b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.679 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f2943a0d-32b4-444a-a1ed-18a77a52574f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.682 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[e99484fb-6ff8-45dd-a5f4-c62e816278c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:05 compute-0 NetworkManager[55454]: <info>  [1769101865.7060] device (tap9c1e909c-80): carrier: link connected
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.710 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d9f3c85c-7c48-4115-b1f7-b88a8ffa5c3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.737 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[672cfedc-05aa-421a-867a-0d0288e92147]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c1e909c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:42:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422940, 'reachable_time': 44267, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219114, 'error': None, 'target': 'ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.755 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[65ebb287-ce40-46ad-88e3-f2d6692067fb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe65:4225'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 422940, 'tstamp': 422940}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219119, 'error': None, 'target': 'ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.773 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d95b7ef6-5a84-44b4-8b9b-f644e53d76e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c1e909c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:42:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422940, 'reachable_time': 44267, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219120, 'error': None, 'target': 'ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.820 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d11fa79c-7f99-43f9-a96a-228099d53558]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.826 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101865.8261213, c1a1134b-933b-41d1-ba12-adb71c18d006 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.826 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] VM Started (Lifecycle Event)
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.852 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.856 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101865.8262918, c1a1134b-933b-41d1-ba12-adb71c18d006 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.856 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] VM Paused (Lifecycle Event)
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.885 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.891 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.906 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3494f101-0c2d-4fdc-a578-8cb69455e8c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.908 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c1e909c-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.908 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.908 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c1e909c-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.910 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:05 compute-0 NetworkManager[55454]: <info>  [1769101865.9111] manager: (tap9c1e909c-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Jan 22 17:11:05 compute-0 kernel: tap9c1e909c-80: entered promiscuous mode
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.913 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9c1e909c-80, col_values=(('external_ids', {'iface-id': '02f52a63-476f-468b-a774-c9514d6b2206'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:05 compute-0 ovn_controller[95372]: 2026-01-22T17:11:05Z|00185|binding|INFO|Releasing lport 02f52a63-476f-468b-a774-c9514d6b2206 from this chassis (sb_readonly=0)
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.914 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.922 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.930 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:05 compute-0 nova_compute[183075]: 2026-01-22 17:11:05.931 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.933 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c1e909c-8e03-49be-b02d-6bf4a2cedc0c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c1e909c-8e03-49be-b02d-6bf4a2cedc0c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.934 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[de4f4b23-8205-42ad-8596-a4992674d66f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.935 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/9c1e909c-8e03-49be-b02d-6bf4a2cedc0c.pid.haproxy
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:11:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:05.936 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c', 'env', 'PROCESS_TAG=haproxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9c1e909c-8e03-49be-b02d-6bf4a2cedc0c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.183 183079 DEBUG nova.compute.manager [req-3a5d0f38-c2dc-4a05-bdd8-3c6394c326a9 req-323afe25-7398-4ad7-8784-b08f40c39de1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Received event network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.184 183079 DEBUG oslo_concurrency.lockutils [req-3a5d0f38-c2dc-4a05-bdd8-3c6394c326a9 req-323afe25-7398-4ad7-8784-b08f40c39de1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.184 183079 DEBUG oslo_concurrency.lockutils [req-3a5d0f38-c2dc-4a05-bdd8-3c6394c326a9 req-323afe25-7398-4ad7-8784-b08f40c39de1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.184 183079 DEBUG oslo_concurrency.lockutils [req-3a5d0f38-c2dc-4a05-bdd8-3c6394c326a9 req-323afe25-7398-4ad7-8784-b08f40c39de1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.185 183079 DEBUG nova.compute.manager [req-3a5d0f38-c2dc-4a05-bdd8-3c6394c326a9 req-323afe25-7398-4ad7-8784-b08f40c39de1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Processing event network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.185 183079 DEBUG nova.compute.manager [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.188 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101866.1886692, c1a1134b-933b-41d1-ba12-adb71c18d006 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.189 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] VM Resumed (Lifecycle Event)
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.191 183079 DEBUG nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.194 183079 INFO nova.virt.libvirt.driver [-] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Instance spawned successfully.
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.194 183079 DEBUG nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.217 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.223 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.227 183079 DEBUG nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.227 183079 DEBUG nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.228 183079 DEBUG nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.228 183079 DEBUG nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.228 183079 DEBUG nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.229 183079 DEBUG nova.virt.libvirt.driver [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.266 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:11:06 compute-0 podman[219153]: 2026-01-22 17:11:06.312422211 +0000 UTC m=+0.057437129 container create 5559883f0eb78b2767cd5b1c2099ef9ddce66df902b91fbcdeb902d14ce3416f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.315 183079 INFO nova.compute.manager [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Took 5.53 seconds to spawn the instance on the hypervisor.
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.315 183079 DEBUG nova.compute.manager [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.325 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101851.3228924, e4683d56-25f3-42a9-aedd-1b076e9a5245 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.325 183079 INFO nova.compute.manager [-] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] VM Stopped (Lifecycle Event)
Jan 22 17:11:06 compute-0 systemd[1]: Started libpod-conmon-5559883f0eb78b2767cd5b1c2099ef9ddce66df902b91fbcdeb902d14ce3416f.scope.
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.367 183079 INFO nova.compute.manager [None req-6921f576-e3b9-4e80-b178-2b8eb67d2816 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Get console output
Jan 22 17:11:06 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.368 183079 DEBUG nova.compute.manager [None req-8490e1a6-c58a-4b92-9ef3-dc8abf42d9c6 - - - - - -] [instance: e4683d56-25f3-42a9-aedd-1b076e9a5245] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.374 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00880bfd293daaf1c42a33064a76a1798cdfeb466c7691cc3e6701338cd821ef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:11:06 compute-0 podman[219153]: 2026-01-22 17:11:06.285079578 +0000 UTC m=+0.030094516 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.388 183079 INFO nova.compute.manager [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Took 6.06 seconds to build instance.
Jan 22 17:11:06 compute-0 podman[219153]: 2026-01-22 17:11:06.392140699 +0000 UTC m=+0.137155637 container init 5559883f0eb78b2767cd5b1c2099ef9ddce66df902b91fbcdeb902d14ce3416f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:11:06 compute-0 podman[219153]: 2026-01-22 17:11:06.398500085 +0000 UTC m=+0.143515003 container start 5559883f0eb78b2767cd5b1c2099ef9ddce66df902b91fbcdeb902d14ce3416f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.404 183079 DEBUG oslo_concurrency.lockutils [None req-fefb4648-151e-4d68-80a3-054523663bd1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "c1a1134b-933b-41d1-ba12-adb71c18d006" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:06 compute-0 neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219169]: [NOTICE]   (219173) : New worker (219175) forked
Jan 22 17:11:06 compute-0 neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219169]: [NOTICE]   (219173) : Loading success.
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.483 183079 INFO nova.compute.manager [None req-77c28471-bd38-40b1-a0e8-c07c74151f8f cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Get console output
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.489 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.542 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.543 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:11:06 compute-0 nova_compute[183075]: 2026-01-22 17:11:06.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:11:07 compute-0 ovn_controller[95372]: 2026-01-22T17:11:07Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3c:de:8a 10.10.1.232
Jan 22 17:11:07 compute-0 ovn_controller[95372]: 2026-01-22T17:11:07Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3c:de:8a 10.10.1.232
Jan 22 17:11:07 compute-0 nova_compute[183075]: 2026-01-22 17:11:07.676 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:08 compute-0 nova_compute[183075]: 2026-01-22 17:11:08.220 183079 DEBUG nova.network.neutron [req-b8e72f1e-727c-4e67-a62e-22cc55f296ab req-64993a28-52a6-409b-bb14-efc27ead817c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Updated VIF entry in instance network info cache for port 096b36b4-87c4-423a-a3ef-3c47a75704f7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:11:08 compute-0 nova_compute[183075]: 2026-01-22 17:11:08.220 183079 DEBUG nova.network.neutron [req-b8e72f1e-727c-4e67-a62e-22cc55f296ab req-64993a28-52a6-409b-bb14-efc27ead817c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Updating instance_info_cache with network_info: [{"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:08 compute-0 nova_compute[183075]: 2026-01-22 17:11:08.244 183079 DEBUG oslo_concurrency.lockutils [req-b8e72f1e-727c-4e67-a62e-22cc55f296ab req-64993a28-52a6-409b-bb14-efc27ead817c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-c1a1134b-933b-41d1-ba12-adb71c18d006" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:11:08 compute-0 nova_compute[183075]: 2026-01-22 17:11:08.267 183079 INFO nova.compute.manager [None req-fdafcf6e-1c54-4935-998d-33d958166536 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Get console output
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:08.280 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:08.281 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.30
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:08 compute-0 nova_compute[183075]: 2026-01-22 17:11:08.398 183079 DEBUG nova.compute.manager [req-b312b906-2314-4876-a724-4420c6ff2584 req-09e6ab01-2221-4212-bfe3-49082f41fce2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Received event network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:08 compute-0 nova_compute[183075]: 2026-01-22 17:11:08.398 183079 DEBUG oslo_concurrency.lockutils [req-b312b906-2314-4876-a724-4420c6ff2584 req-09e6ab01-2221-4212-bfe3-49082f41fce2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:08 compute-0 nova_compute[183075]: 2026-01-22 17:11:08.398 183079 DEBUG oslo_concurrency.lockutils [req-b312b906-2314-4876-a724-4420c6ff2584 req-09e6ab01-2221-4212-bfe3-49082f41fce2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:08 compute-0 nova_compute[183075]: 2026-01-22 17:11:08.399 183079 DEBUG oslo_concurrency.lockutils [req-b312b906-2314-4876-a724-4420c6ff2584 req-09e6ab01-2221-4212-bfe3-49082f41fce2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:08 compute-0 nova_compute[183075]: 2026-01-22 17:11:08.399 183079 DEBUG nova.compute.manager [req-b312b906-2314-4876-a724-4420c6ff2584 req-09e6ab01-2221-4212-bfe3-49082f41fce2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] No waiting events found dispatching network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:08 compute-0 nova_compute[183075]: 2026-01-22 17:11:08.399 183079 WARNING nova.compute.manager [req-b312b906-2314-4876-a724-4420c6ff2584 req-09e6ab01-2221-4212-bfe3-49082f41fce2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Received unexpected event network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 for instance with vm_state active and task_state None.
Jan 22 17:11:08 compute-0 nova_compute[183075]: 2026-01-22 17:11:08.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:08.947 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:08.947 104990 INFO eventlet.wsgi.server [-] 10.100.0.30,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.6660209
Jan 22 17:11:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[218668]: 10.100.0.30:54248 [22/Jan/2026:17:11:08.279] listener listener/metadata 0/0/0/667/667 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:08.959 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:08.960 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.30
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:08.999 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:08.999 104990 INFO eventlet.wsgi.server [-] 10.100.0.30,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0393972
Jan 22 17:11:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[218668]: 10.100.0.30:54260 [22/Jan/2026:17:11:08.958] listener listener/metadata 0/0/0/41/41 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.003 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.004 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.30
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.024 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.024 104990 INFO eventlet.wsgi.server [-] 10.100.0.30,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0202997
Jan 22 17:11:09 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[218668]: 10.100.0.30:54262 [22/Jan/2026:17:11:09.003] listener listener/metadata 0/0/0/21/21 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.029 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.031 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.30
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.051 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:09 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[218668]: 10.100.0.30:54276 [22/Jan/2026:17:11:09.029] listener listener/metadata 0/0/0/23/23 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.052 104990 INFO eventlet.wsgi.server [-] 10.100.0.30,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0216770
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.057 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.057 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.30
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.073 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:09 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[218668]: 10.100.0.30:54282 [22/Jan/2026:17:11:09.056] listener listener/metadata 0/0/0/17/17 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.074 104990 INFO eventlet.wsgi.server [-] 10.100.0.30,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0167577
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.079 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.079 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.30
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.105 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.105 104990 INFO eventlet.wsgi.server [-] 10.100.0.30,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0255921
Jan 22 17:11:09 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[218668]: 10.100.0.30:54292 [22/Jan/2026:17:11:09.078] listener listener/metadata 0/0/0/26/26 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.110 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.111 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.30
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.129 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.129 104990 INFO eventlet.wsgi.server [-] 10.100.0.30,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0185876
Jan 22 17:11:09 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[218668]: 10.100.0.30:54298 [22/Jan/2026:17:11:09.110] listener listener/metadata 0/0/0/19/19 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.134 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.135 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.30
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.163 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.163 104990 INFO eventlet.wsgi.server [-] 10.100.0.30,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0281632
Jan 22 17:11:09 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[218668]: 10.100.0.30:54308 [22/Jan/2026:17:11:09.134] listener listener/metadata 0/0/0/28/28 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.168 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.168 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.30
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.183 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.183 104990 INFO eventlet.wsgi.server [-] 10.100.0.30,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0152161
Jan 22 17:11:09 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[218668]: 10.100.0.30:54316 [22/Jan/2026:17:11:09.167] listener listener/metadata 0/0/0/15/15 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.187 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.188 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.30
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.210 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.210 104990 INFO eventlet.wsgi.server [-] 10.100.0.30,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0219719
Jan 22 17:11:09 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[218668]: 10.100.0.30:54326 [22/Jan/2026:17:11:09.187] listener listener/metadata 0/0/0/22/22 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.215 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.215 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.30
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:09 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[218668]: 10.100.0.30:54330 [22/Jan/2026:17:11:09.214] listener listener/metadata 0/0/0/24/24 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.239 104990 INFO eventlet.wsgi.server [-] 10.100.0.30,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0235941
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.250 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.251 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.30
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.284 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.284 104990 INFO eventlet.wsgi.server [-] 10.100.0.30,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0329809
Jan 22 17:11:09 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[218668]: 10.100.0.30:54338 [22/Jan/2026:17:11:09.250] listener listener/metadata 0/0/0/34/34 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.289 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.290 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.30
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.309 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:09 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[218668]: 10.100.0.30:54352 [22/Jan/2026:17:11:09.288] listener listener/metadata 0/0/0/20/20 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.309 104990 INFO eventlet.wsgi.server [-] 10.100.0.30,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0193079
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.313 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.314 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.30
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.335 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.336 104990 INFO eventlet.wsgi.server [-] 10.100.0.30,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0219250
Jan 22 17:11:09 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[218668]: 10.100.0.30:54354 [22/Jan/2026:17:11:09.313] listener listener/metadata 0/0/0/22/22 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.341 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.342 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.30
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:09 compute-0 podman[219184]: 2026-01-22 17:11:09.352325215 +0000 UTC m=+0.053181267 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.363 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.363 104990 INFO eventlet.wsgi.server [-] 10.100.0.30,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0212982
Jan 22 17:11:09 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[218668]: 10.100.0.30:54370 [22/Jan/2026:17:11:09.340] listener listener/metadata 0/0/0/22/22 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.368 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.369 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.30
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.393 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:09.393 104990 INFO eventlet.wsgi.server [-] 10.100.0.30,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0241549
Jan 22 17:11:09 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[218668]: 10.100.0.30:54372 [22/Jan/2026:17:11:09.368] listener listener/metadata 0/0/0/24/24 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:11:09 compute-0 nova_compute[183075]: 2026-01-22 17:11:09.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:11:09 compute-0 nova_compute[183075]: 2026-01-22 17:11:09.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:11:09 compute-0 nova_compute[183075]: 2026-01-22 17:11:09.978 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-618a8c78-4b30-4b4d-9617-4c54bfdd414e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:11:09 compute-0 nova_compute[183075]: 2026-01-22 17:11:09.979 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-618a8c78-4b30-4b4d-9617-4c54bfdd414e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:11:09 compute-0 nova_compute[183075]: 2026-01-22 17:11:09.979 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:11:10 compute-0 nova_compute[183075]: 2026-01-22 17:11:10.169 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:10 compute-0 nova_compute[183075]: 2026-01-22 17:11:10.395 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:10 compute-0 nova_compute[183075]: 2026-01-22 17:11:10.660 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:10 compute-0 nova_compute[183075]: 2026-01-22 17:11:10.660 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:10 compute-0 nova_compute[183075]: 2026-01-22 17:11:10.677 183079 DEBUG nova.compute.manager [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:11:10 compute-0 nova_compute[183075]: 2026-01-22 17:11:10.746 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:10 compute-0 nova_compute[183075]: 2026-01-22 17:11:10.746 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:10 compute-0 nova_compute[183075]: 2026-01-22 17:11:10.754 183079 DEBUG nova.virt.hardware [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:11:10 compute-0 nova_compute[183075]: 2026-01-22 17:11:10.755 183079 INFO nova.compute.claims [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:11:10 compute-0 nova_compute[183075]: 2026-01-22 17:11:10.940 183079 DEBUG nova.compute.provider_tree [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:11:10 compute-0 nova_compute[183075]: 2026-01-22 17:11:10.954 183079 DEBUG nova.scheduler.client.report [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:11:10 compute-0 nova_compute[183075]: 2026-01-22 17:11:10.974 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:10 compute-0 nova_compute[183075]: 2026-01-22 17:11:10.975 183079 DEBUG nova.compute.manager [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.052 183079 DEBUG nova.compute.manager [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.053 183079 DEBUG nova.network.neutron [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.075 183079 INFO nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.093 183079 DEBUG nova.compute.manager [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.104 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Updating instance_info_cache with network_info: [{"id": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "address": "fa:16:3e:ca:cc:e7", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea3e9d7-15", "ovs_interfaceid": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.137 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-618a8c78-4b30-4b4d-9617-4c54bfdd414e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.137 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.138 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.138 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.192 183079 DEBUG nova.compute.manager [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.193 183079 DEBUG nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.193 183079 INFO nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Creating image(s)
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.194 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "/var/lib/nova/instances/d54ce6ac-7fff-4f20-a6e0-48c13efded58/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.194 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "/var/lib/nova/instances/d54ce6ac-7fff-4f20-a6e0-48c13efded58/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.195 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "/var/lib/nova/instances/d54ce6ac-7fff-4f20-a6e0-48c13efded58/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.206 183079 DEBUG oslo_concurrency.processutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.237 183079 DEBUG nova.policy [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.323 183079 DEBUG oslo_concurrency.processutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.324 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.325 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.341 183079 DEBUG oslo_concurrency.processutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.399 183079 DEBUG oslo_concurrency.processutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.401 183079 DEBUG oslo_concurrency.processutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/d54ce6ac-7fff-4f20-a6e0-48c13efded58/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.456 183079 DEBUG oslo_concurrency.processutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/d54ce6ac-7fff-4f20-a6e0-48c13efded58/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.457 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.457 183079 DEBUG oslo_concurrency.processutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.503 183079 INFO nova.compute.manager [None req-6f4a7b2d-95a7-4eee-accb-34943e90570c 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Get console output
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.508 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.523 183079 DEBUG oslo_concurrency.processutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.523 183079 DEBUG nova.virt.disk.api [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Checking if we can resize image /var/lib/nova/instances/d54ce6ac-7fff-4f20-a6e0-48c13efded58/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.523 183079 DEBUG oslo_concurrency.processutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d54ce6ac-7fff-4f20-a6e0-48c13efded58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.575 183079 DEBUG oslo_concurrency.processutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d54ce6ac-7fff-4f20-a6e0-48c13efded58/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.576 183079 DEBUG nova.virt.disk.api [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Cannot resize image /var/lib/nova/instances/d54ce6ac-7fff-4f20-a6e0-48c13efded58/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.576 183079 DEBUG nova.objects.instance [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lazy-loading 'migration_context' on Instance uuid d54ce6ac-7fff-4f20-a6e0-48c13efded58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.588 183079 INFO nova.compute.manager [None req-da26267e-225c-4346-a895-25626de4f8ce cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Get console output
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.595 183079 DEBUG nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.595 183079 DEBUG nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Ensure instance console log exists: /var/lib/nova/instances/d54ce6ac-7fff-4f20-a6e0-48c13efded58/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.595 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.596 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.596 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.594 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:11:11 compute-0 nova_compute[183075]: 2026-01-22 17:11:11.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.083 183079 DEBUG nova.network.neutron [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Successfully created port: e1d81dc2-e73c-45fb-be4a-7192b576b628 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.546 183079 DEBUG oslo_concurrency.lockutils [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.547 183079 DEBUG oslo_concurrency.lockutils [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.547 183079 DEBUG oslo_concurrency.lockutils [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.547 183079 DEBUG oslo_concurrency.lockutils [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.548 183079 DEBUG oslo_concurrency.lockutils [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.549 183079 INFO nova.compute.manager [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Terminating instance
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.550 183079 DEBUG nova.compute.manager [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:11:12 compute-0 kernel: tap9b370d66-b3 (unregistering): left promiscuous mode
Jan 22 17:11:12 compute-0 NetworkManager[55454]: <info>  [1769101872.5738] device (tap9b370d66-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:11:12 compute-0 ovn_controller[95372]: 2026-01-22T17:11:12Z|00186|binding|INFO|Releasing lport 9b370d66-b3f5-492d-b735-289048caa64f from this chassis (sb_readonly=0)
Jan 22 17:11:12 compute-0 ovn_controller[95372]: 2026-01-22T17:11:12Z|00187|binding|INFO|Setting lport 9b370d66-b3f5-492d-b735-289048caa64f down in Southbound
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.582 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:12 compute-0 ovn_controller[95372]: 2026-01-22T17:11:12Z|00188|binding|INFO|Removing iface tap9b370d66-b3 ovn-installed in OVS
Jan 22 17:11:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:12.592 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:97:ec 10.100.0.30'], port_security=['fa:16:3e:76:97:ec 10.100.0.30'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.30/28', 'neutron:device_id': '2bf3289b-0c4e-4286-80ca-c74eb06a8b96', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a16be1a-262e-47f7-8518-5f24ee15796e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5b6ccb16-1216-4deb-9d72-42005a3163bb, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=9b370d66-b3f5-492d-b735-289048caa64f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:11:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:12.593 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 9b370d66-b3f5-492d-b735-289048caa64f in datapath 0a16be1a-262e-47f7-8518-5f24ee15796e unbound from our chassis
Jan 22 17:11:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:12.595 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a16be1a-262e-47f7-8518-5f24ee15796e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:11:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:12.608 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cbaade3d-0bd2-464b-b628-b5659a7c7b7b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:12.609 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e namespace which is not needed anymore
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.630 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:12 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Jan 22 17:11:12 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000000f.scope: Consumed 13.016s CPU time.
Jan 22 17:11:12 compute-0 systemd-machined[154382]: Machine qemu-15-instance-0000000f terminated.
Jan 22 17:11:12 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[218662]: [NOTICE]   (218666) : haproxy version is 2.8.14-c23fe91
Jan 22 17:11:12 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[218662]: [NOTICE]   (218666) : path to executable is /usr/sbin/haproxy
Jan 22 17:11:12 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[218662]: [WARNING]  (218666) : Exiting Master process...
Jan 22 17:11:12 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[218662]: [WARNING]  (218666) : Exiting Master process...
Jan 22 17:11:12 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[218662]: [ALERT]    (218666) : Current worker (218668) exited with code 143 (Terminated)
Jan 22 17:11:12 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[218662]: [WARNING]  (218666) : All workers exited. Exiting... (0)
Jan 22 17:11:12 compute-0 systemd[1]: libpod-81f287b3ae086e0074d492f8f8a158bac7404596bd7c579dc4ed4edbe5ec93c1.scope: Deactivated successfully.
Jan 22 17:11:12 compute-0 podman[219247]: 2026-01-22 17:11:12.772965146 +0000 UTC m=+0.062878371 container died 81f287b3ae086e0074d492f8f8a158bac7404596bd7c579dc4ed4edbe5ec93c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.800 183079 INFO nova.virt.libvirt.driver [-] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Instance destroyed successfully.
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.801 183079 DEBUG nova.objects.instance [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'resources' on Instance uuid 2bf3289b-0c4e-4286-80ca-c74eb06a8b96 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:11:12 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-81f287b3ae086e0074d492f8f8a158bac7404596bd7c579dc4ed4edbe5ec93c1-userdata-shm.mount: Deactivated successfully.
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.816 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.817 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.817 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.817 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:11:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-4d29937df1ac6d693a9d69c64067ccf3334b0d3988fbec9f9723241c1cfe6246-merged.mount: Deactivated successfully.
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.822 183079 DEBUG nova.virt.libvirt.vif [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:10:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-368474683',display_name='tempest-server-test-368474683',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-368474683',id=15,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:10:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-lzk0if63',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:10:49Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=2bf3289b-0c4e-4286-80ca-c74eb06a8b96,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9b370d66-b3f5-492d-b735-289048caa64f", "address": "fa:16:3e:76:97:ec", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b370d66-b3", "ovs_interfaceid": "9b370d66-b3f5-492d-b735-289048caa64f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.822 183079 DEBUG nova.network.os_vif_util [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "9b370d66-b3f5-492d-b735-289048caa64f", "address": "fa:16:3e:76:97:ec", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b370d66-b3", "ovs_interfaceid": "9b370d66-b3f5-492d-b735-289048caa64f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:11:12 compute-0 podman[219247]: 2026-01-22 17:11:12.826001588 +0000 UTC m=+0.115914813 container cleanup 81f287b3ae086e0074d492f8f8a158bac7404596bd7c579dc4ed4edbe5ec93c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.824 183079 DEBUG nova.network.os_vif_util [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:97:ec,bridge_name='br-int',has_traffic_filtering=True,id=9b370d66-b3f5-492d-b735-289048caa64f,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9b370d66-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.826 183079 DEBUG os_vif [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:97:ec,bridge_name='br-int',has_traffic_filtering=True,id=9b370d66-b3f5-492d-b735-289048caa64f,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9b370d66-b3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.828 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.829 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b370d66-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.834 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.835 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:11:12 compute-0 systemd[1]: libpod-conmon-81f287b3ae086e0074d492f8f8a158bac7404596bd7c579dc4ed4edbe5ec93c1.scope: Deactivated successfully.
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.838 183079 INFO os_vif [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:97:ec,bridge_name='br-int',has_traffic_filtering=True,id=9b370d66-b3f5-492d-b735-289048caa64f,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9b370d66-b3')
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.838 183079 INFO nova.virt.libvirt.driver [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Deleting instance files /var/lib/nova/instances/2bf3289b-0c4e-4286-80ca-c74eb06a8b96_del
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.839 183079 INFO nova.virt.libvirt.driver [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Deletion of /var/lib/nova/instances/2bf3289b-0c4e-4286-80ca-c74eb06a8b96_del complete
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.891 183079 INFO nova.compute.manager [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.891 183079 DEBUG oslo.service.loopingcall [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.892 183079 DEBUG nova.compute.manager [-] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.892 183079 DEBUG nova.network.neutron [-] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.918 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:12 compute-0 podman[219290]: 2026-01-22 17:11:12.923924641 +0000 UTC m=+0.060698123 container remove 81f287b3ae086e0074d492f8f8a158bac7404596bd7c579dc4ed4edbe5ec93c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:11:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:12.938 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[782c30fb-796e-4d04-ad71-9f21eb548a92]: (4, ('Thu Jan 22 05:11:12 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e (81f287b3ae086e0074d492f8f8a158bac7404596bd7c579dc4ed4edbe5ec93c1)\n81f287b3ae086e0074d492f8f8a158bac7404596bd7c579dc4ed4edbe5ec93c1\nThu Jan 22 05:11:12 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e (81f287b3ae086e0074d492f8f8a158bac7404596bd7c579dc4ed4edbe5ec93c1)\n81f287b3ae086e0074d492f8f8a158bac7404596bd7c579dc4ed4edbe5ec93c1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:12.941 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9c6db8ce-e04e-49cd-a555-b600a7cfdb51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:12.942 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a16be1a-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:12 compute-0 kernel: tap0a16be1a-20: left promiscuous mode
Jan 22 17:11:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:12.949 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0d850630-11bc-4e92-a988-bf2374768d7b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.955 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:12.966 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1bafc56c-d1d7-4d25-aeb7-8aa048390779]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.967 183079 DEBUG nova.network.neutron [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Successfully updated port: e1d81dc2-e73c-45fb-be4a-7192b576b628 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:11:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:12.967 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7f65ab65-8d58-42f3-af02-d2d588dd1aad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.969 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.983 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "refresh_cache-d54ce6ac-7fff-4f20-a6e0-48c13efded58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.983 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquired lock "refresh_cache-d54ce6ac-7fff-4f20-a6e0-48c13efded58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.983 183079 DEBUG nova.network.neutron [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:11:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:12.986 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8264037a-f05c-4456-a6ea-e7099f0e8a75]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420671, 'reachable_time': 41511, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219306, 'error': None, 'target': 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:12 compute-0 systemd[1]: run-netns-ovnmeta\x2d0a16be1a\x2d262e\x2d47f7\x2d8518\x2d5f24ee15796e.mount: Deactivated successfully.
Jan 22 17:11:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:12.988 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:11:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:12.988 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[89fbb186-ffcc-40a3-87cd-8b2e980e23b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.992 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:12 compute-0 nova_compute[183075]: 2026-01-22 17:11:12.994 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.044 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.046 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Error from libvirt while getting description of instance-0000000f: [Error Code 42] Domain not found: no domain with matching uuid '2bf3289b-0c4e-4286-80ca-c74eb06a8b96' (instance-0000000f): libvirt.libvirtError: Domain not found: no domain with matching uuid '2bf3289b-0c4e-4286-80ca-c74eb06a8b96' (instance-0000000f)
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.050 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.107 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.108 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.156 183079 DEBUG nova.network.neutron [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.163 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.167 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.226 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.227 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.247 183079 DEBUG nova.compute.manager [req-4824c3fa-5aa9-4cda-b801-5f2535b35d16 req-37eb40ea-ffd3-4261-8574-06c41807e0fb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Received event network-vif-unplugged-9b370d66-b3f5-492d-b735-289048caa64f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.248 183079 DEBUG oslo_concurrency.lockutils [req-4824c3fa-5aa9-4cda-b801-5f2535b35d16 req-37eb40ea-ffd3-4261-8574-06c41807e0fb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.248 183079 DEBUG oslo_concurrency.lockutils [req-4824c3fa-5aa9-4cda-b801-5f2535b35d16 req-37eb40ea-ffd3-4261-8574-06c41807e0fb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.248 183079 DEBUG oslo_concurrency.lockutils [req-4824c3fa-5aa9-4cda-b801-5f2535b35d16 req-37eb40ea-ffd3-4261-8574-06c41807e0fb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.248 183079 DEBUG nova.compute.manager [req-4824c3fa-5aa9-4cda-b801-5f2535b35d16 req-37eb40ea-ffd3-4261-8574-06c41807e0fb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] No waiting events found dispatching network-vif-unplugged-9b370d66-b3f5-492d-b735-289048caa64f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.248 183079 DEBUG nova.compute.manager [req-4824c3fa-5aa9-4cda-b801-5f2535b35d16 req-37eb40ea-ffd3-4261-8574-06c41807e0fb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Received event network-vif-unplugged-9b370d66-b3f5-492d-b735-289048caa64f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.249 183079 DEBUG nova.compute.manager [req-9fb15475-5693-4671-8b3a-765a56f29552 req-6535da28-04a2-41c6-9980-a1e991be1847 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Received event network-changed-e1d81dc2-e73c-45fb-be4a-7192b576b628 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.249 183079 DEBUG nova.compute.manager [req-9fb15475-5693-4671-8b3a-765a56f29552 req-6535da28-04a2-41c6-9980-a1e991be1847 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Refreshing instance network info cache due to event network-changed-e1d81dc2-e73c-45fb-be4a-7192b576b628. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.250 183079 DEBUG oslo_concurrency.lockutils [req-9fb15475-5693-4671-8b3a-765a56f29552 req-6535da28-04a2-41c6-9980-a1e991be1847 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-d54ce6ac-7fff-4f20-a6e0-48c13efded58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.302 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.306 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.372 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.373 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.394 183079 INFO nova.compute.manager [None req-b292e0e6-55bd-46e6-b07b-0f21a6d317c6 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Get console output
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.434 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.438 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.508 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.509 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.562 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.783 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.785 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=4941MB free_disk=73.23446655273438GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.785 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.786 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.912 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 7e8d077b-66fc-42ee-ad4e-a13327ad6764 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.912 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 618a8c78-4b30-4b4d-9617-4c54bfdd414e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.912 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 7d7be65d-c615-4cfd-936e-e5b57b3f29c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.912 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 2bf3289b-0c4e-4286-80ca-c74eb06a8b96 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.912 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.913 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance c1a1134b-933b-41d1-ba12-adb71c18d006 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.913 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance d54ce6ac-7fff-4f20-a6e0-48c13efded58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.913 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 7 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:11:13 compute-0 nova_compute[183075]: 2026-01-22 17:11:13.913 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1408MB phys_disk=79GB used_disk=7GB total_vcpus=8 used_vcpus=7 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:11:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:13.933 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:13.933 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:11:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.1.232
Jan 22 17:11:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ea1cd914-64be-4fd0-b944-45368957fb5b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.018 183079 DEBUG nova.network.neutron [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Updating instance_info_cache with network_info: [{"id": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "address": "fa:16:3e:65:16:89", "network": {"id": "83f76843-09f1-4ec7-b234-50118063210a", "bridge": "br-int", "label": "tempest-test-network--33948725", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1d81dc2-e7", "ovs_interfaceid": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.044 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Releasing lock "refresh_cache-d54ce6ac-7fff-4f20-a6e0-48c13efded58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.044 183079 DEBUG nova.compute.manager [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Instance network_info: |[{"id": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "address": "fa:16:3e:65:16:89", "network": {"id": "83f76843-09f1-4ec7-b234-50118063210a", "bridge": "br-int", "label": "tempest-test-network--33948725", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1d81dc2-e7", "ovs_interfaceid": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.044 183079 DEBUG oslo_concurrency.lockutils [req-9fb15475-5693-4671-8b3a-765a56f29552 req-6535da28-04a2-41c6-9980-a1e991be1847 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-d54ce6ac-7fff-4f20-a6e0-48c13efded58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.045 183079 DEBUG nova.network.neutron [req-9fb15475-5693-4671-8b3a-765a56f29552 req-6535da28-04a2-41c6-9980-a1e991be1847 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Refreshing network info cache for port e1d81dc2-e73c-45fb-be4a-7192b576b628 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.047 183079 DEBUG nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Start _get_guest_xml network_info=[{"id": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "address": "fa:16:3e:65:16:89", "network": {"id": "83f76843-09f1-4ec7-b234-50118063210a", "bridge": "br-int", "label": "tempest-test-network--33948725", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1d81dc2-e7", "ovs_interfaceid": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.051 183079 WARNING nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.055 183079 DEBUG nova.virt.libvirt.host [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.056 183079 DEBUG nova.virt.libvirt.host [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.058 183079 DEBUG nova.virt.libvirt.host [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.059 183079 DEBUG nova.virt.libvirt.host [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.059 183079 DEBUG nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.059 183079 DEBUG nova.virt.hardware [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.059 183079 DEBUG nova.virt.hardware [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.060 183079 DEBUG nova.virt.hardware [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.060 183079 DEBUG nova.virt.hardware [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.060 183079 DEBUG nova.virt.hardware [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.060 183079 DEBUG nova.virt.hardware [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.060 183079 DEBUG nova.virt.hardware [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.061 183079 DEBUG nova.virt.hardware [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.061 183079 DEBUG nova.virt.hardware [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.061 183079 DEBUG nova.virt.hardware [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.061 183079 DEBUG nova.virt.hardware [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.064 183079 DEBUG nova.virt.libvirt.vif [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:11:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-124054286',display_name='tempest-server-test-124054286',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-124054286',id=18,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA4izGuVdf36SsG+7n8kX9aNpboq22Z55adiWGM5qlH08LxqMkSxkCnGlFdsMKL8t/vQsOXqbCU1vgc4to/WoKVrvDSrylB83cxSgDIuuaEZv45HgYlb5csi4YLKl3Bk4g==',key_name='tempest-keypair-test-110348497',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c2b37b797ca344f2b31c3861277068d8',ramdisk_id='',reservation_id='r-a96ualme',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpMultipleRoutersTest-2036232412',owner_user_name='tempest-FloatingIpMultipleRoutersTest-2036232412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:11:11Z,user_data=None,user_id='28bc4852545149e59d0541d4f39eb38e',uuid=d54ce6ac-7fff-4f20-a6e0-48c13efded58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "address": "fa:16:3e:65:16:89", "network": {"id": "83f76843-09f1-4ec7-b234-50118063210a", "bridge": "br-int", "label": "tempest-test-network--33948725", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1d81dc2-e7", "ovs_interfaceid": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.065 183079 DEBUG nova.network.os_vif_util [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converting VIF {"id": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "address": "fa:16:3e:65:16:89", "network": {"id": "83f76843-09f1-4ec7-b234-50118063210a", "bridge": "br-int", "label": "tempest-test-network--33948725", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1d81dc2-e7", "ovs_interfaceid": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.066 183079 DEBUG nova.network.os_vif_util [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:16:89,bridge_name='br-int',has_traffic_filtering=True,id=e1d81dc2-e73c-45fb-be4a-7192b576b628,network=Network(83f76843-09f1-4ec7-b234-50118063210a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1d81dc2-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.066 183079 DEBUG nova.objects.instance [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid d54ce6ac-7fff-4f20-a6e0-48c13efded58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.083 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.086 183079 DEBUG nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:11:14 compute-0 nova_compute[183075]:   <uuid>d54ce6ac-7fff-4f20-a6e0-48c13efded58</uuid>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   <name>instance-00000012</name>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-124054286</nova:name>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:11:14</nova:creationTime>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:11:14 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:11:14 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:11:14 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:11:14 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:11:14 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:11:14 compute-0 nova_compute[183075]:         <nova:user uuid="28bc4852545149e59d0541d4f39eb38e">tempest-FloatingIpMultipleRoutersTest-2036232412-project-member</nova:user>
Jan 22 17:11:14 compute-0 nova_compute[183075]:         <nova:project uuid="c2b37b797ca344f2b31c3861277068d8">tempest-FloatingIpMultipleRoutersTest-2036232412</nova:project>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:11:14 compute-0 nova_compute[183075]:         <nova:port uuid="e1d81dc2-e73c-45fb-be4a-7192b576b628">
Jan 22 17:11:14 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <system>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <entry name="serial">d54ce6ac-7fff-4f20-a6e0-48c13efded58</entry>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <entry name="uuid">d54ce6ac-7fff-4f20-a6e0-48c13efded58</entry>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     </system>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   <os>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   </os>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   <features>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   </features>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/d54ce6ac-7fff-4f20-a6e0-48c13efded58/disk"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:65:16:89"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <target dev="tape1d81dc2-e7"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/d54ce6ac-7fff-4f20-a6e0-48c13efded58/console.log" append="off"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <video>
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     </video>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:11:14 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:11:14 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:11:14 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:11:14 compute-0 nova_compute[183075]: </domain>
Jan 22 17:11:14 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.087 183079 DEBUG nova.compute.manager [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Preparing to wait for external event network-vif-plugged-e1d81dc2-e73c-45fb-be4a-7192b576b628 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.087 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.088 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.088 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.089 183079 DEBUG nova.virt.libvirt.vif [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:11:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-124054286',display_name='tempest-server-test-124054286',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-124054286',id=18,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA4izGuVdf36SsG+7n8kX9aNpboq22Z55adiWGM5qlH08LxqMkSxkCnGlFdsMKL8t/vQsOXqbCU1vgc4to/WoKVrvDSrylB83cxSgDIuuaEZv45HgYlb5csi4YLKl3Bk4g==',key_name='tempest-keypair-test-110348497',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c2b37b797ca344f2b31c3861277068d8',ramdisk_id='',reservation_id='r-a96ualme',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpMultipleRoutersTest-2036232412',owner_user_name='tempest-FloatingIpMultipleRoutersTest-2036232412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:11:11Z,user_data=None,user_id='28bc4852545149e59d0541d4f39eb38e',uuid=d54ce6ac-7fff-4f20-a6e0-48c13efded58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "address": "fa:16:3e:65:16:89", "network": {"id": "83f76843-09f1-4ec7-b234-50118063210a", "bridge": "br-int", "label": "tempest-test-network--33948725", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1d81dc2-e7", "ovs_interfaceid": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.089 183079 DEBUG nova.network.os_vif_util [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converting VIF {"id": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "address": "fa:16:3e:65:16:89", "network": {"id": "83f76843-09f1-4ec7-b234-50118063210a", "bridge": "br-int", "label": "tempest-test-network--33948725", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1d81dc2-e7", "ovs_interfaceid": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.090 183079 DEBUG nova.network.os_vif_util [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:16:89,bridge_name='br-int',has_traffic_filtering=True,id=e1d81dc2-e73c-45fb-be4a-7192b576b628,network=Network(83f76843-09f1-4ec7-b234-50118063210a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1d81dc2-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.090 183079 DEBUG os_vif [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:16:89,bridge_name='br-int',has_traffic_filtering=True,id=e1d81dc2-e73c-45fb-be4a-7192b576b628,network=Network(83f76843-09f1-4ec7-b234-50118063210a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1d81dc2-e7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.091 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.091 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.092 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.095 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.095 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape1d81dc2-e7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.096 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape1d81dc2-e7, col_values=(('external_ids', {'iface-id': 'e1d81dc2-e73c-45fb-be4a-7192b576b628', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:16:89', 'vm-uuid': 'd54ce6ac-7fff-4f20-a6e0-48c13efded58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.097 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:14 compute-0 NetworkManager[55454]: <info>  [1769101874.0994] manager: (tape1d81dc2-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.101 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.106 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.107 183079 INFO os_vif [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:16:89,bridge_name='br-int',has_traffic_filtering=True,id=e1d81dc2-e73c-45fb-be4a-7192b576b628,network=Network(83f76843-09f1-4ec7-b234-50118063210a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1d81dc2-e7')
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.140 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.140 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.355s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.150 183079 DEBUG nova.network.neutron [-] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.154 183079 DEBUG nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.154 183079 DEBUG nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] No VIF found with MAC fa:16:3e:65:16:89, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.170 183079 INFO nova.compute.manager [-] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Took 1.28 seconds to deallocate network for instance.
Jan 22 17:11:14 compute-0 kernel: tape1d81dc2-e7: entered promiscuous mode
Jan 22 17:11:14 compute-0 systemd-udevd[219226]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:11:14 compute-0 ovn_controller[95372]: 2026-01-22T17:11:14Z|00189|binding|INFO|Claiming lport e1d81dc2-e73c-45fb-be4a-7192b576b628 for this chassis.
Jan 22 17:11:14 compute-0 ovn_controller[95372]: 2026-01-22T17:11:14Z|00190|binding|INFO|e1d81dc2-e73c-45fb-be4a-7192b576b628: Claiming fa:16:3e:65:16:89 10.100.0.24
Jan 22 17:11:14 compute-0 NetworkManager[55454]: <info>  [1769101874.2087] manager: (tape1d81dc2-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.210 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.214 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:16:89 10.100.0.24'], port_security=['fa:16:3e:65:16:89 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'd54ce6ac-7fff-4f20-a6e0-48c13efded58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-83f76843-09f1-4ec7-b234-50118063210a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2b37b797ca344f2b31c3861277068d8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2f51838f-8a2c-425b-a70e-e288886c38d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c1b57117-a4a4-4755-be03-74567148d139, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=e1d81dc2-e73c-45fb-be4a-7192b576b628) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.215 104629 INFO neutron.agent.ovn.metadata.agent [-] Port e1d81dc2-e73c-45fb-be4a-7192b576b628 in datapath 83f76843-09f1-4ec7-b234-50118063210a bound to our chassis
Jan 22 17:11:14 compute-0 NetworkManager[55454]: <info>  [1769101874.2197] device (tape1d81dc2-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:11:14 compute-0 NetworkManager[55454]: <info>  [1769101874.2213] device (tape1d81dc2-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.225 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 83f76843-09f1-4ec7-b234-50118063210a
Jan 22 17:11:14 compute-0 ovn_controller[95372]: 2026-01-22T17:11:14Z|00191|binding|INFO|Setting lport e1d81dc2-e73c-45fb-be4a-7192b576b628 ovn-installed in OVS
Jan 22 17:11:14 compute-0 ovn_controller[95372]: 2026-01-22T17:11:14Z|00192|binding|INFO|Setting lport e1d81dc2-e73c-45fb-be4a-7192b576b628 up in Southbound
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.228 183079 DEBUG oslo_concurrency.lockutils [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.229 183079 DEBUG oslo_concurrency.lockutils [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.229 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.235 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.237 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d48625d6-d3c8-4e03-977f-ee879c4a77be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.239 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap83f76843-01 in ovnmeta-83f76843-09f1-4ec7-b234-50118063210a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.242 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap83f76843-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.242 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0f80f738-9a21-4978-b99b-06c23ef71b37]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.243 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2e0f4e1a-8f52-45bd-8228-87f62375d8de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:14 compute-0 systemd-machined[154382]: New machine qemu-18-instance-00000012.
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.254 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[331ebf33-5ea1-46f1-8ddf-476b0a3a5b39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:14 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-00000012.
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.278 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6b1e3a-38c4-45a8-b498-70901d955fb5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.322 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[9176ecf3-6ce6-4fb7-9b91-66f6166fae48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.335 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.336 104990 INFO eventlet.wsgi.server [-] 10.10.1.232,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.4023509
Jan 22 17:11:14 compute-0 haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b[218918]: 10.10.1.232:36428 [22/Jan/2026:17:11:13.932] listener listener/metadata 0/0/0/403/403 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.344 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:14 compute-0 NetworkManager[55454]: <info>  [1769101874.3474] manager: (tap83f76843-00): new Veth device (/org/freedesktop/NetworkManager/Devices/94)
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.345 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5aa465b0-4289-4fd9-a1ed-a1c0488ddf6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.344 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.1.232
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ea1cd914-64be-4fd0-b944-45368957fb5b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.371 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.372 104990 INFO eventlet.wsgi.server [-] 10.10.1.232,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0277631
Jan 22 17:11:14 compute-0 haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b[218918]: 10.10.1.232:36442 [22/Jan/2026:17:11:14.343] listener listener/metadata 0/0/0/29/29 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.378 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.378 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.1.232
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ea1cd914-64be-4fd0-b944-45368957fb5b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.388 183079 DEBUG nova.compute.provider_tree [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.400 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[de9d969e-5540-4014-a54f-ac2ffb80bc24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.405 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[6a29df0d-bb36-4a8e-b5fb-402068cb3333]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.406 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:14 compute-0 haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b[218918]: 10.10.1.232:36444 [22/Jan/2026:17:11:14.377] listener listener/metadata 0/0/0/28/28 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.406 104990 INFO eventlet.wsgi.server [-] 10.10.1.232,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0278165
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.407 183079 DEBUG nova.scheduler.client.report [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.411 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.412 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.1.232
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ea1cd914-64be-4fd0-b944-45368957fb5b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.430 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.430 183079 DEBUG oslo_concurrency.lockutils [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.430 104990 INFO eventlet.wsgi.server [-] 10.10.1.232,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0179636
Jan 22 17:11:14 compute-0 haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b[218918]: 10.10.1.232:36450 [22/Jan/2026:17:11:14.411] listener listener/metadata 0/0/0/19/19 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:11:14 compute-0 NetworkManager[55454]: <info>  [1769101874.4338] device (tap83f76843-00): carrier: link connected
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.438 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a5e5fe4f-6143-44de-b3bb-102e7734e37e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.440 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.440 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.1.232
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ea1cd914-64be-4fd0-b944-45368957fb5b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.452 183079 INFO nova.scheduler.client.report [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Deleted allocations for instance 2bf3289b-0c4e-4286-80ca-c74eb06a8b96
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.453 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fe5063b5-8fa2-4aa2-aded-35ac7f515492]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap83f76843-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:87:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423813, 'reachable_time': 17781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219383, 'error': None, 'target': 'ovnmeta-83f76843-09f1-4ec7-b234-50118063210a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.467 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.468 104990 INFO eventlet.wsgi.server [-] 10.10.1.232,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0272853
Jan 22 17:11:14 compute-0 haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b[218918]: 10.10.1.232:36464 [22/Jan/2026:17:11:14.439] listener listener/metadata 0/0/0/28/28 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.468 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[05f6e95b-c24d-4fad-ba5a-3d01e509c223]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:8754'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 423813, 'tstamp': 423813}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219384, 'error': None, 'target': 'ovnmeta-83f76843-09f1-4ec7-b234-50118063210a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.476 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.477 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.1.232
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ea1cd914-64be-4fd0-b944-45368957fb5b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.490 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b3e6824f-84c5-4324-95df-185c90a2c9d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap83f76843-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:87:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423813, 'reachable_time': 17781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219385, 'error': None, 'target': 'ovnmeta-83f76843-09f1-4ec7-b234-50118063210a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.497 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:14 compute-0 haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b[218918]: 10.10.1.232:36474 [22/Jan/2026:17:11:14.475] listener listener/metadata 0/0/0/22/22 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.498 104990 INFO eventlet.wsgi.server [-] 10.10.1.232,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0210481
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.506 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.507 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.1.232
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ea1cd914-64be-4fd0-b944-45368957fb5b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.521 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:14 compute-0 haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b[218918]: 10.10.1.232:36478 [22/Jan/2026:17:11:14.506] listener listener/metadata 0/0/0/15/15 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.521 104990 INFO eventlet.wsgi.server [-] 10.10.1.232,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0142787
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.530 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e71e9c22-7d80-420b-a875-10f002cfeef4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.532 183079 DEBUG oslo_concurrency.lockutils [None req-f75b5fe6-23f7-4661-86fc-9e8a3ac90a59 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.985s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.533 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.533 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.1.232
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ea1cd914-64be-4fd0-b944-45368957fb5b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.550 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:14 compute-0 haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b[218918]: 10.10.1.232:36492 [22/Jan/2026:17:11:14.533] listener listener/metadata 0/0/0/18/18 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.551 104990 INFO eventlet.wsgi.server [-] 10.10.1.232,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0172284
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.565 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.566 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.1.232
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ea1cd914-64be-4fd0-b944-45368957fb5b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.581 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.581 104990 INFO eventlet.wsgi.server [-] 10.10.1.232,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0151374
Jan 22 17:11:14 compute-0 haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b[218918]: 10.10.1.232:36494 [22/Jan/2026:17:11:14.565] listener listener/metadata 0/0/0/16/16 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.597 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.598 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.1.232
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ea1cd914-64be-4fd0-b944-45368957fb5b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.615 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:14 compute-0 haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b[218918]: 10.10.1.232:36502 [22/Jan/2026:17:11:14.597] listener listener/metadata 0/0/0/18/18 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.615 104990 INFO eventlet.wsgi.server [-] 10.10.1.232,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0171740
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.617 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[60793beb-e496-4e6e-9d7d-22c335a3139b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.618 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap83f76843-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.619 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.620 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap83f76843-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.627 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.628 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.1.232
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ea1cd914-64be-4fd0-b944-45368957fb5b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:14 compute-0 haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b[218918]: 10.10.1.232:36512 [22/Jan/2026:17:11:14.627] listener listener/metadata 0/0/0/13/13 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.641 104990 INFO eventlet.wsgi.server [-] 10.10.1.232,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0125785
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.658 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:14 compute-0 kernel: tap83f76843-00: entered promiscuous mode
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.667 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:14 compute-0 NetworkManager[55454]: <info>  [1769101874.6684] manager: (tap83f76843-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Jan 22 17:11:14 compute-0 ovn_controller[95372]: 2026-01-22T17:11:14Z|00193|binding|INFO|Releasing lport 5bbc40c1-aff4-4dc7-9a6d-ceb120b7d682 from this chassis (sb_readonly=0)
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.668 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap83f76843-00, col_values=(('external_ids', {'iface-id': '5bbc40c1-aff4-4dc7-9a6d-ceb120b7d682'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.670 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.671 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.1.232
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ea1cd914-64be-4fd0-b944-45368957fb5b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.681 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.681 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/83f76843-09f1-4ec7-b234-50118063210a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/83f76843-09f1-4ec7-b234-50118063210a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.682 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[820d0657-4118-4274-80d4-e1eaf4d203eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.683 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/83f76843-09f1-4ec7-b234-50118063210a.pid.haproxy
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 83f76843-09f1-4ec7-b234-50118063210a
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.683 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-83f76843-09f1-4ec7-b234-50118063210a', 'env', 'PROCESS_TAG=haproxy-83f76843-09f1-4ec7-b234-50118063210a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/83f76843-09f1-4ec7-b234-50118063210a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:11:14 compute-0 haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b[218918]: 10.10.1.232:36516 [22/Jan/2026:17:11:14.670] listener listener/metadata 0/0/0/20/20 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.690 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.690 104990 INFO eventlet.wsgi.server [-] 10.10.1.232,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0192318
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.694 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.694 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.1.232
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ea1cd914-64be-4fd0-b944-45368957fb5b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.708 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.708 104990 INFO eventlet.wsgi.server [-] 10.10.1.232,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0136681
Jan 22 17:11:14 compute-0 haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b[218918]: 10.10.1.232:36524 [22/Jan/2026:17:11:14.694] listener listener/metadata 0/0/0/14/14 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.712 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.713 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.1.232
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ea1cd914-64be-4fd0-b944-45368957fb5b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:14 compute-0 haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b[218918]: 10.10.1.232:36536 [22/Jan/2026:17:11:14.712] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.728 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.728 104990 INFO eventlet.wsgi.server [-] 10.10.1.232,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0153120
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.733 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.734 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.1.232
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ea1cd914-64be-4fd0-b944-45368957fb5b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.746 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.747 104990 INFO eventlet.wsgi.server [-] 10.10.1.232,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0126219
Jan 22 17:11:14 compute-0 haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b[218918]: 10.10.1.232:36550 [22/Jan/2026:17:11:14.733] listener listener/metadata 0/0/0/13/13 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.751 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.752 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.1.232
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ea1cd914-64be-4fd0-b944-45368957fb5b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.761 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101874.7609878, d54ce6ac-7fff-4f20-a6e0-48c13efded58 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.762 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] VM Started (Lifecycle Event)
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.773 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:14 compute-0 haproxy-metadata-proxy-ea1cd914-64be-4fd0-b944-45368957fb5b[218918]: 10.10.1.232:36564 [22/Jan/2026:17:11:14.751] listener listener/metadata 0/0/0/22/22 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:11:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:14.774 104990 INFO eventlet.wsgi.server [-] 10.10.1.232,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0214553
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.790 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.795 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101874.7613547, d54ce6ac-7fff-4f20-a6e0-48c13efded58 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.795 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] VM Paused (Lifecycle Event)
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.819 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.822 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:11:14 compute-0 nova_compute[183075]: 2026-01-22 17:11:14.854 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:11:15 compute-0 podman[219423]: 2026-01-22 17:11:15.09832535 +0000 UTC m=+0.056322189 container create 13d088dd1087fe7421decb8a8c4e1ecee3d0b6ddf440c37422f8f2045d897c14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83f76843-09f1-4ec7-b234-50118063210a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:11:15 compute-0 systemd[1]: Started libpod-conmon-13d088dd1087fe7421decb8a8c4e1ecee3d0b6ddf440c37422f8f2045d897c14.scope.
Jan 22 17:11:15 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:11:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f0130c894500ce812c42f46180af996b668b5c463d553ab9ae157b9d45606d2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:11:15 compute-0 podman[219423]: 2026-01-22 17:11:15.075344131 +0000 UTC m=+0.033340990 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.178 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:15 compute-0 podman[219423]: 2026-01-22 17:11:15.1884552 +0000 UTC m=+0.146452059 container init 13d088dd1087fe7421decb8a8c4e1ecee3d0b6ddf440c37422f8f2045d897c14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83f76843-09f1-4ec7-b234-50118063210a, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:11:15 compute-0 podman[219423]: 2026-01-22 17:11:15.197001533 +0000 UTC m=+0.154998372 container start 13d088dd1087fe7421decb8a8c4e1ecee3d0b6ddf440c37422f8f2045d897c14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83f76843-09f1-4ec7-b234-50118063210a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:11:15 compute-0 neutron-haproxy-ovnmeta-83f76843-09f1-4ec7-b234-50118063210a[219438]: [NOTICE]   (219442) : New worker (219444) forked
Jan 22 17:11:15 compute-0 neutron-haproxy-ovnmeta-83f76843-09f1-4ec7-b234-50118063210a[219438]: [NOTICE]   (219442) : Loading success.
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.326 183079 DEBUG nova.compute.manager [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Received event network-vif-plugged-9b370d66-b3f5-492d-b735-289048caa64f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.327 183079 DEBUG oslo_concurrency.lockutils [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.327 183079 DEBUG oslo_concurrency.lockutils [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.327 183079 DEBUG oslo_concurrency.lockutils [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "2bf3289b-0c4e-4286-80ca-c74eb06a8b96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.328 183079 DEBUG nova.compute.manager [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] No waiting events found dispatching network-vif-plugged-9b370d66-b3f5-492d-b735-289048caa64f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.328 183079 WARNING nova.compute.manager [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Received unexpected event network-vif-plugged-9b370d66-b3f5-492d-b735-289048caa64f for instance with vm_state deleted and task_state None.
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.328 183079 DEBUG nova.compute.manager [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Received event network-vif-plugged-e1d81dc2-e73c-45fb-be4a-7192b576b628 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.328 183079 DEBUG oslo_concurrency.lockutils [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.328 183079 DEBUG oslo_concurrency.lockutils [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.329 183079 DEBUG oslo_concurrency.lockutils [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.329 183079 DEBUG nova.compute.manager [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Processing event network-vif-plugged-e1d81dc2-e73c-45fb-be4a-7192b576b628 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.329 183079 DEBUG nova.compute.manager [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Received event network-vif-plugged-e1d81dc2-e73c-45fb-be4a-7192b576b628 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.329 183079 DEBUG oslo_concurrency.lockutils [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.329 183079 DEBUG oslo_concurrency.lockutils [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.330 183079 DEBUG oslo_concurrency.lockutils [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.330 183079 DEBUG nova.compute.manager [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] No waiting events found dispatching network-vif-plugged-e1d81dc2-e73c-45fb-be4a-7192b576b628 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.330 183079 WARNING nova.compute.manager [req-0cb9e7f1-1640-43c4-bf82-5fad0b88558e req-a7252d94-3119-40d3-8a27-b51b661be082 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Received unexpected event network-vif-plugged-e1d81dc2-e73c-45fb-be4a-7192b576b628 for instance with vm_state building and task_state spawning.
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.331 183079 DEBUG nova.compute.manager [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.334 183079 DEBUG nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.336 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101875.3361795, d54ce6ac-7fff-4f20-a6e0-48c13efded58 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.336 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] VM Resumed (Lifecycle Event)
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.338 183079 INFO nova.virt.libvirt.driver [-] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Instance spawned successfully.
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.338 183079 DEBUG nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.362 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.365 183079 DEBUG nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.365 183079 DEBUG nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.366 183079 DEBUG nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.366 183079 DEBUG nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.366 183079 DEBUG nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.367 183079 DEBUG nova.virt.libvirt.driver [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.370 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.409 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.442 183079 INFO nova.compute.manager [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Took 4.25 seconds to spawn the instance on the hypervisor.
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.443 183079 DEBUG nova.compute.manager [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.524 183079 INFO nova.compute.manager [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Took 4.80 seconds to build instance.
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.545 183079 DEBUG oslo_concurrency.lockutils [None req-a9b1b53e-3ac9-4bfa-9c98-64de327f58ec 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.885s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.756 183079 DEBUG oslo_concurrency.lockutils [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.756 183079 DEBUG oslo_concurrency.lockutils [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.757 183079 DEBUG oslo_concurrency.lockutils [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.757 183079 DEBUG oslo_concurrency.lockutils [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.757 183079 DEBUG oslo_concurrency.lockutils [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.758 183079 INFO nova.compute.manager [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Terminating instance
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.760 183079 DEBUG nova.compute.manager [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:11:15 compute-0 kernel: tap1ea3e9d7-15 (unregistering): left promiscuous mode
Jan 22 17:11:15 compute-0 NetworkManager[55454]: <info>  [1769101875.7863] device (tap1ea3e9d7-15): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:11:15 compute-0 ovn_controller[95372]: 2026-01-22T17:11:15Z|00194|binding|INFO|Releasing lport 1ea3e9d7-15d1-4941-93be-4710d9a29763 from this chassis (sb_readonly=0)
Jan 22 17:11:15 compute-0 ovn_controller[95372]: 2026-01-22T17:11:15Z|00195|binding|INFO|Setting lport 1ea3e9d7-15d1-4941-93be-4710d9a29763 down in Southbound
Jan 22 17:11:15 compute-0 ovn_controller[95372]: 2026-01-22T17:11:15Z|00196|binding|INFO|Removing iface tap1ea3e9d7-15 ovn-installed in OVS
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.833 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.836 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:15.844 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:cc:e7 10.100.0.8'], port_security=['fa:16:3e:ca:cc:e7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '618a8c78-4b30-4b4d-9617-4c54bfdd414e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-576f6598-999f-46d9-809a-65b7475a1ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.185', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92087573-6410-4f80-a195-ce8b2058420e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=1ea3e9d7-15d1-4941-93be-4710d9a29763) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:11:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:15.847 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 1ea3e9d7-15d1-4941-93be-4710d9a29763 in datapath 576f6598-999f-46d9-809a-65b7475a1ec7 unbound from our chassis
Jan 22 17:11:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:15.851 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 576f6598-999f-46d9-809a-65b7475a1ec7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:11:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:15.852 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[56e99c79-4f7c-461d-bab6-fd1f28576a91]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:15.853 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 namespace which is not needed anymore
Jan 22 17:11:15 compute-0 nova_compute[183075]: 2026-01-22 17:11:15.864 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:15 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Jan 22 17:11:15 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Consumed 14.795s CPU time.
Jan 22 17:11:15 compute-0 systemd-machined[154382]: Machine qemu-13-instance-0000000d terminated.
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.023 183079 INFO nova.virt.libvirt.driver [-] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Instance destroyed successfully.
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.024 183079 DEBUG nova.objects.instance [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'resources' on Instance uuid 618a8c78-4b30-4b4d-9617-4c54bfdd414e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:11:16 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[218264]: [NOTICE]   (218269) : haproxy version is 2.8.14-c23fe91
Jan 22 17:11:16 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[218264]: [NOTICE]   (218269) : path to executable is /usr/sbin/haproxy
Jan 22 17:11:16 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[218264]: [WARNING]  (218269) : Exiting Master process...
Jan 22 17:11:16 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[218264]: [ALERT]    (218269) : Current worker (218271) exited with code 143 (Terminated)
Jan 22 17:11:16 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[218264]: [WARNING]  (218269) : All workers exited. Exiting... (0)
Jan 22 17:11:16 compute-0 systemd[1]: libpod-3a21c7084bfab33bfc6021ace08f3f1dbd4dd8dc47e25071b33adb61bf86283f.scope: Deactivated successfully.
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.044 183079 DEBUG nova.compute.manager [req-e31d4c23-46e9-4a59-be9e-730499d38bd9 req-04d5c95b-3683-4985-9784-77a2a50a402a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Received event network-vif-unplugged-1ea3e9d7-15d1-4941-93be-4710d9a29763 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.044 183079 DEBUG oslo_concurrency.lockutils [req-e31d4c23-46e9-4a59-be9e-730499d38bd9 req-04d5c95b-3683-4985-9784-77a2a50a402a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.045 183079 DEBUG oslo_concurrency.lockutils [req-e31d4c23-46e9-4a59-be9e-730499d38bd9 req-04d5c95b-3683-4985-9784-77a2a50a402a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.045 183079 DEBUG oslo_concurrency.lockutils [req-e31d4c23-46e9-4a59-be9e-730499d38bd9 req-04d5c95b-3683-4985-9784-77a2a50a402a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.045 183079 DEBUG nova.compute.manager [req-e31d4c23-46e9-4a59-be9e-730499d38bd9 req-04d5c95b-3683-4985-9784-77a2a50a402a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] No waiting events found dispatching network-vif-unplugged-1ea3e9d7-15d1-4941-93be-4710d9a29763 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.046 183079 DEBUG nova.compute.manager [req-e31d4c23-46e9-4a59-be9e-730499d38bd9 req-04d5c95b-3683-4985-9784-77a2a50a402a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Received event network-vif-unplugged-1ea3e9d7-15d1-4941-93be-4710d9a29763 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.047 183079 DEBUG nova.virt.libvirt.vif [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:09:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-254681299',display_name='tempest-server-test-254681299',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-254681299',id=13,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:10:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-nqvb2q9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:10:05Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=618a8c78-4b30-4b4d-9617-4c54bfdd414e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "address": "fa:16:3e:ca:cc:e7", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea3e9d7-15", "ovs_interfaceid": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.047 183079 DEBUG nova.network.os_vif_util [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "address": "fa:16:3e:ca:cc:e7", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea3e9d7-15", "ovs_interfaceid": "1ea3e9d7-15d1-4941-93be-4710d9a29763", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.048 183079 DEBUG nova.network.os_vif_util [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ca:cc:e7,bridge_name='br-int',has_traffic_filtering=True,id=1ea3e9d7-15d1-4941-93be-4710d9a29763,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1ea3e9d7-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.048 183079 DEBUG os_vif [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ca:cc:e7,bridge_name='br-int',has_traffic_filtering=True,id=1ea3e9d7-15d1-4941-93be-4710d9a29763,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1ea3e9d7-15') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.049 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.049 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1ea3e9d7-15, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:16 compute-0 podman[219477]: 2026-01-22 17:11:16.052886277 +0000 UTC m=+0.067098530 container died 3a21c7084bfab33bfc6021ace08f3f1dbd4dd8dc47e25071b33adb61bf86283f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.053 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.054 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.060 183079 INFO os_vif [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ca:cc:e7,bridge_name='br-int',has_traffic_filtering=True,id=1ea3e9d7-15d1-4941-93be-4710d9a29763,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1ea3e9d7-15')
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.061 183079 INFO nova.virt.libvirt.driver [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Deleting instance files /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e_del
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.061 183079 INFO nova.virt.libvirt.driver [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Deletion of /var/lib/nova/instances/618a8c78-4b30-4b4d-9617-4c54bfdd414e_del complete
Jan 22 17:11:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3a21c7084bfab33bfc6021ace08f3f1dbd4dd8dc47e25071b33adb61bf86283f-userdata-shm.mount: Deactivated successfully.
Jan 22 17:11:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-a5198b9617c24de85708d09a3bb8ad0b0d2b51b2a0f24d17911cec713817343d-merged.mount: Deactivated successfully.
Jan 22 17:11:16 compute-0 podman[219477]: 2026-01-22 17:11:16.096972017 +0000 UTC m=+0.111184250 container cleanup 3a21c7084bfab33bfc6021ace08f3f1dbd4dd8dc47e25071b33adb61bf86283f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:11:16 compute-0 systemd[1]: libpod-conmon-3a21c7084bfab33bfc6021ace08f3f1dbd4dd8dc47e25071b33adb61bf86283f.scope: Deactivated successfully.
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.126 183079 INFO nova.compute.manager [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.126 183079 DEBUG oslo.service.loopingcall [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.127 183079 DEBUG nova.compute.manager [-] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.127 183079 DEBUG nova.network.neutron [-] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:11:16 compute-0 podman[219520]: 2026-01-22 17:11:16.172412364 +0000 UTC m=+0.050443936 container remove 3a21c7084bfab33bfc6021ace08f3f1dbd4dd8dc47e25071b33adb61bf86283f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 17:11:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:16.177 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8aac99e0-a88f-43a2-897a-05c2a69dcef5]: (4, ('Thu Jan 22 05:11:15 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 (3a21c7084bfab33bfc6021ace08f3f1dbd4dd8dc47e25071b33adb61bf86283f)\n3a21c7084bfab33bfc6021ace08f3f1dbd4dd8dc47e25071b33adb61bf86283f\nThu Jan 22 05:11:16 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 (3a21c7084bfab33bfc6021ace08f3f1dbd4dd8dc47e25071b33adb61bf86283f)\n3a21c7084bfab33bfc6021ace08f3f1dbd4dd8dc47e25071b33adb61bf86283f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:16.179 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[83c4c52b-432a-4202-96d9-77f6ca91c7d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:16.180 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap576f6598-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:16 compute-0 kernel: tap576f6598-90: left promiscuous mode
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.185 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:16 compute-0 nova_compute[183075]: 2026-01-22 17:11:16.196 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:16.200 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[27935c2e-a389-4a1c-9808-362771c4ea9c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:16.217 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1a51aedb-fceb-4407-9256-c43ff85c8be3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:16.218 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c5c4bcbb-d91a-4d40-b031-9d5bdcf5757f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:16.232 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3ce60dfd-f4ef-420b-aa53-f8114192e606]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416808, 'reachable_time': 26963, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219535, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:16 compute-0 systemd[1]: run-netns-ovnmeta\x2d576f6598\x2d999f\x2d46d9\x2d809a\x2d65b7475a1ec7.mount: Deactivated successfully.
Jan 22 17:11:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:16.235 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:11:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:16.235 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[eac7f8ed-ee60-4951-a7ec-9e4fe9d95cec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:17 compute-0 nova_compute[183075]: 2026-01-22 17:11:17.023 183079 DEBUG nova.network.neutron [req-9fb15475-5693-4671-8b3a-765a56f29552 req-6535da28-04a2-41c6-9980-a1e991be1847 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Updated VIF entry in instance network info cache for port e1d81dc2-e73c-45fb-be4a-7192b576b628. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:11:17 compute-0 nova_compute[183075]: 2026-01-22 17:11:17.023 183079 DEBUG nova.network.neutron [req-9fb15475-5693-4671-8b3a-765a56f29552 req-6535da28-04a2-41c6-9980-a1e991be1847 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Updating instance_info_cache with network_info: [{"id": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "address": "fa:16:3e:65:16:89", "network": {"id": "83f76843-09f1-4ec7-b234-50118063210a", "bridge": "br-int", "label": "tempest-test-network--33948725", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1d81dc2-e7", "ovs_interfaceid": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:17 compute-0 nova_compute[183075]: 2026-01-22 17:11:17.042 183079 DEBUG oslo_concurrency.lockutils [req-9fb15475-5693-4671-8b3a-765a56f29552 req-6535da28-04a2-41c6-9980-a1e991be1847 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-d54ce6ac-7fff-4f20-a6e0-48c13efded58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:11:17 compute-0 nova_compute[183075]: 2026-01-22 17:11:17.116 183079 INFO nova.compute.manager [None req-0602544d-3c0e-456d-8f42-047e34db5930 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Get console output
Jan 22 17:11:17 compute-0 nova_compute[183075]: 2026-01-22 17:11:17.124 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:18 compute-0 nova_compute[183075]: 2026-01-22 17:11:18.125 183079 INFO nova.compute.manager [None req-7ee83545-a75b-452e-b2b4-bb1ac1b6b800 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Get console output
Jan 22 17:11:18 compute-0 nova_compute[183075]: 2026-01-22 17:11:18.131 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:19 compute-0 ovn_controller[95372]: 2026-01-22T17:11:19Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f6:53:65 10.100.0.14
Jan 22 17:11:19 compute-0 ovn_controller[95372]: 2026-01-22T17:11:19Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f6:53:65 10.100.0.14
Jan 22 17:11:19 compute-0 nova_compute[183075]: 2026-01-22 17:11:19.249 183079 DEBUG nova.compute.manager [req-15a8fd1e-ec41-4e4c-9411-35d332fcba81 req-b9bd5d42-1b1c-416f-8512-914a7e92a2f7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Received event network-vif-plugged-1ea3e9d7-15d1-4941-93be-4710d9a29763 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:19 compute-0 nova_compute[183075]: 2026-01-22 17:11:19.249 183079 DEBUG oslo_concurrency.lockutils [req-15a8fd1e-ec41-4e4c-9411-35d332fcba81 req-b9bd5d42-1b1c-416f-8512-914a7e92a2f7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:19 compute-0 nova_compute[183075]: 2026-01-22 17:11:19.250 183079 DEBUG oslo_concurrency.lockutils [req-15a8fd1e-ec41-4e4c-9411-35d332fcba81 req-b9bd5d42-1b1c-416f-8512-914a7e92a2f7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:19 compute-0 nova_compute[183075]: 2026-01-22 17:11:19.251 183079 DEBUG oslo_concurrency.lockutils [req-15a8fd1e-ec41-4e4c-9411-35d332fcba81 req-b9bd5d42-1b1c-416f-8512-914a7e92a2f7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:19 compute-0 nova_compute[183075]: 2026-01-22 17:11:19.251 183079 DEBUG nova.compute.manager [req-15a8fd1e-ec41-4e4c-9411-35d332fcba81 req-b9bd5d42-1b1c-416f-8512-914a7e92a2f7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] No waiting events found dispatching network-vif-plugged-1ea3e9d7-15d1-4941-93be-4710d9a29763 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:19 compute-0 nova_compute[183075]: 2026-01-22 17:11:19.251 183079 WARNING nova.compute.manager [req-15a8fd1e-ec41-4e4c-9411-35d332fcba81 req-b9bd5d42-1b1c-416f-8512-914a7e92a2f7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Received unexpected event network-vif-plugged-1ea3e9d7-15d1-4941-93be-4710d9a29763 for instance with vm_state active and task_state deleting.
Jan 22 17:11:19 compute-0 nova_compute[183075]: 2026-01-22 17:11:19.333 183079 INFO nova.compute.manager [None req-a8c0c06a-fd1b-474a-82ba-fd09e57dc29a 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Get console output
Jan 22 17:11:19 compute-0 nova_compute[183075]: 2026-01-22 17:11:19.338 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:19 compute-0 podman[219551]: 2026-01-22 17:11:19.356703543 +0000 UTC m=+0.070037647 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:11:20 compute-0 nova_compute[183075]: 2026-01-22 17:11:20.136 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:11:20 compute-0 nova_compute[183075]: 2026-01-22 17:11:20.190 183079 DEBUG nova.network.neutron [-] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:20 compute-0 nova_compute[183075]: 2026-01-22 17:11:20.193 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:20 compute-0 nova_compute[183075]: 2026-01-22 17:11:20.218 183079 INFO nova.compute.manager [-] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Took 4.09 seconds to deallocate network for instance.
Jan 22 17:11:20 compute-0 nova_compute[183075]: 2026-01-22 17:11:20.274 183079 DEBUG oslo_concurrency.lockutils [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:20 compute-0 nova_compute[183075]: 2026-01-22 17:11:20.275 183079 DEBUG oslo_concurrency.lockutils [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:20 compute-0 nova_compute[183075]: 2026-01-22 17:11:20.449 183079 DEBUG nova.compute.provider_tree [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:11:20 compute-0 nova_compute[183075]: 2026-01-22 17:11:20.484 183079 DEBUG nova.scheduler.client.report [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:11:20 compute-0 nova_compute[183075]: 2026-01-22 17:11:20.517 183079 DEBUG oslo_concurrency.lockutils [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:20 compute-0 nova_compute[183075]: 2026-01-22 17:11:20.545 183079 INFO nova.scheduler.client.report [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Deleted allocations for instance 618a8c78-4b30-4b4d-9617-4c54bfdd414e
Jan 22 17:11:20 compute-0 nova_compute[183075]: 2026-01-22 17:11:20.630 183079 DEBUG oslo_concurrency.lockutils [None req-4273b044-cdf2-4596-b032-41fbbf7d39a3 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "618a8c78-4b30-4b4d-9617-4c54bfdd414e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.874s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.089 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.321 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "e69f0100-85ca-4ff8-a177-27d35d4580de" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.322 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "e69f0100-85ca-4ff8-a177-27d35d4580de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.337 183079 DEBUG nova.compute.manager [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.398 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.400 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.409 183079 DEBUG nova.virt.hardware [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.410 183079 INFO nova.compute.claims [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.641 183079 DEBUG nova.compute.provider_tree [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.655 183079 DEBUG nova.scheduler.client.report [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.681 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.683 183079 DEBUG nova.compute.manager [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.734 183079 DEBUG nova.compute.manager [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.735 183079 DEBUG nova.network.neutron [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.763 183079 INFO nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.782 183079 DEBUG nova.compute.manager [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.861 183079 DEBUG nova.compute.manager [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.863 183079 DEBUG nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.863 183079 INFO nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Creating image(s)
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.864 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "/var/lib/nova/instances/e69f0100-85ca-4ff8-a177-27d35d4580de/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.865 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "/var/lib/nova/instances/e69f0100-85ca-4ff8-a177-27d35d4580de/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.866 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "/var/lib/nova/instances/e69f0100-85ca-4ff8-a177-27d35d4580de/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.878 183079 DEBUG oslo_concurrency.processutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.939 183079 DEBUG oslo_concurrency.processutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.945 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.947 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:21 compute-0 nova_compute[183075]: 2026-01-22 17:11:21.969 183079 DEBUG oslo_concurrency.processutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.033 183079 DEBUG oslo_concurrency.processutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.035 183079 DEBUG oslo_concurrency.processutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/e69f0100-85ca-4ff8-a177-27d35d4580de/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.076 183079 DEBUG oslo_concurrency.processutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/e69f0100-85ca-4ff8-a177-27d35d4580de/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.079 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.080 183079 DEBUG oslo_concurrency.processutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.145 183079 DEBUG oslo_concurrency.processutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.148 183079 DEBUG nova.virt.disk.api [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Checking if we can resize image /var/lib/nova/instances/e69f0100-85ca-4ff8-a177-27d35d4580de/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.149 183079 DEBUG oslo_concurrency.processutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e69f0100-85ca-4ff8-a177-27d35d4580de/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.215 183079 DEBUG oslo_concurrency.processutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e69f0100-85ca-4ff8-a177-27d35d4580de/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.218 183079 DEBUG nova.virt.disk.api [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Cannot resize image /var/lib/nova/instances/e69f0100-85ca-4ff8-a177-27d35d4580de/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.219 183079 DEBUG nova.objects.instance [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lazy-loading 'migration_context' on Instance uuid e69f0100-85ca-4ff8-a177-27d35d4580de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.235 183079 DEBUG nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.235 183079 DEBUG nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Ensure instance console log exists: /var/lib/nova/instances/e69f0100-85ca-4ff8-a177-27d35d4580de/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.236 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.236 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.237 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:22 compute-0 nova_compute[183075]: 2026-01-22 17:11:22.608 183079 DEBUG nova.policy [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:11:23 compute-0 nova_compute[183075]: 2026-01-22 17:11:23.245 183079 INFO nova.compute.manager [None req-d6eb710f-4372-4b87-8724-cfa6adbd7e93 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Get console output
Jan 22 17:11:23 compute-0 nova_compute[183075]: 2026-01-22 17:11:23.821 183079 DEBUG nova.network.neutron [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Successfully updated port: 0e3bc449-87f9-4d63-9fee-5ac925d686c4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:11:23 compute-0 nova_compute[183075]: 2026-01-22 17:11:23.843 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "refresh_cache-e69f0100-85ca-4ff8-a177-27d35d4580de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:11:23 compute-0 nova_compute[183075]: 2026-01-22 17:11:23.843 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquired lock "refresh_cache-e69f0100-85ca-4ff8-a177-27d35d4580de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:11:23 compute-0 nova_compute[183075]: 2026-01-22 17:11:23.843 183079 DEBUG nova.network.neutron [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:11:23 compute-0 nova_compute[183075]: 2026-01-22 17:11:23.942 183079 DEBUG nova.compute.manager [req-a2eeff99-5a48-4f70-aec9-3ec67fa6e612 req-9ee30527-98cc-44b7-a0f1-0a6a4d1601c2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Received event network-changed-0e3bc449-87f9-4d63-9fee-5ac925d686c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:23 compute-0 nova_compute[183075]: 2026-01-22 17:11:23.942 183079 DEBUG nova.compute.manager [req-a2eeff99-5a48-4f70-aec9-3ec67fa6e612 req-9ee30527-98cc-44b7-a0f1-0a6a4d1601c2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Refreshing instance network info cache due to event network-changed-0e3bc449-87f9-4d63-9fee-5ac925d686c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:11:23 compute-0 nova_compute[183075]: 2026-01-22 17:11:23.943 183079 DEBUG oslo_concurrency.lockutils [req-a2eeff99-5a48-4f70-aec9-3ec67fa6e612 req-9ee30527-98cc-44b7-a0f1-0a6a4d1601c2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-e69f0100-85ca-4ff8-a177-27d35d4580de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:11:24 compute-0 nova_compute[183075]: 2026-01-22 17:11:24.125 183079 DEBUG nova.network.neutron [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:11:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:24.531 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:24.534 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:11:24 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:24 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:24 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:24 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:24 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:24 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:11:24 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:24 compute-0 nova_compute[183075]: 2026-01-22 17:11:24.690 183079 INFO nova.compute.manager [None req-2c4e8be5-e454-4644-a75d-39fb5fd1d430 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Get console output
Jan 22 17:11:24 compute-0 nova_compute[183075]: 2026-01-22 17:11:24.699 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:24 compute-0 nova_compute[183075]: 2026-01-22 17:11:24.974 183079 DEBUG nova.network.neutron [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Updating instance_info_cache with network_info: [{"id": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "address": "fa:16:3e:22:e2:d4", "network": {"id": "3d23d5e4-bd70-4266-8b97-203b9af8d4ef", "bridge": "br-int", "label": "tempest-test-network--2043252313", "subnets": [{"cidr": "10.10.2.0/24", "dns": [], "gateway": {"address": "10.10.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.2.65", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e3bc449-87", "ovs_interfaceid": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.001 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Releasing lock "refresh_cache-e69f0100-85ca-4ff8-a177-27d35d4580de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.001 183079 DEBUG nova.compute.manager [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Instance network_info: |[{"id": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "address": "fa:16:3e:22:e2:d4", "network": {"id": "3d23d5e4-bd70-4266-8b97-203b9af8d4ef", "bridge": "br-int", "label": "tempest-test-network--2043252313", "subnets": [{"cidr": "10.10.2.0/24", "dns": [], "gateway": {"address": "10.10.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.2.65", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e3bc449-87", "ovs_interfaceid": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.002 183079 DEBUG oslo_concurrency.lockutils [req-a2eeff99-5a48-4f70-aec9-3ec67fa6e612 req-9ee30527-98cc-44b7-a0f1-0a6a4d1601c2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-e69f0100-85ca-4ff8-a177-27d35d4580de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.002 183079 DEBUG nova.network.neutron [req-a2eeff99-5a48-4f70-aec9-3ec67fa6e612 req-9ee30527-98cc-44b7-a0f1-0a6a4d1601c2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Refreshing network info cache for port 0e3bc449-87f9-4d63-9fee-5ac925d686c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.005 183079 DEBUG nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Start _get_guest_xml network_info=[{"id": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "address": "fa:16:3e:22:e2:d4", "network": {"id": "3d23d5e4-bd70-4266-8b97-203b9af8d4ef", "bridge": "br-int", "label": "tempest-test-network--2043252313", "subnets": [{"cidr": "10.10.2.0/24", "dns": [], "gateway": {"address": "10.10.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.2.65", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e3bc449-87", "ovs_interfaceid": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.011 183079 WARNING nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.019 183079 DEBUG nova.virt.libvirt.host [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.019 183079 DEBUG nova.virt.libvirt.host [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.028 183079 DEBUG nova.virt.libvirt.host [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.028 183079 DEBUG nova.virt.libvirt.host [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.029 183079 DEBUG nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.029 183079 DEBUG nova.virt.hardware [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.030 183079 DEBUG nova.virt.hardware [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.030 183079 DEBUG nova.virt.hardware [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.030 183079 DEBUG nova.virt.hardware [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.031 183079 DEBUG nova.virt.hardware [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.031 183079 DEBUG nova.virt.hardware [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.031 183079 DEBUG nova.virt.hardware [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.032 183079 DEBUG nova.virt.hardware [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.032 183079 DEBUG nova.virt.hardware [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.032 183079 DEBUG nova.virt.hardware [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.032 183079 DEBUG nova.virt.hardware [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.037 183079 DEBUG nova.virt.libvirt.vif [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:11:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1314522391',display_name='tempest-server-test-1314522391',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1314522391',id=19,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFDbtc4hRCSv9gHmltB2G2DhAS2UXQKlSKw7wO8jzDWMIBVwAbtKYxxZ/33aOA3bpXNLm1KqC5K8xgOmfq2yl/h8qq6Gn/Z0l0pwwJ1+wDWZki4CnB3qyJDKSWstQshx5w==',key_name='tempest-keypair-test-612599565',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26cca885d303443380036cbbe9e70744',ramdisk_id='',reservation_id='r-i7hk38t1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkConnectivityTest-1809867331',owner_user_name='tempest-NetworkConnectivityTest-1809867331-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:11:21Z,user_data=None,user_id='4a7542774b9c42618cf9d00113f9d23d',uuid=e69f0100-85ca-4ff8-a177-27d35d4580de,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "address": "fa:16:3e:22:e2:d4", "network": {"id": "3d23d5e4-bd70-4266-8b97-203b9af8d4ef", "bridge": "br-int", "label": "tempest-test-network--2043252313", "subnets": [{"cidr": "10.10.2.0/24", "dns": [], "gateway": {"address": "10.10.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.2.65", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e3bc449-87", "ovs_interfaceid": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.038 183079 DEBUG nova.network.os_vif_util [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converting VIF {"id": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "address": "fa:16:3e:22:e2:d4", "network": {"id": "3d23d5e4-bd70-4266-8b97-203b9af8d4ef", "bridge": "br-int", "label": "tempest-test-network--2043252313", "subnets": [{"cidr": "10.10.2.0/24", "dns": [], "gateway": {"address": "10.10.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.2.65", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e3bc449-87", "ovs_interfaceid": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.039 183079 DEBUG nova.network.os_vif_util [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:e2:d4,bridge_name='br-int',has_traffic_filtering=True,id=0e3bc449-87f9-4d63-9fee-5ac925d686c4,network=Network(3d23d5e4-bd70-4266-8b97-203b9af8d4ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0e3bc449-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.040 183079 DEBUG nova.objects.instance [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lazy-loading 'pci_devices' on Instance uuid e69f0100-85ca-4ff8-a177-27d35d4580de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.052 183079 DEBUG nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:11:25 compute-0 nova_compute[183075]:   <uuid>e69f0100-85ca-4ff8-a177-27d35d4580de</uuid>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   <name>instance-00000013</name>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1314522391</nova:name>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:11:25</nova:creationTime>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:11:25 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:11:25 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:11:25 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:11:25 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:11:25 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:11:25 compute-0 nova_compute[183075]:         <nova:user uuid="4a7542774b9c42618cf9d00113f9d23d">tempest-NetworkConnectivityTest-1809867331-project-member</nova:user>
Jan 22 17:11:25 compute-0 nova_compute[183075]:         <nova:project uuid="26cca885d303443380036cbbe9e70744">tempest-NetworkConnectivityTest-1809867331</nova:project>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:11:25 compute-0 nova_compute[183075]:         <nova:port uuid="0e3bc449-87f9-4d63-9fee-5ac925d686c4">
Jan 22 17:11:25 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.10.2.65" ipVersion="4"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <system>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <entry name="serial">e69f0100-85ca-4ff8-a177-27d35d4580de</entry>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <entry name="uuid">e69f0100-85ca-4ff8-a177-27d35d4580de</entry>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     </system>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   <os>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   </os>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   <features>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   </features>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/e69f0100-85ca-4ff8-a177-27d35d4580de/disk"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:22:e2:d4"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <target dev="tap0e3bc449-87"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/e69f0100-85ca-4ff8-a177-27d35d4580de/console.log" append="off"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <video>
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     </video>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:11:25 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:11:25 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:11:25 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:11:25 compute-0 nova_compute[183075]: </domain>
Jan 22 17:11:25 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.054 183079 DEBUG nova.compute.manager [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Preparing to wait for external event network-vif-plugged-0e3bc449-87f9-4d63-9fee-5ac925d686c4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.054 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.054 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.054 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.055 183079 DEBUG nova.virt.libvirt.vif [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:11:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1314522391',display_name='tempest-server-test-1314522391',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1314522391',id=19,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFDbtc4hRCSv9gHmltB2G2DhAS2UXQKlSKw7wO8jzDWMIBVwAbtKYxxZ/33aOA3bpXNLm1KqC5K8xgOmfq2yl/h8qq6Gn/Z0l0pwwJ1+wDWZki4CnB3qyJDKSWstQshx5w==',key_name='tempest-keypair-test-612599565',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26cca885d303443380036cbbe9e70744',ramdisk_id='',reservation_id='r-i7hk38t1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkConnectivityTest-1809867331',owner_user_name='tempest-NetworkConnectivityTest-1809867331-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:11:21Z,user_data=None,user_id='4a7542774b9c42618cf9d00113f9d23d',uuid=e69f0100-85ca-4ff8-a177-27d35d4580de,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "address": "fa:16:3e:22:e2:d4", "network": {"id": "3d23d5e4-bd70-4266-8b97-203b9af8d4ef", "bridge": "br-int", "label": "tempest-test-network--2043252313", "subnets": [{"cidr": "10.10.2.0/24", "dns": [], "gateway": {"address": "10.10.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.2.65", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e3bc449-87", "ovs_interfaceid": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.055 183079 DEBUG nova.network.os_vif_util [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converting VIF {"id": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "address": "fa:16:3e:22:e2:d4", "network": {"id": "3d23d5e4-bd70-4266-8b97-203b9af8d4ef", "bridge": "br-int", "label": "tempest-test-network--2043252313", "subnets": [{"cidr": "10.10.2.0/24", "dns": [], "gateway": {"address": "10.10.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.2.65", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e3bc449-87", "ovs_interfaceid": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.056 183079 DEBUG nova.network.os_vif_util [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:e2:d4,bridge_name='br-int',has_traffic_filtering=True,id=0e3bc449-87f9-4d63-9fee-5ac925d686c4,network=Network(3d23d5e4-bd70-4266-8b97-203b9af8d4ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0e3bc449-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.056 183079 DEBUG os_vif [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:e2:d4,bridge_name='br-int',has_traffic_filtering=True,id=0e3bc449-87f9-4d63-9fee-5ac925d686c4,network=Network(3d23d5e4-bd70-4266-8b97-203b9af8d4ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0e3bc449-87') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.057 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.057 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.057 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.059 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.059 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0e3bc449-87, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.060 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0e3bc449-87, col_values=(('external_ids', {'iface-id': '0e3bc449-87f9-4d63-9fee-5ac925d686c4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:e2:d4', 'vm-uuid': 'e69f0100-85ca-4ff8-a177-27d35d4580de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.098 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:25 compute-0 NetworkManager[55454]: <info>  [1769101885.1004] manager: (tap0e3bc449-87): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.101 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.109 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.110 183079 INFO os_vif [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:e2:d4,bridge_name='br-int',has_traffic_filtering=True,id=0e3bc449-87f9-4d63-9fee-5ac925d686c4,network=Network(3d23d5e4-bd70-4266-8b97-203b9af8d4ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0e3bc449-87')
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.169 183079 DEBUG nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.169 183079 DEBUG nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] No VIF found with MAC fa:16:3e:22:e2:d4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.184 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.193 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:25 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219175]: 10.100.0.14:58978 [22/Jan/2026:17:11:24.529] listener listener/metadata 0/0/0/664/664 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.194 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.6604924
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.202 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.203 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.221 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:25 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219175]: 10.100.0.14:58992 [22/Jan/2026:17:11:25.201] listener listener/metadata 0/0/0/19/19 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.222 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0190039
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.228 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.238 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:25 compute-0 NetworkManager[55454]: <info>  [1769101885.2468] manager: (tap0e3bc449-87): new Tun device (/org/freedesktop/NetworkManager/Devices/97)
Jan 22 17:11:25 compute-0 kernel: tap0e3bc449-87: entered promiscuous mode
Jan 22 17:11:25 compute-0 ovn_controller[95372]: 2026-01-22T17:11:25Z|00197|binding|INFO|Claiming lport 0e3bc449-87f9-4d63-9fee-5ac925d686c4 for this chassis.
Jan 22 17:11:25 compute-0 ovn_controller[95372]: 2026-01-22T17:11:25Z|00198|binding|INFO|0e3bc449-87f9-4d63-9fee-5ac925d686c4: Claiming fa:16:3e:22:e2:d4 10.10.2.65
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.265 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:25 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219175]: 10.100.0.14:59002 [22/Jan/2026:17:11:25.227] listener listener/metadata 0/0/0/39/39 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.266 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.267 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0288954
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.270 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:e2:d4 10.10.2.65'], port_security=['fa:16:3e:22:e2:d4 10.10.2.65'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.10.2.65/24', 'neutron:device_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d23d5e4-bd70-4266-8b97-203b9af8d4ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26cca885d303443380036cbbe9e70744', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9427a67d-1313-4d60-b73e-5a3f81f9a54d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=33071058-a726-4eee-b55a-420f0eebe73b, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=0e3bc449-87f9-4d63-9fee-5ac925d686c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.271 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 0e3bc449-87f9-4d63-9fee-5ac925d686c4 in datapath 3d23d5e4-bd70-4266-8b97-203b9af8d4ef bound to our chassis
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.275 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3d23d5e4-bd70-4266-8b97-203b9af8d4ef
Jan 22 17:11:25 compute-0 ovn_controller[95372]: 2026-01-22T17:11:25Z|00199|binding|INFO|Setting lport 0e3bc449-87f9-4d63-9fee-5ac925d686c4 ovn-installed in OVS
Jan 22 17:11:25 compute-0 ovn_controller[95372]: 2026-01-22T17:11:25Z|00200|binding|INFO|Setting lport 0e3bc449-87f9-4d63-9fee-5ac925d686c4 up in Southbound
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.278 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.282 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.286 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.287 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:25 compute-0 systemd-machined[154382]: New machine qemu-19-instance-00000013.
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.291 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[409428ae-db0e-4d94-85de-86e2e5e1348d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.293 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3d23d5e4-b1 in ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.299 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3d23d5e4-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.300 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[50bd42b7-a0f1-4c4f-8de5-843faf9d4260]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.301 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[24acb09c-f230-4700-818d-356f62c51a5c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:25 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-00000013.
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.305 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.305 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0189848
Jan 22 17:11:25 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219175]: 10.100.0.14:59018 [22/Jan/2026:17:11:25.272] listener listener/metadata 0/0/0/33/33 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.310 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.310 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.316 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[82032b33-bf33-4142-a44e-796bd67a46a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:25 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219175]: 10.100.0.14:59034 [22/Jan/2026:17:11:25.309] listener listener/metadata 0/0/0/20/20 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:11:25 compute-0 systemd-udevd[219609]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.329 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.330 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0196760
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.337 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.336 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9d05d0e6-46da-418c-ac88-c3572512b095]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.338 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:25 compute-0 NetworkManager[55454]: <info>  [1769101885.3513] device (tap0e3bc449-87): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:11:25 compute-0 NetworkManager[55454]: <info>  [1769101885.3524] device (tap0e3bc449-87): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.354 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.354 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0167177
Jan 22 17:11:25 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219175]: 10.100.0.14:59050 [22/Jan/2026:17:11:25.336] listener listener/metadata 0/0/0/19/19 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.361 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.362 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.367 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[1440d191-f8fc-4476-9680-23e78a1c5797]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.379 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[078c971d-b363-4c13-9994-63202b11d516]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:25 compute-0 NetworkManager[55454]: <info>  [1769101885.3807] manager: (tap3d23d5e4-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/98)
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.381 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.381 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0199547
Jan 22 17:11:25 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219175]: 10.100.0.14:59056 [22/Jan/2026:17:11:25.360] listener listener/metadata 0/0/0/21/21 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.388 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.388 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.412 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.412 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0239334
Jan 22 17:11:25 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219175]: 10.100.0.14:59058 [22/Jan/2026:17:11:25.387] listener listener/metadata 0/0/0/25/25 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.420 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.421 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.426 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[1ccbdd8c-338c-4e98-9407-1760e4c7faa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.430 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[7b440d15-2a01-48d7-bb3b-c7985e0c03eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:25 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219175]: 10.100.0.14:59060 [22/Jan/2026:17:11:25.418] listener listener/metadata 0/0/0/26/26 200 148 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.444 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.444 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 164 time: 0.0232928
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.450 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.450 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:25 compute-0 NetworkManager[55454]: <info>  [1769101885.4606] device (tap3d23d5e4-b0): carrier: link connected
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.465 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d81faf6d-c92f-4927-983a-624cec994101]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.472 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.473 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 164 time: 0.0229931
Jan 22 17:11:25 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219175]: 10.100.0.14:59072 [22/Jan/2026:17:11:25.449] listener listener/metadata 0/0/0/24/24 200 148 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.480 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.480 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.488 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0a4c683e-762f-4b90-8258-d19694c0ab20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3d23d5e4-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:1f:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424916, 'reachable_time': 42675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219672, 'error': None, 'target': 'ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:25 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219175]: 10.100.0.14:59088 [22/Jan/2026:17:11:25.479] listener listener/metadata 0/0/0/32/32 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.511 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0310612
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.516 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f7bdf834-d338-41f3-abbb-4d242c709148]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe58:1fe5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424916, 'tstamp': 424916}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219685, 'error': None, 'target': 'ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.523 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.526 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.545 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[034acca2-786a-4176-a965-f329bff3d145]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3d23d5e4-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:1f:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424916, 'reachable_time': 42675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219697, 'error': None, 'target': 'ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:25 compute-0 podman[219626]: 2026-01-22 17:11:25.547933767 +0000 UTC m=+0.126465457 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:11:25 compute-0 podman[219638]: 2026-01-22 17:11:25.549280482 +0000 UTC m=+0.122624987 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 17:11:25 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219175]: 10.100.0.14:59102 [22/Jan/2026:17:11:25.523] listener listener/metadata 0/0/0/28/28 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.550 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.551 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0250144
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.557 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.558 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:25 compute-0 podman[219639]: 2026-01-22 17:11:25.560583497 +0000 UTC m=+0.132844804 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true)
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.562 183079 DEBUG nova.compute.manager [req-8ba6bb3e-6082-4838-9f63-9574c8639825 req-5d98858b-aa82-4c52-8dab-4d36a12d9b16 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Received event network-vif-plugged-0e3bc449-87f9-4d63-9fee-5ac925d686c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.562 183079 DEBUG oslo_concurrency.lockutils [req-8ba6bb3e-6082-4838-9f63-9574c8639825 req-5d98858b-aa82-4c52-8dab-4d36a12d9b16 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.562 183079 DEBUG oslo_concurrency.lockutils [req-8ba6bb3e-6082-4838-9f63-9574c8639825 req-5d98858b-aa82-4c52-8dab-4d36a12d9b16 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.563 183079 DEBUG oslo_concurrency.lockutils [req-8ba6bb3e-6082-4838-9f63-9574c8639825 req-5d98858b-aa82-4c52-8dab-4d36a12d9b16 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.563 183079 DEBUG nova.compute.manager [req-8ba6bb3e-6082-4838-9f63-9574c8639825 req-5d98858b-aa82-4c52-8dab-4d36a12d9b16 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Processing event network-vif-plugged-0e3bc449-87f9-4d63-9fee-5ac925d686c4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.575 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.575 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0174513
Jan 22 17:11:25 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219175]: 10.100.0.14:59118 [22/Jan/2026:17:11:25.555] listener listener/metadata 0/0/0/19/19 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.580 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.580 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.597 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.597 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0170102
Jan 22 17:11:25 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219175]: 10.100.0.14:59134 [22/Jan/2026:17:11:25.579] listener listener/metadata 0/0/0/18/18 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.602 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.602 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7b3fc432-8388-498e-9794-8e1fc58a14ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.603 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.616 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.617 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 164 time: 0.0139048
Jan 22 17:11:25 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219175]: 10.100.0.14:59148 [22/Jan/2026:17:11:25.602] listener listener/metadata 0/0/0/15/15 200 148 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.622 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.622 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.638 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.638 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0161839
Jan 22 17:11:25 compute-0 haproxy-metadata-proxy-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219175]: 10.100.0.14:59164 [22/Jan/2026:17:11:25.621] listener listener/metadata 0/0/0/17/17 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.687 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3c99a365-b948-43c2-badf-aef0c58d4d22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.688 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d23d5e4-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.688 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.688 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3d23d5e4-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.691 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:25 compute-0 kernel: tap3d23d5e4-b0: entered promiscuous mode
Jan 22 17:11:25 compute-0 NetworkManager[55454]: <info>  [1769101885.6920] manager: (tap3d23d5e4-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.695 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.700 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3d23d5e4-b0, col_values=(('external_ids', {'iface-id': 'd022c2ed-36a9-47fd-8947-7dd18cdeb285'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.701 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:25 compute-0 ovn_controller[95372]: 2026-01-22T17:11:25Z|00201|binding|INFO|Releasing lport d022c2ed-36a9-47fd-8947-7dd18cdeb285 from this chassis (sb_readonly=0)
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.714 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3d23d5e4-bd70-4266-8b97-203b9af8d4ef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3d23d5e4-bd70-4266-8b97-203b9af8d4ef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.714 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.715 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[84229902-4942-46c9-813f-c9b72460be39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.716 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/3d23d5e4-bd70-4266-8b97-203b9af8d4ef.pid.haproxy
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 3d23d5e4-bd70-4266-8b97-203b9af8d4ef
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:11:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:25.716 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef', 'env', 'PROCESS_TAG=haproxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3d23d5e4-bd70-4266-8b97-203b9af8d4ef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.961 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101885.960866, e69f0100-85ca-4ff8-a177-27d35d4580de => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.963 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] VM Started (Lifecycle Event)
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.965 183079 DEBUG nova.compute.manager [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.970 183079 DEBUG nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.974 183079 INFO nova.virt.libvirt.driver [-] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Instance spawned successfully.
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.974 183079 DEBUG nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:11:25 compute-0 nova_compute[183075]: 2026-01-22 17:11:25.996 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.003 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.006 183079 DEBUG nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.006 183079 DEBUG nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.007 183079 DEBUG nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.007 183079 DEBUG nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.008 183079 DEBUG nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.008 183079 DEBUG nova.virt.libvirt.driver [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.031 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.031 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101885.9620395, e69f0100-85ca-4ff8-a177-27d35d4580de => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.032 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] VM Paused (Lifecycle Event)
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.056 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.063 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101885.9697108, e69f0100-85ca-4ff8-a177-27d35d4580de => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.064 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] VM Resumed (Lifecycle Event)
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.076 183079 INFO nova.compute.manager [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Took 4.21 seconds to spawn the instance on the hypervisor.
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.077 183079 DEBUG nova.compute.manager [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.085 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.088 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.110 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.140 183079 DEBUG nova.network.neutron [req-a2eeff99-5a48-4f70-aec9-3ec67fa6e612 req-9ee30527-98cc-44b7-a0f1-0a6a4d1601c2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Updated VIF entry in instance network info cache for port 0e3bc449-87f9-4d63-9fee-5ac925d686c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.141 183079 DEBUG nova.network.neutron [req-a2eeff99-5a48-4f70-aec9-3ec67fa6e612 req-9ee30527-98cc-44b7-a0f1-0a6a4d1601c2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Updating instance_info_cache with network_info: [{"id": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "address": "fa:16:3e:22:e2:d4", "network": {"id": "3d23d5e4-bd70-4266-8b97-203b9af8d4ef", "bridge": "br-int", "label": "tempest-test-network--2043252313", "subnets": [{"cidr": "10.10.2.0/24", "dns": [], "gateway": {"address": "10.10.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.2.65", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e3bc449-87", "ovs_interfaceid": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.144 183079 INFO nova.compute.manager [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Took 4.76 seconds to build instance.
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.160 183079 DEBUG oslo_concurrency.lockutils [req-a2eeff99-5a48-4f70-aec9-3ec67fa6e612 req-9ee30527-98cc-44b7-a0f1-0a6a4d1601c2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-e69f0100-85ca-4ff8-a177-27d35d4580de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.161 183079 DEBUG oslo_concurrency.lockutils [None req-c06ff669-7ff5-44c9-8994-c0a6bef29f46 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "e69f0100-85ca-4ff8-a177-27d35d4580de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.839s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:26 compute-0 podman[219740]: 2026-01-22 17:11:26.162999493 +0000 UTC m=+0.064036951 container create c35ae59c6888fa03b353884b930a2ae5dcdbde61665b112595820ed76cc4310d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:11:26 compute-0 systemd[1]: Started libpod-conmon-c35ae59c6888fa03b353884b930a2ae5dcdbde61665b112595820ed76cc4310d.scope.
Jan 22 17:11:26 compute-0 podman[219740]: 2026-01-22 17:11:26.127884617 +0000 UTC m=+0.028922165 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:11:26 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:11:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ef46b5b029f8510f0162b5237ad7a6ce67e655f4fab8a923ff26c7a15c1618a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:11:26 compute-0 podman[219740]: 2026-01-22 17:11:26.269438608 +0000 UTC m=+0.170476086 container init c35ae59c6888fa03b353884b930a2ae5dcdbde61665b112595820ed76cc4310d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:11:26 compute-0 podman[219740]: 2026-01-22 17:11:26.279696395 +0000 UTC m=+0.180733853 container start c35ae59c6888fa03b353884b930a2ae5dcdbde61665b112595820ed76cc4310d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:11:26 compute-0 neutron-haproxy-ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219755]: [NOTICE]   (219759) : New worker (219761) forked
Jan 22 17:11:26 compute-0 neutron-haproxy-ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219755]: [NOTICE]   (219759) : Loading success.
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.643 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "936001bf-d51b-4243-87b8-e363ef3c47a8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.643 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.660 183079 DEBUG nova.compute.manager [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.741 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.741 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.750 183079 DEBUG nova.virt.hardware [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:11:26 compute-0 nova_compute[183075]: 2026-01-22 17:11:26.750 183079 INFO nova.compute.claims [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.008 183079 DEBUG nova.compute.provider_tree [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.029 183079 DEBUG nova.scheduler.client.report [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.053 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.312s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.054 183079 DEBUG nova.compute.manager [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.106 183079 DEBUG nova.compute.manager [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.107 183079 DEBUG nova.network.neutron [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.134 183079 INFO nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.158 183079 DEBUG nova.compute.manager [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.277 183079 DEBUG nova.compute.manager [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.279 183079 DEBUG nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.279 183079 INFO nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Creating image(s)
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.280 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "/var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.281 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.282 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.311 183079 DEBUG oslo_concurrency.processutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.387 183079 DEBUG oslo_concurrency.processutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.389 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.390 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.408 183079 DEBUG oslo_concurrency.processutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.494 183079 DEBUG oslo_concurrency.processutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.495 183079 DEBUG oslo_concurrency.processutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.551 183079 DEBUG oslo_concurrency.processutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.552 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.553 183079 DEBUG oslo_concurrency.processutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.610 183079 DEBUG oslo_concurrency.processutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.611 183079 DEBUG nova.virt.disk.api [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Checking if we can resize image /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.612 183079 DEBUG oslo_concurrency.processutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.672 183079 DEBUG oslo_concurrency.processutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.673 183079 DEBUG nova.virt.disk.api [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Cannot resize image /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.674 183079 DEBUG nova.objects.instance [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'migration_context' on Instance uuid 936001bf-d51b-4243-87b8-e363ef3c47a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.701 183079 DEBUG nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.701 183079 DEBUG nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Ensure instance console log exists: /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.701 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.702 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.702 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.799 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101872.7977788, 2bf3289b-0c4e-4286-80ca-c74eb06a8b96 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.799 183079 INFO nova.compute.manager [-] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] VM Stopped (Lifecycle Event)
Jan 22 17:11:27 compute-0 nova_compute[183075]: 2026-01-22 17:11:27.817 183079 DEBUG nova.compute.manager [None req-eabcbfa0-2221-483a-b798-a412072b0851 - - - - - -] [instance: 2bf3289b-0c4e-4286-80ca-c74eb06a8b96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:28 compute-0 nova_compute[183075]: 2026-01-22 17:11:28.028 183079 DEBUG nova.compute.manager [req-c4529e4e-60a0-41a9-b3a6-c2a1a93f4506 req-7e716aea-8509-4d33-93da-14e8d4003fce a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Received event network-vif-plugged-0e3bc449-87f9-4d63-9fee-5ac925d686c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:28 compute-0 nova_compute[183075]: 2026-01-22 17:11:28.028 183079 DEBUG oslo_concurrency.lockutils [req-c4529e4e-60a0-41a9-b3a6-c2a1a93f4506 req-7e716aea-8509-4d33-93da-14e8d4003fce a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:28 compute-0 nova_compute[183075]: 2026-01-22 17:11:28.028 183079 DEBUG oslo_concurrency.lockutils [req-c4529e4e-60a0-41a9-b3a6-c2a1a93f4506 req-7e716aea-8509-4d33-93da-14e8d4003fce a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:28 compute-0 nova_compute[183075]: 2026-01-22 17:11:28.029 183079 DEBUG oslo_concurrency.lockutils [req-c4529e4e-60a0-41a9-b3a6-c2a1a93f4506 req-7e716aea-8509-4d33-93da-14e8d4003fce a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:28 compute-0 nova_compute[183075]: 2026-01-22 17:11:28.029 183079 DEBUG nova.compute.manager [req-c4529e4e-60a0-41a9-b3a6-c2a1a93f4506 req-7e716aea-8509-4d33-93da-14e8d4003fce a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] No waiting events found dispatching network-vif-plugged-0e3bc449-87f9-4d63-9fee-5ac925d686c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:28 compute-0 nova_compute[183075]: 2026-01-22 17:11:28.029 183079 WARNING nova.compute.manager [req-c4529e4e-60a0-41a9-b3a6-c2a1a93f4506 req-7e716aea-8509-4d33-93da-14e8d4003fce a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Received unexpected event network-vif-plugged-0e3bc449-87f9-4d63-9fee-5ac925d686c4 for instance with vm_state active and task_state None.
Jan 22 17:11:28 compute-0 nova_compute[183075]: 2026-01-22 17:11:28.114 183079 DEBUG nova.policy [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:11:28 compute-0 nova_compute[183075]: 2026-01-22 17:11:28.226 183079 INFO nova.compute.manager [None req-9036fe0f-5d60-4d5b-a7d2-9428efe10590 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Get console output
Jan 22 17:11:28 compute-0 nova_compute[183075]: 2026-01-22 17:11:28.349 183079 INFO nova.compute.manager [None req-c6ed7aa1-bed1-4563-a781-cb754bb04af5 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Get console output
Jan 22 17:11:28 compute-0 ovn_controller[95372]: 2026-01-22T17:11:28Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:65:16:89 10.100.0.24
Jan 22 17:11:28 compute-0 ovn_controller[95372]: 2026-01-22T17:11:28Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:65:16:89 10.100.0.24
Jan 22 17:11:29 compute-0 nova_compute[183075]: 2026-01-22 17:11:29.838 183079 INFO nova.compute.manager [None req-0978e769-5745-4222-926a-29c33d4ecd12 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Get console output
Jan 22 17:11:29 compute-0 nova_compute[183075]: 2026-01-22 17:11:29.845 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:30 compute-0 nova_compute[183075]: 2026-01-22 17:11:30.099 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:30 compute-0 nova_compute[183075]: 2026-01-22 17:11:30.187 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:30 compute-0 nova_compute[183075]: 2026-01-22 17:11:30.238 183079 DEBUG nova.network.neutron [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Successfully updated port: 804a64f5-797f-4eba-ae49-100790171545 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:11:30 compute-0 nova_compute[183075]: 2026-01-22 17:11:30.256 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "refresh_cache-936001bf-d51b-4243-87b8-e363ef3c47a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:11:30 compute-0 nova_compute[183075]: 2026-01-22 17:11:30.257 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquired lock "refresh_cache-936001bf-d51b-4243-87b8-e363ef3c47a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:11:30 compute-0 nova_compute[183075]: 2026-01-22 17:11:30.257 183079 DEBUG nova.network.neutron [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:11:30 compute-0 nova_compute[183075]: 2026-01-22 17:11:30.346 183079 DEBUG nova.compute.manager [req-89cebc10-81b8-4eb7-99d4-839a45e36037 req-77fe56aa-ae9a-4cb7-8358-a2148bd37750 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Received event network-changed-804a64f5-797f-4eba-ae49-100790171545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:30 compute-0 nova_compute[183075]: 2026-01-22 17:11:30.346 183079 DEBUG nova.compute.manager [req-89cebc10-81b8-4eb7-99d4-839a45e36037 req-77fe56aa-ae9a-4cb7-8358-a2148bd37750 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Refreshing instance network info cache due to event network-changed-804a64f5-797f-4eba-ae49-100790171545. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:11:30 compute-0 nova_compute[183075]: 2026-01-22 17:11:30.347 183079 DEBUG oslo_concurrency.lockutils [req-89cebc10-81b8-4eb7-99d4-839a45e36037 req-77fe56aa-ae9a-4cb7-8358-a2148bd37750 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-936001bf-d51b-4243-87b8-e363ef3c47a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:11:30 compute-0 nova_compute[183075]: 2026-01-22 17:11:30.949 183079 DEBUG nova.network.neutron [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:11:31 compute-0 nova_compute[183075]: 2026-01-22 17:11:31.022 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101876.021128, 618a8c78-4b30-4b4d-9617-4c54bfdd414e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:11:31 compute-0 nova_compute[183075]: 2026-01-22 17:11:31.023 183079 INFO nova.compute.manager [-] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] VM Stopped (Lifecycle Event)
Jan 22 17:11:31 compute-0 nova_compute[183075]: 2026-01-22 17:11:31.050 183079 DEBUG nova.compute.manager [None req-81b421b1-1ba7-480c-a073-916e28193cc0 - - - - - -] [instance: 618a8c78-4b30-4b4d-9617-4c54bfdd414e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.134 183079 DEBUG nova.network.neutron [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Updating instance_info_cache with network_info: [{"id": "804a64f5-797f-4eba-ae49-100790171545", "address": "fa:16:3e:3a:03:2f", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap804a64f5-79", "ovs_interfaceid": "804a64f5-797f-4eba-ae49-100790171545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.216 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Releasing lock "refresh_cache-936001bf-d51b-4243-87b8-e363ef3c47a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.216 183079 DEBUG nova.compute.manager [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Instance network_info: |[{"id": "804a64f5-797f-4eba-ae49-100790171545", "address": "fa:16:3e:3a:03:2f", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap804a64f5-79", "ovs_interfaceid": "804a64f5-797f-4eba-ae49-100790171545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.217 183079 DEBUG oslo_concurrency.lockutils [req-89cebc10-81b8-4eb7-99d4-839a45e36037 req-77fe56aa-ae9a-4cb7-8358-a2148bd37750 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-936001bf-d51b-4243-87b8-e363ef3c47a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.217 183079 DEBUG nova.network.neutron [req-89cebc10-81b8-4eb7-99d4-839a45e36037 req-77fe56aa-ae9a-4cb7-8358-a2148bd37750 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Refreshing network info cache for port 804a64f5-797f-4eba-ae49-100790171545 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.220 183079 DEBUG nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Start _get_guest_xml network_info=[{"id": "804a64f5-797f-4eba-ae49-100790171545", "address": "fa:16:3e:3a:03:2f", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap804a64f5-79", "ovs_interfaceid": "804a64f5-797f-4eba-ae49-100790171545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.225 183079 WARNING nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.230 183079 DEBUG nova.virt.libvirt.host [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.230 183079 DEBUG nova.virt.libvirt.host [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.233 183079 DEBUG nova.virt.libvirt.host [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.234 183079 DEBUG nova.virt.libvirt.host [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.235 183079 DEBUG nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.235 183079 DEBUG nova.virt.hardware [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.236 183079 DEBUG nova.virt.hardware [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.236 183079 DEBUG nova.virt.hardware [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.236 183079 DEBUG nova.virt.hardware [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.237 183079 DEBUG nova.virt.hardware [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.237 183079 DEBUG nova.virt.hardware [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.237 183079 DEBUG nova.virt.hardware [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.237 183079 DEBUG nova.virt.hardware [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.237 183079 DEBUG nova.virt.hardware [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.238 183079 DEBUG nova.virt.hardware [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.238 183079 DEBUG nova.virt.hardware [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.242 183079 DEBUG nova.virt.libvirt.vif [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:11:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1891253532',display_name='tempest-server-test-1891253532',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1891253532',id=20,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-io1lyazv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:11:27Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=936001bf-d51b-4243-87b8-e363ef3c47a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "804a64f5-797f-4eba-ae49-100790171545", "address": "fa:16:3e:3a:03:2f", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap804a64f5-79", "ovs_interfaceid": "804a64f5-797f-4eba-ae49-100790171545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.243 183079 DEBUG nova.network.os_vif_util [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "804a64f5-797f-4eba-ae49-100790171545", "address": "fa:16:3e:3a:03:2f", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap804a64f5-79", "ovs_interfaceid": "804a64f5-797f-4eba-ae49-100790171545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.244 183079 DEBUG nova.network.os_vif_util [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:03:2f,bridge_name='br-int',has_traffic_filtering=True,id=804a64f5-797f-4eba-ae49-100790171545,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap804a64f5-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.245 183079 DEBUG nova.objects.instance [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 936001bf-d51b-4243-87b8-e363ef3c47a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.262 183079 DEBUG nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:11:32 compute-0 nova_compute[183075]:   <uuid>936001bf-d51b-4243-87b8-e363ef3c47a8</uuid>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   <name>instance-00000014</name>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1891253532</nova:name>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:11:32</nova:creationTime>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:11:32 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:11:32 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:11:32 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:11:32 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:11:32 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:11:32 compute-0 nova_compute[183075]:         <nova:user uuid="cd47d63cff2548a88e21e5c2e6a5c161">tempest-FloatingIpSeparateNetwork-931877966-project-member</nova:user>
Jan 22 17:11:32 compute-0 nova_compute[183075]:         <nova:project uuid="e05c7aae349e4a1d859a387df45650a0">tempest-FloatingIpSeparateNetwork-931877966</nova:project>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:11:32 compute-0 nova_compute[183075]:         <nova:port uuid="804a64f5-797f-4eba-ae49-100790171545">
Jan 22 17:11:32 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <system>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <entry name="serial">936001bf-d51b-4243-87b8-e363ef3c47a8</entry>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <entry name="uuid">936001bf-d51b-4243-87b8-e363ef3c47a8</entry>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     </system>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   <os>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   </os>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   <features>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   </features>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:3a:03:2f"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <target dev="tap804a64f5-79"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/console.log" append="off"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <video>
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     </video>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:11:32 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:11:32 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:11:32 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:11:32 compute-0 nova_compute[183075]: </domain>
Jan 22 17:11:32 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.263 183079 DEBUG nova.compute.manager [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Preparing to wait for external event network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.263 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.263 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.264 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.265 183079 DEBUG nova.virt.libvirt.vif [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:11:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1891253532',display_name='tempest-server-test-1891253532',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1891253532',id=20,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-io1lyazv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:11:27Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=936001bf-d51b-4243-87b8-e363ef3c47a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "804a64f5-797f-4eba-ae49-100790171545", "address": "fa:16:3e:3a:03:2f", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap804a64f5-79", "ovs_interfaceid": "804a64f5-797f-4eba-ae49-100790171545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.265 183079 DEBUG nova.network.os_vif_util [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "804a64f5-797f-4eba-ae49-100790171545", "address": "fa:16:3e:3a:03:2f", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap804a64f5-79", "ovs_interfaceid": "804a64f5-797f-4eba-ae49-100790171545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.266 183079 DEBUG nova.network.os_vif_util [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:03:2f,bridge_name='br-int',has_traffic_filtering=True,id=804a64f5-797f-4eba-ae49-100790171545,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap804a64f5-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.266 183079 DEBUG os_vif [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:03:2f,bridge_name='br-int',has_traffic_filtering=True,id=804a64f5-797f-4eba-ae49-100790171545,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap804a64f5-79') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.267 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.267 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.268 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.271 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.271 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap804a64f5-79, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.272 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap804a64f5-79, col_values=(('external_ids', {'iface-id': '804a64f5-797f-4eba-ae49-100790171545', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3a:03:2f', 'vm-uuid': '936001bf-d51b-4243-87b8-e363ef3c47a8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.273 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:32 compute-0 NetworkManager[55454]: <info>  [1769101892.2748] manager: (tap804a64f5-79): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.277 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.280 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.281 183079 INFO os_vif [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:03:2f,bridge_name='br-int',has_traffic_filtering=True,id=804a64f5-797f-4eba-ae49-100790171545,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap804a64f5-79')
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.335 183079 DEBUG nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.336 183079 DEBUG nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No VIF found with MAC fa:16:3e:3a:03:2f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:11:32 compute-0 kernel: tap804a64f5-79: entered promiscuous mode
Jan 22 17:11:32 compute-0 NetworkManager[55454]: <info>  [1769101892.4125] manager: (tap804a64f5-79): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Jan 22 17:11:32 compute-0 ovn_controller[95372]: 2026-01-22T17:11:32Z|00202|binding|INFO|Claiming lport 804a64f5-797f-4eba-ae49-100790171545 for this chassis.
Jan 22 17:11:32 compute-0 ovn_controller[95372]: 2026-01-22T17:11:32Z|00203|binding|INFO|804a64f5-797f-4eba-ae49-100790171545: Claiming fa:16:3e:3a:03:2f 10.100.0.9
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.417 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.426 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:03:2f 10.100.0.9'], port_security=['fa:16:3e:3a:03:2f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-576f6598-999f-46d9-809a-65b7475a1ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.196'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92087573-6410-4f80-a195-ce8b2058420e, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=804a64f5-797f-4eba-ae49-100790171545) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.427 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 804a64f5-797f-4eba-ae49-100790171545 in datapath 576f6598-999f-46d9-809a-65b7475a1ec7 bound to our chassis
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.432 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 576f6598-999f-46d9-809a-65b7475a1ec7
Jan 22 17:11:32 compute-0 systemd-udevd[219834]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:11:32 compute-0 ovn_controller[95372]: 2026-01-22T17:11:32Z|00204|binding|INFO|Setting lport 804a64f5-797f-4eba-ae49-100790171545 ovn-installed in OVS
Jan 22 17:11:32 compute-0 ovn_controller[95372]: 2026-01-22T17:11:32Z|00205|binding|INFO|Setting lport 804a64f5-797f-4eba-ae49-100790171545 up in Southbound
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.451 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.452 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cd51ed28-11ca-446c-8773-d85b709289b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.454 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap576f6598-91 in ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.453 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.457 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap576f6598-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.457 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8e3dd2b4-5172-4bcc-9d38-88d3c6fe4b18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.460 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b9d88e45-d2dd-4677-bc62-9faa8ca7147c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:32 compute-0 NetworkManager[55454]: <info>  [1769101892.4659] device (tap804a64f5-79): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:11:32 compute-0 NetworkManager[55454]: <info>  [1769101892.4669] device (tap804a64f5-79): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:11:32 compute-0 systemd-machined[154382]: New machine qemu-20-instance-00000014.
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.473 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[fb902996-dc7f-4e39-af0e-08dd964f5fa8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:32 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-00000014.
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.504 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[775c730a-49b2-4f0f-9233-3ed8e0d47da8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:32 compute-0 podman[219824]: 2026-01-22 17:11:32.527177298 +0000 UTC m=+0.106451097 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.538 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c9bb9ce1-d000-40bd-a5a9-d051911ff604]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.544 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cffdc55d-efc2-42e5-9c24-35520439055c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:32 compute-0 NetworkManager[55454]: <info>  [1769101892.5480] manager: (tap576f6598-90): new Veth device (/org/freedesktop/NetworkManager/Devices/102)
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.582 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[210f91f9-e988-4d17-971b-862f78c87b38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.585 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[18c21470-2605-4697-96fe-8a7c9add7b64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:32 compute-0 NetworkManager[55454]: <info>  [1769101892.6116] device (tap576f6598-90): carrier: link connected
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.618 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[af30de6d-97a4-403e-87e7-eb459ed45b87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.637 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[66b3af00-1ab0-46dc-a41d-0ba6c82538aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap576f6598-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:fa:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425631, 'reachable_time': 19752, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219882, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.651 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[908bf0dd-1ec8-4b71-8092-14090e62d1ff]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea1:facd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425631, 'tstamp': 425631}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219883, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.669 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[80b6cbd8-0f12-4545-8d71-7c3aae197f03]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap576f6598-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:fa:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425631, 'reachable_time': 19752, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219884, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.676 183079 DEBUG nova.compute.manager [req-b42b21df-1ed6-41a1-9c41-77c85ae7ad64 req-39586034-2fba-4087-99c5-5c6e8356ffde a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Received event network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.678 183079 DEBUG oslo_concurrency.lockutils [req-b42b21df-1ed6-41a1-9c41-77c85ae7ad64 req-39586034-2fba-4087-99c5-5c6e8356ffde a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.679 183079 DEBUG oslo_concurrency.lockutils [req-b42b21df-1ed6-41a1-9c41-77c85ae7ad64 req-39586034-2fba-4087-99c5-5c6e8356ffde a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.679 183079 DEBUG oslo_concurrency.lockutils [req-b42b21df-1ed6-41a1-9c41-77c85ae7ad64 req-39586034-2fba-4087-99c5-5c6e8356ffde a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.679 183079 DEBUG nova.compute.manager [req-b42b21df-1ed6-41a1-9c41-77c85ae7ad64 req-39586034-2fba-4087-99c5-5c6e8356ffde a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Processing event network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.715 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[71462691-9a10-49de-aa4c-e2e8a5d3685d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.795 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d0b0554f-5f77-4306-a3fc-84f924f7bfbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.797 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap576f6598-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.797 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.798 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap576f6598-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.800 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:32 compute-0 kernel: tap576f6598-90: entered promiscuous mode
Jan 22 17:11:32 compute-0 NetworkManager[55454]: <info>  [1769101892.8022] manager: (tap576f6598-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.803 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.812 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap576f6598-90, col_values=(('external_ids', {'iface-id': '1759254b-798a-4e65-baf5-489557c1f604'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.814 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:32 compute-0 ovn_controller[95372]: 2026-01-22T17:11:32Z|00206|binding|INFO|Releasing lport 1759254b-798a-4e65-baf5-489557c1f604 from this chassis (sb_readonly=0)
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.814 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.817 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/576f6598-999f-46d9-809a-65b7475a1ec7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/576f6598-999f-46d9-809a-65b7475a1ec7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.818 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8cc327d1-c36a-486b-96c7-0453491c88cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.819 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/576f6598-999f-46d9-809a-65b7475a1ec7.pid.haproxy
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 576f6598-999f-46d9-809a-65b7475a1ec7
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:11:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:32.820 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'env', 'PROCESS_TAG=haproxy-576f6598-999f-46d9-809a-65b7475a1ec7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/576f6598-999f-46d9-809a-65b7475a1ec7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.827 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.896 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101892.89561, 936001bf-d51b-4243-87b8-e363ef3c47a8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.897 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] VM Started (Lifecycle Event)
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.898 183079 DEBUG nova.compute.manager [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.902 183079 DEBUG nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.906 183079 INFO nova.virt.libvirt.driver [-] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Instance spawned successfully.
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.906 183079 DEBUG nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.920 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.926 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.929 183079 DEBUG nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.929 183079 DEBUG nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.929 183079 DEBUG nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.930 183079 DEBUG nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.930 183079 DEBUG nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.931 183079 DEBUG nova.virt.libvirt.driver [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.968 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.969 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101892.8959699, 936001bf-d51b-4243-87b8-e363ef3c47a8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.969 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] VM Paused (Lifecycle Event)
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.994 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.998 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101892.9013755, 936001bf-d51b-4243-87b8-e363ef3c47a8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:11:32 compute-0 nova_compute[183075]: 2026-01-22 17:11:32.998 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] VM Resumed (Lifecycle Event)
Jan 22 17:11:33 compute-0 nova_compute[183075]: 2026-01-22 17:11:33.005 183079 INFO nova.compute.manager [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Took 5.73 seconds to spawn the instance on the hypervisor.
Jan 22 17:11:33 compute-0 nova_compute[183075]: 2026-01-22 17:11:33.005 183079 DEBUG nova.compute.manager [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:33 compute-0 nova_compute[183075]: 2026-01-22 17:11:33.018 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:11:33 compute-0 nova_compute[183075]: 2026-01-22 17:11:33.023 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:11:33 compute-0 nova_compute[183075]: 2026-01-22 17:11:33.055 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:11:33 compute-0 nova_compute[183075]: 2026-01-22 17:11:33.082 183079 INFO nova.compute.manager [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Took 6.38 seconds to build instance.
Jan 22 17:11:33 compute-0 nova_compute[183075]: 2026-01-22 17:11:33.099 183079 DEBUG oslo_concurrency.lockutils [None req-7dbef97c-b284-4a19-a377-0571bf5b6717 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.456s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:33 compute-0 podman[219925]: 2026-01-22 17:11:33.233397539 +0000 UTC m=+0.059846301 container create ce557a5a19054a6576b5f8e43aaebf77a0f24635c178933357e9c8fca275c8cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 17:11:33 compute-0 systemd[1]: Started libpod-conmon-ce557a5a19054a6576b5f8e43aaebf77a0f24635c178933357e9c8fca275c8cd.scope.
Jan 22 17:11:33 compute-0 podman[219925]: 2026-01-22 17:11:33.202153614 +0000 UTC m=+0.028602426 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:11:33 compute-0 nova_compute[183075]: 2026-01-22 17:11:33.297 183079 DEBUG nova.network.neutron [req-89cebc10-81b8-4eb7-99d4-839a45e36037 req-77fe56aa-ae9a-4cb7-8358-a2148bd37750 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Updated VIF entry in instance network info cache for port 804a64f5-797f-4eba-ae49-100790171545. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:11:33 compute-0 nova_compute[183075]: 2026-01-22 17:11:33.298 183079 DEBUG nova.network.neutron [req-89cebc10-81b8-4eb7-99d4-839a45e36037 req-77fe56aa-ae9a-4cb7-8358-a2148bd37750 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Updating instance_info_cache with network_info: [{"id": "804a64f5-797f-4eba-ae49-100790171545", "address": "fa:16:3e:3a:03:2f", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap804a64f5-79", "ovs_interfaceid": "804a64f5-797f-4eba-ae49-100790171545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:33 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:11:33 compute-0 nova_compute[183075]: 2026-01-22 17:11:33.311 183079 DEBUG oslo_concurrency.lockutils [req-89cebc10-81b8-4eb7-99d4-839a45e36037 req-77fe56aa-ae9a-4cb7-8358-a2148bd37750 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-936001bf-d51b-4243-87b8-e363ef3c47a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:11:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a9ec44c08dfafb854a6c74492a29420ecebc41a6de1085bc5e755ed3934c6c7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:11:33 compute-0 podman[219925]: 2026-01-22 17:11:33.325711116 +0000 UTC m=+0.152159898 container init ce557a5a19054a6576b5f8e43aaebf77a0f24635c178933357e9c8fca275c8cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:11:33 compute-0 podman[219925]: 2026-01-22 17:11:33.333464688 +0000 UTC m=+0.159913460 container start ce557a5a19054a6576b5f8e43aaebf77a0f24635c178933357e9c8fca275c8cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 17:11:33 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[219940]: [NOTICE]   (219944) : New worker (219946) forked
Jan 22 17:11:33 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[219940]: [NOTICE]   (219944) : Loading success.
Jan 22 17:11:33 compute-0 nova_compute[183075]: 2026-01-22 17:11:33.433 183079 INFO nova.compute.manager [None req-3068a8ca-0ccf-45b8-977b-012b923bf222 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Get console output
Jan 22 17:11:33 compute-0 nova_compute[183075]: 2026-01-22 17:11:33.540 183079 INFO nova.compute.manager [None req-79da4a40-afc0-453a-bbea-a747a3b4be60 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Get console output
Jan 22 17:11:33 compute-0 nova_compute[183075]: 2026-01-22 17:11:33.547 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:34 compute-0 nova_compute[183075]: 2026-01-22 17:11:34.202 183079 INFO nova.compute.manager [None req-0b8a9341-b0cc-4386-b551-a1eca9439a29 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Get console output
Jan 22 17:11:34 compute-0 nova_compute[183075]: 2026-01-22 17:11:34.208 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:34.755 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:34.756 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:11:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:11:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 83f76843-09f1-4ec7-b234-50118063210a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:34 compute-0 nova_compute[183075]: 2026-01-22 17:11:34.781 183079 DEBUG nova.compute.manager [req-95957b42-d654-4c30-9cb8-8161b80e060a req-290677de-88c7-4f3b-acf2-20b852ad2564 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Received event network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:34 compute-0 nova_compute[183075]: 2026-01-22 17:11:34.782 183079 DEBUG oslo_concurrency.lockutils [req-95957b42-d654-4c30-9cb8-8161b80e060a req-290677de-88c7-4f3b-acf2-20b852ad2564 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:34 compute-0 nova_compute[183075]: 2026-01-22 17:11:34.782 183079 DEBUG oslo_concurrency.lockutils [req-95957b42-d654-4c30-9cb8-8161b80e060a req-290677de-88c7-4f3b-acf2-20b852ad2564 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:34 compute-0 nova_compute[183075]: 2026-01-22 17:11:34.782 183079 DEBUG oslo_concurrency.lockutils [req-95957b42-d654-4c30-9cb8-8161b80e060a req-290677de-88c7-4f3b-acf2-20b852ad2564 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:34 compute-0 nova_compute[183075]: 2026-01-22 17:11:34.783 183079 DEBUG nova.compute.manager [req-95957b42-d654-4c30-9cb8-8161b80e060a req-290677de-88c7-4f3b-acf2-20b852ad2564 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] No waiting events found dispatching network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:34 compute-0 nova_compute[183075]: 2026-01-22 17:11:34.783 183079 WARNING nova.compute.manager [req-95957b42-d654-4c30-9cb8-8161b80e060a req-290677de-88c7-4f3b-acf2-20b852ad2564 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Received unexpected event network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 for instance with vm_state active and task_state None.
Jan 22 17:11:34 compute-0 nova_compute[183075]: 2026-01-22 17:11:34.950 183079 INFO nova.compute.manager [None req-97cb2091-97ff-4644-87d2-637ee499b719 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Get console output
Jan 22 17:11:34 compute-0 nova_compute[183075]: 2026-01-22 17:11:34.959 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:35 compute-0 nova_compute[183075]: 2026-01-22 17:11:35.190 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.144 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.144 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.3880026
Jan 22 17:11:36 compute-0 haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a[219444]: 10.100.0.24:36324 [22/Jan/2026:17:11:34.754] listener listener/metadata 0/0/0/1390/1390 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.154 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.155 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 83f76843-09f1-4ec7-b234-50118063210a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.171 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.172 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0170648
Jan 22 17:11:36 compute-0 haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a[219444]: 10.100.0.24:36338 [22/Jan/2026:17:11:36.153] listener listener/metadata 0/0/0/18/18 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.177 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.178 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 83f76843-09f1-4ec7-b234-50118063210a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.193 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.193 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0154483
Jan 22 17:11:36 compute-0 haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a[219444]: 10.100.0.24:36344 [22/Jan/2026:17:11:36.176] listener listener/metadata 0/0/0/16/16 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.203 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.204 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 83f76843-09f1-4ec7-b234-50118063210a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.225 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.225 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0213709
Jan 22 17:11:36 compute-0 haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a[219444]: 10.100.0.24:36360 [22/Jan/2026:17:11:36.203] listener listener/metadata 0/0/0/22/22 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.235 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.235 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 83f76843-09f1-4ec7-b234-50118063210a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.252 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.252 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0166442
Jan 22 17:11:36 compute-0 haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a[219444]: 10.100.0.24:36376 [22/Jan/2026:17:11:36.234] listener listener/metadata 0/0/0/17/17 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.257 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.257 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 83f76843-09f1-4ec7-b234-50118063210a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.277 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:36 compute-0 haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a[219444]: 10.100.0.24:36382 [22/Jan/2026:17:11:36.256] listener listener/metadata 0/0/0/21/21 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.278 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0204937
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.282 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.283 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 83f76843-09f1-4ec7-b234-50118063210a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.297 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.297 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0138781
Jan 22 17:11:36 compute-0 haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a[219444]: 10.100.0.24:36386 [22/Jan/2026:17:11:36.282] listener listener/metadata 0/0/0/14/14 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.302 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.302 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 83f76843-09f1-4ec7-b234-50118063210a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.321 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:36 compute-0 haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a[219444]: 10.100.0.24:36396 [22/Jan/2026:17:11:36.301] listener listener/metadata 0/0/0/19/19 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.321 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0192573
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.326 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.327 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 83f76843-09f1-4ec7-b234-50118063210a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.339 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.339 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0125806
Jan 22 17:11:36 compute-0 haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a[219444]: 10.100.0.24:36408 [22/Jan/2026:17:11:36.326] listener listener/metadata 0/0/0/13/13 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.345 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.345 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 83f76843-09f1-4ec7-b234-50118063210a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.367 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.367 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0219269
Jan 22 17:11:36 compute-0 haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a[219444]: 10.100.0.24:36424 [22/Jan/2026:17:11:36.344] listener listener/metadata 0/0/0/22/22 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.372 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.372 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 83f76843-09f1-4ec7-b234-50118063210a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.389 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0168645
Jan 22 17:11:36 compute-0 haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a[219444]: 10.100.0.24:36428 [22/Jan/2026:17:11:36.371] listener listener/metadata 0/0/0/17/17 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.398 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.399 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 83f76843-09f1-4ec7-b234-50118063210a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.414 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.414 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0158355
Jan 22 17:11:36 compute-0 haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a[219444]: 10.100.0.24:36440 [22/Jan/2026:17:11:36.397] listener listener/metadata 0/0/0/17/17 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.419 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.421 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 83f76843-09f1-4ec7-b234-50118063210a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.435 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.435 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0146148
Jan 22 17:11:36 compute-0 haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a[219444]: 10.100.0.24:36446 [22/Jan/2026:17:11:36.419] listener listener/metadata 0/0/0/15/15 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.439 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.439 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 83f76843-09f1-4ec7-b234-50118063210a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.455 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:36 compute-0 haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a[219444]: 10.100.0.24:36456 [22/Jan/2026:17:11:36.438] listener listener/metadata 0/0/0/17/17 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.456 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0166841
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.459 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.460 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 83f76843-09f1-4ec7-b234-50118063210a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.486 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.486 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0260746
Jan 22 17:11:36 compute-0 haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a[219444]: 10.100.0.24:36462 [22/Jan/2026:17:11:36.459] listener listener/metadata 0/0/0/26/26 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.491 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.491 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 83f76843-09f1-4ec7-b234-50118063210a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.505 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:36 compute-0 haproxy-metadata-proxy-83f76843-09f1-4ec7-b234-50118063210a[219444]: 10.100.0.24:36478 [22/Jan/2026:17:11:36.490] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:11:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:36.505 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0138237
Jan 22 17:11:37 compute-0 nova_compute[183075]: 2026-01-22 17:11:37.356 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:38 compute-0 nova_compute[183075]: 2026-01-22 17:11:38.578 183079 INFO nova.compute.manager [None req-dc6563a9-870f-4ec4-8f61-860dbaebb01a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Get console output
Jan 22 17:11:38 compute-0 nova_compute[183075]: 2026-01-22 17:11:38.583 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:38 compute-0 nova_compute[183075]: 2026-01-22 17:11:38.713 183079 INFO nova.compute.manager [None req-1f1bfe3c-23a1-427b-ba89-2d92fc27d894 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Get console output
Jan 22 17:11:38 compute-0 nova_compute[183075]: 2026-01-22 17:11:38.721 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:39 compute-0 nova_compute[183075]: 2026-01-22 17:11:39.333 183079 INFO nova.compute.manager [None req-dd4e0775-10b8-4698-8ae9-09cbb4d45c12 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Get console output
Jan 22 17:11:39 compute-0 nova_compute[183075]: 2026-01-22 17:11:39.339 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:39 compute-0 ovn_controller[95372]: 2026-01-22T17:11:39Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:22:e2:d4 10.10.2.65
Jan 22 17:11:39 compute-0 ovn_controller[95372]: 2026-01-22T17:11:39Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:e2:d4 10.10.2.65
Jan 22 17:11:40 compute-0 nova_compute[183075]: 2026-01-22 17:11:40.193 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:40 compute-0 podman[219968]: 2026-01-22 17:11:40.35267926 +0000 UTC m=+0.054263846 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:11:40 compute-0 nova_compute[183075]: 2026-01-22 17:11:40.491 183079 INFO nova.compute.manager [None req-fb17f1ac-34e7-4835-8473-56227b8df864 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Get console output
Jan 22 17:11:40 compute-0 nova_compute[183075]: 2026-01-22 17:11:40.498 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:41.931 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:41.932 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:41.934 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:42 compute-0 nova_compute[183075]: 2026-01-22 17:11:42.359 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:42 compute-0 nova_compute[183075]: 2026-01-22 17:11:42.802 183079 DEBUG nova.compute.manager [req-6b820dcd-1800-4bbf-929b-1962d82180c7 req-fd9710f5-bafc-43d4-93d0-25fa288bf55c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received event network-changed-5644ae2a-c35b-431d-88a1-ad18de811d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:42 compute-0 nova_compute[183075]: 2026-01-22 17:11:42.802 183079 DEBUG nova.compute.manager [req-6b820dcd-1800-4bbf-929b-1962d82180c7 req-fd9710f5-bafc-43d4-93d0-25fa288bf55c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Refreshing instance network info cache due to event network-changed-5644ae2a-c35b-431d-88a1-ad18de811d83. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:11:42 compute-0 nova_compute[183075]: 2026-01-22 17:11:42.803 183079 DEBUG oslo_concurrency.lockutils [req-6b820dcd-1800-4bbf-929b-1962d82180c7 req-fd9710f5-bafc-43d4-93d0-25fa288bf55c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-7e8d077b-66fc-42ee-ad4e-a13327ad6764" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:11:42 compute-0 nova_compute[183075]: 2026-01-22 17:11:42.803 183079 DEBUG oslo_concurrency.lockutils [req-6b820dcd-1800-4bbf-929b-1962d82180c7 req-fd9710f5-bafc-43d4-93d0-25fa288bf55c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-7e8d077b-66fc-42ee-ad4e-a13327ad6764" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:11:42 compute-0 nova_compute[183075]: 2026-01-22 17:11:42.803 183079 DEBUG nova.network.neutron [req-6b820dcd-1800-4bbf-929b-1962d82180c7 req-fd9710f5-bafc-43d4-93d0-25fa288bf55c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Refreshing network info cache for port 5644ae2a-c35b-431d-88a1-ad18de811d83 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:11:43 compute-0 nova_compute[183075]: 2026-01-22 17:11:43.729 183079 INFO nova.compute.manager [None req-05e21072-e239-43a7-9276-8b9bf82d8d0e 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Get console output
Jan 22 17:11:43 compute-0 nova_compute[183075]: 2026-01-22 17:11:43.733 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:44 compute-0 nova_compute[183075]: 2026-01-22 17:11:44.433 183079 INFO nova.compute.manager [None req-614d3f29-4cb2-4560-a3c0-5d68fe2fc582 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Get console output
Jan 22 17:11:44 compute-0 nova_compute[183075]: 2026-01-22 17:11:44.440 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:44 compute-0 nova_compute[183075]: 2026-01-22 17:11:44.975 183079 DEBUG nova.network.neutron [req-6b820dcd-1800-4bbf-929b-1962d82180c7 req-fd9710f5-bafc-43d4-93d0-25fa288bf55c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Updated VIF entry in instance network info cache for port 5644ae2a-c35b-431d-88a1-ad18de811d83. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:11:44 compute-0 nova_compute[183075]: 2026-01-22 17:11:44.976 183079 DEBUG nova.network.neutron [req-6b820dcd-1800-4bbf-929b-1962d82180c7 req-fd9710f5-bafc-43d4-93d0-25fa288bf55c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Updating instance_info_cache with network_info: [{"id": "5644ae2a-c35b-431d-88a1-ad18de811d83", "address": "fa:16:3e:d4:c8:e5", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5644ae2a-c3", "ovs_interfaceid": "5644ae2a-c35b-431d-88a1-ad18de811d83", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:44 compute-0 nova_compute[183075]: 2026-01-22 17:11:44.997 183079 DEBUG oslo_concurrency.lockutils [req-6b820dcd-1800-4bbf-929b-1962d82180c7 req-fd9710f5-bafc-43d4-93d0-25fa288bf55c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-7e8d077b-66fc-42ee-ad4e-a13327ad6764" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:11:45 compute-0 nova_compute[183075]: 2026-01-22 17:11:45.196 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:45 compute-0 nova_compute[183075]: 2026-01-22 17:11:45.687 183079 INFO nova.compute.manager [None req-7d8b0953-759f-49f1-a973-2b9ba7214c77 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Get console output
Jan 22 17:11:45 compute-0 nova_compute[183075]: 2026-01-22 17:11:45.691 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:45 compute-0 ovn_controller[95372]: 2026-01-22T17:11:45Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3a:03:2f 10.100.0.9
Jan 22 17:11:45 compute-0 ovn_controller[95372]: 2026-01-22T17:11:45Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3a:03:2f 10.100.0.9
Jan 22 17:11:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:46.072 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:46.073 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:11:46 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:46 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:46 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:46 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:46 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:46 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.2.65
Jan 22 17:11:46 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3d23d5e4-bd70-4266-8b97-203b9af8d4ef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.102 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.102 104990 INFO eventlet.wsgi.server [-] 10.10.2.65,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.0293622
Jan 22 17:11:47 compute-0 haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219761]: 10.10.2.65:56584 [22/Jan/2026:17:11:46.071] listener listener/metadata 0/0/0/1031/1031 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.111 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.112 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.2.65
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3d23d5e4-bd70-4266-8b97-203b9af8d4ef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.133 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:47 compute-0 haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219761]: 10.10.2.65:56590 [22/Jan/2026:17:11:47.110] listener listener/metadata 0/0/0/23/23 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.134 104990 INFO eventlet.wsgi.server [-] 10.10.2.65,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0217965
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.138 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.139 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.2.65
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3d23d5e4-bd70-4266-8b97-203b9af8d4ef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.158 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:47 compute-0 haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219761]: 10.10.2.65:56596 [22/Jan/2026:17:11:47.137] listener listener/metadata 0/0/0/21/21 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.159 104990 INFO eventlet.wsgi.server [-] 10.10.2.65,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0198641
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.163 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.164 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.2.65
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3d23d5e4-bd70-4266-8b97-203b9af8d4ef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.185 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:47 compute-0 haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219761]: 10.10.2.65:56610 [22/Jan/2026:17:11:47.162] listener listener/metadata 0/0/0/23/23 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.186 104990 INFO eventlet.wsgi.server [-] 10.10.2.65,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0223801
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.192 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.193 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.2.65
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3d23d5e4-bd70-4266-8b97-203b9af8d4ef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.216 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:47 compute-0 haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219761]: 10.10.2.65:56614 [22/Jan/2026:17:11:47.191] listener listener/metadata 0/0/0/25/25 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.217 104990 INFO eventlet.wsgi.server [-] 10.10.2.65,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0244014
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.221 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.222 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.2.65
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3d23d5e4-bd70-4266-8b97-203b9af8d4ef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.241 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.241 104990 INFO eventlet.wsgi.server [-] 10.10.2.65,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0196707
Jan 22 17:11:47 compute-0 haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219761]: 10.10.2.65:56624 [22/Jan/2026:17:11:47.221] listener listener/metadata 0/0/0/20/20 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.251 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.252 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.2.65
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3d23d5e4-bd70-4266-8b97-203b9af8d4ef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.265 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.266 104990 INFO eventlet.wsgi.server [-] 10.10.2.65,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0139647
Jan 22 17:11:47 compute-0 haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219761]: 10.10.2.65:56640 [22/Jan/2026:17:11:47.251] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.272 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.273 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.2.65
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3d23d5e4-bd70-4266-8b97-203b9af8d4ef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.288 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.288 104990 INFO eventlet.wsgi.server [-] 10.10.2.65,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0155058
Jan 22 17:11:47 compute-0 haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219761]: 10.10.2.65:56646 [22/Jan/2026:17:11:47.272] listener listener/metadata 0/0/0/16/16 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.294 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.295 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.2.65
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3d23d5e4-bd70-4266-8b97-203b9af8d4ef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.308 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.308 104990 INFO eventlet.wsgi.server [-] 10.10.2.65,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0131750
Jan 22 17:11:47 compute-0 haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219761]: 10.10.2.65:56662 [22/Jan/2026:17:11:47.294] listener listener/metadata 0/0/0/13/13 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.314 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.314 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.2.65
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3d23d5e4-bd70-4266-8b97-203b9af8d4ef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.327 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:47 compute-0 haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219761]: 10.10.2.65:56668 [22/Jan/2026:17:11:47.313] listener listener/metadata 0/0/0/14/14 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.328 104990 INFO eventlet.wsgi.server [-] 10.10.2.65,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0138235
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.333 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.334 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.2.65
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3d23d5e4-bd70-4266-8b97-203b9af8d4ef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:47 compute-0 haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219761]: 10.10.2.65:56684 [22/Jan/2026:17:11:47.333] listener listener/metadata 0/0/0/16/16 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.349 104990 INFO eventlet.wsgi.server [-] 10.10.2.65,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0157111
Jan 22 17:11:47 compute-0 nova_compute[183075]: 2026-01-22 17:11:47.362 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.362 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.364 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.2.65
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3d23d5e4-bd70-4266-8b97-203b9af8d4ef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.384 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.384 104990 INFO eventlet.wsgi.server [-] 10.10.2.65,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0203941
Jan 22 17:11:47 compute-0 haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219761]: 10.10.2.65:56686 [22/Jan/2026:17:11:47.361] listener listener/metadata 0/0/0/22/22 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.392 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.392 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.2.65
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3d23d5e4-bd70-4266-8b97-203b9af8d4ef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.408 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.408 104990 INFO eventlet.wsgi.server [-] 10.10.2.65,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0156679
Jan 22 17:11:47 compute-0 haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219761]: 10.10.2.65:56702 [22/Jan/2026:17:11:47.391] listener listener/metadata 0/0/0/16/16 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.413 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.414 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.2.65
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3d23d5e4-bd70-4266-8b97-203b9af8d4ef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:47 compute-0 nova_compute[183075]: 2026-01-22 17:11:47.420 183079 DEBUG nova.compute.manager [req-9f7f9b32-1fd4-4721-b896-79254748a610 req-eb2a1461-019a-460e-9056-5313e7438e5a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Received event network-changed-e1d81dc2-e73c-45fb-be4a-7192b576b628 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:47 compute-0 nova_compute[183075]: 2026-01-22 17:11:47.420 183079 DEBUG nova.compute.manager [req-9f7f9b32-1fd4-4721-b896-79254748a610 req-eb2a1461-019a-460e-9056-5313e7438e5a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Refreshing instance network info cache due to event network-changed-e1d81dc2-e73c-45fb-be4a-7192b576b628. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:11:47 compute-0 nova_compute[183075]: 2026-01-22 17:11:47.421 183079 DEBUG oslo_concurrency.lockutils [req-9f7f9b32-1fd4-4721-b896-79254748a610 req-eb2a1461-019a-460e-9056-5313e7438e5a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-d54ce6ac-7fff-4f20-a6e0-48c13efded58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:11:47 compute-0 nova_compute[183075]: 2026-01-22 17:11:47.421 183079 DEBUG oslo_concurrency.lockutils [req-9f7f9b32-1fd4-4721-b896-79254748a610 req-eb2a1461-019a-460e-9056-5313e7438e5a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-d54ce6ac-7fff-4f20-a6e0-48c13efded58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:11:47 compute-0 nova_compute[183075]: 2026-01-22 17:11:47.422 183079 DEBUG nova.network.neutron [req-9f7f9b32-1fd4-4721-b896-79254748a610 req-eb2a1461-019a-460e-9056-5313e7438e5a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Refreshing network info cache for port e1d81dc2-e73c-45fb-be4a-7192b576b628 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.430 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.431 104990 INFO eventlet.wsgi.server [-] 10.10.2.65,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0170171
Jan 22 17:11:47 compute-0 haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219761]: 10.10.2.65:56718 [22/Jan/2026:17:11:47.413] listener listener/metadata 0/0/0/18/18 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.441 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.443 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.2.65
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3d23d5e4-bd70-4266-8b97-203b9af8d4ef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.466 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.467 104990 INFO eventlet.wsgi.server [-] 10.10.2.65,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0243180
Jan 22 17:11:47 compute-0 haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219761]: 10.10.2.65:56734 [22/Jan/2026:17:11:47.441] listener listener/metadata 0/0/0/26/26 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.472 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.472 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.2.65
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3d23d5e4-bd70-4266-8b97-203b9af8d4ef __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.490 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:47 compute-0 haproxy-metadata-proxy-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219761]: 10.10.2.65:56746 [22/Jan/2026:17:11:47.471] listener listener/metadata 0/0/0/19/19 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:11:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:47.491 104990 INFO eventlet.wsgi.server [-] 10.10.2.65,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0181468
Jan 22 17:11:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:48.571 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:11:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:48.572 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:11:48 compute-0 nova_compute[183075]: 2026-01-22 17:11:48.572 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:48 compute-0 nova_compute[183075]: 2026-01-22 17:11:48.825 183079 DEBUG nova.compute.manager [req-888e0255-6f40-4595-990d-f29c110783ec req-e27a4c25-6b38-4008-b5fb-af1e31cc5d06 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Received event network-changed-e1d81dc2-e73c-45fb-be4a-7192b576b628 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:48 compute-0 nova_compute[183075]: 2026-01-22 17:11:48.825 183079 DEBUG nova.compute.manager [req-888e0255-6f40-4595-990d-f29c110783ec req-e27a4c25-6b38-4008-b5fb-af1e31cc5d06 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Refreshing instance network info cache due to event network-changed-e1d81dc2-e73c-45fb-be4a-7192b576b628. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:11:48 compute-0 nova_compute[183075]: 2026-01-22 17:11:48.826 183079 DEBUG oslo_concurrency.lockutils [req-888e0255-6f40-4595-990d-f29c110783ec req-e27a4c25-6b38-4008-b5fb-af1e31cc5d06 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-d54ce6ac-7fff-4f20-a6e0-48c13efded58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:11:48 compute-0 nova_compute[183075]: 2026-01-22 17:11:48.911 183079 INFO nova.compute.manager [None req-f8af37d1-9696-444b-b4da-4407116fb99c 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Get console output
Jan 22 17:11:48 compute-0 nova_compute[183075]: 2026-01-22 17:11:48.918 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:48 compute-0 nova_compute[183075]: 2026-01-22 17:11:48.972 183079 DEBUG oslo_concurrency.lockutils [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:48 compute-0 nova_compute[183075]: 2026-01-22 17:11:48.973 183079 DEBUG oslo_concurrency.lockutils [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:48 compute-0 nova_compute[183075]: 2026-01-22 17:11:48.973 183079 DEBUG oslo_concurrency.lockutils [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:48 compute-0 nova_compute[183075]: 2026-01-22 17:11:48.974 183079 DEBUG oslo_concurrency.lockutils [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:48 compute-0 nova_compute[183075]: 2026-01-22 17:11:48.974 183079 DEBUG oslo_concurrency.lockutils [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:48 compute-0 nova_compute[183075]: 2026-01-22 17:11:48.976 183079 INFO nova.compute.manager [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Terminating instance
Jan 22 17:11:48 compute-0 nova_compute[183075]: 2026-01-22 17:11:48.977 183079 DEBUG nova.compute.manager [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:11:48 compute-0 nova_compute[183075]: 2026-01-22 17:11:48.989 183079 DEBUG nova.network.neutron [req-9f7f9b32-1fd4-4721-b896-79254748a610 req-eb2a1461-019a-460e-9056-5313e7438e5a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Updated VIF entry in instance network info cache for port e1d81dc2-e73c-45fb-be4a-7192b576b628. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:11:48 compute-0 nova_compute[183075]: 2026-01-22 17:11:48.989 183079 DEBUG nova.network.neutron [req-9f7f9b32-1fd4-4721-b896-79254748a610 req-eb2a1461-019a-460e-9056-5313e7438e5a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Updating instance_info_cache with network_info: [{"id": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "address": "fa:16:3e:65:16:89", "network": {"id": "83f76843-09f1-4ec7-b234-50118063210a", "bridge": "br-int", "label": "tempest-test-network--33948725", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1d81dc2-e7", "ovs_interfaceid": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:49 compute-0 kernel: tape1d81dc2-e7 (unregistering): left promiscuous mode
Jan 22 17:11:49 compute-0 NetworkManager[55454]: <info>  [1769101909.0084] device (tape1d81dc2-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.021 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:49 compute-0 ovn_controller[95372]: 2026-01-22T17:11:49Z|00207|binding|INFO|Releasing lport e1d81dc2-e73c-45fb-be4a-7192b576b628 from this chassis (sb_readonly=0)
Jan 22 17:11:49 compute-0 ovn_controller[95372]: 2026-01-22T17:11:49Z|00208|binding|INFO|Setting lport e1d81dc2-e73c-45fb-be4a-7192b576b628 down in Southbound
Jan 22 17:11:49 compute-0 ovn_controller[95372]: 2026-01-22T17:11:49Z|00209|binding|INFO|Removing iface tape1d81dc2-e7 ovn-installed in OVS
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.027 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.039 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:49.064 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:16:89 10.100.0.24'], port_security=['fa:16:3e:65:16:89 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'd54ce6ac-7fff-4f20-a6e0-48c13efded58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-83f76843-09f1-4ec7-b234-50118063210a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2b37b797ca344f2b31c3861277068d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2f51838f-8a2c-425b-a70e-e288886c38d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c1b57117-a4a4-4755-be03-74567148d139, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=e1d81dc2-e73c-45fb-be4a-7192b576b628) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:11:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:49.065 104629 INFO neutron.agent.ovn.metadata.agent [-] Port e1d81dc2-e73c-45fb-be4a-7192b576b628 in datapath 83f76843-09f1-4ec7-b234-50118063210a unbound from our chassis
Jan 22 17:11:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:49.068 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 83f76843-09f1-4ec7-b234-50118063210a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:11:49 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 22 17:11:49 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000012.scope: Consumed 13.086s CPU time.
Jan 22 17:11:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:49.070 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[02b016ba-5724-4f3e-8d99-9c8496a026e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:49.072 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-83f76843-09f1-4ec7-b234-50118063210a namespace which is not needed anymore
Jan 22 17:11:49 compute-0 systemd-machined[154382]: Machine qemu-18-instance-00000012 terminated.
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.148 183079 DEBUG oslo_concurrency.lockutils [req-9f7f9b32-1fd4-4721-b896-79254748a610 req-eb2a1461-019a-460e-9056-5313e7438e5a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-d54ce6ac-7fff-4f20-a6e0-48c13efded58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.149 183079 DEBUG oslo_concurrency.lockutils [req-888e0255-6f40-4595-990d-f29c110783ec req-e27a4c25-6b38-4008-b5fb-af1e31cc5d06 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-d54ce6ac-7fff-4f20-a6e0-48c13efded58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.150 183079 DEBUG nova.network.neutron [req-888e0255-6f40-4595-990d-f29c110783ec req-e27a4c25-6b38-4008-b5fb-af1e31cc5d06 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Refreshing network info cache for port e1d81dc2-e73c-45fb-be4a-7192b576b628 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.203 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.209 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.239 183079 INFO nova.virt.libvirt.driver [-] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Instance destroyed successfully.
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.240 183079 DEBUG nova.objects.instance [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lazy-loading 'resources' on Instance uuid d54ce6ac-7fff-4f20-a6e0-48c13efded58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.387 183079 DEBUG nova.virt.libvirt.vif [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:11:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-124054286',display_name='tempest-server-test-124054286',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-124054286',id=18,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA4izGuVdf36SsG+7n8kX9aNpboq22Z55adiWGM5qlH08LxqMkSxkCnGlFdsMKL8t/vQsOXqbCU1vgc4to/WoKVrvDSrylB83cxSgDIuuaEZv45HgYlb5csi4YLKl3Bk4g==',key_name='tempest-keypair-test-110348497',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:11:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2b37b797ca344f2b31c3861277068d8',ramdisk_id='',reservation_id='r-a96ualme',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpMultipleRoutersTest-2036232412',owner_user_name='tempest-FloatingIpMultipleRoutersTest-2036232412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:11:15Z,user_data=None,user_id='28bc4852545149e59d0541d4f39eb38e',uuid=d54ce6ac-7fff-4f20-a6e0-48c13efded58,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "address": "fa:16:3e:65:16:89", "network": {"id": "83f76843-09f1-4ec7-b234-50118063210a", "bridge": "br-int", "label": "tempest-test-network--33948725", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1d81dc2-e7", "ovs_interfaceid": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.388 183079 DEBUG nova.network.os_vif_util [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converting VIF {"id": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "address": "fa:16:3e:65:16:89", "network": {"id": "83f76843-09f1-4ec7-b234-50118063210a", "bridge": "br-int", "label": "tempest-test-network--33948725", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1d81dc2-e7", "ovs_interfaceid": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.389 183079 DEBUG nova.network.os_vif_util [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:16:89,bridge_name='br-int',has_traffic_filtering=True,id=e1d81dc2-e73c-45fb-be4a-7192b576b628,network=Network(83f76843-09f1-4ec7-b234-50118063210a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1d81dc2-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.390 183079 DEBUG os_vif [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:16:89,bridge_name='br-int',has_traffic_filtering=True,id=e1d81dc2-e73c-45fb-be4a-7192b576b628,network=Network(83f76843-09f1-4ec7-b234-50118063210a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1d81dc2-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.393 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.394 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape1d81dc2-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.446 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.449 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:49 compute-0 neutron-haproxy-ovnmeta-83f76843-09f1-4ec7-b234-50118063210a[219438]: [NOTICE]   (219442) : haproxy version is 2.8.14-c23fe91
Jan 22 17:11:49 compute-0 neutron-haproxy-ovnmeta-83f76843-09f1-4ec7-b234-50118063210a[219438]: [NOTICE]   (219442) : path to executable is /usr/sbin/haproxy
Jan 22 17:11:49 compute-0 neutron-haproxy-ovnmeta-83f76843-09f1-4ec7-b234-50118063210a[219438]: [WARNING]  (219442) : Exiting Master process...
Jan 22 17:11:49 compute-0 neutron-haproxy-ovnmeta-83f76843-09f1-4ec7-b234-50118063210a[219438]: [WARNING]  (219442) : Exiting Master process...
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.452 183079 INFO os_vif [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:16:89,bridge_name='br-int',has_traffic_filtering=True,id=e1d81dc2-e73c-45fb-be4a-7192b576b628,network=Network(83f76843-09f1-4ec7-b234-50118063210a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1d81dc2-e7')
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.452 183079 INFO nova.virt.libvirt.driver [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Deleting instance files /var/lib/nova/instances/d54ce6ac-7fff-4f20-a6e0-48c13efded58_del
Jan 22 17:11:49 compute-0 neutron-haproxy-ovnmeta-83f76843-09f1-4ec7-b234-50118063210a[219438]: [ALERT]    (219442) : Current worker (219444) exited with code 143 (Terminated)
Jan 22 17:11:49 compute-0 neutron-haproxy-ovnmeta-83f76843-09f1-4ec7-b234-50118063210a[219438]: [WARNING]  (219442) : All workers exited. Exiting... (0)
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.453 183079 INFO nova.virt.libvirt.driver [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Deletion of /var/lib/nova/instances/d54ce6ac-7fff-4f20-a6e0-48c13efded58_del complete
Jan 22 17:11:49 compute-0 systemd[1]: libpod-13d088dd1087fe7421decb8a8c4e1ecee3d0b6ddf440c37422f8f2045d897c14.scope: Deactivated successfully.
Jan 22 17:11:49 compute-0 podman[220035]: 2026-01-22 17:11:49.461708457 +0000 UTC m=+0.285826473 container died 13d088dd1087fe7421decb8a8c4e1ecee3d0b6ddf440c37422f8f2045d897c14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83f76843-09f1-4ec7-b234-50118063210a, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.542 183079 INFO nova.compute.manager [None req-ae3f82d4-8311-4d8d-afc8-40a2e2179698 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Get console output
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.551 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-13d088dd1087fe7421decb8a8c4e1ecee3d0b6ddf440c37422f8f2045d897c14-userdata-shm.mount: Deactivated successfully.
Jan 22 17:11:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-1f0130c894500ce812c42f46180af996b668b5c463d553ab9ae157b9d45606d2-merged.mount: Deactivated successfully.
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.571 183079 INFO nova.compute.manager [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Took 0.59 seconds to destroy the instance on the hypervisor.
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.572 183079 DEBUG oslo.service.loopingcall [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.573 183079 DEBUG nova.compute.manager [-] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.573 183079 DEBUG nova.network.neutron [-] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:11:49 compute-0 podman[220035]: 2026-01-22 17:11:49.581745736 +0000 UTC m=+0.405863782 container cleanup 13d088dd1087fe7421decb8a8c4e1ecee3d0b6ddf440c37422f8f2045d897c14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83f76843-09f1-4ec7-b234-50118063210a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:11:49 compute-0 systemd[1]: libpod-conmon-13d088dd1087fe7421decb8a8c4e1ecee3d0b6ddf440c37422f8f2045d897c14.scope: Deactivated successfully.
Jan 22 17:11:49 compute-0 podman[220063]: 2026-01-22 17:11:49.609614323 +0000 UTC m=+0.107539355 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:11:49 compute-0 podman[220095]: 2026-01-22 17:11:49.663244111 +0000 UTC m=+0.055908538 container remove 13d088dd1087fe7421decb8a8c4e1ecee3d0b6ddf440c37422f8f2045d897c14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83f76843-09f1-4ec7-b234-50118063210a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:11:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:49.670 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f7bbdb3b-37e2-412d-977c-a7d3cf44f288]: (4, ('Thu Jan 22 05:11:49 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-83f76843-09f1-4ec7-b234-50118063210a (13d088dd1087fe7421decb8a8c4e1ecee3d0b6ddf440c37422f8f2045d897c14)\n13d088dd1087fe7421decb8a8c4e1ecee3d0b6ddf440c37422f8f2045d897c14\nThu Jan 22 05:11:49 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-83f76843-09f1-4ec7-b234-50118063210a (13d088dd1087fe7421decb8a8c4e1ecee3d0b6ddf440c37422f8f2045d897c14)\n13d088dd1087fe7421decb8a8c4e1ecee3d0b6ddf440c37422f8f2045d897c14\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:49.673 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c6530bc1-aa00-4d98-bfbc-772532fa9e68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:49.674 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap83f76843-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.676 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:49 compute-0 kernel: tap83f76843-00: left promiscuous mode
Jan 22 17:11:49 compute-0 nova_compute[183075]: 2026-01-22 17:11:49.691 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:49.695 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[faece665-cdd0-479b-9b18-11892982ec29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:49.717 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4a571a9d-ed1e-45ad-a8cd-048d668de1e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:49.718 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b0d92b5b-442d-46ad-9ac7-0b19402115d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:49.736 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[69f384ec-b455-4fb0-b738-cb3f7d38cdc6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423801, 'reachable_time': 33028, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220117, 'error': None, 'target': 'ovnmeta-83f76843-09f1-4ec7-b234-50118063210a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:49.738 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-83f76843-09f1-4ec7-b234-50118063210a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:11:49 compute-0 systemd[1]: run-netns-ovnmeta\x2d83f76843\x2d09f1\x2d4ec7\x2db234\x2d50118063210a.mount: Deactivated successfully.
Jan 22 17:11:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:49.739 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[1074d581-5b81-4a37-af5f-19eec2cfae48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.200 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.216 183079 DEBUG nova.network.neutron [-] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.234 183079 INFO nova.compute.manager [-] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Took 0.66 seconds to deallocate network for instance.
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.279 183079 DEBUG oslo_concurrency.lockutils [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.279 183079 DEBUG oslo_concurrency.lockutils [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.310 183079 DEBUG nova.scheduler.client.report [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Refreshing inventories for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.442 183079 DEBUG nova.scheduler.client.report [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Updating ProviderTree inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.442 183079 DEBUG nova.compute.provider_tree [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.464 183079 DEBUG nova.scheduler.client.report [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Refreshing aggregate associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.491 183079 DEBUG nova.scheduler.client.report [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Refreshing trait associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NODE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.556 183079 DEBUG nova.network.neutron [req-888e0255-6f40-4595-990d-f29c110783ec req-e27a4c25-6b38-4008-b5fb-af1e31cc5d06 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Updated VIF entry in instance network info cache for port e1d81dc2-e73c-45fb-be4a-7192b576b628. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.556 183079 DEBUG nova.network.neutron [req-888e0255-6f40-4595-990d-f29c110783ec req-e27a4c25-6b38-4008-b5fb-af1e31cc5d06 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Updating instance_info_cache with network_info: [{"id": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "address": "fa:16:3e:65:16:89", "network": {"id": "83f76843-09f1-4ec7-b234-50118063210a", "bridge": "br-int", "label": "tempest-test-network--33948725", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1d81dc2-e7", "ovs_interfaceid": "e1d81dc2-e73c-45fb-be4a-7192b576b628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.584 183079 DEBUG oslo_concurrency.lockutils [req-888e0255-6f40-4595-990d-f29c110783ec req-e27a4c25-6b38-4008-b5fb-af1e31cc5d06 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-d54ce6ac-7fff-4f20-a6e0-48c13efded58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.659 183079 DEBUG nova.compute.provider_tree [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.674 183079 DEBUG nova.scheduler.client.report [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.695 183079 DEBUG oslo_concurrency.lockutils [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.416s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.732 183079 INFO nova.scheduler.client.report [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Deleted allocations for instance d54ce6ac-7fff-4f20-a6e0-48c13efded58
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.818 183079 DEBUG oslo_concurrency.lockutils [None req-e5745dfb-4ac0-4bfb-be0f-810822f07609 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.845s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.823 183079 INFO nova.compute.manager [None req-3cda5a44-6cdf-4da1-b637-b9280d54ecbf 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Get console output
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.832 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.917 183079 DEBUG nova.compute.manager [req-49d07b6c-b4ec-41ba-89c7-cdc5a3302ac3 req-911aa76f-666a-401f-9421-245c3aa41309 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Received event network-vif-unplugged-e1d81dc2-e73c-45fb-be4a-7192b576b628 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.918 183079 DEBUG oslo_concurrency.lockutils [req-49d07b6c-b4ec-41ba-89c7-cdc5a3302ac3 req-911aa76f-666a-401f-9421-245c3aa41309 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.918 183079 DEBUG oslo_concurrency.lockutils [req-49d07b6c-b4ec-41ba-89c7-cdc5a3302ac3 req-911aa76f-666a-401f-9421-245c3aa41309 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.919 183079 DEBUG oslo_concurrency.lockutils [req-49d07b6c-b4ec-41ba-89c7-cdc5a3302ac3 req-911aa76f-666a-401f-9421-245c3aa41309 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.919 183079 DEBUG nova.compute.manager [req-49d07b6c-b4ec-41ba-89c7-cdc5a3302ac3 req-911aa76f-666a-401f-9421-245c3aa41309 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] No waiting events found dispatching network-vif-unplugged-e1d81dc2-e73c-45fb-be4a-7192b576b628 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.920 183079 WARNING nova.compute.manager [req-49d07b6c-b4ec-41ba-89c7-cdc5a3302ac3 req-911aa76f-666a-401f-9421-245c3aa41309 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Received unexpected event network-vif-unplugged-e1d81dc2-e73c-45fb-be4a-7192b576b628 for instance with vm_state deleted and task_state None.
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.920 183079 DEBUG nova.compute.manager [req-49d07b6c-b4ec-41ba-89c7-cdc5a3302ac3 req-911aa76f-666a-401f-9421-245c3aa41309 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Received event network-vif-plugged-e1d81dc2-e73c-45fb-be4a-7192b576b628 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.920 183079 DEBUG oslo_concurrency.lockutils [req-49d07b6c-b4ec-41ba-89c7-cdc5a3302ac3 req-911aa76f-666a-401f-9421-245c3aa41309 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.921 183079 DEBUG oslo_concurrency.lockutils [req-49d07b6c-b4ec-41ba-89c7-cdc5a3302ac3 req-911aa76f-666a-401f-9421-245c3aa41309 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.921 183079 DEBUG oslo_concurrency.lockutils [req-49d07b6c-b4ec-41ba-89c7-cdc5a3302ac3 req-911aa76f-666a-401f-9421-245c3aa41309 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d54ce6ac-7fff-4f20-a6e0-48c13efded58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.921 183079 DEBUG nova.compute.manager [req-49d07b6c-b4ec-41ba-89c7-cdc5a3302ac3 req-911aa76f-666a-401f-9421-245c3aa41309 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] No waiting events found dispatching network-vif-plugged-e1d81dc2-e73c-45fb-be4a-7192b576b628 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.922 183079 WARNING nova.compute.manager [req-49d07b6c-b4ec-41ba-89c7-cdc5a3302ac3 req-911aa76f-666a-401f-9421-245c3aa41309 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Received unexpected event network-vif-plugged-e1d81dc2-e73c-45fb-be4a-7192b576b628 for instance with vm_state deleted and task_state None.
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.922 183079 DEBUG nova.compute.manager [req-49d07b6c-b4ec-41ba-89c7-cdc5a3302ac3 req-911aa76f-666a-401f-9421-245c3aa41309 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Received event network-vif-deleted-e1d81dc2-e73c-45fb-be4a-7192b576b628 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.923 183079 INFO nova.compute.manager [req-49d07b6c-b4ec-41ba-89c7-cdc5a3302ac3 req-911aa76f-666a-401f-9421-245c3aa41309 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Neutron deleted interface e1d81dc2-e73c-45fb-be4a-7192b576b628; detaching it from the instance and deleting it from the info cache
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.923 183079 DEBUG nova.network.neutron [req-49d07b6c-b4ec-41ba-89c7-cdc5a3302ac3 req-911aa76f-666a-401f-9421-245c3aa41309 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 22 17:11:50 compute-0 nova_compute[183075]: 2026-01-22 17:11:50.927 183079 DEBUG nova.compute.manager [req-49d07b6c-b4ec-41ba-89c7-cdc5a3302ac3 req-911aa76f-666a-401f-9421-245c3aa41309 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Detach interface failed, port_id=e1d81dc2-e73c-45fb-be4a-7192b576b628, reason: Instance d54ce6ac-7fff-4f20-a6e0-48c13efded58 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 17:11:51 compute-0 nova_compute[183075]: 2026-01-22 17:11:51.954 183079 DEBUG oslo_concurrency.lockutils [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:51 compute-0 nova_compute[183075]: 2026-01-22 17:11:51.955 183079 DEBUG oslo_concurrency.lockutils [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:51 compute-0 nova_compute[183075]: 2026-01-22 17:11:51.955 183079 DEBUG oslo_concurrency.lockutils [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:51 compute-0 nova_compute[183075]: 2026-01-22 17:11:51.956 183079 DEBUG oslo_concurrency.lockutils [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:51 compute-0 nova_compute[183075]: 2026-01-22 17:11:51.956 183079 DEBUG oslo_concurrency.lockutils [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:51 compute-0 nova_compute[183075]: 2026-01-22 17:11:51.958 183079 INFO nova.compute.manager [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Terminating instance
Jan 22 17:11:51 compute-0 nova_compute[183075]: 2026-01-22 17:11:51.960 183079 DEBUG nova.compute.manager [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:11:51 compute-0 kernel: tap1188a618-45 (unregistering): left promiscuous mode
Jan 22 17:11:51 compute-0 NetworkManager[55454]: <info>  [1769101911.9912] device (tap1188a618-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.001 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:52 compute-0 ovn_controller[95372]: 2026-01-22T17:11:51Z|00210|binding|INFO|Releasing lport 1188a618-4567-453e-b4f1-8d3fafe1d314 from this chassis (sb_readonly=0)
Jan 22 17:11:52 compute-0 ovn_controller[95372]: 2026-01-22T17:11:52Z|00211|binding|INFO|Setting lport 1188a618-4567-453e-b4f1-8d3fafe1d314 down in Southbound
Jan 22 17:11:52 compute-0 ovn_controller[95372]: 2026-01-22T17:11:52Z|00212|binding|INFO|Removing iface tap1188a618-45 ovn-installed in OVS
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.006 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.010 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:2a:05 10.100.0.13'], port_security=['fa:16:3e:54:2a:05 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '7d7be65d-c615-4cfd-936e-e5b57b3f29c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2b37b797ca344f2b31c3861277068d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2f51838f-8a2c-425b-a70e-e288886c38d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.231'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f29732e-c99f-480d-89f6-9caa444040c9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=1188a618-4567-453e-b4f1-8d3fafe1d314) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.012 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 1188a618-4567-453e-b4f1-8d3fafe1d314 in datapath ce346f8d-be8d-455f-b61c-12fea213a3f4 unbound from our chassis
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.013 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ce346f8d-be8d-455f-b61c-12fea213a3f4
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.028 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8343c31e-6286-4a48-8a5b-b0f2a581ba18]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.036 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:52 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Jan 22 17:11:52 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000000e.scope: Consumed 14.857s CPU time.
Jan 22 17:11:52 compute-0 systemd-machined[154382]: Machine qemu-14-instance-0000000e terminated.
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.091 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[6a6a0003-3a38-4adf-91bc-8e3248821bfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.096 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[58284c85-f993-4538-ae99-c353feed36ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.140 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[9f780fa5-c23b-4b25-a10a-fdc16e45416d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.172 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[13f71906-23e2-4874-ade0-2678d389cbc7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce346f8d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:a7:4b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 106, 'rx_bytes': 17308, 'tx_bytes': 12054, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 106, 'rx_bytes': 17308, 'tx_bytes': 12054, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415010, 'reachable_time': 25211, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220129, 'error': None, 'target': 'ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.196 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2c7fb3bd-9a2f-4a5d-9e73-c1ccbeae49e7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapce346f8d-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415022, 'tstamp': 415022}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220131, 'error': None, 'target': 'ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapce346f8d-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415024, 'tstamp': 415024}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220131, 'error': None, 'target': 'ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.199 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce346f8d-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.202 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.215 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.215 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce346f8d-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.215 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.216 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapce346f8d-b0, col_values=(('external_ids', {'iface-id': '255f865e-6322-48b0-a0d1-c16ced648c78'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.216 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.233 183079 INFO nova.virt.libvirt.driver [-] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Instance destroyed successfully.
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.234 183079 DEBUG nova.objects.instance [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lazy-loading 'resources' on Instance uuid 7d7be65d-c615-4cfd-936e-e5b57b3f29c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.246 183079 DEBUG nova.virt.libvirt.vif [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:10:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1811743565',display_name='tempest-server-test-1811743565',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1811743565',id=14,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA4izGuVdf36SsG+7n8kX9aNpboq22Z55adiWGM5qlH08LxqMkSxkCnGlFdsMKL8t/vQsOXqbCU1vgc4to/WoKVrvDSrylB83cxSgDIuuaEZv45HgYlb5csi4YLKl3Bk4g==',key_name='tempest-keypair-test-110348497',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:10:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2b37b797ca344f2b31c3861277068d8',ramdisk_id='',reservation_id='r-9elkgxck',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpMultipleRoutersTest-2036232412',owner_user_name='tempest-FloatingIpMultipleRoutersTest-2036232412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:10:26Z,user_data=None,user_id='28bc4852545149e59d0541d4f39eb38e',uuid=7d7be65d-c615-4cfd-936e-e5b57b3f29c1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1188a618-4567-453e-b4f1-8d3fafe1d314", "address": "fa:16:3e:54:2a:05", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1188a618-45", "ovs_interfaceid": "1188a618-4567-453e-b4f1-8d3fafe1d314", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.246 183079 DEBUG nova.network.os_vif_util [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converting VIF {"id": "1188a618-4567-453e-b4f1-8d3fafe1d314", "address": "fa:16:3e:54:2a:05", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1188a618-45", "ovs_interfaceid": "1188a618-4567-453e-b4f1-8d3fafe1d314", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.247 183079 DEBUG nova.network.os_vif_util [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:54:2a:05,bridge_name='br-int',has_traffic_filtering=True,id=1188a618-4567-453e-b4f1-8d3fafe1d314,network=Network(ce346f8d-be8d-455f-b61c-12fea213a3f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1188a618-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.247 183079 DEBUG os_vif [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:2a:05,bridge_name='br-int',has_traffic_filtering=True,id=1188a618-4567-453e-b4f1-8d3fafe1d314,network=Network(ce346f8d-be8d-455f-b61c-12fea213a3f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1188a618-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.249 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.249 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1188a618-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.251 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.252 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.254 183079 INFO os_vif [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:2a:05,bridge_name='br-int',has_traffic_filtering=True,id=1188a618-4567-453e-b4f1-8d3fafe1d314,network=Network(ce346f8d-be8d-455f-b61c-12fea213a3f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1188a618-45')
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.255 183079 INFO nova.virt.libvirt.driver [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Deleting instance files /var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1_del
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.256 183079 INFO nova.virt.libvirt.driver [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Deletion of /var/lib/nova/instances/7d7be65d-c615-4cfd-936e-e5b57b3f29c1_del complete
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.300 183079 INFO nova.compute.manager [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.300 183079 DEBUG oslo.service.loopingcall [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.300 183079 DEBUG nova.compute.manager [-] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:11:52 compute-0 nova_compute[183075]: 2026-01-22 17:11:52.301 183079 DEBUG nova.network.neutron [-] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.396 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.398 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.575 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.837 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.838 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.4404233
Jan 22 17:11:52 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.9:47330 [22/Jan/2026:17:11:52.394] listener listener/metadata 0/0/0/443/443 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.850 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.853 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.879 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.880 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0277474
Jan 22 17:11:52 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.9:47334 [22/Jan/2026:17:11:52.849] listener listener/metadata 0/0/0/30/30 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.886 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.887 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.908 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.908 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0209622
Jan 22 17:11:52 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.9:47348 [22/Jan/2026:17:11:52.886] listener listener/metadata 0/0/0/22/22 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.914 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.915 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.932 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.932 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0171375
Jan 22 17:11:52 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.9:47362 [22/Jan/2026:17:11:52.913] listener listener/metadata 0/0/0/19/19 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.938 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.939 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.957 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:52 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.9:47372 [22/Jan/2026:17:11:52.937] listener listener/metadata 0/0/0/20/20 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.958 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0188265
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.962 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.963 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.989 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.989 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0257950
Jan 22 17:11:52 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.9:47384 [22/Jan/2026:17:11:52.962] listener listener/metadata 0/0/0/27/27 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.996 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:52.997 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:11:52 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.016 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:53 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.9:47390 [22/Jan/2026:17:11:52.995] listener listener/metadata 0/0/0/21/21 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.017 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0205548
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.022 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.023 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.042 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.043 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0201247
Jan 22 17:11:53 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.9:47404 [22/Jan/2026:17:11:53.021] listener listener/metadata 0/0/0/21/21 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.048 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.048 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.069 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.069 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0212080
Jan 22 17:11:53 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.9:47416 [22/Jan/2026:17:11:53.047] listener listener/metadata 0/0/0/22/22 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.077 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.078 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.098 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.099 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0203302
Jan 22 17:11:53 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.9:47432 [22/Jan/2026:17:11:53.076] listener listener/metadata 0/0/0/22/22 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.110 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.111 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:53 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.9:47434 [22/Jan/2026:17:11:53.110] listener listener/metadata 0/0/0/24/24 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.134 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0233700
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.146 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.147 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.165 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:53 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.9:47448 [22/Jan/2026:17:11:53.146] listener listener/metadata 0/0/0/20/20 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.166 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0197117
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.170 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.171 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.190 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.191 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0195470
Jan 22 17:11:53 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.9:47456 [22/Jan/2026:17:11:53.170] listener listener/metadata 0/0/0/20/20 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.199 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.199 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.215 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:53 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.9:47470 [22/Jan/2026:17:11:53.198] listener listener/metadata 0/0/0/17/17 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.216 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0168071
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.226 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.227 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.243 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.243 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0160518
Jan 22 17:11:53 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.9:47472 [22/Jan/2026:17:11:53.226] listener listener/metadata 0/0/0/17/17 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.252 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.252 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.273 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:11:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:53.273 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0210533
Jan 22 17:11:53 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.9:47482 [22/Jan/2026:17:11:53.251] listener listener/metadata 0/0/0/22/22 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:11:54 compute-0 nova_compute[183075]: 2026-01-22 17:11:54.254 183079 DEBUG nova.compute.manager [req-7d5f5dbf-fc43-4202-b57a-aa37747fdc8f req-98d22f82-b500-4a6d-86ea-dd2f6452d68e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Received event network-vif-unplugged-1188a618-4567-453e-b4f1-8d3fafe1d314 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:54 compute-0 nova_compute[183075]: 2026-01-22 17:11:54.255 183079 DEBUG oslo_concurrency.lockutils [req-7d5f5dbf-fc43-4202-b57a-aa37747fdc8f req-98d22f82-b500-4a6d-86ea-dd2f6452d68e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:54 compute-0 nova_compute[183075]: 2026-01-22 17:11:54.255 183079 DEBUG oslo_concurrency.lockutils [req-7d5f5dbf-fc43-4202-b57a-aa37747fdc8f req-98d22f82-b500-4a6d-86ea-dd2f6452d68e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:54 compute-0 nova_compute[183075]: 2026-01-22 17:11:54.255 183079 DEBUG oslo_concurrency.lockutils [req-7d5f5dbf-fc43-4202-b57a-aa37747fdc8f req-98d22f82-b500-4a6d-86ea-dd2f6452d68e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:54 compute-0 nova_compute[183075]: 2026-01-22 17:11:54.255 183079 DEBUG nova.compute.manager [req-7d5f5dbf-fc43-4202-b57a-aa37747fdc8f req-98d22f82-b500-4a6d-86ea-dd2f6452d68e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] No waiting events found dispatching network-vif-unplugged-1188a618-4567-453e-b4f1-8d3fafe1d314 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:54 compute-0 nova_compute[183075]: 2026-01-22 17:11:54.255 183079 DEBUG nova.compute.manager [req-7d5f5dbf-fc43-4202-b57a-aa37747fdc8f req-98d22f82-b500-4a6d-86ea-dd2f6452d68e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Received event network-vif-unplugged-1188a618-4567-453e-b4f1-8d3fafe1d314 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:11:54 compute-0 nova_compute[183075]: 2026-01-22 17:11:54.391 183079 INFO nova.compute.manager [None req-95368924-861c-4c99-8cc0-0b412cdd82e8 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Get console output
Jan 22 17:11:54 compute-0 nova_compute[183075]: 2026-01-22 17:11:54.398 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:55 compute-0 nova_compute[183075]: 2026-01-22 17:11:55.110 183079 INFO nova.compute.manager [None req-1ac47362-0fdc-403c-b7c2-14d5a0ae220a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Get console output
Jan 22 17:11:55 compute-0 nova_compute[183075]: 2026-01-22 17:11:55.119 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:55 compute-0 nova_compute[183075]: 2026-01-22 17:11:55.203 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.452 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'name': 'tempest-server-test-1891253532', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000014', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e05c7aae349e4a1d859a387df45650a0', 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'hostId': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.456 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'name': 'tempest-server-test-38386895', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000011', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'bfc6667804934c92b71ce7638089e9e3', 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'hostId': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.459 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'name': 'tempest-server-test-939954687', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000010', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '26cca885d303443380036cbbe9e70744', 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'hostId': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.462 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'name': 'tempest-server-test-233401537', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c2b37b797ca344f2b31c3861277068d8', 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'hostId': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.465 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'name': 'tempest-server-test-1314522391', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000013', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '26cca885d303443380036cbbe9e70744', 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'hostId': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.465 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.487 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/disk.device.write.requests volume: 316 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.514 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/disk.device.write.requests volume: 319 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.531 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk.device.write.requests volume: 335 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.549 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.write.requests volume: 348 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.574 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/disk.device.write.requests volume: 291 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 nova_compute[183075]: 2026-01-22 17:11:55.576 183079 DEBUG nova.network.neutron [-] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd455e606-e1e9-4afb-9dfd-c6e184375d8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 316, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '936001bf-d51b-4243-87b8-e363ef3c47a8-vda', 'timestamp': '2026-01-22T17:11:55.466175', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'instance-00000014', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '73f241fe-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.226648524, 'message_signature': 'f5590b33afaf731c774e68876e953b870d177a8bd61270f64207ad0001cdc73f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 319, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006-vda', 'timestamp': '2026-01-22T17:11:55.466175', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'instance-00000011', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '73f66a7c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.249044838, 'message_signature': 'cfbae373736d807e78f964a3a62fd047461e928510b0e276a0dde95c5d8b9b3a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 335, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-vda', 'timestamp': '2026-01-22T17:11:55.466175', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'instance-00000010', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '73f8f922-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.276392581, 'message_signature': '9b08dab45fdc83307aa9ae6797ad690ef39ba2b29e8db35cb30a088ecfe146a3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 348, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:11:55.466175', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '73fbbd9c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.292967533, 'message_signature': '84fc01b1cdc8e8a0a0d708cc99de86a5fe81d8292cd4a710ed2aa24b8f6fc82a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 291, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de-vda', 'timestamp': '2026-01-22T17:11:55.466175', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'instance-00000013', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '73ff81b6-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.31127652, 'message_signature': '5fed292bd8cbf89e7536ece7e36b6996e594d86bbb8935407e19fe2726701a79'}]}, 'timestamp': '2026-01-22 17:11:55.575437', '_unique_id': 'd7699d1fa559425ca99c324d6c21fd13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.579 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.582 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 936001bf-d51b-4243-87b8-e363ef3c47a8 / tap804a64f5-79 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.582 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.585 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.587 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc / tapc26b2385-71 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.587 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.589 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.591 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e69f0100-85ca-4ff8-a177-27d35d4580de / tap0e3bc449-87 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.592 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6cb6740d-a9ff-4763-88e8-3104afe24278', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-00000014-936001bf-d51b-4243-87b8-e363ef3c47a8-tap804a64f5-79', 'timestamp': '2026-01-22T17:11:55.579455', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'tap804a64f5-79', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:03:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap804a64f5-79'}, 'message_id': '7400a514-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.339918457, 'message_signature': 'd159f286eb216416e7a0a8a16cbfe66237b6cd7c7fb213c2fa912434513fa2da'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-00000011-c1a1134b-933b-41d1-ba12-adb71c18d006-tap096b36b4-87', 'timestamp': '2026-01-22T17:11:55.579455', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'tap096b36b4-87', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '74010f5e-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.343158932, 'message_signature': 'd11e2abc2f2dd581bfabed49a1b8302df0cb0f468ea7c57508cec41cc6360baf'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000010-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-tapc26b2385-71', 'timestamp': '2026-01-22T17:11:55.579455', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'tapc26b2385-71', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3c:de:8a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc26b2385-71'}, 'message_id': '74016bb6-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.345831821, 'message_signature': 'd2922238339dfdd1438f275eb61a8aadf6bc07ebaa30a73c15385c972055f664'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:11:55.579455', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '7401c3fe-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.348179582, 'message_signature': '61a960b1195e054c24b688e976aa23f1aa673ee28020767945a54ef8e6e2db7f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000013-e69f0100-85ca-4ff8-a177-27d35d4580de-tap0e3bc449-87', 'timestamp': '2026-01-22T17:11:55.579455', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'tap0e3bc449-87', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:22:e2:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0e3bc449-87'}, 'message_id': '74021a98-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.350434941, 'message_signature': 'e34e022949eaac43c8415b5013e960ff733b827e5337599a23ef03ea7bf0d0a1'}]}, 'timestamp': '2026-01-22 17:11:55.592276', '_unique_id': '346bf681ed384829a95e0dcafeaffc12'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.593 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.594 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.594 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.594 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.594 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad4f4584-cc08-4b1a-841c-1613e84a843e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-00000014-936001bf-d51b-4243-87b8-e363ef3c47a8-tap804a64f5-79', 'timestamp': '2026-01-22T17:11:55.593860', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'tap804a64f5-79', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:03:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap804a64f5-79'}, 'message_id': '7402623c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.339918457, 'message_signature': '106b583c3c3d7a811d99befbcb5290aea709b11b7f048d6c33be532c75da78cb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-00000011-c1a1134b-933b-41d1-ba12-adb71c18d006-tap096b36b4-87', 'timestamp': '2026-01-22T17:11:55.593860', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'tap096b36b4-87', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '74026a7a-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.343158932, 'message_signature': 'dc00a7b43d87a5a0ad83b4f18fdfc630c6d226b0c24508d5529898829040e1b0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000010-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-tapc26b2385-71', 'timestamp': '2026-01-22T17:11:55.593860', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'tapc26b2385-71', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3c:de:8a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc26b2385-71'}, 'message_id': '74027268-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.345831821, 'message_signature': '94a4745e6e113c40563ff162b657ceff942f0f460d21c67cf13c1ddf207cc806'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:11:55.593860', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '74027c40-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.348179582, 'message_signature': '9daa2a21145ae818b2e691b7aea4ebdb8ceda3f37bc75afca9f629853fb81e59'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000013-e69f0100-85ca-4ff8-a177-27d35d4580de-tap0e3bc449-87', 'timestamp': '2026-01-22T17:11:55.593860', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'tap0e3bc449-87', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:22:e2:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0e3bc449-87'}, 'message_id': '74028460-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.350434941, 'message_signature': 'db70f5f1ada4ad54c7780dc36a07dbb346fe2a70d3aa1898b9821f416ba1c83d'}]}, 'timestamp': '2026-01-22 17:11:55.594979', '_unique_id': 'cbf0a6739228414580c812f17ddf4a62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.596 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.596 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.596 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.596 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.596 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e949817b-7d7e-4c48-9e10-c1a92a48ae18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-00000014-936001bf-d51b-4243-87b8-e363ef3c47a8-tap804a64f5-79', 'timestamp': '2026-01-22T17:11:55.596277', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'tap804a64f5-79', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:03:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap804a64f5-79'}, 'message_id': '7402c0d8-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.339918457, 'message_signature': '873588b57727e9a08782348de748ece29681d108665ecdc9000524f4071584ba'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-00000011-c1a1134b-933b-41d1-ba12-adb71c18d006-tap096b36b4-87', 'timestamp': '2026-01-22T17:11:55.596277', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'tap096b36b4-87', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '7402cb78-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.343158932, 'message_signature': '8d7da49d6e67d80d354b95840432202002d139aed92ae06122be584efa00e41e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000010-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-tapc26b2385-71', 'timestamp': '2026-01-22T17:11:55.596277', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'tapc26b2385-71', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3c:de:8a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc26b2385-71'}, 'message_id': '7402d3c0-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.345831821, 'message_signature': '5a926f3b09daedb194a71e38764f97a41e7b6926171908c97f0db2b4f3e07168'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:11:55.596277', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '7402dc8a-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.348179582, 'message_signature': 'c941027edfa10a7af7a7b5ef6ff9e47daf3407cd1c22b3815f4724ded4bfc316'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000013-e69f0100-85ca-4ff8-a177-27d35d4580de-tap0e3bc449-87', 'timestamp': '2026-01-22T17:11:55.596277', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'tap0e3bc449-87', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:22:e2:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0e3bc449-87'}, 'message_id': '7402e52c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.350434941, 'message_signature': '71f4bc7e778bfce08a2a8d85905156e7d69393e4900852706acf3d2275c33f1a'}]}, 'timestamp': '2026-01-22 17:11:55.597440', '_unique_id': '4c8f7ff78d3749f2a893fd18999050db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.597 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.598 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:11:55 compute-0 nova_compute[183075]: 2026-01-22 17:11:55.599 183079 INFO nova.compute.manager [-] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Took 3.30 seconds to deallocate network for instance.
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.604 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.610 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.615 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.621 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.627 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 nova_compute[183075]: 2026-01-22 17:11:55.628 183079 INFO nova.compute.manager [None req-d2a47d43-9bbe-4d1c-adfe-d64f29be01dd 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Get console output
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a147f8b7-37d6-4a62-b7c5-ac96ded53cde', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '936001bf-d51b-4243-87b8-e363ef3c47a8-vda', 'timestamp': '2026-01-22T17:11:55.598615', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'instance-00000014', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7403f3d6-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.359053226, 'message_signature': 'dcfa8a690f5947f6908eb16855eb0a8f096f349b41e6c1bb0b672f6ea0a9c03d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006-vda', 'timestamp': '2026-01-22T17:11:55.598615', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'instance-00000011', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7404e322-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.364776195, 'message_signature': 'a93ab92cd400f084977f28c9c74f7bbbdc8e9491d2a801c558f1abd0b936f5ea'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-vda', 'timestamp': '2026-01-22T17:11:55.598615', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'instance-00000010', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7405a5b4-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.370887994, 'message_signature': '41cef0882484854e63c41c7444a1197094b2876f1f6c599b096f76d525ba6c9a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:11:55.598615', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '740698de-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.375867664, 'message_signature': 'e350183d46fb40edb248b914ed13b22dd86071b739284992bfdd8cdc53a59e2f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de-vda', 'timestamp': '2026-01-22T17:11:55.598615', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'instance-00000013', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74078532-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.382113347, 'message_signature': 'c4669ba670727a26fb991e9cbdaa57ccbc91e827e67e13a9e093d5bfeff80deb'}]}, 'timestamp': '2026-01-22 17:11:55.627795', '_unique_id': '8c5208cad8684fcc8c4af3bc0bce6b7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.628 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.629 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.629 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/network.incoming.bytes volume: 7326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.629 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/network.incoming.bytes volume: 7224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.629 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/network.incoming.bytes volume: 7308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.incoming.bytes volume: 10862 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/network.incoming.bytes volume: 7310 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '652cd6d2-eb8a-475c-b722-64843994e2a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7326, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-00000014-936001bf-d51b-4243-87b8-e363ef3c47a8-tap804a64f5-79', 'timestamp': '2026-01-22T17:11:55.629366', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'tap804a64f5-79', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:03:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap804a64f5-79'}, 'message_id': '7407cd26-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.339918457, 'message_signature': '0181c3b3ee452e498f4ab4fb5821cac55ced5bc9549561c1a9175991d66af760'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7224, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-00000011-c1a1134b-933b-41d1-ba12-adb71c18d006-tap096b36b4-87', 'timestamp': '2026-01-22T17:11:55.629366', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'tap096b36b4-87', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '7407d6b8-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.343158932, 'message_signature': '89a45d40f68bc001ae301f99c3fe4f626df64f8d51ffd60ebe596a1c4bc3b2d2'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7308, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000010-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-tapc26b2385-71', 'timestamp': '2026-01-22T17:11:55.629366', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'tapc26b2385-71', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3c:de:8a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc26b2385-71'}, 'message_id': '7407de9c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.345831821, 'message_signature': 'f217b561eb65678ac6560b6b490d8878b35d2e8656325f414daec4f7cea9e097'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10862, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:11:55.629366', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '7407e644-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.348179582, 'message_signature': 'ef82a19b4dfa82f87773bbdc713bbe1125361b4b741592144e41b02a2e0d8203'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7310, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000013-e69f0100-85ca-4ff8-a177-27d35d4580de-tap0e3bc449-87', 'timestamp': '2026-01-22T17:11:55.629366', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'tap0e3bc449-87', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:22:e2:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0e3bc449-87'}, 'message_id': '7407edce-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.350434941, 'message_signature': '8fe3bd15411ada9ce183c1ba82557dc3f3617f3909b11d444da528d6b9b6f793'}]}, 'timestamp': '2026-01-22 17:11:55.630429', '_unique_id': '989103de07104d13bf5509ddbe02888a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.630 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.632 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.632 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/disk.device.read.requests volume: 1132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.632 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/disk.device.read.requests volume: 1137 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.633 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk.device.read.requests volume: 1185 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.633 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.read.requests volume: 1144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.633 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/disk.device.read.requests volume: 1145 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 nova_compute[183075]: 2026-01-22 17:11:55.633 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4306ba4e-06f6-403c-ba28-d3728a13fd57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1132, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '936001bf-d51b-4243-87b8-e363ef3c47a8-vda', 'timestamp': '2026-01-22T17:11:55.632703', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'instance-00000014', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74084f6c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.226648524, 'message_signature': '2b2676b991869656815d93d58e494a42fad3b5f691725afaf367f34b297d2dbc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1137, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006-vda', 'timestamp': '2026-01-22T17:11:55.632703', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'instance-00000011', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74085728-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.249044838, 'message_signature': 'f5201988a11f71444de7635fc3588d5b767d338c26442c3e696ff726897b0ea9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1185, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-vda', 'timestamp': '2026-01-22T17:11:55.632703', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'instance-00000010', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74085e8a-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.276392581, 'message_signature': '5010d52dc9628cefc3f64f6249d716b4ae21bf333837ce1043fcdbed28fbe22e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1144, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:11:55.632703', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '740865c4-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.292967533, 'message_signature': 'a090a1b62421e3c3857fb261196fc5a250c2d5707f7b66bab2b8779f93da9786'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1145, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de-vda', 'timestamp': '2026-01-22T17:11:55.632703', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'instance-00000013', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74086cf4-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.31127652, 'message_signature': 'aa7a3d8245c8b36743fb7eb00204afc9b127e9ce2c061edca299941524a30da9'}]}, 'timestamp': '2026-01-22 17:11:55.633698', '_unique_id': '4e9bfcf50b724b23a7872a8532a71e07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.634 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/disk.device.allocation volume: 29892608 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.635 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/disk.device.allocation volume: 30875648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.635 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk.device.allocation volume: 30810112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.635 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.635 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2f234d0-6577-4259-81c4-220892c1b23c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29892608, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '936001bf-d51b-4243-87b8-e363ef3c47a8-vda', 'timestamp': '2026-01-22T17:11:55.634908', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'instance-00000014', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7408a520-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.359053226, 'message_signature': '9552e40d27df1df884cca709851dbaab2a8cfe104b7435a8431f6c1be63af4b8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30875648, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006-vda', 'timestamp': '2026-01-22T17:11:55.634908', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'instance-00000011', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7408acdc-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.364776195, 'message_signature': 'bda910a7b53ed65717372aaa4a0dc49d9a386eb63fcf37043f098c934fe49036'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30810112, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-vda', 'timestamp': '2026-01-22T17:11:55.634908', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'instance-00000010', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7408b420-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.370887994, 'message_signature': 'd087312c4b554f04f4e1e2dd5f5d73baf7cee75c59ba4866a49077f486429a5a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:11:55.634908', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7408bb46-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.375867664, 'message_signature': '56deade06893718cc6d2f50d9a9f471b202fd0ca18f3f5f3d62302440d838f19'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de-vda', 'timestamp': '2026-01-22T17:11:55.634908', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'instance-00000013', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7408c334-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.382113347, 'message_signature': 'b5263106565b7aa25f0dd97a985f371dd1d62ba544d1343093b2dce1b0174c51'}]}, 'timestamp': '2026-01-22 17:11:55.635884', '_unique_id': '3f8190413ae84facbd796691e9ac912f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.636 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.637 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/disk.device.write.latency volume: 4999230782 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.637 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/disk.device.write.latency volume: 3569971490 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.637 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk.device.write.latency volume: 3588469124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.637 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.write.latency volume: 3495665049 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.637 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/disk.device.write.latency volume: 3671233723 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6087dc8-ee57-4fe1-b6c0-27a6772bc9c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4999230782, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '936001bf-d51b-4243-87b8-e363ef3c47a8-vda', 'timestamp': '2026-01-22T17:11:55.637054', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'instance-00000014', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7408f912-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.226648524, 'message_signature': '8ec6631b3adaa0364ea44bdcbf976ceba2a33ae178e8efb6775268439af325bf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3569971490, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006-vda', 'timestamp': '2026-01-22T17:11:55.637054', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'instance-00000011', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '740900d8-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.249044838, 'message_signature': 'e1a4c0a9292020a1a844c29f569d33c2eb5365e81c144135a49082ff7f809f76'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3588469124, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-vda', 'timestamp': '2026-01-22T17:11:55.637054', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'instance-00000010', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74090a56-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.276392581, 'message_signature': '5ff4201f2d7275cfb62fc2d859165ff3662485ce5f2ce33b6b5da73a0f72ae93'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3495665049, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:11:55.637054', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '740912c6-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.292967533, 'message_signature': 'e159380b900fa5213a207edd5d38522724995b295f450e632ea8578ccdd41d2d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3671233723, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de-vda', 'timestamp': '2026-01-22T17:11:55.637054', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'instance-00000013', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74091b5e-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.31127652, 'message_signature': '2b0da6f74a59244aa0bf45963d013004ca8a69f675bab4aa837b82c646b76d81'}]}, 'timestamp': '2026-01-22 17:11:55.638176', '_unique_id': 'aa945aabb38d472b827ffaede0468012'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.638 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.639 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.639 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.639 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/network.outgoing.bytes.delta volume: 11596 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.639 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.outgoing.bytes.delta volume: 13804 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7aa5535f-975f-471b-b002-f2ab2a069abc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-00000014-936001bf-d51b-4243-87b8-e363ef3c47a8-tap804a64f5-79', 'timestamp': '2026-01-22T17:11:55.639413', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'tap804a64f5-79', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:03:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap804a64f5-79'}, 'message_id': '74095556-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.339918457, 'message_signature': '10b5ebe57356eeb2f3897e0319b148ba9110be77c900cbcabf3bac81b09eeece'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 11596, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-00000011-c1a1134b-933b-41d1-ba12-adb71c18d006-tap096b36b4-87', 'timestamp': '2026-01-22T17:11:55.639413', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'tap096b36b4-87', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '74095e3e-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.343158932, 'message_signature': '8350b8654995fc5a332c0979e747eec39bc88dbc5368cb121803c5f09b48c64c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000010-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-tapc26b2385-71', 'timestamp': '2026-01-22T17:11:55.639413', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'tapc26b2385-71', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3c:de:8a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc26b2385-71'}, 'message_id': '74096618-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.345831821, 'message_signature': '4b6ee6d2651d0cb951e31c4f5b651f1e070f71ae7c5ed52af680738f307cd263'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 13804, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:11:55.639413', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '74096dca-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.348179582, 'message_signature': 'af600d3fa8a6a3d77b3ec644c62c5867b9457c576e5c91032caec150901ead61'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000013-e69f0100-85ca-4ff8-a177-27d35d4580de-tap0e3bc449-87', 'timestamp': '2026-01-22T17:11:55.639413', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'tap0e3bc449-87', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:22:e2:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0e3bc449-87'}, 'message_id': '74097612-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.350434941, 'message_signature': '2af5fea5da226d08362c0e12a49457a6bcb742e91badaadd9d0e10053fe74d9f'}]}, 'timestamp': '2026-01-22 17:11:55.640469', '_unique_id': '01eb6287bd904f41837cc5fdbb671f0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.640 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.641 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.641 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/disk.device.read.latency volume: 194682394 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.641 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/disk.device.read.latency volume: 272887253 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.642 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk.device.read.latency volume: 185920405 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.642 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.read.latency volume: 224239445 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 nova_compute[183075]: 2026-01-22 17:11:55.642 183079 DEBUG oslo_concurrency.lockutils [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.642 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/disk.device.read.latency volume: 205856885 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 nova_compute[183075]: 2026-01-22 17:11:55.642 183079 DEBUG oslo_concurrency.lockutils [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6067b9cf-a212-4864-8b8f-a55e181df052', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 194682394, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '936001bf-d51b-4243-87b8-e363ef3c47a8-vda', 'timestamp': '2026-01-22T17:11:55.641652', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'instance-00000014', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7409acb8-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.226648524, 'message_signature': 'a23801a0fa5c0421793e00d16c266985bd1cecd3caf77d2f739b127626a2f20d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 272887253, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006-vda', 'timestamp': '2026-01-22T17:11:55.641652', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'instance-00000011', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7409b438-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.249044838, 'message_signature': '8a97e8bdf6c55d87406f8c191632b0024e5be437fcc78668c957389d3d7d5e97'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 185920405, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-vda', 'timestamp': '2026-01-22T17:11:55.641652', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'instance-00000010', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7409bb7c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.276392581, 'message_signature': '15d687e513cb4784f51a5b6d515d1bdd5fe25b92f3f1c6db570e0906830025d6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 224239445, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:11:55.641652', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7409c298-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.292967533, 'message_signature': '384c1f37614e3bc3d6ef3fa271e90c06c34ce3cd24361a01267871f6c3395c8d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 205856885, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de-vda', 'timestamp': '2026-01-22T17:11:55.641652', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'instance-00000013', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7409c9aa-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.31127652, 'message_signature': '726d59ad8ee459ace9f014b46299b76fdc6eee15290fff1915045c2657c665a6'}]}, 'timestamp': '2026-01-22 17:11:55.642646', '_unique_id': '12008cfd7f6e4c7fa89f2262f6f46bac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.643 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.644 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.644 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.644 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/network.incoming.bytes.delta volume: 7134 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.644 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.645 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.incoming.bytes.delta volume: 10772 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.645 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad2df902-b728-4460-b340-3c56297e56a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-00000014-936001bf-d51b-4243-87b8-e363ef3c47a8-tap804a64f5-79', 'timestamp': '2026-01-22T17:11:55.644367', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'tap804a64f5-79', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:03:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap804a64f5-79'}, 'message_id': '740a16bc-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.339918457, 'message_signature': 'fce7bfa0474be7c0633bc7da2d52cea360b4e2621a4a60b8c36a690576bd9030'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 7134, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-00000011-c1a1134b-933b-41d1-ba12-adb71c18d006-tap096b36b4-87', 'timestamp': '2026-01-22T17:11:55.644367', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'tap096b36b4-87', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '740a204e-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.343158932, 'message_signature': '88da9539ac2ae21bf095842a27963907319295acd4a7a6c135a24de1e8c28375'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000010-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-tapc26b2385-71', 'timestamp': '2026-01-22T17:11:55.644367', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'tapc26b2385-71', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3c:de:8a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc26b2385-71'}, 'message_id': '740a28c8-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.345831821, 'message_signature': 'e63b4fb62fb88d380c93b51fab7aa63898719721c93efcdd3fe63a722c7a3d08'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 10772, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:11:55.644367', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '740a3750-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.348179582, 'message_signature': '07ede7d0959a8b2f90194d7824d9687a91f84e915f0d092e7ca2d598742b269d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000013-e69f0100-85ca-4ff8-a177-27d35d4580de-tap0e3bc449-87', 'timestamp': '2026-01-22T17:11:55.644367', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'tap0e3bc449-87', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:22:e2:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0e3bc449-87'}, 'message_id': '740a3ec6-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.350434941, 'message_signature': '2b2b3e8365f2edc61334bedb079ec23c1e9b33d443279db771a39980827a7e3e'}]}, 'timestamp': '2026-01-22 17:11:55.645608', '_unique_id': 'ab925b2b1d904b8d9f58c45f147e6e97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.646 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.647 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.647 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/disk.device.usage volume: 29818880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.647 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.647 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.647 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.647 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17f17b85-b498-4a90-bfee-42b2b5b9520a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29818880, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '936001bf-d51b-4243-87b8-e363ef3c47a8-vda', 'timestamp': '2026-01-22T17:11:55.647084', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'instance-00000014', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '740a80de-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.359053226, 'message_signature': '3bd9063f0df3574b511ea2019115b9c6f3bc5d6f2fb53999e4ab788c479abeef'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006-vda', 'timestamp': '2026-01-22T17:11:55.647084', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'instance-00000011', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '740a885e-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.364776195, 'message_signature': '2bd4827991e8cebdf2d1d98621a53a291f63dd12fad94d983306c320b4ccf925'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-vda', 'timestamp': '2026-01-22T17:11:55.647084', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'instance-00000010', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '740a91e6-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.370887994, 'message_signature': '82d910d5cb41279bce6d3f26dbbdb4bfb29db52004cda0b550f1c1ca7d817d2f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:11:55.647084', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '740a99ac-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.375867664, 'message_signature': '282397660a7764be0ba770812a8cb550b12e794efe16a4cc97965b7ae87bd1d5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de-vda', 'timestamp': '2026-01-22T17:11:55.647084', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'instance-00000013', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '740aa276-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.382113347, 'message_signature': '4abdceba593a650734458c204db4dd62e6c0cb5b3bf922f74661e1668b696440'}]}, 'timestamp': '2026-01-22 17:11:55.648154', '_unique_id': 'b5d36180e6a1480dbbbb333d59199eb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.648 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.649 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.649 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.649 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1891253532>, <NovaLikeServer: tempest-server-test-38386895>, <NovaLikeServer: tempest-server-test-939954687>, <NovaLikeServer: tempest-server-test-1314522391>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1891253532>, <NovaLikeServer: tempest-server-test-38386895>, <NovaLikeServer: tempest-server-test-939954687>, <NovaLikeServer: tempest-server-test-1314522391>]
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.649 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.649 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/disk.device.write.bytes volume: 72871936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.649 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.650 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk.device.write.bytes volume: 73129984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.650 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.write.bytes volume: 73175040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.650 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/disk.device.write.bytes volume: 73011200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05aed0d9-b3a3-47e7-ba60-fd8208c3a802', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72871936, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '936001bf-d51b-4243-87b8-e363ef3c47a8-vda', 'timestamp': '2026-01-22T17:11:55.649733', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'instance-00000014', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '740ae844-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.226648524, 'message_signature': 'f43f9a17c2c4b377a46e0c5197453a576c96505380de0ebe0b60483f7463ddff'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006-vda', 'timestamp': '2026-01-22T17:11:55.649733', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'instance-00000011', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '740af14a-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.249044838, 'message_signature': 'b5c78e2552d655f8db6dc957cfd2a78902b18025a0a75d3a423cc2c66d64d504'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73129984, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-vda', 'timestamp': '2026-01-22T17:11:55.649733', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'instance-00000010', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '740af8a2-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.276392581, 'message_signature': '5a4ea1169ed665582b39c383935b47cc3d326aa695cf7c571672324ef0c548ae'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73175040, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:11:55.649733', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '740affd2-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.292967533, 'message_signature': 'e37af83e603de23c36e8d45dab9796333d762c075ede252b6119757abc5ed421'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73011200, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de-vda', 'timestamp': '2026-01-22T17:11:55.649733', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'instance-00000013', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '740b0860-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.31127652, 'message_signature': '46e48c790edc4024dbfa357ceccfd84509d14abdaa531347033fb6f0db21ec73'}]}, 'timestamp': '2026-01-22 17:11:55.650766', '_unique_id': '328d4d3715424a29915e6555b34b6780'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.651 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.652 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.664 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/cpu volume: 11330000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.681 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/cpu volume: 12340000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.699 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/cpu volume: 11010000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.714 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/cpu volume: 11760000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.735 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/cpu volume: 12060000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5e9222c-20b6-4905-8c6b-09759f57446b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11330000000, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'timestamp': '2026-01-22T17:11:55.652130', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'instance-00000014', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '740d2adc-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.424849571, 'message_signature': '97a462fc420d13fb6bd6cb0fca50ec58b5ead906119e2261e8bb0552d54046b1'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12340000000, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'timestamp': '2026-01-22T17:11:55.652130', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'instance-00000011', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '740fb70c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.441475095, 'message_signature': 'dfb30a0e6da2dbe8aa665be87638a5a646ced57efddd65d12128a5f346b55e55'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11010000000, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'timestamp': '2026-01-22T17:11:55.652130', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'instance-00000010', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '741291e8-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.460243154, 'message_signature': 'c07e5e8f37f80394e0780e01ebfad32c124b0fb75037850e21783aa53a4ef8c3'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11760000000, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'timestamp': '2026-01-22T17:11:55.652130', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '7414dba6-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.475087651, 'message_signature': 'bcb8c107923c700330f7ef6965e9154f38ce16a9bc119df9d461360fb5caa1e2'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12060000000, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'timestamp': '2026-01-22T17:11:55.652130', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'instance-00000013', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '741805a6-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.495850542, 'message_signature': 'b64e0adb7eeca1774596f805e56f959ec0d91e5780782d8672ff9de5050e757e'}]}, 'timestamp': '2026-01-22 17:11:55.736007', '_unique_id': '09fde8422d574104a59da4fbe61a9d54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.737 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/memory.usage volume: 42.640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.738 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/memory.usage volume: 43.41015625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.738 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/memory.usage volume: 42.484375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.738 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/memory.usage volume: 42.171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.738 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/memory.usage volume: 42.76953125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0764f2c3-bcde-4e2c-9261-4b796dea2db4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.640625, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'timestamp': '2026-01-22T17:11:55.737766', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'instance-00000014', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '74185876-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.424849571, 'message_signature': '5429a33c35e1a6c9277c625b81bfee8427b6632b832e7165752d6cf5ac22039c'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.41015625, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'timestamp': '2026-01-22T17:11:55.737766', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'instance-00000011', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '741860e6-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.441475095, 'message_signature': 'f35750224919e6c91a616af009ea05cf74d653ae306471a60e0210a891d540d0'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.484375, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'timestamp': '2026-01-22T17:11:55.737766', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'instance-00000010', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '74186866-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.460243154, 'message_signature': 'befc9354f85e90398bcb9b27f6df91d07411536953b76e879206a56bf2c927c6'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.171875, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'timestamp': '2026-01-22T17:11:55.737766', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '74186fdc-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.475087651, 'message_signature': '812ed764efade5cd936913ba3d352f7b138427f3bf584c93ee6309e68625d911'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.76953125, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'timestamp': '2026-01-22T17:11:55.737766', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'instance-00000013', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '741878ba-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.495850542, 'message_signature': '9509f0ac46999ebafc0329763d416b43abe7222fc741d306010e839c59c54580'}]}, 'timestamp': '2026-01-22 17:11:55.738850', '_unique_id': '93c11cbd44d44de489f1d27b2b67ea62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.739 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.740 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.740 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.740 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.740 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.740 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b49ae909-3f08-4288-955f-54a4e36f41d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-00000014-936001bf-d51b-4243-87b8-e363ef3c47a8-tap804a64f5-79', 'timestamp': '2026-01-22T17:11:55.740029', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'tap804a64f5-79', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:03:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap804a64f5-79'}, 'message_id': '7418afce-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.339918457, 'message_signature': '3a8e49f5050a3acab3b8fc4f7d516e614e0968b7811713734ba8ed371438d493'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-00000011-c1a1134b-933b-41d1-ba12-adb71c18d006-tap096b36b4-87', 'timestamp': '2026-01-22T17:11:55.740029', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'tap096b36b4-87', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '7418b7f8-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.343158932, 'message_signature': '614204f73bb65e5a0b1d7c0d26ab8b44ba7cbf56b5429e2f237016d4357f4de4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000010-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-tapc26b2385-71', 'timestamp': '2026-01-22T17:11:55.740029', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'tapc26b2385-71', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3c:de:8a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc26b2385-71'}, 'message_id': '7418c1bc-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.345831821, 'message_signature': 'bd998cbee7ca1b701bd2ea16cd9d7fa1e6dec0e9f908b481efb5cd63c8e19e36'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:11:55.740029', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '7418cb08-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.348179582, 'message_signature': '6f6c7330e2dae50b1fdf56353e623eba7bc5d7b55a23ee4b0840b8b75587c728'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000013-e69f0100-85ca-4ff8-a177-27d35d4580de-tap0e3bc449-87', 'timestamp': '2026-01-22T17:11:55.740029', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'tap0e3bc449-87', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:22:e2:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0e3bc449-87'}, 'message_id': '7418d4ae-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.350434941, 'message_signature': '05e2ee3b4f66d21bfb094bc33e33b6bf8d7f29978cf87a48c9980e9b481dbb6b'}]}, 'timestamp': '2026-01-22 17:11:55.741198', '_unique_id': 'b58c8e0290ae42878c0f53523798beb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.741 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.742 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.742 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.742 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1891253532>, <NovaLikeServer: tempest-server-test-38386895>, <NovaLikeServer: tempest-server-test-939954687>, <NovaLikeServer: tempest-server-test-1314522391>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1891253532>, <NovaLikeServer: tempest-server-test-38386895>, <NovaLikeServer: tempest-server-test-939954687>, <NovaLikeServer: tempest-server-test-1314522391>]
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.742 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.742 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.742 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1891253532>, <NovaLikeServer: tempest-server-test-38386895>, <NovaLikeServer: tempest-server-test-939954687>, <NovaLikeServer: tempest-server-test-1314522391>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1891253532>, <NovaLikeServer: tempest-server-test-38386895>, <NovaLikeServer: tempest-server-test-939954687>, <NovaLikeServer: tempest-server-test-1314522391>]
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.742 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.743 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.743 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.743 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.743 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.incoming.packets volume: 86 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.743 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0ef8f4f-d928-4776-a36e-f1d55fec7376', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-00000014-936001bf-d51b-4243-87b8-e363ef3c47a8-tap804a64f5-79', 'timestamp': '2026-01-22T17:11:55.743033', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'tap804a64f5-79', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:03:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap804a64f5-79'}, 'message_id': '7419251c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.339918457, 'message_signature': 'e3446c3a130e9bfea37905070aadf7cb4aa4d7f6e6371308e7692ffbe61695c8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-00000011-c1a1134b-933b-41d1-ba12-adb71c18d006-tap096b36b4-87', 'timestamp': '2026-01-22T17:11:55.743033', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'tap096b36b4-87', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '74192d96-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.343158932, 'message_signature': '10ae08b4635244dc923c3121f447dcdfba0af67de1c293abd2d69293b75bb8b8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000010-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-tapc26b2385-71', 'timestamp': '2026-01-22T17:11:55.743033', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'tapc26b2385-71', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3c:de:8a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc26b2385-71'}, 'message_id': '741935c0-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.345831821, 'message_signature': '15001af3da6119ba4e775eaf75b97c563b37e4d6914b5f6e9ce4ed355b9ae77e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 86, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:11:55.743033', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '74193e8a-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.348179582, 'message_signature': 'e22846486ef55b38097c07a8b80020879d6a5f455862e2d03606501190bbac74'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000013-e69f0100-85ca-4ff8-a177-27d35d4580de-tap0e3bc449-87', 'timestamp': '2026-01-22T17:11:55.743033', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'tap0e3bc449-87', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:22:e2:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0e3bc449-87'}, 'message_id': '74194682-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.350434941, 'message_signature': '92a43330c41dfcf1180bf54a84c78a209d37ee9f55cf739de6464b35a23b4c37'}]}, 'timestamp': '2026-01-22 17:11:55.744123', '_unique_id': 'ef6bb0bb4ed4444288271a02d4cf776a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.744 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.745 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.745 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/disk.device.read.bytes volume: 30427648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.745 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/disk.device.read.bytes volume: 30824960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.746 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/disk.device.read.bytes volume: 31955456 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.746 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/disk.device.read.bytes volume: 30894592 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.746 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/disk.device.read.bytes volume: 31115776 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ad6ba5e-86bb-4e37-b93b-a8a417f803cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30427648, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '936001bf-d51b-4243-87b8-e363ef3c47a8-vda', 'timestamp': '2026-01-22T17:11:55.745524', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'instance-00000014', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74198840-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.226648524, 'message_signature': '6ac35af4358776f633a67134ef637f1b63f1c1c0dd6bf2e9dc16740845e7062a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30824960, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006-vda', 'timestamp': '2026-01-22T17:11:55.745524', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'instance-00000011', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74199272-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.249044838, 'message_signature': 'ead570fcffc8d416988be3519abb01ade721ccfcd6c0ef5f266f9a6062d240f7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31955456, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-vda', 'timestamp': '2026-01-22T17:11:55.745524', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'instance-00000010', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74199ce0-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.276392581, 'message_signature': 'e4da851f2c13e771a44df78e026913a8bc2f7c82795530276941c8d117691331'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30894592, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764-vda', 'timestamp': '2026-01-22T17:11:55.745524', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'instance-0000000b', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7419a622-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.292967533, 'message_signature': '76ff858157b5159af21ff58e2ec74cb6c1cecd19870463b6e62dda8ec7931321'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31115776, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de-vda', 'timestamp': '2026-01-22T17:11:55.745524', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'instance-00000013', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7419afd2-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.31127652, 'message_signature': '64a755912ec2fb2dec9c4386107d74ac4797eddde298bbe3c3bb8dc7bd9e83ee'}]}, 'timestamp': '2026-01-22 17:11:55.746818', '_unique_id': 'ce0bf429df9245b6ab1364f08bb341b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.747 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.748 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/network.outgoing.packets volume: 115 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.748 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/network.outgoing.packets volume: 131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.748 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/network.outgoing.packets volume: 121 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.748 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.outgoing.packets volume: 142 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.748 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/network.outgoing.packets volume: 117 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad3a1243-6ff7-4b5a-9415-e29c34ef27f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 115, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-00000014-936001bf-d51b-4243-87b8-e363ef3c47a8-tap804a64f5-79', 'timestamp': '2026-01-22T17:11:55.747992', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'tap804a64f5-79', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:03:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap804a64f5-79'}, 'message_id': '7419e6a0-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.339918457, 'message_signature': '84e56d42493f45bff728a839062396ebfdc724f79176314d57f56f06555985c3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 131, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-00000011-c1a1134b-933b-41d1-ba12-adb71c18d006-tap096b36b4-87', 'timestamp': '2026-01-22T17:11:55.747992', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'tap096b36b4-87', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '7419eea2-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.343158932, 'message_signature': 'b609652b75310fb9037706b6ddfaf30612e666779534d75691c6592130599427'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 121, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000010-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-tapc26b2385-71', 'timestamp': '2026-01-22T17:11:55.747992', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'tapc26b2385-71', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3c:de:8a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc26b2385-71'}, 'message_id': '7419f6b8-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.345831821, 'message_signature': '23badf0ff6c53cdf9f5d5d80456b14c9050a520e716860067e97c0b59a8aa55b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 142, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:11:55.747992', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '7419ff82-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.348179582, 'message_signature': 'ac90e3563ff821ab2d0d21d56837c0db096f7b2f49681362a790a01db1fba7e2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 117, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000013-e69f0100-85ca-4ff8-a177-27d35d4580de-tap0e3bc449-87', 'timestamp': '2026-01-22T17:11:55.747992', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'tap0e3bc449-87', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:22:e2:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0e3bc449-87'}, 'message_id': '741a075c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.350434941, 'message_signature': '7b894b7404fbb5b696f56782c12ab238537d3e23abfa821ec78ecca7d3534fa5'}]}, 'timestamp': '2026-01-22 17:11:55.749047', '_unique_id': '1158b655c76446f492d667db149b1bf0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.749 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.750 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.750 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.750 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1891253532>, <NovaLikeServer: tempest-server-test-38386895>, <NovaLikeServer: tempest-server-test-939954687>, <NovaLikeServer: tempest-server-test-1314522391>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1891253532>, <NovaLikeServer: tempest-server-test-38386895>, <NovaLikeServer: tempest-server-test-939954687>, <NovaLikeServer: tempest-server-test-1314522391>]
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.750 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.750 12 DEBUG ceilometer.compute.pollsters [-] 936001bf-d51b-4243-87b8-e363ef3c47a8/network.outgoing.bytes volume: 10121 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.750 12 DEBUG ceilometer.compute.pollsters [-] c1a1134b-933b-41d1-ba12-adb71c18d006/network.outgoing.bytes volume: 11596 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.751 12 DEBUG ceilometer.compute.pollsters [-] 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc/network.outgoing.bytes volume: 10616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.751 12 DEBUG ceilometer.compute.pollsters [-] 7e8d077b-66fc-42ee-ad4e-a13327ad6764/network.outgoing.bytes volume: 13804 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.751 12 DEBUG ceilometer.compute.pollsters [-] e69f0100-85ca-4ff8-a177-27d35d4580de/network.outgoing.bytes volume: 10276 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3aa203e5-9ee2-4632-9054-6e76dbe71199', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10121, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-00000014-936001bf-d51b-4243-87b8-e363ef3c47a8-tap804a64f5-79', 'timestamp': '2026-01-22T17:11:55.750516', 'resource_metadata': {'display_name': 'tempest-server-test-1891253532', 'name': 'tap804a64f5-79', 'instance_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:03:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap804a64f5-79'}, 'message_id': '741a4a5a-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.339918457, 'message_signature': '2c7bf17b46520aa75d81bf5fbb5414963e2c3fc94a08104d0c784ef29e4a49a6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11596, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_name': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_name': None, 'resource_id': 'instance-00000011-c1a1134b-933b-41d1-ba12-adb71c18d006-tap096b36b4-87', 'timestamp': '2026-01-22T17:11:55.750516', 'resource_metadata': {'display_name': 'tempest-server-test-38386895', 'name': 'tap096b36b4-87', 'instance_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'instance_type': 'm1.nano', 'host': 'a4f6eac946aee0b365b250f469b66f13eee36a12a296e9bec9c6bd5b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap096b36b4-87'}, 'message_id': '741a5310-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.343158932, 'message_signature': 'a2d321c2cf28cbdb54de50de0d0f509df4aa71ebea3cca29b9924568522917b5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10616, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000010-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-tapc26b2385-71', 'timestamp': '2026-01-22T17:11:55.750516', 'resource_metadata': {'display_name': 'tempest-server-test-939954687', 'name': 'tapc26b2385-71', 'instance_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3c:de:8a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc26b2385-71'}, 'message_id': '741a5fe0-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.345831821, 'message_signature': '0d6f13b2aee2d7931fd458cc9e68a3bb463643b1c9da191a146256e1c80d6245'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 13804, 'user_id': '28bc4852545149e59d0541d4f39eb38e', 'user_name': None, 'project_id': 'c2b37b797ca344f2b31c3861277068d8', 'project_name': None, 'resource_id': 'instance-0000000b-7e8d077b-66fc-42ee-ad4e-a13327ad6764-tap5644ae2a-c3', 'timestamp': '2026-01-22T17:11:55.750516', 'resource_metadata': {'display_name': 'tempest-server-test-233401537', 'name': 'tap5644ae2a-c3', 'instance_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'instance_type': 'm1.nano', 'host': '5768678c2c8518615c02c4b38d92ffded2de8a3f4b91eb720a2d75be', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:c8:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5644ae2a-c3'}, 'message_id': '741a679c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.348179582, 'message_signature': '5cd6af58a230d1a0384c6d53b6f3ccdb11a53aca572fc739df292e1793ef3c32'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10276, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_name': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_name': None, 'resource_id': 'instance-00000013-e69f0100-85ca-4ff8-a177-27d35d4580de-tap0e3bc449-87', 'timestamp': '2026-01-22T17:11:55.750516', 'resource_metadata': {'display_name': 'tempest-server-test-1314522391', 'name': 'tap0e3bc449-87', 'instance_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'instance_type': 'm1.nano', 'host': 'bc9209b8e975aadd369712a557aa682fe2641d361b5e14ec23435572', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:22:e2:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0e3bc449-87'}, 'message_id': '741a6f30-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4279.350434941, 'message_signature': '3ead6f1b0d35327ab0ef2b99e2948afd151373cd89bbc962140ff3f5e708d379'}]}, 'timestamp': '2026-01-22 17:11:55.751741', '_unique_id': '993f89f4836d47c7b8aa7cf8bccd8af4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:11:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:11:55.752 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:11:55 compute-0 nova_compute[183075]: 2026-01-22 17:11:55.820 183079 DEBUG nova.compute.provider_tree [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:11:55 compute-0 nova_compute[183075]: 2026-01-22 17:11:55.839 183079 DEBUG nova.scheduler.client.report [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:11:55 compute-0 nova_compute[183075]: 2026-01-22 17:11:55.863 183079 DEBUG oslo_concurrency.lockutils [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:55 compute-0 nova_compute[183075]: 2026-01-22 17:11:55.891 183079 INFO nova.scheduler.client.report [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Deleted allocations for instance 7d7be65d-c615-4cfd-936e-e5b57b3f29c1
Jan 22 17:11:55 compute-0 nova_compute[183075]: 2026-01-22 17:11:55.955 183079 DEBUG oslo_concurrency.lockutils [None req-909f71d0-b3e4-4fad-a6f7-5b5fa8a9f4a6 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:55 compute-0 nova_compute[183075]: 2026-01-22 17:11:55.985 183079 INFO nova.compute.manager [None req-28d5335d-6804-43d2-9cf8-1fb55ec31f1b 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Get console output
Jan 22 17:11:55 compute-0 nova_compute[183075]: 2026-01-22 17:11:55.989 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.249 183079 DEBUG oslo_concurrency.lockutils [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.249 183079 DEBUG oslo_concurrency.lockutils [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.250 183079 DEBUG oslo_concurrency.lockutils [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.250 183079 DEBUG oslo_concurrency.lockutils [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.251 183079 DEBUG oslo_concurrency.lockutils [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.252 183079 INFO nova.compute.manager [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Terminating instance
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.254 183079 DEBUG nova.compute.manager [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:11:56 compute-0 kernel: tap5644ae2a-c3 (unregistering): left promiscuous mode
Jan 22 17:11:56 compute-0 NetworkManager[55454]: <info>  [1769101916.2884] device (tap5644ae2a-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:11:56 compute-0 ovn_controller[95372]: 2026-01-22T17:11:56Z|00213|binding|INFO|Releasing lport 5644ae2a-c35b-431d-88a1-ad18de811d83 from this chassis (sb_readonly=0)
Jan 22 17:11:56 compute-0 ovn_controller[95372]: 2026-01-22T17:11:56Z|00214|binding|INFO|Setting lport 5644ae2a-c35b-431d-88a1-ad18de811d83 down in Southbound
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.341 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:56 compute-0 ovn_controller[95372]: 2026-01-22T17:11:56Z|00215|binding|INFO|Removing iface tap5644ae2a-c3 ovn-installed in OVS
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.345 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.351 183079 DEBUG nova.compute.manager [req-1a7f468d-4f4b-4df2-b0fc-a70054bed979 req-55c3ff41-66ac-4580-be71-c9cf2e8c9f28 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Received event network-vif-plugged-1188a618-4567-453e-b4f1-8d3fafe1d314 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.351 183079 DEBUG oslo_concurrency.lockutils [req-1a7f468d-4f4b-4df2-b0fc-a70054bed979 req-55c3ff41-66ac-4580-be71-c9cf2e8c9f28 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.351 183079 DEBUG oslo_concurrency.lockutils [req-1a7f468d-4f4b-4df2-b0fc-a70054bed979 req-55c3ff41-66ac-4580-be71-c9cf2e8c9f28 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.351 183079 DEBUG oslo_concurrency.lockutils [req-1a7f468d-4f4b-4df2-b0fc-a70054bed979 req-55c3ff41-66ac-4580-be71-c9cf2e8c9f28 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7d7be65d-c615-4cfd-936e-e5b57b3f29c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.352 183079 DEBUG nova.compute.manager [req-1a7f468d-4f4b-4df2-b0fc-a70054bed979 req-55c3ff41-66ac-4580-be71-c9cf2e8c9f28 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] No waiting events found dispatching network-vif-plugged-1188a618-4567-453e-b4f1-8d3fafe1d314 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.352 183079 WARNING nova.compute.manager [req-1a7f468d-4f4b-4df2-b0fc-a70054bed979 req-55c3ff41-66ac-4580-be71-c9cf2e8c9f28 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Received unexpected event network-vif-plugged-1188a618-4567-453e-b4f1-8d3fafe1d314 for instance with vm_state deleted and task_state None.
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.352 183079 DEBUG nova.compute.manager [req-1a7f468d-4f4b-4df2-b0fc-a70054bed979 req-55c3ff41-66ac-4580-be71-c9cf2e8c9f28 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Received event network-vif-deleted-1188a618-4567-453e-b4f1-8d3fafe1d314 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.354 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:c8:e5 10.100.0.9'], port_security=['fa:16:3e:d4:c8:e5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2b37b797ca344f2b31c3861277068d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2f51838f-8a2c-425b-a70e-e288886c38d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f29732e-c99f-480d-89f6-9caa444040c9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=5644ae2a-c35b-431d-88a1-ad18de811d83) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.355 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 5644ae2a-c35b-431d-88a1-ad18de811d83 in datapath ce346f8d-be8d-455f-b61c-12fea213a3f4 unbound from our chassis
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.357 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ce346f8d-be8d-455f-b61c-12fea213a3f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.358 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.358 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4833f74b-981e-4c11-938c-3d328ffbfb56]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.358 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4 namespace which is not needed anymore
Jan 22 17:11:56 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Jan 22 17:11:56 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Consumed 17.383s CPU time.
Jan 22 17:11:56 compute-0 systemd-machined[154382]: Machine qemu-11-instance-0000000b terminated.
Jan 22 17:11:56 compute-0 podman[220150]: 2026-01-22 17:11:56.397771299 +0000 UTC m=+0.098189801 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:11:56 compute-0 podman[220151]: 2026-01-22 17:11:56.406697582 +0000 UTC m=+0.102966375 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, architecture=x86_64, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41)
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.425 183079 INFO nova.compute.manager [None req-bd6f7fc5-97bb-4753-8178-d5761826b924 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Get console output
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.431 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:11:56 compute-0 podman[220149]: 2026-01-22 17:11:56.435702958 +0000 UTC m=+0.136172671 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:11:56 compute-0 kernel: tap5644ae2a-c3: entered promiscuous mode
Jan 22 17:11:56 compute-0 kernel: tap5644ae2a-c3 (unregistering): left promiscuous mode
Jan 22 17:11:56 compute-0 NetworkManager[55454]: <info>  [1769101916.4762] manager: (tap5644ae2a-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Jan 22 17:11:56 compute-0 ovn_controller[95372]: 2026-01-22T17:11:56Z|00216|binding|INFO|Claiming lport 5644ae2a-c35b-431d-88a1-ad18de811d83 for this chassis.
Jan 22 17:11:56 compute-0 ovn_controller[95372]: 2026-01-22T17:11:56Z|00217|binding|INFO|5644ae2a-c35b-431d-88a1-ad18de811d83: Claiming fa:16:3e:d4:c8:e5 10.100.0.9
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.480 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.489 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:c8:e5 10.100.0.9'], port_security=['fa:16:3e:d4:c8:e5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2b37b797ca344f2b31c3861277068d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2f51838f-8a2c-425b-a70e-e288886c38d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f29732e-c99f-480d-89f6-9caa444040c9, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=5644ae2a-c35b-431d-88a1-ad18de811d83) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:11:56 compute-0 neutron-haproxy-ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4[217746]: [NOTICE]   (217764) : haproxy version is 2.8.14-c23fe91
Jan 22 17:11:56 compute-0 neutron-haproxy-ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4[217746]: [NOTICE]   (217764) : path to executable is /usr/sbin/haproxy
Jan 22 17:11:56 compute-0 neutron-haproxy-ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4[217746]: [WARNING]  (217764) : Exiting Master process...
Jan 22 17:11:56 compute-0 neutron-haproxy-ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4[217746]: [ALERT]    (217764) : Current worker (217774) exited with code 143 (Terminated)
Jan 22 17:11:56 compute-0 neutron-haproxy-ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4[217746]: [WARNING]  (217764) : All workers exited. Exiting... (0)
Jan 22 17:11:56 compute-0 systemd[1]: libpod-de6e0593f000d31e8ceafde0b4908ce9bfd4992406fcad5cbdf6e36686c72865.scope: Deactivated successfully.
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.500 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:56 compute-0 ovn_controller[95372]: 2026-01-22T17:11:56Z|00218|binding|INFO|Setting lport 5644ae2a-c35b-431d-88a1-ad18de811d83 ovn-installed in OVS
Jan 22 17:11:56 compute-0 ovn_controller[95372]: 2026-01-22T17:11:56Z|00219|binding|INFO|Setting lport 5644ae2a-c35b-431d-88a1-ad18de811d83 up in Southbound
Jan 22 17:11:56 compute-0 ovn_controller[95372]: 2026-01-22T17:11:56Z|00220|binding|INFO|Releasing lport 5644ae2a-c35b-431d-88a1-ad18de811d83 from this chassis (sb_readonly=1)
Jan 22 17:11:56 compute-0 ovn_controller[95372]: 2026-01-22T17:11:56Z|00221|if_status|INFO|Dropped 2 log messages in last 183 seconds (most recently, 183 seconds ago) due to excessive rate
Jan 22 17:11:56 compute-0 podman[220231]: 2026-01-22 17:11:56.504744738 +0000 UTC m=+0.052694074 container died de6e0593f000d31e8ceafde0b4908ce9bfd4992406fcad5cbdf6e36686c72865 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:11:56 compute-0 ovn_controller[95372]: 2026-01-22T17:11:56Z|00222|if_status|INFO|Not setting lport 5644ae2a-c35b-431d-88a1-ad18de811d83 down as sb is readonly
Jan 22 17:11:56 compute-0 ovn_controller[95372]: 2026-01-22T17:11:56Z|00223|binding|INFO|Releasing lport 5644ae2a-c35b-431d-88a1-ad18de811d83 from this chassis (sb_readonly=0)
Jan 22 17:11:56 compute-0 ovn_controller[95372]: 2026-01-22T17:11:56Z|00224|binding|INFO|Removing iface tap5644ae2a-c3 ovn-installed in OVS
Jan 22 17:11:56 compute-0 ovn_controller[95372]: 2026-01-22T17:11:56Z|00225|binding|INFO|Setting lport 5644ae2a-c35b-431d-88a1-ad18de811d83 down in Southbound
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.513 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.517 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:c8:e5 10.100.0.9'], port_security=['fa:16:3e:d4:c8:e5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '7e8d077b-66fc-42ee-ad4e-a13327ad6764', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2b37b797ca344f2b31c3861277068d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2f51838f-8a2c-425b-a70e-e288886c38d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f29732e-c99f-480d-89f6-9caa444040c9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=5644ae2a-c35b-431d-88a1-ad18de811d83) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.519 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.521 183079 INFO nova.virt.libvirt.driver [-] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Instance destroyed successfully.
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.522 183079 DEBUG nova.objects.instance [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lazy-loading 'resources' on Instance uuid 7e8d077b-66fc-42ee-ad4e-a13327ad6764 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:11:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-de6e0593f000d31e8ceafde0b4908ce9bfd4992406fcad5cbdf6e36686c72865-userdata-shm.mount: Deactivated successfully.
Jan 22 17:11:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-cb99b46687ec2aa5d3c7adf23d310d69db4a67b07a13c288ee5deb7b5100ba51-merged.mount: Deactivated successfully.
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.546 183079 DEBUG nova.virt.libvirt.vif [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:09:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-233401537',display_name='tempest-server-test-233401537',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-233401537',id=11,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA4izGuVdf36SsG+7n8kX9aNpboq22Z55adiWGM5qlH08LxqMkSxkCnGlFdsMKL8t/vQsOXqbCU1vgc4to/WoKVrvDSrylB83cxSgDIuuaEZv45HgYlb5csi4YLKl3Bk4g==',key_name='tempest-keypair-test-110348497',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:09:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2b37b797ca344f2b31c3861277068d8',ramdisk_id='',reservation_id='r-lld7rcgo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpMultipleRoutersTest-2036232412',owner_user_name='tempest-FloatingIpMultipleRoutersTest-2036232412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:09:48Z,user_data=None,user_id='28bc4852545149e59d0541d4f39eb38e',uuid=7e8d077b-66fc-42ee-ad4e-a13327ad6764,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5644ae2a-c35b-431d-88a1-ad18de811d83", "address": "fa:16:3e:d4:c8:e5", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5644ae2a-c3", "ovs_interfaceid": "5644ae2a-c35b-431d-88a1-ad18de811d83", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.546 183079 DEBUG nova.network.os_vif_util [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converting VIF {"id": "5644ae2a-c35b-431d-88a1-ad18de811d83", "address": "fa:16:3e:d4:c8:e5", "network": {"id": "ce346f8d-be8d-455f-b61c-12fea213a3f4", "bridge": "br-int", "label": "tempest-test-network--508539761", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2b37b797ca344f2b31c3861277068d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5644ae2a-c3", "ovs_interfaceid": "5644ae2a-c35b-431d-88a1-ad18de811d83", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.547 183079 DEBUG nova.network.os_vif_util [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d4:c8:e5,bridge_name='br-int',has_traffic_filtering=True,id=5644ae2a-c35b-431d-88a1-ad18de811d83,network=Network(ce346f8d-be8d-455f-b61c-12fea213a3f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5644ae2a-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.547 183079 DEBUG os_vif [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:c8:e5,bridge_name='br-int',has_traffic_filtering=True,id=5644ae2a-c35b-431d-88a1-ad18de811d83,network=Network(ce346f8d-be8d-455f-b61c-12fea213a3f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5644ae2a-c3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.549 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.549 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5644ae2a-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:56 compute-0 podman[220231]: 2026-01-22 17:11:56.551139198 +0000 UTC m=+0.099088534 container cleanup de6e0593f000d31e8ceafde0b4908ce9bfd4992406fcad5cbdf6e36686c72865 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.551 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.553 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.553 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.556 183079 INFO os_vif [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:c8:e5,bridge_name='br-int',has_traffic_filtering=True,id=5644ae2a-c35b-431d-88a1-ad18de811d83,network=Network(ce346f8d-be8d-455f-b61c-12fea213a3f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5644ae2a-c3')
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.557 183079 INFO nova.virt.libvirt.driver [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Deleting instance files /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764_del
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.557 183079 INFO nova.virt.libvirt.driver [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Deletion of /var/lib/nova/instances/7e8d077b-66fc-42ee-ad4e-a13327ad6764_del complete
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.563 183079 DEBUG nova.compute.manager [req-ae2f20ba-c301-45f0-92e7-3f1d79342e55 req-817efdd1-270b-44d7-a134-cbd6653eb6f7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received event network-vif-unplugged-5644ae2a-c35b-431d-88a1-ad18de811d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.563 183079 DEBUG oslo_concurrency.lockutils [req-ae2f20ba-c301-45f0-92e7-3f1d79342e55 req-817efdd1-270b-44d7-a134-cbd6653eb6f7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.564 183079 DEBUG oslo_concurrency.lockutils [req-ae2f20ba-c301-45f0-92e7-3f1d79342e55 req-817efdd1-270b-44d7-a134-cbd6653eb6f7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.564 183079 DEBUG oslo_concurrency.lockutils [req-ae2f20ba-c301-45f0-92e7-3f1d79342e55 req-817efdd1-270b-44d7-a134-cbd6653eb6f7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.565 183079 DEBUG nova.compute.manager [req-ae2f20ba-c301-45f0-92e7-3f1d79342e55 req-817efdd1-270b-44d7-a134-cbd6653eb6f7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] No waiting events found dispatching network-vif-unplugged-5644ae2a-c35b-431d-88a1-ad18de811d83 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.565 183079 DEBUG nova.compute.manager [req-ae2f20ba-c301-45f0-92e7-3f1d79342e55 req-817efdd1-270b-44d7-a134-cbd6653eb6f7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received event network-vif-unplugged-5644ae2a-c35b-431d-88a1-ad18de811d83 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:11:56 compute-0 systemd[1]: libpod-conmon-de6e0593f000d31e8ceafde0b4908ce9bfd4992406fcad5cbdf6e36686c72865.scope: Deactivated successfully.
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.606 183079 INFO nova.compute.manager [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.607 183079 DEBUG oslo.service.loopingcall [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.607 183079 DEBUG nova.compute.manager [-] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.608 183079 DEBUG nova.network.neutron [-] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:11:56 compute-0 podman[220273]: 2026-01-22 17:11:56.62522821 +0000 UTC m=+0.049406940 container remove de6e0593f000d31e8ceafde0b4908ce9bfd4992406fcad5cbdf6e36686c72865 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.631 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2a8e3efb-0566-4f93-9260-b446723d2d00]: (4, ('Thu Jan 22 05:11:56 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4 (de6e0593f000d31e8ceafde0b4908ce9bfd4992406fcad5cbdf6e36686c72865)\nde6e0593f000d31e8ceafde0b4908ce9bfd4992406fcad5cbdf6e36686c72865\nThu Jan 22 05:11:56 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4 (de6e0593f000d31e8ceafde0b4908ce9bfd4992406fcad5cbdf6e36686c72865)\nde6e0593f000d31e8ceafde0b4908ce9bfd4992406fcad5cbdf6e36686c72865\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.633 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb60bbf-a1a8-4134-993c-13e4d5b1049b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.634 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce346f8d-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.636 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:56 compute-0 kernel: tapce346f8d-b0: left promiscuous mode
Jan 22 17:11:56 compute-0 nova_compute[183075]: 2026-01-22 17:11:56.650 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.655 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cb4af222-5cf3-4333-b45f-4dc756908d2e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.667 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[46722618-5fa4-4ff7-b598-171e4e39e1a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.669 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[00afe698-042a-49a2-9f8a-4cf4d1511cd1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.689 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3a65eb97-2db5-46eb-8063-2bd0bb4c51c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415003, 'reachable_time': 36362, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220288, 'error': None, 'target': 'ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.692 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ce346f8d-be8d-455f-b61c-12fea213a3f4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:11:56 compute-0 systemd[1]: run-netns-ovnmeta\x2dce346f8d\x2dbe8d\x2d455f\x2db61c\x2d12fea213a3f4.mount: Deactivated successfully.
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.692 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[46e7f9e1-a086-481b-b69a-fbbbae41b563]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.693 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 5644ae2a-c35b-431d-88a1-ad18de811d83 in datapath ce346f8d-be8d-455f-b61c-12fea213a3f4 unbound from our chassis
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.695 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ce346f8d-be8d-455f-b61c-12fea213a3f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.696 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4a49d136-2bee-48de-892f-007619834119]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.697 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 5644ae2a-c35b-431d-88a1-ad18de811d83 in datapath ce346f8d-be8d-455f-b61c-12fea213a3f4 unbound from our chassis
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.699 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ce346f8d-be8d-455f-b61c-12fea213a3f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:11:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:11:56.699 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[39f8da22-2518-44e1-8731-d130fca3ef97]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:11:57 compute-0 nova_compute[183075]: 2026-01-22 17:11:57.586 183079 DEBUG nova.network.neutron [-] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:11:57 compute-0 nova_compute[183075]: 2026-01-22 17:11:57.763 183079 INFO nova.compute.manager [-] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Took 1.16 seconds to deallocate network for instance.
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.331 183079 DEBUG oslo_concurrency.lockutils [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.332 183079 DEBUG oslo_concurrency.lockutils [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.606 183079 DEBUG nova.compute.manager [req-ab915fe0-776d-4077-9325-22b97a059322 req-d9951375-7b05-471c-9453-6b04395e86a6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received event network-vif-deleted-5644ae2a-c35b-431d-88a1-ad18de811d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.624 183079 DEBUG nova.compute.provider_tree [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.640 183079 DEBUG nova.scheduler.client.report [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.649 183079 DEBUG nova.compute.manager [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received event network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.649 183079 DEBUG oslo_concurrency.lockutils [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.650 183079 DEBUG oslo_concurrency.lockutils [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.650 183079 DEBUG oslo_concurrency.lockutils [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.650 183079 DEBUG nova.compute.manager [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] No waiting events found dispatching network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.650 183079 WARNING nova.compute.manager [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received unexpected event network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 for instance with vm_state deleted and task_state None.
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.650 183079 DEBUG nova.compute.manager [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received event network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.650 183079 DEBUG oslo_concurrency.lockutils [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.651 183079 DEBUG oslo_concurrency.lockutils [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.651 183079 DEBUG oslo_concurrency.lockutils [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.651 183079 DEBUG nova.compute.manager [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] No waiting events found dispatching network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.651 183079 WARNING nova.compute.manager [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received unexpected event network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 for instance with vm_state deleted and task_state None.
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.651 183079 DEBUG nova.compute.manager [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received event network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.651 183079 DEBUG oslo_concurrency.lockutils [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.651 183079 DEBUG oslo_concurrency.lockutils [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.652 183079 DEBUG oslo_concurrency.lockutils [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.652 183079 DEBUG nova.compute.manager [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] No waiting events found dispatching network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.652 183079 WARNING nova.compute.manager [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received unexpected event network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 for instance with vm_state deleted and task_state None.
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.652 183079 DEBUG nova.compute.manager [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received event network-vif-unplugged-5644ae2a-c35b-431d-88a1-ad18de811d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.652 183079 DEBUG oslo_concurrency.lockutils [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.652 183079 DEBUG oslo_concurrency.lockutils [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.652 183079 DEBUG oslo_concurrency.lockutils [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.653 183079 DEBUG nova.compute.manager [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] No waiting events found dispatching network-vif-unplugged-5644ae2a-c35b-431d-88a1-ad18de811d83 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.653 183079 WARNING nova.compute.manager [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received unexpected event network-vif-unplugged-5644ae2a-c35b-431d-88a1-ad18de811d83 for instance with vm_state deleted and task_state None.
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.653 183079 DEBUG nova.compute.manager [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received event network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.653 183079 DEBUG oslo_concurrency.lockutils [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.653 183079 DEBUG oslo_concurrency.lockutils [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.653 183079 DEBUG oslo_concurrency.lockutils [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.653 183079 DEBUG nova.compute.manager [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] No waiting events found dispatching network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.654 183079 WARNING nova.compute.manager [req-d33eedce-d64d-477f-81da-bb7b7ebd5f27 req-455fa66e-b559-4ad2-8d56-05fbc300fd9d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Received unexpected event network-vif-plugged-5644ae2a-c35b-431d-88a1-ad18de811d83 for instance with vm_state deleted and task_state None.
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.660 183079 DEBUG oslo_concurrency.lockutils [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.328s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:11:58 compute-0 nova_compute[183075]: 2026-01-22 17:11:58.855 183079 INFO nova.scheduler.client.report [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Deleted allocations for instance 7e8d077b-66fc-42ee-ad4e-a13327ad6764
Jan 22 17:11:59 compute-0 nova_compute[183075]: 2026-01-22 17:11:59.032 183079 DEBUG oslo_concurrency.lockutils [None req-c9d4e3a1-9fac-4657-9e33-f3ba63a4d3b7 28bc4852545149e59d0541d4f39eb38e c2b37b797ca344f2b31c3861277068d8 - - default default] Lock "7e8d077b-66fc-42ee-ad4e-a13327ad6764" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:00 compute-0 nova_compute[183075]: 2026-01-22 17:12:00.205 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:00 compute-0 nova_compute[183075]: 2026-01-22 17:12:00.585 183079 INFO nova.compute.manager [None req-37038e39-5492-4799-83dd-c8cf679ffbc6 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Get console output
Jan 22 17:12:00 compute-0 nova_compute[183075]: 2026-01-22 17:12:00.592 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:12:00 compute-0 nova_compute[183075]: 2026-01-22 17:12:00.707 183079 DEBUG nova.compute.manager [req-e4bbff31-2079-4734-988b-561d9c00833a req-8a3f5e4c-46ba-41e7-8afa-d1a4aa1a0802 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Received event network-changed-c26b2385-71db-477e-888c-d10712732db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:00 compute-0 nova_compute[183075]: 2026-01-22 17:12:00.707 183079 DEBUG nova.compute.manager [req-e4bbff31-2079-4734-988b-561d9c00833a req-8a3f5e4c-46ba-41e7-8afa-d1a4aa1a0802 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Refreshing instance network info cache due to event network-changed-c26b2385-71db-477e-888c-d10712732db6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:12:00 compute-0 nova_compute[183075]: 2026-01-22 17:12:00.708 183079 DEBUG oslo_concurrency.lockutils [req-e4bbff31-2079-4734-988b-561d9c00833a req-8a3f5e4c-46ba-41e7-8afa-d1a4aa1a0802 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:12:00 compute-0 nova_compute[183075]: 2026-01-22 17:12:00.708 183079 DEBUG oslo_concurrency.lockutils [req-e4bbff31-2079-4734-988b-561d9c00833a req-8a3f5e4c-46ba-41e7-8afa-d1a4aa1a0802 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:12:00 compute-0 nova_compute[183075]: 2026-01-22 17:12:00.709 183079 DEBUG nova.network.neutron [req-e4bbff31-2079-4734-988b-561d9c00833a req-8a3f5e4c-46ba-41e7-8afa-d1a4aa1a0802 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Refreshing network info cache for port c26b2385-71db-477e-888c-d10712732db6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:12:01 compute-0 nova_compute[183075]: 2026-01-22 17:12:01.257 183079 INFO nova.compute.manager [None req-9cd7f256-2569-4d3a-8cab-6b3ee43d07aa 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Get console output
Jan 22 17:12:01 compute-0 nova_compute[183075]: 2026-01-22 17:12:01.268 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:12:01 compute-0 nova_compute[183075]: 2026-01-22 17:12:01.553 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:02 compute-0 nova_compute[183075]: 2026-01-22 17:12:02.949 183079 DEBUG nova.network.neutron [req-e4bbff31-2079-4734-988b-561d9c00833a req-8a3f5e4c-46ba-41e7-8afa-d1a4aa1a0802 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Updated VIF entry in instance network info cache for port c26b2385-71db-477e-888c-d10712732db6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:12:02 compute-0 nova_compute[183075]: 2026-01-22 17:12:02.950 183079 DEBUG nova.network.neutron [req-e4bbff31-2079-4734-988b-561d9c00833a req-8a3f5e4c-46ba-41e7-8afa-d1a4aa1a0802 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Updating instance_info_cache with network_info: [{"id": "c26b2385-71db-477e-888c-d10712732db6", "address": "fa:16:3e:3c:de:8a", "network": {"id": "ea1cd914-64be-4fd0-b944-45368957fb5b", "bridge": "br-int", "label": "tempest-test-network--945581638", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc26b2385-71", "ovs_interfaceid": "c26b2385-71db-477e-888c-d10712732db6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:12:02 compute-0 nova_compute[183075]: 2026-01-22 17:12:02.992 183079 DEBUG oslo_concurrency.lockutils [req-e4bbff31-2079-4734-988b-561d9c00833a req-8a3f5e4c-46ba-41e7-8afa-d1a4aa1a0802 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-36a2dc63-6945-45c9-8e82-9d3aacdfc3bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.033 183079 INFO nova.compute.manager [None req-64adf8e5-8167-4d1c-b5ea-8603558bc554 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Get console output
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.040 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:12:03 compute-0 podman[220291]: 2026-01-22 17:12:03.391294579 +0000 UTC m=+0.097255577 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.584 183079 DEBUG oslo_concurrency.lockutils [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "c1a1134b-933b-41d1-ba12-adb71c18d006" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.585 183079 DEBUG oslo_concurrency.lockutils [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "c1a1134b-933b-41d1-ba12-adb71c18d006" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.585 183079 DEBUG oslo_concurrency.lockutils [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.586 183079 DEBUG oslo_concurrency.lockutils [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.586 183079 DEBUG oslo_concurrency.lockutils [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.587 183079 INFO nova.compute.manager [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Terminating instance
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.588 183079 DEBUG nova.compute.manager [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:12:03 compute-0 kernel: tap096b36b4-87 (unregistering): left promiscuous mode
Jan 22 17:12:03 compute-0 NetworkManager[55454]: <info>  [1769101923.6201] device (tap096b36b4-87): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.624 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:03 compute-0 ovn_controller[95372]: 2026-01-22T17:12:03Z|00226|binding|INFO|Releasing lport 096b36b4-87c4-423a-a3ef-3c47a75704f7 from this chassis (sb_readonly=0)
Jan 22 17:12:03 compute-0 ovn_controller[95372]: 2026-01-22T17:12:03Z|00227|binding|INFO|Setting lport 096b36b4-87c4-423a-a3ef-3c47a75704f7 down in Southbound
Jan 22 17:12:03 compute-0 ovn_controller[95372]: 2026-01-22T17:12:03Z|00228|binding|INFO|Removing iface tap096b36b4-87 ovn-installed in OVS
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.627 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.646 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:03 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000011.scope: Deactivated successfully.
Jan 22 17:12:03 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000011.scope: Consumed 15.333s CPU time.
Jan 22 17:12:03 compute-0 systemd-machined[154382]: Machine qemu-17-instance-00000011 terminated.
Jan 22 17:12:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:03.713 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:53:65 10.100.0.14'], port_security=['fa:16:3e:f6:53:65 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c1a1134b-933b-41d1-ba12-adb71c18d006', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bfc6667804934c92b71ce7638089e9e3', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'd9af03c0-27db-4d08-b124-ee395583cdd3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd725c57-a5bb-4dca-9677-d74d2fa01c15, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=096b36b4-87c4-423a-a3ef-3c47a75704f7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:12:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:03.714 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 096b36b4-87c4-423a-a3ef-3c47a75704f7 in datapath 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c unbound from our chassis
Jan 22 17:12:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:03.717 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c1e909c-8e03-49be-b02d-6bf4a2cedc0c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:12:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:03.718 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5efcadc8-7f4b-47b5-865a-db53e4f55b30]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:03.718 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c namespace which is not needed anymore
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.871 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:03 compute-0 neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219169]: [NOTICE]   (219173) : haproxy version is 2.8.14-c23fe91
Jan 22 17:12:03 compute-0 neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219169]: [NOTICE]   (219173) : path to executable is /usr/sbin/haproxy
Jan 22 17:12:03 compute-0 neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219169]: [WARNING]  (219173) : Exiting Master process...
Jan 22 17:12:03 compute-0 neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219169]: [ALERT]    (219173) : Current worker (219175) exited with code 143 (Terminated)
Jan 22 17:12:03 compute-0 neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c[219169]: [WARNING]  (219173) : All workers exited. Exiting... (0)
Jan 22 17:12:03 compute-0 systemd[1]: libpod-5559883f0eb78b2767cd5b1c2099ef9ddce66df902b91fbcdeb902d14ce3416f.scope: Deactivated successfully.
Jan 22 17:12:03 compute-0 podman[220336]: 2026-01-22 17:12:03.884806985 +0000 UTC m=+0.082954843 container died 5559883f0eb78b2767cd5b1c2099ef9ddce66df902b91fbcdeb902d14ce3416f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.899 183079 INFO nova.virt.libvirt.driver [-] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Instance destroyed successfully.
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.900 183079 DEBUG nova.objects.instance [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lazy-loading 'resources' on Instance uuid c1a1134b-933b-41d1-ba12-adb71c18d006 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.914 183079 DEBUG nova.virt.libvirt.vif [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:10:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-38386895',display_name='tempest-server-test-38386895',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-38386895',id=17,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHfe8rLYrRVMQd5qVieqZtJXgXMRXXWLO3wIeFfb7KYA9eoyBAovnsonBtjcWSfX5askB1oLz9+GVLr2BbeT56cbjxFVwHBiF5ai0hYAzgMHQMj/KeUJm66j5OTKSNVWEQ==',key_name='tempest-keypair-test-1258081705',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:11:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bfc6667804934c92b71ce7638089e9e3',ramdisk_id='',reservation_id='r-ddwqtn0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-QoSTest-2146064006',owner_user_name='tempest-QoSTest-2146064006-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:11:06Z,user_data=None,user_id='1e61127d65144bcbaa0d43fe3eb484c0',uuid=c1a1134b-933b-41d1-ba12-adb71c18d006,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.915 183079 DEBUG nova.network.os_vif_util [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converting VIF {"id": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "address": "fa:16:3e:f6:53:65", "network": {"id": "9c1e909c-8e03-49be-b02d-6bf4a2cedc0c", "bridge": "br-int", "label": "tempest-test-network--1863582698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap096b36b4-87", "ovs_interfaceid": "096b36b4-87c4-423a-a3ef-3c47a75704f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.916 183079 DEBUG nova.network.os_vif_util [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:65,bridge_name='br-int',has_traffic_filtering=True,id=096b36b4-87c4-423a-a3ef-3c47a75704f7,network=Network(9c1e909c-8e03-49be-b02d-6bf4a2cedc0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap096b36b4-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.916 183079 DEBUG os_vif [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:65,bridge_name='br-int',has_traffic_filtering=True,id=096b36b4-87c4-423a-a3ef-3c47a75704f7,network=Network(9c1e909c-8e03-49be-b02d-6bf4a2cedc0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap096b36b4-87') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.918 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.919 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap096b36b4-87, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.920 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.923 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5559883f0eb78b2767cd5b1c2099ef9ddce66df902b91fbcdeb902d14ce3416f-userdata-shm.mount: Deactivated successfully.
Jan 22 17:12:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-00880bfd293daaf1c42a33064a76a1798cdfeb466c7691cc3e6701338cd821ef-merged.mount: Deactivated successfully.
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.928 183079 INFO os_vif [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:65,bridge_name='br-int',has_traffic_filtering=True,id=096b36b4-87c4-423a-a3ef-3c47a75704f7,network=Network(9c1e909c-8e03-49be-b02d-6bf4a2cedc0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap096b36b4-87')
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.929 183079 INFO nova.virt.libvirt.driver [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Deleting instance files /var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006_del
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.930 183079 INFO nova.virt.libvirt.driver [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Deletion of /var/lib/nova/instances/c1a1134b-933b-41d1-ba12-adb71c18d006_del complete
Jan 22 17:12:03 compute-0 podman[220336]: 2026-01-22 17:12:03.930581909 +0000 UTC m=+0.128729767 container cleanup 5559883f0eb78b2767cd5b1c2099ef9ddce66df902b91fbcdeb902d14ce3416f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:12:03 compute-0 systemd[1]: libpod-conmon-5559883f0eb78b2767cd5b1c2099ef9ddce66df902b91fbcdeb902d14ce3416f.scope: Deactivated successfully.
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.988 183079 INFO nova.compute.manager [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.989 183079 DEBUG oslo.service.loopingcall [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.989 183079 DEBUG nova.compute.manager [-] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:12:03 compute-0 nova_compute[183075]: 2026-01-22 17:12:03.989 183079 DEBUG nova.network.neutron [-] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:12:03 compute-0 podman[220379]: 2026-01-22 17:12:03.996783685 +0000 UTC m=+0.044510572 container remove 5559883f0eb78b2767cd5b1c2099ef9ddce66df902b91fbcdeb902d14ce3416f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 17:12:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:04.001 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f9f36f34-8c57-4feb-a8fc-f3d395cf0e96]: (4, ('Thu Jan 22 05:12:03 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c (5559883f0eb78b2767cd5b1c2099ef9ddce66df902b91fbcdeb902d14ce3416f)\n5559883f0eb78b2767cd5b1c2099ef9ddce66df902b91fbcdeb902d14ce3416f\nThu Jan 22 05:12:03 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c (5559883f0eb78b2767cd5b1c2099ef9ddce66df902b91fbcdeb902d14ce3416f)\n5559883f0eb78b2767cd5b1c2099ef9ddce66df902b91fbcdeb902d14ce3416f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:04.003 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3a823375-b4cc-40be-9830-46700018fae1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:04.003 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c1e909c-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:04 compute-0 nova_compute[183075]: 2026-01-22 17:12:04.004 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:04 compute-0 kernel: tap9c1e909c-80: left promiscuous mode
Jan 22 17:12:04 compute-0 nova_compute[183075]: 2026-01-22 17:12:04.016 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:04.019 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4cec2cd1-799c-47ef-b225-5931b0860cc8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:04.030 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b397c231-9058-41d3-a949-44ad68823f18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:04.031 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f813eb84-af66-40aa-8695-09d8fa95985d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:04.046 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c2290f5e-6763-4281-8acb-0777f1dfc1a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422933, 'reachable_time': 34581, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220394, 'error': None, 'target': 'ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:04.048 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9c1e909c-8e03-49be-b02d-6bf4a2cedc0c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:12:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:04.048 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[e5667cbf-57d7-4b5b-a2cc-b89eb2ca5b9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:04 compute-0 systemd[1]: run-netns-ovnmeta\x2d9c1e909c\x2d8e03\x2d49be\x2db02d\x2d6bf4a2cedc0c.mount: Deactivated successfully.
Jan 22 17:12:04 compute-0 nova_compute[183075]: 2026-01-22 17:12:04.238 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101909.2374067, d54ce6ac-7fff-4f20-a6e0-48c13efded58 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:04 compute-0 nova_compute[183075]: 2026-01-22 17:12:04.239 183079 INFO nova.compute.manager [-] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] VM Stopped (Lifecycle Event)
Jan 22 17:12:04 compute-0 nova_compute[183075]: 2026-01-22 17:12:04.256 183079 DEBUG nova.compute.manager [None req-061efc93-a04b-427b-942c-f81ed97f4759 - - - - - -] [instance: d54ce6ac-7fff-4f20-a6e0-48c13efded58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:04 compute-0 ovn_controller[95372]: 2026-01-22T17:12:04Z|00229|binding|INFO|Releasing lport 1759254b-798a-4e65-baf5-489557c1f604 from this chassis (sb_readonly=0)
Jan 22 17:12:04 compute-0 ovn_controller[95372]: 2026-01-22T17:12:04Z|00230|binding|INFO|Releasing lport 75c62732-e203-4484-b6f5-77c2880e15a3 from this chassis (sb_readonly=0)
Jan 22 17:12:04 compute-0 ovn_controller[95372]: 2026-01-22T17:12:04Z|00231|binding|INFO|Releasing lport d022c2ed-36a9-47fd-8947-7dd18cdeb285 from this chassis (sb_readonly=0)
Jan 22 17:12:04 compute-0 nova_compute[183075]: 2026-01-22 17:12:04.408 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:04 compute-0 nova_compute[183075]: 2026-01-22 17:12:04.490 183079 DEBUG nova.compute.manager [req-db6171c5-1a07-420f-b42b-5d61412b95bf req-a5a3cc6d-d97e-45f6-a5af-e519a4e22c39 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Received event network-vif-unplugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:04 compute-0 nova_compute[183075]: 2026-01-22 17:12:04.491 183079 DEBUG oslo_concurrency.lockutils [req-db6171c5-1a07-420f-b42b-5d61412b95bf req-a5a3cc6d-d97e-45f6-a5af-e519a4e22c39 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:04 compute-0 nova_compute[183075]: 2026-01-22 17:12:04.491 183079 DEBUG oslo_concurrency.lockutils [req-db6171c5-1a07-420f-b42b-5d61412b95bf req-a5a3cc6d-d97e-45f6-a5af-e519a4e22c39 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:04 compute-0 nova_compute[183075]: 2026-01-22 17:12:04.491 183079 DEBUG oslo_concurrency.lockutils [req-db6171c5-1a07-420f-b42b-5d61412b95bf req-a5a3cc6d-d97e-45f6-a5af-e519a4e22c39 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:04 compute-0 nova_compute[183075]: 2026-01-22 17:12:04.492 183079 DEBUG nova.compute.manager [req-db6171c5-1a07-420f-b42b-5d61412b95bf req-a5a3cc6d-d97e-45f6-a5af-e519a4e22c39 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] No waiting events found dispatching network-vif-unplugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:12:04 compute-0 nova_compute[183075]: 2026-01-22 17:12:04.492 183079 DEBUG nova.compute.manager [req-db6171c5-1a07-420f-b42b-5d61412b95bf req-a5a3cc6d-d97e-45f6-a5af-e519a4e22c39 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Received event network-vif-unplugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:12:05 compute-0 nova_compute[183075]: 2026-01-22 17:12:05.207 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:05 compute-0 nova_compute[183075]: 2026-01-22 17:12:05.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:12:05 compute-0 nova_compute[183075]: 2026-01-22 17:12:05.957 183079 DEBUG nova.network.neutron [-] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:12:05 compute-0 nova_compute[183075]: 2026-01-22 17:12:05.976 183079 INFO nova.compute.manager [-] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Took 1.99 seconds to deallocate network for instance.
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.061 183079 DEBUG oslo_concurrency.lockutils [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.062 183079 DEBUG oslo_concurrency.lockutils [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.140 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.141 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.161 183079 DEBUG nova.compute.manager [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.205 183079 DEBUG nova.compute.provider_tree [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.304 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.471 183079 DEBUG nova.scheduler.client.report [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.564 183079 DEBUG oslo_concurrency.lockutils [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.566 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.572 183079 DEBUG nova.virt.hardware [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.572 183079 INFO nova.compute.claims [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.592 183079 INFO nova.scheduler.client.report [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Deleted allocations for instance c1a1134b-933b-41d1-ba12-adb71c18d006
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.617 183079 DEBUG nova.compute.manager [req-0bb89ca4-3a50-4974-8103-557e42947b4a req-c73762c5-3f12-4196-8bb4-769f79814034 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Received event network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.617 183079 DEBUG oslo_concurrency.lockutils [req-0bb89ca4-3a50-4974-8103-557e42947b4a req-c73762c5-3f12-4196-8bb4-769f79814034 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.617 183079 DEBUG oslo_concurrency.lockutils [req-0bb89ca4-3a50-4974-8103-557e42947b4a req-c73762c5-3f12-4196-8bb4-769f79814034 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.617 183079 DEBUG oslo_concurrency.lockutils [req-0bb89ca4-3a50-4974-8103-557e42947b4a req-c73762c5-3f12-4196-8bb4-769f79814034 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c1a1134b-933b-41d1-ba12-adb71c18d006-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.617 183079 DEBUG nova.compute.manager [req-0bb89ca4-3a50-4974-8103-557e42947b4a req-c73762c5-3f12-4196-8bb4-769f79814034 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] No waiting events found dispatching network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.618 183079 WARNING nova.compute.manager [req-0bb89ca4-3a50-4974-8103-557e42947b4a req-c73762c5-3f12-4196-8bb4-769f79814034 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Received unexpected event network-vif-plugged-096b36b4-87c4-423a-a3ef-3c47a75704f7 for instance with vm_state deleted and task_state None.
Jan 22 17:12:06 compute-0 nova_compute[183075]: 2026-01-22 17:12:06.900 183079 DEBUG oslo_concurrency.lockutils [None req-1c554789-37a1-4e66-b20a-ff3a6d6892a4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "c1a1134b-933b-41d1-ba12-adb71c18d006" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.315s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.015 183079 DEBUG nova.compute.provider_tree [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.040 183079 DEBUG nova.scheduler.client.report [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.068 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.069 183079 DEBUG nova.compute.manager [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.143 183079 DEBUG nova.compute.manager [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.144 183079 DEBUG nova.network.neutron [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.160 183079 INFO nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.176 183079 DEBUG nova.compute.manager [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.232 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101912.2314544, 7d7be65d-c615-4cfd-936e-e5b57b3f29c1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.233 183079 INFO nova.compute.manager [-] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] VM Stopped (Lifecycle Event)
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.264 183079 DEBUG nova.compute.manager [None req-27b05eb3-48f9-479c-b40e-1ee02700864e - - - - - -] [instance: 7d7be65d-c615-4cfd-936e-e5b57b3f29c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.273 183079 DEBUG nova.compute.manager [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.276 183079 DEBUG nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.277 183079 INFO nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Creating image(s)
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.278 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "/var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.278 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.279 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.294 183079 DEBUG oslo_concurrency.processutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.358 183079 DEBUG oslo_concurrency.processutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.358 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.359 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.369 183079 DEBUG oslo_concurrency.processutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.443 183079 DEBUG oslo_concurrency.processutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.444 183079 DEBUG oslo_concurrency.processutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.489 183079 DEBUG oslo_concurrency.processutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.490 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.491 183079 DEBUG oslo_concurrency.processutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.548 183079 DEBUG oslo_concurrency.processutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.550 183079 DEBUG nova.virt.disk.api [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Checking if we can resize image /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.550 183079 DEBUG oslo_concurrency.processutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.584 183079 DEBUG nova.policy [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.610 183079 DEBUG oslo_concurrency.processutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.611 183079 DEBUG nova.virt.disk.api [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Cannot resize image /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.611 183079 DEBUG nova.objects.instance [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'migration_context' on Instance uuid 91845d3c-b89e-43ba-b1d2-40f99d79ae8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.624 183079 DEBUG nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.624 183079 DEBUG nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Ensure instance console log exists: /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.625 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.625 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.625 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:07 compute-0 nova_compute[183075]: 2026-01-22 17:12:07.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:12:08 compute-0 ovn_controller[95372]: 2026-01-22T17:12:08Z|00232|binding|INFO|Releasing lport 1759254b-798a-4e65-baf5-489557c1f604 from this chassis (sb_readonly=0)
Jan 22 17:12:08 compute-0 ovn_controller[95372]: 2026-01-22T17:12:08Z|00233|binding|INFO|Releasing lport 75c62732-e203-4484-b6f5-77c2880e15a3 from this chassis (sb_readonly=0)
Jan 22 17:12:08 compute-0 ovn_controller[95372]: 2026-01-22T17:12:08Z|00234|binding|INFO|Releasing lport d022c2ed-36a9-47fd-8947-7dd18cdeb285 from this chassis (sb_readonly=0)
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.069 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.395 183079 DEBUG oslo_concurrency.lockutils [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "e69f0100-85ca-4ff8-a177-27d35d4580de" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.396 183079 DEBUG oslo_concurrency.lockutils [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "e69f0100-85ca-4ff8-a177-27d35d4580de" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.396 183079 DEBUG oslo_concurrency.lockutils [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.396 183079 DEBUG oslo_concurrency.lockutils [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.396 183079 DEBUG oslo_concurrency.lockutils [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.397 183079 INFO nova.compute.manager [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Terminating instance
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.398 183079 DEBUG nova.compute.manager [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:12:08 compute-0 kernel: tap0e3bc449-87 (unregistering): left promiscuous mode
Jan 22 17:12:08 compute-0 NetworkManager[55454]: <info>  [1769101928.4323] device (tap0e3bc449-87): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.473 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:08 compute-0 ovn_controller[95372]: 2026-01-22T17:12:08Z|00235|binding|INFO|Releasing lport 0e3bc449-87f9-4d63-9fee-5ac925d686c4 from this chassis (sb_readonly=0)
Jan 22 17:12:08 compute-0 ovn_controller[95372]: 2026-01-22T17:12:08Z|00236|binding|INFO|Setting lport 0e3bc449-87f9-4d63-9fee-5ac925d686c4 down in Southbound
Jan 22 17:12:08 compute-0 ovn_controller[95372]: 2026-01-22T17:12:08Z|00237|binding|INFO|Removing iface tap0e3bc449-87 ovn-installed in OVS
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.478 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.483 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:e2:d4 10.10.2.65'], port_security=['fa:16:3e:22:e2:d4 10.10.2.65'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.10.2.65/24', 'neutron:device_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d23d5e4-bd70-4266-8b97-203b9af8d4ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26cca885d303443380036cbbe9e70744', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9427a67d-1313-4d60-b73e-5a3f81f9a54d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=33071058-a726-4eee-b55a-420f0eebe73b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=0e3bc449-87f9-4d63-9fee-5ac925d686c4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.486 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 0e3bc449-87f9-4d63-9fee-5ac925d686c4 in datapath 3d23d5e4-bd70-4266-8b97-203b9af8d4ef unbound from our chassis
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.489 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3d23d5e4-bd70-4266-8b97-203b9af8d4ef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.491 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[66a72b9f-f161-4f4d-8d98-a884bec81b9a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.492 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef namespace which is not needed anymore
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.499 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:08 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000013.scope: Deactivated successfully.
Jan 22 17:12:08 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000013.scope: Consumed 14.708s CPU time.
Jan 22 17:12:08 compute-0 systemd-machined[154382]: Machine qemu-19-instance-00000013 terminated.
Jan 22 17:12:08 compute-0 kernel: tap0e3bc449-87: entered promiscuous mode
Jan 22 17:12:08 compute-0 NetworkManager[55454]: <info>  [1769101928.6264] manager: (tap0e3bc449-87): new Tun device (/org/freedesktop/NetworkManager/Devices/105)
Jan 22 17:12:08 compute-0 kernel: tap0e3bc449-87 (unregistering): left promiscuous mode
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.627 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:08 compute-0 ovn_controller[95372]: 2026-01-22T17:12:08Z|00238|binding|INFO|Claiming lport 0e3bc449-87f9-4d63-9fee-5ac925d686c4 for this chassis.
Jan 22 17:12:08 compute-0 ovn_controller[95372]: 2026-01-22T17:12:08Z|00239|binding|INFO|0e3bc449-87f9-4d63-9fee-5ac925d686c4: Claiming fa:16:3e:22:e2:d4 10.10.2.65
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.639 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:e2:d4 10.10.2.65'], port_security=['fa:16:3e:22:e2:d4 10.10.2.65'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.10.2.65/24', 'neutron:device_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d23d5e4-bd70-4266-8b97-203b9af8d4ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26cca885d303443380036cbbe9e70744', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9427a67d-1313-4d60-b73e-5a3f81f9a54d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=33071058-a726-4eee-b55a-420f0eebe73b, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=0e3bc449-87f9-4d63-9fee-5ac925d686c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:12:08 compute-0 ovn_controller[95372]: 2026-01-22T17:12:08Z|00240|binding|INFO|Setting lport 0e3bc449-87f9-4d63-9fee-5ac925d686c4 ovn-installed in OVS
Jan 22 17:12:08 compute-0 ovn_controller[95372]: 2026-01-22T17:12:08Z|00241|binding|INFO|Setting lport 0e3bc449-87f9-4d63-9fee-5ac925d686c4 up in Southbound
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.656 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:08 compute-0 ovn_controller[95372]: 2026-01-22T17:12:08Z|00242|binding|INFO|Releasing lport 0e3bc449-87f9-4d63-9fee-5ac925d686c4 from this chassis (sb_readonly=1)
Jan 22 17:12:08 compute-0 ovn_controller[95372]: 2026-01-22T17:12:08Z|00243|if_status|INFO|Not setting lport 0e3bc449-87f9-4d63-9fee-5ac925d686c4 down as sb is readonly
Jan 22 17:12:08 compute-0 ovn_controller[95372]: 2026-01-22T17:12:08Z|00244|binding|INFO|Removing iface tap0e3bc449-87 ovn-installed in OVS
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.660 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:08 compute-0 ovn_controller[95372]: 2026-01-22T17:12:08Z|00245|binding|INFO|Releasing lport 0e3bc449-87f9-4d63-9fee-5ac925d686c4 from this chassis (sb_readonly=0)
Jan 22 17:12:08 compute-0 ovn_controller[95372]: 2026-01-22T17:12:08Z|00246|binding|INFO|Setting lport 0e3bc449-87f9-4d63-9fee-5ac925d686c4 down in Southbound
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.670 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:e2:d4 10.10.2.65'], port_security=['fa:16:3e:22:e2:d4 10.10.2.65'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.10.2.65/24', 'neutron:device_id': 'e69f0100-85ca-4ff8-a177-27d35d4580de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d23d5e4-bd70-4266-8b97-203b9af8d4ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26cca885d303443380036cbbe9e70744', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9427a67d-1313-4d60-b73e-5a3f81f9a54d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=33071058-a726-4eee-b55a-420f0eebe73b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=0e3bc449-87f9-4d63-9fee-5ac925d686c4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.676 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:08 compute-0 neutron-haproxy-ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219755]: [NOTICE]   (219759) : haproxy version is 2.8.14-c23fe91
Jan 22 17:12:08 compute-0 neutron-haproxy-ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219755]: [NOTICE]   (219759) : path to executable is /usr/sbin/haproxy
Jan 22 17:12:08 compute-0 neutron-haproxy-ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219755]: [WARNING]  (219759) : Exiting Master process...
Jan 22 17:12:08 compute-0 neutron-haproxy-ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219755]: [WARNING]  (219759) : Exiting Master process...
Jan 22 17:12:08 compute-0 neutron-haproxy-ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219755]: [ALERT]    (219759) : Current worker (219761) exited with code 143 (Terminated)
Jan 22 17:12:08 compute-0 neutron-haproxy-ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef[219755]: [WARNING]  (219759) : All workers exited. Exiting... (0)
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.681 183079 INFO nova.virt.libvirt.driver [-] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Instance destroyed successfully.
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.682 183079 DEBUG nova.objects.instance [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lazy-loading 'resources' on Instance uuid e69f0100-85ca-4ff8-a177-27d35d4580de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:12:08 compute-0 systemd[1]: libpod-c35ae59c6888fa03b353884b930a2ae5dcdbde61665b112595820ed76cc4310d.scope: Deactivated successfully.
Jan 22 17:12:08 compute-0 podman[220434]: 2026-01-22 17:12:08.689648844 +0000 UTC m=+0.081292540 container died c35ae59c6888fa03b353884b930a2ae5dcdbde61665b112595820ed76cc4310d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.699 183079 DEBUG nova.virt.libvirt.vif [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:11:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1314522391',display_name='tempest-server-test-1314522391',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1314522391',id=19,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFDbtc4hRCSv9gHmltB2G2DhAS2UXQKlSKw7wO8jzDWMIBVwAbtKYxxZ/33aOA3bpXNLm1KqC5K8xgOmfq2yl/h8qq6Gn/Z0l0pwwJ1+wDWZki4CnB3qyJDKSWstQshx5w==',key_name='tempest-keypair-test-612599565',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:11:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='26cca885d303443380036cbbe9e70744',ramdisk_id='',reservation_id='r-i7hk38t1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkConnectivityTest-1809867331',owner_user_name='tempest-NetworkConnectivityTest-1809867331-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:11:26Z,user_data=None,user_id='4a7542774b9c42618cf9d00113f9d23d',uuid=e69f0100-85ca-4ff8-a177-27d35d4580de,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "address": "fa:16:3e:22:e2:d4", "network": {"id": "3d23d5e4-bd70-4266-8b97-203b9af8d4ef", "bridge": "br-int", "label": "tempest-test-network--2043252313", "subnets": [{"cidr": "10.10.2.0/24", "dns": [], "gateway": {"address": "10.10.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.2.65", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e3bc449-87", "ovs_interfaceid": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.700 183079 DEBUG nova.network.os_vif_util [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converting VIF {"id": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "address": "fa:16:3e:22:e2:d4", "network": {"id": "3d23d5e4-bd70-4266-8b97-203b9af8d4ef", "bridge": "br-int", "label": "tempest-test-network--2043252313", "subnets": [{"cidr": "10.10.2.0/24", "dns": [], "gateway": {"address": "10.10.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.2.65", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e3bc449-87", "ovs_interfaceid": "0e3bc449-87f9-4d63-9fee-5ac925d686c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.701 183079 DEBUG nova.network.os_vif_util [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:e2:d4,bridge_name='br-int',has_traffic_filtering=True,id=0e3bc449-87f9-4d63-9fee-5ac925d686c4,network=Network(3d23d5e4-bd70-4266-8b97-203b9af8d4ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0e3bc449-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.701 183079 DEBUG os_vif [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:e2:d4,bridge_name='br-int',has_traffic_filtering=True,id=0e3bc449-87f9-4d63-9fee-5ac925d686c4,network=Network(3d23d5e4-bd70-4266-8b97-203b9af8d4ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0e3bc449-87') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.703 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.703 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e3bc449-87, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.706 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.707 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.710 183079 INFO os_vif [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:e2:d4,bridge_name='br-int',has_traffic_filtering=True,id=0e3bc449-87f9-4d63-9fee-5ac925d686c4,network=Network(3d23d5e4-bd70-4266-8b97-203b9af8d4ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0e3bc449-87')
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.711 183079 INFO nova.virt.libvirt.driver [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Deleting instance files /var/lib/nova/instances/e69f0100-85ca-4ff8-a177-27d35d4580de_del
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.711 183079 INFO nova.virt.libvirt.driver [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Deletion of /var/lib/nova/instances/e69f0100-85ca-4ff8-a177-27d35d4580de_del complete
Jan 22 17:12:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c35ae59c6888fa03b353884b930a2ae5dcdbde61665b112595820ed76cc4310d-userdata-shm.mount: Deactivated successfully.
Jan 22 17:12:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-7ef46b5b029f8510f0162b5237ad7a6ce67e655f4fab8a923ff26c7a15c1618a-merged.mount: Deactivated successfully.
Jan 22 17:12:08 compute-0 podman[220434]: 2026-01-22 17:12:08.738290812 +0000 UTC m=+0.129934508 container cleanup c35ae59c6888fa03b353884b930a2ae5dcdbde61665b112595820ed76cc4310d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:12:08 compute-0 systemd[1]: libpod-conmon-c35ae59c6888fa03b353884b930a2ae5dcdbde61665b112595820ed76cc4310d.scope: Deactivated successfully.
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.770 183079 INFO nova.compute.manager [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.771 183079 DEBUG oslo.service.loopingcall [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.771 183079 DEBUG nova.compute.manager [-] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.771 183079 DEBUG nova.network.neutron [-] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:12:08 compute-0 podman[220473]: 2026-01-22 17:12:08.825894676 +0000 UTC m=+0.063656350 container remove c35ae59c6888fa03b353884b930a2ae5dcdbde61665b112595820ed76cc4310d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.834 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fabd60bc-6891-4bff-8271-055672dd1a46]: (4, ('Thu Jan 22 05:12:08 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef (c35ae59c6888fa03b353884b930a2ae5dcdbde61665b112595820ed76cc4310d)\nc35ae59c6888fa03b353884b930a2ae5dcdbde61665b112595820ed76cc4310d\nThu Jan 22 05:12:08 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef (c35ae59c6888fa03b353884b930a2ae5dcdbde61665b112595820ed76cc4310d)\nc35ae59c6888fa03b353884b930a2ae5dcdbde61665b112595820ed76cc4310d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.836 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8c1c4a86-1699-4138-8261-574e040d2e4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.837 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d23d5e4-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.839 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:08 compute-0 kernel: tap3d23d5e4-b0: left promiscuous mode
Jan 22 17:12:08 compute-0 nova_compute[183075]: 2026-01-22 17:12:08.857 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.860 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[25c2aa64-6174-4977-b767-2e50b177bd0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.880 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[38fb0193-13e5-439d-918e-46e6e3f399b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.882 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[73ddf690-115b-49e4-9b67-8575b9d554c1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.897 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5b8be8ef-c59b-4855-bdb4-a00ec32eaf94]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424906, 'reachable_time': 19721, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220488, 'error': None, 'target': 'ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:08 compute-0 systemd[1]: run-netns-ovnmeta\x2d3d23d5e4\x2dbd70\x2d4266\x2d8b97\x2d203b9af8d4ef.mount: Deactivated successfully.
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.902 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3d23d5e4-bd70-4266-8b97-203b9af8d4ef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.902 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[270d6218-49cb-46de-83eb-c83610fe7916]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.904 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 0e3bc449-87f9-4d63-9fee-5ac925d686c4 in datapath 3d23d5e4-bd70-4266-8b97-203b9af8d4ef unbound from our chassis
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.906 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3d23d5e4-bd70-4266-8b97-203b9af8d4ef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.907 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2a099489-c794-4c20-ba67-9613c5fc16d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.908 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 0e3bc449-87f9-4d63-9fee-5ac925d686c4 in datapath 3d23d5e4-bd70-4266-8b97-203b9af8d4ef unbound from our chassis
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.909 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3d23d5e4-bd70-4266-8b97-203b9af8d4ef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:12:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:08.910 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[960608c2-a3fc-455e-bfd6-93213eb3c69f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:09 compute-0 nova_compute[183075]: 2026-01-22 17:12:09.148 183079 DEBUG nova.compute.manager [req-4d675ba1-fdad-43a4-b5b8-466b2e531356 req-84a7eba5-b34a-4765-adf7-ce0453910b98 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Received event network-vif-unplugged-0e3bc449-87f9-4d63-9fee-5ac925d686c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:09 compute-0 nova_compute[183075]: 2026-01-22 17:12:09.148 183079 DEBUG oslo_concurrency.lockutils [req-4d675ba1-fdad-43a4-b5b8-466b2e531356 req-84a7eba5-b34a-4765-adf7-ce0453910b98 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:09 compute-0 nova_compute[183075]: 2026-01-22 17:12:09.149 183079 DEBUG oslo_concurrency.lockutils [req-4d675ba1-fdad-43a4-b5b8-466b2e531356 req-84a7eba5-b34a-4765-adf7-ce0453910b98 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:09 compute-0 nova_compute[183075]: 2026-01-22 17:12:09.149 183079 DEBUG oslo_concurrency.lockutils [req-4d675ba1-fdad-43a4-b5b8-466b2e531356 req-84a7eba5-b34a-4765-adf7-ce0453910b98 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:09 compute-0 nova_compute[183075]: 2026-01-22 17:12:09.149 183079 DEBUG nova.compute.manager [req-4d675ba1-fdad-43a4-b5b8-466b2e531356 req-84a7eba5-b34a-4765-adf7-ce0453910b98 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] No waiting events found dispatching network-vif-unplugged-0e3bc449-87f9-4d63-9fee-5ac925d686c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:12:09 compute-0 nova_compute[183075]: 2026-01-22 17:12:09.149 183079 DEBUG nova.compute.manager [req-4d675ba1-fdad-43a4-b5b8-466b2e531356 req-84a7eba5-b34a-4765-adf7-ce0453910b98 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Received event network-vif-unplugged-0e3bc449-87f9-4d63-9fee-5ac925d686c4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:12:09 compute-0 nova_compute[183075]: 2026-01-22 17:12:09.151 183079 DEBUG nova.network.neutron [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Successfully updated port: 1728cec9-ef37-4d9b-8c9c-54c2a6640439 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:12:09 compute-0 nova_compute[183075]: 2026-01-22 17:12:09.171 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "refresh_cache-91845d3c-b89e-43ba-b1d2-40f99d79ae8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:12:09 compute-0 nova_compute[183075]: 2026-01-22 17:12:09.171 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquired lock "refresh_cache-91845d3c-b89e-43ba-b1d2-40f99d79ae8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:12:09 compute-0 nova_compute[183075]: 2026-01-22 17:12:09.172 183079 DEBUG nova.network.neutron [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:12:09 compute-0 nova_compute[183075]: 2026-01-22 17:12:09.364 183079 DEBUG nova.network.neutron [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:12:09 compute-0 nova_compute[183075]: 2026-01-22 17:12:09.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:12:09 compute-0 nova_compute[183075]: 2026-01-22 17:12:09.996 183079 DEBUG nova.network.neutron [-] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.014 183079 INFO nova.compute.manager [-] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Took 1.24 seconds to deallocate network for instance.
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.061 183079 DEBUG oslo_concurrency.lockutils [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.062 183079 DEBUG oslo_concurrency.lockutils [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.141 183079 DEBUG nova.network.neutron [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Updating instance_info_cache with network_info: [{"id": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "address": "fa:16:3e:2c:21:9e", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1728cec9-ef", "ovs_interfaceid": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.157 183079 DEBUG nova.compute.provider_tree [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.210 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.276 183079 DEBUG nova.scheduler.client.report [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.284 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Releasing lock "refresh_cache-91845d3c-b89e-43ba-b1d2-40f99d79ae8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.285 183079 DEBUG nova.compute.manager [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Instance network_info: |[{"id": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "address": "fa:16:3e:2c:21:9e", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1728cec9-ef", "ovs_interfaceid": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.289 183079 DEBUG nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Start _get_guest_xml network_info=[{"id": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "address": "fa:16:3e:2c:21:9e", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1728cec9-ef", "ovs_interfaceid": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.296 183079 WARNING nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.302 183079 DEBUG oslo_concurrency.lockutils [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.309 183079 DEBUG nova.virt.libvirt.host [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.310 183079 DEBUG nova.virt.libvirt.host [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.315 183079 DEBUG nova.virt.libvirt.host [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.316 183079 DEBUG nova.virt.libvirt.host [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.317 183079 DEBUG nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.317 183079 DEBUG nova.virt.hardware [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.318 183079 DEBUG nova.virt.hardware [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.318 183079 DEBUG nova.virt.hardware [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.319 183079 DEBUG nova.virt.hardware [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.319 183079 DEBUG nova.virt.hardware [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.319 183079 DEBUG nova.virt.hardware [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.320 183079 DEBUG nova.virt.hardware [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.320 183079 DEBUG nova.virt.hardware [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.321 183079 DEBUG nova.virt.hardware [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.321 183079 DEBUG nova.virt.hardware [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.321 183079 DEBUG nova.virt.hardware [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.329 183079 DEBUG nova.virt.libvirt.vif [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:12:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-105042380',display_name='tempest-server-test-105042380',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-105042380',id=21,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-kyiddvi0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:12:07Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=91845d3c-b89e-43ba-b1d2-40f99d79ae8e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "address": "fa:16:3e:2c:21:9e", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1728cec9-ef", "ovs_interfaceid": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.329 183079 DEBUG nova.network.os_vif_util [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "address": "fa:16:3e:2c:21:9e", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1728cec9-ef", "ovs_interfaceid": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.330 183079 DEBUG nova.network.os_vif_util [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:21:9e,bridge_name='br-int',has_traffic_filtering=True,id=1728cec9-ef37-4d9b-8c9c-54c2a6640439,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1728cec9-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.332 183079 DEBUG nova.objects.instance [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 91845d3c-b89e-43ba-b1d2-40f99d79ae8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.339 183079 INFO nova.scheduler.client.report [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Deleted allocations for instance e69f0100-85ca-4ff8-a177-27d35d4580de
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.345 183079 DEBUG nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:12:10 compute-0 nova_compute[183075]:   <uuid>91845d3c-b89e-43ba-b1d2-40f99d79ae8e</uuid>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   <name>instance-00000015</name>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-105042380</nova:name>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:12:10</nova:creationTime>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:12:10 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:12:10 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:12:10 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:12:10 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:12:10 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:12:10 compute-0 nova_compute[183075]:         <nova:user uuid="cd47d63cff2548a88e21e5c2e6a5c161">tempest-FloatingIpSeparateNetwork-931877966-project-member</nova:user>
Jan 22 17:12:10 compute-0 nova_compute[183075]:         <nova:project uuid="e05c7aae349e4a1d859a387df45650a0">tempest-FloatingIpSeparateNetwork-931877966</nova:project>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:12:10 compute-0 nova_compute[183075]:         <nova:port uuid="1728cec9-ef37-4d9b-8c9c-54c2a6640439">
Jan 22 17:12:10 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <system>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <entry name="serial">91845d3c-b89e-43ba-b1d2-40f99d79ae8e</entry>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <entry name="uuid">91845d3c-b89e-43ba-b1d2-40f99d79ae8e</entry>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     </system>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   <os>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   </os>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   <features>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   </features>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:2c:21:9e"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <target dev="tap1728cec9-ef"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/console.log" append="off"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <video>
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     </video>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:12:10 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:12:10 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:12:10 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:12:10 compute-0 nova_compute[183075]: </domain>
Jan 22 17:12:10 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.346 183079 DEBUG nova.compute.manager [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Preparing to wait for external event network-vif-plugged-1728cec9-ef37-4d9b-8c9c-54c2a6640439 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.346 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.346 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.346 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.347 183079 DEBUG nova.virt.libvirt.vif [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:12:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-105042380',display_name='tempest-server-test-105042380',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-105042380',id=21,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-kyiddvi0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:12:07Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=91845d3c-b89e-43ba-b1d2-40f99d79ae8e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "address": "fa:16:3e:2c:21:9e", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1728cec9-ef", "ovs_interfaceid": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.347 183079 DEBUG nova.network.os_vif_util [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "address": "fa:16:3e:2c:21:9e", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1728cec9-ef", "ovs_interfaceid": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.348 183079 DEBUG nova.network.os_vif_util [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:21:9e,bridge_name='br-int',has_traffic_filtering=True,id=1728cec9-ef37-4d9b-8c9c-54c2a6640439,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1728cec9-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.348 183079 DEBUG os_vif [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:21:9e,bridge_name='br-int',has_traffic_filtering=True,id=1728cec9-ef37-4d9b-8c9c-54c2a6640439,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1728cec9-ef') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.349 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.349 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.349 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.353 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.354 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1728cec9-ef, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.354 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1728cec9-ef, col_values=(('external_ids', {'iface-id': '1728cec9-ef37-4d9b-8c9c-54c2a6640439', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2c:21:9e', 'vm-uuid': '91845d3c-b89e-43ba-b1d2-40f99d79ae8e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.356 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:10 compute-0 NetworkManager[55454]: <info>  [1769101930.3568] manager: (tap1728cec9-ef): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.360 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.361 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.362 183079 INFO os_vif [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:21:9e,bridge_name='br-int',has_traffic_filtering=True,id=1728cec9-ef37-4d9b-8c9c-54c2a6640439,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1728cec9-ef')
Jan 22 17:12:10 compute-0 podman[220491]: 2026-01-22 17:12:10.478325848 +0000 UTC m=+0.081373893 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.554 183079 DEBUG nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.555 183079 DEBUG nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No VIF found with MAC fa:16:3e:2c:21:9e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.610 183079 DEBUG oslo_concurrency.lockutils [None req-eae44f3b-1699-467b-8ba7-937461e7642a 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "e69f0100-85ca-4ff8-a177-27d35d4580de" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:10 compute-0 kernel: tap1728cec9-ef: entered promiscuous mode
Jan 22 17:12:10 compute-0 NetworkManager[55454]: <info>  [1769101930.6264] manager: (tap1728cec9-ef): new Tun device (/org/freedesktop/NetworkManager/Devices/107)
Jan 22 17:12:10 compute-0 systemd-udevd[220415]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.632 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:10 compute-0 ovn_controller[95372]: 2026-01-22T17:12:10Z|00247|binding|INFO|Claiming lport 1728cec9-ef37-4d9b-8c9c-54c2a6640439 for this chassis.
Jan 22 17:12:10 compute-0 ovn_controller[95372]: 2026-01-22T17:12:10Z|00248|binding|INFO|1728cec9-ef37-4d9b-8c9c-54c2a6640439: Claiming fa:16:3e:2c:21:9e 10.100.0.13
Jan 22 17:12:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:10.641 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:21:9e 10.100.0.13'], port_security=['fa:16:3e:2c:21:9e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '91845d3c-b89e-43ba-b1d2-40f99d79ae8e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-576f6598-999f-46d9-809a-65b7475a1ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92087573-6410-4f80-a195-ce8b2058420e, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=1728cec9-ef37-4d9b-8c9c-54c2a6640439) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:12:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:10.644 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 1728cec9-ef37-4d9b-8c9c-54c2a6640439 in datapath 576f6598-999f-46d9-809a-65b7475a1ec7 bound to our chassis
Jan 22 17:12:10 compute-0 NetworkManager[55454]: <info>  [1769101930.6450] device (tap1728cec9-ef): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:12:10 compute-0 NetworkManager[55454]: <info>  [1769101930.6458] device (tap1728cec9-ef): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:12:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:10.649 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 576f6598-999f-46d9-809a-65b7475a1ec7
Jan 22 17:12:10 compute-0 ovn_controller[95372]: 2026-01-22T17:12:10Z|00249|binding|INFO|Setting lport 1728cec9-ef37-4d9b-8c9c-54c2a6640439 ovn-installed in OVS
Jan 22 17:12:10 compute-0 ovn_controller[95372]: 2026-01-22T17:12:10Z|00250|binding|INFO|Setting lport 1728cec9-ef37-4d9b-8c9c-54c2a6640439 up in Southbound
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.660 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.664 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:10.672 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b1a92623-0115-4c05-b4e5-aa62e93cf58e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:10 compute-0 systemd-machined[154382]: New machine qemu-21-instance-00000015.
Jan 22 17:12:10 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-00000015.
Jan 22 17:12:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:10.719 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[eb7a66a5-8171-4f52-890f-8bd181177d87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:10.723 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e8b97e-cef9-4405-9e17-673fa5fcfb42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:10.767 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5f5fd770-027c-4201-938d-99ba0ffdd41c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:10.786 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b1c26185-adbc-4296-8020-5ed0045b02f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap576f6598-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:fa:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6146, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6146, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425631, 'reachable_time': 19752, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220541, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:10.810 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c037290e-6843-46c4-bad5-8e80caef2d66]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap576f6598-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425644, 'tstamp': 425644}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220543, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap576f6598-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425648, 'tstamp': 425648}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220543, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:10.812 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap576f6598-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:10.815 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap576f6598-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:10.815 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:12:10 compute-0 nova_compute[183075]: 2026-01-22 17:12:10.815 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:10.816 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap576f6598-90, col_values=(('external_ids', {'iface-id': '1759254b-798a-4e65-baf5-489557c1f604'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:10.816 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.125 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101931.1250901, 91845d3c-b89e-43ba-b1d2-40f99d79ae8e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.126 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] VM Started (Lifecycle Event)
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.144 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.148 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101931.125797, 91845d3c-b89e-43ba-b1d2-40f99d79ae8e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.149 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] VM Paused (Lifecycle Event)
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.214 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.218 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.260 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.287 183079 DEBUG nova.compute.manager [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Received event network-changed-1728cec9-ef37-4d9b-8c9c-54c2a6640439 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.287 183079 DEBUG nova.compute.manager [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Refreshing instance network info cache due to event network-changed-1728cec9-ef37-4d9b-8c9c-54c2a6640439. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.288 183079 DEBUG oslo_concurrency.lockutils [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-91845d3c-b89e-43ba-b1d2-40f99d79ae8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.288 183079 DEBUG oslo_concurrency.lockutils [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-91845d3c-b89e-43ba-b1d2-40f99d79ae8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.288 183079 DEBUG nova.network.neutron [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Refreshing network info cache for port 1728cec9-ef37-4d9b-8c9c-54c2a6640439 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.515 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101916.513847, 7e8d077b-66fc-42ee-ad4e-a13327ad6764 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.516 183079 INFO nova.compute.manager [-] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] VM Stopped (Lifecycle Event)
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.563 183079 DEBUG nova.compute.manager [None req-d63d7673-c2e2-465f-a1e9-57204c3dc869 - - - - - -] [instance: 7e8d077b-66fc-42ee-ad4e-a13327ad6764] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.763 183079 DEBUG oslo_concurrency.lockutils [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.764 183079 DEBUG oslo_concurrency.lockutils [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.764 183079 DEBUG oslo_concurrency.lockutils [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.764 183079 DEBUG oslo_concurrency.lockutils [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.764 183079 DEBUG oslo_concurrency.lockutils [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.765 183079 INFO nova.compute.manager [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Terminating instance
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.766 183079 DEBUG nova.compute.manager [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:12:11 compute-0 kernel: tapc26b2385-71 (unregistering): left promiscuous mode
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:12:11 compute-0 NetworkManager[55454]: <info>  [1769101931.7930] device (tapc26b2385-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.807 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.807 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 17:12:11 compute-0 ovn_controller[95372]: 2026-01-22T17:12:11Z|00251|binding|INFO|Releasing lport c26b2385-71db-477e-888c-d10712732db6 from this chassis (sb_readonly=0)
Jan 22 17:12:11 compute-0 ovn_controller[95372]: 2026-01-22T17:12:11Z|00252|binding|INFO|Setting lport c26b2385-71db-477e-888c-d10712732db6 down in Southbound
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.810 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:11 compute-0 ovn_controller[95372]: 2026-01-22T17:12:11Z|00253|binding|INFO|Removing iface tapc26b2385-71 ovn-installed in OVS
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.813 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:11.822 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:de:8a 10.10.1.232'], port_security=['fa:16:3e:3c:de:8a 10.10.1.232'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.10.1.232/24', 'neutron:device_id': '36a2dc63-6945-45c9-8e82-9d3aacdfc3bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea1cd914-64be-4fd0-b944-45368957fb5b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26cca885d303443380036cbbe9e70744', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9427a67d-1313-4d60-b73e-5a3f81f9a54d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.249'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f87222fa-7187-4fb1-9f2e-117949cb78fa, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=c26b2385-71db-477e-888c-d10712732db6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:12:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:11.824 104629 INFO neutron.agent.ovn.metadata.agent [-] Port c26b2385-71db-477e-888c-d10712732db6 in datapath ea1cd914-64be-4fd0-b944-45368957fb5b unbound from our chassis
Jan 22 17:12:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:11.825 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ea1cd914-64be-4fd0-b944-45368957fb5b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:12:11 compute-0 nova_compute[183075]: 2026-01-22 17:12:11.825 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:11.827 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[79b11847-e292-4568-97a7-c141f8910eeb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:11.827 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b namespace which is not needed anymore
Jan 22 17:12:11 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000010.scope: Deactivated successfully.
Jan 22 17:12:11 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000010.scope: Consumed 14.710s CPU time.
Jan 22 17:12:11 compute-0 systemd-machined[154382]: Machine qemu-16-instance-00000010 terminated.
Jan 22 17:12:11 compute-0 neutron-haproxy-ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b[218912]: [NOTICE]   (218916) : haproxy version is 2.8.14-c23fe91
Jan 22 17:12:11 compute-0 neutron-haproxy-ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b[218912]: [NOTICE]   (218916) : path to executable is /usr/sbin/haproxy
Jan 22 17:12:11 compute-0 neutron-haproxy-ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b[218912]: [WARNING]  (218916) : Exiting Master process...
Jan 22 17:12:11 compute-0 neutron-haproxy-ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b[218912]: [WARNING]  (218916) : Exiting Master process...
Jan 22 17:12:11 compute-0 neutron-haproxy-ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b[218912]: [ALERT]    (218916) : Current worker (218918) exited with code 143 (Terminated)
Jan 22 17:12:11 compute-0 neutron-haproxy-ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b[218912]: [WARNING]  (218916) : All workers exited. Exiting... (0)
Jan 22 17:12:11 compute-0 systemd[1]: libpod-04243dd0edec7173998e19ee857249c039bac30befee2d558a8d8b171cc9f8c7.scope: Deactivated successfully.
Jan 22 17:12:11 compute-0 podman[220575]: 2026-01-22 17:12:11.968984252 +0000 UTC m=+0.044655355 container died 04243dd0edec7173998e19ee857249c039bac30befee2d558a8d8b171cc9f8c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:12:12 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-04243dd0edec7173998e19ee857249c039bac30befee2d558a8d8b171cc9f8c7-userdata-shm.mount: Deactivated successfully.
Jan 22 17:12:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-98a90ffaaf19cf754860dd41470f93fa7233e95a32e8f90c678af9e3a1058b81-merged.mount: Deactivated successfully.
Jan 22 17:12:12 compute-0 podman[220575]: 2026-01-22 17:12:12.025136425 +0000 UTC m=+0.100807528 container cleanup 04243dd0edec7173998e19ee857249c039bac30befee2d558a8d8b171cc9f8c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.031 183079 INFO nova.virt.libvirt.driver [-] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Instance destroyed successfully.
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.032 183079 DEBUG nova.objects.instance [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lazy-loading 'resources' on Instance uuid 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:12:12 compute-0 systemd[1]: libpod-conmon-04243dd0edec7173998e19ee857249c039bac30befee2d558a8d8b171cc9f8c7.scope: Deactivated successfully.
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.045 183079 DEBUG nova.virt.libvirt.vif [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:10:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-939954687',display_name='tempest-server-test-939954687',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-939954687',id=16,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFDbtc4hRCSv9gHmltB2G2DhAS2UXQKlSKw7wO8jzDWMIBVwAbtKYxxZ/33aOA3bpXNLm1KqC5K8xgOmfq2yl/h8qq6Gn/Z0l0pwwJ1+wDWZki4CnB3qyJDKSWstQshx5w==',key_name='tempest-keypair-test-612599565',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:10:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='26cca885d303443380036cbbe9e70744',ramdisk_id='',reservation_id='r-0x1txn9l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkConnectivityTest-1809867331',owner_user_name='tempest-NetworkConnectivityTest-1809867331-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:10:54Z,user_data=None,user_id='4a7542774b9c42618cf9d00113f9d23d',uuid=36a2dc63-6945-45c9-8e82-9d3aacdfc3bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c26b2385-71db-477e-888c-d10712732db6", "address": "fa:16:3e:3c:de:8a", "network": {"id": "ea1cd914-64be-4fd0-b944-45368957fb5b", "bridge": "br-int", "label": "tempest-test-network--945581638", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc26b2385-71", "ovs_interfaceid": "c26b2385-71db-477e-888c-d10712732db6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.046 183079 DEBUG nova.network.os_vif_util [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converting VIF {"id": "c26b2385-71db-477e-888c-d10712732db6", "address": "fa:16:3e:3c:de:8a", "network": {"id": "ea1cd914-64be-4fd0-b944-45368957fb5b", "bridge": "br-int", "label": "tempest-test-network--945581638", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.232", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc26b2385-71", "ovs_interfaceid": "c26b2385-71db-477e-888c-d10712732db6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.047 183079 DEBUG nova.network.os_vif_util [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3c:de:8a,bridge_name='br-int',has_traffic_filtering=True,id=c26b2385-71db-477e-888c-d10712732db6,network=Network(ea1cd914-64be-4fd0-b944-45368957fb5b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc26b2385-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.047 183079 DEBUG os_vif [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3c:de:8a,bridge_name='br-int',has_traffic_filtering=True,id=c26b2385-71db-477e-888c-d10712732db6,network=Network(ea1cd914-64be-4fd0-b944-45368957fb5b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc26b2385-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.048 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.048 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc26b2385-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.049 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.051 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.054 183079 INFO os_vif [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3c:de:8a,bridge_name='br-int',has_traffic_filtering=True,id=c26b2385-71db-477e-888c-d10712732db6,network=Network(ea1cd914-64be-4fd0-b944-45368957fb5b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc26b2385-71')
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.054 183079 INFO nova.virt.libvirt.driver [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Deleting instance files /var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc_del
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.054 183079 INFO nova.virt.libvirt.driver [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Deletion of /var/lib/nova/instances/36a2dc63-6945-45c9-8e82-9d3aacdfc3bc_del complete
Jan 22 17:12:12 compute-0 podman[220621]: 2026-01-22 17:12:12.096340711 +0000 UTC m=+0.048163266 container remove 04243dd0edec7173998e19ee857249c039bac30befee2d558a8d8b171cc9f8c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:12:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:12.101 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c81116ab-969d-45a7-9b44-42eb9add0463]: (4, ('Thu Jan 22 05:12:11 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b (04243dd0edec7173998e19ee857249c039bac30befee2d558a8d8b171cc9f8c7)\n04243dd0edec7173998e19ee857249c039bac30befee2d558a8d8b171cc9f8c7\nThu Jan 22 05:12:12 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b (04243dd0edec7173998e19ee857249c039bac30befee2d558a8d8b171cc9f8c7)\n04243dd0edec7173998e19ee857249c039bac30befee2d558a8d8b171cc9f8c7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:12.103 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f9b13c52-4d8e-412d-9952-b376442b6741]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:12.104 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea1cd914-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.106 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:12 compute-0 kernel: tapea1cd914-60: left promiscuous mode
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.111 183079 INFO nova.compute.manager [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.112 183079 DEBUG oslo.service.loopingcall [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.112 183079 DEBUG nova.compute.manager [-] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.113 183079 DEBUG nova.network.neutron [-] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:12:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:12.120 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e14b4adb-843f-48ce-a51b-d0100b1afbbd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.124 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:12.144 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[21189880-6bae-4423-96e7-675db8be989b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:12.145 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1ab9be63-55ef-431a-8c09-3c0da3d5aa65]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:12.166 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[89334678-14fc-4ca5-b351-b2f62a400cc7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 421704, 'reachable_time': 16237, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220636, 'error': None, 'target': 'ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:12.168 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ea1cd914-64be-4fd0-b944-45368957fb5b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:12:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:12.168 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[386d4011-6430-43be-a0ae-d1dc8053c3dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:12 compute-0 systemd[1]: run-netns-ovnmeta\x2dea1cd914\x2d64be\x2d4fd0\x2db944\x2d45368957fb5b.mount: Deactivated successfully.
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.407 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-936001bf-d51b-4243-87b8-e363ef3c47a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.408 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-936001bf-d51b-4243-87b8-e363ef3c47a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.408 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:12:12 compute-0 nova_compute[183075]: 2026-01-22 17:12:12.408 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 936001bf-d51b-4243-87b8-e363ef3c47a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.442 183079 DEBUG nova.compute.manager [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Received event network-vif-plugged-1728cec9-ef37-4d9b-8c9c-54c2a6640439 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.442 183079 DEBUG oslo_concurrency.lockutils [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.442 183079 DEBUG oslo_concurrency.lockutils [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.442 183079 DEBUG oslo_concurrency.lockutils [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.443 183079 DEBUG nova.compute.manager [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Processing event network-vif-plugged-1728cec9-ef37-4d9b-8c9c-54c2a6640439 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.443 183079 DEBUG nova.compute.manager [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Received event network-vif-unplugged-c26b2385-71db-477e-888c-d10712732db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.443 183079 DEBUG oslo_concurrency.lockutils [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.443 183079 DEBUG oslo_concurrency.lockutils [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.444 183079 DEBUG oslo_concurrency.lockutils [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.444 183079 DEBUG nova.compute.manager [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] No waiting events found dispatching network-vif-unplugged-c26b2385-71db-477e-888c-d10712732db6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.444 183079 DEBUG nova.compute.manager [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Received event network-vif-unplugged-c26b2385-71db-477e-888c-d10712732db6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.444 183079 DEBUG nova.compute.manager [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Received event network-vif-plugged-c26b2385-71db-477e-888c-d10712732db6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.444 183079 DEBUG oslo_concurrency.lockutils [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.444 183079 DEBUG oslo_concurrency.lockutils [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.445 183079 DEBUG oslo_concurrency.lockutils [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.445 183079 DEBUG nova.compute.manager [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] No waiting events found dispatching network-vif-plugged-c26b2385-71db-477e-888c-d10712732db6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.445 183079 WARNING nova.compute.manager [req-8554d871-bfbd-46e3-887f-08c784cf17c5 req-57370372-dae2-406c-a995-08b8f372fc65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Received unexpected event network-vif-plugged-c26b2385-71db-477e-888c-d10712732db6 for instance with vm_state active and task_state deleting.
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.446 183079 DEBUG nova.compute.manager [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.452 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101933.4518845, 91845d3c-b89e-43ba-b1d2-40f99d79ae8e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.452 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] VM Resumed (Lifecycle Event)
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.455 183079 DEBUG nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.458 183079 INFO nova.virt.libvirt.driver [-] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Instance spawned successfully.
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.459 183079 DEBUG nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.476 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.482 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.486 183079 DEBUG nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.486 183079 DEBUG nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.486 183079 DEBUG nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.487 183079 DEBUG nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.487 183079 DEBUG nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.488 183079 DEBUG nova.virt.libvirt.driver [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.580 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.615 183079 INFO nova.compute.manager [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Took 6.34 seconds to spawn the instance on the hypervisor.
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.615 183079 DEBUG nova.compute.manager [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:13 compute-0 nova_compute[183075]: 2026-01-22 17:12:13.676 183079 INFO nova.compute.manager [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Took 7.39 seconds to build instance.
Jan 22 17:12:14 compute-0 nova_compute[183075]: 2026-01-22 17:12:14.003 183079 DEBUG nova.network.neutron [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Updated VIF entry in instance network info cache for port 1728cec9-ef37-4d9b-8c9c-54c2a6640439. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:12:14 compute-0 nova_compute[183075]: 2026-01-22 17:12:14.003 183079 DEBUG nova.network.neutron [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Updating instance_info_cache with network_info: [{"id": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "address": "fa:16:3e:2c:21:9e", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1728cec9-ef", "ovs_interfaceid": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:12:14 compute-0 nova_compute[183075]: 2026-01-22 17:12:14.026 183079 DEBUG oslo_concurrency.lockutils [None req-b369a94f-4eae-48ec-a124-ea9f47b35e72 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.885s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:14 compute-0 nova_compute[183075]: 2026-01-22 17:12:14.043 183079 DEBUG oslo_concurrency.lockutils [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-91845d3c-b89e-43ba-b1d2-40f99d79ae8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:12:14 compute-0 nova_compute[183075]: 2026-01-22 17:12:14.043 183079 DEBUG nova.compute.manager [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Received event network-vif-plugged-0e3bc449-87f9-4d63-9fee-5ac925d686c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:14 compute-0 nova_compute[183075]: 2026-01-22 17:12:14.044 183079 DEBUG oslo_concurrency.lockutils [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:14 compute-0 nova_compute[183075]: 2026-01-22 17:12:14.044 183079 DEBUG oslo_concurrency.lockutils [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:14 compute-0 nova_compute[183075]: 2026-01-22 17:12:14.044 183079 DEBUG oslo_concurrency.lockutils [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e69f0100-85ca-4ff8-a177-27d35d4580de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:14 compute-0 nova_compute[183075]: 2026-01-22 17:12:14.044 183079 DEBUG nova.compute.manager [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] No waiting events found dispatching network-vif-plugged-0e3bc449-87f9-4d63-9fee-5ac925d686c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:12:14 compute-0 nova_compute[183075]: 2026-01-22 17:12:14.045 183079 WARNING nova.compute.manager [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Received unexpected event network-vif-plugged-0e3bc449-87f9-4d63-9fee-5ac925d686c4 for instance with vm_state deleted and task_state None.
Jan 22 17:12:14 compute-0 nova_compute[183075]: 2026-01-22 17:12:14.045 183079 DEBUG nova.compute.manager [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Received event network-vif-plugged-1728cec9-ef37-4d9b-8c9c-54c2a6640439 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:14 compute-0 nova_compute[183075]: 2026-01-22 17:12:14.045 183079 DEBUG oslo_concurrency.lockutils [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:14 compute-0 nova_compute[183075]: 2026-01-22 17:12:14.045 183079 DEBUG oslo_concurrency.lockutils [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:14 compute-0 nova_compute[183075]: 2026-01-22 17:12:14.046 183079 DEBUG oslo_concurrency.lockutils [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:14 compute-0 nova_compute[183075]: 2026-01-22 17:12:14.046 183079 DEBUG nova.compute.manager [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] No waiting events found dispatching network-vif-plugged-1728cec9-ef37-4d9b-8c9c-54c2a6640439 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:12:14 compute-0 nova_compute[183075]: 2026-01-22 17:12:14.046 183079 WARNING nova.compute.manager [req-b370faab-275c-49c7-8bbf-b99d502f4762 req-f6bb4206-f509-49cc-a71d-aa376a57f92c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Received unexpected event network-vif-plugged-1728cec9-ef37-4d9b-8c9c-54c2a6640439 for instance with vm_state building and task_state spawning.
Jan 22 17:12:15 compute-0 nova_compute[183075]: 2026-01-22 17:12:15.213 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:15 compute-0 nova_compute[183075]: 2026-01-22 17:12:15.949 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Updating instance_info_cache with network_info: [{"id": "804a64f5-797f-4eba-ae49-100790171545", "address": "fa:16:3e:3a:03:2f", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap804a64f5-79", "ovs_interfaceid": "804a64f5-797f-4eba-ae49-100790171545", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.143 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-936001bf-d51b-4243-87b8-e363ef3c47a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.143 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.144 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.144 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.145 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.145 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.173 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.175 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.175 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.175 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.438 183079 DEBUG nova.network.neutron [-] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.735 183079 INFO nova.compute.manager [-] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Took 4.62 seconds to deallocate network for instance.
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.740 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.813 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.815 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.838 183079 DEBUG oslo_concurrency.lockutils [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.839 183079 DEBUG oslo_concurrency.lockutils [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.875 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.881 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.908 183079 INFO nova.compute.manager [None req-a9d03322-ad4c-4237-8e16-e7ab78466f11 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Get console output
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.918 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.938 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.939 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:16 compute-0 nova_compute[183075]: 2026-01-22 17:12:16.987 183079 DEBUG nova.compute.provider_tree [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.002 183079 DEBUG nova.scheduler.client.report [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.022 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.033 183079 DEBUG oslo_concurrency.lockutils [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.051 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.056 183079 INFO nova.scheduler.client.report [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Deleted allocations for instance 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.138 183079 DEBUG oslo_concurrency.lockutils [None req-753711fd-6dff-4fec-869e-1b1c13fc5990 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "36a2dc63-6945-45c9-8e82-9d3aacdfc3bc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.375s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.190 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.191 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5382MB free_disk=73.3463134765625GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.191 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.192 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.268 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 936001bf-d51b-4243-87b8-e363ef3c47a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.269 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 91845d3c-b89e-43ba-b1d2-40f99d79ae8e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.269 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.270 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.329 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.342 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.376 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.377 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:17 compute-0 nova_compute[183075]: 2026-01-22 17:12:17.746 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:18 compute-0 nova_compute[183075]: 2026-01-22 17:12:18.898 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101923.8970468, c1a1134b-933b-41d1-ba12-adb71c18d006 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:18 compute-0 nova_compute[183075]: 2026-01-22 17:12:18.899 183079 INFO nova.compute.manager [-] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] VM Stopped (Lifecycle Event)
Jan 22 17:12:18 compute-0 nova_compute[183075]: 2026-01-22 17:12:18.915 183079 DEBUG nova.compute.manager [None req-da430587-28be-4cff-8aeb-3853940451c7 - - - - - -] [instance: c1a1134b-933b-41d1-ba12-adb71c18d006] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:20 compute-0 nova_compute[183075]: 2026-01-22 17:12:20.216 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:20 compute-0 podman[220651]: 2026-01-22 17:12:20.4126076 +0000 UTC m=+0.107719720 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:12:22 compute-0 nova_compute[183075]: 2026-01-22 17:12:22.056 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:22 compute-0 nova_compute[183075]: 2026-01-22 17:12:22.105 183079 INFO nova.compute.manager [None req-9132a6b8-577d-4cca-8187-c648e4cb14d0 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Get console output
Jan 22 17:12:23 compute-0 nova_compute[183075]: 2026-01-22 17:12:23.678 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101928.675842, e69f0100-85ca-4ff8-a177-27d35d4580de => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:23 compute-0 nova_compute[183075]: 2026-01-22 17:12:23.678 183079 INFO nova.compute.manager [-] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] VM Stopped (Lifecycle Event)
Jan 22 17:12:23 compute-0 nova_compute[183075]: 2026-01-22 17:12:23.713 183079 DEBUG nova.compute.manager [None req-e3d1c3ff-bef6-497e-b43d-ea6881129b97 - - - - - -] [instance: e69f0100-85ca-4ff8-a177-27d35d4580de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:24 compute-0 nova_compute[183075]: 2026-01-22 17:12:24.170 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:25 compute-0 nova_compute[183075]: 2026-01-22 17:12:25.266 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:26 compute-0 ovn_controller[95372]: 2026-01-22T17:12:26Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2c:21:9e 10.100.0.13
Jan 22 17:12:26 compute-0 ovn_controller[95372]: 2026-01-22T17:12:26Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2c:21:9e 10.100.0.13
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.029 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101932.0283616, 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.030 183079 INFO nova.compute.manager [-] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] VM Stopped (Lifecycle Event)
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.048 183079 DEBUG nova.compute.manager [None req-ef00bbb4-ca62-4ad0-939b-f6b847acf502 - - - - - -] [instance: 36a2dc63-6945-45c9-8e82-9d3aacdfc3bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.060 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.224 183079 INFO nova.compute.manager [None req-cd6b75ff-43bb-467b-aebd-e39b92ef40fe cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Get console output
Jan 22 17:12:27 compute-0 podman[220700]: 2026-01-22 17:12:27.36656096 +0000 UTC m=+0.063356693 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:12:27 compute-0 podman[220701]: 2026-01-22 17:12:27.37730763 +0000 UTC m=+0.066641678 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Jan 22 17:12:27 compute-0 podman[220699]: 2026-01-22 17:12:27.416553513 +0000 UTC m=+0.114095925 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.425 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "a39a5d00-6f96-4405-aff0-1449aee94079" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.426 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "a39a5d00-6f96-4405-aff0-1449aee94079" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.443 183079 DEBUG nova.compute.manager [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.548 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.550 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.563 183079 DEBUG nova.virt.hardware [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.564 183079 INFO nova.compute.claims [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.731 183079 DEBUG nova.compute.provider_tree [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.747 183079 DEBUG nova.scheduler.client.report [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.768 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.769 183079 DEBUG nova.compute.manager [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.822 183079 DEBUG nova.compute.manager [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.823 183079 DEBUG nova.network.neutron [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.844 183079 INFO nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:12:27 compute-0 nova_compute[183075]: 2026-01-22 17:12:27.864 183079 DEBUG nova.compute.manager [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.034 183079 DEBUG nova.policy [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.043 183079 DEBUG nova.compute.manager [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.045 183079 DEBUG nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.046 183079 INFO nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Creating image(s)
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.048 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "/var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.049 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "/var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.050 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "/var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.075 183079 DEBUG oslo_concurrency.processutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.104 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.158 183079 DEBUG oslo_concurrency.processutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.159 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.160 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.186 183079 DEBUG oslo_concurrency.processutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.257 183079 DEBUG oslo_concurrency.processutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.259 183079 DEBUG oslo_concurrency.processutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.312 183079 DEBUG oslo_concurrency.processutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.314 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.314 183079 DEBUG oslo_concurrency.processutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.393 183079 DEBUG oslo_concurrency.processutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.395 183079 DEBUG nova.virt.disk.api [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Checking if we can resize image /var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.396 183079 DEBUG oslo_concurrency.processutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.464 183079 DEBUG oslo_concurrency.processutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.467 183079 DEBUG nova.virt.disk.api [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Cannot resize image /var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.467 183079 DEBUG nova.objects.instance [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lazy-loading 'migration_context' on Instance uuid a39a5d00-6f96-4405-aff0-1449aee94079 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.485 183079 DEBUG nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.486 183079 DEBUG nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Ensure instance console log exists: /var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.487 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.487 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:28 compute-0 nova_compute[183075]: 2026-01-22 17:12:28.488 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:29 compute-0 nova_compute[183075]: 2026-01-22 17:12:29.203 183079 DEBUG nova.network.neutron [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Successfully created port: c30636af-db80-4279-8e40-c0266175c726 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:12:30 compute-0 nova_compute[183075]: 2026-01-22 17:12:30.269 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:30.986 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:30.988 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:12:30 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:30 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:30 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:30 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:30 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:30 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:12:30 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:12:31 compute-0 nova_compute[183075]: 2026-01-22 17:12:31.334 183079 DEBUG nova.network.neutron [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Successfully updated port: c30636af-db80-4279-8e40-c0266175c726 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.438 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.439 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.4514859
Jan 22 17:12:31 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.13:50804 [22/Jan/2026:17:12:30.985] listener listener/metadata 0/0/0/453/453 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.454 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.455 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.476 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.476 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0218520
Jan 22 17:12:31 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.13:50806 [22/Jan/2026:17:12:31.453] listener listener/metadata 0/0/0/23/23 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:12:31 compute-0 nova_compute[183075]: 2026-01-22 17:12:31.483 183079 DEBUG nova.compute.manager [req-22596de5-12fb-4445-a10d-d09ead485b4c req-10b6d648-8e49-4772-afc3-c525227e547b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Received event network-changed-c30636af-db80-4279-8e40-c0266175c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:31 compute-0 nova_compute[183075]: 2026-01-22 17:12:31.483 183079 DEBUG nova.compute.manager [req-22596de5-12fb-4445-a10d-d09ead485b4c req-10b6d648-8e49-4772-afc3-c525227e547b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Refreshing instance network info cache due to event network-changed-c30636af-db80-4279-8e40-c0266175c726. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:12:31 compute-0 nova_compute[183075]: 2026-01-22 17:12:31.484 183079 DEBUG oslo_concurrency.lockutils [req-22596de5-12fb-4445-a10d-d09ead485b4c req-10b6d648-8e49-4772-afc3-c525227e547b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-a39a5d00-6f96-4405-aff0-1449aee94079" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:12:31 compute-0 nova_compute[183075]: 2026-01-22 17:12:31.484 183079 DEBUG oslo_concurrency.lockutils [req-22596de5-12fb-4445-a10d-d09ead485b4c req-10b6d648-8e49-4772-afc3-c525227e547b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-a39a5d00-6f96-4405-aff0-1449aee94079" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:12:31 compute-0 nova_compute[183075]: 2026-01-22 17:12:31.484 183079 DEBUG nova.network.neutron [req-22596de5-12fb-4445-a10d-d09ead485b4c req-10b6d648-8e49-4772-afc3-c525227e547b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Refreshing network info cache for port c30636af-db80-4279-8e40-c0266175c726 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.487 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:31 compute-0 nova_compute[183075]: 2026-01-22 17:12:31.488 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "refresh_cache-a39a5d00-6f96-4405-aff0-1449aee94079" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.488 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.507 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.508 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0199747
Jan 22 17:12:31 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.13:50822 [22/Jan/2026:17:12:31.486] listener listener/metadata 0/0/0/21/21 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.516 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.517 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.537 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.538 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0213342
Jan 22 17:12:31 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.13:50834 [22/Jan/2026:17:12:31.515] listener listener/metadata 0/0/0/22/22 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.549 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.550 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.581 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:12:31 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.13:50838 [22/Jan/2026:17:12:31.548] listener listener/metadata 0/0/0/33/33 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.582 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0323963
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.591 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.592 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.624 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:12:31 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.13:50842 [22/Jan/2026:17:12:31.590] listener listener/metadata 0/0/0/34/34 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.625 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0329919
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.630 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.631 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.659 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.660 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0284564
Jan 22 17:12:31 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.13:50854 [22/Jan/2026:17:12:31.630] listener listener/metadata 0/0/0/29/29 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.665 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.666 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.690 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.690 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0246077
Jan 22 17:12:31 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.13:50868 [22/Jan/2026:17:12:31.664] listener listener/metadata 0/0/0/25/25 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.695 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.696 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:12:31 compute-0 nova_compute[183075]: 2026-01-22 17:12:31.700 183079 DEBUG nova.network.neutron [req-22596de5-12fb-4445-a10d-d09ead485b4c req-10b6d648-8e49-4772-afc3-c525227e547b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.717 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.718 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0222352
Jan 22 17:12:31 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.13:50876 [22/Jan/2026:17:12:31.695] listener listener/metadata 0/0/0/23/23 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.722 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.724 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.743 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.744 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0204003
Jan 22 17:12:31 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.13:50878 [22/Jan/2026:17:12:31.722] listener listener/metadata 0/0/0/21/21 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.749 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.749 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.769 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0194795
Jan 22 17:12:31 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.13:50888 [22/Jan/2026:17:12:31.748] listener listener/metadata 0/0/0/20/20 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.787 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.787 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.808 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.809 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0214517
Jan 22 17:12:31 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.13:50890 [22/Jan/2026:17:12:31.786] listener listener/metadata 0/0/0/22/22 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.815 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.816 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.835 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.836 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0199778
Jan 22 17:12:31 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.13:50906 [22/Jan/2026:17:12:31.815] listener listener/metadata 0/0/0/21/21 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.842 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.843 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.864 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:12:31 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.13:50910 [22/Jan/2026:17:12:31.841] listener listener/metadata 0/0/0/23/23 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.865 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0224810
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.875 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.877 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.920 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:12:31 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.13:50916 [22/Jan/2026:17:12:31.875] listener listener/metadata 0/0/0/46/46 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.921 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0441282
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.928 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.929 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.996 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:12:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:31.997 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0674975
Jan 22 17:12:31 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[219946]: 10.100.0.13:50924 [22/Jan/2026:17:12:31.928] listener listener/metadata 0/0/0/68/68 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:12:32 compute-0 nova_compute[183075]: 2026-01-22 17:12:32.063 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:32 compute-0 nova_compute[183075]: 2026-01-22 17:12:32.323 183079 DEBUG nova.network.neutron [req-22596de5-12fb-4445-a10d-d09ead485b4c req-10b6d648-8e49-4772-afc3-c525227e547b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:12:32 compute-0 nova_compute[183075]: 2026-01-22 17:12:32.358 183079 DEBUG oslo_concurrency.lockutils [req-22596de5-12fb-4445-a10d-d09ead485b4c req-10b6d648-8e49-4772-afc3-c525227e547b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-a39a5d00-6f96-4405-aff0-1449aee94079" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:12:32 compute-0 nova_compute[183075]: 2026-01-22 17:12:32.359 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquired lock "refresh_cache-a39a5d00-6f96-4405-aff0-1449aee94079" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:12:32 compute-0 nova_compute[183075]: 2026-01-22 17:12:32.359 183079 DEBUG nova.network.neutron [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:12:32 compute-0 nova_compute[183075]: 2026-01-22 17:12:32.374 183079 INFO nova.compute.manager [None req-9255609b-8288-40bb-8b39-ab09dbe02991 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Get console output
Jan 22 17:12:32 compute-0 nova_compute[183075]: 2026-01-22 17:12:32.380 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:12:32 compute-0 nova_compute[183075]: 2026-01-22 17:12:32.502 183079 DEBUG nova.network.neutron [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.255 183079 DEBUG nova.network.neutron [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Updating instance_info_cache with network_info: [{"id": "c30636af-db80-4279-8e40-c0266175c726", "address": "fa:16:3e:cd:f6:c4", "network": {"id": "cbad0d35-4bf3-49f1-bb21-0be199e1e42e", "bridge": "br-int", "label": "tempest-test-network--998799692", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc30636af-db", "ovs_interfaceid": "c30636af-db80-4279-8e40-c0266175c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.285 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Releasing lock "refresh_cache-a39a5d00-6f96-4405-aff0-1449aee94079" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.285 183079 DEBUG nova.compute.manager [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Instance network_info: |[{"id": "c30636af-db80-4279-8e40-c0266175c726", "address": "fa:16:3e:cd:f6:c4", "network": {"id": "cbad0d35-4bf3-49f1-bb21-0be199e1e42e", "bridge": "br-int", "label": "tempest-test-network--998799692", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc30636af-db", "ovs_interfaceid": "c30636af-db80-4279-8e40-c0266175c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.288 183079 DEBUG nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Start _get_guest_xml network_info=[{"id": "c30636af-db80-4279-8e40-c0266175c726", "address": "fa:16:3e:cd:f6:c4", "network": {"id": "cbad0d35-4bf3-49f1-bb21-0be199e1e42e", "bridge": "br-int", "label": "tempest-test-network--998799692", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc30636af-db", "ovs_interfaceid": "c30636af-db80-4279-8e40-c0266175c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.294 183079 WARNING nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.302 183079 DEBUG nova.virt.libvirt.host [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.302 183079 DEBUG nova.virt.libvirt.host [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.307 183079 DEBUG nova.virt.libvirt.host [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.307 183079 DEBUG nova.virt.libvirt.host [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.308 183079 DEBUG nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.308 183079 DEBUG nova.virt.hardware [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.309 183079 DEBUG nova.virt.hardware [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.309 183079 DEBUG nova.virt.hardware [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.309 183079 DEBUG nova.virt.hardware [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.309 183079 DEBUG nova.virt.hardware [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.310 183079 DEBUG nova.virt.hardware [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.310 183079 DEBUG nova.virt.hardware [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.310 183079 DEBUG nova.virt.hardware [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.311 183079 DEBUG nova.virt.hardware [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.311 183079 DEBUG nova.virt.hardware [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.311 183079 DEBUG nova.virt.hardware [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.315 183079 DEBUG nova.virt.libvirt.vif [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:12:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-2004519396',display_name='tempest-server-test-2004519396',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-2004519396',id=22,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIwX3Tnjd5UGmg0w9k/BN9eS1qe75E7Lic/jqsTQaVUTG16NFNysn4OP5OqeIQEMSgvijvcEmFLUdbKXTJ+WqhpTczbZR3YnhHyqcZ3vgAR6NGGdmWhQ6meJ9Nv3J8mm/Q==',key_name='tempest-keypair-test-368848261',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bfc6667804934c92b71ce7638089e9e3',ramdisk_id='',reservation_id='r-m04apwhz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-QoSTest-2146064006',owner_user_name='tempest-QoSTest-2146064006-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:12:27Z,user_data=None,user_id='1e61127d65144bcbaa0d43fe3eb484c0',uuid=a39a5d00-6f96-4405-aff0-1449aee94079,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c30636af-db80-4279-8e40-c0266175c726", "address": "fa:16:3e:cd:f6:c4", "network": {"id": "cbad0d35-4bf3-49f1-bb21-0be199e1e42e", "bridge": "br-int", "label": "tempest-test-network--998799692", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc30636af-db", "ovs_interfaceid": "c30636af-db80-4279-8e40-c0266175c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.316 183079 DEBUG nova.network.os_vif_util [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converting VIF {"id": "c30636af-db80-4279-8e40-c0266175c726", "address": "fa:16:3e:cd:f6:c4", "network": {"id": "cbad0d35-4bf3-49f1-bb21-0be199e1e42e", "bridge": "br-int", "label": "tempest-test-network--998799692", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc30636af-db", "ovs_interfaceid": "c30636af-db80-4279-8e40-c0266175c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.317 183079 DEBUG nova.network.os_vif_util [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:f6:c4,bridge_name='br-int',has_traffic_filtering=True,id=c30636af-db80-4279-8e40-c0266175c726,network=Network(cbad0d35-4bf3-49f1-bb21-0be199e1e42e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc30636af-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.318 183079 DEBUG nova.objects.instance [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lazy-loading 'pci_devices' on Instance uuid a39a5d00-6f96-4405-aff0-1449aee94079 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.334 183079 DEBUG nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:12:33 compute-0 nova_compute[183075]:   <uuid>a39a5d00-6f96-4405-aff0-1449aee94079</uuid>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   <name>instance-00000016</name>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-2004519396</nova:name>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:12:33</nova:creationTime>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:12:33 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:12:33 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:12:33 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:12:33 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:12:33 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:12:33 compute-0 nova_compute[183075]:         <nova:user uuid="1e61127d65144bcbaa0d43fe3eb484c0">tempest-QoSTest-2146064006-project-member</nova:user>
Jan 22 17:12:33 compute-0 nova_compute[183075]:         <nova:project uuid="bfc6667804934c92b71ce7638089e9e3">tempest-QoSTest-2146064006</nova:project>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:12:33 compute-0 nova_compute[183075]:         <nova:port uuid="c30636af-db80-4279-8e40-c0266175c726">
Jan 22 17:12:33 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.19" ipVersion="4"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <system>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <entry name="serial">a39a5d00-6f96-4405-aff0-1449aee94079</entry>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <entry name="uuid">a39a5d00-6f96-4405-aff0-1449aee94079</entry>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     </system>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   <os>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   </os>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   <features>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   </features>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079/disk"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:cd:f6:c4"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <target dev="tapc30636af-db"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079/console.log" append="off"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <video>
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     </video>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:12:33 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:12:33 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:12:33 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:12:33 compute-0 nova_compute[183075]: </domain>
Jan 22 17:12:33 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.336 183079 DEBUG nova.compute.manager [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Preparing to wait for external event network-vif-plugged-c30636af-db80-4279-8e40-c0266175c726 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.337 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.338 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.338 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.340 183079 DEBUG nova.virt.libvirt.vif [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:12:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-2004519396',display_name='tempest-server-test-2004519396',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-2004519396',id=22,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIwX3Tnjd5UGmg0w9k/BN9eS1qe75E7Lic/jqsTQaVUTG16NFNysn4OP5OqeIQEMSgvijvcEmFLUdbKXTJ+WqhpTczbZR3YnhHyqcZ3vgAR6NGGdmWhQ6meJ9Nv3J8mm/Q==',key_name='tempest-keypair-test-368848261',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bfc6667804934c92b71ce7638089e9e3',ramdisk_id='',reservation_id='r-m04apwhz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-QoSTest-2146064006',owner_user_name='tempest-QoSTest-2146064006-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:12:27Z,user_data=None,user_id='1e61127d65144bcbaa0d43fe3eb484c0',uuid=a39a5d00-6f96-4405-aff0-1449aee94079,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c30636af-db80-4279-8e40-c0266175c726", "address": "fa:16:3e:cd:f6:c4", "network": {"id": "cbad0d35-4bf3-49f1-bb21-0be199e1e42e", "bridge": "br-int", "label": "tempest-test-network--998799692", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc30636af-db", "ovs_interfaceid": "c30636af-db80-4279-8e40-c0266175c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.341 183079 DEBUG nova.network.os_vif_util [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converting VIF {"id": "c30636af-db80-4279-8e40-c0266175c726", "address": "fa:16:3e:cd:f6:c4", "network": {"id": "cbad0d35-4bf3-49f1-bb21-0be199e1e42e", "bridge": "br-int", "label": "tempest-test-network--998799692", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc30636af-db", "ovs_interfaceid": "c30636af-db80-4279-8e40-c0266175c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.342 183079 DEBUG nova.network.os_vif_util [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:f6:c4,bridge_name='br-int',has_traffic_filtering=True,id=c30636af-db80-4279-8e40-c0266175c726,network=Network(cbad0d35-4bf3-49f1-bb21-0be199e1e42e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc30636af-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.343 183079 DEBUG os_vif [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:f6:c4,bridge_name='br-int',has_traffic_filtering=True,id=c30636af-db80-4279-8e40-c0266175c726,network=Network(cbad0d35-4bf3-49f1-bb21-0be199e1e42e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc30636af-db') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.344 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.345 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.346 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.353 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.353 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc30636af-db, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.354 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc30636af-db, col_values=(('external_ids', {'iface-id': 'c30636af-db80-4279-8e40-c0266175c726', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cd:f6:c4', 'vm-uuid': 'a39a5d00-6f96-4405-aff0-1449aee94079'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.388 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:33 compute-0 NetworkManager[55454]: <info>  [1769101953.3900] manager: (tapc30636af-db): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.391 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.402 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.404 183079 INFO os_vif [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:f6:c4,bridge_name='br-int',has_traffic_filtering=True,id=c30636af-db80-4279-8e40-c0266175c726,network=Network(cbad0d35-4bf3-49f1-bb21-0be199e1e42e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc30636af-db')
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.455 183079 DEBUG nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.455 183079 DEBUG nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] No VIF found with MAC fa:16:3e:cd:f6:c4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:12:33 compute-0 kernel: tapc30636af-db: entered promiscuous mode
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.553 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:33 compute-0 NetworkManager[55454]: <info>  [1769101953.5532] manager: (tapc30636af-db): new Tun device (/org/freedesktop/NetworkManager/Devices/109)
Jan 22 17:12:33 compute-0 ovn_controller[95372]: 2026-01-22T17:12:33Z|00254|binding|INFO|Claiming lport c30636af-db80-4279-8e40-c0266175c726 for this chassis.
Jan 22 17:12:33 compute-0 ovn_controller[95372]: 2026-01-22T17:12:33Z|00255|binding|INFO|c30636af-db80-4279-8e40-c0266175c726: Claiming fa:16:3e:cd:f6:c4 10.100.0.19
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.567 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:f6:c4 10.100.0.19'], port_security=['fa:16:3e:cd:f6:c4 10.100.0.19'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': 'a39a5d00-6f96-4405-aff0-1449aee94079', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cbad0d35-4bf3-49f1-bb21-0be199e1e42e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bfc6667804934c92b71ce7638089e9e3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ff5109fd-5275-48f3-bbdf-9e01013834de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=429b02ac-0e82-46dc-96b9-403150e7bdc7, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=c30636af-db80-4279-8e40-c0266175c726) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.571 104629 INFO neutron.agent.ovn.metadata.agent [-] Port c30636af-db80-4279-8e40-c0266175c726 in datapath cbad0d35-4bf3-49f1-bb21-0be199e1e42e bound to our chassis
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.577 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cbad0d35-4bf3-49f1-bb21-0be199e1e42e
Jan 22 17:12:33 compute-0 ovn_controller[95372]: 2026-01-22T17:12:33Z|00256|binding|INFO|Setting lport c30636af-db80-4279-8e40-c0266175c726 ovn-installed in OVS
Jan 22 17:12:33 compute-0 ovn_controller[95372]: 2026-01-22T17:12:33Z|00257|binding|INFO|Setting lport c30636af-db80-4279-8e40-c0266175c726 up in Southbound
Jan 22 17:12:33 compute-0 nova_compute[183075]: 2026-01-22 17:12:33.590 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:33 compute-0 systemd-udevd[220804]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.601 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a7b0fd9c-5221-4b87-a99f-17b5bc0a32c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.602 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcbad0d35-41 in ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.605 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcbad0d35-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.605 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8b358d41-b165-4a3e-820b-c4468b7dfdf9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.608 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6f6a3d43-b8dd-485d-8957-045dcecbeab1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:33 compute-0 NetworkManager[55454]: <info>  [1769101953.6191] device (tapc30636af-db): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:12:33 compute-0 NetworkManager[55454]: <info>  [1769101953.6196] device (tapc30636af-db): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:12:33 compute-0 systemd-machined[154382]: New machine qemu-22-instance-00000016.
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.632 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[bbfcba5d-3e90-4089-b939-87c67803fd92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:33 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-00000016.
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.661 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9ca25fba-9e6c-45df-b96d-492ce1e93cb0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:33 compute-0 podman[220788]: 2026-01-22 17:12:33.6708995 +0000 UTC m=+0.119038124 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.699 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[584093ff-0c55-483a-beb9-b2c817517866]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:33 compute-0 NetworkManager[55454]: <info>  [1769101953.7094] manager: (tapcbad0d35-40): new Veth device (/org/freedesktop/NetworkManager/Devices/110)
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.708 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ecd36752-a601-426f-a6d9-4441a1862e0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.773 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[11f0ae11-84a7-44b6-8c14-670699b0fca4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.780 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[b4bf8190-9521-4012-8566-f0cc10bc35af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:33 compute-0 NetworkManager[55454]: <info>  [1769101953.8056] device (tapcbad0d35-40): carrier: link connected
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.810 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[4d93fc53-8743-4466-9ffe-a6660e9422dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.837 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[218bb5be-c9d1-4154-851e-842f1b78d7af]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcbad0d35-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:f3:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431750, 'reachable_time': 36376, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220847, 'error': None, 'target': 'ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.860 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[294b542a-6550-4d83-a6ac-98e8c464f6ea]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe15:f3b2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431750, 'tstamp': 431750}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220848, 'error': None, 'target': 'ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.888 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d20ceddc-9b97-4b73-a423-230ea87e18df]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcbad0d35-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:f3:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431750, 'reachable_time': 36376, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220849, 'error': None, 'target': 'ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:33.933 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0c42c6ab-2b07-49df-b818-b3961133a47a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:34.027 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9c858705-fc08-4528-b8a8-70ef36cc8fe6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:34.029 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcbad0d35-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:34.029 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:34.030 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcbad0d35-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:34 compute-0 NetworkManager[55454]: <info>  [1769101954.0334] manager: (tapcbad0d35-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Jan 22 17:12:34 compute-0 kernel: tapcbad0d35-40: entered promiscuous mode
Jan 22 17:12:34 compute-0 nova_compute[183075]: 2026-01-22 17:12:34.034 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:34.044 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcbad0d35-40, col_values=(('external_ids', {'iface-id': '9a8cacf5-7e05-4d6d-ac33-e4996692a784'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:34 compute-0 ovn_controller[95372]: 2026-01-22T17:12:34Z|00258|binding|INFO|Releasing lport 9a8cacf5-7e05-4d6d-ac33-e4996692a784 from this chassis (sb_readonly=0)
Jan 22 17:12:34 compute-0 nova_compute[183075]: 2026-01-22 17:12:34.046 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:34.048 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cbad0d35-4bf3-49f1-bb21-0be199e1e42e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cbad0d35-4bf3-49f1-bb21-0be199e1e42e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:34.049 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0d28a46a-ecdb-46c1-828d-6c550f97286a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:34.051 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/cbad0d35-4bf3-49f1-bb21-0be199e1e42e.pid.haproxy
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID cbad0d35-4bf3-49f1-bb21-0be199e1e42e
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:12:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:34.052 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e', 'env', 'PROCESS_TAG=haproxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cbad0d35-4bf3-49f1-bb21-0be199e1e42e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:12:34 compute-0 nova_compute[183075]: 2026-01-22 17:12:34.060 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:34 compute-0 nova_compute[183075]: 2026-01-22 17:12:34.104 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101954.1032517, a39a5d00-6f96-4405-aff0-1449aee94079 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:34 compute-0 nova_compute[183075]: 2026-01-22 17:12:34.104 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] VM Started (Lifecycle Event)
Jan 22 17:12:34 compute-0 nova_compute[183075]: 2026-01-22 17:12:34.128 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:34 compute-0 nova_compute[183075]: 2026-01-22 17:12:34.132 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101954.1035614, a39a5d00-6f96-4405-aff0-1449aee94079 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:34 compute-0 nova_compute[183075]: 2026-01-22 17:12:34.132 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] VM Paused (Lifecycle Event)
Jan 22 17:12:34 compute-0 nova_compute[183075]: 2026-01-22 17:12:34.151 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:34 compute-0 nova_compute[183075]: 2026-01-22 17:12:34.155 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:12:34 compute-0 nova_compute[183075]: 2026-01-22 17:12:34.180 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:12:34 compute-0 podman[220888]: 2026-01-22 17:12:34.523424597 +0000 UTC m=+0.078854097 container create 4faf985257e3c47d365d9893848cb23e428415e25aabdb2f680c13454df413aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:12:34 compute-0 podman[220888]: 2026-01-22 17:12:34.486091874 +0000 UTC m=+0.041521454 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:12:34 compute-0 nova_compute[183075]: 2026-01-22 17:12:34.590 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:34 compute-0 systemd[1]: Started libpod-conmon-4faf985257e3c47d365d9893848cb23e428415e25aabdb2f680c13454df413aa.scope.
Jan 22 17:12:34 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:12:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b15dfc45478c199d4e87377273bc377a98fa0798e6507e78b443f0dfcffbdafc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:12:34 compute-0 podman[220888]: 2026-01-22 17:12:34.648011485 +0000 UTC m=+0.203441005 container init 4faf985257e3c47d365d9893848cb23e428415e25aabdb2f680c13454df413aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 17:12:34 compute-0 podman[220888]: 2026-01-22 17:12:34.653985831 +0000 UTC m=+0.209415331 container start 4faf985257e3c47d365d9893848cb23e428415e25aabdb2f680c13454df413aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:12:34 compute-0 neutron-haproxy-ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220905]: [NOTICE]   (220909) : New worker (220911) forked
Jan 22 17:12:34 compute-0 neutron-haproxy-ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220905]: [NOTICE]   (220909) : Loading success.
Jan 22 17:12:35 compute-0 nova_compute[183075]: 2026-01-22 17:12:35.273 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:36 compute-0 nova_compute[183075]: 2026-01-22 17:12:36.956 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:37 compute-0 nova_compute[183075]: 2026-01-22 17:12:37.506 183079 INFO nova.compute.manager [None req-8c6f7dc0-0474-4e9c-bfcc-b0656d4971a6 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Get console output
Jan 22 17:12:37 compute-0 nova_compute[183075]: 2026-01-22 17:12:37.514 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:12:38 compute-0 nova_compute[183075]: 2026-01-22 17:12:38.390 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:38 compute-0 nova_compute[183075]: 2026-01-22 17:12:38.678 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.435 183079 DEBUG nova.compute.manager [req-3a248d7d-6c95-40bb-b20c-4dda7f49484c req-594fb250-2402-4624-8cf1-51c02c4c16fb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Received event network-vif-plugged-c30636af-db80-4279-8e40-c0266175c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.435 183079 DEBUG oslo_concurrency.lockutils [req-3a248d7d-6c95-40bb-b20c-4dda7f49484c req-594fb250-2402-4624-8cf1-51c02c4c16fb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.436 183079 DEBUG oslo_concurrency.lockutils [req-3a248d7d-6c95-40bb-b20c-4dda7f49484c req-594fb250-2402-4624-8cf1-51c02c4c16fb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.436 183079 DEBUG oslo_concurrency.lockutils [req-3a248d7d-6c95-40bb-b20c-4dda7f49484c req-594fb250-2402-4624-8cf1-51c02c4c16fb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.436 183079 DEBUG nova.compute.manager [req-3a248d7d-6c95-40bb-b20c-4dda7f49484c req-594fb250-2402-4624-8cf1-51c02c4c16fb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Processing event network-vif-plugged-c30636af-db80-4279-8e40-c0266175c726 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.438 183079 DEBUG nova.compute.manager [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.442 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101959.4423416, a39a5d00-6f96-4405-aff0-1449aee94079 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.443 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] VM Resumed (Lifecycle Event)
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.446 183079 DEBUG nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.451 183079 INFO nova.virt.libvirt.driver [-] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Instance spawned successfully.
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.451 183079 DEBUG nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.467 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.476 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.481 183079 DEBUG nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.481 183079 DEBUG nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.482 183079 DEBUG nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.482 183079 DEBUG nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.483 183079 DEBUG nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.483 183079 DEBUG nova.virt.libvirt.driver [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.509 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.543 183079 INFO nova.compute.manager [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Took 11.50 seconds to spawn the instance on the hypervisor.
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.544 183079 DEBUG nova.compute.manager [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.614 183079 INFO nova.compute.manager [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Took 12.10 seconds to build instance.
Jan 22 17:12:39 compute-0 nova_compute[183075]: 2026-01-22 17:12:39.630 183079 DEBUG oslo_concurrency.lockutils [None req-c86748c4-62b3-4f0b-b579-cd50e3d6e9e8 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "a39a5d00-6f96-4405-aff0-1449aee94079" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:40 compute-0 nova_compute[183075]: 2026-01-22 17:12:40.279 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:40 compute-0 nova_compute[183075]: 2026-01-22 17:12:40.408 183079 INFO nova.compute.manager [None req-2d278ba3-0aea-4bc5-adb1-972e7cb2c008 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Get console output
Jan 22 17:12:40 compute-0 nova_compute[183075]: 2026-01-22 17:12:40.413 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:12:41 compute-0 podman[220920]: 2026-01-22 17:12:41.410151205 +0000 UTC m=+0.099320670 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:12:41 compute-0 nova_compute[183075]: 2026-01-22 17:12:41.559 183079 DEBUG nova.compute.manager [req-23c28238-b64a-4124-b34d-2a2db0b4a8d9 req-6e0e47a3-7b64-4a52-bd8a-f4adaf57e38f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Received event network-vif-plugged-c30636af-db80-4279-8e40-c0266175c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:41 compute-0 nova_compute[183075]: 2026-01-22 17:12:41.560 183079 DEBUG oslo_concurrency.lockutils [req-23c28238-b64a-4124-b34d-2a2db0b4a8d9 req-6e0e47a3-7b64-4a52-bd8a-f4adaf57e38f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:41 compute-0 nova_compute[183075]: 2026-01-22 17:12:41.560 183079 DEBUG oslo_concurrency.lockutils [req-23c28238-b64a-4124-b34d-2a2db0b4a8d9 req-6e0e47a3-7b64-4a52-bd8a-f4adaf57e38f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:41 compute-0 nova_compute[183075]: 2026-01-22 17:12:41.561 183079 DEBUG oslo_concurrency.lockutils [req-23c28238-b64a-4124-b34d-2a2db0b4a8d9 req-6e0e47a3-7b64-4a52-bd8a-f4adaf57e38f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:41 compute-0 nova_compute[183075]: 2026-01-22 17:12:41.561 183079 DEBUG nova.compute.manager [req-23c28238-b64a-4124-b34d-2a2db0b4a8d9 req-6e0e47a3-7b64-4a52-bd8a-f4adaf57e38f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] No waiting events found dispatching network-vif-plugged-c30636af-db80-4279-8e40-c0266175c726 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:12:41 compute-0 nova_compute[183075]: 2026-01-22 17:12:41.561 183079 WARNING nova.compute.manager [req-23c28238-b64a-4124-b34d-2a2db0b4a8d9 req-6e0e47a3-7b64-4a52-bd8a-f4adaf57e38f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Received unexpected event network-vif-plugged-c30636af-db80-4279-8e40-c0266175c726 for instance with vm_state active and task_state None.
Jan 22 17:12:41 compute-0 nova_compute[183075]: 2026-01-22 17:12:41.594 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:41.929 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:41.930 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:41.931 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:43 compute-0 nova_compute[183075]: 2026-01-22 17:12:43.435 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:43 compute-0 nova_compute[183075]: 2026-01-22 17:12:43.833 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "7915ef96-3b31-447b-a4b5-1feeb4997869" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:43 compute-0 nova_compute[183075]: 2026-01-22 17:12:43.834 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "7915ef96-3b31-447b-a4b5-1feeb4997869" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:43 compute-0 nova_compute[183075]: 2026-01-22 17:12:43.856 183079 DEBUG nova.compute.manager [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:12:43 compute-0 nova_compute[183075]: 2026-01-22 17:12:43.947 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:43 compute-0 nova_compute[183075]: 2026-01-22 17:12:43.949 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:43 compute-0 nova_compute[183075]: 2026-01-22 17:12:43.957 183079 DEBUG nova.virt.hardware [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:12:43 compute-0 nova_compute[183075]: 2026-01-22 17:12:43.958 183079 INFO nova.compute.claims [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.120 183079 DEBUG nova.compute.provider_tree [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.137 183079 DEBUG nova.scheduler.client.report [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.162 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.163 183079 DEBUG nova.compute.manager [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.213 183079 DEBUG nova.compute.manager [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.214 183079 DEBUG nova.network.neutron [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.232 183079 INFO nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.248 183079 DEBUG nova.compute.manager [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.339 183079 DEBUG nova.compute.manager [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.341 183079 DEBUG nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.342 183079 INFO nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Creating image(s)
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.342 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "/var/lib/nova/instances/7915ef96-3b31-447b-a4b5-1feeb4997869/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.343 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/7915ef96-3b31-447b-a4b5-1feeb4997869/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.343 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/7915ef96-3b31-447b-a4b5-1feeb4997869/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.358 183079 DEBUG oslo_concurrency.processutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.428 183079 DEBUG oslo_concurrency.processutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.430 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.431 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.446 183079 DEBUG oslo_concurrency.processutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.512 183079 DEBUG oslo_concurrency.processutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.513 183079 DEBUG oslo_concurrency.processutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/7915ef96-3b31-447b-a4b5-1feeb4997869/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.568 183079 DEBUG oslo_concurrency.processutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/7915ef96-3b31-447b-a4b5-1feeb4997869/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.570 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.571 183079 DEBUG oslo_concurrency.processutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.638 183079 DEBUG oslo_concurrency.processutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.641 183079 DEBUG nova.virt.disk.api [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Checking if we can resize image /var/lib/nova/instances/7915ef96-3b31-447b-a4b5-1feeb4997869/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.642 183079 DEBUG oslo_concurrency.processutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7915ef96-3b31-447b-a4b5-1feeb4997869/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.708 183079 DEBUG oslo_concurrency.processutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7915ef96-3b31-447b-a4b5-1feeb4997869/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.710 183079 DEBUG nova.virt.disk.api [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Cannot resize image /var/lib/nova/instances/7915ef96-3b31-447b-a4b5-1feeb4997869/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.711 183079 DEBUG nova.objects.instance [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'migration_context' on Instance uuid 7915ef96-3b31-447b-a4b5-1feeb4997869 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.735 183079 DEBUG nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.736 183079 DEBUG nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Ensure instance console log exists: /var/lib/nova/instances/7915ef96-3b31-447b-a4b5-1feeb4997869/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.737 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.737 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:44 compute-0 nova_compute[183075]: 2026-01-22 17:12:44.738 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:45 compute-0 nova_compute[183075]: 2026-01-22 17:12:45.284 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:45 compute-0 nova_compute[183075]: 2026-01-22 17:12:45.526 183079 INFO nova.compute.manager [None req-161c0563-8c95-4fff-8e07-551fbcce3d59 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Get console output
Jan 22 17:12:45 compute-0 nova_compute[183075]: 2026-01-22 17:12:45.533 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:12:46 compute-0 nova_compute[183075]: 2026-01-22 17:12:46.035 183079 DEBUG nova.policy [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:12:46 compute-0 nova_compute[183075]: 2026-01-22 17:12:46.825 183079 DEBUG nova.network.neutron [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Successfully updated port: d9e056da-23a3-44e3-b7a8-44b73622dbb1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:12:46 compute-0 nova_compute[183075]: 2026-01-22 17:12:46.840 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "refresh_cache-7915ef96-3b31-447b-a4b5-1feeb4997869" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:12:46 compute-0 nova_compute[183075]: 2026-01-22 17:12:46.840 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquired lock "refresh_cache-7915ef96-3b31-447b-a4b5-1feeb4997869" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:12:46 compute-0 nova_compute[183075]: 2026-01-22 17:12:46.841 183079 DEBUG nova.network.neutron [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:12:46 compute-0 nova_compute[183075]: 2026-01-22 17:12:46.936 183079 DEBUG nova.compute.manager [req-ae82d368-a1f1-4103-b223-c7adf29349fa req-8c49c2ed-8611-4251-9221-3181a49b8430 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Received event network-changed-d9e056da-23a3-44e3-b7a8-44b73622dbb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:46 compute-0 nova_compute[183075]: 2026-01-22 17:12:46.937 183079 DEBUG nova.compute.manager [req-ae82d368-a1f1-4103-b223-c7adf29349fa req-8c49c2ed-8611-4251-9221-3181a49b8430 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Refreshing instance network info cache due to event network-changed-d9e056da-23a3-44e3-b7a8-44b73622dbb1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:12:46 compute-0 nova_compute[183075]: 2026-01-22 17:12:46.937 183079 DEBUG oslo_concurrency.lockutils [req-ae82d368-a1f1-4103-b223-c7adf29349fa req-8c49c2ed-8611-4251-9221-3181a49b8430 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-7915ef96-3b31-447b-a4b5-1feeb4997869" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:12:46 compute-0 nova_compute[183075]: 2026-01-22 17:12:46.986 183079 DEBUG nova.network.neutron [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.393 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.395 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.412 183079 DEBUG nova.compute.manager [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.436 183079 DEBUG nova.network.neutron [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Updating instance_info_cache with network_info: [{"id": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "address": "fa:16:3e:fc:05:dd", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9e056da-23", "ovs_interfaceid": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.439 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.465 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Releasing lock "refresh_cache-7915ef96-3b31-447b-a4b5-1feeb4997869" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.466 183079 DEBUG nova.compute.manager [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Instance network_info: |[{"id": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "address": "fa:16:3e:fc:05:dd", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9e056da-23", "ovs_interfaceid": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.466 183079 DEBUG oslo_concurrency.lockutils [req-ae82d368-a1f1-4103-b223-c7adf29349fa req-8c49c2ed-8611-4251-9221-3181a49b8430 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-7915ef96-3b31-447b-a4b5-1feeb4997869" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.467 183079 DEBUG nova.network.neutron [req-ae82d368-a1f1-4103-b223-c7adf29349fa req-8c49c2ed-8611-4251-9221-3181a49b8430 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Refreshing network info cache for port d9e056da-23a3-44e3-b7a8-44b73622dbb1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.469 183079 DEBUG nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Start _get_guest_xml network_info=[{"id": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "address": "fa:16:3e:fc:05:dd", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9e056da-23", "ovs_interfaceid": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.475 183079 WARNING nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.484 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.485 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.486 183079 DEBUG nova.virt.libvirt.host [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.486 183079 DEBUG nova.virt.libvirt.host [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.493 183079 DEBUG nova.virt.hardware [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.493 183079 INFO nova.compute.claims [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.496 183079 DEBUG nova.virt.libvirt.host [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.497 183079 DEBUG nova.virt.libvirt.host [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.497 183079 DEBUG nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.498 183079 DEBUG nova.virt.hardware [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.498 183079 DEBUG nova.virt.hardware [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.498 183079 DEBUG nova.virt.hardware [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.499 183079 DEBUG nova.virt.hardware [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.499 183079 DEBUG nova.virt.hardware [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.499 183079 DEBUG nova.virt.hardware [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.499 183079 DEBUG nova.virt.hardware [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.500 183079 DEBUG nova.virt.hardware [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.500 183079 DEBUG nova.virt.hardware [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.500 183079 DEBUG nova.virt.hardware [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.501 183079 DEBUG nova.virt.hardware [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.504 183079 DEBUG nova.virt.libvirt.vif [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:12:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1675680184',display_name='tempest-server-test-1675680184',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1675680184',id=23,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-b8u19k66',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:12:44Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=7915ef96-3b31-447b-a4b5-1feeb4997869,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "address": "fa:16:3e:fc:05:dd", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9e056da-23", "ovs_interfaceid": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.505 183079 DEBUG nova.network.os_vif_util [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "address": "fa:16:3e:fc:05:dd", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9e056da-23", "ovs_interfaceid": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.506 183079 DEBUG nova.network.os_vif_util [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:05:dd,bridge_name='br-int',has_traffic_filtering=True,id=d9e056da-23a3-44e3-b7a8-44b73622dbb1,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd9e056da-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.507 183079 DEBUG nova.objects.instance [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7915ef96-3b31-447b-a4b5-1feeb4997869 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.522 183079 DEBUG nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:12:48 compute-0 nova_compute[183075]:   <uuid>7915ef96-3b31-447b-a4b5-1feeb4997869</uuid>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   <name>instance-00000017</name>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1675680184</nova:name>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:12:48</nova:creationTime>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:12:48 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:12:48 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:12:48 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:12:48 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:12:48 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:12:48 compute-0 nova_compute[183075]:         <nova:user uuid="cd47d63cff2548a88e21e5c2e6a5c161">tempest-FloatingIpSeparateNetwork-931877966-project-member</nova:user>
Jan 22 17:12:48 compute-0 nova_compute[183075]:         <nova:project uuid="e05c7aae349e4a1d859a387df45650a0">tempest-FloatingIpSeparateNetwork-931877966</nova:project>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:12:48 compute-0 nova_compute[183075]:         <nova:port uuid="d9e056da-23a3-44e3-b7a8-44b73622dbb1">
Jan 22 17:12:48 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <system>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <entry name="serial">7915ef96-3b31-447b-a4b5-1feeb4997869</entry>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <entry name="uuid">7915ef96-3b31-447b-a4b5-1feeb4997869</entry>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     </system>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   <os>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   </os>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   <features>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   </features>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/7915ef96-3b31-447b-a4b5-1feeb4997869/disk"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:fc:05:dd"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <target dev="tapd9e056da-23"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/7915ef96-3b31-447b-a4b5-1feeb4997869/console.log" append="off"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <video>
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     </video>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:12:48 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:12:48 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:12:48 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:12:48 compute-0 nova_compute[183075]: </domain>
Jan 22 17:12:48 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.527 183079 DEBUG nova.compute.manager [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Preparing to wait for external event network-vif-plugged-d9e056da-23a3-44e3-b7a8-44b73622dbb1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.528 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.528 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.528 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.529 183079 DEBUG nova.virt.libvirt.vif [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:12:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1675680184',display_name='tempest-server-test-1675680184',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1675680184',id=23,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-b8u19k66',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:12:44Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=7915ef96-3b31-447b-a4b5-1feeb4997869,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "address": "fa:16:3e:fc:05:dd", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9e056da-23", "ovs_interfaceid": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.529 183079 DEBUG nova.network.os_vif_util [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "address": "fa:16:3e:fc:05:dd", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9e056da-23", "ovs_interfaceid": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.530 183079 DEBUG nova.network.os_vif_util [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:05:dd,bridge_name='br-int',has_traffic_filtering=True,id=d9e056da-23a3-44e3-b7a8-44b73622dbb1,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd9e056da-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.530 183079 DEBUG os_vif [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:05:dd,bridge_name='br-int',has_traffic_filtering=True,id=d9e056da-23a3-44e3-b7a8-44b73622dbb1,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd9e056da-23') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.531 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.532 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.532 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.536 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.536 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9e056da-23, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.536 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd9e056da-23, col_values=(('external_ids', {'iface-id': 'd9e056da-23a3-44e3-b7a8-44b73622dbb1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fc:05:dd', 'vm-uuid': '7915ef96-3b31-447b-a4b5-1feeb4997869'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.538 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:48 compute-0 NetworkManager[55454]: <info>  [1769101968.5394] manager: (tapd9e056da-23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.541 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.545 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.546 183079 INFO os_vif [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:05:dd,bridge_name='br-int',has_traffic_filtering=True,id=d9e056da-23a3-44e3-b7a8-44b73622dbb1,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd9e056da-23')
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.595 183079 DEBUG nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.596 183079 DEBUG nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No VIF found with MAC fa:16:3e:fc:05:dd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:12:48 compute-0 kernel: tapd9e056da-23: entered promiscuous mode
Jan 22 17:12:48 compute-0 NetworkManager[55454]: <info>  [1769101968.6649] manager: (tapd9e056da-23): new Tun device (/org/freedesktop/NetworkManager/Devices/113)
Jan 22 17:12:48 compute-0 ovn_controller[95372]: 2026-01-22T17:12:48Z|00259|binding|INFO|Claiming lport d9e056da-23a3-44e3-b7a8-44b73622dbb1 for this chassis.
Jan 22 17:12:48 compute-0 ovn_controller[95372]: 2026-01-22T17:12:48Z|00260|binding|INFO|d9e056da-23a3-44e3-b7a8-44b73622dbb1: Claiming fa:16:3e:fc:05:dd 10.100.0.27
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.668 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.678 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:05:dd 10.100.0.27'], port_security=['fa:16:3e:fc:05:dd 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '7915ef96-3b31-447b-a4b5-1feeb4997869', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a16be1a-262e-47f7-8518-5f24ee15796e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.224'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5b6ccb16-1216-4deb-9d72-42005a3163bb, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=d9e056da-23a3-44e3-b7a8-44b73622dbb1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.679 104629 INFO neutron.agent.ovn.metadata.agent [-] Port d9e056da-23a3-44e3-b7a8-44b73622dbb1 in datapath 0a16be1a-262e-47f7-8518-5f24ee15796e bound to our chassis
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.681 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a16be1a-262e-47f7-8518-5f24ee15796e
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.696 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[27e06f23-1f88-4e10-9890-0b8b082dda39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.697 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0a16be1a-21 in ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.700 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0a16be1a-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.700 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[51fa6abc-e450-4d17-9ae7-beab2de35071]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:48 compute-0 systemd-udevd[220976]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.704 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f76baa26-7546-4623-ba45-d48e24b189be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:48 compute-0 ovn_controller[95372]: 2026-01-22T17:12:48Z|00261|binding|INFO|Setting lport d9e056da-23a3-44e3-b7a8-44b73622dbb1 up in Southbound
Jan 22 17:12:48 compute-0 ovn_controller[95372]: 2026-01-22T17:12:48Z|00262|binding|INFO|Setting lport d9e056da-23a3-44e3-b7a8-44b73622dbb1 ovn-installed in OVS
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.708 183079 DEBUG nova.compute.provider_tree [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.712 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:48 compute-0 systemd-machined[154382]: New machine qemu-23-instance-00000017.
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.724 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[42134ebd-842d-4f48-855e-463e71ae41c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:48 compute-0 NetworkManager[55454]: <info>  [1769101968.7268] device (tapd9e056da-23): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:12:48 compute-0 NetworkManager[55454]: <info>  [1769101968.7279] device (tapd9e056da-23): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.730 183079 DEBUG nova.scheduler.client.report [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:12:48 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-00000017.
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.757 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[60dd774a-d4b2-4bbc-a538-ca90257409c1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.760 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.761 183079 DEBUG nova.compute.manager [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.786 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[665c87b2-2faa-457a-a24d-5281ee7fbc49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:48 compute-0 NetworkManager[55454]: <info>  [1769101968.7919] manager: (tap0a16be1a-20): new Veth device (/org/freedesktop/NetworkManager/Devices/114)
Jan 22 17:12:48 compute-0 systemd-udevd[220980]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.794 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[59718a64-cf03-454c-b648-9c66ea18d7c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.805 183079 DEBUG nova.compute.manager [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.805 183079 DEBUG nova.network.neutron [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.836 183079 INFO nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.848 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5b38897d-8185-46d0-b1ad-ee199f8269e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.854 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[16634dc3-16cf-432c-a957-bf5115764776]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:48 compute-0 NetworkManager[55454]: <info>  [1769101968.8776] device (tap0a16be1a-20): carrier: link connected
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.883 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[60ee302c-6e4f-4ce5-a3d3-7dbabcb50eb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.904 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[36c7de7f-2a17-4a41-ab71-45ff752f3981]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a16be1a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:16:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433257, 'reachable_time': 44205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221009, 'error': None, 'target': 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.922 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[200dfecf-de9b-4d1e-9124-73036bb13784]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feba:16c2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 433257, 'tstamp': 433257}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221010, 'error': None, 'target': 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.943 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[faccb23b-0bdc-4b97-844b-c66e45ba1fdd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a16be1a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:16:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433257, 'reachable_time': 44205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221011, 'error': None, 'target': 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:48.977 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b880fbf2-fec4-4007-88c7-e673a11c928e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:48 compute-0 nova_compute[183075]: 2026-01-22 17:12:48.989 183079 DEBUG nova.compute.manager [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:49.043 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7ec655b9-d41f-4f3a-b180-e7d8aae7588e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:49.049 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a16be1a-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:49.050 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:49.050 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a16be1a-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:49 compute-0 kernel: tap0a16be1a-20: entered promiscuous mode
Jan 22 17:12:49 compute-0 NetworkManager[55454]: <info>  [1769101969.0542] manager: (tap0a16be1a-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.054 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:49.059 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a16be1a-20, col_values=(('external_ids', {'iface-id': 'f5af8e72-5100-4440-84f0-c68eec4b5e5e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.060 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:49 compute-0 ovn_controller[95372]: 2026-01-22T17:12:49Z|00263|binding|INFO|Releasing lport f5af8e72-5100-4440-84f0-c68eec4b5e5e from this chassis (sb_readonly=0)
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.061 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:49.063 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a16be1a-262e-47f7-8518-5f24ee15796e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a16be1a-262e-47f7-8518-5f24ee15796e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:49.076 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[852ff845-1665-4924-a3cd-080070119e11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:49.077 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/0a16be1a-262e-47f7-8518-5f24ee15796e.pid.haproxy
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 0a16be1a-262e-47f7-8518-5f24ee15796e
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:12:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:49.077 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'env', 'PROCESS_TAG=haproxy-0a16be1a-262e-47f7-8518-5f24ee15796e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0a16be1a-262e-47f7-8518-5f24ee15796e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.080 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.094 183079 DEBUG nova.compute.manager [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.095 183079 DEBUG nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.096 183079 INFO nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Creating image(s)
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.096 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "/var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.096 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "/var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.097 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "/var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.112 183079 DEBUG oslo_concurrency.processutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.162 183079 DEBUG nova.compute.manager [req-9d2ec980-0c48-4f3d-b5a3-f06d0afc922c req-192692ab-d85e-4a78-8634-ffaa29e57226 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Received event network-vif-plugged-d9e056da-23a3-44e3-b7a8-44b73622dbb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.162 183079 DEBUG oslo_concurrency.lockutils [req-9d2ec980-0c48-4f3d-b5a3-f06d0afc922c req-192692ab-d85e-4a78-8634-ffaa29e57226 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.163 183079 DEBUG oslo_concurrency.lockutils [req-9d2ec980-0c48-4f3d-b5a3-f06d0afc922c req-192692ab-d85e-4a78-8634-ffaa29e57226 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.163 183079 DEBUG oslo_concurrency.lockutils [req-9d2ec980-0c48-4f3d-b5a3-f06d0afc922c req-192692ab-d85e-4a78-8634-ffaa29e57226 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.163 183079 DEBUG nova.compute.manager [req-9d2ec980-0c48-4f3d-b5a3-f06d0afc922c req-192692ab-d85e-4a78-8634-ffaa29e57226 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Processing event network-vif-plugged-d9e056da-23a3-44e3-b7a8-44b73622dbb1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.170 183079 DEBUG oslo_concurrency.processutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.170 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.171 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.183 183079 DEBUG oslo_concurrency.processutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.238 183079 DEBUG oslo_concurrency.processutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.239 183079 DEBUG oslo_concurrency.processutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.273 183079 DEBUG oslo_concurrency.processutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.274 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.274 183079 DEBUG oslo_concurrency.processutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.292 183079 DEBUG nova.policy [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.328 183079 DEBUG oslo_concurrency.processutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.329 183079 DEBUG nova.virt.disk.api [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Checking if we can resize image /var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.329 183079 DEBUG oslo_concurrency.processutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.346 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101969.3392208, 7915ef96-3b31-447b-a4b5-1feeb4997869 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.347 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] VM Started (Lifecycle Event)
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.351 183079 DEBUG nova.compute.manager [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.359 183079 DEBUG nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.365 183079 INFO nova.virt.libvirt.driver [-] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Instance spawned successfully.
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.366 183079 DEBUG nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.380 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.393 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.395 183079 DEBUG oslo_concurrency.processutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.395 183079 DEBUG nova.virt.disk.api [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Cannot resize image /var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.396 183079 DEBUG nova.objects.instance [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lazy-loading 'migration_context' on Instance uuid cfe610a3-4dee-46ca-a82a-1c8993fdd52c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.398 183079 DEBUG nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.398 183079 DEBUG nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.398 183079 DEBUG nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.399 183079 DEBUG nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.399 183079 DEBUG nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.399 183079 DEBUG nova.virt.libvirt.driver [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.425 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.426 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101969.339461, 7915ef96-3b31-447b-a4b5-1feeb4997869 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.426 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] VM Paused (Lifecycle Event)
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.428 183079 DEBUG nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.428 183079 DEBUG nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Ensure instance console log exists: /var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.428 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.429 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.429 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.453 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.457 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101969.3589504, 7915ef96-3b31-447b-a4b5-1feeb4997869 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.457 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] VM Resumed (Lifecycle Event)
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.467 183079 INFO nova.compute.manager [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Took 5.13 seconds to spawn the instance on the hypervisor.
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.468 183079 DEBUG nova.compute.manager [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.481 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.483 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:12:49 compute-0 podman[221063]: 2026-01-22 17:12:49.490993635 +0000 UTC m=+0.049682176 container create aadf64025bb13b881ecce5591040cd7c5d46fba17d2e4f89db588408b3f4acbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.509 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.529 183079 INFO nova.compute.manager [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Took 5.62 seconds to build instance.
Jan 22 17:12:49 compute-0 systemd[1]: Started libpod-conmon-aadf64025bb13b881ecce5591040cd7c5d46fba17d2e4f89db588408b3f4acbc.scope.
Jan 22 17:12:49 compute-0 nova_compute[183075]: 2026-01-22 17:12:49.543 183079 DEBUG oslo_concurrency.lockutils [None req-f434599a-06b6-49e2-900a-56c959d624f9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "7915ef96-3b31-447b-a4b5-1feeb4997869" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.709s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:49 compute-0 podman[221063]: 2026-01-22 17:12:49.464793852 +0000 UTC m=+0.023482423 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:12:49 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:12:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2012cf507196a4129ccacfa2a827c15e086993982512a9339f7011ae117a27e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:12:49 compute-0 podman[221063]: 2026-01-22 17:12:49.594853773 +0000 UTC m=+0.153542404 container init aadf64025bb13b881ecce5591040cd7c5d46fba17d2e4f89db588408b3f4acbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 17:12:49 compute-0 podman[221063]: 2026-01-22 17:12:49.608114099 +0000 UTC m=+0.166802680 container start aadf64025bb13b881ecce5591040cd7c5d46fba17d2e4f89db588408b3f4acbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 17:12:49 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[221075]: [NOTICE]   (221082) : New worker (221084) forked
Jan 22 17:12:49 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[221075]: [NOTICE]   (221082) : Loading success.
Jan 22 17:12:50 compute-0 nova_compute[183075]: 2026-01-22 17:12:50.285 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:50.387 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:12:50 compute-0 nova_compute[183075]: 2026-01-22 17:12:50.388 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:50.389 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:12:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:50.391 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:50 compute-0 nova_compute[183075]: 2026-01-22 17:12:50.663 183079 INFO nova.compute.manager [None req-418873f2-87c0-4a82-acca-7f6f728060a0 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Get console output
Jan 22 17:12:50 compute-0 nova_compute[183075]: 2026-01-22 17:12:50.671 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.014 183079 DEBUG nova.network.neutron [req-ae82d368-a1f1-4103-b223-c7adf29349fa req-8c49c2ed-8611-4251-9221-3181a49b8430 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Updated VIF entry in instance network info cache for port d9e056da-23a3-44e3-b7a8-44b73622dbb1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.015 183079 DEBUG nova.network.neutron [req-ae82d368-a1f1-4103-b223-c7adf29349fa req-8c49c2ed-8611-4251-9221-3181a49b8430 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Updating instance_info_cache with network_info: [{"id": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "address": "fa:16:3e:fc:05:dd", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9e056da-23", "ovs_interfaceid": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.036 183079 DEBUG oslo_concurrency.lockutils [req-ae82d368-a1f1-4103-b223-c7adf29349fa req-8c49c2ed-8611-4251-9221-3181a49b8430 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-7915ef96-3b31-447b-a4b5-1feeb4997869" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.216 183079 DEBUG nova.network.neutron [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Successfully updated port: 87f56506-3e49-4545-b8c0-8c58cbe49f15 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.231 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "refresh_cache-cfe610a3-4dee-46ca-a82a-1c8993fdd52c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.231 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquired lock "refresh_cache-cfe610a3-4dee-46ca-a82a-1c8993fdd52c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.231 183079 DEBUG nova.network.neutron [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.267 183079 DEBUG nova.compute.manager [req-b27effd9-1a57-43c9-bf3d-ace03121fdeb req-a286d25b-0069-4ea9-9604-94dbabf01701 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Received event network-vif-plugged-d9e056da-23a3-44e3-b7a8-44b73622dbb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.268 183079 DEBUG oslo_concurrency.lockutils [req-b27effd9-1a57-43c9-bf3d-ace03121fdeb req-a286d25b-0069-4ea9-9604-94dbabf01701 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.269 183079 DEBUG oslo_concurrency.lockutils [req-b27effd9-1a57-43c9-bf3d-ace03121fdeb req-a286d25b-0069-4ea9-9604-94dbabf01701 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.270 183079 DEBUG oslo_concurrency.lockutils [req-b27effd9-1a57-43c9-bf3d-ace03121fdeb req-a286d25b-0069-4ea9-9604-94dbabf01701 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.270 183079 DEBUG nova.compute.manager [req-b27effd9-1a57-43c9-bf3d-ace03121fdeb req-a286d25b-0069-4ea9-9604-94dbabf01701 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] No waiting events found dispatching network-vif-plugged-d9e056da-23a3-44e3-b7a8-44b73622dbb1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.271 183079 WARNING nova.compute.manager [req-b27effd9-1a57-43c9-bf3d-ace03121fdeb req-a286d25b-0069-4ea9-9604-94dbabf01701 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Received unexpected event network-vif-plugged-d9e056da-23a3-44e3-b7a8-44b73622dbb1 for instance with vm_state active and task_state None.
Jan 22 17:12:51 compute-0 podman[221104]: 2026-01-22 17:12:51.365859096 +0000 UTC m=+0.072605674 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.379 183079 DEBUG nova.compute.manager [req-a0ff466b-b6a5-4501-aeff-b21fe8873a3a req-d38287c3-dbb4-4e50-970c-d7d03cc11666 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Received event network-changed-87f56506-3e49-4545-b8c0-8c58cbe49f15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.379 183079 DEBUG nova.compute.manager [req-a0ff466b-b6a5-4501-aeff-b21fe8873a3a req-d38287c3-dbb4-4e50-970c-d7d03cc11666 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Refreshing instance network info cache due to event network-changed-87f56506-3e49-4545-b8c0-8c58cbe49f15. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.379 183079 DEBUG oslo_concurrency.lockutils [req-a0ff466b-b6a5-4501-aeff-b21fe8873a3a req-d38287c3-dbb4-4e50-970c-d7d03cc11666 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-cfe610a3-4dee-46ca-a82a-1c8993fdd52c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:12:51 compute-0 nova_compute[183075]: 2026-01-22 17:12:51.445 183079 DEBUG nova.network.neutron [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:12:51 compute-0 sshd-session[221102]: Received disconnect from 45.227.254.170 port 19325:11:  [preauth]
Jan 22 17:12:51 compute-0 sshd-session[221102]: Disconnected from authenticating user root 45.227.254.170 port 19325 [preauth]
Jan 22 17:12:52 compute-0 ovn_controller[95372]: 2026-01-22T17:12:52Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cd:f6:c4 10.100.0.19
Jan 22 17:12:52 compute-0 ovn_controller[95372]: 2026-01-22T17:12:52Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cd:f6:c4 10.100.0.19
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.325 183079 INFO nova.compute.manager [None req-9d3f2e85-6c26-4177-8513-d64dfc12d849 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Get console output
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.329 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.780 183079 DEBUG nova.network.neutron [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Updating instance_info_cache with network_info: [{"id": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "address": "fa:16:3e:c6:8a:7a", "network": {"id": "3f5295ae-a32e-4595-80e2-a52b2a6b9934", "bridge": "br-int", "label": "tempest-test-network--1267664325", "subnets": [{"cidr": "10.10.210.0/24", "dns": [], "gateway": {"address": "10.10.210.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.210.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87f56506-3e", "ovs_interfaceid": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.800 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Releasing lock "refresh_cache-cfe610a3-4dee-46ca-a82a-1c8993fdd52c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.800 183079 DEBUG nova.compute.manager [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Instance network_info: |[{"id": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "address": "fa:16:3e:c6:8a:7a", "network": {"id": "3f5295ae-a32e-4595-80e2-a52b2a6b9934", "bridge": "br-int", "label": "tempest-test-network--1267664325", "subnets": [{"cidr": "10.10.210.0/24", "dns": [], "gateway": {"address": "10.10.210.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.210.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87f56506-3e", "ovs_interfaceid": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.801 183079 DEBUG oslo_concurrency.lockutils [req-a0ff466b-b6a5-4501-aeff-b21fe8873a3a req-d38287c3-dbb4-4e50-970c-d7d03cc11666 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-cfe610a3-4dee-46ca-a82a-1c8993fdd52c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.801 183079 DEBUG nova.network.neutron [req-a0ff466b-b6a5-4501-aeff-b21fe8873a3a req-d38287c3-dbb4-4e50-970c-d7d03cc11666 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Refreshing network info cache for port 87f56506-3e49-4545-b8c0-8c58cbe49f15 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.806 183079 DEBUG nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Start _get_guest_xml network_info=[{"id": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "address": "fa:16:3e:c6:8a:7a", "network": {"id": "3f5295ae-a32e-4595-80e2-a52b2a6b9934", "bridge": "br-int", "label": "tempest-test-network--1267664325", "subnets": [{"cidr": "10.10.210.0/24", "dns": [], "gateway": {"address": "10.10.210.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.210.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87f56506-3e", "ovs_interfaceid": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.812 183079 WARNING nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.819 183079 DEBUG nova.virt.libvirt.host [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.820 183079 DEBUG nova.virt.libvirt.host [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.830 183079 DEBUG nova.virt.libvirt.host [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.831 183079 DEBUG nova.virt.libvirt.host [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.832 183079 DEBUG nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.832 183079 DEBUG nova.virt.hardware [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.833 183079 DEBUG nova.virt.hardware [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.833 183079 DEBUG nova.virt.hardware [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.833 183079 DEBUG nova.virt.hardware [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.834 183079 DEBUG nova.virt.hardware [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.834 183079 DEBUG nova.virt.hardware [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.834 183079 DEBUG nova.virt.hardware [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.834 183079 DEBUG nova.virt.hardware [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.835 183079 DEBUG nova.virt.hardware [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.835 183079 DEBUG nova.virt.hardware [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.835 183079 DEBUG nova.virt.hardware [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.842 183079 DEBUG nova.virt.libvirt.vif [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:12:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1791420059',display_name='tempest-server-test-1791420059',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1791420059',id=24,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFDbtc4hRCSv9gHmltB2G2DhAS2UXQKlSKw7wO8jzDWMIBVwAbtKYxxZ/33aOA3bpXNLm1KqC5K8xgOmfq2yl/h8qq6Gn/Z0l0pwwJ1+wDWZki4CnB3qyJDKSWstQshx5w==',key_name='tempest-keypair-test-612599565',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26cca885d303443380036cbbe9e70744',ramdisk_id='',reservation_id='r-zfljqd21',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkConnectivityTest-1809867331',owner_user_name='tempest-NetworkConnectivityTest-1809867331-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:12:49Z,user_data=None,user_id='4a7542774b9c42618cf9d00113f9d23d',uuid=cfe610a3-4dee-46ca-a82a-1c8993fdd52c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "address": "fa:16:3e:c6:8a:7a", "network": {"id": "3f5295ae-a32e-4595-80e2-a52b2a6b9934", "bridge": "br-int", "label": "tempest-test-network--1267664325", "subnets": [{"cidr": "10.10.210.0/24", "dns": [], "gateway": {"address": "10.10.210.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.210.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87f56506-3e", "ovs_interfaceid": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.843 183079 DEBUG nova.network.os_vif_util [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converting VIF {"id": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "address": "fa:16:3e:c6:8a:7a", "network": {"id": "3f5295ae-a32e-4595-80e2-a52b2a6b9934", "bridge": "br-int", "label": "tempest-test-network--1267664325", "subnets": [{"cidr": "10.10.210.0/24", "dns": [], "gateway": {"address": "10.10.210.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.210.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87f56506-3e", "ovs_interfaceid": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.844 183079 DEBUG nova.network.os_vif_util [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:8a:7a,bridge_name='br-int',has_traffic_filtering=True,id=87f56506-3e49-4545-b8c0-8c58cbe49f15,network=Network(3f5295ae-a32e-4595-80e2-a52b2a6b9934),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap87f56506-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.846 183079 DEBUG nova.objects.instance [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lazy-loading 'pci_devices' on Instance uuid cfe610a3-4dee-46ca-a82a-1c8993fdd52c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.865 183079 DEBUG nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:12:52 compute-0 nova_compute[183075]:   <uuid>cfe610a3-4dee-46ca-a82a-1c8993fdd52c</uuid>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   <name>instance-00000018</name>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1791420059</nova:name>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:12:52</nova:creationTime>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:12:52 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:12:52 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:12:52 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:12:52 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:12:52 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:12:52 compute-0 nova_compute[183075]:         <nova:user uuid="4a7542774b9c42618cf9d00113f9d23d">tempest-NetworkConnectivityTest-1809867331-project-member</nova:user>
Jan 22 17:12:52 compute-0 nova_compute[183075]:         <nova:project uuid="26cca885d303443380036cbbe9e70744">tempest-NetworkConnectivityTest-1809867331</nova:project>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:12:52 compute-0 nova_compute[183075]:         <nova:port uuid="87f56506-3e49-4545-b8c0-8c58cbe49f15">
Jan 22 17:12:52 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.10.210.20" ipVersion="4"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <system>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <entry name="serial">cfe610a3-4dee-46ca-a82a-1c8993fdd52c</entry>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <entry name="uuid">cfe610a3-4dee-46ca-a82a-1c8993fdd52c</entry>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     </system>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   <os>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   </os>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   <features>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   </features>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c/disk"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:c6:8a:7a"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <target dev="tap87f56506-3e"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c/console.log" append="off"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <video>
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     </video>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:12:52 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:12:52 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:12:52 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:12:52 compute-0 nova_compute[183075]: </domain>
Jan 22 17:12:52 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.866 183079 DEBUG nova.compute.manager [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Preparing to wait for external event network-vif-plugged-87f56506-3e49-4545-b8c0-8c58cbe49f15 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.866 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.866 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.867 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.867 183079 DEBUG nova.virt.libvirt.vif [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:12:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1791420059',display_name='tempest-server-test-1791420059',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1791420059',id=24,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFDbtc4hRCSv9gHmltB2G2DhAS2UXQKlSKw7wO8jzDWMIBVwAbtKYxxZ/33aOA3bpXNLm1KqC5K8xgOmfq2yl/h8qq6Gn/Z0l0pwwJ1+wDWZki4CnB3qyJDKSWstQshx5w==',key_name='tempest-keypair-test-612599565',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26cca885d303443380036cbbe9e70744',ramdisk_id='',reservation_id='r-zfljqd21',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkConnectivityTest-1809867331',owner_user_name='tempest-NetworkConnectivityTest-1809867331-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:12:49Z,user_data=None,user_id='4a7542774b9c42618cf9d00113f9d23d',uuid=cfe610a3-4dee-46ca-a82a-1c8993fdd52c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "address": "fa:16:3e:c6:8a:7a", "network": {"id": "3f5295ae-a32e-4595-80e2-a52b2a6b9934", "bridge": "br-int", "label": "tempest-test-network--1267664325", "subnets": [{"cidr": "10.10.210.0/24", "dns": [], "gateway": {"address": "10.10.210.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.210.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87f56506-3e", "ovs_interfaceid": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.868 183079 DEBUG nova.network.os_vif_util [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converting VIF {"id": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "address": "fa:16:3e:c6:8a:7a", "network": {"id": "3f5295ae-a32e-4595-80e2-a52b2a6b9934", "bridge": "br-int", "label": "tempest-test-network--1267664325", "subnets": [{"cidr": "10.10.210.0/24", "dns": [], "gateway": {"address": "10.10.210.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.210.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87f56506-3e", "ovs_interfaceid": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.868 183079 DEBUG nova.network.os_vif_util [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:8a:7a,bridge_name='br-int',has_traffic_filtering=True,id=87f56506-3e49-4545-b8c0-8c58cbe49f15,network=Network(3f5295ae-a32e-4595-80e2-a52b2a6b9934),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap87f56506-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.869 183079 DEBUG os_vif [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:8a:7a,bridge_name='br-int',has_traffic_filtering=True,id=87f56506-3e49-4545-b8c0-8c58cbe49f15,network=Network(3f5295ae-a32e-4595-80e2-a52b2a6b9934),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap87f56506-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.869 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.869 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.870 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.872 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.872 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap87f56506-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.873 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap87f56506-3e, col_values=(('external_ids', {'iface-id': '87f56506-3e49-4545-b8c0-8c58cbe49f15', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c6:8a:7a', 'vm-uuid': 'cfe610a3-4dee-46ca-a82a-1c8993fdd52c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.874 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:52 compute-0 NetworkManager[55454]: <info>  [1769101972.8751] manager: (tap87f56506-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.876 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.880 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.881 183079 INFO os_vif [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:8a:7a,bridge_name='br-int',has_traffic_filtering=True,id=87f56506-3e49-4545-b8c0-8c58cbe49f15,network=Network(3f5295ae-a32e-4595-80e2-a52b2a6b9934),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap87f56506-3e')
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.941 183079 DEBUG nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:12:52 compute-0 nova_compute[183075]: 2026-01-22 17:12:52.942 183079 DEBUG nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] No VIF found with MAC fa:16:3e:c6:8a:7a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:12:53 compute-0 kernel: tap87f56506-3e: entered promiscuous mode
Jan 22 17:12:53 compute-0 NetworkManager[55454]: <info>  [1769101973.0049] manager: (tap87f56506-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/117)
Jan 22 17:12:53 compute-0 ovn_controller[95372]: 2026-01-22T17:12:53Z|00264|binding|INFO|Claiming lport 87f56506-3e49-4545-b8c0-8c58cbe49f15 for this chassis.
Jan 22 17:12:53 compute-0 ovn_controller[95372]: 2026-01-22T17:12:53Z|00265|binding|INFO|87f56506-3e49-4545-b8c0-8c58cbe49f15: Claiming fa:16:3e:c6:8a:7a 10.10.210.20
Jan 22 17:12:53 compute-0 nova_compute[183075]: 2026-01-22 17:12:53.011 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.029 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:8a:7a 10.10.210.20'], port_security=['fa:16:3e:c6:8a:7a 10.10.210.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.10.210.20/24', 'neutron:device_id': 'cfe610a3-4dee-46ca-a82a-1c8993fdd52c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f5295ae-a32e-4595-80e2-a52b2a6b9934', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26cca885d303443380036cbbe9e70744', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9427a67d-1313-4d60-b73e-5a3f81f9a54d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=688e5202-8125-409e-8a42-679a9dc31876, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=87f56506-3e49-4545-b8c0-8c58cbe49f15) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.031 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 87f56506-3e49-4545-b8c0-8c58cbe49f15 in datapath 3f5295ae-a32e-4595-80e2-a52b2a6b9934 bound to our chassis
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.034 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3f5295ae-a32e-4595-80e2-a52b2a6b9934
Jan 22 17:12:53 compute-0 ovn_controller[95372]: 2026-01-22T17:12:53Z|00266|binding|INFO|Setting lport 87f56506-3e49-4545-b8c0-8c58cbe49f15 ovn-installed in OVS
Jan 22 17:12:53 compute-0 ovn_controller[95372]: 2026-01-22T17:12:53Z|00267|binding|INFO|Setting lport 87f56506-3e49-4545-b8c0-8c58cbe49f15 up in Southbound
Jan 22 17:12:53 compute-0 nova_compute[183075]: 2026-01-22 17:12:53.039 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:53 compute-0 nova_compute[183075]: 2026-01-22 17:12:53.045 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.048 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[095a62e2-62ef-4f15-adcb-422cf84906a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.049 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3f5295ae-a1 in ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:12:53 compute-0 systemd-udevd[221144]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.054 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3f5295ae-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.054 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9c147e6b-f86c-4c50-a953-9a190cf57f08]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.055 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb1b70a-30af-41ca-8a40-66fadd3ed7ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:53 compute-0 NetworkManager[55454]: <info>  [1769101973.0634] device (tap87f56506-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:12:53 compute-0 NetworkManager[55454]: <info>  [1769101973.0641] device (tap87f56506-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.066 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[d742b20d-d185-44d4-9769-a0b09f9c17e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:53 compute-0 systemd-machined[154382]: New machine qemu-24-instance-00000018.
Jan 22 17:12:53 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-00000018.
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.095 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[27ccea33-b599-4efa-93dc-996a96f0af77]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.128 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[3726d888-a913-4e38-a3f9-2e02f0addb47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:53 compute-0 NetworkManager[55454]: <info>  [1769101973.1352] manager: (tap3f5295ae-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/118)
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.134 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a8218008-4a77-431e-8e3c-9a3deb7e0ff2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.168 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d0107185-b57c-4f1f-8869-3583ee1a4aee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.171 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[e10c6acb-7db1-41a4-9059-e591aee17ec5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:53 compute-0 NetworkManager[55454]: <info>  [1769101973.1979] device (tap3f5295ae-a0): carrier: link connected
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.203 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[aa54829f-7f80-47a4-8764-4cb7a43b3184]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.222 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f30564f8-1045-40fc-9dee-b0bfd516e2fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f5295ae-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:8b:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433689, 'reachable_time': 42430, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221178, 'error': None, 'target': 'ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.240 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e19bb26a-4873-4024-81b6-2fa9bce1f66a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe61:8b02'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 433689, 'tstamp': 433689}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221179, 'error': None, 'target': 'ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.257 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fe8a9709-8371-4295-96e8-3ac627215cac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f5295ae-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:8b:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433689, 'reachable_time': 42430, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221180, 'error': None, 'target': 'ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.289 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[86502c62-8aee-41b1-8c8f-1d3af1ba144d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.348 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b5d9978e-17f9-478f-be89-68a00f86c859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.349 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f5295ae-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.350 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.350 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f5295ae-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:53 compute-0 NetworkManager[55454]: <info>  [1769101973.3528] manager: (tap3f5295ae-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Jan 22 17:12:53 compute-0 kernel: tap3f5295ae-a0: entered promiscuous mode
Jan 22 17:12:53 compute-0 nova_compute[183075]: 2026-01-22 17:12:53.354 183079 DEBUG nova.compute.manager [req-14b67059-7386-42c3-b578-3289acd4673a req-4e2432b7-24f0-49ad-a267-627252a0ba37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Received event network-vif-plugged-87f56506-3e49-4545-b8c0-8c58cbe49f15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:53 compute-0 nova_compute[183075]: 2026-01-22 17:12:53.354 183079 DEBUG oslo_concurrency.lockutils [req-14b67059-7386-42c3-b578-3289acd4673a req-4e2432b7-24f0-49ad-a267-627252a0ba37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:53 compute-0 nova_compute[183075]: 2026-01-22 17:12:53.354 183079 DEBUG oslo_concurrency.lockutils [req-14b67059-7386-42c3-b578-3289acd4673a req-4e2432b7-24f0-49ad-a267-627252a0ba37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:53 compute-0 nova_compute[183075]: 2026-01-22 17:12:53.355 183079 DEBUG oslo_concurrency.lockutils [req-14b67059-7386-42c3-b578-3289acd4673a req-4e2432b7-24f0-49ad-a267-627252a0ba37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:53 compute-0 nova_compute[183075]: 2026-01-22 17:12:53.355 183079 DEBUG nova.compute.manager [req-14b67059-7386-42c3-b578-3289acd4673a req-4e2432b7-24f0-49ad-a267-627252a0ba37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Processing event network-vif-plugged-87f56506-3e49-4545-b8c0-8c58cbe49f15 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.355 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3f5295ae-a0, col_values=(('external_ids', {'iface-id': '02b0d2da-138a-4eeb-a429-bc777aacaea1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:12:53 compute-0 nova_compute[183075]: 2026-01-22 17:12:53.355 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:53 compute-0 ovn_controller[95372]: 2026-01-22T17:12:53Z|00268|binding|INFO|Releasing lport 02b0d2da-138a-4eeb-a429-bc777aacaea1 from this chassis (sb_readonly=0)
Jan 22 17:12:53 compute-0 nova_compute[183075]: 2026-01-22 17:12:53.369 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.370 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3f5295ae-a32e-4595-80e2-a52b2a6b9934.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3f5295ae-a32e-4595-80e2-a52b2a6b9934.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.371 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[47e99eb3-68fc-45bd-b0c1-d0ffb4079442]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.372 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/3f5295ae-a32e-4595-80e2-a52b2a6b9934.pid.haproxy
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 3f5295ae-a32e-4595-80e2-a52b2a6b9934
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:12:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:53.374 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934', 'env', 'PROCESS_TAG=haproxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3f5295ae-a32e-4595-80e2-a52b2a6b9934.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:12:53 compute-0 podman[221212]: 2026-01-22 17:12:53.756779371 +0000 UTC m=+0.063511777 container create 43800d9786d9454fc4ffb8be77c82fa1d377458ca9fa029d83ecb4fe8282be11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:12:53 compute-0 systemd[1]: Started libpod-conmon-43800d9786d9454fc4ffb8be77c82fa1d377458ca9fa029d83ecb4fe8282be11.scope.
Jan 22 17:12:53 compute-0 podman[221212]: 2026-01-22 17:12:53.720209368 +0000 UTC m=+0.026941804 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:12:53 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:12:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0fe8517455866911bc916d8d9b4447d7603db5756319ac74770d1a46e7bf747d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:12:53 compute-0 podman[221212]: 2026-01-22 17:12:53.862691572 +0000 UTC m=+0.169423978 container init 43800d9786d9454fc4ffb8be77c82fa1d377458ca9fa029d83ecb4fe8282be11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:12:53 compute-0 podman[221212]: 2026-01-22 17:12:53.86875006 +0000 UTC m=+0.175482506 container start 43800d9786d9454fc4ffb8be77c82fa1d377458ca9fa029d83ecb4fe8282be11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:12:53 compute-0 neutron-haproxy-ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221227]: [NOTICE]   (221231) : New worker (221233) forked
Jan 22 17:12:53 compute-0 neutron-haproxy-ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221227]: [NOTICE]   (221231) : Loading success.
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.164 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101974.1643028, cfe610a3-4dee-46ca-a82a-1c8993fdd52c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.165 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] VM Started (Lifecycle Event)
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.167 183079 DEBUG nova.compute.manager [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.171 183079 DEBUG nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.175 183079 INFO nova.virt.libvirt.driver [-] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Instance spawned successfully.
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.175 183079 DEBUG nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.196 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.201 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.205 183079 DEBUG nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.206 183079 DEBUG nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.206 183079 DEBUG nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.207 183079 DEBUG nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.207 183079 DEBUG nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.208 183079 DEBUG nova.virt.libvirt.driver [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.242 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.243 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101974.16457, cfe610a3-4dee-46ca-a82a-1c8993fdd52c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.243 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] VM Paused (Lifecycle Event)
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.269 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.273 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769101974.1714256, cfe610a3-4dee-46ca-a82a-1c8993fdd52c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.273 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] VM Resumed (Lifecycle Event)
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.278 183079 INFO nova.compute.manager [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Took 5.18 seconds to spawn the instance on the hypervisor.
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.278 183079 DEBUG nova.compute.manager [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.289 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.292 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.295 183079 DEBUG nova.network.neutron [req-a0ff466b-b6a5-4501-aeff-b21fe8873a3a req-d38287c3-dbb4-4e50-970c-d7d03cc11666 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Updated VIF entry in instance network info cache for port 87f56506-3e49-4545-b8c0-8c58cbe49f15. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.296 183079 DEBUG nova.network.neutron [req-a0ff466b-b6a5-4501-aeff-b21fe8873a3a req-d38287c3-dbb4-4e50-970c-d7d03cc11666 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Updating instance_info_cache with network_info: [{"id": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "address": "fa:16:3e:c6:8a:7a", "network": {"id": "3f5295ae-a32e-4595-80e2-a52b2a6b9934", "bridge": "br-int", "label": "tempest-test-network--1267664325", "subnets": [{"cidr": "10.10.210.0/24", "dns": [], "gateway": {"address": "10.10.210.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.210.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87f56506-3e", "ovs_interfaceid": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.326 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.327 183079 DEBUG oslo_concurrency.lockutils [req-a0ff466b-b6a5-4501-aeff-b21fe8873a3a req-d38287c3-dbb4-4e50-970c-d7d03cc11666 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-cfe610a3-4dee-46ca-a82a-1c8993fdd52c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.347 183079 INFO nova.compute.manager [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Took 5.89 seconds to build instance.
Jan 22 17:12:54 compute-0 nova_compute[183075]: 2026-01-22 17:12:54.371 183079 DEBUG oslo_concurrency.lockutils [None req-8d22cc57-6b21-4904-881b-2639dcc9bc78 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.976s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:55 compute-0 nova_compute[183075]: 2026-01-22 17:12:55.326 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:55 compute-0 nova_compute[183075]: 2026-01-22 17:12:55.354 183079 INFO nova.compute.manager [None req-ea6ef976-087c-4229-8bf6-82bd9af7614c 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Get console output
Jan 22 17:12:55 compute-0 nova_compute[183075]: 2026-01-22 17:12:55.359 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:12:55 compute-0 nova_compute[183075]: 2026-01-22 17:12:55.422 183079 DEBUG nova.compute.manager [req-4f89c072-86bb-448b-9204-c4c1646e050f req-6badd293-6e31-4488-90c4-da3d19621060 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Received event network-vif-plugged-87f56506-3e49-4545-b8c0-8c58cbe49f15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:12:55 compute-0 nova_compute[183075]: 2026-01-22 17:12:55.423 183079 DEBUG oslo_concurrency.lockutils [req-4f89c072-86bb-448b-9204-c4c1646e050f req-6badd293-6e31-4488-90c4-da3d19621060 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:12:55 compute-0 nova_compute[183075]: 2026-01-22 17:12:55.423 183079 DEBUG oslo_concurrency.lockutils [req-4f89c072-86bb-448b-9204-c4c1646e050f req-6badd293-6e31-4488-90c4-da3d19621060 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:12:55 compute-0 nova_compute[183075]: 2026-01-22 17:12:55.424 183079 DEBUG oslo_concurrency.lockutils [req-4f89c072-86bb-448b-9204-c4c1646e050f req-6badd293-6e31-4488-90c4-da3d19621060 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:12:55 compute-0 nova_compute[183075]: 2026-01-22 17:12:55.424 183079 DEBUG nova.compute.manager [req-4f89c072-86bb-448b-9204-c4c1646e050f req-6badd293-6e31-4488-90c4-da3d19621060 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] No waiting events found dispatching network-vif-plugged-87f56506-3e49-4545-b8c0-8c58cbe49f15 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:12:55 compute-0 nova_compute[183075]: 2026-01-22 17:12:55.424 183079 WARNING nova.compute.manager [req-4f89c072-86bb-448b-9204-c4c1646e050f req-6badd293-6e31-4488-90c4-da3d19621060 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Received unexpected event network-vif-plugged-87f56506-3e49-4545-b8c0-8c58cbe49f15 for instance with vm_state active and task_state None.
Jan 22 17:12:55 compute-0 nova_compute[183075]: 2026-01-22 17:12:55.842 183079 INFO nova.compute.manager [None req-d2ceb180-77e0-4d6f-a98c-9cc2ec415803 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Get console output
Jan 22 17:12:55 compute-0 nova_compute[183075]: 2026-01-22 17:12:55.848 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:12:57 compute-0 nova_compute[183075]: 2026-01-22 17:12:57.483 183079 INFO nova.compute.manager [None req-b9d7540b-01cd-4aea-8016-7179e8d1d2ee cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Get console output
Jan 22 17:12:57 compute-0 nova_compute[183075]: 2026-01-22 17:12:57.875 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:12:58 compute-0 podman[221250]: 2026-01-22 17:12:58.365027596 +0000 UTC m=+0.061961697 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 17:12:58 compute-0 podman[221251]: 2026-01-22 17:12:58.399185166 +0000 UTC m=+0.085255054 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, config_id=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter)
Jan 22 17:12:58 compute-0 podman[221249]: 2026-01-22 17:12:58.403509309 +0000 UTC m=+0.104507216 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:12:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:59.141 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:12:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:12:59.142 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:12:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:12:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:12:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:12:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:12:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:12:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.19
Jan 22 17:12:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cbad0d35-4bf3-49f1-bb21-0be199e1e42e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.157 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.158 104990 INFO eventlet.wsgi.server [-] 10.100.0.19,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.0156093
Jan 22 17:13:00 compute-0 haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220911]: 10.100.0.19:56826 [22/Jan/2026:17:12:59.140] listener listener/metadata 0/0/0/1017/1017 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.168 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.168 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.19
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cbad0d35-4bf3-49f1-bb21-0be199e1e42e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.182 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:00 compute-0 haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220911]: 10.100.0.19:56838 [22/Jan/2026:17:13:00.167] listener listener/metadata 0/0/0/14/14 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.183 104990 INFO eventlet.wsgi.server [-] 10.100.0.19,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0145519
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.186 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.187 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.19
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cbad0d35-4bf3-49f1-bb21-0be199e1e42e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.206 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.206 104990 INFO eventlet.wsgi.server [-] 10.100.0.19,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0191793
Jan 22 17:13:00 compute-0 haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220911]: 10.100.0.19:56846 [22/Jan/2026:17:13:00.186] listener listener/metadata 0/0/0/19/19 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.211 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.212 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.19
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cbad0d35-4bf3-49f1-bb21-0be199e1e42e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.233 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.234 104990 INFO eventlet.wsgi.server [-] 10.100.0.19,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0216095
Jan 22 17:13:00 compute-0 haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220911]: 10.100.0.19:56848 [22/Jan/2026:17:13:00.211] listener listener/metadata 0/0/0/22/22 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.239 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.239 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.19
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cbad0d35-4bf3-49f1-bb21-0be199e1e42e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.254 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.255 104990 INFO eventlet.wsgi.server [-] 10.100.0.19,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0152564
Jan 22 17:13:00 compute-0 haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220911]: 10.100.0.19:56854 [22/Jan/2026:17:13:00.239] listener listener/metadata 0/0/0/15/15 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.260 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.260 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.19
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cbad0d35-4bf3-49f1-bb21-0be199e1e42e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.275 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:00 compute-0 haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220911]: 10.100.0.19:56862 [22/Jan/2026:17:13:00.259] listener listener/metadata 0/0/0/15/15 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.275 104990 INFO eventlet.wsgi.server [-] 10.100.0.19,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0149083
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.280 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.280 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.19
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cbad0d35-4bf3-49f1-bb21-0be199e1e42e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.293 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:00 compute-0 haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220911]: 10.100.0.19:56866 [22/Jan/2026:17:13:00.280] listener listener/metadata 0/0/0/13/13 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.294 104990 INFO eventlet.wsgi.server [-] 10.100.0.19,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0134304
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.298 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.299 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.19
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cbad0d35-4bf3-49f1-bb21-0be199e1e42e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.322 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.322 104990 INFO eventlet.wsgi.server [-] 10.100.0.19,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0236623
Jan 22 17:13:00 compute-0 haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220911]: 10.100.0.19:56874 [22/Jan/2026:17:13:00.298] listener listener/metadata 0/0/0/24/24 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.350 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.352 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.19
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cbad0d35-4bf3-49f1-bb21-0be199e1e42e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:00 compute-0 nova_compute[183075]: 2026-01-22 17:13:00.351 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.371 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.371 104990 INFO eventlet.wsgi.server [-] 10.100.0.19,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0195475
Jan 22 17:13:00 compute-0 haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220911]: 10.100.0.19:56878 [22/Jan/2026:17:13:00.328] listener listener/metadata 0/0/0/43/43 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.377 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.378 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.19
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cbad0d35-4bf3-49f1-bb21-0be199e1e42e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.396 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.396 104990 INFO eventlet.wsgi.server [-] 10.100.0.19,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0186241
Jan 22 17:13:00 compute-0 haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220911]: 10.100.0.19:56882 [22/Jan/2026:17:13:00.377] listener listener/metadata 0/0/0/19/19 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.401 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.402 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.19
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cbad0d35-4bf3-49f1-bb21-0be199e1e42e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:00 compute-0 haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220911]: 10.100.0.19:56894 [22/Jan/2026:17:13:00.401] listener listener/metadata 0/0/0/20/20 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.422 104990 INFO eventlet.wsgi.server [-] 10.100.0.19,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0199602
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.434 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.436 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.19
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cbad0d35-4bf3-49f1-bb21-0be199e1e42e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.463 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:00 compute-0 haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220911]: 10.100.0.19:56904 [22/Jan/2026:17:13:00.434] listener listener/metadata 0/0/0/29/29 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.463 104990 INFO eventlet.wsgi.server [-] 10.100.0.19,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0278759
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.468 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.469 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.19
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cbad0d35-4bf3-49f1-bb21-0be199e1e42e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.487 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:00 compute-0 haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220911]: 10.100.0.19:56918 [22/Jan/2026:17:13:00.468] listener listener/metadata 0/0/0/20/20 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.488 104990 INFO eventlet.wsgi.server [-] 10.100.0.19,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0192227
Jan 22 17:13:00 compute-0 nova_compute[183075]: 2026-01-22 17:13:00.489 183079 INFO nova.compute.manager [None req-5012f069-d944-4fc7-a52e-fc085d05a27b 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Get console output
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.494 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.495 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.19
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cbad0d35-4bf3-49f1-bb21-0be199e1e42e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.519 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:00 compute-0 haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220911]: 10.100.0.19:56926 [22/Jan/2026:17:13:00.493] listener listener/metadata 0/0/0/26/26 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.519 104990 INFO eventlet.wsgi.server [-] 10.100.0.19,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0242212
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.526 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.527 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.19
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cbad0d35-4bf3-49f1-bb21-0be199e1e42e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.543 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.544 104990 INFO eventlet.wsgi.server [-] 10.100.0.19,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0174108
Jan 22 17:13:00 compute-0 haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220911]: 10.100.0.19:56930 [22/Jan/2026:17:13:00.525] listener listener/metadata 0/0/0/18/18 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.550 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.551 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.19
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cbad0d35-4bf3-49f1-bb21-0be199e1e42e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.574 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:00 compute-0 haproxy-metadata-proxy-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220911]: 10.100.0.19:56940 [22/Jan/2026:17:13:00.549] listener listener/metadata 0/0/0/25/25 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:13:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:00.575 104990 INFO eventlet.wsgi.server [-] 10.100.0.19,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0239763
Jan 22 17:13:00 compute-0 nova_compute[183075]: 2026-01-22 17:13:00.978 183079 INFO nova.compute.manager [None req-e24b8fe4-9393-4ab0-9fbc-d72f6bde4310 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Get console output
Jan 22 17:13:00 compute-0 nova_compute[183075]: 2026-01-22 17:13:00.984 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:01 compute-0 ovn_controller[95372]: 2026-01-22T17:13:01Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fc:05:dd 10.100.0.27
Jan 22 17:13:01 compute-0 ovn_controller[95372]: 2026-01-22T17:13:01Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fc:05:dd 10.100.0.27
Jan 22 17:13:02 compute-0 nova_compute[183075]: 2026-01-22 17:13:02.680 183079 INFO nova.compute.manager [None req-e8112bda-69cd-460f-a541-61d65455f7e9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Get console output
Jan 22 17:13:02 compute-0 nova_compute[183075]: 2026-01-22 17:13:02.876 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:04 compute-0 podman[221322]: 2026-01-22 17:13:04.359942292 +0000 UTC m=+0.070065298 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 22 17:13:05 compute-0 nova_compute[183075]: 2026-01-22 17:13:05.354 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:05 compute-0 nova_compute[183075]: 2026-01-22 17:13:05.637 183079 INFO nova.compute.manager [None req-cec55028-1f17-4b52-9198-e6aeb497916e 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Get console output
Jan 22 17:13:05 compute-0 nova_compute[183075]: 2026-01-22 17:13:05.644 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:06 compute-0 nova_compute[183075]: 2026-01-22 17:13:06.100 183079 INFO nova.compute.manager [None req-1b98681a-4963-45c6-b108-bfe97acb72fb 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Get console output
Jan 22 17:13:06 compute-0 nova_compute[183075]: 2026-01-22 17:13:06.105 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:06.900 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:06.901 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:13:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:13:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:07 compute-0 ovn_controller[95372]: 2026-01-22T17:13:07Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c6:8a:7a 10.10.210.20
Jan 22 17:13:07 compute-0 ovn_controller[95372]: 2026-01-22T17:13:07Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c6:8a:7a 10.10.210.20
Jan 22 17:13:07 compute-0 nova_compute[183075]: 2026-01-22 17:13:07.816 183079 INFO nova.compute.manager [None req-700697a9-73ad-4825-b876-758d993fb282 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Get console output
Jan 22 17:13:07 compute-0 nova_compute[183075]: 2026-01-22 17:13:07.824 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:07 compute-0 nova_compute[183075]: 2026-01-22 17:13:07.877 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:08 compute-0 nova_compute[183075]: 2026-01-22 17:13:08.020 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:13:08 compute-0 nova_compute[183075]: 2026-01-22 17:13:08.021 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.141 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[221084]: 10.100.0.27:40646 [22/Jan/2026:17:13:06.899] listener listener/metadata 0/0/0/1243/1243 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.143 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.2407224
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.164 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.165 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.183 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.184 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0187938
Jan 22 17:13:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[221084]: 10.100.0.27:40660 [22/Jan/2026:17:13:08.163] listener listener/metadata 0/0/0/20/20 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.191 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.192 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.213 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[221084]: 10.100.0.27:40674 [22/Jan/2026:17:13:08.191] listener listener/metadata 0/0/0/22/22 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.213 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0212479
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.226 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.227 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.246 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.247 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0195136
Jan 22 17:13:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[221084]: 10.100.0.27:40678 [22/Jan/2026:17:13:08.225] listener listener/metadata 0/0/0/21/21 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.257 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.258 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.279 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.279 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0209229
Jan 22 17:13:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[221084]: 10.100.0.27:40682 [22/Jan/2026:17:13:08.257] listener listener/metadata 0/0/0/22/22 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.289 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.290 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.313 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.313 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0231414
Jan 22 17:13:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[221084]: 10.100.0.27:40694 [22/Jan/2026:17:13:08.288] listener listener/metadata 0/0/0/25/25 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.319 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.320 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.337 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.338 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0181909
Jan 22 17:13:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[221084]: 10.100.0.27:40706 [22/Jan/2026:17:13:08.319] listener listener/metadata 0/0/0/19/19 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.343 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.343 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.363 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.364 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0203266
Jan 22 17:13:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[221084]: 10.100.0.27:40714 [22/Jan/2026:17:13:08.342] listener listener/metadata 0/0/0/21/21 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.370 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.370 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.395 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[221084]: 10.100.0.27:40720 [22/Jan/2026:17:13:08.369] listener listener/metadata 0/0/0/26/26 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.395 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0251682
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.401 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.402 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.427 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.428 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0256443
Jan 22 17:13:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[221084]: 10.100.0.27:40724 [22/Jan/2026:17:13:08.400] listener listener/metadata 0/0/0/27/27 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.434 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.435 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[221084]: 10.100.0.27:40728 [22/Jan/2026:17:13:08.433] listener listener/metadata 0/0/0/25/25 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.459 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0244887
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.471 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.471 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.491 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[221084]: 10.100.0.27:40730 [22/Jan/2026:17:13:08.470] listener listener/metadata 0/0/0/21/21 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.492 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0204515
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.498 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.499 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.519 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.520 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0208905
Jan 22 17:13:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[221084]: 10.100.0.27:40732 [22/Jan/2026:17:13:08.498] listener listener/metadata 0/0/0/22/22 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.526 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.527 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.547 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.548 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0210025
Jan 22 17:13:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[221084]: 10.100.0.27:40740 [22/Jan/2026:17:13:08.525] listener listener/metadata 0/0/0/22/22 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.555 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.556 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.582 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.582 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0261304
Jan 22 17:13:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[221084]: 10.100.0.27:40742 [22/Jan/2026:17:13:08.555] listener listener/metadata 0/0/0/27/27 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.593 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.594 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.619 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:08.619 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0248394
Jan 22 17:13:08 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[221084]: 10.100.0.27:40748 [22/Jan/2026:17:13:08.593] listener listener/metadata 0/0/0/26/26 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:13:09 compute-0 nova_compute[183075]: 2026-01-22 17:13:09.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:13:10 compute-0 nova_compute[183075]: 2026-01-22 17:13:10.356 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:10 compute-0 nova_compute[183075]: 2026-01-22 17:13:10.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:13:10 compute-0 nova_compute[183075]: 2026-01-22 17:13:10.805 183079 INFO nova.compute.manager [None req-b4d5e402-372a-4c20-a1f4-f04896418e0f 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Get console output
Jan 22 17:13:10 compute-0 nova_compute[183075]: 2026-01-22 17:13:10.809 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:11 compute-0 nova_compute[183075]: 2026-01-22 17:13:11.259 183079 INFO nova.compute.manager [None req-7f80eed4-5ea6-4c71-9cf6-f3cf340f712f 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Get console output
Jan 22 17:13:11 compute-0 nova_compute[183075]: 2026-01-22 17:13:11.267 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:12 compute-0 podman[221358]: 2026-01-22 17:13:12.382439468 +0000 UTC m=+0.076578317 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:13:12 compute-0 nova_compute[183075]: 2026-01-22 17:13:12.885 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:13 compute-0 nova_compute[183075]: 2026-01-22 17:13:13.027 183079 INFO nova.compute.manager [None req-e8fa7628-63d6-45f9-bdb6-0135e8e79f84 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Get console output
Jan 22 17:13:13 compute-0 nova_compute[183075]: 2026-01-22 17:13:13.036 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:13.656 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:13.657 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:13:13 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:13 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:13 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:13 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:13 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:13 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.210.20
Jan 22 17:13:13 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3f5295ae-a32e-4595-80e2-a52b2a6b9934 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:13 compute-0 nova_compute[183075]: 2026-01-22 17:13:13.782 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:13:13 compute-0 nova_compute[183075]: 2026-01-22 17:13:13.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:13:13 compute-0 nova_compute[183075]: 2026-01-22 17:13:13.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:13:13 compute-0 nova_compute[183075]: 2026-01-22 17:13:13.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:13:14 compute-0 nova_compute[183075]: 2026-01-22 17:13:14.017 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-936001bf-d51b-4243-87b8-e363ef3c47a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:13:14 compute-0 nova_compute[183075]: 2026-01-22 17:13:14.017 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-936001bf-d51b-4243-87b8-e363ef3c47a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:13:14 compute-0 nova_compute[183075]: 2026-01-22 17:13:14.018 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:13:14 compute-0 nova_compute[183075]: 2026-01-22 17:13:14.018 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 936001bf-d51b-4243-87b8-e363ef3c47a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.168 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.169 104990 INFO eventlet.wsgi.server [-] 10.10.210.20,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.5118968
Jan 22 17:13:14 compute-0 haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221233]: 10.10.210.20:35066 [22/Jan/2026:17:13:13.655] listener listener/metadata 0/0/0/513/513 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.178 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.179 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.210.20
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3f5295ae-a32e-4595-80e2-a52b2a6b9934 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.203 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.203 104990 INFO eventlet.wsgi.server [-] 10.10.210.20,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0245881
Jan 22 17:13:14 compute-0 haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221233]: 10.10.210.20:35070 [22/Jan/2026:17:13:14.177] listener listener/metadata 0/0/0/25/25 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.210 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.211 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.210.20
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3f5295ae-a32e-4595-80e2-a52b2a6b9934 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.232 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.232 104990 INFO eventlet.wsgi.server [-] 10.10.210.20,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0215976
Jan 22 17:13:14 compute-0 haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221233]: 10.10.210.20:35086 [22/Jan/2026:17:13:14.209] listener listener/metadata 0/0/0/23/23 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.242 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.243 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.210.20
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3f5295ae-a32e-4595-80e2-a52b2a6b9934 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.270 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.270 104990 INFO eventlet.wsgi.server [-] 10.10.210.20,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0275550
Jan 22 17:13:14 compute-0 haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221233]: 10.10.210.20:35100 [22/Jan/2026:17:13:14.242] listener listener/metadata 0/0/0/29/29 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.277 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.277 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.210.20
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3f5295ae-a32e-4595-80e2-a52b2a6b9934 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.306 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.306 104990 INFO eventlet.wsgi.server [-] 10.10.210.20,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0286460
Jan 22 17:13:14 compute-0 haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221233]: 10.10.210.20:35110 [22/Jan/2026:17:13:14.276] listener listener/metadata 0/0/0/29/29 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.317 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.317 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.210.20
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3f5295ae-a32e-4595-80e2-a52b2a6b9934 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.335 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.335 104990 INFO eventlet.wsgi.server [-] 10.10.210.20,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0184178
Jan 22 17:13:14 compute-0 haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221233]: 10.10.210.20:35116 [22/Jan/2026:17:13:14.316] listener listener/metadata 0/0/0/19/19 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.344 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.345 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.210.20
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3f5295ae-a32e-4595-80e2-a52b2a6b9934 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.367 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.368 104990 INFO eventlet.wsgi.server [-] 10.10.210.20,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 148 time: 0.0229440
Jan 22 17:13:14 compute-0 haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221233]: 10.10.210.20:35118 [22/Jan/2026:17:13:14.343] listener listener/metadata 0/0/0/24/24 200 132 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.374 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.375 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.210.20
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3f5295ae-a32e-4595-80e2-a52b2a6b9934 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.396 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.397 104990 INFO eventlet.wsgi.server [-] 10.10.210.20,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0222788
Jan 22 17:13:14 compute-0 haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221233]: 10.10.210.20:35134 [22/Jan/2026:17:13:14.373] listener listener/metadata 0/0/0/23/23 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.403 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.404 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.210.20
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3f5295ae-a32e-4595-80e2-a52b2a6b9934 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.429 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:14 compute-0 haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221233]: 10.10.210.20:35148 [22/Jan/2026:17:13:14.403] listener listener/metadata 0/0/0/27/27 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.430 104990 INFO eventlet.wsgi.server [-] 10.10.210.20,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0259349
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.436 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.438 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.210.20
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3f5295ae-a32e-4595-80e2-a52b2a6b9934 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.458 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:14 compute-0 haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221233]: 10.10.210.20:35158 [22/Jan/2026:17:13:14.435] listener listener/metadata 0/0/0/23/23 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.459 104990 INFO eventlet.wsgi.server [-] 10.10.210.20,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0206230
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.468 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.469 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.210.20
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3f5295ae-a32e-4595-80e2-a52b2a6b9934 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:14 compute-0 haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221233]: 10.10.210.20:35172 [22/Jan/2026:17:13:14.467] listener listener/metadata 0/0/0/25/25 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.492 104990 INFO eventlet.wsgi.server [-] 10.10.210.20,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0237863
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.508 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.509 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.210.20
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3f5295ae-a32e-4595-80e2-a52b2a6b9934 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.534 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.535 104990 INFO eventlet.wsgi.server [-] 10.10.210.20,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0252872
Jan 22 17:13:14 compute-0 haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221233]: 10.10.210.20:35180 [22/Jan/2026:17:13:14.508] listener listener/metadata 0/0/0/26/26 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.539 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.540 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.210.20
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3f5295ae-a32e-4595-80e2-a52b2a6b9934 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.564 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:14 compute-0 haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221233]: 10.10.210.20:35196 [22/Jan/2026:17:13:14.539] listener listener/metadata 0/0/0/25/25 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.564 104990 INFO eventlet.wsgi.server [-] 10.10.210.20,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0241392
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.570 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.571 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.210.20
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3f5295ae-a32e-4595-80e2-a52b2a6b9934 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.585 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.585 104990 INFO eventlet.wsgi.server [-] 10.10.210.20,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0146616
Jan 22 17:13:14 compute-0 haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221233]: 10.10.210.20:35208 [22/Jan/2026:17:13:14.569] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.591 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.591 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.210.20
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3f5295ae-a32e-4595-80e2-a52b2a6b9934 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.607 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.607 104990 INFO eventlet.wsgi.server [-] 10.10.210.20,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0158145
Jan 22 17:13:14 compute-0 haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221233]: 10.10.210.20:35214 [22/Jan/2026:17:13:14.590] listener listener/metadata 0/0/0/16/16 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.613 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.613 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.210.20
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3f5295ae-a32e-4595-80e2-a52b2a6b9934 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.627 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.628 104990 INFO eventlet.wsgi.server [-] 10.10.210.20,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0146835
Jan 22 17:13:14 compute-0 haproxy-metadata-proxy-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221233]: 10.10.210.20:35222 [22/Jan/2026:17:13:14.612] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:13:14 compute-0 nova_compute[183075]: 2026-01-22 17:13:14.920 183079 DEBUG oslo_concurrency.lockutils [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "7915ef96-3b31-447b-a4b5-1feeb4997869" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:14 compute-0 nova_compute[183075]: 2026-01-22 17:13:14.920 183079 DEBUG oslo_concurrency.lockutils [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "7915ef96-3b31-447b-a4b5-1feeb4997869" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:14 compute-0 nova_compute[183075]: 2026-01-22 17:13:14.921 183079 DEBUG oslo_concurrency.lockutils [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:14 compute-0 nova_compute[183075]: 2026-01-22 17:13:14.921 183079 DEBUG oslo_concurrency.lockutils [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:14 compute-0 nova_compute[183075]: 2026-01-22 17:13:14.922 183079 DEBUG oslo_concurrency.lockutils [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:14 compute-0 nova_compute[183075]: 2026-01-22 17:13:14.925 183079 INFO nova.compute.manager [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Terminating instance
Jan 22 17:13:14 compute-0 nova_compute[183075]: 2026-01-22 17:13:14.926 183079 DEBUG nova.compute.manager [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:13:14 compute-0 kernel: tapd9e056da-23 (unregistering): left promiscuous mode
Jan 22 17:13:14 compute-0 NetworkManager[55454]: <info>  [1769101994.9552] device (tapd9e056da-23): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:13:14 compute-0 nova_compute[183075]: 2026-01-22 17:13:14.965 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:14 compute-0 ovn_controller[95372]: 2026-01-22T17:13:14Z|00269|binding|INFO|Releasing lport d9e056da-23a3-44e3-b7a8-44b73622dbb1 from this chassis (sb_readonly=0)
Jan 22 17:13:14 compute-0 ovn_controller[95372]: 2026-01-22T17:13:14Z|00270|binding|INFO|Setting lport d9e056da-23a3-44e3-b7a8-44b73622dbb1 down in Southbound
Jan 22 17:13:14 compute-0 ovn_controller[95372]: 2026-01-22T17:13:14Z|00271|binding|INFO|Removing iface tapd9e056da-23 ovn-installed in OVS
Jan 22 17:13:14 compute-0 nova_compute[183075]: 2026-01-22 17:13:14.971 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.978 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:05:dd 10.100.0.27'], port_security=['fa:16:3e:fc:05:dd 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '7915ef96-3b31-447b-a4b5-1feeb4997869', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a16be1a-262e-47f7-8518-5f24ee15796e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.224', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5b6ccb16-1216-4deb-9d72-42005a3163bb, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=d9e056da-23a3-44e3-b7a8-44b73622dbb1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.980 104629 INFO neutron.agent.ovn.metadata.agent [-] Port d9e056da-23a3-44e3-b7a8-44b73622dbb1 in datapath 0a16be1a-262e-47f7-8518-5f24ee15796e unbound from our chassis
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.983 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a16be1a-262e-47f7-8518-5f24ee15796e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.985 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3f4ebcd6-b6ce-4bc7-bc9f-a4b20cb7919f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:14.988 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e namespace which is not needed anymore
Jan 22 17:13:14 compute-0 nova_compute[183075]: 2026-01-22 17:13:14.996 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:15 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000017.scope: Deactivated successfully.
Jan 22 17:13:15 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000017.scope: Consumed 13.063s CPU time.
Jan 22 17:13:15 compute-0 systemd-machined[154382]: Machine qemu-23-instance-00000017 terminated.
Jan 22 17:13:15 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[221075]: [NOTICE]   (221082) : haproxy version is 2.8.14-c23fe91
Jan 22 17:13:15 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[221075]: [NOTICE]   (221082) : path to executable is /usr/sbin/haproxy
Jan 22 17:13:15 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[221075]: [WARNING]  (221082) : Exiting Master process...
Jan 22 17:13:15 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[221075]: [WARNING]  (221082) : Exiting Master process...
Jan 22 17:13:15 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[221075]: [ALERT]    (221082) : Current worker (221084) exited with code 143 (Terminated)
Jan 22 17:13:15 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[221075]: [WARNING]  (221082) : All workers exited. Exiting... (0)
Jan 22 17:13:15 compute-0 systemd[1]: libpod-aadf64025bb13b881ecce5591040cd7c5d46fba17d2e4f89db588408b3f4acbc.scope: Deactivated successfully.
Jan 22 17:13:15 compute-0 podman[221407]: 2026-01-22 17:13:15.156564943 +0000 UTC m=+0.047142600 container died aadf64025bb13b881ecce5591040cd7c5d46fba17d2e4f89db588408b3f4acbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.182 183079 INFO nova.virt.libvirt.driver [-] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Instance destroyed successfully.
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.183 183079 DEBUG nova.objects.instance [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'resources' on Instance uuid 7915ef96-3b31-447b-a4b5-1feeb4997869 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:13:15 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aadf64025bb13b881ecce5591040cd7c5d46fba17d2e4f89db588408b3f4acbc-userdata-shm.mount: Deactivated successfully.
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.190 183079 DEBUG nova.compute.manager [req-552dfed2-926e-4a44-b738-dd5eb85d2806 req-d692efec-3853-427c-8e33-f8c31a14a200 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Received event network-vif-unplugged-d9e056da-23a3-44e3-b7a8-44b73622dbb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.191 183079 DEBUG oslo_concurrency.lockutils [req-552dfed2-926e-4a44-b738-dd5eb85d2806 req-d692efec-3853-427c-8e33-f8c31a14a200 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.191 183079 DEBUG oslo_concurrency.lockutils [req-552dfed2-926e-4a44-b738-dd5eb85d2806 req-d692efec-3853-427c-8e33-f8c31a14a200 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.192 183079 DEBUG oslo_concurrency.lockutils [req-552dfed2-926e-4a44-b738-dd5eb85d2806 req-d692efec-3853-427c-8e33-f8c31a14a200 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.192 183079 DEBUG nova.compute.manager [req-552dfed2-926e-4a44-b738-dd5eb85d2806 req-d692efec-3853-427c-8e33-f8c31a14a200 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] No waiting events found dispatching network-vif-unplugged-d9e056da-23a3-44e3-b7a8-44b73622dbb1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.192 183079 DEBUG nova.compute.manager [req-552dfed2-926e-4a44-b738-dd5eb85d2806 req-d692efec-3853-427c-8e33-f8c31a14a200 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Received event network-vif-unplugged-d9e056da-23a3-44e3-b7a8-44b73622dbb1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:13:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-c2012cf507196a4129ccacfa2a827c15e086993982512a9339f7011ae117a27e-merged.mount: Deactivated successfully.
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.199 183079 DEBUG nova.virt.libvirt.vif [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:12:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1675680184',display_name='tempest-server-test-1675680184',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1675680184',id=23,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:12:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-b8u19k66',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:12:49Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=7915ef96-3b31-447b-a4b5-1feeb4997869,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "address": "fa:16:3e:fc:05:dd", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9e056da-23", "ovs_interfaceid": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.199 183079 DEBUG nova.network.os_vif_util [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "address": "fa:16:3e:fc:05:dd", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9e056da-23", "ovs_interfaceid": "d9e056da-23a3-44e3-b7a8-44b73622dbb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.200 183079 DEBUG nova.network.os_vif_util [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:05:dd,bridge_name='br-int',has_traffic_filtering=True,id=d9e056da-23a3-44e3-b7a8-44b73622dbb1,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd9e056da-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.201 183079 DEBUG os_vif [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:05:dd,bridge_name='br-int',has_traffic_filtering=True,id=d9e056da-23a3-44e3-b7a8-44b73622dbb1,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd9e056da-23') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.202 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.202 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9e056da-23, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:15 compute-0 podman[221407]: 2026-01-22 17:13:15.208698973 +0000 UTC m=+0.099276630 container cleanup aadf64025bb13b881ecce5591040cd7c5d46fba17d2e4f89db588408b3f4acbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:13:15 compute-0 systemd[1]: libpod-conmon-aadf64025bb13b881ecce5591040cd7c5d46fba17d2e4f89db588408b3f4acbc.scope: Deactivated successfully.
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.241 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.243 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.246 183079 INFO os_vif [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:05:dd,bridge_name='br-int',has_traffic_filtering=True,id=d9e056da-23a3-44e3-b7a8-44b73622dbb1,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd9e056da-23')
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.246 183079 INFO nova.virt.libvirt.driver [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Deleting instance files /var/lib/nova/instances/7915ef96-3b31-447b-a4b5-1feeb4997869_del
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.247 183079 INFO nova.virt.libvirt.driver [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Deletion of /var/lib/nova/instances/7915ef96-3b31-447b-a4b5-1feeb4997869_del complete
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.305 183079 INFO nova.compute.manager [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.306 183079 DEBUG oslo.service.loopingcall [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.307 183079 DEBUG nova.compute.manager [-] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.308 183079 DEBUG nova.network.neutron [-] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:13:15 compute-0 podman[221450]: 2026-01-22 17:13:15.337224073 +0000 UTC m=+0.062328396 container remove aadf64025bb13b881ecce5591040cd7c5d46fba17d2e4f89db588408b3f4acbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:13:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:15.341 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a94a33d9-02da-45a5-88f1-0cec902887d3]: (4, ('Thu Jan 22 05:13:15 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e (aadf64025bb13b881ecce5591040cd7c5d46fba17d2e4f89db588408b3f4acbc)\naadf64025bb13b881ecce5591040cd7c5d46fba17d2e4f89db588408b3f4acbc\nThu Jan 22 05:13:15 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e (aadf64025bb13b881ecce5591040cd7c5d46fba17d2e4f89db588408b3f4acbc)\naadf64025bb13b881ecce5591040cd7c5d46fba17d2e4f89db588408b3f4acbc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:15.343 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd7f5e0-a4f2-4f60-8907-962f77549fe3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:15.344 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a16be1a-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.346 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:15 compute-0 kernel: tap0a16be1a-20: left promiscuous mode
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.359 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.360 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:15.365 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c5b12fea-d2c0-4910-b296-10ebe52776fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:15.386 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ec4f54d6-7f8c-432d-8693-29230f4012dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:15.387 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c2a89983-4f53-450e-99d2-34b444ec099f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:15.403 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1f219c0e-daa0-4137-acd6-ce25b21e0875]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433248, 'reachable_time': 26457, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221465, 'error': None, 'target': 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:15.406 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:13:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:15.406 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[b53d5179-c583-4a37-bc52-b6d56093975c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:15 compute-0 systemd[1]: run-netns-ovnmeta\x2d0a16be1a\x2d262e\x2d47f7\x2d8518\x2d5f24ee15796e.mount: Deactivated successfully.
Jan 22 17:13:15 compute-0 nova_compute[183075]: 2026-01-22 17:13:15.936 183079 INFO nova.compute.manager [None req-ac947176-8b2e-48fd-8e67-a2bb05819487 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Get console output
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.344 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Updating instance_info_cache with network_info: [{"id": "804a64f5-797f-4eba-ae49-100790171545", "address": "fa:16:3e:3a:03:2f", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap804a64f5-79", "ovs_interfaceid": "804a64f5-797f-4eba-ae49-100790171545", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.359 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-936001bf-d51b-4243-87b8-e363ef3c47a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.360 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.360 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.360 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.360 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.361 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.381 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.381 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.381 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.382 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.419 183079 INFO nova.compute.manager [None req-7c2427f2-c6ff-4ee3-84c7-9ac3e8e2fff7 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Get console output
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.426 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.625 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.706 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.707 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.786 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.792 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.884 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.886 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.942 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:16 compute-0 nova_compute[183075]: 2026-01-22 17:13:16.952 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.013 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.015 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.107 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.115 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.191 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.193 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.267 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.394 183079 DEBUG nova.network.neutron [-] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.400 183079 DEBUG nova.compute.manager [req-1993ce89-a415-4adc-996d-2422e29a8b9a req-17ec144d-eeb7-4e2e-9def-13ef57361e2c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Received event network-vif-plugged-d9e056da-23a3-44e3-b7a8-44b73622dbb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.400 183079 DEBUG oslo_concurrency.lockutils [req-1993ce89-a415-4adc-996d-2422e29a8b9a req-17ec144d-eeb7-4e2e-9def-13ef57361e2c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.400 183079 DEBUG oslo_concurrency.lockutils [req-1993ce89-a415-4adc-996d-2422e29a8b9a req-17ec144d-eeb7-4e2e-9def-13ef57361e2c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.401 183079 DEBUG oslo_concurrency.lockutils [req-1993ce89-a415-4adc-996d-2422e29a8b9a req-17ec144d-eeb7-4e2e-9def-13ef57361e2c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7915ef96-3b31-447b-a4b5-1feeb4997869-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.401 183079 DEBUG nova.compute.manager [req-1993ce89-a415-4adc-996d-2422e29a8b9a req-17ec144d-eeb7-4e2e-9def-13ef57361e2c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] No waiting events found dispatching network-vif-plugged-d9e056da-23a3-44e3-b7a8-44b73622dbb1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.401 183079 WARNING nova.compute.manager [req-1993ce89-a415-4adc-996d-2422e29a8b9a req-17ec144d-eeb7-4e2e-9def-13ef57361e2c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Received unexpected event network-vif-plugged-d9e056da-23a3-44e3-b7a8-44b73622dbb1 for instance with vm_state active and task_state deleting.
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.411 183079 INFO nova.compute.manager [-] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Took 2.10 seconds to deallocate network for instance.
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.447 183079 DEBUG oslo_concurrency.lockutils [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.448 183079 DEBUG oslo_concurrency.lockutils [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.559 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.560 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=4922MB free_disk=73.26124954223633GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.561 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.598 183079 DEBUG nova.compute.provider_tree [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.613 183079 DEBUG nova.scheduler.client.report [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.637 183079 DEBUG oslo_concurrency.lockutils [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.640 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.673 183079 INFO nova.scheduler.client.report [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Deleted allocations for instance 7915ef96-3b31-447b-a4b5-1feeb4997869
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.729 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 936001bf-d51b-4243-87b8-e363ef3c47a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.729 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 91845d3c-b89e-43ba-b1d2-40f99d79ae8e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.729 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance a39a5d00-6f96-4405-aff0-1449aee94079 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.730 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance cfe610a3-4dee-46ca-a82a-1c8993fdd52c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.730 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.731 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.736 183079 DEBUG oslo_concurrency.lockutils [None req-e96d75bf-f4b8-4285-be14-389ab2faf210 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "7915ef96-3b31-447b-a4b5-1feeb4997869" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.816s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.838 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.854 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.882 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:13:17 compute-0 nova_compute[183075]: 2026-01-22 17:13:17.883 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.592 183079 DEBUG oslo_concurrency.lockutils [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.592 183079 DEBUG oslo_concurrency.lockutils [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.592 183079 DEBUG oslo_concurrency.lockutils [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.593 183079 DEBUG oslo_concurrency.lockutils [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.593 183079 DEBUG oslo_concurrency.lockutils [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.594 183079 INFO nova.compute.manager [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Terminating instance
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.595 183079 DEBUG nova.compute.manager [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:13:18 compute-0 kernel: tap1728cec9-ef (unregistering): left promiscuous mode
Jan 22 17:13:18 compute-0 NetworkManager[55454]: <info>  [1769101998.6255] device (tap1728cec9-ef): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:13:18 compute-0 ovn_controller[95372]: 2026-01-22T17:13:18Z|00272|binding|INFO|Releasing lport 1728cec9-ef37-4d9b-8c9c-54c2a6640439 from this chassis (sb_readonly=0)
Jan 22 17:13:18 compute-0 ovn_controller[95372]: 2026-01-22T17:13:18Z|00273|binding|INFO|Setting lport 1728cec9-ef37-4d9b-8c9c-54c2a6640439 down in Southbound
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.643 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:18 compute-0 ovn_controller[95372]: 2026-01-22T17:13:18Z|00274|binding|INFO|Removing iface tap1728cec9-ef ovn-installed in OVS
Jan 22 17:13:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:18.653 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:21:9e 10.100.0.13'], port_security=['fa:16:3e:2c:21:9e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '91845d3c-b89e-43ba-b1d2-40f99d79ae8e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-576f6598-999f-46d9-809a-65b7475a1ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92087573-6410-4f80-a195-ce8b2058420e, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=1728cec9-ef37-4d9b-8c9c-54c2a6640439) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:13:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:18.656 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 1728cec9-ef37-4d9b-8c9c-54c2a6640439 in datapath 576f6598-999f-46d9-809a-65b7475a1ec7 unbound from our chassis
Jan 22 17:13:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:18.661 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 576f6598-999f-46d9-809a-65b7475a1ec7
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.663 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:18 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000015.scope: Deactivated successfully.
Jan 22 17:13:18 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000015.scope: Consumed 15.338s CPU time.
Jan 22 17:13:18 compute-0 systemd-machined[154382]: Machine qemu-21-instance-00000015 terminated.
Jan 22 17:13:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:18.687 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[67c0c032-95f5-43a7-a001-f04b83961e4e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:18.723 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[67bb4761-98ee-4504-aa32-95fae3b23778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:18.728 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[345bdb8c-7261-451e-ad9a-09ca9ceac1df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:18.756 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[752d5dcc-2266-49d2-918b-e8d58e1d8b4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:18.778 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[35ef425b-c2d4-40b7-a1fc-86f5ef85dba2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap576f6598-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:fa:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 204, 'tx_packets': 107, 'rx_bytes': 17392, 'tx_bytes': 12136, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 204, 'tx_packets': 107, 'rx_bytes': 17392, 'tx_bytes': 12136, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425631, 'reachable_time': 19752, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221503, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:18.797 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f1f577cc-35f8-48b6-9017-5ccad2f86286]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap576f6598-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425644, 'tstamp': 425644}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221504, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap576f6598-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425648, 'tstamp': 425648}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221504, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:18.801 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap576f6598-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.837 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.839 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.846 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:18.846 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap576f6598-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:18.847 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:13:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:18.847 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap576f6598-90, col_values=(('external_ids', {'iface-id': '1759254b-798a-4e65-baf5-489557c1f604'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:18.848 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.878 183079 INFO nova.virt.libvirt.driver [-] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Instance destroyed successfully.
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.879 183079 DEBUG nova.objects.instance [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'resources' on Instance uuid 91845d3c-b89e-43ba-b1d2-40f99d79ae8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.898 183079 DEBUG nova.virt.libvirt.vif [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:12:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-105042380',display_name='tempest-server-test-105042380',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-105042380',id=21,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:12:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-kyiddvi0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:12:13Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=91845d3c-b89e-43ba-b1d2-40f99d79ae8e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "address": "fa:16:3e:2c:21:9e", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1728cec9-ef", "ovs_interfaceid": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.898 183079 DEBUG nova.network.os_vif_util [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "address": "fa:16:3e:2c:21:9e", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1728cec9-ef", "ovs_interfaceid": "1728cec9-ef37-4d9b-8c9c-54c2a6640439", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.899 183079 DEBUG nova.network.os_vif_util [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2c:21:9e,bridge_name='br-int',has_traffic_filtering=True,id=1728cec9-ef37-4d9b-8c9c-54c2a6640439,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1728cec9-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.899 183079 DEBUG os_vif [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2c:21:9e,bridge_name='br-int',has_traffic_filtering=True,id=1728cec9-ef37-4d9b-8c9c-54c2a6640439,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1728cec9-ef') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.901 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.901 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1728cec9-ef, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.903 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.906 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.912 183079 INFO os_vif [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2c:21:9e,bridge_name='br-int',has_traffic_filtering=True,id=1728cec9-ef37-4d9b-8c9c-54c2a6640439,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1728cec9-ef')
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.913 183079 INFO nova.virt.libvirt.driver [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Deleting instance files /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e_del
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.914 183079 INFO nova.virt.libvirt.driver [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Deletion of /var/lib/nova/instances/91845d3c-b89e-43ba-b1d2-40f99d79ae8e_del complete
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.992 183079 INFO nova.compute.manager [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.993 183079 DEBUG oslo.service.loopingcall [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.993 183079 DEBUG nova.compute.manager [-] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:13:18 compute-0 nova_compute[183075]: 2026-01-22 17:13:18.994 183079 DEBUG nova.network.neutron [-] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.195 183079 DEBUG nova.compute.manager [req-eab80c12-62b0-468d-8ecf-46b20b41bd5a req-58e3c032-8a50-44d2-9f59-57e27e4912a3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Received event network-vif-unplugged-1728cec9-ef37-4d9b-8c9c-54c2a6640439 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.196 183079 DEBUG oslo_concurrency.lockutils [req-eab80c12-62b0-468d-8ecf-46b20b41bd5a req-58e3c032-8a50-44d2-9f59-57e27e4912a3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.196 183079 DEBUG oslo_concurrency.lockutils [req-eab80c12-62b0-468d-8ecf-46b20b41bd5a req-58e3c032-8a50-44d2-9f59-57e27e4912a3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.197 183079 DEBUG oslo_concurrency.lockutils [req-eab80c12-62b0-468d-8ecf-46b20b41bd5a req-58e3c032-8a50-44d2-9f59-57e27e4912a3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.197 183079 DEBUG nova.compute.manager [req-eab80c12-62b0-468d-8ecf-46b20b41bd5a req-58e3c032-8a50-44d2-9f59-57e27e4912a3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] No waiting events found dispatching network-vif-unplugged-1728cec9-ef37-4d9b-8c9c-54c2a6640439 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.197 183079 DEBUG nova.compute.manager [req-eab80c12-62b0-468d-8ecf-46b20b41bd5a req-58e3c032-8a50-44d2-9f59-57e27e4912a3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Received event network-vif-unplugged-1728cec9-ef37-4d9b-8c9c-54c2a6640439 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.565 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.565 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.582 183079 DEBUG nova.compute.manager [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.647 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.647 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.656 183079 DEBUG nova.virt.hardware [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.657 183079 INFO nova.compute.claims [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.855 183079 DEBUG nova.compute.provider_tree [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.876 183079 DEBUG nova.scheduler.client.report [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.900 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.901 183079 DEBUG nova.compute.manager [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.953 183079 DEBUG nova.compute.manager [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.954 183079 DEBUG nova.network.neutron [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.975 183079 INFO nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:13:19 compute-0 nova_compute[183075]: 2026-01-22 17:13:19.993 183079 DEBUG nova.compute.manager [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.092 183079 DEBUG nova.compute.manager [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.094 183079 DEBUG nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.094 183079 INFO nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Creating image(s)
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.095 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "/var/lib/nova/instances/ed1e087d-92fe-41d0-bd0f-e907e799d3bc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.096 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "/var/lib/nova/instances/ed1e087d-92fe-41d0-bd0f-e907e799d3bc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.097 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "/var/lib/nova/instances/ed1e087d-92fe-41d0-bd0f-e907e799d3bc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.122 183079 DEBUG oslo_concurrency.processutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.183 183079 DEBUG oslo_concurrency.processutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.184 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.185 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.200 183079 DEBUG oslo_concurrency.processutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.247 183079 DEBUG nova.policy [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4a7542774b9c42618cf9d00113f9d23d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '26cca885d303443380036cbbe9e70744', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.256 183079 DEBUG oslo_concurrency.processutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.257 183079 DEBUG oslo_concurrency.processutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/ed1e087d-92fe-41d0-bd0f-e907e799d3bc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.308 183079 DEBUG oslo_concurrency.processutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/ed1e087d-92fe-41d0-bd0f-e907e799d3bc/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.310 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.310 183079 DEBUG oslo_concurrency.processutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.365 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.385 183079 DEBUG oslo_concurrency.processutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.386 183079 DEBUG nova.virt.disk.api [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Checking if we can resize image /var/lib/nova/instances/ed1e087d-92fe-41d0-bd0f-e907e799d3bc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.386 183079 DEBUG oslo_concurrency.processutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed1e087d-92fe-41d0-bd0f-e907e799d3bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.450 183079 DEBUG oslo_concurrency.processutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed1e087d-92fe-41d0-bd0f-e907e799d3bc/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.451 183079 DEBUG nova.virt.disk.api [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Cannot resize image /var/lib/nova/instances/ed1e087d-92fe-41d0-bd0f-e907e799d3bc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.452 183079 DEBUG nova.objects.instance [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lazy-loading 'migration_context' on Instance uuid ed1e087d-92fe-41d0-bd0f-e907e799d3bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.474 183079 DEBUG nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.474 183079 DEBUG nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Ensure instance console log exists: /var/lib/nova/instances/ed1e087d-92fe-41d0-bd0f-e907e799d3bc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.475 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.475 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.475 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.561 183079 DEBUG nova.network.neutron [-] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.588 183079 INFO nova.compute.manager [-] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Took 1.59 seconds to deallocate network for instance.
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.631 183079 DEBUG oslo_concurrency.lockutils [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.632 183079 DEBUG oslo_concurrency.lockutils [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.734 183079 DEBUG nova.compute.provider_tree [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.756 183079 DEBUG nova.scheduler.client.report [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.778 183079 DEBUG oslo_concurrency.lockutils [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.799 183079 INFO nova.scheduler.client.report [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Deleted allocations for instance 91845d3c-b89e-43ba-b1d2-40f99d79ae8e
Jan 22 17:13:20 compute-0 nova_compute[183075]: 2026-01-22 17:13:20.859 183079 DEBUG oslo_concurrency.lockutils [None req-e91e7370-6565-427e-aca9-e64eefc7a89a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.068 183079 DEBUG nova.network.neutron [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Successfully updated port: 3f6f0766-40d7-4301-8513-fdb50502511a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.091 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "refresh_cache-ed1e087d-92fe-41d0-bd0f-e907e799d3bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.091 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquired lock "refresh_cache-ed1e087d-92fe-41d0-bd0f-e907e799d3bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.091 183079 DEBUG nova.network.neutron [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.168 183079 DEBUG nova.compute.manager [req-aa58f36f-71c6-4651-96ff-15267bcc528b req-3c3b4c2d-a25f-4676-b0fa-be5ccfe41e4c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Received event network-changed-3f6f0766-40d7-4301-8513-fdb50502511a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.168 183079 DEBUG nova.compute.manager [req-aa58f36f-71c6-4651-96ff-15267bcc528b req-3c3b4c2d-a25f-4676-b0fa-be5ccfe41e4c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Refreshing instance network info cache due to event network-changed-3f6f0766-40d7-4301-8513-fdb50502511a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.169 183079 DEBUG oslo_concurrency.lockutils [req-aa58f36f-71c6-4651-96ff-15267bcc528b req-3c3b4c2d-a25f-4676-b0fa-be5ccfe41e4c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-ed1e087d-92fe-41d0-bd0f-e907e799d3bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.258 183079 DEBUG nova.network.neutron [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.303 183079 DEBUG nova.compute.manager [req-2568eccc-0ecb-4306-b81f-ef566786e31d req-330dff9c-1560-412c-b5d3-2b8f629d14cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Received event network-vif-plugged-1728cec9-ef37-4d9b-8c9c-54c2a6640439 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.303 183079 DEBUG oslo_concurrency.lockutils [req-2568eccc-0ecb-4306-b81f-ef566786e31d req-330dff9c-1560-412c-b5d3-2b8f629d14cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.303 183079 DEBUG oslo_concurrency.lockutils [req-2568eccc-0ecb-4306-b81f-ef566786e31d req-330dff9c-1560-412c-b5d3-2b8f629d14cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.304 183079 DEBUG oslo_concurrency.lockutils [req-2568eccc-0ecb-4306-b81f-ef566786e31d req-330dff9c-1560-412c-b5d3-2b8f629d14cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "91845d3c-b89e-43ba-b1d2-40f99d79ae8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.304 183079 DEBUG nova.compute.manager [req-2568eccc-0ecb-4306-b81f-ef566786e31d req-330dff9c-1560-412c-b5d3-2b8f629d14cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] No waiting events found dispatching network-vif-plugged-1728cec9-ef37-4d9b-8c9c-54c2a6640439 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.304 183079 WARNING nova.compute.manager [req-2568eccc-0ecb-4306-b81f-ef566786e31d req-330dff9c-1560-412c-b5d3-2b8f629d14cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Received unexpected event network-vif-plugged-1728cec9-ef37-4d9b-8c9c-54c2a6640439 for instance with vm_state deleted and task_state None.
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.569 183079 INFO nova.compute.manager [None req-f83d757e-ac6a-42da-a2d8-5a3a9d9b70e9 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Get console output
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.575 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.845 183079 DEBUG oslo_concurrency.lockutils [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "936001bf-d51b-4243-87b8-e363ef3c47a8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.846 183079 DEBUG oslo_concurrency.lockutils [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.846 183079 DEBUG oslo_concurrency.lockutils [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.846 183079 DEBUG oslo_concurrency.lockutils [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.846 183079 DEBUG oslo_concurrency.lockutils [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.847 183079 INFO nova.compute.manager [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Terminating instance
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.848 183079 DEBUG nova.compute.manager [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:13:21 compute-0 kernel: tap804a64f5-79 (unregistering): left promiscuous mode
Jan 22 17:13:21 compute-0 NetworkManager[55454]: <info>  [1769102001.8804] device (tap804a64f5-79): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:13:21 compute-0 ovn_controller[95372]: 2026-01-22T17:13:21Z|00275|binding|INFO|Releasing lport 804a64f5-797f-4eba-ae49-100790171545 from this chassis (sb_readonly=0)
Jan 22 17:13:21 compute-0 ovn_controller[95372]: 2026-01-22T17:13:21Z|00276|binding|INFO|Setting lport 804a64f5-797f-4eba-ae49-100790171545 down in Southbound
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.884 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:21 compute-0 ovn_controller[95372]: 2026-01-22T17:13:21Z|00277|binding|INFO|Removing iface tap804a64f5-79 ovn-installed in OVS
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.888 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:21.895 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:03:2f 10.100.0.9'], port_security=['fa:16:3e:3a:03:2f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-576f6598-999f-46d9-809a-65b7475a1ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.196', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92087573-6410-4f80-a195-ce8b2058420e, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=804a64f5-797f-4eba-ae49-100790171545) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:13:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:21.897 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 804a64f5-797f-4eba-ae49-100790171545 in datapath 576f6598-999f-46d9-809a-65b7475a1ec7 unbound from our chassis
Jan 22 17:13:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:21.899 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 576f6598-999f-46d9-809a-65b7475a1ec7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:13:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:21.900 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[64cf03e4-24bf-459b-99ad-3844fc05afff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:21.901 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 namespace which is not needed anymore
Jan 22 17:13:21 compute-0 nova_compute[183075]: 2026-01-22 17:13:21.902 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:21 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000014.scope: Deactivated successfully.
Jan 22 17:13:21 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000014.scope: Consumed 17.068s CPU time.
Jan 22 17:13:21 compute-0 systemd-machined[154382]: Machine qemu-20-instance-00000014 terminated.
Jan 22 17:13:21 compute-0 podman[221536]: 2026-01-22 17:13:21.969962218 +0000 UTC m=+0.064689277 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:13:22 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[219940]: [NOTICE]   (219944) : haproxy version is 2.8.14-c23fe91
Jan 22 17:13:22 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[219940]: [NOTICE]   (219944) : path to executable is /usr/sbin/haproxy
Jan 22 17:13:22 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[219940]: [WARNING]  (219944) : Exiting Master process...
Jan 22 17:13:22 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[219940]: [WARNING]  (219944) : Exiting Master process...
Jan 22 17:13:22 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[219940]: [ALERT]    (219944) : Current worker (219946) exited with code 143 (Terminated)
Jan 22 17:13:22 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[219940]: [WARNING]  (219944) : All workers exited. Exiting... (0)
Jan 22 17:13:22 compute-0 systemd[1]: libpod-ce557a5a19054a6576b5f8e43aaebf77a0f24635c178933357e9c8fca275c8cd.scope: Deactivated successfully.
Jan 22 17:13:22 compute-0 podman[221581]: 2026-01-22 17:13:22.045816386 +0000 UTC m=+0.047849499 container died ce557a5a19054a6576b5f8e43aaebf77a0f24635c178933357e9c8fca275c8cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:13:22 compute-0 kernel: tap804a64f5-79: entered promiscuous mode
Jan 22 17:13:22 compute-0 NetworkManager[55454]: <info>  [1769102002.0691] manager: (tap804a64f5-79): new Tun device (/org/freedesktop/NetworkManager/Devices/120)
Jan 22 17:13:22 compute-0 systemd-udevd[221551]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:13:22 compute-0 kernel: tap804a64f5-79 (unregistering): left promiscuous mode
Jan 22 17:13:22 compute-0 ovn_controller[95372]: 2026-01-22T17:13:22Z|00278|binding|INFO|Claiming lport 804a64f5-797f-4eba-ae49-100790171545 for this chassis.
Jan 22 17:13:22 compute-0 ovn_controller[95372]: 2026-01-22T17:13:22Z|00279|binding|INFO|804a64f5-797f-4eba-ae49-100790171545: Claiming fa:16:3e:3a:03:2f 10.100.0.9
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.073 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ce557a5a19054a6576b5f8e43aaebf77a0f24635c178933357e9c8fca275c8cd-userdata-shm.mount: Deactivated successfully.
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.086 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:03:2f 10.100.0.9'], port_security=['fa:16:3e:3a:03:2f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-576f6598-999f-46d9-809a-65b7475a1ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.196', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92087573-6410-4f80-a195-ce8b2058420e, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=804a64f5-797f-4eba-ae49-100790171545) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:13:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-3a9ec44c08dfafb854a6c74492a29420ecebc41a6de1085bc5e755ed3934c6c7-merged.mount: Deactivated successfully.
Jan 22 17:13:22 compute-0 ovn_controller[95372]: 2026-01-22T17:13:22Z|00280|binding|INFO|Setting lport 804a64f5-797f-4eba-ae49-100790171545 ovn-installed in OVS
Jan 22 17:13:22 compute-0 ovn_controller[95372]: 2026-01-22T17:13:22Z|00281|binding|INFO|Setting lport 804a64f5-797f-4eba-ae49-100790171545 up in Southbound
Jan 22 17:13:22 compute-0 ovn_controller[95372]: 2026-01-22T17:13:22Z|00282|binding|INFO|Releasing lport 804a64f5-797f-4eba-ae49-100790171545 from this chassis (sb_readonly=1)
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.097 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 ovn_controller[95372]: 2026-01-22T17:13:22Z|00283|binding|INFO|Removing iface tap804a64f5-79 ovn-installed in OVS
Jan 22 17:13:22 compute-0 ovn_controller[95372]: 2026-01-22T17:13:22Z|00284|if_status|INFO|Dropped 2 log messages in last 73 seconds (most recently, 73 seconds ago) due to excessive rate
Jan 22 17:13:22 compute-0 ovn_controller[95372]: 2026-01-22T17:13:22Z|00285|if_status|INFO|Not setting lport 804a64f5-797f-4eba-ae49-100790171545 down as sb is readonly
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.100 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 podman[221581]: 2026-01-22 17:13:22.102241287 +0000 UTC m=+0.104274400 container cleanup ce557a5a19054a6576b5f8e43aaebf77a0f24635c178933357e9c8fca275c8cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:13:22 compute-0 ovn_controller[95372]: 2026-01-22T17:13:22Z|00286|binding|INFO|Releasing lport 804a64f5-797f-4eba-ae49-100790171545 from this chassis (sb_readonly=0)
Jan 22 17:13:22 compute-0 ovn_controller[95372]: 2026-01-22T17:13:22Z|00287|binding|INFO|Setting lport 804a64f5-797f-4eba-ae49-100790171545 down in Southbound
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.107 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:03:2f 10.100.0.9'], port_security=['fa:16:3e:3a:03:2f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '936001bf-d51b-4243-87b8-e363ef3c47a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-576f6598-999f-46d9-809a-65b7475a1ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.196', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92087573-6410-4f80-a195-ce8b2058420e, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=804a64f5-797f-4eba-ae49-100790171545) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:13:22 compute-0 systemd[1]: libpod-conmon-ce557a5a19054a6576b5f8e43aaebf77a0f24635c178933357e9c8fca275c8cd.scope: Deactivated successfully.
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.121 183079 INFO nova.virt.libvirt.driver [-] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Instance destroyed successfully.
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.122 183079 DEBUG nova.objects.instance [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'resources' on Instance uuid 936001bf-d51b-4243-87b8-e363ef3c47a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.135 183079 DEBUG nova.virt.libvirt.vif [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:11:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1891253532',display_name='tempest-server-test-1891253532',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1891253532',id=20,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:11:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-io1lyazv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:11:33Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=936001bf-d51b-4243-87b8-e363ef3c47a8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "804a64f5-797f-4eba-ae49-100790171545", "address": "fa:16:3e:3a:03:2f", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap804a64f5-79", "ovs_interfaceid": "804a64f5-797f-4eba-ae49-100790171545", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.136 183079 DEBUG nova.network.os_vif_util [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "804a64f5-797f-4eba-ae49-100790171545", "address": "fa:16:3e:3a:03:2f", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap804a64f5-79", "ovs_interfaceid": "804a64f5-797f-4eba-ae49-100790171545", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.136 183079 DEBUG nova.network.os_vif_util [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3a:03:2f,bridge_name='br-int',has_traffic_filtering=True,id=804a64f5-797f-4eba-ae49-100790171545,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap804a64f5-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.137 183079 DEBUG os_vif [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3a:03:2f,bridge_name='br-int',has_traffic_filtering=True,id=804a64f5-797f-4eba-ae49-100790171545,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap804a64f5-79') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.138 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.138 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap804a64f5-79, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.153 183079 DEBUG nova.network.neutron [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Updating instance_info_cache with network_info: [{"id": "3f6f0766-40d7-4301-8513-fdb50502511a", "address": "fa:16:3e:5b:24:9f", "network": {"id": "b6706592-b3b7-4148-ba2f-3b2dac46e91f", "bridge": "br-int", "label": "tempest-test-network--969320467", "subnets": [{"cidr": "10.10.220.0/24", "dns": [], "gateway": {"address": "10.10.220.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.220.230", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f6f0766-40", "ovs_interfaceid": "3f6f0766-40d7-4301-8513-fdb50502511a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.162 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.164 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.166 183079 INFO os_vif [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3a:03:2f,bridge_name='br-int',has_traffic_filtering=True,id=804a64f5-797f-4eba-ae49-100790171545,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap804a64f5-79')
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.167 183079 INFO nova.virt.libvirt.driver [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Deleting instance files /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8_del
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.167 183079 INFO nova.virt.libvirt.driver [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Deletion of /var/lib/nova/instances/936001bf-d51b-4243-87b8-e363ef3c47a8_del complete
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.172 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Releasing lock "refresh_cache-ed1e087d-92fe-41d0-bd0f-e907e799d3bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.173 183079 DEBUG nova.compute.manager [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Instance network_info: |[{"id": "3f6f0766-40d7-4301-8513-fdb50502511a", "address": "fa:16:3e:5b:24:9f", "network": {"id": "b6706592-b3b7-4148-ba2f-3b2dac46e91f", "bridge": "br-int", "label": "tempest-test-network--969320467", "subnets": [{"cidr": "10.10.220.0/24", "dns": [], "gateway": {"address": "10.10.220.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.220.230", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f6f0766-40", "ovs_interfaceid": "3f6f0766-40d7-4301-8513-fdb50502511a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:13:22 compute-0 podman[221618]: 2026-01-22 17:13:22.17331239 +0000 UTC m=+0.046992656 container remove ce557a5a19054a6576b5f8e43aaebf77a0f24635c178933357e9c8fca275c8cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.173 183079 DEBUG oslo_concurrency.lockutils [req-aa58f36f-71c6-4651-96ff-15267bcc528b req-3c3b4c2d-a25f-4676-b0fa-be5ccfe41e4c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-ed1e087d-92fe-41d0-bd0f-e907e799d3bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.174 183079 DEBUG nova.network.neutron [req-aa58f36f-71c6-4651-96ff-15267bcc528b req-3c3b4c2d-a25f-4676-b0fa-be5ccfe41e4c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Refreshing network info cache for port 3f6f0766-40d7-4301-8513-fdb50502511a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.177 183079 DEBUG nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Start _get_guest_xml network_info=[{"id": "3f6f0766-40d7-4301-8513-fdb50502511a", "address": "fa:16:3e:5b:24:9f", "network": {"id": "b6706592-b3b7-4148-ba2f-3b2dac46e91f", "bridge": "br-int", "label": "tempest-test-network--969320467", "subnets": [{"cidr": "10.10.220.0/24", "dns": [], "gateway": {"address": "10.10.220.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.220.230", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f6f0766-40", "ovs_interfaceid": "3f6f0766-40d7-4301-8513-fdb50502511a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.176 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[54289e8d-cb3c-485a-8e26-18fd7e63c2ff]: (4, ('Thu Jan 22 05:13:21 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 (ce557a5a19054a6576b5f8e43aaebf77a0f24635c178933357e9c8fca275c8cd)\nce557a5a19054a6576b5f8e43aaebf77a0f24635c178933357e9c8fca275c8cd\nThu Jan 22 05:13:22 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 (ce557a5a19054a6576b5f8e43aaebf77a0f24635c178933357e9c8fca275c8cd)\nce557a5a19054a6576b5f8e43aaebf77a0f24635c178933357e9c8fca275c8cd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.179 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[db0f6f04-e439-4181-a3a4-45457c01aec5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.180 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap576f6598-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.182 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 kernel: tap576f6598-90: left promiscuous mode
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.187 183079 WARNING nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.191 183079 DEBUG nova.virt.libvirt.host [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.191 183079 DEBUG nova.virt.libvirt.host [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.197 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.198 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[77822796-4ef1-4446-821b-0e58cb90c5ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.198 183079 DEBUG nova.virt.libvirt.host [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.199 183079 DEBUG nova.virt.libvirt.host [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.200 183079 DEBUG nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.200 183079 DEBUG nova.virt.hardware [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.200 183079 DEBUG nova.virt.hardware [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.201 183079 DEBUG nova.virt.hardware [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.201 183079 DEBUG nova.virt.hardware [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.201 183079 DEBUG nova.virt.hardware [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.201 183079 DEBUG nova.virt.hardware [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.202 183079 DEBUG nova.virt.hardware [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.202 183079 DEBUG nova.virt.hardware [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.202 183079 DEBUG nova.virt.hardware [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.202 183079 DEBUG nova.virt.hardware [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.202 183079 DEBUG nova.virt.hardware [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.206 183079 DEBUG nova.virt.libvirt.vif [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:13:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-310083763',display_name='tempest-server-test-310083763',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-310083763',id=25,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFDbtc4hRCSv9gHmltB2G2DhAS2UXQKlSKw7wO8jzDWMIBVwAbtKYxxZ/33aOA3bpXNLm1KqC5K8xgOmfq2yl/h8qq6Gn/Z0l0pwwJ1+wDWZki4CnB3qyJDKSWstQshx5w==',key_name='tempest-keypair-test-612599565',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26cca885d303443380036cbbe9e70744',ramdisk_id='',reservation_id='r-k3vwoe13',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkConnectivityTest-1809867331',owner_user_name='tempest-NetworkConnectivityTest-1809867331-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:13:20Z,user_data=None,user_id='4a7542774b9c42618cf9d00113f9d23d',uuid=ed1e087d-92fe-41d0-bd0f-e907e799d3bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3f6f0766-40d7-4301-8513-fdb50502511a", "address": "fa:16:3e:5b:24:9f", "network": {"id": "b6706592-b3b7-4148-ba2f-3b2dac46e91f", "bridge": "br-int", "label": "tempest-test-network--969320467", "subnets": [{"cidr": "10.10.220.0/24", "dns": [], "gateway": {"address": "10.10.220.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.220.230", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f6f0766-40", "ovs_interfaceid": "3f6f0766-40d7-4301-8513-fdb50502511a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.206 183079 DEBUG nova.network.os_vif_util [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converting VIF {"id": "3f6f0766-40d7-4301-8513-fdb50502511a", "address": "fa:16:3e:5b:24:9f", "network": {"id": "b6706592-b3b7-4148-ba2f-3b2dac46e91f", "bridge": "br-int", "label": "tempest-test-network--969320467", "subnets": [{"cidr": "10.10.220.0/24", "dns": [], "gateway": {"address": "10.10.220.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.220.230", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f6f0766-40", "ovs_interfaceid": "3f6f0766-40d7-4301-8513-fdb50502511a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.207 183079 DEBUG nova.network.os_vif_util [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:24:9f,bridge_name='br-int',has_traffic_filtering=True,id=3f6f0766-40d7-4301-8513-fdb50502511a,network=Network(b6706592-b3b7-4148-ba2f-3b2dac46e91f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3f6f0766-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.207 183079 DEBUG nova.objects.instance [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lazy-loading 'pci_devices' on Instance uuid ed1e087d-92fe-41d0-bd0f-e907e799d3bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.211 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3ecd201c-5d78-4e1f-9727-6a3d91b21b2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.212 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8e504f1d-9931-45be-922b-dd4dd9771723]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.224 183079 DEBUG nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:13:22 compute-0 nova_compute[183075]:   <uuid>ed1e087d-92fe-41d0-bd0f-e907e799d3bc</uuid>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   <name>instance-00000019</name>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-310083763</nova:name>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:13:22</nova:creationTime>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:13:22 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:13:22 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:13:22 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:13:22 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:13:22 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:13:22 compute-0 nova_compute[183075]:         <nova:user uuid="4a7542774b9c42618cf9d00113f9d23d">tempest-NetworkConnectivityTest-1809867331-project-member</nova:user>
Jan 22 17:13:22 compute-0 nova_compute[183075]:         <nova:project uuid="26cca885d303443380036cbbe9e70744">tempest-NetworkConnectivityTest-1809867331</nova:project>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:13:22 compute-0 nova_compute[183075]:         <nova:port uuid="3f6f0766-40d7-4301-8513-fdb50502511a">
Jan 22 17:13:22 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.10.220.230" ipVersion="4"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <system>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <entry name="serial">ed1e087d-92fe-41d0-bd0f-e907e799d3bc</entry>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <entry name="uuid">ed1e087d-92fe-41d0-bd0f-e907e799d3bc</entry>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     </system>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   <os>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   </os>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   <features>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   </features>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/ed1e087d-92fe-41d0-bd0f-e907e799d3bc/disk"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:5b:24:9f"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <target dev="tap3f6f0766-40"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/ed1e087d-92fe-41d0-bd0f-e907e799d3bc/console.log" append="off"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <video>
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     </video>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:13:22 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:13:22 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:13:22 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:13:22 compute-0 nova_compute[183075]: </domain>
Jan 22 17:13:22 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.226 183079 DEBUG nova.compute.manager [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Preparing to wait for external event network-vif-plugged-3f6f0766-40d7-4301-8513-fdb50502511a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.226 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.227 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.227 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.228 183079 DEBUG nova.virt.libvirt.vif [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:13:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-310083763',display_name='tempest-server-test-310083763',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-310083763',id=25,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFDbtc4hRCSv9gHmltB2G2DhAS2UXQKlSKw7wO8jzDWMIBVwAbtKYxxZ/33aOA3bpXNLm1KqC5K8xgOmfq2yl/h8qq6Gn/Z0l0pwwJ1+wDWZki4CnB3qyJDKSWstQshx5w==',key_name='tempest-keypair-test-612599565',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26cca885d303443380036cbbe9e70744',ramdisk_id='',reservation_id='r-k3vwoe13',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkConnectivityTest-1809867331',owner_user_name='tempest-NetworkConnectivityTest-1809867331-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:13:20Z,user_data=None,user_id='4a7542774b9c42618cf9d00113f9d23d',uuid=ed1e087d-92fe-41d0-bd0f-e907e799d3bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3f6f0766-40d7-4301-8513-fdb50502511a", "address": "fa:16:3e:5b:24:9f", "network": {"id": "b6706592-b3b7-4148-ba2f-3b2dac46e91f", "bridge": "br-int", "label": "tempest-test-network--969320467", "subnets": [{"cidr": "10.10.220.0/24", "dns": [], "gateway": {"address": "10.10.220.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.220.230", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f6f0766-40", "ovs_interfaceid": "3f6f0766-40d7-4301-8513-fdb50502511a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.228 183079 DEBUG nova.network.os_vif_util [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converting VIF {"id": "3f6f0766-40d7-4301-8513-fdb50502511a", "address": "fa:16:3e:5b:24:9f", "network": {"id": "b6706592-b3b7-4148-ba2f-3b2dac46e91f", "bridge": "br-int", "label": "tempest-test-network--969320467", "subnets": [{"cidr": "10.10.220.0/24", "dns": [], "gateway": {"address": "10.10.220.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.220.230", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f6f0766-40", "ovs_interfaceid": "3f6f0766-40d7-4301-8513-fdb50502511a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.229 183079 DEBUG nova.network.os_vif_util [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:24:9f,bridge_name='br-int',has_traffic_filtering=True,id=3f6f0766-40d7-4301-8513-fdb50502511a,network=Network(b6706592-b3b7-4148-ba2f-3b2dac46e91f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3f6f0766-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.228 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b3756017-e4be-4ace-ad6c-b403b2ffc180]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425623, 'reachable_time': 22238, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221633, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.229 183079 DEBUG os_vif [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:24:9f,bridge_name='br-int',has_traffic_filtering=True,id=3f6f0766-40d7-4301-8513-fdb50502511a,network=Network(b6706592-b3b7-4148-ba2f-3b2dac46e91f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3f6f0766-40') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.230 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.231 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.231 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[b568b13c-c160-483e-9397-3b51f7d7b3f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.232 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 804a64f5-797f-4eba-ae49-100790171545 in datapath 576f6598-999f-46d9-809a-65b7475a1ec7 unbound from our chassis
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.231 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.232 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:13:22 compute-0 systemd[1]: run-netns-ovnmeta\x2d576f6598\x2d999f\x2d46d9\x2d809a\x2d65b7475a1ec7.mount: Deactivated successfully.
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.233 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 576f6598-999f-46d9-809a-65b7475a1ec7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.234 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a9ab19-64d6-46f0-9f2e-d834615013c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.235 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 804a64f5-797f-4eba-ae49-100790171545 in datapath 576f6598-999f-46d9-809a-65b7475a1ec7 unbound from our chassis
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.236 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.236 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f6f0766-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.236 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3f6f0766-40, col_values=(('external_ids', {'iface-id': '3f6f0766-40d7-4301-8513-fdb50502511a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:24:9f', 'vm-uuid': 'ed1e087d-92fe-41d0-bd0f-e907e799d3bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.237 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 576f6598-999f-46d9-809a-65b7475a1ec7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.238 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 NetworkManager[55454]: <info>  [1769102002.2390] manager: (tap3f6f0766-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.238 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7bd4ad6b-8d51-4571-b94c-d10c5724e315]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.240 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.242 183079 INFO nova.compute.manager [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.243 183079 DEBUG oslo.service.loopingcall [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.243 183079 DEBUG nova.compute.manager [-] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.243 183079 DEBUG nova.network.neutron [-] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.245 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.246 183079 INFO os_vif [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:24:9f,bridge_name='br-int',has_traffic_filtering=True,id=3f6f0766-40d7-4301-8513-fdb50502511a,network=Network(b6706592-b3b7-4148-ba2f-3b2dac46e91f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3f6f0766-40')
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.296 183079 DEBUG nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.296 183079 DEBUG nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] No VIF found with MAC fa:16:3e:5b:24:9f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:13:22 compute-0 NetworkManager[55454]: <info>  [1769102002.3478] manager: (tap3f6f0766-40): new Tun device (/org/freedesktop/NetworkManager/Devices/122)
Jan 22 17:13:22 compute-0 kernel: tap3f6f0766-40: entered promiscuous mode
Jan 22 17:13:22 compute-0 ovn_controller[95372]: 2026-01-22T17:13:22Z|00288|binding|INFO|Claiming lport 3f6f0766-40d7-4301-8513-fdb50502511a for this chassis.
Jan 22 17:13:22 compute-0 ovn_controller[95372]: 2026-01-22T17:13:22Z|00289|binding|INFO|3f6f0766-40d7-4301-8513-fdb50502511a: Claiming fa:16:3e:5b:24:9f 10.10.220.230
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.350 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.358 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:24:9f 10.10.220.230'], port_security=['fa:16:3e:5b:24:9f 10.10.220.230'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.10.220.230/24', 'neutron:device_id': 'ed1e087d-92fe-41d0-bd0f-e907e799d3bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b6706592-b3b7-4148-ba2f-3b2dac46e91f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26cca885d303443380036cbbe9e70744', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9427a67d-1313-4d60-b73e-5a3f81f9a54d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4be7746-829b-4878-bcc8-6c751abfe325, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=3f6f0766-40d7-4301-8513-fdb50502511a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:13:22 compute-0 NetworkManager[55454]: <info>  [1769102002.3592] device (tap3f6f0766-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.359 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 3f6f0766-40d7-4301-8513-fdb50502511a in datapath b6706592-b3b7-4148-ba2f-3b2dac46e91f bound to our chassis
Jan 22 17:13:22 compute-0 NetworkManager[55454]: <info>  [1769102002.3622] device (tap3f6f0766-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.362 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b6706592-b3b7-4148-ba2f-3b2dac46e91f
Jan 22 17:13:22 compute-0 ovn_controller[95372]: 2026-01-22T17:13:22Z|00290|binding|INFO|Setting lport 3f6f0766-40d7-4301-8513-fdb50502511a ovn-installed in OVS
Jan 22 17:13:22 compute-0 ovn_controller[95372]: 2026-01-22T17:13:22Z|00291|binding|INFO|Setting lport 3f6f0766-40d7-4301-8513-fdb50502511a up in Southbound
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.366 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.369 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.375 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e1f1cd7a-53f1-4efa-9868-7e94cd12cd66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.376 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb6706592-b1 in ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.378 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb6706592-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.378 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fa12fe48-43c1-4247-a54e-9b0f1575a144]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.379 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7103ebbe-caf3-4a59-afd7-8872f2f6b444]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 systemd-machined[154382]: New machine qemu-25-instance-00000019.
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.391 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[fb523173-be1d-4d19-bfb7-544e52b35989]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 systemd[1]: Started Virtual Machine qemu-25-instance-00000019.
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.413 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[96acbfcf-51bf-487e-8ecd-53f536231f75]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.441 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f93fb395-6691-4d10-986a-8ea56ce473a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.446 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f60a1e46-f248-4ba6-a8ea-ef34b54fc3ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 NetworkManager[55454]: <info>  [1769102002.4482] manager: (tapb6706592-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/123)
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.479 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[3c34bf48-3a83-48f9-bfec-d8713b7d4fd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.481 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ae05001e-adc7-428f-9c3f-75b28df674bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 NetworkManager[55454]: <info>  [1769102002.5051] device (tapb6706592-b0): carrier: link connected
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.511 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[e177c7ca-2ea3-4b18-81d3-98b8b8240b4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.529 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[61aae1d1-ad36-43e2-9312-e2850bdc3674]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb6706592-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:d1:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436620, 'reachable_time': 21392, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221681, 'error': None, 'target': 'ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.543 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f88a0809-c3c9-4aee-a651-0d85c01cdf0b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:d1ba'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 436620, 'tstamp': 436620}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221682, 'error': None, 'target': 'ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.558 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a7999cf4-a68a-42a4-95a6-d909f86f4bc1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb6706592-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:d1:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436620, 'reachable_time': 21392, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221683, 'error': None, 'target': 'ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.589 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[39d907a5-ee2b-4045-a1fa-2d4d3c422e7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.655 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[20274a50-2f27-4874-b29a-4083b7635a0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.657 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb6706592-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.658 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.658 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb6706592-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:22 compute-0 NetworkManager[55454]: <info>  [1769102002.6613] manager: (tapb6706592-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.660 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 kernel: tapb6706592-b0: entered promiscuous mode
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.664 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb6706592-b0, col_values=(('external_ids', {'iface-id': 'd0c72a1e-55e5-4054-b84d-c12ef156a0e7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:22 compute-0 ovn_controller[95372]: 2026-01-22T17:13:22Z|00292|binding|INFO|Releasing lport d0c72a1e-55e5-4054-b84d-c12ef156a0e7 from this chassis (sb_readonly=0)
Jan 22 17:13:22 compute-0 nova_compute[183075]: 2026-01-22 17:13:22.676 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.677 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b6706592-b3b7-4148-ba2f-3b2dac46e91f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b6706592-b3b7-4148-ba2f-3b2dac46e91f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.678 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c20b3aec-a371-4196-aa94-c4241a3b3a7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.679 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/b6706592-b3b7-4148-ba2f-3b2dac46e91f.pid.haproxy
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID b6706592-b3b7-4148-ba2f-3b2dac46e91f
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:13:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:22.680 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f', 'env', 'PROCESS_TAG=haproxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b6706592-b3b7-4148-ba2f-3b2dac46e91f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:13:23 compute-0 podman[221713]: 2026-01-22 17:13:23.103688786 +0000 UTC m=+0.061225257 container create 8f8b650c4cb3750afb653a102a60863b104a09927661f4ae48d4caf8149337ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:13:23 compute-0 systemd[1]: Started libpod-conmon-8f8b650c4cb3750afb653a102a60863b104a09927661f4ae48d4caf8149337ec.scope.
Jan 22 17:13:23 compute-0 podman[221713]: 2026-01-22 17:13:23.065598873 +0000 UTC m=+0.023135324 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:13:23 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:13:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2197d0b6530bf1130a3d1477e9e62aa4f77a854014f1a11c01ad7194fcfea478/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:13:23 compute-0 podman[221713]: 2026-01-22 17:13:23.197605685 +0000 UTC m=+0.155142136 container init 8f8b650c4cb3750afb653a102a60863b104a09927661f4ae48d4caf8149337ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 17:13:23 compute-0 podman[221713]: 2026-01-22 17:13:23.204907145 +0000 UTC m=+0.162443586 container start 8f8b650c4cb3750afb653a102a60863b104a09927661f4ae48d4caf8149337ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 17:13:23 compute-0 neutron-haproxy-ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221729]: [NOTICE]   (221733) : New worker (221735) forked
Jan 22 17:13:23 compute-0 neutron-haproxy-ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221729]: [NOTICE]   (221733) : Loading success.
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.275 183079 DEBUG nova.compute.manager [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Received event network-vif-unplugged-804a64f5-797f-4eba-ae49-100790171545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.276 183079 DEBUG oslo_concurrency.lockutils [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.277 183079 DEBUG oslo_concurrency.lockutils [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.277 183079 DEBUG oslo_concurrency.lockutils [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.277 183079 DEBUG nova.compute.manager [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] No waiting events found dispatching network-vif-unplugged-804a64f5-797f-4eba-ae49-100790171545 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.277 183079 DEBUG nova.compute.manager [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Received event network-vif-unplugged-804a64f5-797f-4eba-ae49-100790171545 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.278 183079 DEBUG nova.compute.manager [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Received event network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.278 183079 DEBUG oslo_concurrency.lockutils [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.278 183079 DEBUG oslo_concurrency.lockutils [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.278 183079 DEBUG oslo_concurrency.lockutils [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.279 183079 DEBUG nova.compute.manager [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] No waiting events found dispatching network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.279 183079 WARNING nova.compute.manager [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Received unexpected event network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 for instance with vm_state active and task_state deleting.
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.279 183079 DEBUG nova.compute.manager [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Received event network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.279 183079 DEBUG oslo_concurrency.lockutils [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.280 183079 DEBUG oslo_concurrency.lockutils [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.280 183079 DEBUG oslo_concurrency.lockutils [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.280 183079 DEBUG nova.compute.manager [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] No waiting events found dispatching network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.280 183079 WARNING nova.compute.manager [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Received unexpected event network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 for instance with vm_state active and task_state deleting.
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.280 183079 DEBUG nova.compute.manager [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Received event network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.281 183079 DEBUG oslo_concurrency.lockutils [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.281 183079 DEBUG oslo_concurrency.lockutils [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.281 183079 DEBUG oslo_concurrency.lockutils [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.281 183079 DEBUG nova.compute.manager [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] No waiting events found dispatching network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.282 183079 WARNING nova.compute.manager [req-ac09746a-0ee4-447d-860e-9c6366ecd2fe req-dbe9c0f9-4502-4fa7-b0b7-c0f6abacc417 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Received unexpected event network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 for instance with vm_state active and task_state deleting.
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.363 183079 DEBUG nova.network.neutron [-] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.378 183079 DEBUG nova.compute.manager [req-ddea4e2e-df74-4535-9899-79d0e350e0d8 req-ed5fe2fb-1cc4-4057-b895-ef42669f96dd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Received event network-vif-plugged-3f6f0766-40d7-4301-8513-fdb50502511a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.378 183079 DEBUG oslo_concurrency.lockutils [req-ddea4e2e-df74-4535-9899-79d0e350e0d8 req-ed5fe2fb-1cc4-4057-b895-ef42669f96dd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.378 183079 DEBUG oslo_concurrency.lockutils [req-ddea4e2e-df74-4535-9899-79d0e350e0d8 req-ed5fe2fb-1cc4-4057-b895-ef42669f96dd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.379 183079 DEBUG oslo_concurrency.lockutils [req-ddea4e2e-df74-4535-9899-79d0e350e0d8 req-ed5fe2fb-1cc4-4057-b895-ef42669f96dd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.379 183079 DEBUG nova.compute.manager [req-ddea4e2e-df74-4535-9899-79d0e350e0d8 req-ed5fe2fb-1cc4-4057-b895-ef42669f96dd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Processing event network-vif-plugged-3f6f0766-40d7-4301-8513-fdb50502511a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.379 183079 DEBUG nova.compute.manager [req-ddea4e2e-df74-4535-9899-79d0e350e0d8 req-ed5fe2fb-1cc4-4057-b895-ef42669f96dd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Received event network-vif-plugged-3f6f0766-40d7-4301-8513-fdb50502511a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.379 183079 DEBUG oslo_concurrency.lockutils [req-ddea4e2e-df74-4535-9899-79d0e350e0d8 req-ed5fe2fb-1cc4-4057-b895-ef42669f96dd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.380 183079 DEBUG oslo_concurrency.lockutils [req-ddea4e2e-df74-4535-9899-79d0e350e0d8 req-ed5fe2fb-1cc4-4057-b895-ef42669f96dd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.380 183079 DEBUG oslo_concurrency.lockutils [req-ddea4e2e-df74-4535-9899-79d0e350e0d8 req-ed5fe2fb-1cc4-4057-b895-ef42669f96dd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.380 183079 DEBUG nova.compute.manager [req-ddea4e2e-df74-4535-9899-79d0e350e0d8 req-ed5fe2fb-1cc4-4057-b895-ef42669f96dd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] No waiting events found dispatching network-vif-plugged-3f6f0766-40d7-4301-8513-fdb50502511a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.380 183079 WARNING nova.compute.manager [req-ddea4e2e-df74-4535-9899-79d0e350e0d8 req-ed5fe2fb-1cc4-4057-b895-ef42669f96dd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Received unexpected event network-vif-plugged-3f6f0766-40d7-4301-8513-fdb50502511a for instance with vm_state building and task_state spawning.
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.382 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102003.3817296, ed1e087d-92fe-41d0-bd0f-e907e799d3bc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.382 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] VM Started (Lifecycle Event)
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.384 183079 INFO nova.compute.manager [-] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Took 1.14 seconds to deallocate network for instance.
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.384 183079 DEBUG nova.compute.manager [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.392 183079 DEBUG nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.396 183079 INFO nova.virt.libvirt.driver [-] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Instance spawned successfully.
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.397 183079 DEBUG nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.406 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.408 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.416 183079 DEBUG nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.416 183079 DEBUG nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.417 183079 DEBUG nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.417 183079 DEBUG nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.418 183079 DEBUG nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.418 183079 DEBUG nova.virt.libvirt.driver [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.438 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.438 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102003.3839262, ed1e087d-92fe-41d0-bd0f-e907e799d3bc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.438 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] VM Paused (Lifecycle Event)
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.448 183079 DEBUG oslo_concurrency.lockutils [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.448 183079 DEBUG oslo_concurrency.lockutils [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.462 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.467 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102003.3906941, ed1e087d-92fe-41d0-bd0f-e907e799d3bc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.467 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] VM Resumed (Lifecycle Event)
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.487 183079 INFO nova.compute.manager [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Took 3.39 seconds to spawn the instance on the hypervisor.
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.487 183079 DEBUG nova.compute.manager [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.488 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.495 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.525 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.557 183079 INFO nova.compute.manager [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Took 3.93 seconds to build instance.
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.572 183079 DEBUG nova.compute.provider_tree [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.575 183079 DEBUG oslo_concurrency.lockutils [None req-6865622c-4e23-47d2-a89c-2d3bd4dff9b7 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.585 183079 DEBUG nova.scheduler.client.report [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.607 183079 DEBUG oslo_concurrency.lockutils [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.633 183079 INFO nova.scheduler.client.report [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Deleted allocations for instance 936001bf-d51b-4243-87b8-e363ef3c47a8
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.700 183079 DEBUG nova.network.neutron [req-aa58f36f-71c6-4651-96ff-15267bcc528b req-3c3b4c2d-a25f-4676-b0fa-be5ccfe41e4c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Updated VIF entry in instance network info cache for port 3f6f0766-40d7-4301-8513-fdb50502511a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.701 183079 DEBUG nova.network.neutron [req-aa58f36f-71c6-4651-96ff-15267bcc528b req-3c3b4c2d-a25f-4676-b0fa-be5ccfe41e4c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Updating instance_info_cache with network_info: [{"id": "3f6f0766-40d7-4301-8513-fdb50502511a", "address": "fa:16:3e:5b:24:9f", "network": {"id": "b6706592-b3b7-4148-ba2f-3b2dac46e91f", "bridge": "br-int", "label": "tempest-test-network--969320467", "subnets": [{"cidr": "10.10.220.0/24", "dns": [], "gateway": {"address": "10.10.220.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.220.230", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f6f0766-40", "ovs_interfaceid": "3f6f0766-40d7-4301-8513-fdb50502511a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.715 183079 DEBUG oslo_concurrency.lockutils [None req-7aa3e1c1-f2de-438d-ab6f-51f886835ba9 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.869s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:23 compute-0 nova_compute[183075]: 2026-01-22 17:13:23.726 183079 DEBUG oslo_concurrency.lockutils [req-aa58f36f-71c6-4651-96ff-15267bcc528b req-3c3b4c2d-a25f-4676-b0fa-be5ccfe41e4c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-ed1e087d-92fe-41d0-bd0f-e907e799d3bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:13:24 compute-0 nova_compute[183075]: 2026-01-22 17:13:24.255 183079 INFO nova.compute.manager [None req-394d1bc1-4297-4e0e-a425-77af64b0827c 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Get console output
Jan 22 17:13:24 compute-0 nova_compute[183075]: 2026-01-22 17:13:24.260 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:25 compute-0 nova_compute[183075]: 2026-01-22 17:13:25.370 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:25 compute-0 nova_compute[183075]: 2026-01-22 17:13:25.378 183079 DEBUG nova.compute.manager [req-3f95bc83-1d2a-42bb-869f-34bbe76f2b4e req-f7965d48-eee6-4b22-bb51-395b583f508a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Received event network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:25 compute-0 nova_compute[183075]: 2026-01-22 17:13:25.379 183079 DEBUG oslo_concurrency.lockutils [req-3f95bc83-1d2a-42bb-869f-34bbe76f2b4e req-f7965d48-eee6-4b22-bb51-395b583f508a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:25 compute-0 nova_compute[183075]: 2026-01-22 17:13:25.379 183079 DEBUG oslo_concurrency.lockutils [req-3f95bc83-1d2a-42bb-869f-34bbe76f2b4e req-f7965d48-eee6-4b22-bb51-395b583f508a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:25 compute-0 nova_compute[183075]: 2026-01-22 17:13:25.380 183079 DEBUG oslo_concurrency.lockutils [req-3f95bc83-1d2a-42bb-869f-34bbe76f2b4e req-f7965d48-eee6-4b22-bb51-395b583f508a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "936001bf-d51b-4243-87b8-e363ef3c47a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:25 compute-0 nova_compute[183075]: 2026-01-22 17:13:25.380 183079 DEBUG nova.compute.manager [req-3f95bc83-1d2a-42bb-869f-34bbe76f2b4e req-f7965d48-eee6-4b22-bb51-395b583f508a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] No waiting events found dispatching network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:25 compute-0 nova_compute[183075]: 2026-01-22 17:13:25.380 183079 WARNING nova.compute.manager [req-3f95bc83-1d2a-42bb-869f-34bbe76f2b4e req-f7965d48-eee6-4b22-bb51-395b583f508a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Received unexpected event network-vif-plugged-804a64f5-797f-4eba-ae49-100790171545 for instance with vm_state deleted and task_state None.
Jan 22 17:13:26 compute-0 nova_compute[183075]: 2026-01-22 17:13:26.733 183079 INFO nova.compute.manager [None req-4fe42abe-422a-412d-8751-9ac8453533cc 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Get console output
Jan 22 17:13:26 compute-0 nova_compute[183075]: 2026-01-22 17:13:26.739 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:26 compute-0 nova_compute[183075]: 2026-01-22 17:13:26.880 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:13:27 compute-0 nova_compute[183075]: 2026-01-22 17:13:27.241 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:28 compute-0 nova_compute[183075]: 2026-01-22 17:13:28.641 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "84b90c1e-91a0-437d-8ed2-956840c552ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:28 compute-0 nova_compute[183075]: 2026-01-22 17:13:28.641 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "84b90c1e-91a0-437d-8ed2-956840c552ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:28 compute-0 nova_compute[183075]: 2026-01-22 17:13:28.665 183079 DEBUG nova.compute.manager [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:13:28 compute-0 nova_compute[183075]: 2026-01-22 17:13:28.794 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:28 compute-0 nova_compute[183075]: 2026-01-22 17:13:28.795 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:28 compute-0 nova_compute[183075]: 2026-01-22 17:13:28.801 183079 DEBUG nova.virt.hardware [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:13:28 compute-0 nova_compute[183075]: 2026-01-22 17:13:28.802 183079 INFO nova.compute.claims [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:13:28 compute-0 nova_compute[183075]: 2026-01-22 17:13:28.970 183079 DEBUG nova.compute.provider_tree [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.033 183079 DEBUG nova.scheduler.client.report [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.056 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.057 183079 DEBUG nova.compute.manager [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.109 183079 DEBUG nova.compute.manager [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.109 183079 DEBUG nova.network.neutron [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.133 183079 INFO nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.153 183079 DEBUG nova.compute.manager [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.241 183079 DEBUG nova.compute.manager [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.244 183079 DEBUG nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.244 183079 INFO nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Creating image(s)
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.245 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "/var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.246 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.247 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.271 183079 DEBUG oslo_concurrency.processutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.345 183079 DEBUG oslo_concurrency.processutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.346 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.347 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.372 183079 DEBUG oslo_concurrency.processutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:29 compute-0 podman[221754]: 2026-01-22 17:13:29.387789002 +0000 UTC m=+0.090072119 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.396 183079 INFO nova.compute.manager [None req-3406ea42-4cd1-4be6-8980-ab4329d04a38 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Get console output
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.403 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:29 compute-0 podman[221752]: 2026-01-22 17:13:29.404920139 +0000 UTC m=+0.106239691 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:13:29 compute-0 podman[221751]: 2026-01-22 17:13:29.430891796 +0000 UTC m=+0.133022409 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.433 183079 DEBUG oslo_concurrency.processutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.434 183079 DEBUG oslo_concurrency.processutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.477 183079 DEBUG oslo_concurrency.processutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.478 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.479 183079 DEBUG oslo_concurrency.processutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.542 183079 DEBUG oslo_concurrency.processutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.543 183079 DEBUG nova.virt.disk.api [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Checking if we can resize image /var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.544 183079 DEBUG oslo_concurrency.processutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.596 183079 DEBUG oslo_concurrency.processutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.597 183079 DEBUG nova.virt.disk.api [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Cannot resize image /var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.598 183079 DEBUG nova.objects.instance [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'migration_context' on Instance uuid 84b90c1e-91a0-437d-8ed2-956840c552ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.612 183079 DEBUG nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.612 183079 DEBUG nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Ensure instance console log exists: /var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.612 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.612 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:29 compute-0 nova_compute[183075]: 2026-01-22 17:13:29.613 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:30 compute-0 nova_compute[183075]: 2026-01-22 17:13:30.180 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101995.1766505, 7915ef96-3b31-447b-a4b5-1feeb4997869 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:13:30 compute-0 nova_compute[183075]: 2026-01-22 17:13:30.181 183079 INFO nova.compute.manager [-] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] VM Stopped (Lifecycle Event)
Jan 22 17:13:30 compute-0 nova_compute[183075]: 2026-01-22 17:13:30.202 183079 DEBUG nova.compute.manager [None req-da48f212-2014-431b-b7e7-199593fd7c4f - - - - - -] [instance: 7915ef96-3b31-447b-a4b5-1feeb4997869] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:13:30 compute-0 nova_compute[183075]: 2026-01-22 17:13:30.248 183079 DEBUG nova.policy [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:13:30 compute-0 nova_compute[183075]: 2026-01-22 17:13:30.372 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:31 compute-0 nova_compute[183075]: 2026-01-22 17:13:31.823 183079 DEBUG nova.network.neutron [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Successfully updated port: b3bc8962-61ba-4d8d-9a4a-705e9e713574 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:13:31 compute-0 nova_compute[183075]: 2026-01-22 17:13:31.839 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "refresh_cache-84b90c1e-91a0-437d-8ed2-956840c552ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:13:31 compute-0 nova_compute[183075]: 2026-01-22 17:13:31.840 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquired lock "refresh_cache-84b90c1e-91a0-437d-8ed2-956840c552ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:13:31 compute-0 nova_compute[183075]: 2026-01-22 17:13:31.840 183079 DEBUG nova.network.neutron [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:13:31 compute-0 nova_compute[183075]: 2026-01-22 17:13:31.865 183079 INFO nova.compute.manager [None req-8638f7cd-8079-4193-991f-9730d48ec73a 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Get console output
Jan 22 17:13:31 compute-0 nova_compute[183075]: 2026-01-22 17:13:31.872 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:31 compute-0 nova_compute[183075]: 2026-01-22 17:13:31.907 183079 DEBUG nova.compute.manager [req-bba9dc1a-f2fc-4891-874f-58bc8e1be682 req-42ea4c3a-97f1-42a9-9042-12b91781d4ae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Received event network-changed-b3bc8962-61ba-4d8d-9a4a-705e9e713574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:31 compute-0 nova_compute[183075]: 2026-01-22 17:13:31.908 183079 DEBUG nova.compute.manager [req-bba9dc1a-f2fc-4891-874f-58bc8e1be682 req-42ea4c3a-97f1-42a9-9042-12b91781d4ae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Refreshing instance network info cache due to event network-changed-b3bc8962-61ba-4d8d-9a4a-705e9e713574. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:13:31 compute-0 nova_compute[183075]: 2026-01-22 17:13:31.908 183079 DEBUG oslo_concurrency.lockutils [req-bba9dc1a-f2fc-4891-874f-58bc8e1be682 req-42ea4c3a-97f1-42a9-9042-12b91781d4ae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-84b90c1e-91a0-437d-8ed2-956840c552ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.001 183079 DEBUG nova.network.neutron [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.244 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.718 183079 DEBUG nova.network.neutron [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Updating instance_info_cache with network_info: [{"id": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "address": "fa:16:3e:24:ac:67", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3bc8962-61", "ovs_interfaceid": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.746 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Releasing lock "refresh_cache-84b90c1e-91a0-437d-8ed2-956840c552ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.746 183079 DEBUG nova.compute.manager [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Instance network_info: |[{"id": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "address": "fa:16:3e:24:ac:67", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3bc8962-61", "ovs_interfaceid": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.747 183079 DEBUG oslo_concurrency.lockutils [req-bba9dc1a-f2fc-4891-874f-58bc8e1be682 req-42ea4c3a-97f1-42a9-9042-12b91781d4ae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-84b90c1e-91a0-437d-8ed2-956840c552ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.747 183079 DEBUG nova.network.neutron [req-bba9dc1a-f2fc-4891-874f-58bc8e1be682 req-42ea4c3a-97f1-42a9-9042-12b91781d4ae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Refreshing network info cache for port b3bc8962-61ba-4d8d-9a4a-705e9e713574 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.753 183079 DEBUG nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Start _get_guest_xml network_info=[{"id": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "address": "fa:16:3e:24:ac:67", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3bc8962-61", "ovs_interfaceid": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.761 183079 WARNING nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.767 183079 DEBUG nova.virt.libvirt.host [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.768 183079 DEBUG nova.virt.libvirt.host [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.772 183079 DEBUG nova.virt.libvirt.host [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.773 183079 DEBUG nova.virt.libvirt.host [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.773 183079 DEBUG nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.774 183079 DEBUG nova.virt.hardware [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.774 183079 DEBUG nova.virt.hardware [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.775 183079 DEBUG nova.virt.hardware [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.775 183079 DEBUG nova.virt.hardware [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.776 183079 DEBUG nova.virt.hardware [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.776 183079 DEBUG nova.virt.hardware [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.776 183079 DEBUG nova.virt.hardware [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.777 183079 DEBUG nova.virt.hardware [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.777 183079 DEBUG nova.virt.hardware [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.777 183079 DEBUG nova.virt.hardware [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.778 183079 DEBUG nova.virt.hardware [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.784 183079 DEBUG nova.virt.libvirt.vif [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:13:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1407213754',display_name='tempest-server-test-1407213754',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1407213754',id=26,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-f5ng9jql',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:13:29Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=84b90c1e-91a0-437d-8ed2-956840c552ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "address": "fa:16:3e:24:ac:67", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3bc8962-61", "ovs_interfaceid": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.785 183079 DEBUG nova.network.os_vif_util [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "address": "fa:16:3e:24:ac:67", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3bc8962-61", "ovs_interfaceid": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.786 183079 DEBUG nova.network.os_vif_util [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:ac:67,bridge_name='br-int',has_traffic_filtering=True,id=b3bc8962-61ba-4d8d-9a4a-705e9e713574,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb3bc8962-61') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.788 183079 DEBUG nova.objects.instance [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 84b90c1e-91a0-437d-8ed2-956840c552ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.814 183079 DEBUG nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:13:32 compute-0 nova_compute[183075]:   <uuid>84b90c1e-91a0-437d-8ed2-956840c552ab</uuid>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   <name>instance-0000001a</name>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1407213754</nova:name>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:13:32</nova:creationTime>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:13:32 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:13:32 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:13:32 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:13:32 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:13:32 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:13:32 compute-0 nova_compute[183075]:         <nova:user uuid="cd47d63cff2548a88e21e5c2e6a5c161">tempest-FloatingIpSeparateNetwork-931877966-project-member</nova:user>
Jan 22 17:13:32 compute-0 nova_compute[183075]:         <nova:project uuid="e05c7aae349e4a1d859a387df45650a0">tempest-FloatingIpSeparateNetwork-931877966</nova:project>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:13:32 compute-0 nova_compute[183075]:         <nova:port uuid="b3bc8962-61ba-4d8d-9a4a-705e9e713574">
Jan 22 17:13:32 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <system>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <entry name="serial">84b90c1e-91a0-437d-8ed2-956840c552ab</entry>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <entry name="uuid">84b90c1e-91a0-437d-8ed2-956840c552ab</entry>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     </system>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   <os>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   </os>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   <features>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   </features>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab/disk"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:24:ac:67"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <target dev="tapb3bc8962-61"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab/console.log" append="off"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <video>
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     </video>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:13:32 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:13:32 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:13:32 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:13:32 compute-0 nova_compute[183075]: </domain>
Jan 22 17:13:32 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.815 183079 DEBUG nova.compute.manager [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Preparing to wait for external event network-vif-plugged-b3bc8962-61ba-4d8d-9a4a-705e9e713574 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.815 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.816 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.816 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.817 183079 DEBUG nova.virt.libvirt.vif [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:13:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1407213754',display_name='tempest-server-test-1407213754',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1407213754',id=26,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-f5ng9jql',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:13:29Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=84b90c1e-91a0-437d-8ed2-956840c552ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "address": "fa:16:3e:24:ac:67", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3bc8962-61", "ovs_interfaceid": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.818 183079 DEBUG nova.network.os_vif_util [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "address": "fa:16:3e:24:ac:67", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3bc8962-61", "ovs_interfaceid": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.819 183079 DEBUG nova.network.os_vif_util [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:ac:67,bridge_name='br-int',has_traffic_filtering=True,id=b3bc8962-61ba-4d8d-9a4a-705e9e713574,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb3bc8962-61') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.820 183079 DEBUG os_vif [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:ac:67,bridge_name='br-int',has_traffic_filtering=True,id=b3bc8962-61ba-4d8d-9a4a-705e9e713574,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb3bc8962-61') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.821 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.822 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.823 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.826 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.827 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb3bc8962-61, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.827 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb3bc8962-61, col_values=(('external_ids', {'iface-id': 'b3bc8962-61ba-4d8d-9a4a-705e9e713574', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:ac:67', 'vm-uuid': '84b90c1e-91a0-437d-8ed2-956840c552ab'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:32 compute-0 NetworkManager[55454]: <info>  [1769102012.8313] manager: (tapb3bc8962-61): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/125)
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.830 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.836 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.841 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.842 183079 INFO os_vif [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:ac:67,bridge_name='br-int',has_traffic_filtering=True,id=b3bc8962-61ba-4d8d-9a4a-705e9e713574,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb3bc8962-61')
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.909 183079 DEBUG nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.909 183079 DEBUG nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No VIF found with MAC fa:16:3e:24:ac:67, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:13:32 compute-0 kernel: tapb3bc8962-61: entered promiscuous mode
Jan 22 17:13:32 compute-0 NetworkManager[55454]: <info>  [1769102012.9767] manager: (tapb3bc8962-61): new Tun device (/org/freedesktop/NetworkManager/Devices/126)
Jan 22 17:13:32 compute-0 ovn_controller[95372]: 2026-01-22T17:13:32Z|00293|binding|INFO|Claiming lport b3bc8962-61ba-4d8d-9a4a-705e9e713574 for this chassis.
Jan 22 17:13:32 compute-0 ovn_controller[95372]: 2026-01-22T17:13:32Z|00294|binding|INFO|b3bc8962-61ba-4d8d-9a4a-705e9e713574: Claiming fa:16:3e:24:ac:67 10.100.0.5
Jan 22 17:13:32 compute-0 nova_compute[183075]: 2026-01-22 17:13:32.984 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:32.995 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:ac:67 10.100.0.5'], port_security=['fa:16:3e:24:ac:67 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-576f6598-999f-46d9-809a-65b7475a1ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.229'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92087573-6410-4f80-a195-ce8b2058420e, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=b3bc8962-61ba-4d8d-9a4a-705e9e713574) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.000 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:33 compute-0 ovn_controller[95372]: 2026-01-22T17:13:33Z|00295|binding|INFO|Setting lport b3bc8962-61ba-4d8d-9a4a-705e9e713574 ovn-installed in OVS
Jan 22 17:13:33 compute-0 ovn_controller[95372]: 2026-01-22T17:13:33Z|00296|binding|INFO|Setting lport b3bc8962-61ba-4d8d-9a4a-705e9e713574 up in Southbound
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.004 104629 INFO neutron.agent.ovn.metadata.agent [-] Port b3bc8962-61ba-4d8d-9a4a-705e9e713574 in datapath 576f6598-999f-46d9-809a-65b7475a1ec7 bound to our chassis
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.005 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.013 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 576f6598-999f-46d9-809a-65b7475a1ec7
Jan 22 17:13:33 compute-0 systemd-udevd[221847]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.028 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d1c7d115-acec-430f-8017-37ad6c01291a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.029 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap576f6598-91 in ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.031 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap576f6598-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.031 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ac44343c-c08c-4cf0-a0cf-3c512e1e1ea7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.032 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c815240d-bb6e-435f-aa3e-aad5aa39db07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:33 compute-0 NetworkManager[55454]: <info>  [1769102013.0370] device (tapb3bc8962-61): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:13:33 compute-0 NetworkManager[55454]: <info>  [1769102013.0377] device (tapb3bc8962-61): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:13:33 compute-0 systemd-machined[154382]: New machine qemu-26-instance-0000001a.
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.044 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[f417086a-1c32-46d3-8d16-07258dd58fbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:33 compute-0 systemd[1]: Started Virtual Machine qemu-26-instance-0000001a.
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.063 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a3f0e36a-1cfe-4178-b08b-7e6c71feeb52]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.114 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[091f71fc-9b79-402a-8603-1fd87219f67b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.120 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[33c8035f-1748-4914-a721-4b099a0c4f58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:33 compute-0 NetworkManager[55454]: <info>  [1769102013.1208] manager: (tap576f6598-90): new Veth device (/org/freedesktop/NetworkManager/Devices/127)
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.155 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[41fb34e9-9d68-498d-bc49-7b4b7c426d59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.158 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a35df6cd-84d6-4cdb-b314-3e5a4c35d273]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:33 compute-0 NetworkManager[55454]: <info>  [1769102013.1836] device (tap576f6598-90): carrier: link connected
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.189 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[68824dea-0b06-4a76-b59a-f700637a1a7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.208 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[02f64b02-72ea-4135-8cc9-74f6d040b2de]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap576f6598-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:fa:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 437688, 'reachable_time': 25491, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221882, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.226 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6a5a164f-2dd4-4075-9333-8b4b0128b022]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea1:facd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 437688, 'tstamp': 437688}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221883, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.244 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fe3a83fb-a432-44e5-a436-f69303935ab6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap576f6598-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:fa:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 437688, 'reachable_time': 25491, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221884, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.275 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[62dfc2c1-4a1a-4c5e-9055-0e6f7329fac0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.332 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e2c4e2a9-b846-4255-9d3d-bf4ba723861e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.334 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap576f6598-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.335 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.335 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap576f6598-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:33 compute-0 NetworkManager[55454]: <info>  [1769102013.3378] manager: (tap576f6598-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Jan 22 17:13:33 compute-0 kernel: tap576f6598-90: entered promiscuous mode
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.339 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap576f6598-90, col_values=(('external_ids', {'iface-id': '1759254b-798a-4e65-baf5-489557c1f604'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:33 compute-0 ovn_controller[95372]: 2026-01-22T17:13:33Z|00297|binding|INFO|Releasing lport 1759254b-798a-4e65-baf5-489557c1f604 from this chassis (sb_readonly=0)
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.356 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.364 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/576f6598-999f-46d9-809a-65b7475a1ec7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/576f6598-999f-46d9-809a-65b7475a1ec7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.365 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[dea9d734-c22a-4485-9d0b-10276e577a4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.366 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/576f6598-999f-46d9-809a-65b7475a1ec7.pid.haproxy
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 576f6598-999f-46d9-809a-65b7475a1ec7
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.367 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:33.368 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'env', 'PROCESS_TAG=haproxy-576f6598-999f-46d9-809a-65b7475a1ec7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/576f6598-999f-46d9-809a-65b7475a1ec7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.535 183079 DEBUG nova.compute.manager [req-63d39041-1966-42e4-a527-ce77db27a33e req-63c4dd39-8d90-46e2-b795-0d3a3ce76291 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Received event network-vif-plugged-b3bc8962-61ba-4d8d-9a4a-705e9e713574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.535 183079 DEBUG oslo_concurrency.lockutils [req-63d39041-1966-42e4-a527-ce77db27a33e req-63c4dd39-8d90-46e2-b795-0d3a3ce76291 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.536 183079 DEBUG oslo_concurrency.lockutils [req-63d39041-1966-42e4-a527-ce77db27a33e req-63c4dd39-8d90-46e2-b795-0d3a3ce76291 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.536 183079 DEBUG oslo_concurrency.lockutils [req-63d39041-1966-42e4-a527-ce77db27a33e req-63c4dd39-8d90-46e2-b795-0d3a3ce76291 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.536 183079 DEBUG nova.compute.manager [req-63d39041-1966-42e4-a527-ce77db27a33e req-63c4dd39-8d90-46e2-b795-0d3a3ce76291 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Processing event network-vif-plugged-b3bc8962-61ba-4d8d-9a4a-705e9e713574 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.558 183079 DEBUG nova.compute.manager [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.559 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102013.5583668, 84b90c1e-91a0-437d-8ed2-956840c552ab => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.559 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] VM Started (Lifecycle Event)
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.566 183079 DEBUG nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.570 183079 INFO nova.virt.libvirt.driver [-] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Instance spawned successfully.
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.570 183079 DEBUG nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.591 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.597 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.600 183079 DEBUG nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.600 183079 DEBUG nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.601 183079 DEBUG nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.601 183079 DEBUG nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.602 183079 DEBUG nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.602 183079 DEBUG nova.virt.libvirt.driver [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.639 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.639 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102013.5620682, 84b90c1e-91a0-437d-8ed2-956840c552ab => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.639 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] VM Paused (Lifecycle Event)
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.667 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.676 183079 INFO nova.compute.manager [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Took 4.43 seconds to spawn the instance on the hypervisor.
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.677 183079 DEBUG nova.compute.manager [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.678 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102013.5647173, 84b90c1e-91a0-437d-8ed2-956840c552ab => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.679 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] VM Resumed (Lifecycle Event)
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.710 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.715 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.744 183079 INFO nova.compute.manager [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Took 5.00 seconds to build instance.
Jan 22 17:13:33 compute-0 podman[221923]: 2026-01-22 17:13:33.774342247 +0000 UTC m=+0.037630092 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.876 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769101998.8749852, 91845d3c-b89e-43ba-b1d2-40f99d79ae8e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:13:33 compute-0 nova_compute[183075]: 2026-01-22 17:13:33.877 183079 INFO nova.compute.manager [-] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] VM Stopped (Lifecycle Event)
Jan 22 17:13:34 compute-0 nova_compute[183075]: 2026-01-22 17:13:34.033 183079 DEBUG oslo_concurrency.lockutils [None req-ed47ab94-44b4-428c-9bcf-287ecf256788 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "84b90c1e-91a0-437d-8ed2-956840c552ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.392s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:34 compute-0 nova_compute[183075]: 2026-01-22 17:13:34.047 183079 DEBUG nova.compute.manager [None req-9659780e-0062-46c9-96af-1c1fdc30d8fe - - - - - -] [instance: 91845d3c-b89e-43ba-b1d2-40f99d79ae8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:13:34 compute-0 podman[221923]: 2026-01-22 17:13:34.075832337 +0000 UTC m=+0.339120142 container create bfe269bc1798759d85db1012917eca0350a00fc4b919aa8f7ee10ddd0066f52b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:13:34 compute-0 systemd[1]: Started libpod-conmon-bfe269bc1798759d85db1012917eca0350a00fc4b919aa8f7ee10ddd0066f52b.scope.
Jan 22 17:13:34 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:13:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3ef0fb847366aceadce50dea20996790699a5583a4e3df818ef68e2a96d69f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:13:34 compute-0 podman[221923]: 2026-01-22 17:13:34.330953499 +0000 UTC m=+0.594241384 container init bfe269bc1798759d85db1012917eca0350a00fc4b919aa8f7ee10ddd0066f52b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 17:13:34 compute-0 podman[221923]: 2026-01-22 17:13:34.338923097 +0000 UTC m=+0.602210892 container start bfe269bc1798759d85db1012917eca0350a00fc4b919aa8f7ee10ddd0066f52b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:13:34 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[221942]: [NOTICE]   (221950) : New worker (221952) forked
Jan 22 17:13:34 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[221942]: [NOTICE]   (221950) : Loading success.
Jan 22 17:13:34 compute-0 nova_compute[183075]: 2026-01-22 17:13:34.400 183079 DEBUG nova.network.neutron [req-bba9dc1a-f2fc-4891-874f-58bc8e1be682 req-42ea4c3a-97f1-42a9-9042-12b91781d4ae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Updated VIF entry in instance network info cache for port b3bc8962-61ba-4d8d-9a4a-705e9e713574. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:13:34 compute-0 nova_compute[183075]: 2026-01-22 17:13:34.400 183079 DEBUG nova.network.neutron [req-bba9dc1a-f2fc-4891-874f-58bc8e1be682 req-42ea4c3a-97f1-42a9-9042-12b91781d4ae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Updating instance_info_cache with network_info: [{"id": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "address": "fa:16:3e:24:ac:67", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3bc8962-61", "ovs_interfaceid": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:13:34 compute-0 nova_compute[183075]: 2026-01-22 17:13:34.437 183079 DEBUG oslo_concurrency.lockutils [req-bba9dc1a-f2fc-4891-874f-58bc8e1be682 req-42ea4c3a-97f1-42a9-9042-12b91781d4ae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-84b90c1e-91a0-437d-8ed2-956840c552ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:13:34 compute-0 nova_compute[183075]: 2026-01-22 17:13:34.620 183079 INFO nova.compute.manager [None req-f424bb49-6880-473f-975c-e0e3b7029f15 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Get console output
Jan 22 17:13:34 compute-0 nova_compute[183075]: 2026-01-22 17:13:34.626 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:35 compute-0 podman[221978]: 2026-01-22 17:13:35.341283619 +0000 UTC m=+0.051243287 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:13:35 compute-0 nova_compute[183075]: 2026-01-22 17:13:35.373 183079 INFO nova.compute.manager [None req-44585c37-99a2-4a51-9704-c8ebf5d867ee cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Get console output
Jan 22 17:13:35 compute-0 nova_compute[183075]: 2026-01-22 17:13:35.374 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:35 compute-0 ovn_controller[95372]: 2026-01-22T17:13:35Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5b:24:9f 10.10.220.230
Jan 22 17:13:35 compute-0 ovn_controller[95372]: 2026-01-22T17:13:35Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5b:24:9f 10.10.220.230
Jan 22 17:13:35 compute-0 nova_compute[183075]: 2026-01-22 17:13:35.637 183079 DEBUG nova.compute.manager [req-e50621cb-c744-4da3-b4c2-28c244166bb3 req-a94d184f-9f90-40c9-ac98-55bf1d4998c5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Received event network-vif-plugged-b3bc8962-61ba-4d8d-9a4a-705e9e713574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:35 compute-0 nova_compute[183075]: 2026-01-22 17:13:35.637 183079 DEBUG oslo_concurrency.lockutils [req-e50621cb-c744-4da3-b4c2-28c244166bb3 req-a94d184f-9f90-40c9-ac98-55bf1d4998c5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:35 compute-0 nova_compute[183075]: 2026-01-22 17:13:35.638 183079 DEBUG oslo_concurrency.lockutils [req-e50621cb-c744-4da3-b4c2-28c244166bb3 req-a94d184f-9f90-40c9-ac98-55bf1d4998c5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:35 compute-0 nova_compute[183075]: 2026-01-22 17:13:35.638 183079 DEBUG oslo_concurrency.lockutils [req-e50621cb-c744-4da3-b4c2-28c244166bb3 req-a94d184f-9f90-40c9-ac98-55bf1d4998c5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:35 compute-0 nova_compute[183075]: 2026-01-22 17:13:35.638 183079 DEBUG nova.compute.manager [req-e50621cb-c744-4da3-b4c2-28c244166bb3 req-a94d184f-9f90-40c9-ac98-55bf1d4998c5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] No waiting events found dispatching network-vif-plugged-b3bc8962-61ba-4d8d-9a4a-705e9e713574 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:35 compute-0 nova_compute[183075]: 2026-01-22 17:13:35.639 183079 WARNING nova.compute.manager [req-e50621cb-c744-4da3-b4c2-28c244166bb3 req-a94d184f-9f90-40c9-ac98-55bf1d4998c5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Received unexpected event network-vif-plugged-b3bc8962-61ba-4d8d-9a4a-705e9e713574 for instance with vm_state active and task_state None.
Jan 22 17:13:37 compute-0 nova_compute[183075]: 2026-01-22 17:13:37.010 183079 INFO nova.compute.manager [None req-2f0bd3c7-d545-4f93-985c-25b7c4c94cc6 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Get console output
Jan 22 17:13:37 compute-0 nova_compute[183075]: 2026-01-22 17:13:37.016 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:37 compute-0 nova_compute[183075]: 2026-01-22 17:13:37.120 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102002.1194024, 936001bf-d51b-4243-87b8-e363ef3c47a8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:13:37 compute-0 nova_compute[183075]: 2026-01-22 17:13:37.121 183079 INFO nova.compute.manager [-] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] VM Stopped (Lifecycle Event)
Jan 22 17:13:37 compute-0 nova_compute[183075]: 2026-01-22 17:13:37.144 183079 DEBUG nova.compute.manager [None req-0a558f5a-801b-4a60-8ec0-f94771a64a3b - - - - - -] [instance: 936001bf-d51b-4243-87b8-e363ef3c47a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:13:37 compute-0 nova_compute[183075]: 2026-01-22 17:13:37.832 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:39 compute-0 nova_compute[183075]: 2026-01-22 17:13:39.766 183079 INFO nova.compute.manager [None req-68f4a65d-3e4d-4a94-beca-88856c782fca 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Get console output
Jan 22 17:13:39 compute-0 nova_compute[183075]: 2026-01-22 17:13:39.773 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:40 compute-0 nova_compute[183075]: 2026-01-22 17:13:40.377 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:40 compute-0 nova_compute[183075]: 2026-01-22 17:13:40.519 183079 INFO nova.compute.manager [None req-f8f053e2-87ad-46ae-a7eb-153a861e36f5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Get console output
Jan 22 17:13:40 compute-0 nova_compute[183075]: 2026-01-22 17:13:40.525 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:40.794 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:40.795 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:13:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.220.230
Jan 22 17:13:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b6706592-b3b7-4148-ba2f-3b2dac46e91f __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.237 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.237 104990 INFO eventlet.wsgi.server [-] 10.10.220.230,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.4427526
Jan 22 17:13:41 compute-0 haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221735]: 10.10.220.230:48726 [22/Jan/2026:17:13:40.793] listener listener/metadata 0/0/0/444/444 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.246 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.247 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.220.230
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b6706592-b3b7-4148-ba2f-3b2dac46e91f __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.267 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:41 compute-0 haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221735]: 10.10.220.230:48734 [22/Jan/2026:17:13:41.245] listener listener/metadata 0/0/0/22/22 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.268 104990 INFO eventlet.wsgi.server [-] 10.10.220.230,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0212963
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.272 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.273 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.220.230
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b6706592-b3b7-4148-ba2f-3b2dac46e91f __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.294 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.295 104990 INFO eventlet.wsgi.server [-] 10.10.220.230,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0222671
Jan 22 17:13:41 compute-0 haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221735]: 10.10.220.230:48746 [22/Jan/2026:17:13:41.271] listener listener/metadata 0/0/0/23/23 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.300 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.301 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.220.230
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b6706592-b3b7-4148-ba2f-3b2dac46e91f __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.319 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.320 104990 INFO eventlet.wsgi.server [-] 10.10.220.230,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0189002
Jan 22 17:13:41 compute-0 haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221735]: 10.10.220.230:48756 [22/Jan/2026:17:13:41.300] listener listener/metadata 0/0/0/20/20 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.326 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.326 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.220.230
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b6706592-b3b7-4148-ba2f-3b2dac46e91f __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.341 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.342 104990 INFO eventlet.wsgi.server [-] 10.10.220.230,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0153787
Jan 22 17:13:41 compute-0 haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221735]: 10.10.220.230:48770 [22/Jan/2026:17:13:41.325] listener listener/metadata 0/0/0/16/16 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.348 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.348 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.220.230
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b6706592-b3b7-4148-ba2f-3b2dac46e91f __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.365 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.365 104990 INFO eventlet.wsgi.server [-] 10.10.220.230,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0170505
Jan 22 17:13:41 compute-0 haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221735]: 10.10.220.230:48786 [22/Jan/2026:17:13:41.347] listener listener/metadata 0/0/0/18/18 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.370 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.371 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.220.230
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b6706592-b3b7-4148-ba2f-3b2dac46e91f __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.386 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.386 104990 INFO eventlet.wsgi.server [-] 10.10.220.230,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 149 time: 0.0154469
Jan 22 17:13:41 compute-0 haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221735]: 10.10.220.230:48794 [22/Jan/2026:17:13:41.370] listener listener/metadata 0/0/0/16/16 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.392 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.392 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.220.230
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b6706592-b3b7-4148-ba2f-3b2dac46e91f __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.409 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.409 104990 INFO eventlet.wsgi.server [-] 10.10.220.230,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0170360
Jan 22 17:13:41 compute-0 haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221735]: 10.10.220.230:48800 [22/Jan/2026:17:13:41.391] listener listener/metadata 0/0/0/18/18 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.415 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.415 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.220.230
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b6706592-b3b7-4148-ba2f-3b2dac46e91f __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.431 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.431 104990 INFO eventlet.wsgi.server [-] 10.10.220.230,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0157654
Jan 22 17:13:41 compute-0 haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221735]: 10.10.220.230:48802 [22/Jan/2026:17:13:41.414] listener listener/metadata 0/0/0/16/16 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.437 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.438 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.220.230
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b6706592-b3b7-4148-ba2f-3b2dac46e91f __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.457 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:41 compute-0 haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221735]: 10.10.220.230:48818 [22/Jan/2026:17:13:41.436] listener listener/metadata 0/0/0/21/21 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.462 104990 INFO eventlet.wsgi.server [-] 10.10.220.230,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0243266
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.463 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.464 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.220.230
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b6706592-b3b7-4148-ba2f-3b2dac46e91f __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:41 compute-0 haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221735]: 10.10.220.230:48820 [22/Jan/2026:17:13:41.463] listener listener/metadata 0/0/0/17/17 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.480 104990 INFO eventlet.wsgi.server [-] 10.10.220.230,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0164564
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.493 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.494 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.220.230
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b6706592-b3b7-4148-ba2f-3b2dac46e91f __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.514 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.514 104990 INFO eventlet.wsgi.server [-] 10.10.220.230,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0200965
Jan 22 17:13:41 compute-0 haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221735]: 10.10.220.230:48824 [22/Jan/2026:17:13:41.493] listener listener/metadata 0/0/0/21/21 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.520 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.521 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.220.230
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b6706592-b3b7-4148-ba2f-3b2dac46e91f __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.538 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.538 104990 INFO eventlet.wsgi.server [-] 10.10.220.230,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0172696
Jan 22 17:13:41 compute-0 haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221735]: 10.10.220.230:48840 [22/Jan/2026:17:13:41.520] listener listener/metadata 0/0/0/18/18 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.545 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.546 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.220.230
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b6706592-b3b7-4148-ba2f-3b2dac46e91f __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.566 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.567 104990 INFO eventlet.wsgi.server [-] 10.10.220.230,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0211279
Jan 22 17:13:41 compute-0 haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221735]: 10.10.220.230:48854 [22/Jan/2026:17:13:41.544] listener listener/metadata 0/0/0/22/22 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.574 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.575 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.220.230
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b6706592-b3b7-4148-ba2f-3b2dac46e91f __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.596 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.596 104990 INFO eventlet.wsgi.server [-] 10.10.220.230,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0213394
Jan 22 17:13:41 compute-0 haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221735]: 10.10.220.230:48864 [22/Jan/2026:17:13:41.574] listener listener/metadata 0/0/0/22/22 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.604 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.605 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.10.220.230
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b6706592-b3b7-4148-ba2f-3b2dac46e91f __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.624 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.625 104990 INFO eventlet.wsgi.server [-] 10.10.220.230,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0199931
Jan 22 17:13:41 compute-0 haproxy-metadata-proxy-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221735]: 10.10.220.230:48866 [22/Jan/2026:17:13:41.604] listener listener/metadata 0/0/0/21/21 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.930 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.930 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:41.931 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:42 compute-0 nova_compute[183075]: 2026-01-22 17:13:42.358 183079 DEBUG nova.compute.manager [req-176f9eb7-0ca6-4c48-8007-b5131883e741 req-800da730-47d0-46ae-a450-9004d0e90d12 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Received event network-changed-c30636af-db80-4279-8e40-c0266175c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:42 compute-0 nova_compute[183075]: 2026-01-22 17:13:42.359 183079 DEBUG nova.compute.manager [req-176f9eb7-0ca6-4c48-8007-b5131883e741 req-800da730-47d0-46ae-a450-9004d0e90d12 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Refreshing instance network info cache due to event network-changed-c30636af-db80-4279-8e40-c0266175c726. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:13:42 compute-0 nova_compute[183075]: 2026-01-22 17:13:42.360 183079 DEBUG oslo_concurrency.lockutils [req-176f9eb7-0ca6-4c48-8007-b5131883e741 req-800da730-47d0-46ae-a450-9004d0e90d12 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-a39a5d00-6f96-4405-aff0-1449aee94079" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:13:42 compute-0 nova_compute[183075]: 2026-01-22 17:13:42.360 183079 DEBUG oslo_concurrency.lockutils [req-176f9eb7-0ca6-4c48-8007-b5131883e741 req-800da730-47d0-46ae-a450-9004d0e90d12 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-a39a5d00-6f96-4405-aff0-1449aee94079" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:13:42 compute-0 nova_compute[183075]: 2026-01-22 17:13:42.360 183079 DEBUG nova.network.neutron [req-176f9eb7-0ca6-4c48-8007-b5131883e741 req-800da730-47d0-46ae-a450-9004d0e90d12 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Refreshing network info cache for port c30636af-db80-4279-8e40-c0266175c726 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:13:42 compute-0 nova_compute[183075]: 2026-01-22 17:13:42.843 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:42 compute-0 nova_compute[183075]: 2026-01-22 17:13:42.858 183079 DEBUG oslo_concurrency.lockutils [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "a39a5d00-6f96-4405-aff0-1449aee94079" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:42 compute-0 nova_compute[183075]: 2026-01-22 17:13:42.860 183079 DEBUG oslo_concurrency.lockutils [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "a39a5d00-6f96-4405-aff0-1449aee94079" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:42 compute-0 nova_compute[183075]: 2026-01-22 17:13:42.861 183079 DEBUG oslo_concurrency.lockutils [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:42 compute-0 nova_compute[183075]: 2026-01-22 17:13:42.862 183079 DEBUG oslo_concurrency.lockutils [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:42 compute-0 nova_compute[183075]: 2026-01-22 17:13:42.863 183079 DEBUG oslo_concurrency.lockutils [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:42 compute-0 nova_compute[183075]: 2026-01-22 17:13:42.865 183079 INFO nova.compute.manager [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Terminating instance
Jan 22 17:13:42 compute-0 nova_compute[183075]: 2026-01-22 17:13:42.868 183079 DEBUG nova.compute.manager [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:13:42 compute-0 kernel: tapc30636af-db (unregistering): left promiscuous mode
Jan 22 17:13:42 compute-0 NetworkManager[55454]: <info>  [1769102022.8981] device (tapc30636af-db): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:13:42 compute-0 nova_compute[183075]: 2026-01-22 17:13:42.913 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:42 compute-0 ovn_controller[95372]: 2026-01-22T17:13:42Z|00298|binding|INFO|Releasing lport c30636af-db80-4279-8e40-c0266175c726 from this chassis (sb_readonly=0)
Jan 22 17:13:42 compute-0 ovn_controller[95372]: 2026-01-22T17:13:42Z|00299|binding|INFO|Setting lport c30636af-db80-4279-8e40-c0266175c726 down in Southbound
Jan 22 17:13:42 compute-0 ovn_controller[95372]: 2026-01-22T17:13:42Z|00300|binding|INFO|Removing iface tapc30636af-db ovn-installed in OVS
Jan 22 17:13:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:42.927 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:f6:c4 10.100.0.19'], port_security=['fa:16:3e:cd:f6:c4 10.100.0.19'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': 'a39a5d00-6f96-4405-aff0-1449aee94079', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cbad0d35-4bf3-49f1-bb21-0be199e1e42e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bfc6667804934c92b71ce7638089e9e3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ff5109fd-5275-48f3-bbdf-9e01013834de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=429b02ac-0e82-46dc-96b9-403150e7bdc7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=c30636af-db80-4279-8e40-c0266175c726) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:13:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:42.928 104629 INFO neutron.agent.ovn.metadata.agent [-] Port c30636af-db80-4279-8e40-c0266175c726 in datapath cbad0d35-4bf3-49f1-bb21-0be199e1e42e unbound from our chassis
Jan 22 17:13:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:42.931 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cbad0d35-4bf3-49f1-bb21-0be199e1e42e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:13:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:42.932 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[71fae419-d89e-4f82-b88b-61d42fc5a1c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:42.933 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e namespace which is not needed anymore
Jan 22 17:13:42 compute-0 nova_compute[183075]: 2026-01-22 17:13:42.939 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:42 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000016.scope: Deactivated successfully.
Jan 22 17:13:42 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000016.scope: Consumed 15.222s CPU time.
Jan 22 17:13:42 compute-0 systemd-machined[154382]: Machine qemu-22-instance-00000016 terminated.
Jan 22 17:13:42 compute-0 podman[222001]: 2026-01-22 17:13:42.987027525 +0000 UTC m=+0.066081664 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:13:43 compute-0 neutron-haproxy-ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220905]: [NOTICE]   (220909) : haproxy version is 2.8.14-c23fe91
Jan 22 17:13:43 compute-0 neutron-haproxy-ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220905]: [NOTICE]   (220909) : path to executable is /usr/sbin/haproxy
Jan 22 17:13:43 compute-0 neutron-haproxy-ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220905]: [WARNING]  (220909) : Exiting Master process...
Jan 22 17:13:43 compute-0 neutron-haproxy-ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220905]: [ALERT]    (220909) : Current worker (220911) exited with code 143 (Terminated)
Jan 22 17:13:43 compute-0 neutron-haproxy-ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e[220905]: [WARNING]  (220909) : All workers exited. Exiting... (0)
Jan 22 17:13:43 compute-0 systemd[1]: libpod-4faf985257e3c47d365d9893848cb23e428415e25aabdb2f680c13454df413aa.scope: Deactivated successfully.
Jan 22 17:13:43 compute-0 podman[222045]: 2026-01-22 17:13:43.073389737 +0000 UTC m=+0.048456584 container died 4faf985257e3c47d365d9893848cb23e428415e25aabdb2f680c13454df413aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 17:13:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4faf985257e3c47d365d9893848cb23e428415e25aabdb2f680c13454df413aa-userdata-shm.mount: Deactivated successfully.
Jan 22 17:13:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-b15dfc45478c199d4e87377273bc377a98fa0798e6507e78b443f0dfcffbdafc-merged.mount: Deactivated successfully.
Jan 22 17:13:43 compute-0 podman[222045]: 2026-01-22 17:13:43.121297946 +0000 UTC m=+0.096364793 container cleanup 4faf985257e3c47d365d9893848cb23e428415e25aabdb2f680c13454df413aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 22 17:13:43 compute-0 systemd[1]: libpod-conmon-4faf985257e3c47d365d9893848cb23e428415e25aabdb2f680c13454df413aa.scope: Deactivated successfully.
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.135 183079 INFO nova.virt.libvirt.driver [-] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Instance destroyed successfully.
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.136 183079 DEBUG nova.objects.instance [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lazy-loading 'resources' on Instance uuid a39a5d00-6f96-4405-aff0-1449aee94079 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.149 183079 DEBUG nova.virt.libvirt.vif [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:12:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-2004519396',display_name='tempest-server-test-2004519396',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-2004519396',id=22,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIwX3Tnjd5UGmg0w9k/BN9eS1qe75E7Lic/jqsTQaVUTG16NFNysn4OP5OqeIQEMSgvijvcEmFLUdbKXTJ+WqhpTczbZR3YnhHyqcZ3vgAR6NGGdmWhQ6meJ9Nv3J8mm/Q==',key_name='tempest-keypair-test-368848261',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:12:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bfc6667804934c92b71ce7638089e9e3',ramdisk_id='',reservation_id='r-m04apwhz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-QoSTest-2146064006',owner_user_name='tempest-QoSTest-2146064006-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:12:39Z,user_data=None,user_id='1e61127d65144bcbaa0d43fe3eb484c0',uuid=a39a5d00-6f96-4405-aff0-1449aee94079,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c30636af-db80-4279-8e40-c0266175c726", "address": "fa:16:3e:cd:f6:c4", "network": {"id": "cbad0d35-4bf3-49f1-bb21-0be199e1e42e", "bridge": "br-int", "label": "tempest-test-network--998799692", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc30636af-db", "ovs_interfaceid": "c30636af-db80-4279-8e40-c0266175c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.149 183079 DEBUG nova.network.os_vif_util [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converting VIF {"id": "c30636af-db80-4279-8e40-c0266175c726", "address": "fa:16:3e:cd:f6:c4", "network": {"id": "cbad0d35-4bf3-49f1-bb21-0be199e1e42e", "bridge": "br-int", "label": "tempest-test-network--998799692", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc30636af-db", "ovs_interfaceid": "c30636af-db80-4279-8e40-c0266175c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.150 183079 DEBUG nova.network.os_vif_util [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:f6:c4,bridge_name='br-int',has_traffic_filtering=True,id=c30636af-db80-4279-8e40-c0266175c726,network=Network(cbad0d35-4bf3-49f1-bb21-0be199e1e42e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc30636af-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.150 183079 DEBUG os_vif [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:f6:c4,bridge_name='br-int',has_traffic_filtering=True,id=c30636af-db80-4279-8e40-c0266175c726,network=Network(cbad0d35-4bf3-49f1-bb21-0be199e1e42e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc30636af-db') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.153 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.154 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc30636af-db, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.158 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.162 183079 INFO os_vif [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:f6:c4,bridge_name='br-int',has_traffic_filtering=True,id=c30636af-db80-4279-8e40-c0266175c726,network=Network(cbad0d35-4bf3-49f1-bb21-0be199e1e42e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc30636af-db')
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.162 183079 INFO nova.virt.libvirt.driver [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Deleting instance files /var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079_del
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.163 183079 INFO nova.virt.libvirt.driver [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Deletion of /var/lib/nova/instances/a39a5d00-6f96-4405-aff0-1449aee94079_del complete
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.170 183079 DEBUG nova.compute.manager [req-d0831c97-f815-455e-8900-a85faddc149f req-61d7b91c-cf53-409e-9dc8-72bb258a7138 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Received event network-vif-unplugged-c30636af-db80-4279-8e40-c0266175c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.171 183079 DEBUG oslo_concurrency.lockutils [req-d0831c97-f815-455e-8900-a85faddc149f req-61d7b91c-cf53-409e-9dc8-72bb258a7138 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.171 183079 DEBUG oslo_concurrency.lockutils [req-d0831c97-f815-455e-8900-a85faddc149f req-61d7b91c-cf53-409e-9dc8-72bb258a7138 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.171 183079 DEBUG oslo_concurrency.lockutils [req-d0831c97-f815-455e-8900-a85faddc149f req-61d7b91c-cf53-409e-9dc8-72bb258a7138 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.172 183079 DEBUG nova.compute.manager [req-d0831c97-f815-455e-8900-a85faddc149f req-61d7b91c-cf53-409e-9dc8-72bb258a7138 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] No waiting events found dispatching network-vif-unplugged-c30636af-db80-4279-8e40-c0266175c726 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.172 183079 DEBUG nova.compute.manager [req-d0831c97-f815-455e-8900-a85faddc149f req-61d7b91c-cf53-409e-9dc8-72bb258a7138 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Received event network-vif-unplugged-c30636af-db80-4279-8e40-c0266175c726 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:13:43 compute-0 podman[222088]: 2026-01-22 17:13:43.189222427 +0000 UTC m=+0.039580933 container remove 4faf985257e3c47d365d9893848cb23e428415e25aabdb2f680c13454df413aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 17:13:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:43.195 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bbe641b3-735c-4138-9f39-2451d9a4cf8e]: (4, ('Thu Jan 22 05:13:43 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e (4faf985257e3c47d365d9893848cb23e428415e25aabdb2f680c13454df413aa)\n4faf985257e3c47d365d9893848cb23e428415e25aabdb2f680c13454df413aa\nThu Jan 22 05:13:43 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e (4faf985257e3c47d365d9893848cb23e428415e25aabdb2f680c13454df413aa)\n4faf985257e3c47d365d9893848cb23e428415e25aabdb2f680c13454df413aa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:43.197 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6b044458-09d1-4311-a4be-8849c743b35b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:43.198 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcbad0d35-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.200 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:43 compute-0 kernel: tapcbad0d35-40: left promiscuous mode
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.212 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:43.216 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fe466011-411c-4ac8-9487-7d394a9f3b15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:43.231 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c9e6822b-dbe4-4990-b34c-8e0fc40b6228]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:43.232 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e34e082d-643a-4057-89e1-18e4cf2c425c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.243 183079 INFO nova.compute.manager [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.244 183079 DEBUG oslo.service.loopingcall [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.244 183079 DEBUG nova.compute.manager [-] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.244 183079 DEBUG nova.network.neutron [-] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:13:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:43.252 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[63897864-594f-4dc6-a31e-a4084b632042]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431739, 'reachable_time': 19652, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222103, 'error': None, 'target': 'ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:43 compute-0 systemd[1]: run-netns-ovnmeta\x2dcbad0d35\x2d4bf3\x2d49f1\x2dbb21\x2d0be199e1e42e.mount: Deactivated successfully.
Jan 22 17:13:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:43.260 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cbad0d35-4bf3-49f1-bb21-0be199e1e42e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:13:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:43.261 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e2eeae-9f56-451e-b553-1f46e1f5a816]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.322 183079 DEBUG nova.network.neutron [req-176f9eb7-0ca6-4c48-8007-b5131883e741 req-800da730-47d0-46ae-a450-9004d0e90d12 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Updated VIF entry in instance network info cache for port c30636af-db80-4279-8e40-c0266175c726. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.323 183079 DEBUG nova.network.neutron [req-176f9eb7-0ca6-4c48-8007-b5131883e741 req-800da730-47d0-46ae-a450-9004d0e90d12 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Updating instance_info_cache with network_info: [{"id": "c30636af-db80-4279-8e40-c0266175c726", "address": "fa:16:3e:cd:f6:c4", "network": {"id": "cbad0d35-4bf3-49f1-bb21-0be199e1e42e", "bridge": "br-int", "label": "tempest-test-network--998799692", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc30636af-db", "ovs_interfaceid": "c30636af-db80-4279-8e40-c0266175c726", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:13:43 compute-0 nova_compute[183075]: 2026-01-22 17:13:43.342 183079 DEBUG oslo_concurrency.lockutils [req-176f9eb7-0ca6-4c48-8007-b5131883e741 req-800da730-47d0-46ae-a450-9004d0e90d12 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-a39a5d00-6f96-4405-aff0-1449aee94079" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:13:44 compute-0 nova_compute[183075]: 2026-01-22 17:13:44.910 183079 INFO nova.compute.manager [None req-24583d48-c3bd-4fb8-bd4b-d5eebafebd9c 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Get console output
Jan 22 17:13:44 compute-0 nova_compute[183075]: 2026-01-22 17:13:44.918 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.025 183079 DEBUG nova.network.neutron [-] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.045 183079 INFO nova.compute.manager [-] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Took 1.80 seconds to deallocate network for instance.
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.101 183079 DEBUG oslo_concurrency.lockutils [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.101 183079 DEBUG oslo_concurrency.lockutils [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.250 183079 DEBUG nova.compute.provider_tree [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.265 183079 DEBUG nova.compute.manager [req-3ac3e5e7-fa5c-41a9-aed8-cc73616aed72 req-13b627c0-2e2a-4e10-aede-a10feb9aae49 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Received event network-vif-plugged-c30636af-db80-4279-8e40-c0266175c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.265 183079 DEBUG oslo_concurrency.lockutils [req-3ac3e5e7-fa5c-41a9-aed8-cc73616aed72 req-13b627c0-2e2a-4e10-aede-a10feb9aae49 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.265 183079 DEBUG oslo_concurrency.lockutils [req-3ac3e5e7-fa5c-41a9-aed8-cc73616aed72 req-13b627c0-2e2a-4e10-aede-a10feb9aae49 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.266 183079 DEBUG oslo_concurrency.lockutils [req-3ac3e5e7-fa5c-41a9-aed8-cc73616aed72 req-13b627c0-2e2a-4e10-aede-a10feb9aae49 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a39a5d00-6f96-4405-aff0-1449aee94079-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.266 183079 DEBUG nova.compute.manager [req-3ac3e5e7-fa5c-41a9-aed8-cc73616aed72 req-13b627c0-2e2a-4e10-aede-a10feb9aae49 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] No waiting events found dispatching network-vif-plugged-c30636af-db80-4279-8e40-c0266175c726 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.266 183079 WARNING nova.compute.manager [req-3ac3e5e7-fa5c-41a9-aed8-cc73616aed72 req-13b627c0-2e2a-4e10-aede-a10feb9aae49 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Received unexpected event network-vif-plugged-c30636af-db80-4279-8e40-c0266175c726 for instance with vm_state deleted and task_state None.
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.266 183079 DEBUG nova.compute.manager [req-3ac3e5e7-fa5c-41a9-aed8-cc73616aed72 req-13b627c0-2e2a-4e10-aede-a10feb9aae49 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Received event network-vif-deleted-c30636af-db80-4279-8e40-c0266175c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.268 183079 DEBUG nova.scheduler.client.report [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.298 183079 DEBUG oslo_concurrency.lockutils [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.332 183079 INFO nova.scheduler.client.report [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Deleted allocations for instance a39a5d00-6f96-4405-aff0-1449aee94079
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.380 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.398 183079 DEBUG oslo_concurrency.lockutils [None req-e2ab1039-9de9-4e0c-baa6-d6eeaf6205e3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "a39a5d00-6f96-4405-aff0-1449aee94079" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.539s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:45 compute-0 ovn_controller[95372]: 2026-01-22T17:13:45Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:ac:67 10.100.0.5
Jan 22 17:13:45 compute-0 ovn_controller[95372]: 2026-01-22T17:13:45Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:ac:67 10.100.0.5
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.475 183079 INFO nova.compute.manager [None req-305efe10-1d9b-40fe-8f7c-79a3fa2dc30e 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Get console output
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.633 183079 INFO nova.compute.manager [None req-7e167981-a785-4604-8626-4b9e800105e0 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Get console output
Jan 22 17:13:45 compute-0 nova_compute[183075]: 2026-01-22 17:13:45.638 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:46 compute-0 nova_compute[183075]: 2026-01-22 17:13:46.409 183079 INFO nova.compute.manager [None req-df3a2aa8-e1ee-4ebd-9fd9-3cfb1c657ce3 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Get console output
Jan 22 17:13:46 compute-0 nova_compute[183075]: 2026-01-22 17:13:46.415 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:48 compute-0 nova_compute[183075]: 2026-01-22 17:13:48.156 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:48 compute-0 nova_compute[183075]: 2026-01-22 17:13:48.847 183079 DEBUG nova.compute.manager [req-6ae27933-5f00-40f3-9ebf-c64052417544 req-df094c4e-22e0-4cf6-b2ba-f37c9af0dab9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Received event network-changed-87f56506-3e49-4545-b8c0-8c58cbe49f15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:48 compute-0 nova_compute[183075]: 2026-01-22 17:13:48.848 183079 DEBUG nova.compute.manager [req-6ae27933-5f00-40f3-9ebf-c64052417544 req-df094c4e-22e0-4cf6-b2ba-f37c9af0dab9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Refreshing instance network info cache due to event network-changed-87f56506-3e49-4545-b8c0-8c58cbe49f15. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:13:48 compute-0 nova_compute[183075]: 2026-01-22 17:13:48.848 183079 DEBUG oslo_concurrency.lockutils [req-6ae27933-5f00-40f3-9ebf-c64052417544 req-df094c4e-22e0-4cf6-b2ba-f37c9af0dab9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-cfe610a3-4dee-46ca-a82a-1c8993fdd52c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:13:48 compute-0 nova_compute[183075]: 2026-01-22 17:13:48.848 183079 DEBUG oslo_concurrency.lockutils [req-6ae27933-5f00-40f3-9ebf-c64052417544 req-df094c4e-22e0-4cf6-b2ba-f37c9af0dab9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-cfe610a3-4dee-46ca-a82a-1c8993fdd52c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:13:48 compute-0 nova_compute[183075]: 2026-01-22 17:13:48.848 183079 DEBUG nova.network.neutron [req-6ae27933-5f00-40f3-9ebf-c64052417544 req-df094c4e-22e0-4cf6-b2ba-f37c9af0dab9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Refreshing network info cache for port 87f56506-3e49-4545-b8c0-8c58cbe49f15 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.070 183079 DEBUG oslo_concurrency.lockutils [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.071 183079 DEBUG oslo_concurrency.lockutils [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.071 183079 DEBUG oslo_concurrency.lockutils [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.071 183079 DEBUG oslo_concurrency.lockutils [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.071 183079 DEBUG oslo_concurrency.lockutils [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.072 183079 INFO nova.compute.manager [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Terminating instance
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.073 183079 DEBUG nova.compute.manager [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:13:49 compute-0 kernel: tap3f6f0766-40 (unregistering): left promiscuous mode
Jan 22 17:13:49 compute-0 NetworkManager[55454]: <info>  [1769102029.1005] device (tap3f6f0766-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.107 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:49 compute-0 ovn_controller[95372]: 2026-01-22T17:13:49Z|00301|binding|INFO|Releasing lport 3f6f0766-40d7-4301-8513-fdb50502511a from this chassis (sb_readonly=0)
Jan 22 17:13:49 compute-0 ovn_controller[95372]: 2026-01-22T17:13:49Z|00302|binding|INFO|Setting lport 3f6f0766-40d7-4301-8513-fdb50502511a down in Southbound
Jan 22 17:13:49 compute-0 ovn_controller[95372]: 2026-01-22T17:13:49Z|00303|binding|INFO|Removing iface tap3f6f0766-40 ovn-installed in OVS
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.110 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.123 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:49.135 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:24:9f 10.10.220.230'], port_security=['fa:16:3e:5b:24:9f 10.10.220.230'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.10.220.230/24', 'neutron:device_id': 'ed1e087d-92fe-41d0-bd0f-e907e799d3bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b6706592-b3b7-4148-ba2f-3b2dac46e91f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26cca885d303443380036cbbe9e70744', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9427a67d-1313-4d60-b73e-5a3f81f9a54d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4be7746-829b-4878-bcc8-6c751abfe325, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=3f6f0766-40d7-4301-8513-fdb50502511a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:13:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:49.136 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 3f6f0766-40d7-4301-8513-fdb50502511a in datapath b6706592-b3b7-4148-ba2f-3b2dac46e91f unbound from our chassis
Jan 22 17:13:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:49.139 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b6706592-b3b7-4148-ba2f-3b2dac46e91f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:13:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:49.141 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[28fc6ba2-1aea-4264-97a7-729ee58607ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:49.143 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f namespace which is not needed anymore
Jan 22 17:13:49 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000019.scope: Deactivated successfully.
Jan 22 17:13:49 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000019.scope: Consumed 13.409s CPU time.
Jan 22 17:13:49 compute-0 systemd-machined[154382]: Machine qemu-25-instance-00000019 terminated.
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.330 183079 INFO nova.virt.libvirt.driver [-] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Instance destroyed successfully.
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.331 183079 DEBUG nova.objects.instance [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lazy-loading 'resources' on Instance uuid ed1e087d-92fe-41d0-bd0f-e907e799d3bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:13:49 compute-0 neutron-haproxy-ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221729]: [NOTICE]   (221733) : haproxy version is 2.8.14-c23fe91
Jan 22 17:13:49 compute-0 neutron-haproxy-ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221729]: [NOTICE]   (221733) : path to executable is /usr/sbin/haproxy
Jan 22 17:13:49 compute-0 neutron-haproxy-ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221729]: [WARNING]  (221733) : Exiting Master process...
Jan 22 17:13:49 compute-0 neutron-haproxy-ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221729]: [ALERT]    (221733) : Current worker (221735) exited with code 143 (Terminated)
Jan 22 17:13:49 compute-0 neutron-haproxy-ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f[221729]: [WARNING]  (221733) : All workers exited. Exiting... (0)
Jan 22 17:13:49 compute-0 systemd[1]: libpod-8f8b650c4cb3750afb653a102a60863b104a09927661f4ae48d4caf8149337ec.scope: Deactivated successfully.
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.342 183079 DEBUG nova.virt.libvirt.vif [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:13:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-310083763',display_name='tempest-server-test-310083763',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-310083763',id=25,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFDbtc4hRCSv9gHmltB2G2DhAS2UXQKlSKw7wO8jzDWMIBVwAbtKYxxZ/33aOA3bpXNLm1KqC5K8xgOmfq2yl/h8qq6Gn/Z0l0pwwJ1+wDWZki4CnB3qyJDKSWstQshx5w==',key_name='tempest-keypair-test-612599565',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:13:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='26cca885d303443380036cbbe9e70744',ramdisk_id='',reservation_id='r-k3vwoe13',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkConnectivityTest-1809867331',owner_user_name='tempest-NetworkConnectivityTest-1809867331-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:13:23Z,user_data=None,user_id='4a7542774b9c42618cf9d00113f9d23d',uuid=ed1e087d-92fe-41d0-bd0f-e907e799d3bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3f6f0766-40d7-4301-8513-fdb50502511a", "address": "fa:16:3e:5b:24:9f", "network": {"id": "b6706592-b3b7-4148-ba2f-3b2dac46e91f", "bridge": "br-int", "label": "tempest-test-network--969320467", "subnets": [{"cidr": "10.10.220.0/24", "dns": [], "gateway": {"address": "10.10.220.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.220.230", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f6f0766-40", "ovs_interfaceid": "3f6f0766-40d7-4301-8513-fdb50502511a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.342 183079 DEBUG nova.network.os_vif_util [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converting VIF {"id": "3f6f0766-40d7-4301-8513-fdb50502511a", "address": "fa:16:3e:5b:24:9f", "network": {"id": "b6706592-b3b7-4148-ba2f-3b2dac46e91f", "bridge": "br-int", "label": "tempest-test-network--969320467", "subnets": [{"cidr": "10.10.220.0/24", "dns": [], "gateway": {"address": "10.10.220.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.220.230", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f6f0766-40", "ovs_interfaceid": "3f6f0766-40d7-4301-8513-fdb50502511a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.342 183079 DEBUG nova.network.os_vif_util [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:24:9f,bridge_name='br-int',has_traffic_filtering=True,id=3f6f0766-40d7-4301-8513-fdb50502511a,network=Network(b6706592-b3b7-4148-ba2f-3b2dac46e91f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3f6f0766-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.343 183079 DEBUG os_vif [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:24:9f,bridge_name='br-int',has_traffic_filtering=True,id=3f6f0766-40d7-4301-8513-fdb50502511a,network=Network(b6706592-b3b7-4148-ba2f-3b2dac46e91f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3f6f0766-40') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.344 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.344 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f6f0766-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.345 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.347 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:49 compute-0 podman[222154]: 2026-01-22 17:13:49.348699443 +0000 UTC m=+0.112936716 container died 8f8b650c4cb3750afb653a102a60863b104a09927661f4ae48d4caf8149337ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.351 183079 INFO os_vif [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:24:9f,bridge_name='br-int',has_traffic_filtering=True,id=3f6f0766-40d7-4301-8513-fdb50502511a,network=Network(b6706592-b3b7-4148-ba2f-3b2dac46e91f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3f6f0766-40')
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.352 183079 INFO nova.virt.libvirt.driver [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Deleting instance files /var/lib/nova/instances/ed1e087d-92fe-41d0-bd0f-e907e799d3bc_del
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.353 183079 INFO nova.virt.libvirt.driver [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Deletion of /var/lib/nova/instances/ed1e087d-92fe-41d0-bd0f-e907e799d3bc_del complete
Jan 22 17:13:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f8b650c4cb3750afb653a102a60863b104a09927661f4ae48d4caf8149337ec-userdata-shm.mount: Deactivated successfully.
Jan 22 17:13:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-2197d0b6530bf1130a3d1477e9e62aa4f77a854014f1a11c01ad7194fcfea478-merged.mount: Deactivated successfully.
Jan 22 17:13:49 compute-0 podman[222154]: 2026-01-22 17:13:49.432227281 +0000 UTC m=+0.196464544 container cleanup 8f8b650c4cb3750afb653a102a60863b104a09927661f4ae48d4caf8149337ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.441 183079 INFO nova.compute.manager [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 17:13:49 compute-0 systemd[1]: libpod-conmon-8f8b650c4cb3750afb653a102a60863b104a09927661f4ae48d4caf8149337ec.scope: Deactivated successfully.
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.442 183079 DEBUG oslo.service.loopingcall [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.442 183079 DEBUG nova.compute.manager [-] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.442 183079 DEBUG nova.network.neutron [-] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:13:49 compute-0 podman[222199]: 2026-01-22 17:13:49.535334479 +0000 UTC m=+0.074865893 container remove 8f8b650c4cb3750afb653a102a60863b104a09927661f4ae48d4caf8149337ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 17:13:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:49.541 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[efbbfb60-8604-4e04-9fff-ea3bf1ed5be7]: (4, ('Thu Jan 22 05:13:49 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f (8f8b650c4cb3750afb653a102a60863b104a09927661f4ae48d4caf8149337ec)\n8f8b650c4cb3750afb653a102a60863b104a09927661f4ae48d4caf8149337ec\nThu Jan 22 05:13:49 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f (8f8b650c4cb3750afb653a102a60863b104a09927661f4ae48d4caf8149337ec)\n8f8b650c4cb3750afb653a102a60863b104a09927661f4ae48d4caf8149337ec\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:49.543 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a3173486-7de5-4a94-9c99-4d692c2a5c17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:49.544 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb6706592-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:49 compute-0 kernel: tapb6706592-b0: left promiscuous mode
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.548 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:49.552 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[aa28fcb6-daad-4170-a8e1-791e243f7619]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:49 compute-0 nova_compute[183075]: 2026-01-22 17:13:49.562 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:49.571 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4ba04b24-93ee-426b-b999-83df3964fd37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:49.572 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c1d3f850-655e-4866-bbba-29609d60d72e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:49.591 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[aad77614-d440-4a7f-8aed-66bca20ee272]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436613, 'reachable_time': 40927, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222214, 'error': None, 'target': 'ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:49 compute-0 systemd[1]: run-netns-ovnmeta\x2db6706592\x2db3b7\x2d4148\x2dba2f\x2d3b2dac46e91f.mount: Deactivated successfully.
Jan 22 17:13:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:49.595 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b6706592-b3b7-4148-ba2f-3b2dac46e91f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:13:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:49.595 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[16c1d03a-ca5e-46ba-b1a5-5137ddb4a4a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:50 compute-0 nova_compute[183075]: 2026-01-22 17:13:50.384 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:50.603 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:50.605 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:13:50 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:50 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:50 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:50 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:50 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:50 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:13:50 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:50 compute-0 nova_compute[183075]: 2026-01-22 17:13:50.648 183079 DEBUG nova.network.neutron [req-6ae27933-5f00-40f3-9ebf-c64052417544 req-df094c4e-22e0-4cf6-b2ba-f37c9af0dab9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Updated VIF entry in instance network info cache for port 87f56506-3e49-4545-b8c0-8c58cbe49f15. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:13:50 compute-0 nova_compute[183075]: 2026-01-22 17:13:50.649 183079 DEBUG nova.network.neutron [req-6ae27933-5f00-40f3-9ebf-c64052417544 req-df094c4e-22e0-4cf6-b2ba-f37c9af0dab9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Updating instance_info_cache with network_info: [{"id": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "address": "fa:16:3e:c6:8a:7a", "network": {"id": "3f5295ae-a32e-4595-80e2-a52b2a6b9934", "bridge": "br-int", "label": "tempest-test-network--1267664325", "subnets": [{"cidr": "10.10.210.0/24", "dns": [], "gateway": {"address": "10.10.210.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.210.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87f56506-3e", "ovs_interfaceid": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:13:50 compute-0 nova_compute[183075]: 2026-01-22 17:13:50.716 183079 DEBUG oslo_concurrency.lockutils [req-6ae27933-5f00-40f3-9ebf-c64052417544 req-df094c4e-22e0-4cf6-b2ba-f37c9af0dab9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-cfe610a3-4dee-46ca-a82a-1c8993fdd52c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:13:50 compute-0 nova_compute[183075]: 2026-01-22 17:13:50.938 183079 INFO nova.compute.manager [None req-fe5818ed-ea4c-4515-8a3b-5f3aaff70d06 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Get console output
Jan 22 17:13:50 compute-0 nova_compute[183075]: 2026-01-22 17:13:50.943 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.068 183079 DEBUG nova.compute.manager [req-56f21fdb-2cf1-4a17-a2ff-6a0c9df1e566 req-59a5878b-7fba-4e7a-838b-ecb8d741aa92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Received event network-vif-unplugged-3f6f0766-40d7-4301-8513-fdb50502511a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.068 183079 DEBUG oslo_concurrency.lockutils [req-56f21fdb-2cf1-4a17-a2ff-6a0c9df1e566 req-59a5878b-7fba-4e7a-838b-ecb8d741aa92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.070 183079 DEBUG oslo_concurrency.lockutils [req-56f21fdb-2cf1-4a17-a2ff-6a0c9df1e566 req-59a5878b-7fba-4e7a-838b-ecb8d741aa92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.070 183079 DEBUG oslo_concurrency.lockutils [req-56f21fdb-2cf1-4a17-a2ff-6a0c9df1e566 req-59a5878b-7fba-4e7a-838b-ecb8d741aa92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.070 183079 DEBUG nova.compute.manager [req-56f21fdb-2cf1-4a17-a2ff-6a0c9df1e566 req-59a5878b-7fba-4e7a-838b-ecb8d741aa92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] No waiting events found dispatching network-vif-unplugged-3f6f0766-40d7-4301-8513-fdb50502511a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.070 183079 DEBUG nova.compute.manager [req-56f21fdb-2cf1-4a17-a2ff-6a0c9df1e566 req-59a5878b-7fba-4e7a-838b-ecb8d741aa92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Received event network-vif-unplugged-3f6f0766-40d7-4301-8513-fdb50502511a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.070 183079 DEBUG nova.compute.manager [req-56f21fdb-2cf1-4a17-a2ff-6a0c9df1e566 req-59a5878b-7fba-4e7a-838b-ecb8d741aa92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Received event network-vif-plugged-3f6f0766-40d7-4301-8513-fdb50502511a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.071 183079 DEBUG oslo_concurrency.lockutils [req-56f21fdb-2cf1-4a17-a2ff-6a0c9df1e566 req-59a5878b-7fba-4e7a-838b-ecb8d741aa92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.071 183079 DEBUG oslo_concurrency.lockutils [req-56f21fdb-2cf1-4a17-a2ff-6a0c9df1e566 req-59a5878b-7fba-4e7a-838b-ecb8d741aa92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.071 183079 DEBUG oslo_concurrency.lockutils [req-56f21fdb-2cf1-4a17-a2ff-6a0c9df1e566 req-59a5878b-7fba-4e7a-838b-ecb8d741aa92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.071 183079 DEBUG nova.compute.manager [req-56f21fdb-2cf1-4a17-a2ff-6a0c9df1e566 req-59a5878b-7fba-4e7a-838b-ecb8d741aa92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] No waiting events found dispatching network-vif-plugged-3f6f0766-40d7-4301-8513-fdb50502511a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.071 183079 WARNING nova.compute.manager [req-56f21fdb-2cf1-4a17-a2ff-6a0c9df1e566 req-59a5878b-7fba-4e7a-838b-ecb8d741aa92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Received unexpected event network-vif-plugged-3f6f0766-40d7-4301-8513-fdb50502511a for instance with vm_state active and task_state deleting.
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.275 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.275 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.6700392
Jan 22 17:13:51 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[221952]: 10.100.0.5:44008 [22/Jan/2026:17:13:50.602] listener listener/metadata 0/0/0/673/673 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.284 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.286 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.306 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:51 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[221952]: 10.100.0.5:44024 [22/Jan/2026:17:13:51.284] listener listener/metadata 0/0/0/23/23 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.307 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0212772
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.312 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.313 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.333 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.334 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0209444
Jan 22 17:13:51 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[221952]: 10.100.0.5:44030 [22/Jan/2026:17:13:51.311] listener listener/metadata 0/0/0/22/22 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.346 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.347 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.353 183079 DEBUG nova.network.neutron [-] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.367 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.368 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0204124
Jan 22 17:13:51 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[221952]: 10.100.0.5:44038 [22/Jan/2026:17:13:51.345] listener listener/metadata 0/0/0/22/22 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.369 183079 INFO nova.compute.manager [-] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Took 1.93 seconds to deallocate network for instance.
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.374 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.375 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.398 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:51 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[221952]: 10.100.0.5:44042 [22/Jan/2026:17:13:51.373] listener listener/metadata 0/0/0/25/25 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.399 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0235765
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.407 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.407 183079 DEBUG oslo_concurrency.lockutils [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.407 183079 DEBUG oslo_concurrency.lockutils [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.408 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.425 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.426 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0179162
Jan 22 17:13:51 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[221952]: 10.100.0.5:44056 [22/Jan/2026:17:13:51.406] listener listener/metadata 0/0/0/19/19 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.431 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.432 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.453 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.454 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0216167
Jan 22 17:13:51 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[221952]: 10.100.0.5:44064 [22/Jan/2026:17:13:51.431] listener listener/metadata 0/0/0/22/22 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.460 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.461 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.490 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.494 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:51 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[221952]: 10.100.0.5:44074 [22/Jan/2026:17:13:51.460] listener listener/metadata 0/0/0/36/36 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.496 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0341647
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.500 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.501 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.516 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:51 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[221952]: 10.100.0.5:44080 [22/Jan/2026:17:13:51.500] listener listener/metadata 0/0/0/16/16 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.517 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0156336
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.524 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.524 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.525 183079 DEBUG nova.compute.provider_tree [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.544 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.544 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0198896
Jan 22 17:13:51 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[221952]: 10.100.0.5:44094 [22/Jan/2026:17:13:51.523] listener listener/metadata 0/0/0/20/20 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.546 183079 DEBUG nova.scheduler.client.report [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.550 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.550 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:51 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[221952]: 10.100.0.5:44110 [22/Jan/2026:17:13:51.549] listener listener/metadata 0/0/0/15/15 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.565 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0148537
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.582 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.582 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.596 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:51 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[221952]: 10.100.0.5:44114 [22/Jan/2026:17:13:51.581] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.596 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0138602
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.600 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.600 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.614 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.614 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0141191
Jan 22 17:13:51 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[221952]: 10.100.0.5:44118 [22/Jan/2026:17:13:51.600] listener listener/metadata 0/0/0/14/14 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.618 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.618 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.627 183079 DEBUG oslo_concurrency.lockutils [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.634 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.635 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0161304
Jan 22 17:13:51 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[221952]: 10.100.0.5:44124 [22/Jan/2026:17:13:51.617] listener listener/metadata 0/0/0/17/17 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.640 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.640 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.654 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.654 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0138338
Jan 22 17:13:51 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[221952]: 10.100.0.5:44130 [22/Jan/2026:17:13:51.640] listener listener/metadata 0/0/0/14/14 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.659 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.660 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 576f6598-999f-46d9-809a-65b7475a1ec7 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.672 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:13:51 compute-0 haproxy-metadata-proxy-576f6598-999f-46d9-809a-65b7475a1ec7[221952]: 10.100.0.5:44134 [22/Jan/2026:17:13:51.659] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:13:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:51.673 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0132666
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.816 183079 INFO nova.scheduler.client.report [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Deleted allocations for instance ed1e087d-92fe-41d0-bd0f-e907e799d3bc
Jan 22 17:13:51 compute-0 nova_compute[183075]: 2026-01-22 17:13:51.933 183079 DEBUG oslo_concurrency.lockutils [None req-40b23c51-3289-4e8c-af5f-ba2de199b4ef 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "ed1e087d-92fe-41d0-bd0f-e907e799d3bc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.862s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.345 183079 DEBUG oslo_concurrency.lockutils [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.345 183079 DEBUG oslo_concurrency.lockutils [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.346 183079 DEBUG oslo_concurrency.lockutils [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.346 183079 DEBUG oslo_concurrency.lockutils [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.346 183079 DEBUG oslo_concurrency.lockutils [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.347 183079 INFO nova.compute.manager [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Terminating instance
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.348 183079 DEBUG nova.compute.manager [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:13:52 compute-0 podman[222215]: 2026-01-22 17:13:52.362024015 +0000 UTC m=+0.065219662 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:13:52 compute-0 kernel: tap87f56506-3e (unregistering): left promiscuous mode
Jan 22 17:13:52 compute-0 NetworkManager[55454]: <info>  [1769102032.3790] device (tap87f56506-3e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:13:52 compute-0 ovn_controller[95372]: 2026-01-22T17:13:52Z|00304|binding|INFO|Releasing lport 87f56506-3e49-4545-b8c0-8c58cbe49f15 from this chassis (sb_readonly=0)
Jan 22 17:13:52 compute-0 ovn_controller[95372]: 2026-01-22T17:13:52Z|00305|binding|INFO|Setting lport 87f56506-3e49-4545-b8c0-8c58cbe49f15 down in Southbound
Jan 22 17:13:52 compute-0 ovn_controller[95372]: 2026-01-22T17:13:52Z|00306|binding|INFO|Removing iface tap87f56506-3e ovn-installed in OVS
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.388 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:52.397 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:8a:7a 10.10.210.20'], port_security=['fa:16:3e:c6:8a:7a 10.10.210.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.10.210.20/24', 'neutron:device_id': 'cfe610a3-4dee-46ca-a82a-1c8993fdd52c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f5295ae-a32e-4595-80e2-a52b2a6b9934', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26cca885d303443380036cbbe9e70744', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9427a67d-1313-4d60-b73e-5a3f81f9a54d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.221'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=688e5202-8125-409e-8a42-679a9dc31876, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=87f56506-3e49-4545-b8c0-8c58cbe49f15) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:13:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:52.398 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 87f56506-3e49-4545-b8c0-8c58cbe49f15 in datapath 3f5295ae-a32e-4595-80e2-a52b2a6b9934 unbound from our chassis
Jan 22 17:13:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:52.400 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3f5295ae-a32e-4595-80e2-a52b2a6b9934, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:13:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:52.401 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8d111baf-ea8a-4542-a4ed-35d0ec9a57ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:52.402 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934 namespace which is not needed anymore
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.399 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:52 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000018.scope: Deactivated successfully.
Jan 22 17:13:52 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000018.scope: Consumed 15.406s CPU time.
Jan 22 17:13:52 compute-0 systemd-machined[154382]: Machine qemu-24-instance-00000018 terminated.
Jan 22 17:13:52 compute-0 neutron-haproxy-ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221227]: [NOTICE]   (221231) : haproxy version is 2.8.14-c23fe91
Jan 22 17:13:52 compute-0 neutron-haproxy-ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221227]: [NOTICE]   (221231) : path to executable is /usr/sbin/haproxy
Jan 22 17:13:52 compute-0 neutron-haproxy-ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221227]: [WARNING]  (221231) : Exiting Master process...
Jan 22 17:13:52 compute-0 neutron-haproxy-ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221227]: [WARNING]  (221231) : Exiting Master process...
Jan 22 17:13:52 compute-0 neutron-haproxy-ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221227]: [ALERT]    (221231) : Current worker (221233) exited with code 143 (Terminated)
Jan 22 17:13:52 compute-0 neutron-haproxy-ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934[221227]: [WARNING]  (221231) : All workers exited. Exiting... (0)
Jan 22 17:13:52 compute-0 systemd[1]: libpod-43800d9786d9454fc4ffb8be77c82fa1d377458ca9fa029d83ecb4fe8282be11.scope: Deactivated successfully.
Jan 22 17:13:52 compute-0 podman[222261]: 2026-01-22 17:13:52.546255188 +0000 UTC m=+0.058223889 container died 43800d9786d9454fc4ffb8be77c82fa1d377458ca9fa029d83ecb4fe8282be11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.599 183079 INFO nova.virt.libvirt.driver [-] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Instance destroyed successfully.
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.600 183079 DEBUG nova.objects.instance [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lazy-loading 'resources' on Instance uuid cfe610a3-4dee-46ca-a82a-1c8993fdd52c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:13:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-43800d9786d9454fc4ffb8be77c82fa1d377458ca9fa029d83ecb4fe8282be11-userdata-shm.mount: Deactivated successfully.
Jan 22 17:13:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-0fe8517455866911bc916d8d9b4447d7603db5756319ac74770d1a46e7bf747d-merged.mount: Deactivated successfully.
Jan 22 17:13:52 compute-0 podman[222261]: 2026-01-22 17:13:52.623894862 +0000 UTC m=+0.135863563 container cleanup 43800d9786d9454fc4ffb8be77c82fa1d377458ca9fa029d83ecb4fe8282be11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.628 183079 DEBUG nova.virt.libvirt.vif [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:12:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1791420059',display_name='tempest-server-test-1791420059',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1791420059',id=24,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFDbtc4hRCSv9gHmltB2G2DhAS2UXQKlSKw7wO8jzDWMIBVwAbtKYxxZ/33aOA3bpXNLm1KqC5K8xgOmfq2yl/h8qq6Gn/Z0l0pwwJ1+wDWZki4CnB3qyJDKSWstQshx5w==',key_name='tempest-keypair-test-612599565',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:12:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='26cca885d303443380036cbbe9e70744',ramdisk_id='',reservation_id='r-zfljqd21',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkConnectivityTest-1809867331',owner_user_name='tempest-NetworkConnectivityTest-1809867331-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:12:54Z,user_data=None,user_id='4a7542774b9c42618cf9d00113f9d23d',uuid=cfe610a3-4dee-46ca-a82a-1c8993fdd52c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "address": "fa:16:3e:c6:8a:7a", "network": {"id": "3f5295ae-a32e-4595-80e2-a52b2a6b9934", "bridge": "br-int", "label": "tempest-test-network--1267664325", "subnets": [{"cidr": "10.10.210.0/24", "dns": [], "gateway": {"address": "10.10.210.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.210.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87f56506-3e", "ovs_interfaceid": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.628 183079 DEBUG nova.network.os_vif_util [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converting VIF {"id": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "address": "fa:16:3e:c6:8a:7a", "network": {"id": "3f5295ae-a32e-4595-80e2-a52b2a6b9934", "bridge": "br-int", "label": "tempest-test-network--1267664325", "subnets": [{"cidr": "10.10.210.0/24", "dns": [], "gateway": {"address": "10.10.210.254", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.210.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26cca885d303443380036cbbe9e70744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87f56506-3e", "ovs_interfaceid": "87f56506-3e49-4545-b8c0-8c58cbe49f15", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.629 183079 DEBUG nova.network.os_vif_util [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c6:8a:7a,bridge_name='br-int',has_traffic_filtering=True,id=87f56506-3e49-4545-b8c0-8c58cbe49f15,network=Network(3f5295ae-a32e-4595-80e2-a52b2a6b9934),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap87f56506-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.629 183079 DEBUG os_vif [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c6:8a:7a,bridge_name='br-int',has_traffic_filtering=True,id=87f56506-3e49-4545-b8c0-8c58cbe49f15,network=Network(3f5295ae-a32e-4595-80e2-a52b2a6b9934),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap87f56506-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.632 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.632 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap87f56506-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.633 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.635 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:52 compute-0 systemd[1]: libpod-conmon-43800d9786d9454fc4ffb8be77c82fa1d377458ca9fa029d83ecb4fe8282be11.scope: Deactivated successfully.
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.637 183079 INFO os_vif [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c6:8a:7a,bridge_name='br-int',has_traffic_filtering=True,id=87f56506-3e49-4545-b8c0-8c58cbe49f15,network=Network(3f5295ae-a32e-4595-80e2-a52b2a6b9934),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap87f56506-3e')
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.638 183079 INFO nova.virt.libvirt.driver [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Deleting instance files /var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c_del
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.638 183079 INFO nova.virt.libvirt.driver [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Deletion of /var/lib/nova/instances/cfe610a3-4dee-46ca-a82a-1c8993fdd52c_del complete
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.760 183079 INFO nova.compute.manager [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Took 0.41 seconds to destroy the instance on the hypervisor.
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.761 183079 DEBUG oslo.service.loopingcall [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.761 183079 DEBUG nova.compute.manager [-] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.762 183079 DEBUG nova.network.neutron [-] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:13:52 compute-0 podman[222306]: 2026-01-22 17:13:52.772332212 +0000 UTC m=+0.125301368 container remove 43800d9786d9454fc4ffb8be77c82fa1d377458ca9fa029d83ecb4fe8282be11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:13:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:52.781 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d695aa77-cdaa-4774-b8e5-2f159cb75caa]: (4, ('Thu Jan 22 05:13:52 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934 (43800d9786d9454fc4ffb8be77c82fa1d377458ca9fa029d83ecb4fe8282be11)\n43800d9786d9454fc4ffb8be77c82fa1d377458ca9fa029d83ecb4fe8282be11\nThu Jan 22 05:13:52 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934 (43800d9786d9454fc4ffb8be77c82fa1d377458ca9fa029d83ecb4fe8282be11)\n43800d9786d9454fc4ffb8be77c82fa1d377458ca9fa029d83ecb4fe8282be11\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:52.783 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3e250a1b-68d1-4b9a-b607-04ec5621e20e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:52.784 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f5295ae-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.785 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:52 compute-0 kernel: tap3f5295ae-a0: left promiscuous mode
Jan 22 17:13:52 compute-0 nova_compute[183075]: 2026-01-22 17:13:52.805 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:52.810 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[70028564-ea9d-4499-951b-391bfc592219]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:52.828 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e5b98330-28a5-4353-8a96-42554ee253a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:52.829 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[752c9b3d-1d58-4091-baab-374cec38075b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:52.847 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[06daf7e1-7038-42c9-84b8-808e8fd70581]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433682, 'reachable_time': 32295, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222320, 'error': None, 'target': 'ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:52.850 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3f5295ae-a32e-4595-80e2-a52b2a6b9934 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:13:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:13:52.850 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[9630e11b-621c-4058-b528-4081e24da40c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:13:52 compute-0 systemd[1]: run-netns-ovnmeta\x2d3f5295ae\x2da32e\x2d4595\x2d80e2\x2da52b2a6b9934.mount: Deactivated successfully.
Jan 22 17:13:53 compute-0 nova_compute[183075]: 2026-01-22 17:13:53.094 183079 DEBUG nova.compute.manager [req-ca8dcd94-0780-441d-a44f-e288e7230c26 req-e362d1fc-c281-41e1-920f-ebd31fcaecdd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Received event network-vif-unplugged-87f56506-3e49-4545-b8c0-8c58cbe49f15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:53 compute-0 nova_compute[183075]: 2026-01-22 17:13:53.095 183079 DEBUG oslo_concurrency.lockutils [req-ca8dcd94-0780-441d-a44f-e288e7230c26 req-e362d1fc-c281-41e1-920f-ebd31fcaecdd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:53 compute-0 nova_compute[183075]: 2026-01-22 17:13:53.095 183079 DEBUG oslo_concurrency.lockutils [req-ca8dcd94-0780-441d-a44f-e288e7230c26 req-e362d1fc-c281-41e1-920f-ebd31fcaecdd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:53 compute-0 nova_compute[183075]: 2026-01-22 17:13:53.096 183079 DEBUG oslo_concurrency.lockutils [req-ca8dcd94-0780-441d-a44f-e288e7230c26 req-e362d1fc-c281-41e1-920f-ebd31fcaecdd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:53 compute-0 nova_compute[183075]: 2026-01-22 17:13:53.096 183079 DEBUG nova.compute.manager [req-ca8dcd94-0780-441d-a44f-e288e7230c26 req-e362d1fc-c281-41e1-920f-ebd31fcaecdd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] No waiting events found dispatching network-vif-unplugged-87f56506-3e49-4545-b8c0-8c58cbe49f15 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:53 compute-0 nova_compute[183075]: 2026-01-22 17:13:53.096 183079 DEBUG nova.compute.manager [req-ca8dcd94-0780-441d-a44f-e288e7230c26 req-e362d1fc-c281-41e1-920f-ebd31fcaecdd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Received event network-vif-unplugged-87f56506-3e49-4545-b8c0-8c58cbe49f15 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:13:55 compute-0 nova_compute[183075]: 2026-01-22 17:13:55.357 183079 DEBUG nova.compute.manager [req-52baab0a-8809-4e18-be64-5e1148165000 req-2d0e92b3-a7da-4b5f-b20d-7b453ef10aaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Received event network-vif-plugged-87f56506-3e49-4545-b8c0-8c58cbe49f15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:13:55 compute-0 nova_compute[183075]: 2026-01-22 17:13:55.358 183079 DEBUG oslo_concurrency.lockutils [req-52baab0a-8809-4e18-be64-5e1148165000 req-2d0e92b3-a7da-4b5f-b20d-7b453ef10aaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:55 compute-0 nova_compute[183075]: 2026-01-22 17:13:55.358 183079 DEBUG oslo_concurrency.lockutils [req-52baab0a-8809-4e18-be64-5e1148165000 req-2d0e92b3-a7da-4b5f-b20d-7b453ef10aaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:55 compute-0 nova_compute[183075]: 2026-01-22 17:13:55.358 183079 DEBUG oslo_concurrency.lockutils [req-52baab0a-8809-4e18-be64-5e1148165000 req-2d0e92b3-a7da-4b5f-b20d-7b453ef10aaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:55 compute-0 nova_compute[183075]: 2026-01-22 17:13:55.359 183079 DEBUG nova.compute.manager [req-52baab0a-8809-4e18-be64-5e1148165000 req-2d0e92b3-a7da-4b5f-b20d-7b453ef10aaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] No waiting events found dispatching network-vif-plugged-87f56506-3e49-4545-b8c0-8c58cbe49f15 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:13:55 compute-0 nova_compute[183075]: 2026-01-22 17:13:55.359 183079 WARNING nova.compute.manager [req-52baab0a-8809-4e18-be64-5e1148165000 req-2d0e92b3-a7da-4b5f-b20d-7b453ef10aaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Received unexpected event network-vif-plugged-87f56506-3e49-4545-b8c0-8c58cbe49f15 for instance with vm_state active and task_state deleting.
Jan 22 17:13:55 compute-0 nova_compute[183075]: 2026-01-22 17:13:55.386 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.453 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'name': 'tempest-server-test-1407213754', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e05c7aae349e4a1d859a387df45650a0', 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'hostId': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.454 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.473 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/disk.device.read.bytes volume: 30185984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91640d78-1c54-40cc-8c98-930effed66c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30185984, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '84b90c1e-91a0-437d-8ed2-956840c552ab-vda', 'timestamp': '2026-01-22T17:13:55.454951', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'instance-0000001a', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bb76b12c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.215402973, 'message_signature': '33556595a605f39abb904cdb2ab7e2aa8fcf0e2ef4a5cb0b1d7864b373cf11a9'}]}, 'timestamp': '2026-01-22 17:13:55.474848', '_unique_id': 'b76c00defd5b48289d97304e341fb98e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.476 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.477 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.481 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 84b90c1e-91a0-437d-8ed2-956840c552ab / tapb3bc8962-61 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.481 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4c43052-b0ad-4581-9ff0-10c4ae4b772d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-0000001a-84b90c1e-91a0-437d-8ed2-956840c552ab-tapb3bc8962-61', 'timestamp': '2026-01-22T17:13:55.478060', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'tapb3bc8962-61', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:ac:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3bc8962-61'}, 'message_id': 'bb77e0ec-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.238527396, 'message_signature': 'd3a64f70666a77ae5a184b0afb30100235bb0344b242a40821177c1e28120b8c'}]}, 'timestamp': '2026-01-22 17:13:55.482500', '_unique_id': '5aeb3c5aa3b5476eaed5b3e42bc0404a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.485 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.492 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/disk.device.allocation volume: 29892608 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0d9cb63-6c11-4d34-a2b2-603725ea9eb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29892608, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '84b90c1e-91a0-437d-8ed2-956840c552ab-vda', 'timestamp': '2026-01-22T17:13:55.485198', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'instance-0000001a', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bb799414-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.245657842, 'message_signature': '02a932b132210ca4c54b7b5f2ba7574d9b64e826bbf91ed9186ca4cc9cbedc08'}]}, 'timestamp': '2026-01-22 17:13:55.493613', '_unique_id': '915403a63156479e90a05db24a133290'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.496 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.496 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef2ff2ba-65a2-46a9-9820-2f682bb3cc52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-0000001a-84b90c1e-91a0-437d-8ed2-956840c552ab-tapb3bc8962-61', 'timestamp': '2026-01-22T17:13:55.496292', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'tapb3bc8962-61', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:ac:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3bc8962-61'}, 'message_id': 'bb7a101a-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.238527396, 'message_signature': 'f4007cfaaa7b26953a5334e7c40d06966c3422b0385149eff8a48685b9b5b421'}]}, 'timestamp': '2026-01-22 17:13:55.496914', '_unique_id': '8eae2e33d635453d82fa208099f0e236'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.497 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.499 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.499 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98fd7188-2170-4efd-9d47-03bae226a6d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-0000001a-84b90c1e-91a0-437d-8ed2-956840c552ab-tapb3bc8962-61', 'timestamp': '2026-01-22T17:13:55.499435', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'tapb3bc8962-61', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:ac:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3bc8962-61'}, 'message_id': 'bb7a8ca2-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.238527396, 'message_signature': '20a1aa7ac3a1abe15c6e8fb7bf982e7827d2fcb918e7689fd3101e89c7e71781'}]}, 'timestamp': '2026-01-22 17:13:55.499984', '_unique_id': '47c5f90b4ca84ff7b684156696d06a5e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.502 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.502 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fb7d916-fb5d-4093-8f61-42b871a26aa5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-0000001a-84b90c1e-91a0-437d-8ed2-956840c552ab-tapb3bc8962-61', 'timestamp': '2026-01-22T17:13:55.502386', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'tapb3bc8962-61', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:ac:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3bc8962-61'}, 'message_id': 'bb7afffc-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.238527396, 'message_signature': 'c8a53ab1db1b40322c4649f743ce71b5726f08541e2be1138cb214ae9a69a6f8'}]}, 'timestamp': '2026-01-22 17:13:55.503012', '_unique_id': 'c978cb2a82cf4807bd5736fe3c2c43a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.505 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.506 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/disk.device.read.requests volume: 1119 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5dd986dd-b42d-4756-8298-d5833fc81c92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1119, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '84b90c1e-91a0-437d-8ed2-956840c552ab-vda', 'timestamp': '2026-01-22T17:13:55.506029', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'instance-0000001a', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bb7b8c1a-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.215402973, 'message_signature': '4ff0d52918f7194c581208ed3356fd091e39319341b1218c77bbf8a6b989275b'}]}, 'timestamp': '2026-01-22 17:13:55.506503', '_unique_id': '45c58c94d41342c18f2e84634b15b946'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.508 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.508 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/network.outgoing.packets volume: 115 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d36f575-717f-4261-8606-572c19f451bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 115, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-0000001a-84b90c1e-91a0-437d-8ed2-956840c552ab-tapb3bc8962-61', 'timestamp': '2026-01-22T17:13:55.508928', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'tapb3bc8962-61', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:ac:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3bc8962-61'}, 'message_id': 'bb7bfd44-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.238527396, 'message_signature': '1352a1f8ae3ed4bbffeab231f714674cb7fe348e5d938750a0c766a6b6f6390c'}]}, 'timestamp': '2026-01-22 17:13:55.509411', '_unique_id': 'd028d5b75f5a4005b26eb19a0b04d795'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.511 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.511 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '865e255b-d76c-4e9f-a7ae-d7ec51767ce2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-0000001a-84b90c1e-91a0-437d-8ed2-956840c552ab-tapb3bc8962-61', 'timestamp': '2026-01-22T17:13:55.511777', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'tapb3bc8962-61', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:ac:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3bc8962-61'}, 'message_id': 'bb7c7102-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.238527396, 'message_signature': '76bcf4a507478bed712ccb5d53b4c78cd101628c37baf7ec60445ae218cfd84c'}]}, 'timestamp': '2026-01-22 17:13:55.512455', '_unique_id': 'b68c949e4d2245cfa8683e7435fb532d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.513 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.514 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.515 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.515 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1407213754>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1407213754>]
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.515 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.515 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/disk.device.write.bytes volume: 72867840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc25d943-aeaa-48b9-8126-55757c2ea8a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72867840, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '84b90c1e-91a0-437d-8ed2-956840c552ab-vda', 'timestamp': '2026-01-22T17:13:55.515833', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'instance-0000001a', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bb7d0b08-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.215402973, 'message_signature': '495d9d7b357e9219d8a346e2df4cc71fd06a6b1a02391d8d21ee7b653ad742db'}]}, 'timestamp': '2026-01-22 17:13:55.516308', '_unique_id': '818e8647907f48d8a5fc7e3080235c77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.517 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.518 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.518 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b18474d6-872f-44ef-ae09-f7c0c5315b3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-0000001a-84b90c1e-91a0-437d-8ed2-956840c552ab-tapb3bc8962-61', 'timestamp': '2026-01-22T17:13:55.518587', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'tapb3bc8962-61', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:ac:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3bc8962-61'}, 'message_id': 'bb7d78f4-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.238527396, 'message_signature': '831f7ec76f1a3c2c5cd0dceb72f07354045b17ac8c07ce910460b197a633def8'}]}, 'timestamp': '2026-01-22 17:13:55.519173', '_unique_id': '73dfe299c68143608829b0d9f8b02547'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.520 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.521 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.521 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/network.incoming.bytes volume: 7326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d969b75-e364-4def-b3e4-51977cb6a3a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7326, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-0000001a-84b90c1e-91a0-437d-8ed2-956840c552ab-tapb3bc8962-61', 'timestamp': '2026-01-22T17:13:55.521453', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'tapb3bc8962-61', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:ac:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3bc8962-61'}, 'message_id': 'bb7de79e-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.238527396, 'message_signature': 'f3730d90855e7b1a82c6eb765371263c15243e89d90b028d8ea53f93f10b52a4'}]}, 'timestamp': '2026-01-22 17:13:55.521969', '_unique_id': '1c3e74ea2b3c48c887431d64fc0d8946'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.524 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.524 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/disk.device.usage volume: 29818880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac502522-6b6d-4c3d-906c-3b7e86034560', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29818880, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '84b90c1e-91a0-437d-8ed2-956840c552ab-vda', 'timestamp': '2026-01-22T17:13:55.524289', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'instance-0000001a', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bb7e54c2-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.245657842, 'message_signature': '4622d345e436f8ed6623bfdfb8a5d08ece903a2c68cfaff1e58302e302bd0316'}]}, 'timestamp': '2026-01-22 17:13:55.524782', '_unique_id': '7d5a93d2e7944274b1faa2b115c0133c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.526 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.527 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/disk.device.write.requests volume: 312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2160b264-0003-4681-80a8-09b0b0fab0c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 312, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '84b90c1e-91a0-437d-8ed2-956840c552ab-vda', 'timestamp': '2026-01-22T17:13:55.527016', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'instance-0000001a', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bb7ec128-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.215402973, 'message_signature': '2525bd8d587bc3795319b5fac95819583125d87c0482ed14907bf552fc27091a'}]}, 'timestamp': '2026-01-22 17:13:55.527518', '_unique_id': 'a2d1b217c86446cd8dac1d6b9dd93636'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.529 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.529 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.530 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1407213754>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1407213754>]
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.530 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.530 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/disk.device.write.latency volume: 2042524747 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24fc3615-6065-4dc0-9f11-efde1dc06c0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2042524747, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '84b90c1e-91a0-437d-8ed2-956840c552ab-vda', 'timestamp': '2026-01-22T17:13:55.530592', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'instance-0000001a', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bb7f4dc8-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.215402973, 'message_signature': 'e7aed6beebf064e9757d00b0e702f47b3c58c6df63ce11bcce880175951a4e57'}]}, 'timestamp': '2026-01-22 17:13:55.531188', '_unique_id': '6fd36d381cb645e09d3466cb15a22b99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.533 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.533 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0462803f-788f-415b-8952-f07771a8ec38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-0000001a-84b90c1e-91a0-437d-8ed2-956840c552ab-tapb3bc8962-61', 'timestamp': '2026-01-22T17:13:55.533168', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'tapb3bc8962-61', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:ac:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3bc8962-61'}, 'message_id': 'bb7facfa-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.238527396, 'message_signature': '27fc0cb1a07d912db005f2e1ba17eedd0e0513b5c7bac29ca6955babf2658796'}]}, 'timestamp': '2026-01-22 17:13:55.533493', '_unique_id': 'e3bad4afb7034334bef1530647e6fc2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.534 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.554 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/memory.usage volume: 46.62890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '652978aa-fd0c-40f5-b8de-0d4c4a059ddf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.62890625, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'timestamp': '2026-01-22T17:13:55.535034', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'instance-0000001a', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'bb82e3a2-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.314458196, 'message_signature': 'd7892ae71ab3c201e0397e0ab34d5268a4560e036a705ddcd6580b960407353f'}]}, 'timestamp': '2026-01-22 17:13:55.554673', '_unique_id': 'b2b3e0c73ddc49a1a3e339c5bc388792'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.555 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.556 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.556 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be1ea9cf-110d-48ec-a4c4-c16a338f07a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '84b90c1e-91a0-437d-8ed2-956840c552ab-vda', 'timestamp': '2026-01-22T17:13:55.556616', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'instance-0000001a', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bb834518-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.245657842, 'message_signature': 'd18958525496a4663285ecebd5664e9362df9fd48accb8ff91b3a1f9de512d5b'}]}, 'timestamp': '2026-01-22 17:13:55.557113', '_unique_id': 'a38c50a82ff54f8b8622cb0069c57c0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.558 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.559 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.559 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.559 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1407213754>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1407213754>]
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.559 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.560 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/cpu volume: 10790000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ffcd471d-a3c9-45ac-accc-bf32f3863209', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10790000000, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'timestamp': '2026-01-22T17:13:55.559986', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'instance-0000001a', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'bb83c740-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.314458196, 'message_signature': 'a4bdea70f0396e9806e6cabe26dfee0ebfa6a9cf6923e7c269203b8062907a68'}]}, 'timestamp': '2026-01-22 17:13:55.560440', '_unique_id': '9d52e82c5a4a4143ad3ea58508bf191c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.561 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.562 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.562 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.562 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1407213754>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1407213754>]
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.563 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.563 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/disk.device.read.latency volume: 366125581 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54d40fb8-a139-4980-b194-7a1883dff52f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 366125581, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': '84b90c1e-91a0-437d-8ed2-956840c552ab-vda', 'timestamp': '2026-01-22T17:13:55.563215', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'instance-0000001a', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bb8446de-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.215402973, 'message_signature': '8c7d28d97d69a2b5a316adfea5b7870ff3dbe89bae714acb1b8aae9a2ad13d98'}]}, 'timestamp': '2026-01-22 17:13:55.563725', '_unique_id': '46a0be6588094fa0b18c45a25d2409d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.564 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.565 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.565 12 DEBUG ceilometer.compute.pollsters [-] 84b90c1e-91a0-437d-8ed2-956840c552ab/network.outgoing.bytes volume: 10121 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21ff934b-054e-4052-9b7f-a91896ffd247', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10121, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_name': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_name': None, 'resource_id': 'instance-0000001a-84b90c1e-91a0-437d-8ed2-956840c552ab-tapb3bc8962-61', 'timestamp': '2026-01-22T17:13:55.565874', 'resource_metadata': {'display_name': 'tempest-server-test-1407213754', 'name': 'tapb3bc8962-61', 'instance_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'instance_type': 'm1.nano', 'host': '620017861095791b58e81c14d8c50cf7d441a724af895ad24a612ebc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:ac:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb3bc8962-61'}, 'message_id': 'bb84ad2c-f7b5-11f0-9e69-fa163eaea1db', 'monotonic_time': 4399.238527396, 'message_signature': 'f34cffefbe4ccd2ec9867bebbdc1c41e412be9424d52f4edf33bb5cf74fa2442'}]}, 'timestamp': '2026-01-22 17:13:55.566338', '_unique_id': 'dde0e3d664764049bc497177d00f6f89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:13:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:13:55.567 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:13:56 compute-0 nova_compute[183075]: 2026-01-22 17:13:56.192 183079 INFO nova.compute.manager [None req-6b212bd4-832c-489a-853f-adbb4147d5e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Get console output
Jan 22 17:13:56 compute-0 nova_compute[183075]: 2026-01-22 17:13:56.197 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:13:56 compute-0 nova_compute[183075]: 2026-01-22 17:13:56.785 183079 DEBUG nova.network.neutron [-] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:13:56 compute-0 nova_compute[183075]: 2026-01-22 17:13:56.906 183079 INFO nova.compute.manager [-] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Took 4.14 seconds to deallocate network for instance.
Jan 22 17:13:57 compute-0 nova_compute[183075]: 2026-01-22 17:13:57.082 183079 DEBUG oslo_concurrency.lockutils [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:13:57 compute-0 nova_compute[183075]: 2026-01-22 17:13:57.083 183079 DEBUG oslo_concurrency.lockutils [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:13:57 compute-0 nova_compute[183075]: 2026-01-22 17:13:57.152 183079 DEBUG nova.compute.provider_tree [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:13:57 compute-0 nova_compute[183075]: 2026-01-22 17:13:57.165 183079 DEBUG nova.scheduler.client.report [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:13:57 compute-0 nova_compute[183075]: 2026-01-22 17:13:57.201 183079 DEBUG oslo_concurrency.lockutils [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:57 compute-0 nova_compute[183075]: 2026-01-22 17:13:57.233 183079 INFO nova.scheduler.client.report [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Deleted allocations for instance cfe610a3-4dee-46ca-a82a-1c8993fdd52c
Jan 22 17:13:57 compute-0 nova_compute[183075]: 2026-01-22 17:13:57.319 183079 DEBUG oslo_concurrency.lockutils [None req-3e905935-6a72-469b-ac66-f57306796f25 4a7542774b9c42618cf9d00113f9d23d 26cca885d303443380036cbbe9e70744 - - default default] Lock "cfe610a3-4dee-46ca-a82a-1c8993fdd52c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.974s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:13:57 compute-0 nova_compute[183075]: 2026-01-22 17:13:57.635 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:57 compute-0 nova_compute[183075]: 2026-01-22 17:13:57.642 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:13:58 compute-0 nova_compute[183075]: 2026-01-22 17:13:58.133 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102023.1322608, a39a5d00-6f96-4405-aff0-1449aee94079 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:13:58 compute-0 nova_compute[183075]: 2026-01-22 17:13:58.134 183079 INFO nova.compute.manager [-] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] VM Stopped (Lifecycle Event)
Jan 22 17:13:58 compute-0 nova_compute[183075]: 2026-01-22 17:13:58.167 183079 DEBUG nova.compute.manager [None req-ebdb6f7e-4998-4ff5-ba63-c7046129637b - - - - - -] [instance: a39a5d00-6f96-4405-aff0-1449aee94079] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:14:00 compute-0 podman[222323]: 2026-01-22 17:14:00.353450111 +0000 UTC m=+0.055733594 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:14:00 compute-0 podman[222324]: 2026-01-22 17:14:00.357134457 +0000 UTC m=+0.057472809 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Jan 22 17:14:00 compute-0 podman[222322]: 2026-01-22 17:14:00.380408994 +0000 UTC m=+0.086377853 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:14:00 compute-0 nova_compute[183075]: 2026-01-22 17:14:00.388 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:02 compute-0 nova_compute[183075]: 2026-01-22 17:14:02.640 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:04 compute-0 nova_compute[183075]: 2026-01-22 17:14:04.328 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102029.327137, ed1e087d-92fe-41d0-bd0f-e907e799d3bc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:14:04 compute-0 nova_compute[183075]: 2026-01-22 17:14:04.329 183079 INFO nova.compute.manager [-] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] VM Stopped (Lifecycle Event)
Jan 22 17:14:04 compute-0 nova_compute[183075]: 2026-01-22 17:14:04.349 183079 DEBUG nova.compute.manager [None req-f6d19270-1d3a-4b73-87ce-32ba7614129a - - - - - -] [instance: ed1e087d-92fe-41d0-bd0f-e907e799d3bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:14:04 compute-0 nova_compute[183075]: 2026-01-22 17:14:04.621 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "26367132-bc45-4c8a-bd7e-5c0883453bbd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:04 compute-0 nova_compute[183075]: 2026-01-22 17:14:04.622 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "26367132-bc45-4c8a-bd7e-5c0883453bbd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:04 compute-0 nova_compute[183075]: 2026-01-22 17:14:04.639 183079 DEBUG nova.compute.manager [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:14:04 compute-0 nova_compute[183075]: 2026-01-22 17:14:04.710 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:04 compute-0 nova_compute[183075]: 2026-01-22 17:14:04.711 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:04 compute-0 nova_compute[183075]: 2026-01-22 17:14:04.717 183079 DEBUG nova.virt.hardware [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:14:04 compute-0 nova_compute[183075]: 2026-01-22 17:14:04.718 183079 INFO nova.compute.claims [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:14:05 compute-0 nova_compute[183075]: 2026-01-22 17:14:05.390 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:05 compute-0 nova_compute[183075]: 2026-01-22 17:14:05.510 183079 DEBUG nova.compute.provider_tree [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:14:06 compute-0 podman[222386]: 2026-01-22 17:14:06.371345477 +0000 UTC m=+0.078444556 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.473 183079 DEBUG nova.scheduler.client.report [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.507 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.796s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.507 183079 DEBUG nova.compute.manager [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.734 183079 DEBUG nova.compute.manager [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.735 183079 DEBUG nova.network.neutron [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.769 183079 INFO nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.792 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "613fb094-d482-4b33-92c8-184b010a0169" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.793 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "613fb094-d482-4b33-92c8-184b010a0169" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.793 183079 DEBUG nova.compute.manager [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.820 183079 DEBUG nova.compute.manager [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.910 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.910 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.916 183079 DEBUG nova.virt.hardware [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.917 183079 INFO nova.compute.claims [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.933 183079 DEBUG nova.compute.manager [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.935 183079 DEBUG nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.936 183079 INFO nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Creating image(s)
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.936 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "/var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.937 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "/var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.938 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "/var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:06 compute-0 nova_compute[183075]: 2026-01-22 17:14:06.960 183079 DEBUG oslo_concurrency.processutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.059 183079 DEBUG oslo_concurrency.processutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.060 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.060 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.070 183079 DEBUG oslo_concurrency.processutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.123 183079 DEBUG oslo_concurrency.processutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.125 183079 DEBUG oslo_concurrency.processutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.169 183079 DEBUG nova.compute.provider_tree [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.249 183079 DEBUG oslo_concurrency.processutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk 1073741824" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.250 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.250 183079 DEBUG oslo_concurrency.processutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.270 183079 DEBUG nova.scheduler.client.report [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.307 183079 DEBUG oslo_concurrency.processutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.308 183079 DEBUG nova.virt.disk.api [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Checking if we can resize image /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.309 183079 DEBUG oslo_concurrency.processutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.364 183079 DEBUG oslo_concurrency.processutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.365 183079 DEBUG nova.virt.disk.api [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Cannot resize image /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.365 183079 DEBUG nova.objects.instance [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lazy-loading 'migration_context' on Instance uuid 26367132-bc45-4c8a-bd7e-5c0883453bbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.396 183079 DEBUG nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.396 183079 DEBUG nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Ensure instance console log exists: /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.397 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.397 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.397 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.399 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.488s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.399 183079 DEBUG nova.compute.manager [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.417 183079 DEBUG nova.policy [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1e61127d65144bcbaa0d43fe3eb484c0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bfc6667804934c92b71ce7638089e9e3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.482 183079 DEBUG nova.compute.manager [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.483 183079 DEBUG nova.network.neutron [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.560 183079 INFO nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:14:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:07.565 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.566 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:07.568 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.597 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102032.5962465, cfe610a3-4dee-46ca-a82a-1c8993fdd52c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.597 183079 INFO nova.compute.manager [-] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] VM Stopped (Lifecycle Event)
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.643 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.796 183079 DEBUG nova.compute.manager [None req-bda6bc32-a21f-4cee-a18d-aed02952c4ff - - - - - -] [instance: cfe610a3-4dee-46ca-a82a-1c8993fdd52c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:14:07 compute-0 nova_compute[183075]: 2026-01-22 17:14:07.896 183079 DEBUG nova.compute.manager [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.045 183079 DEBUG nova.policy [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd47d63cff2548a88e21e5c2e6a5c161', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e05c7aae349e4a1d859a387df45650a0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.431 183079 DEBUG nova.compute.manager [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.433 183079 DEBUG nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.433 183079 INFO nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Creating image(s)
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.434 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "/var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.435 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.436 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "/var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.461 183079 DEBUG oslo_concurrency.processutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.537 183079 DEBUG oslo_concurrency.processutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.539 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.540 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.575 183079 DEBUG oslo_concurrency.processutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.661 183079 DEBUG oslo_concurrency.processutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.662 183079 DEBUG oslo_concurrency.processutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.842 183079 DEBUG oslo_concurrency.processutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169/disk 1073741824" returned: 0 in 0.180s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.843 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.844 183079 DEBUG oslo_concurrency.processutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.903 183079 DEBUG nova.network.neutron [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Successfully created port: 76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.936 183079 DEBUG oslo_concurrency.processutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.938 183079 DEBUG nova.virt.disk.api [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Checking if we can resize image /var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:14:08 compute-0 nova_compute[183075]: 2026-01-22 17:14:08.938 183079 DEBUG oslo_concurrency.processutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:14:09 compute-0 nova_compute[183075]: 2026-01-22 17:14:09.010 183079 DEBUG oslo_concurrency.processutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:14:09 compute-0 nova_compute[183075]: 2026-01-22 17:14:09.012 183079 DEBUG nova.virt.disk.api [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Cannot resize image /var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:14:09 compute-0 nova_compute[183075]: 2026-01-22 17:14:09.012 183079 DEBUG nova.objects.instance [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'migration_context' on Instance uuid 613fb094-d482-4b33-92c8-184b010a0169 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:14:09 compute-0 nova_compute[183075]: 2026-01-22 17:14:09.081 183079 DEBUG nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:14:09 compute-0 nova_compute[183075]: 2026-01-22 17:14:09.082 183079 DEBUG nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Ensure instance console log exists: /var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:14:09 compute-0 nova_compute[183075]: 2026-01-22 17:14:09.083 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:09 compute-0 nova_compute[183075]: 2026-01-22 17:14:09.084 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:09 compute-0 nova_compute[183075]: 2026-01-22 17:14:09.084 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:10 compute-0 nova_compute[183075]: 2026-01-22 17:14:10.208 183079 DEBUG nova.network.neutron [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Successfully updated port: 1e2e1eb8-eea7-4040-8e90-9c98522a01da _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:14:10 compute-0 nova_compute[183075]: 2026-01-22 17:14:10.392 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:10 compute-0 nova_compute[183075]: 2026-01-22 17:14:10.444 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "refresh_cache-613fb094-d482-4b33-92c8-184b010a0169" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:14:10 compute-0 nova_compute[183075]: 2026-01-22 17:14:10.444 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquired lock "refresh_cache-613fb094-d482-4b33-92c8-184b010a0169" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:14:10 compute-0 nova_compute[183075]: 2026-01-22 17:14:10.445 183079 DEBUG nova.network.neutron [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:14:10 compute-0 nova_compute[183075]: 2026-01-22 17:14:10.701 183079 DEBUG nova.compute.manager [req-a806a219-d5c0-4eb5-a8a2-58200676653c req-e795a686-20e8-4dcd-837a-e4dcc914b422 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Received event network-changed-1e2e1eb8-eea7-4040-8e90-9c98522a01da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:14:10 compute-0 nova_compute[183075]: 2026-01-22 17:14:10.702 183079 DEBUG nova.compute.manager [req-a806a219-d5c0-4eb5-a8a2-58200676653c req-e795a686-20e8-4dcd-837a-e4dcc914b422 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Refreshing instance network info cache due to event network-changed-1e2e1eb8-eea7-4040-8e90-9c98522a01da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:14:10 compute-0 nova_compute[183075]: 2026-01-22 17:14:10.703 183079 DEBUG oslo_concurrency.lockutils [req-a806a219-d5c0-4eb5-a8a2-58200676653c req-e795a686-20e8-4dcd-837a-e4dcc914b422 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-613fb094-d482-4b33-92c8-184b010a0169" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:14:10 compute-0 nova_compute[183075]: 2026-01-22 17:14:10.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:14:10 compute-0 nova_compute[183075]: 2026-01-22 17:14:10.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:14:10 compute-0 nova_compute[183075]: 2026-01-22 17:14:10.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 17:14:11 compute-0 nova_compute[183075]: 2026-01-22 17:14:11.115 183079 DEBUG nova.network.neutron [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:14:11 compute-0 nova_compute[183075]: 2026-01-22 17:14:11.236 183079 DEBUG nova.network.neutron [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Successfully updated port: 76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:14:11 compute-0 nova_compute[183075]: 2026-01-22 17:14:11.355 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "refresh_cache-26367132-bc45-4c8a-bd7e-5c0883453bbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:14:11 compute-0 nova_compute[183075]: 2026-01-22 17:14:11.356 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquired lock "refresh_cache-26367132-bc45-4c8a-bd7e-5c0883453bbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:14:11 compute-0 nova_compute[183075]: 2026-01-22 17:14:11.356 183079 DEBUG nova.network.neutron [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:14:11 compute-0 nova_compute[183075]: 2026-01-22 17:14:11.361 183079 DEBUG nova.compute.manager [req-d455aa81-5157-4e3a-bc5a-58649f2e65a0 req-4e0e3c0a-eb95-45e3-b327-5f0553a94ab3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Received event network-changed-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:14:11 compute-0 nova_compute[183075]: 2026-01-22 17:14:11.362 183079 DEBUG nova.compute.manager [req-d455aa81-5157-4e3a-bc5a-58649f2e65a0 req-4e0e3c0a-eb95-45e3-b327-5f0553a94ab3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Refreshing instance network info cache due to event network-changed-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:14:11 compute-0 nova_compute[183075]: 2026-01-22 17:14:11.362 183079 DEBUG oslo_concurrency.lockutils [req-d455aa81-5157-4e3a-bc5a-58649f2e65a0 req-4e0e3c0a-eb95-45e3-b327-5f0553a94ab3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-26367132-bc45-4c8a-bd7e-5c0883453bbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:14:11 compute-0 nova_compute[183075]: 2026-01-22 17:14:11.544 183079 DEBUG nova.network.neutron [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.416 183079 DEBUG nova.network.neutron [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Updating instance_info_cache with network_info: [{"id": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "address": "fa:16:3e:e1:3a:d3", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2e1eb8-ee", "ovs_interfaceid": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.496 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Releasing lock "refresh_cache-613fb094-d482-4b33-92c8-184b010a0169" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.497 183079 DEBUG nova.compute.manager [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Instance network_info: |[{"id": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "address": "fa:16:3e:e1:3a:d3", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2e1eb8-ee", "ovs_interfaceid": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.497 183079 DEBUG oslo_concurrency.lockutils [req-a806a219-d5c0-4eb5-a8a2-58200676653c req-e795a686-20e8-4dcd-837a-e4dcc914b422 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-613fb094-d482-4b33-92c8-184b010a0169" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.497 183079 DEBUG nova.network.neutron [req-a806a219-d5c0-4eb5-a8a2-58200676653c req-e795a686-20e8-4dcd-837a-e4dcc914b422 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Refreshing network info cache for port 1e2e1eb8-eea7-4040-8e90-9c98522a01da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.500 183079 DEBUG nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Start _get_guest_xml network_info=[{"id": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "address": "fa:16:3e:e1:3a:d3", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2e1eb8-ee", "ovs_interfaceid": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.503 183079 WARNING nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.508 183079 DEBUG nova.virt.libvirt.host [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.508 183079 DEBUG nova.virt.libvirt.host [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.511 183079 DEBUG nova.virt.libvirt.host [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.511 183079 DEBUG nova.virt.libvirt.host [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.512 183079 DEBUG nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.512 183079 DEBUG nova.virt.hardware [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.512 183079 DEBUG nova.virt.hardware [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.512 183079 DEBUG nova.virt.hardware [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.513 183079 DEBUG nova.virt.hardware [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.513 183079 DEBUG nova.virt.hardware [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.513 183079 DEBUG nova.virt.hardware [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.513 183079 DEBUG nova.virt.hardware [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.514 183079 DEBUG nova.virt.hardware [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.514 183079 DEBUG nova.virt.hardware [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.514 183079 DEBUG nova.virt.hardware [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.514 183079 DEBUG nova.virt.hardware [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.518 183079 DEBUG nova.virt.libvirt.vif [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:14:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-551951220',display_name='tempest-server-test-551951220',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-551951220',id=28,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-d35qou6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:14:08Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=613fb094-d482-4b33-92c8-184b010a0169,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "address": "fa:16:3e:e1:3a:d3", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2e1eb8-ee", "ovs_interfaceid": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.518 183079 DEBUG nova.network.os_vif_util [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "address": "fa:16:3e:e1:3a:d3", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2e1eb8-ee", "ovs_interfaceid": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.519 183079 DEBUG nova.network.os_vif_util [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:3a:d3,bridge_name='br-int',has_traffic_filtering=True,id=1e2e1eb8-eea7-4040-8e90-9c98522a01da,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1e2e1eb8-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.519 183079 DEBUG nova.objects.instance [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 613fb094-d482-4b33-92c8-184b010a0169 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.537 183079 DEBUG nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <uuid>613fb094-d482-4b33-92c8-184b010a0169</uuid>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <name>instance-0000001c</name>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-551951220</nova:name>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:14:12</nova:creationTime>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:14:12 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:14:12 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:14:12 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:14:12 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:14:12 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:14:12 compute-0 nova_compute[183075]:         <nova:user uuid="cd47d63cff2548a88e21e5c2e6a5c161">tempest-FloatingIpSeparateNetwork-931877966-project-member</nova:user>
Jan 22 17:14:12 compute-0 nova_compute[183075]:         <nova:project uuid="e05c7aae349e4a1d859a387df45650a0">tempest-FloatingIpSeparateNetwork-931877966</nova:project>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:14:12 compute-0 nova_compute[183075]:         <nova:port uuid="1e2e1eb8-eea7-4040-8e90-9c98522a01da">
Jan 22 17:14:12 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <system>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <entry name="serial">613fb094-d482-4b33-92c8-184b010a0169</entry>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <entry name="uuid">613fb094-d482-4b33-92c8-184b010a0169</entry>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     </system>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <os>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   </os>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <features>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   </features>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169/disk"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:e1:3a:d3"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <target dev="tap1e2e1eb8-ee"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169/console.log" append="off"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <video>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     </video>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:14:12 compute-0 nova_compute[183075]: </domain>
Jan 22 17:14:12 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.538 183079 DEBUG nova.compute.manager [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Preparing to wait for external event network-vif-plugged-1e2e1eb8-eea7-4040-8e90-9c98522a01da prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.539 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "613fb094-d482-4b33-92c8-184b010a0169-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.539 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "613fb094-d482-4b33-92c8-184b010a0169-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.539 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "613fb094-d482-4b33-92c8-184b010a0169-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.540 183079 DEBUG nova.virt.libvirt.vif [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:14:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-551951220',display_name='tempest-server-test-551951220',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-551951220',id=28,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-d35qou6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:14:08Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=613fb094-d482-4b33-92c8-184b010a0169,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "address": "fa:16:3e:e1:3a:d3", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2e1eb8-ee", "ovs_interfaceid": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.540 183079 DEBUG nova.network.os_vif_util [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "address": "fa:16:3e:e1:3a:d3", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2e1eb8-ee", "ovs_interfaceid": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.540 183079 DEBUG nova.network.os_vif_util [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:3a:d3,bridge_name='br-int',has_traffic_filtering=True,id=1e2e1eb8-eea7-4040-8e90-9c98522a01da,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1e2e1eb8-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.541 183079 DEBUG os_vif [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:3a:d3,bridge_name='br-int',has_traffic_filtering=True,id=1e2e1eb8-eea7-4040-8e90-9c98522a01da,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1e2e1eb8-ee') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.541 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.541 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.542 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.544 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.544 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1e2e1eb8-ee, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.544 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1e2e1eb8-ee, col_values=(('external_ids', {'iface-id': '1e2e1eb8-eea7-4040-8e90-9c98522a01da', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e1:3a:d3', 'vm-uuid': '613fb094-d482-4b33-92c8-184b010a0169'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.545 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:12 compute-0 NetworkManager[55454]: <info>  [1769102052.5466] manager: (tap1e2e1eb8-ee): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.550 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.552 183079 INFO os_vif [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:3a:d3,bridge_name='br-int',has_traffic_filtering=True,id=1e2e1eb8-eea7-4040-8e90-9c98522a01da,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1e2e1eb8-ee')
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.767 183079 DEBUG nova.network.neutron [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Updating instance_info_cache with network_info: [{"id": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "address": "fa:16:3e:f2:f5:37", "network": {"id": "2f6bd5e0-c1b9-4783-b6e7-1932fe18705c", "bridge": "br-int", "label": "tempest-test-network--1617607678", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.36", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76cfd2b1-7a", "ovs_interfaceid": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.804 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.900 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Releasing lock "refresh_cache-26367132-bc45-4c8a-bd7e-5c0883453bbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.901 183079 DEBUG nova.compute.manager [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Instance network_info: |[{"id": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "address": "fa:16:3e:f2:f5:37", "network": {"id": "2f6bd5e0-c1b9-4783-b6e7-1932fe18705c", "bridge": "br-int", "label": "tempest-test-network--1617607678", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.36", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76cfd2b1-7a", "ovs_interfaceid": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.902 183079 DEBUG oslo_concurrency.lockutils [req-d455aa81-5157-4e3a-bc5a-58649f2e65a0 req-4e0e3c0a-eb95-45e3-b327-5f0553a94ab3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-26367132-bc45-4c8a-bd7e-5c0883453bbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.902 183079 DEBUG nova.network.neutron [req-d455aa81-5157-4e3a-bc5a-58649f2e65a0 req-4e0e3c0a-eb95-45e3-b327-5f0553a94ab3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Refreshing network info cache for port 76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.906 183079 DEBUG nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Start _get_guest_xml network_info=[{"id": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "address": "fa:16:3e:f2:f5:37", "network": {"id": "2f6bd5e0-c1b9-4783-b6e7-1932fe18705c", "bridge": "br-int", "label": "tempest-test-network--1617607678", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.36", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76cfd2b1-7a", "ovs_interfaceid": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.915 183079 WARNING nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.920 183079 DEBUG nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.921 183079 DEBUG nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] No VIF found with MAC fa:16:3e:e1:3a:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.922 183079 DEBUG nova.virt.libvirt.host [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.923 183079 DEBUG nova.virt.libvirt.host [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.927 183079 DEBUG nova.virt.libvirt.host [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.928 183079 DEBUG nova.virt.libvirt.host [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.928 183079 DEBUG nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.928 183079 DEBUG nova.virt.hardware [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.929 183079 DEBUG nova.virt.hardware [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.929 183079 DEBUG nova.virt.hardware [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.929 183079 DEBUG nova.virt.hardware [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.930 183079 DEBUG nova.virt.hardware [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.930 183079 DEBUG nova.virt.hardware [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.930 183079 DEBUG nova.virt.hardware [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.930 183079 DEBUG nova.virt.hardware [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.931 183079 DEBUG nova.virt.hardware [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.931 183079 DEBUG nova.virt.hardware [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.931 183079 DEBUG nova.virt.hardware [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.935 183079 DEBUG nova.virt.libvirt.vif [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:14:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-126458854',display_name='tempest-server-test-126458854',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-126458854',id=27,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMrJJ31y1scB3LRFxoJUNXGWF+24G8xnIsagkR4AgFUSi4x4N/JtpAglucdepqXrDN4/cu+UKlGqq1KPQ/3dphaxCQ2ycOFcca6dHGCNqF9JiM6hrakYFD5RWRpIAcONgw==',key_name='tempest-keypair-test-1482310067',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bfc6667804934c92b71ce7638089e9e3',ramdisk_id='',reservation_id='r-jcmuifay',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-QoSTest-2146064006',owner_user_name='tempest-QoSTest-2146064006-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:14:06Z,user_data=None,user_id='1e61127d65144bcbaa0d43fe3eb484c0',uuid=26367132-bc45-4c8a-bd7e-5c0883453bbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "address": "fa:16:3e:f2:f5:37", "network": {"id": "2f6bd5e0-c1b9-4783-b6e7-1932fe18705c", "bridge": "br-int", "label": "tempest-test-network--1617607678", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.36", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76cfd2b1-7a", "ovs_interfaceid": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.935 183079 DEBUG nova.network.os_vif_util [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converting VIF {"id": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "address": "fa:16:3e:f2:f5:37", "network": {"id": "2f6bd5e0-c1b9-4783-b6e7-1932fe18705c", "bridge": "br-int", "label": "tempest-test-network--1617607678", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.36", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76cfd2b1-7a", "ovs_interfaceid": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.936 183079 DEBUG nova.network.os_vif_util [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:f5:37,bridge_name='br-int',has_traffic_filtering=True,id=76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885,network=Network(2f6bd5e0-c1b9-4783-b6e7-1932fe18705c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76cfd2b1-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.938 183079 DEBUG nova.objects.instance [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 26367132-bc45-4c8a-bd7e-5c0883453bbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.973 183079 DEBUG nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <uuid>26367132-bc45-4c8a-bd7e-5c0883453bbd</uuid>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <name>instance-0000001b</name>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-126458854</nova:name>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:14:12</nova:creationTime>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:14:12 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:14:12 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:14:12 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:14:12 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:14:12 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:14:12 compute-0 nova_compute[183075]:         <nova:user uuid="1e61127d65144bcbaa0d43fe3eb484c0">tempest-QoSTest-2146064006-project-member</nova:user>
Jan 22 17:14:12 compute-0 nova_compute[183075]:         <nova:project uuid="bfc6667804934c92b71ce7638089e9e3">tempest-QoSTest-2146064006</nova:project>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:14:12 compute-0 nova_compute[183075]:         <nova:port uuid="76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885">
Jan 22 17:14:12 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.36" ipVersion="4"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <system>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <entry name="serial">26367132-bc45-4c8a-bd7e-5c0883453bbd</entry>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <entry name="uuid">26367132-bc45-4c8a-bd7e-5c0883453bbd</entry>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     </system>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <os>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   </os>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <features>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   </features>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:f2:f5:37"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <target dev="tap76cfd2b1-7a"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/console.log" append="off"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <video>
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     </video>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:14:12 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:14:12 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:14:12 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:14:12 compute-0 nova_compute[183075]: </domain>
Jan 22 17:14:12 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.975 183079 DEBUG nova.compute.manager [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Preparing to wait for external event network-vif-plugged-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.976 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.976 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.977 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.978 183079 DEBUG nova.virt.libvirt.vif [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:14:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-126458854',display_name='tempest-server-test-126458854',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-126458854',id=27,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMrJJ31y1scB3LRFxoJUNXGWF+24G8xnIsagkR4AgFUSi4x4N/JtpAglucdepqXrDN4/cu+UKlGqq1KPQ/3dphaxCQ2ycOFcca6dHGCNqF9JiM6hrakYFD5RWRpIAcONgw==',key_name='tempest-keypair-test-1482310067',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bfc6667804934c92b71ce7638089e9e3',ramdisk_id='',reservation_id='r-jcmuifay',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-QoSTest-2146064006',owner_user_name='tempest-QoSTest-2146064006-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:14:06Z,user_data=None,user_id='1e61127d65144bcbaa0d43fe3eb484c0',uuid=26367132-bc45-4c8a-bd7e-5c0883453bbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "address": "fa:16:3e:f2:f5:37", "network": {"id": "2f6bd5e0-c1b9-4783-b6e7-1932fe18705c", "bridge": "br-int", "label": "tempest-test-network--1617607678", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.36", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76cfd2b1-7a", "ovs_interfaceid": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.978 183079 DEBUG nova.network.os_vif_util [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converting VIF {"id": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "address": "fa:16:3e:f2:f5:37", "network": {"id": "2f6bd5e0-c1b9-4783-b6e7-1932fe18705c", "bridge": "br-int", "label": "tempest-test-network--1617607678", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.36", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76cfd2b1-7a", "ovs_interfaceid": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.979 183079 DEBUG nova.network.os_vif_util [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:f5:37,bridge_name='br-int',has_traffic_filtering=True,id=76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885,network=Network(2f6bd5e0-c1b9-4783-b6e7-1932fe18705c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76cfd2b1-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.980 183079 DEBUG os_vif [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:f5:37,bridge_name='br-int',has_traffic_filtering=True,id=76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885,network=Network(2f6bd5e0-c1b9-4783-b6e7-1932fe18705c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76cfd2b1-7a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.980 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.981 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.981 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.988 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.988 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap76cfd2b1-7a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.989 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap76cfd2b1-7a, col_values=(('external_ids', {'iface-id': '76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:f5:37', 'vm-uuid': '26367132-bc45-4c8a-bd7e-5c0883453bbd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.990 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:12 compute-0 NetworkManager[55454]: <info>  [1769102052.9913] manager: (tap76cfd2b1-7a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/130)
Jan 22 17:14:12 compute-0 nova_compute[183075]: 2026-01-22 17:14:12.993 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.002 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.003 183079 INFO os_vif [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:f5:37,bridge_name='br-int',has_traffic_filtering=True,id=76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885,network=Network(2f6bd5e0-c1b9-4783-b6e7-1932fe18705c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76cfd2b1-7a')
Jan 22 17:14:13 compute-0 NetworkManager[55454]: <info>  [1769102053.0094] manager: (tap1e2e1eb8-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/131)
Jan 22 17:14:13 compute-0 kernel: tap1e2e1eb8-ee: entered promiscuous mode
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.015 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:13 compute-0 ovn_controller[95372]: 2026-01-22T17:14:13Z|00307|binding|INFO|Claiming lport 1e2e1eb8-eea7-4040-8e90-9c98522a01da for this chassis.
Jan 22 17:14:13 compute-0 ovn_controller[95372]: 2026-01-22T17:14:13Z|00308|binding|INFO|1e2e1eb8-eea7-4040-8e90-9c98522a01da: Claiming fa:16:3e:e1:3a:d3 10.100.0.24
Jan 22 17:14:13 compute-0 ovn_controller[95372]: 2026-01-22T17:14:13Z|00309|binding|INFO|Setting lport 1e2e1eb8-eea7-4040-8e90-9c98522a01da ovn-installed in OVS
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.048 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.052 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:13 compute-0 ovn_controller[95372]: 2026-01-22T17:14:13Z|00310|binding|INFO|Setting lport 1e2e1eb8-eea7-4040-8e90-9c98522a01da up in Southbound
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.058 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:3a:d3 10.100.0.24'], port_security=['fa:16:3e:e1:3a:d3 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': '613fb094-d482-4b33-92c8-184b010a0169', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a16be1a-262e-47f7-8518-5f24ee15796e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5b6ccb16-1216-4deb-9d72-42005a3163bb, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=1e2e1eb8-eea7-4040-8e90-9c98522a01da) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.060 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 1e2e1eb8-eea7-4040-8e90-9c98522a01da in datapath 0a16be1a-262e-47f7-8518-5f24ee15796e bound to our chassis
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.064 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a16be1a-262e-47f7-8518-5f24ee15796e
Jan 22 17:14:13 compute-0 systemd-udevd[222467]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.076 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f69ab8be-40ea-4d4d-b7f0-4d50511833a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.077 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0a16be1a-21 in ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:14:13 compute-0 systemd-machined[154382]: New machine qemu-27-instance-0000001c.
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.079 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0a16be1a-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.080 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[889d260f-cb1d-458b-b487-140d6f6e8d5e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.081 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fec6d212-669f-4c98-9946-2559a4f00c20]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:13 compute-0 NetworkManager[55454]: <info>  [1769102053.0851] device (tap1e2e1eb8-ee): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:14:13 compute-0 NetworkManager[55454]: <info>  [1769102053.0857] device (tap1e2e1eb8-ee): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:14:13 compute-0 systemd[1]: Started Virtual Machine qemu-27-instance-0000001c.
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.097 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc75f91-e35a-4bce-8624-7738b41249bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:13 compute-0 ovn_controller[95372]: 2026-01-22T17:14:13Z|00311|binding|INFO|Releasing lport 1759254b-798a-4e65-baf5-489557c1f604 from this chassis (sb_readonly=0)
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.120 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ee2d5c46-1eb0-4742-a4fc-aef205f502e9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:13 compute-0 podman[222454]: 2026-01-22 17:14:13.136113845 +0000 UTC m=+0.087444461 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.141 183079 DEBUG nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.142 183079 DEBUG nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] No VIF found with MAC fa:16:3e:f2:f5:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.159 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[dfacebb5-ee8d-4764-820d-816f8eaf165e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:13 compute-0 NetworkManager[55454]: <info>  [1769102053.1798] manager: (tap0a16be1a-20): new Veth device (/org/freedesktop/NetworkManager/Devices/132)
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.179 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[85d1b1a0-eca2-4238-934f-9a0350f40268]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:13 compute-0 systemd-udevd[222511]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:14:13 compute-0 kernel: tap76cfd2b1-7a: entered promiscuous mode
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.272 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:13 compute-0 ovn_controller[95372]: 2026-01-22T17:14:13Z|00312|binding|INFO|Claiming lport 76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 for this chassis.
Jan 22 17:14:13 compute-0 ovn_controller[95372]: 2026-01-22T17:14:13Z|00313|binding|INFO|76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885: Claiming fa:16:3e:f2:f5:37 10.100.0.36
Jan 22 17:14:13 compute-0 NetworkManager[55454]: <info>  [1769102053.2763] manager: (tap76cfd2b1-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/133)
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.277 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[7d6ada79-b6e2-48ba-a4f1-14520fbdff11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:13 compute-0 NetworkManager[55454]: <info>  [1769102053.2888] device (tap76cfd2b1-7a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:14:13 compute-0 NetworkManager[55454]: <info>  [1769102053.2898] device (tap76cfd2b1-7a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.289 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:f5:37 10.100.0.36'], port_security=['fa:16:3e:f2:f5:37 10.100.0.36'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.36/28', 'neutron:device_id': '26367132-bc45-4c8a-bd7e-5c0883453bbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bfc6667804934c92b71ce7638089e9e3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1ec3a99e-543d-4786-af18-fa8f96c0f742', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12ee3cfa-41b0-4935-bdbd-341f1a5f30cc, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.290 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[6da6e8a9-3417-4ea3-8810-72138308c83b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:13 compute-0 ovn_controller[95372]: 2026-01-22T17:14:13Z|00314|binding|INFO|Setting lport 76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 ovn-installed in OVS
Jan 22 17:14:13 compute-0 ovn_controller[95372]: 2026-01-22T17:14:13Z|00315|binding|INFO|Setting lport 76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 up in Southbound
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.292 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.295 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:13 compute-0 NetworkManager[55454]: <info>  [1769102053.3174] device (tap0a16be1a-20): carrier: link connected
Jan 22 17:14:13 compute-0 systemd-machined[154382]: New machine qemu-28-instance-0000001b.
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.324 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a4381c2b-6673-4a3c-b2ed-b29a754d6a63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.329 183079 DEBUG nova.compute.manager [req-82aebd54-0fd9-4d9f-a8e5-1384f1579faf req-b6a7882b-2b8e-46ec-9655-aa5dfe190f21 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Received event network-vif-plugged-1e2e1eb8-eea7-4040-8e90-9c98522a01da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.330 183079 DEBUG oslo_concurrency.lockutils [req-82aebd54-0fd9-4d9f-a8e5-1384f1579faf req-b6a7882b-2b8e-46ec-9655-aa5dfe190f21 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "613fb094-d482-4b33-92c8-184b010a0169-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.330 183079 DEBUG oslo_concurrency.lockutils [req-82aebd54-0fd9-4d9f-a8e5-1384f1579faf req-b6a7882b-2b8e-46ec-9655-aa5dfe190f21 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "613fb094-d482-4b33-92c8-184b010a0169-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.330 183079 DEBUG oslo_concurrency.lockutils [req-82aebd54-0fd9-4d9f-a8e5-1384f1579faf req-b6a7882b-2b8e-46ec-9655-aa5dfe190f21 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "613fb094-d482-4b33-92c8-184b010a0169-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.330 183079 DEBUG nova.compute.manager [req-82aebd54-0fd9-4d9f-a8e5-1384f1579faf req-b6a7882b-2b8e-46ec-9655-aa5dfe190f21 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Processing event network-vif-plugged-1e2e1eb8-eea7-4040-8e90-9c98522a01da _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.339 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[642e9c53-448a-493e-87ef-961c88f20aa3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a16be1a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:16:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441701, 'reachable_time': 38676, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222530, 'error': None, 'target': 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:13 compute-0 systemd[1]: Started Virtual Machine qemu-28-instance-0000001b.
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.360 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[32668c21-1b00-4a95-a4db-2308ec1fbc62]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feba:16c2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441701, 'tstamp': 441701}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222531, 'error': None, 'target': 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.381 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1bd3be5f-4933-44cf-8ae0-d476c7e1336d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a16be1a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:16:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441701, 'reachable_time': 38676, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222534, 'error': None, 'target': 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.411 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b12cf891-4a10-45b5-becf-d8cd30d08389]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.463 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[915e2406-1c34-4c52-a7be-a8e75d8ae68a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.464 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a16be1a-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.464 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.464 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a16be1a-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.466 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:13 compute-0 kernel: tap0a16be1a-20: entered promiscuous mode
Jan 22 17:14:13 compute-0 NetworkManager[55454]: <info>  [1769102053.4683] manager: (tap0a16be1a-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.470 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a16be1a-20, col_values=(('external_ids', {'iface-id': 'f5af8e72-5100-4440-84f0-c68eec4b5e5e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.472 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:13 compute-0 ovn_controller[95372]: 2026-01-22T17:14:13Z|00316|binding|INFO|Releasing lport f5af8e72-5100-4440-84f0-c68eec4b5e5e from this chassis (sb_readonly=0)
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.473 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.474 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a16be1a-262e-47f7-8518-5f24ee15796e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a16be1a-262e-47f7-8518-5f24ee15796e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.475 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[33ae1712-6182-4b32-a805-8c299f0e3403]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.476 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/0a16be1a-262e-47f7-8518-5f24ee15796e.pid.haproxy
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 0a16be1a-262e-47f7-8518-5f24ee15796e
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:14:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:13.477 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'env', 'PROCESS_TAG=haproxy-0a16be1a-262e-47f7-8518-5f24ee15796e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0a16be1a-262e-47f7-8518-5f24ee15796e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.485 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.656 183079 DEBUG nova.network.neutron [req-a806a219-d5c0-4eb5-a8a2-58200676653c req-e795a686-20e8-4dcd-837a-e4dcc914b422 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Updated VIF entry in instance network info cache for port 1e2e1eb8-eea7-4040-8e90-9c98522a01da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.657 183079 DEBUG nova.network.neutron [req-a806a219-d5c0-4eb5-a8a2-58200676653c req-e795a686-20e8-4dcd-837a-e4dcc914b422 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Updating instance_info_cache with network_info: [{"id": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "address": "fa:16:3e:e1:3a:d3", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2e1eb8-ee", "ovs_interfaceid": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:14:13 compute-0 podman[222570]: 2026-01-22 17:14:13.866027184 +0000 UTC m=+0.082014699 container create e10c75bf8e526c31a039b24eb52457fe532227aeb8eae403d1ec3f110578d5ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.879 183079 DEBUG oslo_concurrency.lockutils [req-a806a219-d5c0-4eb5-a8a2-58200676653c req-e795a686-20e8-4dcd-837a-e4dcc914b422 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-613fb094-d482-4b33-92c8-184b010a0169" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:14:13 compute-0 podman[222570]: 2026-01-22 17:14:13.831319219 +0000 UTC m=+0.047306774 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:14:13 compute-0 systemd[1]: Started libpod-conmon-e10c75bf8e526c31a039b24eb52457fe532227aeb8eae403d1ec3f110578d5ac.scope.
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.935 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102053.9353778, 26367132-bc45-4c8a-bd7e-5c0883453bbd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.936 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] VM Started (Lifecycle Event)
Jan 22 17:14:13 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.939 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.939 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:14:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41007d1b5a1e8b2b463f75b32e45b0842b07a46ef9248ff75a8cbaa3c4d92c41/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.954 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:14:13 compute-0 podman[222570]: 2026-01-22 17:14:13.956880523 +0000 UTC m=+0.172868068 container init e10c75bf8e526c31a039b24eb52457fe532227aeb8eae403d1ec3f110578d5ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.959 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102053.9365833, 26367132-bc45-4c8a-bd7e-5c0883453bbd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.959 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] VM Paused (Lifecycle Event)
Jan 22 17:14:13 compute-0 podman[222570]: 2026-01-22 17:14:13.962878289 +0000 UTC m=+0.178865814 container start e10c75bf8e526c31a039b24eb52457fe532227aeb8eae403d1ec3f110578d5ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.976 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.979 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:14:13 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[222592]: [NOTICE]   (222596) : New worker (222598) forked
Jan 22 17:14:13 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[222592]: [NOTICE]   (222596) : Loading success.
Jan 22 17:14:13 compute-0 nova_compute[183075]: 2026-01-22 17:14:13.997 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.017 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 in datapath 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c unbound from our chassis
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.019 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.031 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a203cf05-137d-4ba5-9a9f-163139811938]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.031 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2f6bd5e0-c1 in ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.033 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2f6bd5e0-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.033 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cc867b88-1065-41c7-a261-6ae97e816123]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.034 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[62cf156e-ecc8-4847-bf78-ec6a3e29d6c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.044 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[5f43d172-34f9-42f6-8d13-26e9cf1285cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.057 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[db09fbd5-d57e-4bb4-9069-04848df6ec09]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.085 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[cfdbc9b0-ae04-4efe-a58c-7295911ad951]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.091 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c845d6fa-eae3-4adc-8e9f-379736494b20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:14 compute-0 NetworkManager[55454]: <info>  [1769102054.0927] manager: (tap2f6bd5e0-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/135)
Jan 22 17:14:14 compute-0 systemd-udevd[222507]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.119 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f476f1e8-9cb7-4d06-a7c9-5780e490e5d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.122 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[63bed63d-e9e3-4834-b342-cb6868a8c309]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:14 compute-0 NetworkManager[55454]: <info>  [1769102054.1417] device (tap2f6bd5e0-c0): carrier: link connected
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.146 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[b0f910ca-846b-4381-8a4d-5062891b2dd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.168 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b79729e5-fe86-4b36-9577-7c26ffd6e35b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2f6bd5e0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:74:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441784, 'reachable_time': 35818, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222624, 'error': None, 'target': 'ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.191 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102054.1911192, 613fb094-d482-4b33-92c8-184b010a0169 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.191 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 613fb094-d482-4b33-92c8-184b010a0169] VM Started (Lifecycle Event)
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.193 183079 DEBUG nova.compute.manager [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.195 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3b6fec2b-883f-42e1-a007-6c92a94a604d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe43:74ff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441784, 'tstamp': 441784}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222625, 'error': None, 'target': 'ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.197 183079 DEBUG nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.199 183079 INFO nova.virt.libvirt.driver [-] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Instance spawned successfully.
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.200 183079 DEBUG nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.213 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3b7ebf09-e53c-4ffa-ad14-6754c89d0d0f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2f6bd5e0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:74:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441784, 'reachable_time': 35818, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222626, 'error': None, 'target': 'ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.242 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec71100-a16d-4736-86da-d583f687c432]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.299 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e6e310f1-f428-46dc-9aca-9a7a37f9e93f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.301 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2f6bd5e0-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.301 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.301 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2f6bd5e0-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.302 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:14 compute-0 kernel: tap2f6bd5e0-c0: entered promiscuous mode
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.309 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2f6bd5e0-c0, col_values=(('external_ids', {'iface-id': '89fb1d7a-a439-4b74-977f-72af8114ea6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.310 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:14 compute-0 ovn_controller[95372]: 2026-01-22T17:14:14Z|00317|binding|INFO|Releasing lport 89fb1d7a-a439-4b74-977f-72af8114ea6b from this chassis (sb_readonly=0)
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.368 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:14 compute-0 NetworkManager[55454]: <info>  [1769102054.3695] manager: (tap2f6bd5e0-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/136)
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.370 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2f6bd5e0-c1b9-4783-b6e7-1932fe18705c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2f6bd5e0-c1b9-4783-b6e7-1932fe18705c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.371 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a8af1cf4-124f-4f85-909f-f5229633451f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.372 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/2f6bd5e0-c1b9-4783-b6e7-1932fe18705c.pid.haproxy
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:14:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:14.372 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c', 'env', 'PROCESS_TAG=haproxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2f6bd5e0-c1b9-4783-b6e7-1932fe18705c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:14:14 compute-0 podman[222658]: 2026-01-22 17:14:14.750363521 +0000 UTC m=+0.057399818 container create 79c0a3668c1d22d65d670ffded06db05fa72502bc394a4da5302746ba6472e64 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.752 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.763 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.767 183079 DEBUG nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.768 183079 DEBUG nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.768 183079 DEBUG nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.769 183079 DEBUG nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.770 183079 DEBUG nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.770 183079 DEBUG nova.virt.libvirt.driver [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:14:14 compute-0 systemd[1]: Started libpod-conmon-79c0a3668c1d22d65d670ffded06db05fa72502bc394a4da5302746ba6472e64.scope.
Jan 22 17:14:14 compute-0 podman[222658]: 2026-01-22 17:14:14.717815452 +0000 UTC m=+0.024851759 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.816 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 613fb094-d482-4b33-92c8-184b010a0169] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.816 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102054.191259, 613fb094-d482-4b33-92c8-184b010a0169 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.817 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 613fb094-d482-4b33-92c8-184b010a0169] VM Paused (Lifecycle Event)
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.841 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:14:14 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.846 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102054.1964443, 613fb094-d482-4b33-92c8-184b010a0169 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.846 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 613fb094-d482-4b33-92c8-184b010a0169] VM Resumed (Lifecycle Event)
Jan 22 17:14:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d604c141741825b731f79c27c1fd23aac9acc55c16b9ce88a0bafacb0adbbc1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:14:14 compute-0 podman[222658]: 2026-01-22 17:14:14.866343514 +0000 UTC m=+0.173379811 container init 79c0a3668c1d22d65d670ffded06db05fa72502bc394a4da5302746ba6472e64 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 22 17:14:14 compute-0 podman[222658]: 2026-01-22 17:14:14.872957857 +0000 UTC m=+0.179994134 container start 79c0a3668c1d22d65d670ffded06db05fa72502bc394a4da5302746ba6472e64 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.882 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.884 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:14:14 compute-0 neutron-haproxy-ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222674]: [NOTICE]   (222678) : New worker (222680) forked
Jan 22 17:14:14 compute-0 neutron-haproxy-ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222674]: [NOTICE]   (222678) : Loading success.
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.937 183079 INFO nova.compute.manager [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Took 6.51 seconds to spawn the instance on the hypervisor.
Jan 22 17:14:14 compute-0 nova_compute[183075]: 2026-01-22 17:14:14.938 183079 DEBUG nova.compute.manager [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.108 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 613fb094-d482-4b33-92c8-184b010a0169] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.155 183079 INFO nova.compute.manager [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Took 8.28 seconds to build instance.
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.176 183079 DEBUG oslo_concurrency.lockutils [None req-7891b36f-7d42-4a5e-90ec-7b04c0258f7a cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "613fb094-d482-4b33-92c8-184b010a0169" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.383s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.423 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.578 183079 DEBUG nova.network.neutron [req-d455aa81-5157-4e3a-bc5a-58649f2e65a0 req-4e0e3c0a-eb95-45e3-b327-5f0553a94ab3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Updated VIF entry in instance network info cache for port 76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.579 183079 DEBUG nova.network.neutron [req-d455aa81-5157-4e3a-bc5a-58649f2e65a0 req-4e0e3c0a-eb95-45e3-b327-5f0553a94ab3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Updating instance_info_cache with network_info: [{"id": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "address": "fa:16:3e:f2:f5:37", "network": {"id": "2f6bd5e0-c1b9-4783-b6e7-1932fe18705c", "bridge": "br-int", "label": "tempest-test-network--1617607678", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.36", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76cfd2b1-7a", "ovs_interfaceid": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.932 183079 DEBUG nova.compute.manager [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Received event network-vif-plugged-1e2e1eb8-eea7-4040-8e90-9c98522a01da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.933 183079 DEBUG oslo_concurrency.lockutils [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "613fb094-d482-4b33-92c8-184b010a0169-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.933 183079 DEBUG oslo_concurrency.lockutils [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "613fb094-d482-4b33-92c8-184b010a0169-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.934 183079 DEBUG oslo_concurrency.lockutils [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "613fb094-d482-4b33-92c8-184b010a0169-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.934 183079 DEBUG nova.compute.manager [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] No waiting events found dispatching network-vif-plugged-1e2e1eb8-eea7-4040-8e90-9c98522a01da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.935 183079 WARNING nova.compute.manager [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Received unexpected event network-vif-plugged-1e2e1eb8-eea7-4040-8e90-9c98522a01da for instance with vm_state active and task_state None.
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.935 183079 DEBUG nova.compute.manager [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Received event network-vif-plugged-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.936 183079 DEBUG oslo_concurrency.lockutils [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.936 183079 DEBUG oslo_concurrency.lockutils [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.937 183079 DEBUG oslo_concurrency.lockutils [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.937 183079 DEBUG nova.compute.manager [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Processing event network-vif-plugged-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.938 183079 DEBUG nova.compute.manager [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Received event network-vif-plugged-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.938 183079 DEBUG oslo_concurrency.lockutils [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.939 183079 DEBUG oslo_concurrency.lockutils [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.939 183079 DEBUG oslo_concurrency.lockutils [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.940 183079 DEBUG nova.compute.manager [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] No waiting events found dispatching network-vif-plugged-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.941 183079 WARNING nova.compute.manager [req-83ff1ed4-10c7-47e2-9550-c0bdf21d8791 req-ff96c1ae-4d68-47b7-9d4a-83244e65625d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Received unexpected event network-vif-plugged-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 for instance with vm_state building and task_state spawning.
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.944 183079 DEBUG nova.compute.manager [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.946 183079 INFO nova.compute.manager [None req-5557557e-248c-4384-af45-d07143001b5c cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Get console output
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.965 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102055.9618273, 26367132-bc45-4c8a-bd7e-5c0883453bbd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.966 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] VM Resumed (Lifecycle Event)
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.967 183079 DEBUG nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.971 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.973 183079 INFO nova.virt.libvirt.driver [-] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Instance spawned successfully.
Jan 22 17:14:15 compute-0 nova_compute[183075]: 2026-01-22 17:14:15.974 183079 DEBUG nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.083 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.084 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.085 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.085 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.087 183079 DEBUG oslo_concurrency.lockutils [req-d455aa81-5157-4e3a-bc5a-58649f2e65a0 req-4e0e3c0a-eb95-45e3-b327-5f0553a94ab3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-26367132-bc45-4c8a-bd7e-5c0883453bbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.089 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.094 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.111 183079 DEBUG nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.112 183079 DEBUG nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.112 183079 DEBUG nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.112 183079 DEBUG nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.113 183079 DEBUG nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.113 183079 DEBUG nova.virt.libvirt.driver [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.119 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.542 183079 INFO nova.compute.manager [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Took 9.61 seconds to spawn the instance on the hypervisor.
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.543 183079 DEBUG nova.compute.manager [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.554 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:14:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:16.571 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.613 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.615 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.673 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.679 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.738 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.739 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.771 183079 INFO nova.compute.manager [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Took 12.09 seconds to build instance.
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.789 183079 DEBUG oslo_concurrency.lockutils [None req-b8062c67-46b9-4182-bb4d-89e9fbdce7e1 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "26367132-bc45-4c8a-bd7e-5c0883453bbd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.794 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.800 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.867 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.868 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:14:16 compute-0 nova_compute[183075]: 2026-01-22 17:14:16.929 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:14:17 compute-0 nova_compute[183075]: 2026-01-22 17:14:17.139 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:14:17 compute-0 nova_compute[183075]: 2026-01-22 17:14:17.140 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5271MB free_disk=73.33729553222656GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:14:17 compute-0 nova_compute[183075]: 2026-01-22 17:14:17.141 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:17 compute-0 nova_compute[183075]: 2026-01-22 17:14:17.141 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:17 compute-0 nova_compute[183075]: 2026-01-22 17:14:17.573 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 84b90c1e-91a0-437d-8ed2-956840c552ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:14:17 compute-0 nova_compute[183075]: 2026-01-22 17:14:17.573 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 26367132-bc45-4c8a-bd7e-5c0883453bbd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:14:17 compute-0 nova_compute[183075]: 2026-01-22 17:14:17.574 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 613fb094-d482-4b33-92c8-184b010a0169 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:14:17 compute-0 nova_compute[183075]: 2026-01-22 17:14:17.574 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:14:17 compute-0 nova_compute[183075]: 2026-01-22 17:14:17.574 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:14:17 compute-0 nova_compute[183075]: 2026-01-22 17:14:17.723 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:14:17 compute-0 nova_compute[183075]: 2026-01-22 17:14:17.968 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:14:17 compute-0 nova_compute[183075]: 2026-01-22 17:14:17.990 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:17 compute-0 nova_compute[183075]: 2026-01-22 17:14:17.994 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:14:17 compute-0 nova_compute[183075]: 2026-01-22 17:14:17.994 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.853s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:18 compute-0 nova_compute[183075]: 2026-01-22 17:14:18.482 183079 INFO nova.compute.manager [None req-37392d32-ee0d-4722-ba8f-650741609360 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Get console output
Jan 22 17:14:18 compute-0 nova_compute[183075]: 2026-01-22 17:14:18.489 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:14:18 compute-0 nova_compute[183075]: 2026-01-22 17:14:18.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:14:18 compute-0 nova_compute[183075]: 2026-01-22 17:14:18.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 17:14:18 compute-0 nova_compute[183075]: 2026-01-22 17:14:18.815 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 17:14:20 compute-0 nova_compute[183075]: 2026-01-22 17:14:20.430 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:21 compute-0 nova_compute[183075]: 2026-01-22 17:14:21.205 183079 INFO nova.compute.manager [None req-f94a0047-24dc-46e8-b1e8-df765481ec9f cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Get console output
Jan 22 17:14:21 compute-0 nova_compute[183075]: 2026-01-22 17:14:21.211 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:14:21 compute-0 ovn_controller[95372]: 2026-01-22T17:14:21Z|00318|binding|INFO|Releasing lport 89fb1d7a-a439-4b74-977f-72af8114ea6b from this chassis (sb_readonly=0)
Jan 22 17:14:21 compute-0 ovn_controller[95372]: 2026-01-22T17:14:21Z|00319|binding|INFO|Releasing lport 1759254b-798a-4e65-baf5-489557c1f604 from this chassis (sb_readonly=0)
Jan 22 17:14:21 compute-0 ovn_controller[95372]: 2026-01-22T17:14:21Z|00320|binding|INFO|Releasing lport f5af8e72-5100-4440-84f0-c68eec4b5e5e from this chassis (sb_readonly=0)
Jan 22 17:14:21 compute-0 nova_compute[183075]: 2026-01-22 17:14:21.547 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:22 compute-0 nova_compute[183075]: 2026-01-22 17:14:22.992 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:23 compute-0 podman[222709]: 2026-01-22 17:14:23.376781306 +0000 UTC m=+0.076693131 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:14:23 compute-0 nova_compute[183075]: 2026-01-22 17:14:23.614 183079 INFO nova.compute.manager [None req-e018ae1e-a297-4e89-8d03-c9691715da91 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Get console output
Jan 22 17:14:25 compute-0 nova_compute[183075]: 2026-01-22 17:14:25.432 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:25 compute-0 ovn_controller[95372]: 2026-01-22T17:14:25Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e1:3a:d3 10.100.0.24
Jan 22 17:14:25 compute-0 ovn_controller[95372]: 2026-01-22T17:14:25Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e1:3a:d3 10.100.0.24
Jan 22 17:14:26 compute-0 nova_compute[183075]: 2026-01-22 17:14:26.503 183079 INFO nova.compute.manager [None req-a53499e6-901f-4ab3-a34e-247d90cea288 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Get console output
Jan 22 17:14:26 compute-0 nova_compute[183075]: 2026-01-22 17:14:26.510 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:14:27 compute-0 nova_compute[183075]: 2026-01-22 17:14:27.994 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:28 compute-0 ovn_controller[95372]: 2026-01-22T17:14:28Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f2:f5:37 10.100.0.36
Jan 22 17:14:28 compute-0 ovn_controller[95372]: 2026-01-22T17:14:28Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f2:f5:37 10.100.0.36
Jan 22 17:14:28 compute-0 nova_compute[183075]: 2026-01-22 17:14:28.744 183079 INFO nova.compute.manager [None req-2fb3c797-3358-4960-8f53-f4248c243dbe 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Get console output
Jan 22 17:14:28 compute-0 nova_compute[183075]: 2026-01-22 17:14:28.748 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:14:30 compute-0 nova_compute[183075]: 2026-01-22 17:14:30.435 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:31.072 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:31.073 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:14:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:14:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:31 compute-0 podman[222761]: 2026-01-22 17:14:31.373552482 +0000 UTC m=+0.078419595 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:14:31 compute-0 podman[222762]: 2026-01-22 17:14:31.377492515 +0000 UTC m=+0.080400117 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 17:14:31 compute-0 podman[222760]: 2026-01-22 17:14:31.407048585 +0000 UTC m=+0.107142754 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:14:31 compute-0 nova_compute[183075]: 2026-01-22 17:14:31.715 183079 INFO nova.compute.manager [None req-92271485-397a-4ed6-a2bb-a4ef51fe491f cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Get console output
Jan 22 17:14:31 compute-0 nova_compute[183075]: 2026-01-22 17:14:31.942 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.088 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.089 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.0162306
Jan 22 17:14:32 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[222598]: 10.100.0.24:41490 [22/Jan/2026:17:14:31.071] listener listener/metadata 0/0/0/1018/1018 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.103 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.104 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.120 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.120 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0160453
Jan 22 17:14:32 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[222598]: 10.100.0.24:41502 [22/Jan/2026:17:14:32.103] listener listener/metadata 0/0/0/17/17 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.124 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.125 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.138 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.138 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0137544
Jan 22 17:14:32 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[222598]: 10.100.0.24:41510 [22/Jan/2026:17:14:32.124] listener listener/metadata 0/0/0/14/14 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.144 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.144 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.161 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.162 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0175130
Jan 22 17:14:32 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[222598]: 10.100.0.24:41514 [22/Jan/2026:17:14:32.143] listener listener/metadata 0/0/0/18/18 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.166 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.167 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.203 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:32 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[222598]: 10.100.0.24:41522 [22/Jan/2026:17:14:32.166] listener listener/metadata 0/0/0/37/37 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.204 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0367258
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.213 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.214 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.239 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.239 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0252333
Jan 22 17:14:32 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[222598]: 10.100.0.24:41530 [22/Jan/2026:17:14:32.212] listener listener/metadata 0/0/0/26/26 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.245 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.246 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:32 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[222598]: 10.100.0.24:41542 [22/Jan/2026:17:14:32.245] listener listener/metadata 0/0/0/22/22 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.267 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.268 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0218859
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.275 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.275 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.292 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:32 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[222598]: 10.100.0.24:41546 [22/Jan/2026:17:14:32.274] listener listener/metadata 0/0/0/17/17 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.293 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0170710
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.300 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.301 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.319 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:32 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[222598]: 10.100.0.24:41554 [22/Jan/2026:17:14:32.299] listener listener/metadata 0/0/0/20/20 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.319 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0186734
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.329 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.330 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.344 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.345 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0145521
Jan 22 17:14:32 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[222598]: 10.100.0.24:41564 [22/Jan/2026:17:14:32.329] listener listener/metadata 0/0/0/15/15 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.353 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.353 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:32 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[222598]: 10.100.0.24:41568 [22/Jan/2026:17:14:32.352] listener listener/metadata 0/0/0/15/15 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.368 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0149906
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.389 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.390 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.408 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.408 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0182803
Jan 22 17:14:32 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[222598]: 10.100.0.24:41578 [22/Jan/2026:17:14:32.388] listener listener/metadata 0/0/0/19/19 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.413 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.413 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.433 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.433 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0199754
Jan 22 17:14:32 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[222598]: 10.100.0.24:41594 [22/Jan/2026:17:14:32.412] listener listener/metadata 0/0/0/20/20 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.441 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.442 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.459 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.459 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0172920
Jan 22 17:14:32 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[222598]: 10.100.0.24:41598 [22/Jan/2026:17:14:32.441] listener listener/metadata 0/0/0/18/18 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.469 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.469 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.487 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.488 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0182760
Jan 22 17:14:32 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[222598]: 10.100.0.24:41608 [22/Jan/2026:17:14:32.468] listener listener/metadata 0/0/0/19/19 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.497 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.497 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.24
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 0a16be1a-262e-47f7-8518-5f24ee15796e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.513 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:32 compute-0 haproxy-metadata-proxy-0a16be1a-262e-47f7-8518-5f24ee15796e[222598]: 10.100.0.24:41622 [22/Jan/2026:17:14:32.496] listener listener/metadata 0/0/0/16/16 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:14:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:32.513 104990 INFO eventlet.wsgi.server [-] 10.100.0.24,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0156806
Jan 22 17:14:32 compute-0 nova_compute[183075]: 2026-01-22 17:14:32.996 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:33 compute-0 nova_compute[183075]: 2026-01-22 17:14:33.901 183079 INFO nova.compute.manager [None req-ab1953c1-09e5-494a-b2ce-e33db86fe675 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Get console output
Jan 22 17:14:33 compute-0 nova_compute[183075]: 2026-01-22 17:14:33.905 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:14:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:35.341 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:35.342 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:14:35 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:35 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:35 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:35 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:35 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:35 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.36
Jan 22 17:14:35 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:35 compute-0 nova_compute[183075]: 2026-01-22 17:14:35.477 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.225 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.225 104990 INFO eventlet.wsgi.server [-] 10.100.0.36,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.8832824
Jan 22 17:14:36 compute-0 haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222680]: 10.100.0.36:33624 [22/Jan/2026:17:14:35.341] listener listener/metadata 0/0/0/884/884 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.234 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.235 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.36
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.249 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.249 104990 INFO eventlet.wsgi.server [-] 10.100.0.36,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0143828
Jan 22 17:14:36 compute-0 haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222680]: 10.100.0.36:33634 [22/Jan/2026:17:14:36.234] listener listener/metadata 0/0/0/15/15 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.253 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.253 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.36
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.272 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.272 104990 INFO eventlet.wsgi.server [-] 10.100.0.36,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0189157
Jan 22 17:14:36 compute-0 haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222680]: 10.100.0.36:33636 [22/Jan/2026:17:14:36.253] listener listener/metadata 0/0/0/20/20 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.279 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.279 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.36
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.296 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:36 compute-0 haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222680]: 10.100.0.36:33652 [22/Jan/2026:17:14:36.278] listener listener/metadata 0/0/0/17/17 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.297 104990 INFO eventlet.wsgi.server [-] 10.100.0.36,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0171609
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.300 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.301 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.36
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.324 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.324 104990 INFO eventlet.wsgi.server [-] 10.100.0.36,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0237880
Jan 22 17:14:36 compute-0 haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222680]: 10.100.0.36:33660 [22/Jan/2026:17:14:36.300] listener listener/metadata 0/0/0/24/24 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.328 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.328 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.36
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.341 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.342 104990 INFO eventlet.wsgi.server [-] 10.100.0.36,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0134265
Jan 22 17:14:36 compute-0 haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222680]: 10.100.0.36:33662 [22/Jan/2026:17:14:36.327] listener listener/metadata 0/0/0/14/14 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.345 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.346 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.36
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.358 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.358 104990 INFO eventlet.wsgi.server [-] 10.100.0.36,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0122464
Jan 22 17:14:36 compute-0 haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222680]: 10.100.0.36:33664 [22/Jan/2026:17:14:36.345] listener listener/metadata 0/0/0/13/13 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.363 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.363 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.36
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.375 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.375 104990 INFO eventlet.wsgi.server [-] 10.100.0.36,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0121777
Jan 22 17:14:36 compute-0 haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222680]: 10.100.0.36:33680 [22/Jan/2026:17:14:36.362] listener listener/metadata 0/0/0/12/12 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.379 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.379 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.36
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.401 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:36 compute-0 haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222680]: 10.100.0.36:33696 [22/Jan/2026:17:14:36.379] listener listener/metadata 0/0/0/22/22 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.401 104990 INFO eventlet.wsgi.server [-] 10.100.0.36,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0220978
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.405 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.405 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.36
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.424 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:36 compute-0 haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222680]: 10.100.0.36:33704 [22/Jan/2026:17:14:36.405] listener listener/metadata 0/0/0/19/19 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.424 104990 INFO eventlet.wsgi.server [-] 10.100.0.36,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0189760
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.428 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.429 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.36
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:36 compute-0 haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222680]: 10.100.0.36:33716 [22/Jan/2026:17:14:36.428] listener listener/metadata 0/0/0/13/13 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.442 104990 INFO eventlet.wsgi.server [-] 10.100.0.36,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0129516
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.450 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.451 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.36
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.468 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.469 104990 INFO eventlet.wsgi.server [-] 10.100.0.36,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0177784
Jan 22 17:14:36 compute-0 haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222680]: 10.100.0.36:33722 [22/Jan/2026:17:14:36.450] listener listener/metadata 0/0/0/18/18 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.472 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.473 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.36
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.484 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:36 compute-0 haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222680]: 10.100.0.36:33738 [22/Jan/2026:17:14:36.472] listener listener/metadata 0/0/0/12/12 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.485 104990 INFO eventlet.wsgi.server [-] 10.100.0.36,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0120008
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.489 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.489 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.36
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.503 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:36 compute-0 haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222680]: 10.100.0.36:33742 [22/Jan/2026:17:14:36.488] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.504 104990 INFO eventlet.wsgi.server [-] 10.100.0.36,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0143719
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.508 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.508 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.36
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.519 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.520 104990 INFO eventlet.wsgi.server [-] 10.100.0.36,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0116286
Jan 22 17:14:36 compute-0 haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222680]: 10.100.0.36:33758 [22/Jan/2026:17:14:36.507] listener listener/metadata 0/0/0/12/12 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.524 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.525 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.36
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.541 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:14:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:36.541 104990 INFO eventlet.wsgi.server [-] 10.100.0.36,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0163217
Jan 22 17:14:36 compute-0 haproxy-metadata-proxy-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222680]: 10.100.0.36:33764 [22/Jan/2026:17:14:36.524] listener listener/metadata 0/0/0/17/17 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:14:37 compute-0 nova_compute[183075]: 2026-01-22 17:14:37.225 183079 INFO nova.compute.manager [None req-4c0d63b5-5b63-4a95-b812-55bc32bd05e1 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Get console output
Jan 22 17:14:37 compute-0 nova_compute[183075]: 2026-01-22 17:14:37.231 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:14:37 compute-0 podman[222826]: 2026-01-22 17:14:37.344607108 +0000 UTC m=+0.054156483 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 17:14:37 compute-0 nova_compute[183075]: 2026-01-22 17:14:37.998 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.305 183079 DEBUG oslo_concurrency.lockutils [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "613fb094-d482-4b33-92c8-184b010a0169" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.305 183079 DEBUG oslo_concurrency.lockutils [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "613fb094-d482-4b33-92c8-184b010a0169" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.305 183079 DEBUG oslo_concurrency.lockutils [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "613fb094-d482-4b33-92c8-184b010a0169-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.305 183079 DEBUG oslo_concurrency.lockutils [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "613fb094-d482-4b33-92c8-184b010a0169-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.306 183079 DEBUG oslo_concurrency.lockutils [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "613fb094-d482-4b33-92c8-184b010a0169-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.307 183079 INFO nova.compute.manager [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Terminating instance
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.307 183079 DEBUG nova.compute.manager [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:14:38 compute-0 kernel: tap1e2e1eb8-ee (unregistering): left promiscuous mode
Jan 22 17:14:38 compute-0 NetworkManager[55454]: <info>  [1769102078.3312] device (tap1e2e1eb8-ee): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:14:38 compute-0 ovn_controller[95372]: 2026-01-22T17:14:38Z|00321|binding|INFO|Releasing lport 1e2e1eb8-eea7-4040-8e90-9c98522a01da from this chassis (sb_readonly=0)
Jan 22 17:14:38 compute-0 ovn_controller[95372]: 2026-01-22T17:14:38Z|00322|binding|INFO|Setting lport 1e2e1eb8-eea7-4040-8e90-9c98522a01da down in Southbound
Jan 22 17:14:38 compute-0 ovn_controller[95372]: 2026-01-22T17:14:38Z|00323|binding|INFO|Removing iface tap1e2e1eb8-ee ovn-installed in OVS
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.341 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.343 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:38.347 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:3a:d3 10.100.0.24'], port_security=['fa:16:3e:e1:3a:d3 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': '613fb094-d482-4b33-92c8-184b010a0169', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a16be1a-262e-47f7-8518-5f24ee15796e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5b6ccb16-1216-4deb-9d72-42005a3163bb, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=1e2e1eb8-eea7-4040-8e90-9c98522a01da) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:14:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:38.348 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 1e2e1eb8-eea7-4040-8e90-9c98522a01da in datapath 0a16be1a-262e-47f7-8518-5f24ee15796e unbound from our chassis
Jan 22 17:14:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:38.349 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a16be1a-262e-47f7-8518-5f24ee15796e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:14:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:38.351 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[815be1b4-5233-4baa-ab0a-a6d8c3b473d0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:38.351 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e namespace which is not needed anymore
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.353 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:38 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Jan 22 17:14:38 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000001c.scope: Consumed 12.960s CPU time.
Jan 22 17:14:38 compute-0 systemd-machined[154382]: Machine qemu-27-instance-0000001c terminated.
Jan 22 17:14:38 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[222592]: [NOTICE]   (222596) : haproxy version is 2.8.14-c23fe91
Jan 22 17:14:38 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[222592]: [NOTICE]   (222596) : path to executable is /usr/sbin/haproxy
Jan 22 17:14:38 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[222592]: [ALERT]    (222596) : Current worker (222598) exited with code 143 (Terminated)
Jan 22 17:14:38 compute-0 neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e[222592]: [WARNING]  (222596) : All workers exited. Exiting... (0)
Jan 22 17:14:38 compute-0 systemd[1]: libpod-e10c75bf8e526c31a039b24eb52457fe532227aeb8eae403d1ec3f110578d5ac.scope: Deactivated successfully.
Jan 22 17:14:38 compute-0 podman[222871]: 2026-01-22 17:14:38.512800034 +0000 UTC m=+0.071024242 container died e10c75bf8e526c31a039b24eb52457fe532227aeb8eae403d1ec3f110578d5ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:14:38 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e10c75bf8e526c31a039b24eb52457fe532227aeb8eae403d1ec3f110578d5ac-userdata-shm.mount: Deactivated successfully.
Jan 22 17:14:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-41007d1b5a1e8b2b463f75b32e45b0842b07a46ef9248ff75a8cbaa3c4d92c41-merged.mount: Deactivated successfully.
Jan 22 17:14:38 compute-0 podman[222871]: 2026-01-22 17:14:38.556592666 +0000 UTC m=+0.114816874 container cleanup e10c75bf8e526c31a039b24eb52457fe532227aeb8eae403d1ec3f110578d5ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:14:38 compute-0 systemd[1]: libpod-conmon-e10c75bf8e526c31a039b24eb52457fe532227aeb8eae403d1ec3f110578d5ac.scope: Deactivated successfully.
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.574 183079 INFO nova.virt.libvirt.driver [-] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Instance destroyed successfully.
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.575 183079 DEBUG nova.objects.instance [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'resources' on Instance uuid 613fb094-d482-4b33-92c8-184b010a0169 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.588 183079 DEBUG nova.virt.libvirt.vif [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:14:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-551951220',display_name='tempest-server-test-551951220',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-551951220',id=28,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:14:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-d35qou6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:14:15Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=613fb094-d482-4b33-92c8-184b010a0169,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "address": "fa:16:3e:e1:3a:d3", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2e1eb8-ee", "ovs_interfaceid": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.588 183079 DEBUG nova.network.os_vif_util [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "address": "fa:16:3e:e1:3a:d3", "network": {"id": "0a16be1a-262e-47f7-8518-5f24ee15796e", "bridge": "br-int", "label": "tempest-test-network--472783449", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2e1eb8-ee", "ovs_interfaceid": "1e2e1eb8-eea7-4040-8e90-9c98522a01da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.589 183079 DEBUG nova.network.os_vif_util [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:3a:d3,bridge_name='br-int',has_traffic_filtering=True,id=1e2e1eb8-eea7-4040-8e90-9c98522a01da,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1e2e1eb8-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.589 183079 DEBUG os_vif [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:3a:d3,bridge_name='br-int',has_traffic_filtering=True,id=1e2e1eb8-eea7-4040-8e90-9c98522a01da,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1e2e1eb8-ee') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.591 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.591 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1e2e1eb8-ee, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.593 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.594 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.596 183079 INFO os_vif [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:3a:d3,bridge_name='br-int',has_traffic_filtering=True,id=1e2e1eb8-eea7-4040-8e90-9c98522a01da,network=Network(0a16be1a-262e-47f7-8518-5f24ee15796e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1e2e1eb8-ee')
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.597 183079 INFO nova.virt.libvirt.driver [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Deleting instance files /var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169_del
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.597 183079 INFO nova.virt.libvirt.driver [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Deletion of /var/lib/nova/instances/613fb094-d482-4b33-92c8-184b010a0169_del complete
Jan 22 17:14:38 compute-0 podman[222914]: 2026-01-22 17:14:38.638892152 +0000 UTC m=+0.050617671 container remove e10c75bf8e526c31a039b24eb52457fe532227aeb8eae403d1ec3f110578d5ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:14:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:38.645 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c0c85f21-094d-4110-b30f-59db5ca714eb]: (4, ('Thu Jan 22 05:14:38 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e (e10c75bf8e526c31a039b24eb52457fe532227aeb8eae403d1ec3f110578d5ac)\ne10c75bf8e526c31a039b24eb52457fe532227aeb8eae403d1ec3f110578d5ac\nThu Jan 22 05:14:38 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e (e10c75bf8e526c31a039b24eb52457fe532227aeb8eae403d1ec3f110578d5ac)\ne10c75bf8e526c31a039b24eb52457fe532227aeb8eae403d1ec3f110578d5ac\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:38.647 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e2cbaf93-8013-4940-a774-3ff825e9c76e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:38.648 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a16be1a-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.650 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:38 compute-0 kernel: tap0a16be1a-20: left promiscuous mode
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.668 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:38.671 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c84ec6bd-b5c9-45d7-b2de-c7aca288400f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.674 183079 INFO nova.compute.manager [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.675 183079 DEBUG oslo.service.loopingcall [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.675 183079 DEBUG nova.compute.manager [-] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:14:38 compute-0 nova_compute[183075]: 2026-01-22 17:14:38.676 183079 DEBUG nova.network.neutron [-] [instance: 613fb094-d482-4b33-92c8-184b010a0169] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:14:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:38.688 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[57fe27d5-5f09-4c17-89e9-b17ed8c8f0fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:38.689 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[48d64f77-32f2-42c3-a655-b40b88f020f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:38.707 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[37026092-4cf4-46b9-a674-bc0357f8e73d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441685, 'reachable_time': 26377, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222929, 'error': None, 'target': 'ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:38.711 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0a16be1a-262e-47f7-8518-5f24ee15796e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:14:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:38.711 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[11ca0945-f5a4-4894-a43b-4092d76c0c7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:38 compute-0 systemd[1]: run-netns-ovnmeta\x2d0a16be1a\x2d262e\x2d47f7\x2d8518\x2d5f24ee15796e.mount: Deactivated successfully.
Jan 22 17:14:39 compute-0 nova_compute[183075]: 2026-01-22 17:14:39.075 183079 INFO nova.compute.manager [None req-efa6602e-8979-4f92-98cf-621e55406b57 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Get console output
Jan 22 17:14:39 compute-0 nova_compute[183075]: 2026-01-22 17:14:39.079 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:14:39 compute-0 nova_compute[183075]: 2026-01-22 17:14:39.267 183079 DEBUG nova.compute.manager [req-147b4cc7-c733-47d5-baf0-25d4fe35e158 req-72c9c19a-6422-43ef-93d5-b7472433dcb7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Received event network-vif-unplugged-1e2e1eb8-eea7-4040-8e90-9c98522a01da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:14:39 compute-0 nova_compute[183075]: 2026-01-22 17:14:39.268 183079 DEBUG oslo_concurrency.lockutils [req-147b4cc7-c733-47d5-baf0-25d4fe35e158 req-72c9c19a-6422-43ef-93d5-b7472433dcb7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "613fb094-d482-4b33-92c8-184b010a0169-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:39 compute-0 nova_compute[183075]: 2026-01-22 17:14:39.269 183079 DEBUG oslo_concurrency.lockutils [req-147b4cc7-c733-47d5-baf0-25d4fe35e158 req-72c9c19a-6422-43ef-93d5-b7472433dcb7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "613fb094-d482-4b33-92c8-184b010a0169-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:39 compute-0 nova_compute[183075]: 2026-01-22 17:14:39.269 183079 DEBUG oslo_concurrency.lockutils [req-147b4cc7-c733-47d5-baf0-25d4fe35e158 req-72c9c19a-6422-43ef-93d5-b7472433dcb7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "613fb094-d482-4b33-92c8-184b010a0169-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:39 compute-0 nova_compute[183075]: 2026-01-22 17:14:39.270 183079 DEBUG nova.compute.manager [req-147b4cc7-c733-47d5-baf0-25d4fe35e158 req-72c9c19a-6422-43ef-93d5-b7472433dcb7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] No waiting events found dispatching network-vif-unplugged-1e2e1eb8-eea7-4040-8e90-9c98522a01da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:14:39 compute-0 nova_compute[183075]: 2026-01-22 17:14:39.271 183079 DEBUG nova.compute.manager [req-147b4cc7-c733-47d5-baf0-25d4fe35e158 req-72c9c19a-6422-43ef-93d5-b7472433dcb7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Received event network-vif-unplugged-1e2e1eb8-eea7-4040-8e90-9c98522a01da for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:14:39 compute-0 nova_compute[183075]: 2026-01-22 17:14:39.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.076 183079 DEBUG nova.network.neutron [-] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.093 183079 INFO nova.compute.manager [-] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Took 1.42 seconds to deallocate network for instance.
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.129 183079 DEBUG oslo_concurrency.lockutils [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.130 183079 DEBUG oslo_concurrency.lockutils [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.218 183079 DEBUG nova.compute.provider_tree [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.233 183079 DEBUG nova.scheduler.client.report [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.253 183079 DEBUG oslo_concurrency.lockutils [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.290 183079 INFO nova.scheduler.client.report [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Deleted allocations for instance 613fb094-d482-4b33-92c8-184b010a0169
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.366 183079 DEBUG oslo_concurrency.lockutils [None req-9aa93046-039d-473a-b61e-b65fe579d0e5 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "613fb094-d482-4b33-92c8-184b010a0169" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.061s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.480 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.631 183079 DEBUG oslo_concurrency.lockutils [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "84b90c1e-91a0-437d-8ed2-956840c552ab" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.632 183079 DEBUG oslo_concurrency.lockutils [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "84b90c1e-91a0-437d-8ed2-956840c552ab" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.632 183079 DEBUG oslo_concurrency.lockutils [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.632 183079 DEBUG oslo_concurrency.lockutils [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.633 183079 DEBUG oslo_concurrency.lockutils [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.634 183079 INFO nova.compute.manager [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Terminating instance
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.635 183079 DEBUG nova.compute.manager [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:14:40 compute-0 kernel: tapb3bc8962-61 (unregistering): left promiscuous mode
Jan 22 17:14:40 compute-0 NetworkManager[55454]: <info>  [1769102080.6620] device (tapb3bc8962-61): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:14:40 compute-0 ovn_controller[95372]: 2026-01-22T17:14:40Z|00324|binding|INFO|Releasing lport b3bc8962-61ba-4d8d-9a4a-705e9e713574 from this chassis (sb_readonly=0)
Jan 22 17:14:40 compute-0 ovn_controller[95372]: 2026-01-22T17:14:40Z|00325|binding|INFO|Setting lport b3bc8962-61ba-4d8d-9a4a-705e9e713574 down in Southbound
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.664 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:40 compute-0 ovn_controller[95372]: 2026-01-22T17:14:40Z|00326|binding|INFO|Removing iface tapb3bc8962-61 ovn-installed in OVS
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.667 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:40.674 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:ac:67 10.100.0.5'], port_security=['fa:16:3e:24:ac:67 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '84b90c1e-91a0-437d-8ed2-956840c552ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-576f6598-999f-46d9-809a-65b7475a1ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e05c7aae349e4a1d859a387df45650a0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b9cc1ac9-85f4-4da7-891e-b0644711e574', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.229', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92087573-6410-4f80-a195-ce8b2058420e, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=b3bc8962-61ba-4d8d-9a4a-705e9e713574) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:14:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:40.676 104629 INFO neutron.agent.ovn.metadata.agent [-] Port b3bc8962-61ba-4d8d-9a4a-705e9e713574 in datapath 576f6598-999f-46d9-809a-65b7475a1ec7 unbound from our chassis
Jan 22 17:14:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:40.680 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 576f6598-999f-46d9-809a-65b7475a1ec7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:14:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:40.681 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1cd0c115-feb3-44a2-907f-58eed44a437c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:40.682 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 namespace which is not needed anymore
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.696 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:40 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Jan 22 17:14:40 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000001a.scope: Consumed 14.467s CPU time.
Jan 22 17:14:40 compute-0 systemd-machined[154382]: Machine qemu-26-instance-0000001a terminated.
Jan 22 17:14:40 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[221942]: [NOTICE]   (221950) : haproxy version is 2.8.14-c23fe91
Jan 22 17:14:40 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[221942]: [NOTICE]   (221950) : path to executable is /usr/sbin/haproxy
Jan 22 17:14:40 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[221942]: [WARNING]  (221950) : Exiting Master process...
Jan 22 17:14:40 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[221942]: [ALERT]    (221950) : Current worker (221952) exited with code 143 (Terminated)
Jan 22 17:14:40 compute-0 neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7[221942]: [WARNING]  (221950) : All workers exited. Exiting... (0)
Jan 22 17:14:40 compute-0 systemd[1]: libpod-bfe269bc1798759d85db1012917eca0350a00fc4b919aa8f7ee10ddd0066f52b.scope: Deactivated successfully.
Jan 22 17:14:40 compute-0 podman[222952]: 2026-01-22 17:14:40.829948146 +0000 UTC m=+0.053542967 container died bfe269bc1798759d85db1012917eca0350a00fc4b919aa8f7ee10ddd0066f52b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:14:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bfe269bc1798759d85db1012917eca0350a00fc4b919aa8f7ee10ddd0066f52b-userdata-shm.mount: Deactivated successfully.
Jan 22 17:14:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-a3ef0fb847366aceadce50dea20996790699a5583a4e3df818ef68e2a96d69f2-merged.mount: Deactivated successfully.
Jan 22 17:14:40 compute-0 podman[222952]: 2026-01-22 17:14:40.887060455 +0000 UTC m=+0.110655236 container cleanup bfe269bc1798759d85db1012917eca0350a00fc4b919aa8f7ee10ddd0066f52b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:14:40 compute-0 systemd[1]: libpod-conmon-bfe269bc1798759d85db1012917eca0350a00fc4b919aa8f7ee10ddd0066f52b.scope: Deactivated successfully.
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.908 183079 INFO nova.virt.libvirt.driver [-] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Instance destroyed successfully.
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.908 183079 DEBUG nova.objects.instance [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lazy-loading 'resources' on Instance uuid 84b90c1e-91a0-437d-8ed2-956840c552ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.921 183079 DEBUG nova.virt.libvirt.vif [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:13:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1407213754',display_name='tempest-server-test-1407213754',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1407213754',id=26,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFroJSJbEvCYe2sCwXkwL5KPVfwqqRPNqrWU5nnhv8U4G8wYWbEqk5AtRWXltmX7CAIPBST+sTwCKfymhALKd54XNN92D6KKnKgA6jmOihENah4FGfU80fx1UEE0osLpiw==',key_name='tempest-keypair-test-618317104',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:13:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e05c7aae349e4a1d859a387df45650a0',ramdisk_id='',reservation_id='r-f5ng9jql',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSeparateNetwork-931877966',owner_user_name='tempest-FloatingIpSeparateNetwork-931877966-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:13:33Z,user_data=None,user_id='cd47d63cff2548a88e21e5c2e6a5c161',uuid=84b90c1e-91a0-437d-8ed2-956840c552ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "address": "fa:16:3e:24:ac:67", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3bc8962-61", "ovs_interfaceid": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.922 183079 DEBUG nova.network.os_vif_util [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converting VIF {"id": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "address": "fa:16:3e:24:ac:67", "network": {"id": "576f6598-999f-46d9-809a-65b7475a1ec7", "bridge": "br-int", "label": "tempest-test-network--17141989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e05c7aae349e4a1d859a387df45650a0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3bc8962-61", "ovs_interfaceid": "b3bc8962-61ba-4d8d-9a4a-705e9e713574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.922 183079 DEBUG nova.network.os_vif_util [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:ac:67,bridge_name='br-int',has_traffic_filtering=True,id=b3bc8962-61ba-4d8d-9a4a-705e9e713574,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb3bc8962-61') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.923 183079 DEBUG os_vif [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:ac:67,bridge_name='br-int',has_traffic_filtering=True,id=b3bc8962-61ba-4d8d-9a4a-705e9e713574,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb3bc8962-61') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.924 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.924 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3bc8962-61, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.962 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.964 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.966 183079 INFO os_vif [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:ac:67,bridge_name='br-int',has_traffic_filtering=True,id=b3bc8962-61ba-4d8d-9a4a-705e9e713574,network=Network(576f6598-999f-46d9-809a-65b7475a1ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb3bc8962-61')
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.966 183079 INFO nova.virt.libvirt.driver [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Deleting instance files /var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab_del
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.967 183079 INFO nova.virt.libvirt.driver [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Deletion of /var/lib/nova/instances/84b90c1e-91a0-437d-8ed2-956840c552ab_del complete
Jan 22 17:14:40 compute-0 podman[222996]: 2026-01-22 17:14:40.97200895 +0000 UTC m=+0.055292363 container remove bfe269bc1798759d85db1012917eca0350a00fc4b919aa8f7ee10ddd0066f52b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:14:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:40.977 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[78f71128-506f-4050-aa0c-067c4e9bf6ab]: (4, ('Thu Jan 22 05:14:40 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 (bfe269bc1798759d85db1012917eca0350a00fc4b919aa8f7ee10ddd0066f52b)\nbfe269bc1798759d85db1012917eca0350a00fc4b919aa8f7ee10ddd0066f52b\nThu Jan 22 05:14:40 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 (bfe269bc1798759d85db1012917eca0350a00fc4b919aa8f7ee10ddd0066f52b)\nbfe269bc1798759d85db1012917eca0350a00fc4b919aa8f7ee10ddd0066f52b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:40.978 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1dc7f7d6-3507-4b77-acab-4b99cf61f3fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:40.979 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap576f6598-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.980 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:40 compute-0 kernel: tap576f6598-90: left promiscuous mode
Jan 22 17:14:40 compute-0 nova_compute[183075]: 2026-01-22 17:14:40.997 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:41.000 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[94ea7465-2520-4056-a5bf-0d7adad13bed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.012 183079 INFO nova.compute.manager [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.012 183079 DEBUG oslo.service.loopingcall [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.013 183079 DEBUG nova.compute.manager [-] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.013 183079 DEBUG nova.network.neutron [-] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:14:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:41.021 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4ffe849a-786c-4327-844c-ceb506fd3b33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:41.022 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e82155ea-b08a-4472-83d7-8272a651833b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:41.035 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f4136fb4-4e0a-4036-ae19-c9d99c8d9687]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 437681, 'reachable_time': 17351, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223012, 'error': None, 'target': 'ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:41.037 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-576f6598-999f-46d9-809a-65b7475a1ec7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:14:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:41.037 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[9c6981e5-4d10-4630-a671-9049217b5a80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:14:41 compute-0 systemd[1]: run-netns-ovnmeta\x2d576f6598\x2d999f\x2d46d9\x2d809a\x2d65b7475a1ec7.mount: Deactivated successfully.
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.372 183079 DEBUG nova.compute.manager [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Received event network-vif-plugged-1e2e1eb8-eea7-4040-8e90-9c98522a01da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.373 183079 DEBUG oslo_concurrency.lockutils [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "613fb094-d482-4b33-92c8-184b010a0169-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.374 183079 DEBUG oslo_concurrency.lockutils [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "613fb094-d482-4b33-92c8-184b010a0169-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.374 183079 DEBUG oslo_concurrency.lockutils [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "613fb094-d482-4b33-92c8-184b010a0169-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.374 183079 DEBUG nova.compute.manager [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] No waiting events found dispatching network-vif-plugged-1e2e1eb8-eea7-4040-8e90-9c98522a01da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.375 183079 WARNING nova.compute.manager [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Received unexpected event network-vif-plugged-1e2e1eb8-eea7-4040-8e90-9c98522a01da for instance with vm_state deleted and task_state None.
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.375 183079 DEBUG nova.compute.manager [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Received event network-vif-unplugged-b3bc8962-61ba-4d8d-9a4a-705e9e713574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.376 183079 DEBUG oslo_concurrency.lockutils [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.376 183079 DEBUG oslo_concurrency.lockutils [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.377 183079 DEBUG oslo_concurrency.lockutils [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.377 183079 DEBUG nova.compute.manager [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] No waiting events found dispatching network-vif-unplugged-b3bc8962-61ba-4d8d-9a4a-705e9e713574 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.378 183079 DEBUG nova.compute.manager [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Received event network-vif-unplugged-b3bc8962-61ba-4d8d-9a4a-705e9e713574 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.378 183079 DEBUG nova.compute.manager [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Received event network-vif-plugged-b3bc8962-61ba-4d8d-9a4a-705e9e713574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.378 183079 DEBUG oslo_concurrency.lockutils [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.379 183079 DEBUG oslo_concurrency.lockutils [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.379 183079 DEBUG oslo_concurrency.lockutils [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "84b90c1e-91a0-437d-8ed2-956840c552ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.380 183079 DEBUG nova.compute.manager [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] No waiting events found dispatching network-vif-plugged-b3bc8962-61ba-4d8d-9a4a-705e9e713574 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:14:41 compute-0 nova_compute[183075]: 2026-01-22 17:14:41.380 183079 WARNING nova.compute.manager [req-2ebbc5b5-5632-45f7-af6a-8bc27eb8d774 req-0e6706c9-3fd8-4c32-bd49-597da9439b3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Received unexpected event network-vif-plugged-b3bc8962-61ba-4d8d-9a4a-705e9e713574 for instance with vm_state active and task_state deleting.
Jan 22 17:14:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:41.930 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:41.931 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:14:41.932 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:42 compute-0 nova_compute[183075]: 2026-01-22 17:14:42.097 183079 DEBUG nova.network.neutron [-] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:14:42 compute-0 nova_compute[183075]: 2026-01-22 17:14:42.119 183079 INFO nova.compute.manager [-] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Took 1.11 seconds to deallocate network for instance.
Jan 22 17:14:42 compute-0 nova_compute[183075]: 2026-01-22 17:14:42.164 183079 DEBUG oslo_concurrency.lockutils [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:14:42 compute-0 nova_compute[183075]: 2026-01-22 17:14:42.164 183079 DEBUG oslo_concurrency.lockutils [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:14:42 compute-0 nova_compute[183075]: 2026-01-22 17:14:42.260 183079 DEBUG nova.compute.provider_tree [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:14:42 compute-0 nova_compute[183075]: 2026-01-22 17:14:42.315 183079 DEBUG nova.scheduler.client.report [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:14:42 compute-0 nova_compute[183075]: 2026-01-22 17:14:42.343 183079 DEBUG oslo_concurrency.lockutils [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:42 compute-0 nova_compute[183075]: 2026-01-22 17:14:42.362 183079 INFO nova.scheduler.client.report [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Deleted allocations for instance 84b90c1e-91a0-437d-8ed2-956840c552ab
Jan 22 17:14:42 compute-0 nova_compute[183075]: 2026-01-22 17:14:42.684 183079 DEBUG oslo_concurrency.lockutils [None req-20815005-1d37-4839-9bda-9b2588eaaf51 cd47d63cff2548a88e21e5c2e6a5c161 e05c7aae349e4a1d859a387df45650a0 - - default default] Lock "84b90c1e-91a0-437d-8ed2-956840c552ab" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:14:43 compute-0 podman[223013]: 2026-01-22 17:14:43.356086286 +0000 UTC m=+0.067770238 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:14:44 compute-0 nova_compute[183075]: 2026-01-22 17:14:44.288 183079 INFO nova.compute.manager [None req-40f1e34e-71ec-4c44-8e53-e1d015f102b4 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Get console output
Jan 22 17:14:44 compute-0 nova_compute[183075]: 2026-01-22 17:14:44.294 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:14:45 compute-0 nova_compute[183075]: 2026-01-22 17:14:45.525 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:45 compute-0 nova_compute[183075]: 2026-01-22 17:14:45.963 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:46 compute-0 nova_compute[183075]: 2026-01-22 17:14:46.616 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:49 compute-0 nova_compute[183075]: 2026-01-22 17:14:49.445 183079 INFO nova.compute.manager [None req-9a85a23a-72d3-4258-82ae-c46a20f97ff3 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Get console output
Jan 22 17:14:49 compute-0 nova_compute[183075]: 2026-01-22 17:14:49.450 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:14:50 compute-0 nova_compute[183075]: 2026-01-22 17:14:50.571 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:50 compute-0 nova_compute[183075]: 2026-01-22 17:14:50.965 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:52 compute-0 ovn_controller[95372]: 2026-01-22T17:14:52Z|00327|binding|INFO|Releasing lport 89fb1d7a-a439-4b74-977f-72af8114ea6b from this chassis (sb_readonly=0)
Jan 22 17:14:52 compute-0 nova_compute[183075]: 2026-01-22 17:14:52.323 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:52 compute-0 ovn_controller[95372]: 2026-01-22T17:14:52Z|00328|binding|INFO|Releasing lport 89fb1d7a-a439-4b74-977f-72af8114ea6b from this chassis (sb_readonly=0)
Jan 22 17:14:52 compute-0 nova_compute[183075]: 2026-01-22 17:14:52.552 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:53 compute-0 nova_compute[183075]: 2026-01-22 17:14:53.573 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102078.5727708, 613fb094-d482-4b33-92c8-184b010a0169 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:14:53 compute-0 nova_compute[183075]: 2026-01-22 17:14:53.574 183079 INFO nova.compute.manager [-] [instance: 613fb094-d482-4b33-92c8-184b010a0169] VM Stopped (Lifecycle Event)
Jan 22 17:14:53 compute-0 nova_compute[183075]: 2026-01-22 17:14:53.598 183079 DEBUG nova.compute.manager [None req-f116be7e-cdbd-444d-bbca-7d6f794d58bc - - - - - -] [instance: 613fb094-d482-4b33-92c8-184b010a0169] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:14:54 compute-0 podman[223038]: 2026-01-22 17:14:54.33743268 +0000 UTC m=+0.050880800 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:14:54 compute-0 nova_compute[183075]: 2026-01-22 17:14:54.711 183079 INFO nova.compute.manager [None req-e22fccf5-f81c-4fdd-82fb-e0cc6b47ab31 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Get console output
Jan 22 17:14:54 compute-0 nova_compute[183075]: 2026-01-22 17:14:54.717 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:14:55 compute-0 nova_compute[183075]: 2026-01-22 17:14:55.668 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:14:55 compute-0 nova_compute[183075]: 2026-01-22 17:14:55.906 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102080.9046066, 84b90c1e-91a0-437d-8ed2-956840c552ab => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:14:55 compute-0 nova_compute[183075]: 2026-01-22 17:14:55.906 183079 INFO nova.compute.manager [-] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] VM Stopped (Lifecycle Event)
Jan 22 17:14:55 compute-0 nova_compute[183075]: 2026-01-22 17:14:55.922 183079 DEBUG nova.compute.manager [None req-c4cafab9-cda4-4939-ba1a-21035b0bf099 - - - - - -] [instance: 84b90c1e-91a0-437d-8ed2-956840c552ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:14:55 compute-0 nova_compute[183075]: 2026-01-22 17:14:55.966 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:00 compute-0 nova_compute[183075]: 2026-01-22 17:15:00.016 183079 INFO nova.compute.manager [None req-307866fa-7c12-4286-bcfc-966c61af3d43 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Get console output
Jan 22 17:15:00 compute-0 nova_compute[183075]: 2026-01-22 17:15:00.020 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:15:00 compute-0 nova_compute[183075]: 2026-01-22 17:15:00.669 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:00 compute-0 nova_compute[183075]: 2026-01-22 17:15:00.968 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:02 compute-0 podman[223064]: 2026-01-22 17:15:02.378903332 +0000 UTC m=+0.077151606 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:15:02 compute-0 podman[223065]: 2026-01-22 17:15:02.389821248 +0000 UTC m=+0.082754243 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9)
Jan 22 17:15:02 compute-0 podman[223063]: 2026-01-22 17:15:02.429280709 +0000 UTC m=+0.125408618 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:15:05 compute-0 nova_compute[183075]: 2026-01-22 17:15:05.216 183079 INFO nova.compute.manager [None req-d98a48ed-a76c-42ac-93e1-cadcb06512fc 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Get console output
Jan 22 17:15:05 compute-0 nova_compute[183075]: 2026-01-22 17:15:05.222 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:15:05 compute-0 nova_compute[183075]: 2026-01-22 17:15:05.671 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:05 compute-0 nova_compute[183075]: 2026-01-22 17:15:05.970 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:06 compute-0 nova_compute[183075]: 2026-01-22 17:15:06.803 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:15:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:08.267 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:15:08 compute-0 nova_compute[183075]: 2026-01-22 17:15:08.268 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:08.268 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:15:08 compute-0 podman[223126]: 2026-01-22 17:15:08.382782193 +0000 UTC m=+0.076065588 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 17:15:08 compute-0 nova_compute[183075]: 2026-01-22 17:15:08.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:15:10 compute-0 nova_compute[183075]: 2026-01-22 17:15:10.388 183079 INFO nova.compute.manager [None req-0c10bca9-1f88-4782-a176-ebbb06f45130 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Get console output
Jan 22 17:15:10 compute-0 nova_compute[183075]: 2026-01-22 17:15:10.396 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:15:10 compute-0 nova_compute[183075]: 2026-01-22 17:15:10.716 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:10 compute-0 nova_compute[183075]: 2026-01-22 17:15:10.971 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:12 compute-0 nova_compute[183075]: 2026-01-22 17:15:12.096 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:12 compute-0 NetworkManager[55454]: <info>  [1769102112.0976] manager: (patch-br-int-to-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Jan 22 17:15:12 compute-0 NetworkManager[55454]: <info>  [1769102112.1000] manager: (patch-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Jan 22 17:15:12 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:15:12 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:15:12 compute-0 nova_compute[183075]: 2026-01-22 17:15:12.253 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:12 compute-0 ovn_controller[95372]: 2026-01-22T17:15:12Z|00329|binding|INFO|Releasing lport 89fb1d7a-a439-4b74-977f-72af8114ea6b from this chassis (sb_readonly=0)
Jan 22 17:15:12 compute-0 nova_compute[183075]: 2026-01-22 17:15:12.280 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:12 compute-0 nova_compute[183075]: 2026-01-22 17:15:12.502 183079 DEBUG nova.compute.manager [req-0cfdafe2-5d0c-49bc-8f31-c42b72ec7d87 req-eeff3a10-80b5-4ccc-b282-84a35616c46a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Received event network-changed-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:15:12 compute-0 nova_compute[183075]: 2026-01-22 17:15:12.503 183079 DEBUG nova.compute.manager [req-0cfdafe2-5d0c-49bc-8f31-c42b72ec7d87 req-eeff3a10-80b5-4ccc-b282-84a35616c46a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Refreshing instance network info cache due to event network-changed-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:15:12 compute-0 nova_compute[183075]: 2026-01-22 17:15:12.503 183079 DEBUG oslo_concurrency.lockutils [req-0cfdafe2-5d0c-49bc-8f31-c42b72ec7d87 req-eeff3a10-80b5-4ccc-b282-84a35616c46a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-26367132-bc45-4c8a-bd7e-5c0883453bbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:15:12 compute-0 nova_compute[183075]: 2026-01-22 17:15:12.504 183079 DEBUG oslo_concurrency.lockutils [req-0cfdafe2-5d0c-49bc-8f31-c42b72ec7d87 req-eeff3a10-80b5-4ccc-b282-84a35616c46a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-26367132-bc45-4c8a-bd7e-5c0883453bbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:15:12 compute-0 nova_compute[183075]: 2026-01-22 17:15:12.504 183079 DEBUG nova.network.neutron [req-0cfdafe2-5d0c-49bc-8f31-c42b72ec7d87 req-eeff3a10-80b5-4ccc-b282-84a35616c46a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Refreshing network info cache for port 76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:15:12 compute-0 nova_compute[183075]: 2026-01-22 17:15:12.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:15:12 compute-0 nova_compute[183075]: 2026-01-22 17:15:12.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:15:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:13.271 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:13 compute-0 nova_compute[183075]: 2026-01-22 17:15:13.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:15:13 compute-0 nova_compute[183075]: 2026-01-22 17:15:13.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:15:13 compute-0 nova_compute[183075]: 2026-01-22 17:15:13.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:15:14 compute-0 nova_compute[183075]: 2026-01-22 17:15:14.020 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-26367132-bc45-4c8a-bd7e-5c0883453bbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:15:14 compute-0 nova_compute[183075]: 2026-01-22 17:15:14.110 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:14 compute-0 podman[223148]: 2026-01-22 17:15:14.368744127 +0000 UTC m=+0.069644641 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:15:14 compute-0 nova_compute[183075]: 2026-01-22 17:15:14.815 183079 DEBUG nova.network.neutron [req-0cfdafe2-5d0c-49bc-8f31-c42b72ec7d87 req-eeff3a10-80b5-4ccc-b282-84a35616c46a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Updated VIF entry in instance network info cache for port 76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:15:14 compute-0 nova_compute[183075]: 2026-01-22 17:15:14.815 183079 DEBUG nova.network.neutron [req-0cfdafe2-5d0c-49bc-8f31-c42b72ec7d87 req-eeff3a10-80b5-4ccc-b282-84a35616c46a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Updating instance_info_cache with network_info: [{"id": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "address": "fa:16:3e:f2:f5:37", "network": {"id": "2f6bd5e0-c1b9-4783-b6e7-1932fe18705c", "bridge": "br-int", "label": "tempest-test-network--1617607678", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.36", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76cfd2b1-7a", "ovs_interfaceid": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:15:14 compute-0 nova_compute[183075]: 2026-01-22 17:15:14.867 183079 DEBUG oslo_concurrency.lockutils [req-0cfdafe2-5d0c-49bc-8f31-c42b72ec7d87 req-eeff3a10-80b5-4ccc-b282-84a35616c46a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-26367132-bc45-4c8a-bd7e-5c0883453bbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:15:14 compute-0 nova_compute[183075]: 2026-01-22 17:15:14.868 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-26367132-bc45-4c8a-bd7e-5c0883453bbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:15:14 compute-0 nova_compute[183075]: 2026-01-22 17:15:14.868 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:15:14 compute-0 nova_compute[183075]: 2026-01-22 17:15:14.869 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 26367132-bc45-4c8a-bd7e-5c0883453bbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:15:15 compute-0 nova_compute[183075]: 2026-01-22 17:15:15.760 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:15 compute-0 nova_compute[183075]: 2026-01-22 17:15:15.974 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:16 compute-0 nova_compute[183075]: 2026-01-22 17:15:16.943 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Updating instance_info_cache with network_info: [{"id": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "address": "fa:16:3e:f2:f5:37", "network": {"id": "2f6bd5e0-c1b9-4783-b6e7-1932fe18705c", "bridge": "br-int", "label": "tempest-test-network--1617607678", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.36", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76cfd2b1-7a", "ovs_interfaceid": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:15:16 compute-0 nova_compute[183075]: 2026-01-22 17:15:16.958 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-26367132-bc45-4c8a-bd7e-5c0883453bbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:15:16 compute-0 nova_compute[183075]: 2026-01-22 17:15:16.958 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:15:16 compute-0 nova_compute[183075]: 2026-01-22 17:15:16.958 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:15:16 compute-0 nova_compute[183075]: 2026-01-22 17:15:16.959 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:15:16 compute-0 nova_compute[183075]: 2026-01-22 17:15:16.959 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:15:16 compute-0 nova_compute[183075]: 2026-01-22 17:15:16.959 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:15:16 compute-0 nova_compute[183075]: 2026-01-22 17:15:16.989 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:16 compute-0 nova_compute[183075]: 2026-01-22 17:15:16.989 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:16 compute-0 nova_compute[183075]: 2026-01-22 17:15:16.989 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:16 compute-0 nova_compute[183075]: 2026-01-22 17:15:16.990 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:15:17 compute-0 nova_compute[183075]: 2026-01-22 17:15:17.055 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:15:17 compute-0 nova_compute[183075]: 2026-01-22 17:15:17.131 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:15:17 compute-0 nova_compute[183075]: 2026-01-22 17:15:17.132 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:15:17 compute-0 nova_compute[183075]: 2026-01-22 17:15:17.186 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:15:17 compute-0 nova_compute[183075]: 2026-01-22 17:15:17.376 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:15:17 compute-0 nova_compute[183075]: 2026-01-22 17:15:17.377 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5511MB free_disk=73.33826446533203GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:15:17 compute-0 nova_compute[183075]: 2026-01-22 17:15:17.377 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:17 compute-0 nova_compute[183075]: 2026-01-22 17:15:17.378 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:17 compute-0 nova_compute[183075]: 2026-01-22 17:15:17.457 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 26367132-bc45-4c8a-bd7e-5c0883453bbd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:15:17 compute-0 nova_compute[183075]: 2026-01-22 17:15:17.458 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:15:17 compute-0 nova_compute[183075]: 2026-01-22 17:15:17.458 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:15:17 compute-0 nova_compute[183075]: 2026-01-22 17:15:17.496 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:15:17 compute-0 nova_compute[183075]: 2026-01-22 17:15:17.507 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:15:17 compute-0 nova_compute[183075]: 2026-01-22 17:15:17.528 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:15:17 compute-0 nova_compute[183075]: 2026-01-22 17:15:17.528 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:18 compute-0 nova_compute[183075]: 2026-01-22 17:15:18.523 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:15:20 compute-0 nova_compute[183075]: 2026-01-22 17:15:20.622 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Acquiring lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:20 compute-0 nova_compute[183075]: 2026-01-22 17:15:20.622 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:20 compute-0 nova_compute[183075]: 2026-01-22 17:15:20.647 183079 DEBUG nova.compute.manager [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:15:20 compute-0 nova_compute[183075]: 2026-01-22 17:15:20.807 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:20 compute-0 nova_compute[183075]: 2026-01-22 17:15:20.808 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:20 compute-0 nova_compute[183075]: 2026-01-22 17:15:20.809 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:20 compute-0 nova_compute[183075]: 2026-01-22 17:15:20.820 183079 DEBUG nova.virt.hardware [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:15:20 compute-0 nova_compute[183075]: 2026-01-22 17:15:20.821 183079 INFO nova.compute.claims [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:15:20 compute-0 nova_compute[183075]: 2026-01-22 17:15:20.963 183079 DEBUG nova.compute.provider_tree [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:15:20 compute-0 nova_compute[183075]: 2026-01-22 17:15:20.975 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:20 compute-0 nova_compute[183075]: 2026-01-22 17:15:20.992 183079 DEBUG nova.scheduler.client.report [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.027 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.029 183079 DEBUG nova.compute.manager [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.093 183079 DEBUG nova.compute.manager [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.094 183079 DEBUG nova.network.neutron [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.125 183079 INFO nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.150 183079 DEBUG nova.compute.manager [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.262 183079 DEBUG nova.compute.manager [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.263 183079 DEBUG nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.263 183079 INFO nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Creating image(s)
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.264 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Acquiring lock "/var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.264 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "/var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.265 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "/var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.278 183079 DEBUG oslo_concurrency.processutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.355 183079 DEBUG oslo_concurrency.processutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.356 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.357 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.373 183079 DEBUG oslo_concurrency.processutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.430 183079 DEBUG oslo_concurrency.processutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.431 183079 DEBUG oslo_concurrency.processutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.476 183079 DEBUG oslo_concurrency.processutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.477 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.477 183079 DEBUG oslo_concurrency.processutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.567 183079 DEBUG oslo_concurrency.processutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.568 183079 DEBUG nova.virt.disk.api [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Checking if we can resize image /var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.568 183079 DEBUG oslo_concurrency.processutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.602 183079 DEBUG nova.policy [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.628 183079 DEBUG oslo_concurrency.processutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.630 183079 DEBUG nova.virt.disk.api [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Cannot resize image /var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.630 183079 DEBUG nova.objects.instance [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lazy-loading 'migration_context' on Instance uuid 1b9e0b4e-34d1-46d1-8f04-a07da354e704 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.645 183079 DEBUG nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.645 183079 DEBUG nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Ensure instance console log exists: /var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.646 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.646 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:21 compute-0 nova_compute[183075]: 2026-01-22 17:15:21.646 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:22 compute-0 nova_compute[183075]: 2026-01-22 17:15:22.624 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:22 compute-0 nova_compute[183075]: 2026-01-22 17:15:22.666 183079 DEBUG nova.network.neutron [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Successfully updated port: 00a6b90a-3de5-45e7-93ca-e2b72cac1de3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:15:22 compute-0 nova_compute[183075]: 2026-01-22 17:15:22.680 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Acquiring lock "refresh_cache-1b9e0b4e-34d1-46d1-8f04-a07da354e704" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:15:22 compute-0 nova_compute[183075]: 2026-01-22 17:15:22.680 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Acquired lock "refresh_cache-1b9e0b4e-34d1-46d1-8f04-a07da354e704" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:15:22 compute-0 nova_compute[183075]: 2026-01-22 17:15:22.680 183079 DEBUG nova.network.neutron [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:15:22 compute-0 nova_compute[183075]: 2026-01-22 17:15:22.785 183079 DEBUG nova.compute.manager [req-fa388a56-6a69-48e0-adde-f72ccdcfeb6f req-78afcf46-e55b-43e1-9f15-a14f9304af9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Received event network-changed-00a6b90a-3de5-45e7-93ca-e2b72cac1de3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:15:22 compute-0 nova_compute[183075]: 2026-01-22 17:15:22.786 183079 DEBUG nova.compute.manager [req-fa388a56-6a69-48e0-adde-f72ccdcfeb6f req-78afcf46-e55b-43e1-9f15-a14f9304af9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Refreshing instance network info cache due to event network-changed-00a6b90a-3de5-45e7-93ca-e2b72cac1de3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:15:22 compute-0 nova_compute[183075]: 2026-01-22 17:15:22.786 183079 DEBUG oslo_concurrency.lockutils [req-fa388a56-6a69-48e0-adde-f72ccdcfeb6f req-78afcf46-e55b-43e1-9f15-a14f9304af9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-1b9e0b4e-34d1-46d1-8f04-a07da354e704" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:15:22 compute-0 nova_compute[183075]: 2026-01-22 17:15:22.967 183079 DEBUG nova.network.neutron [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.081 183079 DEBUG nova.network.neutron [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Updating instance_info_cache with network_info: [{"id": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "address": "fa:16:3e:05:67:11", "network": {"id": "922155c4-0d93-4488-a55e-c0d6583804c3", "bridge": "br-int", "label": "tempest-DHCPTest-396339124", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d764c6e1fdd46b88f83657e6a259c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a6b90a-3d", "ovs_interfaceid": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.103 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Releasing lock "refresh_cache-1b9e0b4e-34d1-46d1-8f04-a07da354e704" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.104 183079 DEBUG nova.compute.manager [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Instance network_info: |[{"id": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "address": "fa:16:3e:05:67:11", "network": {"id": "922155c4-0d93-4488-a55e-c0d6583804c3", "bridge": "br-int", "label": "tempest-DHCPTest-396339124", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d764c6e1fdd46b88f83657e6a259c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a6b90a-3d", "ovs_interfaceid": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.105 183079 DEBUG oslo_concurrency.lockutils [req-fa388a56-6a69-48e0-adde-f72ccdcfeb6f req-78afcf46-e55b-43e1-9f15-a14f9304af9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-1b9e0b4e-34d1-46d1-8f04-a07da354e704" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.106 183079 DEBUG nova.network.neutron [req-fa388a56-6a69-48e0-adde-f72ccdcfeb6f req-78afcf46-e55b-43e1-9f15-a14f9304af9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Refreshing network info cache for port 00a6b90a-3de5-45e7-93ca-e2b72cac1de3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.111 183079 DEBUG nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Start _get_guest_xml network_info=[{"id": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "address": "fa:16:3e:05:67:11", "network": {"id": "922155c4-0d93-4488-a55e-c0d6583804c3", "bridge": "br-int", "label": "tempest-DHCPTest-396339124", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d764c6e1fdd46b88f83657e6a259c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a6b90a-3d", "ovs_interfaceid": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.120 183079 WARNING nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.131 183079 DEBUG nova.virt.libvirt.host [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.132 183079 DEBUG nova.virt.libvirt.host [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.137 183079 DEBUG nova.virt.libvirt.host [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.138 183079 DEBUG nova.virt.libvirt.host [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.139 183079 DEBUG nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.139 183079 DEBUG nova.virt.hardware [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.140 183079 DEBUG nova.virt.hardware [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.140 183079 DEBUG nova.virt.hardware [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.140 183079 DEBUG nova.virt.hardware [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.141 183079 DEBUG nova.virt.hardware [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.141 183079 DEBUG nova.virt.hardware [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.141 183079 DEBUG nova.virt.hardware [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.141 183079 DEBUG nova.virt.hardware [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.142 183079 DEBUG nova.virt.hardware [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.142 183079 DEBUG nova.virt.hardware [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.142 183079 DEBUG nova.virt.hardware [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.146 183079 DEBUG nova.virt.libvirt.vif [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:15:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1891638859',display_name='tempest-server-test-1891638859',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1891638859',id=29,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHJ2WpIHRA1kq4lV633wE2x21ZQiUKsNno6rE9q+Yvr8+SBuOLxYX/SrJrCgwnCnqd0qro7QnVhx9iXp6Xfs9luRAXTITM2MCnWaFzPevTHxPPALi/yKqLeXytZxdZTrkw==',key_name='tempest-DHCPTest-396339124',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d764c6e1fdd46b88f83657e6a259c71',ramdisk_id='',reservation_id='r-rf98r04v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DHCPTest-488220837',owner_user_name='tempest-DHCPTest-488220837-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:15:21Z,user_data=None,user_id='21741b1e79254e698cc6d7684318589f',uuid=1b9e0b4e-34d1-46d1-8f04-a07da354e704,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "address": "fa:16:3e:05:67:11", "network": {"id": "922155c4-0d93-4488-a55e-c0d6583804c3", "bridge": "br-int", "label": "tempest-DHCPTest-396339124", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d764c6e1fdd46b88f83657e6a259c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a6b90a-3d", "ovs_interfaceid": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.146 183079 DEBUG nova.network.os_vif_util [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Converting VIF {"id": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "address": "fa:16:3e:05:67:11", "network": {"id": "922155c4-0d93-4488-a55e-c0d6583804c3", "bridge": "br-int", "label": "tempest-DHCPTest-396339124", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d764c6e1fdd46b88f83657e6a259c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a6b90a-3d", "ovs_interfaceid": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.147 183079 DEBUG nova.network.os_vif_util [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:67:11,bridge_name='br-int',has_traffic_filtering=True,id=00a6b90a-3de5-45e7-93ca-e2b72cac1de3,network=Network(922155c4-0d93-4488-a55e-c0d6583804c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap00a6b90a-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.148 183079 DEBUG nova.objects.instance [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1b9e0b4e-34d1-46d1-8f04-a07da354e704 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.163 183079 DEBUG nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:15:24 compute-0 nova_compute[183075]:   <uuid>1b9e0b4e-34d1-46d1-8f04-a07da354e704</uuid>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   <name>instance-0000001d</name>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1891638859</nova:name>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:15:24</nova:creationTime>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:15:24 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:15:24 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:15:24 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:15:24 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:15:24 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:15:24 compute-0 nova_compute[183075]:         <nova:user uuid="21741b1e79254e698cc6d7684318589f">tempest-DHCPTest-488220837-project-member</nova:user>
Jan 22 17:15:24 compute-0 nova_compute[183075]:         <nova:project uuid="8d764c6e1fdd46b88f83657e6a259c71">tempest-DHCPTest-488220837</nova:project>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:15:24 compute-0 nova_compute[183075]:         <nova:port uuid="00a6b90a-3de5-45e7-93ca-e2b72cac1de3">
Jan 22 17:15:24 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <system>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <entry name="serial">1b9e0b4e-34d1-46d1-8f04-a07da354e704</entry>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <entry name="uuid">1b9e0b4e-34d1-46d1-8f04-a07da354e704</entry>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     </system>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   <os>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   </os>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   <features>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   </features>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:05:67:11"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <target dev="tap00a6b90a-3d"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704/console.log" append="off"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <video>
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     </video>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:15:24 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:15:24 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:15:24 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:15:24 compute-0 nova_compute[183075]: </domain>
Jan 22 17:15:24 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.164 183079 DEBUG nova.compute.manager [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Preparing to wait for external event network-vif-plugged-00a6b90a-3de5-45e7-93ca-e2b72cac1de3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.164 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Acquiring lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.165 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.165 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.167 183079 DEBUG nova.virt.libvirt.vif [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:15:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1891638859',display_name='tempest-server-test-1891638859',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1891638859',id=29,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHJ2WpIHRA1kq4lV633wE2x21ZQiUKsNno6rE9q+Yvr8+SBuOLxYX/SrJrCgwnCnqd0qro7QnVhx9iXp6Xfs9luRAXTITM2MCnWaFzPevTHxPPALi/yKqLeXytZxdZTrkw==',key_name='tempest-DHCPTest-396339124',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d764c6e1fdd46b88f83657e6a259c71',ramdisk_id='',reservation_id='r-rf98r04v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DHCPTest-488220837',owner_user_name='tempest-DHCPTest-488220837-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:15:21Z,user_data=None,user_id='21741b1e79254e698cc6d7684318589f',uuid=1b9e0b4e-34d1-46d1-8f04-a07da354e704,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "address": "fa:16:3e:05:67:11", "network": {"id": "922155c4-0d93-4488-a55e-c0d6583804c3", "bridge": "br-int", "label": "tempest-DHCPTest-396339124", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d764c6e1fdd46b88f83657e6a259c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a6b90a-3d", "ovs_interfaceid": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.167 183079 DEBUG nova.network.os_vif_util [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Converting VIF {"id": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "address": "fa:16:3e:05:67:11", "network": {"id": "922155c4-0d93-4488-a55e-c0d6583804c3", "bridge": "br-int", "label": "tempest-DHCPTest-396339124", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d764c6e1fdd46b88f83657e6a259c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a6b90a-3d", "ovs_interfaceid": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.168 183079 DEBUG nova.network.os_vif_util [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:67:11,bridge_name='br-int',has_traffic_filtering=True,id=00a6b90a-3de5-45e7-93ca-e2b72cac1de3,network=Network(922155c4-0d93-4488-a55e-c0d6583804c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap00a6b90a-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.169 183079 DEBUG os_vif [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:67:11,bridge_name='br-int',has_traffic_filtering=True,id=00a6b90a-3de5-45e7-93ca-e2b72cac1de3,network=Network(922155c4-0d93-4488-a55e-c0d6583804c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap00a6b90a-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.170 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.171 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.171 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.175 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.176 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap00a6b90a-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.177 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap00a6b90a-3d, col_values=(('external_ids', {'iface-id': '00a6b90a-3de5-45e7-93ca-e2b72cac1de3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:05:67:11', 'vm-uuid': '1b9e0b4e-34d1-46d1-8f04-a07da354e704'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.232 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:24 compute-0 NetworkManager[55454]: <info>  [1769102124.2335] manager: (tap00a6b90a-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/139)
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.235 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.239 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.240 183079 INFO os_vif [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:67:11,bridge_name='br-int',has_traffic_filtering=True,id=00a6b90a-3de5-45e7-93ca-e2b72cac1de3,network=Network(922155c4-0d93-4488-a55e-c0d6583804c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap00a6b90a-3d')
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.292 183079 DEBUG nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.292 183079 DEBUG nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] No VIF found with MAC fa:16:3e:05:67:11, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:15:24 compute-0 kernel: tap00a6b90a-3d: entered promiscuous mode
Jan 22 17:15:24 compute-0 NetworkManager[55454]: <info>  [1769102124.3701] manager: (tap00a6b90a-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/140)
Jan 22 17:15:24 compute-0 ovn_controller[95372]: 2026-01-22T17:15:24Z|00330|binding|INFO|Claiming lport 00a6b90a-3de5-45e7-93ca-e2b72cac1de3 for this chassis.
Jan 22 17:15:24 compute-0 ovn_controller[95372]: 2026-01-22T17:15:24Z|00331|binding|INFO|00a6b90a-3de5-45e7-93ca-e2b72cac1de3: Claiming fa:16:3e:05:67:11 10.100.0.9
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.370 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.377 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:67:11 10.100.0.9'], port_security=['fa:16:3e:05:67:11 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-DHCPTest-396339124', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-922155c4-0d93-4488-a55e-c0d6583804c3', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-DHCPTest-396339124', 'neutron:project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ef07e124-c32c-447c-8695-9ee6eb139f7e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.229'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=284b2620-f1a0-4ab6-8476-908f65d591a2, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=00a6b90a-3de5-45e7-93ca-e2b72cac1de3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.380 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 00a6b90a-3de5-45e7-93ca-e2b72cac1de3 in datapath 922155c4-0d93-4488-a55e-c0d6583804c3 bound to our chassis
Jan 22 17:15:24 compute-0 ovn_controller[95372]: 2026-01-22T17:15:24Z|00332|binding|INFO|Setting lport 00a6b90a-3de5-45e7-93ca-e2b72cac1de3 ovn-installed in OVS
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.385 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 922155c4-0d93-4488-a55e-c0d6583804c3
Jan 22 17:15:24 compute-0 ovn_controller[95372]: 2026-01-22T17:15:24Z|00333|binding|INFO|Setting lport 00a6b90a-3de5-45e7-93ca-e2b72cac1de3 up in Southbound
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.387 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.390 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.398 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[87e2ece9-8f9e-4365-b8a2-85e3f5dc342e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.399 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap922155c4-01 in ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.400 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap922155c4-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.401 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b1a0c1d7-a4e1-4b04-8e56-236f1d2a6d41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.401 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[07d108b6-584b-45c0-aa93-7855c9a43fba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.412 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[51c55256-5e96-4a19-8ee9-84d59e032a6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:24 compute-0 systemd-udevd[223218]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:15:24 compute-0 NetworkManager[55454]: <info>  [1769102124.4349] device (tap00a6b90a-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:15:24 compute-0 NetworkManager[55454]: <info>  [1769102124.4358] device (tap00a6b90a-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:15:24 compute-0 systemd-machined[154382]: New machine qemu-29-instance-0000001d.
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.439 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b73b290f-6f80-4e24-9b03-72b16dca138d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:24 compute-0 systemd[1]: Started Virtual Machine qemu-29-instance-0000001d.
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.473 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a3dadc68-028a-47ca-8f21-8366a72d95e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.478 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6416ecc0-0ce5-4df9-9662-102a8b5e293e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:24 compute-0 NetworkManager[55454]: <info>  [1769102124.4796] manager: (tap922155c4-00): new Veth device (/org/freedesktop/NetworkManager/Devices/141)
Jan 22 17:15:24 compute-0 podman[223205]: 2026-01-22 17:15:24.494396738 +0000 UTC m=+0.093932255 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.512 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[8d323244-6eaa-4e0b-b1c3-e3bb794a010c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.516 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[2970e756-b7ec-4e22-af9d-b58901aa0bc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:24 compute-0 NetworkManager[55454]: <info>  [1769102124.5362] device (tap922155c4-00): carrier: link connected
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.542 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[3b00bda3-6a18-4f5b-8194-81c346c3eaa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.557 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[dc7a2d2c-b310-45b0-9969-26c17b124be7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap922155c4-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:62:b8:10'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448823, 'reachable_time': 28956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223266, 'error': None, 'target': 'ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.571 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ff2537af-3f81-4c5b-b10f-5e24ee85c68a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe62:b810'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 448823, 'tstamp': 448823}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223267, 'error': None, 'target': 'ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.589 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7ede3174-241e-4a9d-a1ce-386e4a3e340c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap922155c4-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:62:b8:10'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448823, 'reachable_time': 28956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223268, 'error': None, 'target': 'ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.623 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bdfeee98-236c-4c41-bb37-d350b123ce66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.685 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2a7d5e06-dbba-4edf-a1e0-d3e9d0e49416]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.686 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap922155c4-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.686 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.687 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap922155c4-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:24 compute-0 kernel: tap922155c4-00: entered promiscuous mode
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.688 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:24 compute-0 NetworkManager[55454]: <info>  [1769102124.6896] manager: (tap922155c4-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.690 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.691 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap922155c4-00, col_values=(('external_ids', {'iface-id': '934896c0-00c7-4510-9857-208659266984'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.694 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.694 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/922155c4-0d93-4488-a55e-c0d6583804c3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/922155c4-0d93-4488-a55e-c0d6583804c3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:15:24 compute-0 ovn_controller[95372]: 2026-01-22T17:15:24Z|00334|binding|INFO|Releasing lport 934896c0-00c7-4510-9857-208659266984 from this chassis (sb_readonly=0)
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.695 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1332c653-75b4-461a-bbea-2dd60deac9be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.696 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/922155c4-0d93-4488-a55e-c0d6583804c3.pid.haproxy
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 922155c4-0d93-4488-a55e-c0d6583804c3
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:15:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:24.696 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3', 'env', 'PROCESS_TAG=haproxy-922155c4-0d93-4488-a55e-c0d6583804c3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/922155c4-0d93-4488-a55e-c0d6583804c3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.706 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.897 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102124.8967633, 1b9e0b4e-34d1-46d1-8f04-a07da354e704 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.898 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] VM Started (Lifecycle Event)
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.915 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.920 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102124.8968978, 1b9e0b4e-34d1-46d1-8f04-a07da354e704 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.920 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] VM Paused (Lifecycle Event)
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.935 183079 DEBUG nova.compute.manager [req-76468582-c17d-4d32-8f7e-1471f3a8c19a req-82478e98-db3c-43e6-b97c-b2879caafba3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Received event network-vif-plugged-00a6b90a-3de5-45e7-93ca-e2b72cac1de3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.936 183079 DEBUG oslo_concurrency.lockutils [req-76468582-c17d-4d32-8f7e-1471f3a8c19a req-82478e98-db3c-43e6-b97c-b2879caafba3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.936 183079 DEBUG oslo_concurrency.lockutils [req-76468582-c17d-4d32-8f7e-1471f3a8c19a req-82478e98-db3c-43e6-b97c-b2879caafba3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.936 183079 DEBUG oslo_concurrency.lockutils [req-76468582-c17d-4d32-8f7e-1471f3a8c19a req-82478e98-db3c-43e6-b97c-b2879caafba3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.936 183079 DEBUG nova.compute.manager [req-76468582-c17d-4d32-8f7e-1471f3a8c19a req-82478e98-db3c-43e6-b97c-b2879caafba3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Processing event network-vif-plugged-00a6b90a-3de5-45e7-93ca-e2b72cac1de3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.937 183079 DEBUG nova.compute.manager [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.941 183079 DEBUG nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.945 183079 INFO nova.virt.libvirt.driver [-] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Instance spawned successfully.
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.945 183079 DEBUG nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.951 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.956 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102124.9408762, 1b9e0b4e-34d1-46d1-8f04-a07da354e704 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.956 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] VM Resumed (Lifecycle Event)
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.968 183079 DEBUG nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.969 183079 DEBUG nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.970 183079 DEBUG nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.971 183079 DEBUG nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.971 183079 DEBUG nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.972 183079 DEBUG nova.virt.libvirt.driver [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.978 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:15:24 compute-0 nova_compute[183075]: 2026-01-22 17:15:24.983 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:15:25 compute-0 nova_compute[183075]: 2026-01-22 17:15:25.015 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:15:25 compute-0 nova_compute[183075]: 2026-01-22 17:15:25.049 183079 INFO nova.compute.manager [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Took 3.79 seconds to spawn the instance on the hypervisor.
Jan 22 17:15:25 compute-0 nova_compute[183075]: 2026-01-22 17:15:25.049 183079 DEBUG nova.compute.manager [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:15:25 compute-0 podman[223307]: 2026-01-22 17:15:25.092718149 +0000 UTC m=+0.064506936 container create 753d8288d75fee4d824bd426081cff820975aa801c3263879d62b5202b50cdc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 17:15:25 compute-0 nova_compute[183075]: 2026-01-22 17:15:25.124 183079 INFO nova.compute.manager [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Took 4.38 seconds to build instance.
Jan 22 17:15:25 compute-0 systemd[1]: Started libpod-conmon-753d8288d75fee4d824bd426081cff820975aa801c3263879d62b5202b50cdc8.scope.
Jan 22 17:15:25 compute-0 nova_compute[183075]: 2026-01-22 17:15:25.145 183079 DEBUG oslo_concurrency.lockutils [None req-2e96c9ec-d0d2-4bba-920e-db9dc0e0f6aa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:25 compute-0 podman[223307]: 2026-01-22 17:15:25.062383257 +0000 UTC m=+0.034172094 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:15:25 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:15:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ea4a1becb38a6424de738958d8c8d23623d806b180d4f1eb01e9de2c2cb9efd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:15:25 compute-0 podman[223307]: 2026-01-22 17:15:25.200102035 +0000 UTC m=+0.171890882 container init 753d8288d75fee4d824bd426081cff820975aa801c3263879d62b5202b50cdc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:15:25 compute-0 podman[223307]: 2026-01-22 17:15:25.211567564 +0000 UTC m=+0.183356371 container start 753d8288d75fee4d824bd426081cff820975aa801c3263879d62b5202b50cdc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 17:15:25 compute-0 neutron-haproxy-ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3[223323]: [NOTICE]   (223327) : New worker (223329) forked
Jan 22 17:15:25 compute-0 neutron-haproxy-ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3[223323]: [NOTICE]   (223327) : Loading success.
Jan 22 17:15:25 compute-0 nova_compute[183075]: 2026-01-22 17:15:25.923 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:26 compute-0 nova_compute[183075]: 2026-01-22 17:15:26.197 183079 DEBUG nova.network.neutron [req-fa388a56-6a69-48e0-adde-f72ccdcfeb6f req-78afcf46-e55b-43e1-9f15-a14f9304af9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Updated VIF entry in instance network info cache for port 00a6b90a-3de5-45e7-93ca-e2b72cac1de3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:15:26 compute-0 nova_compute[183075]: 2026-01-22 17:15:26.197 183079 DEBUG nova.network.neutron [req-fa388a56-6a69-48e0-adde-f72ccdcfeb6f req-78afcf46-e55b-43e1-9f15-a14f9304af9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Updating instance_info_cache with network_info: [{"id": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "address": "fa:16:3e:05:67:11", "network": {"id": "922155c4-0d93-4488-a55e-c0d6583804c3", "bridge": "br-int", "label": "tempest-DHCPTest-396339124", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d764c6e1fdd46b88f83657e6a259c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a6b90a-3d", "ovs_interfaceid": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:15:26 compute-0 nova_compute[183075]: 2026-01-22 17:15:26.215 183079 DEBUG oslo_concurrency.lockutils [req-fa388a56-6a69-48e0-adde-f72ccdcfeb6f req-78afcf46-e55b-43e1-9f15-a14f9304af9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-1b9e0b4e-34d1-46d1-8f04-a07da354e704" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:15:26 compute-0 nova_compute[183075]: 2026-01-22 17:15:26.541 183079 INFO nova.compute.manager [None req-8b62a8fb-b0ca-4dc4-8260-9861e8dac0a3 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Get console output
Jan 22 17:15:26 compute-0 nova_compute[183075]: 2026-01-22 17:15:26.548 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:15:26 compute-0 nova_compute[183075]: 2026-01-22 17:15:26.665 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:26 compute-0 nova_compute[183075]: 2026-01-22 17:15:26.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:15:27 compute-0 nova_compute[183075]: 2026-01-22 17:15:27.041 183079 DEBUG nova.compute.manager [req-f528d6d1-a7fa-45c1-8c7b-c209dad6fb92 req-a9067a3e-aed8-4b6b-9ee1-2c238f0b1c68 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Received event network-vif-plugged-00a6b90a-3de5-45e7-93ca-e2b72cac1de3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:15:27 compute-0 nova_compute[183075]: 2026-01-22 17:15:27.042 183079 DEBUG oslo_concurrency.lockutils [req-f528d6d1-a7fa-45c1-8c7b-c209dad6fb92 req-a9067a3e-aed8-4b6b-9ee1-2c238f0b1c68 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:27 compute-0 nova_compute[183075]: 2026-01-22 17:15:27.042 183079 DEBUG oslo_concurrency.lockutils [req-f528d6d1-a7fa-45c1-8c7b-c209dad6fb92 req-a9067a3e-aed8-4b6b-9ee1-2c238f0b1c68 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:27 compute-0 nova_compute[183075]: 2026-01-22 17:15:27.043 183079 DEBUG oslo_concurrency.lockutils [req-f528d6d1-a7fa-45c1-8c7b-c209dad6fb92 req-a9067a3e-aed8-4b6b-9ee1-2c238f0b1c68 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:27 compute-0 nova_compute[183075]: 2026-01-22 17:15:27.043 183079 DEBUG nova.compute.manager [req-f528d6d1-a7fa-45c1-8c7b-c209dad6fb92 req-a9067a3e-aed8-4b6b-9ee1-2c238f0b1c68 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] No waiting events found dispatching network-vif-plugged-00a6b90a-3de5-45e7-93ca-e2b72cac1de3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:15:27 compute-0 nova_compute[183075]: 2026-01-22 17:15:27.043 183079 WARNING nova.compute.manager [req-f528d6d1-a7fa-45c1-8c7b-c209dad6fb92 req-a9067a3e-aed8-4b6b-9ee1-2c238f0b1c68 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Received unexpected event network-vif-plugged-00a6b90a-3de5-45e7-93ca-e2b72cac1de3 for instance with vm_state active and task_state None.
Jan 22 17:15:29 compute-0 nova_compute[183075]: 2026-01-22 17:15:29.136 183079 DEBUG nova.compute.manager [req-0c520c4b-3624-42f2-9b1d-5196ba62c179 req-601b8e14-0a23-43db-8e1c-87af6f349bee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Received event network-changed-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:15:29 compute-0 nova_compute[183075]: 2026-01-22 17:15:29.138 183079 DEBUG nova.compute.manager [req-0c520c4b-3624-42f2-9b1d-5196ba62c179 req-601b8e14-0a23-43db-8e1c-87af6f349bee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Refreshing instance network info cache due to event network-changed-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:15:29 compute-0 nova_compute[183075]: 2026-01-22 17:15:29.138 183079 DEBUG oslo_concurrency.lockutils [req-0c520c4b-3624-42f2-9b1d-5196ba62c179 req-601b8e14-0a23-43db-8e1c-87af6f349bee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-26367132-bc45-4c8a-bd7e-5c0883453bbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:15:29 compute-0 nova_compute[183075]: 2026-01-22 17:15:29.139 183079 DEBUG oslo_concurrency.lockutils [req-0c520c4b-3624-42f2-9b1d-5196ba62c179 req-601b8e14-0a23-43db-8e1c-87af6f349bee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-26367132-bc45-4c8a-bd7e-5c0883453bbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:15:29 compute-0 nova_compute[183075]: 2026-01-22 17:15:29.139 183079 DEBUG nova.network.neutron [req-0c520c4b-3624-42f2-9b1d-5196ba62c179 req-601b8e14-0a23-43db-8e1c-87af6f349bee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Refreshing network info cache for port 76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:15:29 compute-0 nova_compute[183075]: 2026-01-22 17:15:29.266 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:30 compute-0 nova_compute[183075]: 2026-01-22 17:15:30.965 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:31 compute-0 nova_compute[183075]: 2026-01-22 17:15:31.740 183079 INFO nova.compute.manager [None req-a9363bfe-bfa6-449a-9d9b-c85598172e11 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Get console output
Jan 22 17:15:32 compute-0 nova_compute[183075]: 2026-01-22 17:15:32.148 183079 DEBUG nova.network.neutron [req-0c520c4b-3624-42f2-9b1d-5196ba62c179 req-601b8e14-0a23-43db-8e1c-87af6f349bee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Updated VIF entry in instance network info cache for port 76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:15:32 compute-0 nova_compute[183075]: 2026-01-22 17:15:32.149 183079 DEBUG nova.network.neutron [req-0c520c4b-3624-42f2-9b1d-5196ba62c179 req-601b8e14-0a23-43db-8e1c-87af6f349bee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Updating instance_info_cache with network_info: [{"id": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "address": "fa:16:3e:f2:f5:37", "network": {"id": "2f6bd5e0-c1b9-4783-b6e7-1932fe18705c", "bridge": "br-int", "label": "tempest-test-network--1617607678", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.36", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76cfd2b1-7a", "ovs_interfaceid": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:15:32 compute-0 nova_compute[183075]: 2026-01-22 17:15:32.170 183079 DEBUG oslo_concurrency.lockutils [req-0c520c4b-3624-42f2-9b1d-5196ba62c179 req-601b8e14-0a23-43db-8e1c-87af6f349bee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-26367132-bc45-4c8a-bd7e-5c0883453bbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:15:33 compute-0 podman[223340]: 2026-01-22 17:15:33.418600671 +0000 UTC m=+0.113007643 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal)
Jan 22 17:15:33 compute-0 podman[223339]: 2026-01-22 17:15:33.429594278 +0000 UTC m=+0.127699957 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:15:33 compute-0 podman[223338]: 2026-01-22 17:15:33.434108196 +0000 UTC m=+0.143978152 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Jan 22 17:15:34 compute-0 nova_compute[183075]: 2026-01-22 17:15:34.268 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:36 compute-0 nova_compute[183075]: 2026-01-22 17:15:36.021 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:36 compute-0 nova_compute[183075]: 2026-01-22 17:15:36.943 183079 INFO nova.compute.manager [None req-5766ed4c-124a-41d4-b05b-7f08ab5a9055 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Get console output
Jan 22 17:15:36 compute-0 nova_compute[183075]: 2026-01-22 17:15:36.949 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:15:37 compute-0 nova_compute[183075]: 2026-01-22 17:15:37.331 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:37 compute-0 ovn_controller[95372]: 2026-01-22T17:15:37Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:05:67:11 10.100.0.9
Jan 22 17:15:37 compute-0 ovn_controller[95372]: 2026-01-22T17:15:37Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:05:67:11 10.100.0.9
Jan 22 17:15:39 compute-0 nova_compute[183075]: 2026-01-22 17:15:39.272 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:39 compute-0 podman[223430]: 2026-01-22 17:15:39.375639349 +0000 UTC m=+0.079624922 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.023 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.529 183079 DEBUG oslo_concurrency.lockutils [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "26367132-bc45-4c8a-bd7e-5c0883453bbd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.530 183079 DEBUG oslo_concurrency.lockutils [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "26367132-bc45-4c8a-bd7e-5c0883453bbd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.531 183079 DEBUG oslo_concurrency.lockutils [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.531 183079 DEBUG oslo_concurrency.lockutils [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.531 183079 DEBUG oslo_concurrency.lockutils [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.533 183079 INFO nova.compute.manager [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Terminating instance
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.535 183079 DEBUG nova.compute.manager [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:15:41 compute-0 kernel: tap76cfd2b1-7a (unregistering): left promiscuous mode
Jan 22 17:15:41 compute-0 NetworkManager[55454]: <info>  [1769102141.5643] device (tap76cfd2b1-7a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:15:41 compute-0 ovn_controller[95372]: 2026-01-22T17:15:41Z|00335|binding|INFO|Releasing lport 76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 from this chassis (sb_readonly=0)
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.577 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:41 compute-0 ovn_controller[95372]: 2026-01-22T17:15:41Z|00336|binding|INFO|Setting lport 76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 down in Southbound
Jan 22 17:15:41 compute-0 ovn_controller[95372]: 2026-01-22T17:15:41Z|00337|binding|INFO|Removing iface tap76cfd2b1-7a ovn-installed in OVS
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.582 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:41.587 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:f5:37 10.100.0.36'], port_security=['fa:16:3e:f2:f5:37 10.100.0.36'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.36/28', 'neutron:device_id': '26367132-bc45-4c8a-bd7e-5c0883453bbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bfc6667804934c92b71ce7638089e9e3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '1ec3a99e-543d-4786-af18-fa8f96c0f742', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.236'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12ee3cfa-41b0-4935-bdbd-341f1a5f30cc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:15:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:41.588 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 in datapath 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c unbound from our chassis
Jan 22 17:15:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:41.590 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2f6bd5e0-c1b9-4783-b6e7-1932fe18705c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:15:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:41.592 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a57527c0-7ed6-4d40-ae7b-b795513776ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:41.593 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c namespace which is not needed anymore
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.596 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:41 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Jan 22 17:15:41 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000001b.scope: Consumed 16.983s CPU time.
Jan 22 17:15:41 compute-0 systemd-machined[154382]: Machine qemu-28-instance-0000001b terminated.
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.772 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.782 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:41 compute-0 neutron-haproxy-ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222674]: [NOTICE]   (222678) : haproxy version is 2.8.14-c23fe91
Jan 22 17:15:41 compute-0 neutron-haproxy-ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222674]: [NOTICE]   (222678) : path to executable is /usr/sbin/haproxy
Jan 22 17:15:41 compute-0 neutron-haproxy-ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222674]: [WARNING]  (222678) : Exiting Master process...
Jan 22 17:15:41 compute-0 neutron-haproxy-ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222674]: [WARNING]  (222678) : Exiting Master process...
Jan 22 17:15:41 compute-0 neutron-haproxy-ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222674]: [ALERT]    (222678) : Current worker (222680) exited with code 143 (Terminated)
Jan 22 17:15:41 compute-0 neutron-haproxy-ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c[222674]: [WARNING]  (222678) : All workers exited. Exiting... (0)
Jan 22 17:15:41 compute-0 systemd[1]: libpod-79c0a3668c1d22d65d670ffded06db05fa72502bc394a4da5302746ba6472e64.scope: Deactivated successfully.
Jan 22 17:15:41 compute-0 podman[223476]: 2026-01-22 17:15:41.83239403 +0000 UTC m=+0.083043921 container died 79c0a3668c1d22d65d670ffded06db05fa72502bc394a4da5302746ba6472e64 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.836 183079 INFO nova.virt.libvirt.driver [-] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Instance destroyed successfully.
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.839 183079 DEBUG nova.objects.instance [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lazy-loading 'resources' on Instance uuid 26367132-bc45-4c8a-bd7e-5c0883453bbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.856 183079 DEBUG nova.virt.libvirt.vif [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:14:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-126458854',display_name='tempest-server-test-126458854',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-126458854',id=27,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMrJJ31y1scB3LRFxoJUNXGWF+24G8xnIsagkR4AgFUSi4x4N/JtpAglucdepqXrDN4/cu+UKlGqq1KPQ/3dphaxCQ2ycOFcca6dHGCNqF9JiM6hrakYFD5RWRpIAcONgw==',key_name='tempest-keypair-test-1482310067',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:14:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bfc6667804934c92b71ce7638089e9e3',ramdisk_id='',reservation_id='r-jcmuifay',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-QoSTest-2146064006',owner_user_name='tempest-QoSTest-2146064006-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:14:16Z,user_data=None,user_id='1e61127d65144bcbaa0d43fe3eb484c0',uuid=26367132-bc45-4c8a-bd7e-5c0883453bbd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "address": "fa:16:3e:f2:f5:37", "network": {"id": "2f6bd5e0-c1b9-4783-b6e7-1932fe18705c", "bridge": "br-int", "label": "tempest-test-network--1617607678", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.36", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76cfd2b1-7a", "ovs_interfaceid": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.857 183079 DEBUG nova.network.os_vif_util [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converting VIF {"id": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "address": "fa:16:3e:f2:f5:37", "network": {"id": "2f6bd5e0-c1b9-4783-b6e7-1932fe18705c", "bridge": "br-int", "label": "tempest-test-network--1617607678", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.36", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bfc6667804934c92b71ce7638089e9e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76cfd2b1-7a", "ovs_interfaceid": "76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.858 183079 DEBUG nova.network.os_vif_util [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f2:f5:37,bridge_name='br-int',has_traffic_filtering=True,id=76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885,network=Network(2f6bd5e0-c1b9-4783-b6e7-1932fe18705c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76cfd2b1-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.859 183079 DEBUG os_vif [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:f5:37,bridge_name='br-int',has_traffic_filtering=True,id=76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885,network=Network(2f6bd5e0-c1b9-4783-b6e7-1932fe18705c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76cfd2b1-7a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.863 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.865 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap76cfd2b1-7a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.868 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.873 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.877 183079 INFO os_vif [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:f5:37,bridge_name='br-int',has_traffic_filtering=True,id=76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885,network=Network(2f6bd5e0-c1b9-4783-b6e7-1932fe18705c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76cfd2b1-7a')
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.879 183079 INFO nova.virt.libvirt.driver [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Deleting instance files /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd_del
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.880 183079 INFO nova.virt.libvirt.driver [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Deletion of /var/lib/nova/instances/26367132-bc45-4c8a-bd7e-5c0883453bbd_del complete
Jan 22 17:15:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-79c0a3668c1d22d65d670ffded06db05fa72502bc394a4da5302746ba6472e64-userdata-shm.mount: Deactivated successfully.
Jan 22 17:15:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-0d604c141741825b731f79c27c1fd23aac9acc55c16b9ce88a0bafacb0adbbc1-merged.mount: Deactivated successfully.
Jan 22 17:15:41 compute-0 podman[223476]: 2026-01-22 17:15:41.904484163 +0000 UTC m=+0.155134064 container cleanup 79c0a3668c1d22d65d670ffded06db05fa72502bc394a4da5302746ba6472e64 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:15:41 compute-0 systemd[1]: libpod-conmon-79c0a3668c1d22d65d670ffded06db05fa72502bc394a4da5302746ba6472e64.scope: Deactivated successfully.
Jan 22 17:15:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:41.931 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:41.932 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:41.934 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.933 183079 INFO nova.compute.manager [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.934 183079 DEBUG oslo.service.loopingcall [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.934 183079 DEBUG nova.compute.manager [-] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:15:41 compute-0 nova_compute[183075]: 2026-01-22 17:15:41.935 183079 DEBUG nova.network.neutron [-] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:15:41 compute-0 podman[223522]: 2026-01-22 17:15:41.999596148 +0000 UTC m=+0.058586782 container remove 79c0a3668c1d22d65d670ffded06db05fa72502bc394a4da5302746ba6472e64 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:15:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:42.009 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f981718f-1733-416b-9d9d-e107dad13e69]: (4, ('Thu Jan 22 05:15:41 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c (79c0a3668c1d22d65d670ffded06db05fa72502bc394a4da5302746ba6472e64)\n79c0a3668c1d22d65d670ffded06db05fa72502bc394a4da5302746ba6472e64\nThu Jan 22 05:15:41 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c (79c0a3668c1d22d65d670ffded06db05fa72502bc394a4da5302746ba6472e64)\n79c0a3668c1d22d65d670ffded06db05fa72502bc394a4da5302746ba6472e64\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:42.012 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c5e26a49-efa6-4088-9d73-8f2b902d0903]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:42.013 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2f6bd5e0-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:42 compute-0 nova_compute[183075]: 2026-01-22 17:15:42.016 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:42 compute-0 kernel: tap2f6bd5e0-c0: left promiscuous mode
Jan 22 17:15:42 compute-0 nova_compute[183075]: 2026-01-22 17:15:42.020 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:42.026 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[efe541f5-e825-440a-8c39-105cceaff86e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:42 compute-0 nova_compute[183075]: 2026-01-22 17:15:42.043 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:42.049 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2f88e455-f488-43fb-8ec9-ac963c73b81a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:42.052 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[04fab842-f1d0-4594-bb42-c120a09f86c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:42.083 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[88dc7f7b-a539-4363-aa60-eba3192a0e9b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441778, 'reachable_time': 41643, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223535, 'error': None, 'target': 'ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:42.087 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2f6bd5e0-c1b9-4783-b6e7-1932fe18705c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:15:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:42.088 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[cfe8c70a-2c85-401c-988b-df4c00b678d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:42 compute-0 systemd[1]: run-netns-ovnmeta\x2d2f6bd5e0\x2dc1b9\x2d4783\x2db6e7\x2d1932fe18705c.mount: Deactivated successfully.
Jan 22 17:15:42 compute-0 nova_compute[183075]: 2026-01-22 17:15:42.111 183079 INFO nova.compute.manager [None req-19f3a658-d506-4791-91c3-4d8da44634c5 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Get console output
Jan 22 17:15:42 compute-0 nova_compute[183075]: 2026-01-22 17:15:42.119 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:15:43 compute-0 nova_compute[183075]: 2026-01-22 17:15:43.124 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:43 compute-0 nova_compute[183075]: 2026-01-22 17:15:43.230 183079 DEBUG nova.compute.manager [req-5e638461-f598-4d93-aecb-a3bb547cd3e9 req-30baaf10-46f6-40f2-a4bd-2c4a57f7c740 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Received event network-vif-unplugged-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:15:43 compute-0 nova_compute[183075]: 2026-01-22 17:15:43.230 183079 DEBUG oslo_concurrency.lockutils [req-5e638461-f598-4d93-aecb-a3bb547cd3e9 req-30baaf10-46f6-40f2-a4bd-2c4a57f7c740 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:43 compute-0 nova_compute[183075]: 2026-01-22 17:15:43.231 183079 DEBUG oslo_concurrency.lockutils [req-5e638461-f598-4d93-aecb-a3bb547cd3e9 req-30baaf10-46f6-40f2-a4bd-2c4a57f7c740 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:43 compute-0 nova_compute[183075]: 2026-01-22 17:15:43.231 183079 DEBUG oslo_concurrency.lockutils [req-5e638461-f598-4d93-aecb-a3bb547cd3e9 req-30baaf10-46f6-40f2-a4bd-2c4a57f7c740 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:43 compute-0 nova_compute[183075]: 2026-01-22 17:15:43.232 183079 DEBUG nova.compute.manager [req-5e638461-f598-4d93-aecb-a3bb547cd3e9 req-30baaf10-46f6-40f2-a4bd-2c4a57f7c740 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] No waiting events found dispatching network-vif-unplugged-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:15:43 compute-0 nova_compute[183075]: 2026-01-22 17:15:43.232 183079 DEBUG nova.compute.manager [req-5e638461-f598-4d93-aecb-a3bb547cd3e9 req-30baaf10-46f6-40f2-a4bd-2c4a57f7c740 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Received event network-vif-unplugged-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:15:43 compute-0 nova_compute[183075]: 2026-01-22 17:15:43.570 183079 DEBUG nova.network.neutron [-] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:15:43 compute-0 nova_compute[183075]: 2026-01-22 17:15:43.586 183079 INFO nova.compute.manager [-] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Took 1.65 seconds to deallocate network for instance.
Jan 22 17:15:43 compute-0 nova_compute[183075]: 2026-01-22 17:15:43.629 183079 DEBUG oslo_concurrency.lockutils [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:43 compute-0 nova_compute[183075]: 2026-01-22 17:15:43.631 183079 DEBUG oslo_concurrency.lockutils [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:43 compute-0 nova_compute[183075]: 2026-01-22 17:15:43.716 183079 DEBUG nova.compute.provider_tree [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:15:43 compute-0 nova_compute[183075]: 2026-01-22 17:15:43.735 183079 DEBUG nova.scheduler.client.report [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:15:43 compute-0 nova_compute[183075]: 2026-01-22 17:15:43.753 183079 DEBUG oslo_concurrency.lockutils [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:43 compute-0 nova_compute[183075]: 2026-01-22 17:15:43.791 183079 INFO nova.scheduler.client.report [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Deleted allocations for instance 26367132-bc45-4c8a-bd7e-5c0883453bbd
Jan 22 17:15:43 compute-0 nova_compute[183075]: 2026-01-22 17:15:43.876 183079 DEBUG oslo_concurrency.lockutils [None req-6f28f0b1-fe76-463c-994b-0c85b00b0641 1e61127d65144bcbaa0d43fe3eb484c0 bfc6667804934c92b71ce7638089e9e3 - - default default] Lock "26367132-bc45-4c8a-bd7e-5c0883453bbd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:44.508 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:15:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:44.509 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:15:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:15:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:15:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:15:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:15:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:15:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:15:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 922155c4-0d93-4488-a55e-c0d6583804c3 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:15:45 compute-0 nova_compute[183075]: 2026-01-22 17:15:45.331 183079 DEBUG nova.compute.manager [req-2d33c76b-1722-4fd9-a34a-25a3be9e794e req-1df1ce21-9e54-4bac-86e7-8a598bcad12c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Received event network-vif-plugged-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:15:45 compute-0 nova_compute[183075]: 2026-01-22 17:15:45.332 183079 DEBUG oslo_concurrency.lockutils [req-2d33c76b-1722-4fd9-a34a-25a3be9e794e req-1df1ce21-9e54-4bac-86e7-8a598bcad12c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:45 compute-0 nova_compute[183075]: 2026-01-22 17:15:45.332 183079 DEBUG oslo_concurrency.lockutils [req-2d33c76b-1722-4fd9-a34a-25a3be9e794e req-1df1ce21-9e54-4bac-86e7-8a598bcad12c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:45 compute-0 nova_compute[183075]: 2026-01-22 17:15:45.333 183079 DEBUG oslo_concurrency.lockutils [req-2d33c76b-1722-4fd9-a34a-25a3be9e794e req-1df1ce21-9e54-4bac-86e7-8a598bcad12c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "26367132-bc45-4c8a-bd7e-5c0883453bbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:45 compute-0 nova_compute[183075]: 2026-01-22 17:15:45.333 183079 DEBUG nova.compute.manager [req-2d33c76b-1722-4fd9-a34a-25a3be9e794e req-1df1ce21-9e54-4bac-86e7-8a598bcad12c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] No waiting events found dispatching network-vif-plugged-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:15:45 compute-0 nova_compute[183075]: 2026-01-22 17:15:45.334 183079 WARNING nova.compute.manager [req-2d33c76b-1722-4fd9-a34a-25a3be9e794e req-1df1ce21-9e54-4bac-86e7-8a598bcad12c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Received unexpected event network-vif-plugged-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 for instance with vm_state deleted and task_state None.
Jan 22 17:15:45 compute-0 nova_compute[183075]: 2026-01-22 17:15:45.334 183079 DEBUG nova.compute.manager [req-2d33c76b-1722-4fd9-a34a-25a3be9e794e req-1df1ce21-9e54-4bac-86e7-8a598bcad12c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Received event network-vif-deleted-76cfd2b1-7a8d-44cd-a50f-41cbb8ac6885 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:15:45 compute-0 podman[223537]: 2026-01-22 17:15:45.384279222 +0000 UTC m=+0.084659573 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.402 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.403 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.8937299
Jan 22 17:15:45 compute-0 haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3[223329]: 10.100.0.9:34684 [22/Jan/2026:17:15:44.507] listener listener/metadata 0/0/0/896/896 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.414 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.415 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 922155c4-0d93-4488-a55e-c0d6583804c3 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.456 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:15:45 compute-0 haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3[223329]: 10.100.0.9:34694 [22/Jan/2026:17:15:45.413] listener listener/metadata 0/0/0/43/43 200 148 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.457 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 164 time: 0.0420773
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.464 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.465 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 922155c4-0d93-4488-a55e-c0d6583804c3 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.498 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.499 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0341485
Jan 22 17:15:45 compute-0 haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3[223329]: 10.100.0.9:34706 [22/Jan/2026:17:15:45.463] listener listener/metadata 0/0/0/36/36 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.508 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.509 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 922155c4-0d93-4488-a55e-c0d6583804c3 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.531 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.532 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0231440
Jan 22 17:15:45 compute-0 haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3[223329]: 10.100.0.9:34710 [22/Jan/2026:17:15:45.507] listener listener/metadata 0/0/0/25/25 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.542 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.544 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 922155c4-0d93-4488-a55e-c0d6583804c3 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.567 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.568 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0242398
Jan 22 17:15:45 compute-0 haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3[223329]: 10.100.0.9:34726 [22/Jan/2026:17:15:45.542] listener listener/metadata 0/0/0/26/26 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.578 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.579 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 922155c4-0d93-4488-a55e-c0d6583804c3 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.601 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:15:45 compute-0 haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3[223329]: 10.100.0.9:34740 [22/Jan/2026:17:15:45.577] listener listener/metadata 0/0/0/24/24 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.603 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0236206
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.613 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.614 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 922155c4-0d93-4488-a55e-c0d6583804c3 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.633 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.634 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0202262
Jan 22 17:15:45 compute-0 haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3[223329]: 10.100.0.9:34752 [22/Jan/2026:17:15:45.612] listener listener/metadata 0/0/0/21/21 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.644 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.645 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 922155c4-0d93-4488-a55e-c0d6583804c3 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.671 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:15:45 compute-0 haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3[223329]: 10.100.0.9:34762 [22/Jan/2026:17:15:45.643] listener listener/metadata 0/0/0/28/28 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.672 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0270519
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.682 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.684 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 922155c4-0d93-4488-a55e-c0d6583804c3 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.706 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:15:45 compute-0 haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3[223329]: 10.100.0.9:34772 [22/Jan/2026:17:15:45.682] listener listener/metadata 0/0/0/25/25 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.707 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0228183
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.713 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.714 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 922155c4-0d93-4488-a55e-c0d6583804c3 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.735 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:15:45 compute-0 haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3[223329]: 10.100.0.9:34778 [22/Jan/2026:17:15:45.712] listener listener/metadata 0/0/0/23/23 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.735 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0218306
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.743 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.744 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 922155c4-0d93-4488-a55e-c0d6583804c3 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:15:45 compute-0 haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3[223329]: 10.100.0.9:34790 [22/Jan/2026:17:15:45.742] listener listener/metadata 0/0/0/23/23 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.766 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0218699
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.783 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.784 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 922155c4-0d93-4488-a55e-c0d6583804c3 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.804 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:15:45 compute-0 haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3[223329]: 10.100.0.9:34804 [22/Jan/2026:17:15:45.782] listener listener/metadata 0/0/0/22/22 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.805 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0210285
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.811 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.811 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 922155c4-0d93-4488-a55e-c0d6583804c3 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.829 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:15:45 compute-0 haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3[223329]: 10.100.0.9:34810 [22/Jan/2026:17:15:45.810] listener listener/metadata 0/0/0/19/19 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.830 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0183377
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.836 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.837 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 922155c4-0d93-4488-a55e-c0d6583804c3 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.855 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:15:45 compute-0 haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3[223329]: 10.100.0.9:34816 [22/Jan/2026:17:15:45.835] listener listener/metadata 0/0/0/19/19 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.855 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0186536
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.861 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.862 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 922155c4-0d93-4488-a55e-c0d6583804c3 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.880 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:15:45 compute-0 haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3[223329]: 10.100.0.9:34822 [22/Jan/2026:17:15:45.860] listener listener/metadata 0/0/0/20/20 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.881 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0190835
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.886 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.888 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 922155c4-0d93-4488-a55e-c0d6583804c3 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.909 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:15:45 compute-0 haproxy-metadata-proxy-922155c4-0d93-4488-a55e-c0d6583804c3[223329]: 10.100.0.9:34830 [22/Jan/2026:17:15:45.886] listener listener/metadata 0/0/0/23/23 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:15:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:45.910 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0221648
Jan 22 17:15:46 compute-0 nova_compute[183075]: 2026-01-22 17:15:46.060 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:46 compute-0 nova_compute[183075]: 2026-01-22 17:15:46.868 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:47 compute-0 nova_compute[183075]: 2026-01-22 17:15:47.229 183079 INFO nova.compute.manager [None req-c45bce8b-3f46-4304-b404-433f9e9de988 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Get console output
Jan 22 17:15:47 compute-0 nova_compute[183075]: 2026-01-22 17:15:47.235 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:15:47 compute-0 nova_compute[183075]: 2026-01-22 17:15:47.801 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:47 compute-0 nova_compute[183075]: 2026-01-22 17:15:47.802 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:47 compute-0 nova_compute[183075]: 2026-01-22 17:15:47.838 183079 DEBUG nova.compute.manager [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:15:47 compute-0 nova_compute[183075]: 2026-01-22 17:15:47.913 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:47 compute-0 nova_compute[183075]: 2026-01-22 17:15:47.914 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:47 compute-0 nova_compute[183075]: 2026-01-22 17:15:47.922 183079 DEBUG nova.virt.hardware [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:15:47 compute-0 nova_compute[183075]: 2026-01-22 17:15:47.922 183079 INFO nova.compute.claims [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.152 183079 DEBUG nova.compute.provider_tree [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.177 183079 DEBUG nova.scheduler.client.report [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.212 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.214 183079 DEBUG nova.compute.manager [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.285 183079 DEBUG nova.compute.manager [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.286 183079 DEBUG nova.network.neutron [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.308 183079 INFO nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.329 183079 DEBUG nova.compute.manager [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.425 183079 DEBUG nova.compute.manager [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.426 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.426 183079 INFO nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Creating image(s)
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.427 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.427 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.428 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.442 183079 DEBUG oslo_concurrency.processutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.507 183079 DEBUG nova.policy [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.530 183079 DEBUG oslo_concurrency.processutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.531 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.531 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.545 183079 DEBUG oslo_concurrency.processutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.617 183079 DEBUG oslo_concurrency.processutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.618 183079 DEBUG oslo_concurrency.processutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.758 183079 DEBUG oslo_concurrency.processutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk 1073741824" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.759 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.760 183079 DEBUG oslo_concurrency.processutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.856 183079 DEBUG oslo_concurrency.processutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.857 183079 DEBUG nova.virt.disk.api [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Checking if we can resize image /var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.858 183079 DEBUG oslo_concurrency.processutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.923 183079 DEBUG oslo_concurrency.processutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.925 183079 DEBUG nova.virt.disk.api [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Cannot resize image /var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:15:48 compute-0 nova_compute[183075]: 2026-01-22 17:15:48.926 183079 DEBUG nova.objects.instance [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lazy-loading 'migration_context' on Instance uuid e02af423-c4c3-4fcd-be73-ebeba9fae411 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:15:49 compute-0 nova_compute[183075]: 2026-01-22 17:15:49.051 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:15:49 compute-0 nova_compute[183075]: 2026-01-22 17:15:49.051 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Ensure instance console log exists: /var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:15:49 compute-0 nova_compute[183075]: 2026-01-22 17:15:49.052 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:49 compute-0 nova_compute[183075]: 2026-01-22 17:15:49.053 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:49 compute-0 nova_compute[183075]: 2026-01-22 17:15:49.053 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:50 compute-0 nova_compute[183075]: 2026-01-22 17:15:50.327 183079 DEBUG nova.network.neutron [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Successfully created port: 0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:15:51 compute-0 nova_compute[183075]: 2026-01-22 17:15:51.054 183079 DEBUG nova.network.neutron [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Successfully created port: 01441304-20f8-4d07-a1ed-05da5d9297d6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:15:51 compute-0 nova_compute[183075]: 2026-01-22 17:15:51.061 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:51 compute-0 nova_compute[183075]: 2026-01-22 17:15:51.871 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:52 compute-0 ovn_controller[95372]: 2026-01-22T17:15:52Z|00338|binding|INFO|Releasing lport 934896c0-00c7-4510-9857-208659266984 from this chassis (sb_readonly=0)
Jan 22 17:15:52 compute-0 ovn_controller[95372]: 2026-01-22T17:15:52Z|00339|binding|INFO|Releasing lport 934896c0-00c7-4510-9857-208659266984 from this chassis (sb_readonly=0)
Jan 22 17:15:52 compute-0 nova_compute[183075]: 2026-01-22 17:15:52.314 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:52 compute-0 nova_compute[183075]: 2026-01-22 17:15:52.365 183079 INFO nova.compute.manager [None req-270a92a7-2936-4773-8ad2-fe300ba498fa 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Get console output
Jan 22 17:15:52 compute-0 nova_compute[183075]: 2026-01-22 17:15:52.370 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:15:52 compute-0 nova_compute[183075]: 2026-01-22 17:15:52.429 183079 DEBUG nova.network.neutron [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Successfully updated port: 0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:15:52 compute-0 nova_compute[183075]: 2026-01-22 17:15:52.516 183079 DEBUG nova.compute.manager [req-fec75a05-e5c0-41dd-87ff-2552bf6bae37 req-bbc96f36-79da-4579-9f54-76c973caa435 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-changed-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:15:52 compute-0 nova_compute[183075]: 2026-01-22 17:15:52.517 183079 DEBUG nova.compute.manager [req-fec75a05-e5c0-41dd-87ff-2552bf6bae37 req-bbc96f36-79da-4579-9f54-76c973caa435 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Refreshing instance network info cache due to event network-changed-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:15:52 compute-0 nova_compute[183075]: 2026-01-22 17:15:52.517 183079 DEBUG oslo_concurrency.lockutils [req-fec75a05-e5c0-41dd-87ff-2552bf6bae37 req-bbc96f36-79da-4579-9f54-76c973caa435 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:15:52 compute-0 nova_compute[183075]: 2026-01-22 17:15:52.518 183079 DEBUG oslo_concurrency.lockutils [req-fec75a05-e5c0-41dd-87ff-2552bf6bae37 req-bbc96f36-79da-4579-9f54-76c973caa435 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:15:52 compute-0 nova_compute[183075]: 2026-01-22 17:15:52.518 183079 DEBUG nova.network.neutron [req-fec75a05-e5c0-41dd-87ff-2552bf6bae37 req-bbc96f36-79da-4579-9f54-76c973caa435 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Refreshing network info cache for port 0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:15:52 compute-0 nova_compute[183075]: 2026-01-22 17:15:52.706 183079 DEBUG nova.network.neutron [req-fec75a05-e5c0-41dd-87ff-2552bf6bae37 req-bbc96f36-79da-4579-9f54-76c973caa435 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:15:53 compute-0 nova_compute[183075]: 2026-01-22 17:15:53.024 183079 DEBUG nova.network.neutron [req-fec75a05-e5c0-41dd-87ff-2552bf6bae37 req-bbc96f36-79da-4579-9f54-76c973caa435 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:15:53 compute-0 nova_compute[183075]: 2026-01-22 17:15:53.045 183079 DEBUG oslo_concurrency.lockutils [req-fec75a05-e5c0-41dd-87ff-2552bf6bae37 req-bbc96f36-79da-4579-9f54-76c973caa435 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:15:53 compute-0 nova_compute[183075]: 2026-01-22 17:15:53.233 183079 DEBUG nova.network.neutron [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Successfully updated port: 01441304-20f8-4d07-a1ed-05da5d9297d6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:15:53 compute-0 nova_compute[183075]: 2026-01-22 17:15:53.637 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:15:53 compute-0 nova_compute[183075]: 2026-01-22 17:15:53.638 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquired lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:15:53 compute-0 nova_compute[183075]: 2026-01-22 17:15:53.639 183079 DEBUG nova.network.neutron [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:15:53 compute-0 nova_compute[183075]: 2026-01-22 17:15:53.823 183079 DEBUG nova.network.neutron [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:15:54 compute-0 nova_compute[183075]: 2026-01-22 17:15:54.609 183079 DEBUG nova.compute.manager [req-46d46fee-3102-4044-b6a8-0effc58f663a req-0d2159b8-c02a-43d8-8132-48a6687cc641 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-changed-01441304-20f8-4d07-a1ed-05da5d9297d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:15:54 compute-0 nova_compute[183075]: 2026-01-22 17:15:54.609 183079 DEBUG nova.compute.manager [req-46d46fee-3102-4044-b6a8-0effc58f663a req-0d2159b8-c02a-43d8-8132-48a6687cc641 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Refreshing instance network info cache due to event network-changed-01441304-20f8-4d07-a1ed-05da5d9297d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:15:54 compute-0 nova_compute[183075]: 2026-01-22 17:15:54.610 183079 DEBUG oslo_concurrency.lockutils [req-46d46fee-3102-4044-b6a8-0effc58f663a req-0d2159b8-c02a-43d8-8132-48a6687cc641 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:15:55 compute-0 ovn_controller[95372]: 2026-01-22T17:15:55Z|00340|binding|INFO|Releasing lport 934896c0-00c7-4510-9857-208659266984 from this chassis (sb_readonly=0)
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.386 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:55 compute-0 podman[223577]: 2026-01-22 17:15:55.389746264 +0000 UTC m=+0.090731691 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.455 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'name': 'tempest-server-test-1891638859', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8d764c6e1fdd46b88f83657e6a259c71', 'user_id': '21741b1e79254e698cc6d7684318589f', 'hostId': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.457 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.462 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1b9e0b4e-34d1-46d1-8f04-a07da354e704 / tap00a6b90a-3d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.462 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10a82e0c-939f-46b7-89b2-992e07b6dab0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': 'instance-0000001d-1b9e0b4e-34d1-46d1-8f04-a07da354e704-tap00a6b90a-3d', 'timestamp': '2026-01-22T17:15:55.457264', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'tap00a6b90a-3d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:67:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap00a6b90a-3d'}, 'message_id': '02fb8fc2-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.21773043, 'message_signature': 'c987c63907194974f7410df6f8780a674d05895fff11dfbb4b7277ce4758980e'}]}, 'timestamp': '2026-01-22 17:15:55.463874', '_unique_id': 'b59f6c5053d34d2992589de7856a98c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.466 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.467 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.467 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c23cfe5-839d-4a5c-9303-d096b08b6b2f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': 'instance-0000001d-1b9e0b4e-34d1-46d1-8f04-a07da354e704-tap00a6b90a-3d', 'timestamp': '2026-01-22T17:15:55.467928', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'tap00a6b90a-3d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:67:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap00a6b90a-3d'}, 'message_id': '02fc48c2-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.21773043, 'message_signature': '9311e847495575a1ce1e5b52a84ead25c4a28511260f4a4990372646e672928f'}]}, 'timestamp': '2026-01-22 17:15:55.468458', '_unique_id': '94e468f82c6e4ea2bff796519e7b9866'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.469 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.470 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.471 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.471 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1891638859>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1891638859>]
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.471 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.471 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9425038a-d790-47e2-b538-90172fc6dbdf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': 'instance-0000001d-1b9e0b4e-34d1-46d1-8f04-a07da354e704-tap00a6b90a-3d', 'timestamp': '2026-01-22T17:15:55.471828', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'tap00a6b90a-3d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:67:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap00a6b90a-3d'}, 'message_id': '02fce0ac-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.21773043, 'message_signature': '9bb10aafa49e834a64ed4dab0250a8eb9deddc99ff0fef812bfdb9cf09a6f088'}]}, 'timestamp': '2026-01-22 17:15:55.472341', '_unique_id': '69080a40863e41f0b30a71f666ea35b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.473 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.474 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.474 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/network.outgoing.bytes volume: 10766 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fbdee69-da3f-46a0-84bd-15de7d0f5d67', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10766, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': 'instance-0000001d-1b9e0b4e-34d1-46d1-8f04-a07da354e704-tap00a6b90a-3d', 'timestamp': '2026-01-22T17:15:55.474940', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'tap00a6b90a-3d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:67:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap00a6b90a-3d'}, 'message_id': '02fd5af0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.21773043, 'message_signature': '01de5d3fc598a37d15c5da1aa341edf80e010bbb1bd7da9869471156047eae10'}]}, 'timestamp': '2026-01-22 17:15:55.475464', '_unique_id': '9544a2c2df8a41b8970c70ccd742b86f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.476 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.477 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.501 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk.device.write.latency volume: 5062274914 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73caac0b-01a3-465c-a234-b50caebcb490', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5062274914, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704-vda', 'timestamp': '2026-01-22T17:15:55.477860', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'instance-0000001d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03017400-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.238323698, 'message_signature': '73be4f5e8a39d8edb3e8b4ae0dbd219029971311944731d20e8dfc4a78e0a44a'}]}, 'timestamp': '2026-01-22 17:15:55.502341', '_unique_id': 'e219fe309c5b4380a27599bdae3d4ce3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.505 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.515 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd12829f5-9046-4c03-adde-5abef1336df9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704-vda', 'timestamp': '2026-01-22T17:15:55.505209', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'instance-0000001d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03039f50-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.265686173, 'message_signature': 'f1212c1558a26a587d8c2d6486690eca4468a1e5357bb3026dae9cc87636ef09'}]}, 'timestamp': '2026-01-22 17:15:55.516549', '_unique_id': '39fe52daa1584d1a8f62fee6c86acf60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.517 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.519 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.519 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd10b5a4f-4e9c-47ac-871a-1b8d9846db0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': 'instance-0000001d-1b9e0b4e-34d1-46d1-8f04-a07da354e704-tap00a6b90a-3d', 'timestamp': '2026-01-22T17:15:55.519832', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'tap00a6b90a-3d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:67:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap00a6b90a-3d'}, 'message_id': '0304349c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.21773043, 'message_signature': '22cd88588d4559a3be17d8b37b11930b5c6a943e8b8b9c2b5de9f9f3b16980b1'}]}, 'timestamp': '2026-01-22 17:15:55.520361', '_unique_id': 'fbcbde582e5f4f28b52ee064bd98e13e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.522 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.522 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7fd17616-aa68-4784-839a-9b56e1309200', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704-vda', 'timestamp': '2026-01-22T17:15:55.522814', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'instance-0000001d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0304a7e2-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.265686173, 'message_signature': '3fd040259d157fad3a64592b861449bb8667c5fa60655e55a3c1b83aef835628'}]}, 'timestamp': '2026-01-22 17:15:55.523348', '_unique_id': 'd300f352359b4d4a8c01564754062962'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.524 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.525 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.525 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk.device.read.bytes volume: 30181888 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe4e99a0-3935-4c70-8687-54eb81c073c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30181888, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704-vda', 'timestamp': '2026-01-22T17:15:55.525900', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'instance-0000001d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03052096-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.238323698, 'message_signature': '58ad45adb3205d1c1633022bf7f982db40b12f310ec477236eeaebe40ee722bd'}]}, 'timestamp': '2026-01-22 17:15:55.526389', '_unique_id': '946b8c9dd26f4226be915c6a45519179'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.528 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.528 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0370d50f-6f24-4eaf-95e8-51cfbec0e185', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704-vda', 'timestamp': '2026-01-22T17:15:55.528844', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'instance-0000001d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03059350-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.265686173, 'message_signature': '8b4fe0d657d91ea3af349587fecbb9bd73eeefb48fceab3e38c9aad2386dfd85'}]}, 'timestamp': '2026-01-22 17:15:55.529320', '_unique_id': 'c86f506551e04d0d98a1eb7d1811e1e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.531 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.531 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.532 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1891638859>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1891638859>]
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.532 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.532 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk.device.read.requests volume: 1122 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42cda47f-0889-4182-9674-eb6cea95ae0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1122, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704-vda', 'timestamp': '2026-01-22T17:15:55.532512', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'instance-0000001d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03062482-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.238323698, 'message_signature': '90a13f5a69ba329dc20965729c9b2c681d03d3a9d48883e9da75419966e53365'}]}, 'timestamp': '2026-01-22 17:15:55.533041', '_unique_id': '0007ef7557954c13bd7d50c7f686d2a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.535 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.535 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4ef58cf-3eaa-4a74-9be3-675029b1fd8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': 'instance-0000001d-1b9e0b4e-34d1-46d1-8f04-a07da354e704-tap00a6b90a-3d', 'timestamp': '2026-01-22T17:15:55.535846', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'tap00a6b90a-3d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:67:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap00a6b90a-3d'}, 'message_id': '0306a8da-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.21773043, 'message_signature': '7d5eab3248b64ea1bde8de6e17d9139b00ad797fc83c82e5b3890e4d84b46c19'}]}, 'timestamp': '2026-01-22 17:15:55.536539', '_unique_id': '38e159832929495ab4ce6898836fa8d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.539 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.539 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f627375d-8c20-4a0d-a682-61cd542d9518', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': 'instance-0000001d-1b9e0b4e-34d1-46d1-8f04-a07da354e704-tap00a6b90a-3d', 'timestamp': '2026-01-22T17:15:55.539803', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'tap00a6b90a-3d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:67:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap00a6b90a-3d'}, 'message_id': '0307402e-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.21773043, 'message_signature': '3dbbf827c72d8979c88a5ed3dc94776dfc9831524e4cef85f5dc67b8453955ff'}]}, 'timestamp': '2026-01-22 17:15:55.540316', '_unique_id': 'b5fcf16adba540d2b17f7f8f8a89f246'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.542 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.572 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/memory.usage volume: 43.3984375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5607953c-1b69-4526-bbdd-ae5b58f2f3fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.3984375, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'timestamp': '2026-01-22T17:15:55.542897', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'instance-0000001d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '030c3692-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.332371345, 'message_signature': 'c9bd2a34fbc3ee85a9394b85b74637ceb74cf96ff827969e90eec6955feb64bd'}]}, 'timestamp': '2026-01-22 17:15:55.572968', '_unique_id': 'e5b4d2a409934de9a2b3fb6866e38374'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.575 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.576 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/network.incoming.bytes volume: 7274 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0938a98f-e7a0-40c4-bc90-4913753c839f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7274, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': 'instance-0000001d-1b9e0b4e-34d1-46d1-8f04-a07da354e704-tap00a6b90a-3d', 'timestamp': '2026-01-22T17:15:55.576140', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'tap00a6b90a-3d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:67:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap00a6b90a-3d'}, 'message_id': '030ccb7a-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.21773043, 'message_signature': 'a882a455f60085f158e7e7811da69b3337f92f4ae9fbaef461f5afab8df06917'}]}, 'timestamp': '2026-01-22 17:15:55.576714', '_unique_id': '398dcc9bc3074a0c9ca76ee9028a9d01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.578 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.579 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk.device.write.requests volume: 322 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4731bc38-5959-4ba0-94ef-dfe81e588c2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 322, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704-vda', 'timestamp': '2026-01-22T17:15:55.579025', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'instance-0000001d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '030d3cfe-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.238323698, 'message_signature': 'f7cd748b07dc526b4de3af9a5a388720899c4fb8ee4f89038d19fa8a3d8a8238'}]}, 'timestamp': '2026-01-22 17:15:55.579534', '_unique_id': 'd5510fd129a2418ca2654abe2d42cf3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.580 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.581 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.581 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/network.outgoing.packets volume: 122 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26c1299b-dd8f-47ab-896d-abe6687c97cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 122, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': 'instance-0000001d-1b9e0b4e-34d1-46d1-8f04-a07da354e704-tap00a6b90a-3d', 'timestamp': '2026-01-22T17:15:55.581888', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'tap00a6b90a-3d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:67:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap00a6b90a-3d'}, 'message_id': '030dab62-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.21773043, 'message_signature': 'e76fc0d213ed49f0552a8c341cbfe42ba7087d696e92d31657206572b714e594'}]}, 'timestamp': '2026-01-22 17:15:55.582374', '_unique_id': '450f9df4e2b34cb2b461af05c91b8217'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.583 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.584 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.584 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.584 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1891638859>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1891638859>]
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.585 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.585 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd331a4ff-379f-4374-8b0b-c63b5019e079', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': 'instance-0000001d-1b9e0b4e-34d1-46d1-8f04-a07da354e704-tap00a6b90a-3d', 'timestamp': '2026-01-22T17:15:55.585391', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'tap00a6b90a-3d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:67:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap00a6b90a-3d'}, 'message_id': '030e3686-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.21773043, 'message_signature': 'd12fe2e52f88aa3522c2ca2f08bfb9e94715dbeb2ddaf5eb282cbad4f270aa92'}]}, 'timestamp': '2026-01-22 17:15:55.586050', '_unique_id': 'a1311b7abb634d1a8c5ee1de13d92496'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.589 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.589 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/cpu volume: 11400000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.588 183079 DEBUG nova.network.neutron [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Updating instance_info_cache with network_info: [{"id": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "address": "fa:16:3e:d9:53:87", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b803bb5-5b", "ovs_interfaceid": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '593bd7b6-27fe-425b-9415-0efb5840a48c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11400000000, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'timestamp': '2026-01-22T17:15:55.589341', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'instance-0000001d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '030ed44c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.332371345, 'message_signature': 'f628e73edf3538b8326bdc9135b4c74ea8660f9ef8853643e1d45ec46bbefc85'}]}, 'timestamp': '2026-01-22 17:15:55.590061', '_unique_id': '6d4080dc67ee450a97ec2a048beea90a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.591 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.593 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.593 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c27f1f01-c549-448a-8f91-9ab7d1fb280e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704-vda', 'timestamp': '2026-01-22T17:15:55.593354', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'instance-0000001d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '030f6f56-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.238323698, 'message_signature': 'bbcde1dd9dbf41305ccf1640b058a47921cd237654e9dd4e2c7e2104437858fe'}]}, 'timestamp': '2026-01-22 17:15:55.593945', '_unique_id': 'e362b35a245e4c53bb5c4a48b893bc4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.596 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.596 12 DEBUG ceilometer.compute.pollsters [-] 1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk.device.read.latency volume: 202929016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24eae1e4-078c-4844-92a6-1a0c804f72c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 202929016, 'user_id': '21741b1e79254e698cc6d7684318589f', 'user_name': None, 'project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'project_name': None, 'resource_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704-vda', 'timestamp': '2026-01-22T17:15:55.596456', 'resource_metadata': {'display_name': 'tempest-server-test-1891638859', 'name': 'instance-0000001d', 'instance_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'instance_type': 'm1.nano', 'host': '6e9b60e5d1652caafccaf151c3ac0b618f0e3f1c99f980e373c836ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '030fe846-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4519.238323698, 'message_signature': 'ebe03302b6058f8c5ca4d7402022aaf018a9afc566058b0390c7ebf2f81dcec7'}]}, 'timestamp': '2026-01-22 17:15:55.597141', '_unique_id': 'ecaad863dcae4998945294f800558678'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.598 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.599 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.599 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:15:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:15:55.600 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1891638859>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1891638859>]
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.618 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Releasing lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.618 183079 DEBUG nova.compute.manager [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Instance network_info: |[{"id": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "address": "fa:16:3e:d9:53:87", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b803bb5-5b", "ovs_interfaceid": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.620 183079 DEBUG oslo_concurrency.lockutils [req-46d46fee-3102-4044-b6a8-0effc58f663a req-0d2159b8-c02a-43d8-8132-48a6687cc641 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.620 183079 DEBUG nova.network.neutron [req-46d46fee-3102-4044-b6a8-0effc58f663a req-0d2159b8-c02a-43d8-8132-48a6687cc641 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Refreshing network info cache for port 01441304-20f8-4d07-a1ed-05da5d9297d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.627 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Start _get_guest_xml network_info=[{"id": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "address": "fa:16:3e:d9:53:87", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b803bb5-5b", "ovs_interfaceid": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.634 183079 WARNING nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.642 183079 DEBUG nova.virt.libvirt.host [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.643 183079 DEBUG nova.virt.libvirt.host [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.655 183079 DEBUG nova.virt.libvirt.host [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.656 183079 DEBUG nova.virt.libvirt.host [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.657 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.658 183079 DEBUG nova.virt.hardware [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.659 183079 DEBUG nova.virt.hardware [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.659 183079 DEBUG nova.virt.hardware [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.660 183079 DEBUG nova.virt.hardware [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.660 183079 DEBUG nova.virt.hardware [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.661 183079 DEBUG nova.virt.hardware [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.661 183079 DEBUG nova.virt.hardware [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.662 183079 DEBUG nova.virt.hardware [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.663 183079 DEBUG nova.virt.hardware [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.663 183079 DEBUG nova.virt.hardware [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.664 183079 DEBUG nova.virt.hardware [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.670 183079 DEBUG nova.virt.libvirt.vif [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1223247828',display_name='tempest-server-test-1223247828',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1223247828',id=30,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-kexgovqt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:15:48Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=e02af423-c4c3-4fcd-be73-ebeba9fae411,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "address": "fa:16:3e:d9:53:87", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b803bb5-5b", "ovs_interfaceid": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.671 183079 DEBUG nova.network.os_vif_util [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "address": "fa:16:3e:d9:53:87", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b803bb5-5b", "ovs_interfaceid": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.673 183079 DEBUG nova.network.os_vif_util [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:53:87,bridge_name='br-int',has_traffic_filtering=True,id=0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510,network=Network(359b74c5-cbeb-4440-a3e9-a16a51b1ab77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b803bb5-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.674 183079 DEBUG nova.virt.libvirt.vif [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1223247828',display_name='tempest-server-test-1223247828',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1223247828',id=30,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-kexgovqt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:15:48Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=e02af423-c4c3-4fcd-be73-ebeba9fae411,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.675 183079 DEBUG nova.network.os_vif_util [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.675 183079 DEBUG nova.network.os_vif_util [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:72:7a,bridge_name='br-int',has_traffic_filtering=True,id=01441304-20f8-4d07-a1ed-05da5d9297d6,network=Network(ee52ea03-3241-4391-9bfe-b2039dbf3bfe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01441304-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.676 183079 DEBUG nova.objects.instance [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lazy-loading 'pci_devices' on Instance uuid e02af423-c4c3-4fcd-be73-ebeba9fae411 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.696 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:15:55 compute-0 nova_compute[183075]:   <uuid>e02af423-c4c3-4fcd-be73-ebeba9fae411</uuid>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   <name>instance-0000001e</name>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1223247828</nova:name>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:15:55</nova:creationTime>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:15:55 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:15:55 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:15:55 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:15:55 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:15:55 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:15:55 compute-0 nova_compute[183075]:         <nova:user uuid="2fe02f7484a94091bab26aba1c370459">tempest-IPv6Test-1828780436-project-member</nova:user>
Jan 22 17:15:55 compute-0 nova_compute[183075]:         <nova:project uuid="4f95c62a5a194d2291b03187a9c85702">tempest-IPv6Test-1828780436</nova:project>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:15:55 compute-0 nova_compute[183075]:         <nova:port uuid="0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510">
Jan 22 17:15:55 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:15:55 compute-0 nova_compute[183075]:         <nova:port uuid="01441304-20f8-4d07-a1ed-05da5d9297d6">
Jan 22 17:15:55 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fedb:727a" ipVersion="6"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <system>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <entry name="serial">e02af423-c4c3-4fcd-be73-ebeba9fae411</entry>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <entry name="uuid">e02af423-c4c3-4fcd-be73-ebeba9fae411</entry>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     </system>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   <os>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   </os>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   <features>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   </features>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:d9:53:87"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <target dev="tap0b803bb5-5b"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:db:72:7a"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <target dev="tap01441304-20"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/console.log" append="off"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <video>
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     </video>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:15:55 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:15:55 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:15:55 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:15:55 compute-0 nova_compute[183075]: </domain>
Jan 22 17:15:55 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.698 183079 DEBUG nova.compute.manager [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Preparing to wait for external event network-vif-plugged-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.698 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.699 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.699 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.699 183079 DEBUG nova.compute.manager [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Preparing to wait for external event network-vif-plugged-01441304-20f8-4d07-a1ed-05da5d9297d6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.699 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.699 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.699 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.700 183079 DEBUG nova.virt.libvirt.vif [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1223247828',display_name='tempest-server-test-1223247828',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1223247828',id=30,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-kexgovqt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:15:48Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=e02af423-c4c3-4fcd-be73-ebeba9fae411,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "address": "fa:16:3e:d9:53:87", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b803bb5-5b", "ovs_interfaceid": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.700 183079 DEBUG nova.network.os_vif_util [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "address": "fa:16:3e:d9:53:87", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b803bb5-5b", "ovs_interfaceid": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.701 183079 DEBUG nova.network.os_vif_util [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:53:87,bridge_name='br-int',has_traffic_filtering=True,id=0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510,network=Network(359b74c5-cbeb-4440-a3e9-a16a51b1ab77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b803bb5-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.701 183079 DEBUG os_vif [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:53:87,bridge_name='br-int',has_traffic_filtering=True,id=0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510,network=Network(359b74c5-cbeb-4440-a3e9-a16a51b1ab77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b803bb5-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.702 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.702 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.703 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.705 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.705 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b803bb5-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.706 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b803bb5-5b, col_values=(('external_ids', {'iface-id': '0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d9:53:87', 'vm-uuid': 'e02af423-c4c3-4fcd-be73-ebeba9fae411'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.752 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:55 compute-0 NetworkManager[55454]: <info>  [1769102155.7528] manager: (tap0b803bb5-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.754 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.761 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.763 183079 INFO os_vif [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:53:87,bridge_name='br-int',has_traffic_filtering=True,id=0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510,network=Network(359b74c5-cbeb-4440-a3e9-a16a51b1ab77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b803bb5-5b')
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.765 183079 DEBUG nova.virt.libvirt.vif [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1223247828',display_name='tempest-server-test-1223247828',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1223247828',id=30,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-kexgovqt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:15:48Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=e02af423-c4c3-4fcd-be73-ebeba9fae411,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.766 183079 DEBUG nova.network.os_vif_util [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.767 183079 DEBUG nova.network.os_vif_util [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:72:7a,bridge_name='br-int',has_traffic_filtering=True,id=01441304-20f8-4d07-a1ed-05da5d9297d6,network=Network(ee52ea03-3241-4391-9bfe-b2039dbf3bfe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01441304-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.767 183079 DEBUG os_vif [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:72:7a,bridge_name='br-int',has_traffic_filtering=True,id=01441304-20f8-4d07-a1ed-05da5d9297d6,network=Network(ee52ea03-3241-4391-9bfe-b2039dbf3bfe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01441304-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.769 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.769 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.770 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.773 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.773 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01441304-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.774 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap01441304-20, col_values=(('external_ids', {'iface-id': '01441304-20f8-4d07-a1ed-05da5d9297d6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:72:7a', 'vm-uuid': 'e02af423-c4c3-4fcd-be73-ebeba9fae411'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:55 compute-0 NetworkManager[55454]: <info>  [1769102155.7772] manager: (tap01441304-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.776 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.781 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.788 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.789 183079 INFO os_vif [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:72:7a,bridge_name='br-int',has_traffic_filtering=True,id=01441304-20f8-4d07-a1ed-05da5d9297d6,network=Network(ee52ea03-3241-4391-9bfe-b2039dbf3bfe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01441304-20')
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.864 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.865 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] No VIF found with MAC fa:16:3e:d9:53:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.866 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] No VIF found with MAC fa:16:3e:db:72:7a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:15:55 compute-0 NetworkManager[55454]: <info>  [1769102155.9648] manager: (tap0b803bb5-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/145)
Jan 22 17:15:55 compute-0 kernel: tap0b803bb5-5b: entered promiscuous mode
Jan 22 17:15:55 compute-0 ovn_controller[95372]: 2026-01-22T17:15:55Z|00341|binding|INFO|Claiming lport 0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 for this chassis.
Jan 22 17:15:55 compute-0 ovn_controller[95372]: 2026-01-22T17:15:55Z|00342|binding|INFO|0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510: Claiming fa:16:3e:d9:53:87 10.100.0.7
Jan 22 17:15:55 compute-0 nova_compute[183075]: 2026-01-22 17:15:55.971 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:55.984 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:53:87 10.100.0.7'], port_security=['fa:16:3e:d9:53:87 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e02af423-c4c3-4fcd-be73-ebeba9fae411', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-359b74c5-cbeb-4440-a3e9-a16a51b1ab77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4f95c62a5a194d2291b03187a9c85702', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1b5e2b25-1ae0-464c-ac9a-7fc65ac893a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee297cf4-fb08-4758-bba6-b8b00aaf6678, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:15:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:55.987 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 in datapath 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 bound to our chassis
Jan 22 17:15:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:55.990 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 359b74c5-cbeb-4440-a3e9-a16a51b1ab77
Jan 22 17:15:55 compute-0 NetworkManager[55454]: <info>  [1769102155.9950] manager: (tap01441304-20): new Tun device (/org/freedesktop/NetworkManager/Devices/146)
Jan 22 17:15:56 compute-0 kernel: tap01441304-20: entered promiscuous mode
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.010 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[826db3dd-075f-41a1-8ef0-90efe88fab84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.011 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap359b74c5-c1 in ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:15:56 compute-0 ovn_controller[95372]: 2026-01-22T17:15:56Z|00343|binding|INFO|Setting lport 0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 ovn-installed in OVS
Jan 22 17:15:56 compute-0 ovn_controller[95372]: 2026-01-22T17:15:56Z|00344|binding|INFO|Setting lport 0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 up in Southbound
Jan 22 17:15:56 compute-0 systemd-udevd[223623]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.016 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap359b74c5-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.016 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c1c4ad32-b614-461f-96a8-4fce6b5a8a4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.016 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.017 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[65f1570d-15c5-4f99-8074-f0b02758d818]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:56 compute-0 ovn_controller[95372]: 2026-01-22T17:15:56Z|00345|if_status|INFO|Not updating pb chassis for 01441304-20f8-4d07-a1ed-05da5d9297d6 now as sb is readonly
Jan 22 17:15:56 compute-0 ovn_controller[95372]: 2026-01-22T17:15:56Z|00346|binding|INFO|Claiming lport 01441304-20f8-4d07-a1ed-05da5d9297d6 for this chassis.
Jan 22 17:15:56 compute-0 systemd-udevd[223624]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:15:56 compute-0 ovn_controller[95372]: 2026-01-22T17:15:56Z|00347|binding|INFO|01441304-20f8-4d07-a1ed-05da5d9297d6: Claiming fa:16:3e:db:72:7a 2001:db8::f816:3eff:fedb:727a
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.027 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:72:7a 2001:db8::f816:3eff:fedb:727a'], port_security=['fa:16:3e:db:72:7a 2001:db8::f816:3eff:fedb:727a'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fedb:727a/64', 'neutron:device_id': 'e02af423-c4c3-4fcd-be73-ebeba9fae411', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee52ea03-3241-4391-9bfe-b2039dbf3bfe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4f95c62a5a194d2291b03187a9c85702', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1b5e2b25-1ae0-464c-ac9a-7fc65ac893a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9a1c076-b07e-4835-ae9a-9f814ca84200, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=01441304-20f8-4d07-a1ed-05da5d9297d6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.034 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[08a7cc3e-a721-4e6d-893e-75414d937f60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:56 compute-0 ovn_controller[95372]: 2026-01-22T17:15:56Z|00348|binding|INFO|Setting lport 01441304-20f8-4d07-a1ed-05da5d9297d6 up in Southbound
Jan 22 17:15:56 compute-0 ovn_controller[95372]: 2026-01-22T17:15:56Z|00349|binding|INFO|Setting lport 01441304-20f8-4d07-a1ed-05da5d9297d6 ovn-installed in OVS
Jan 22 17:15:56 compute-0 NetworkManager[55454]: <info>  [1769102156.0393] device (tap01441304-20): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:15:56 compute-0 NetworkManager[55454]: <info>  [1769102156.0403] device (tap01441304-20): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.042 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:56 compute-0 NetworkManager[55454]: <info>  [1769102156.0437] device (tap0b803bb5-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:15:56 compute-0 NetworkManager[55454]: <info>  [1769102156.0442] device (tap0b803bb5-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.055 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[57038e2b-7a9d-48f8-b0e2-139abce87a96]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:56 compute-0 systemd-machined[154382]: New machine qemu-30-instance-0000001e.
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.063 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:56 compute-0 systemd[1]: Started Virtual Machine qemu-30-instance-0000001e.
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.095 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[79ab96d6-49f2-42ba-b498-7a88707e9e41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.101 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6588e4bc-8be8-47b1-866d-1aed73f34a57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:56 compute-0 NetworkManager[55454]: <info>  [1769102156.1025] manager: (tap359b74c5-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/147)
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.139 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d1a97265-0f43-4ad5-ad20-53ad434d66f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.143 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[b4d28d7a-c32a-498b-bbfa-29d5101c27eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:56 compute-0 NetworkManager[55454]: <info>  [1769102156.1712] device (tap359b74c5-c0): carrier: link connected
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.176 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c39108de-85b7-4355-821d-773ff54eddcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.200 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d6aa639f-7714-47e6-ae8f-12003e030bba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap359b74c5-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:36:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451987, 'reachable_time': 41063, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223659, 'error': None, 'target': 'ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.219 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6d21dd6b-3022-4328-a435-51b240fdd8a1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee7:3633'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451987, 'tstamp': 451987}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223660, 'error': None, 'target': 'ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.240 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5db9a313-c60b-4613-8d6b-0f6769df3f66]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap359b74c5-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:36:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451987, 'reachable_time': 41063, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223661, 'error': None, 'target': 'ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.278 183079 DEBUG nova.compute.manager [req-ca5eb179-d39b-4f02-b4ec-5505d19a883a req-587ee09d-912b-4d21-bb49-c9c39c018f22 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-vif-plugged-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.279 183079 DEBUG oslo_concurrency.lockutils [req-ca5eb179-d39b-4f02-b4ec-5505d19a883a req-587ee09d-912b-4d21-bb49-c9c39c018f22 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.279 183079 DEBUG oslo_concurrency.lockutils [req-ca5eb179-d39b-4f02-b4ec-5505d19a883a req-587ee09d-912b-4d21-bb49-c9c39c018f22 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.279 183079 DEBUG oslo_concurrency.lockutils [req-ca5eb179-d39b-4f02-b4ec-5505d19a883a req-587ee09d-912b-4d21-bb49-c9c39c018f22 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.279 183079 DEBUG nova.compute.manager [req-ca5eb179-d39b-4f02-b4ec-5505d19a883a req-587ee09d-912b-4d21-bb49-c9c39c018f22 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Processing event network-vif-plugged-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.285 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[94610705-2f27-4b60-89fe-e6013c3032b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.349 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e624dc2e-740f-48a1-8394-37fa5c2ad97f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.352 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap359b74c5-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.352 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.353 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap359b74c5-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.354 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:56 compute-0 NetworkManager[55454]: <info>  [1769102156.3555] manager: (tap359b74c5-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Jan 22 17:15:56 compute-0 kernel: tap359b74c5-c0: entered promiscuous mode
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.357 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.358 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap359b74c5-c0, col_values=(('external_ids', {'iface-id': '705c199a-731e-4515-b4ee-a538f73a29f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.359 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:56 compute-0 ovn_controller[95372]: 2026-01-22T17:15:56Z|00350|binding|INFO|Releasing lport 705c199a-731e-4515-b4ee-a538f73a29f1 from this chassis (sb_readonly=0)
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.360 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.361 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/359b74c5-cbeb-4440-a3e9-a16a51b1ab77.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/359b74c5-cbeb-4440-a3e9-a16a51b1ab77.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.362 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ec0fe75d-a650-48d2-ab0c-a9943e21b058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.362 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/359b74c5-cbeb-4440-a3e9-a16a51b1ab77.pid.haproxy
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 359b74c5-cbeb-4440-a3e9-a16a51b1ab77
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:15:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:56.363 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77', 'env', 'PROCESS_TAG=haproxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/359b74c5-cbeb-4440-a3e9-a16a51b1ab77.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.372 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.393 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102156.3933115, e02af423-c4c3-4fcd-be73-ebeba9fae411 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.394 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] VM Started (Lifecycle Event)
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.412 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.417 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102156.3953369, e02af423-c4c3-4fcd-be73-ebeba9fae411 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.417 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] VM Paused (Lifecycle Event)
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.435 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.439 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.455 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.701 183079 DEBUG nova.compute.manager [req-c45aafd4-aac5-43b9-820e-73eb6f440abe req-5abc538e-02f2-41fe-b136-dcfd97c07d2c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-vif-plugged-01441304-20f8-4d07-a1ed-05da5d9297d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.701 183079 DEBUG oslo_concurrency.lockutils [req-c45aafd4-aac5-43b9-820e-73eb6f440abe req-5abc538e-02f2-41fe-b136-dcfd97c07d2c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.702 183079 DEBUG oslo_concurrency.lockutils [req-c45aafd4-aac5-43b9-820e-73eb6f440abe req-5abc538e-02f2-41fe-b136-dcfd97c07d2c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.702 183079 DEBUG oslo_concurrency.lockutils [req-c45aafd4-aac5-43b9-820e-73eb6f440abe req-5abc538e-02f2-41fe-b136-dcfd97c07d2c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.702 183079 DEBUG nova.compute.manager [req-c45aafd4-aac5-43b9-820e-73eb6f440abe req-5abc538e-02f2-41fe-b136-dcfd97c07d2c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Processing event network-vif-plugged-01441304-20f8-4d07-a1ed-05da5d9297d6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.703 183079 DEBUG nova.compute.manager [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.707 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102156.7067394, e02af423-c4c3-4fcd-be73-ebeba9fae411 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.707 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] VM Resumed (Lifecycle Event)
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.712 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.716 183079 INFO nova.virt.libvirt.driver [-] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Instance spawned successfully.
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.717 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.749 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.754 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.758 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.758 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.759 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.759 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.759 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.760 183079 DEBUG nova.virt.libvirt.driver [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.790 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:15:56 compute-0 podman[223701]: 2026-01-22 17:15:56.795320405 +0000 UTC m=+0.093884444 container create 23db2a4d832717b097d7becbb77c12bccadb1403cb11f17d49410f55d0c9782c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.828 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102141.827129, 26367132-bc45-4c8a-bd7e-5c0883453bbd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.828 183079 INFO nova.compute.manager [-] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] VM Stopped (Lifecycle Event)
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.838 183079 INFO nova.compute.manager [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Took 8.41 seconds to spawn the instance on the hypervisor.
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.838 183079 DEBUG nova.compute.manager [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:15:56 compute-0 podman[223701]: 2026-01-22 17:15:56.753443531 +0000 UTC m=+0.052007600 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.845 183079 DEBUG nova.compute.manager [None req-0ea1535e-3b26-4d2d-955f-ee790f5f6e39 - - - - - -] [instance: 26367132-bc45-4c8a-bd7e-5c0883453bbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:15:56 compute-0 systemd[1]: Started libpod-conmon-23db2a4d832717b097d7becbb77c12bccadb1403cb11f17d49410f55d0c9782c.scope.
Jan 22 17:15:56 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.898 183079 INFO nova.compute.manager [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Took 9.00 seconds to build instance.
Jan 22 17:15:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2261cc405fb906726a062834e89ea717b4ae935ef0400bca60424988b6bf124/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:15:56 compute-0 nova_compute[183075]: 2026-01-22 17:15:56.912 183079 DEBUG oslo_concurrency.lockutils [None req-edf058eb-1336-4c03-a977-7b6ebaffcf3d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:56 compute-0 podman[223701]: 2026-01-22 17:15:56.916537182 +0000 UTC m=+0.215101211 container init 23db2a4d832717b097d7becbb77c12bccadb1403cb11f17d49410f55d0c9782c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:15:56 compute-0 podman[223701]: 2026-01-22 17:15:56.926833661 +0000 UTC m=+0.225397690 container start 23db2a4d832717b097d7becbb77c12bccadb1403cb11f17d49410f55d0c9782c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:15:56 compute-0 neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223716]: [NOTICE]   (223720) : New worker (223722) forked
Jan 22 17:15:56 compute-0 neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223716]: [NOTICE]   (223720) : Loading success.
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.005 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 01441304-20f8-4d07-a1ed-05da5d9297d6 in datapath ee52ea03-3241-4391-9bfe-b2039dbf3bfe unbound from our chassis
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.007 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee52ea03-3241-4391-9bfe-b2039dbf3bfe
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.019 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d6b33466-415a-49a6-8a41-21c51a065c72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.020 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee52ea03-31 in ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.023 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee52ea03-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.023 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e4627b69-960c-4747-b4a2-2c5669cc560f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.024 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d1da18-ec87-4e58-8428-f26fa7bf7edf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.038 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[589d8a82-0d1b-417d-96c3-2271523134dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.068 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4787041d-2b4a-4a48-ae65-94ad66ac7703]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.115 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d8a30371-f6d1-4fa8-9ef6-1132b1ebdc4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.125 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[31de6df1-59be-4aad-9d9d-f9d953d50921]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:57 compute-0 NetworkManager[55454]: <info>  [1769102157.1265] manager: (tapee52ea03-30): new Veth device (/org/freedesktop/NetworkManager/Devices/149)
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.174 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[0442ee47-3ee2-4304-8aa4-371f228dbe5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.178 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ec53f11a-9f6e-46d4-91a5-8f127ecddfbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:57 compute-0 NetworkManager[55454]: <info>  [1769102157.2210] device (tapee52ea03-30): carrier: link connected
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.232 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[88f3c0ff-ef37-433b-a207-e02540449a8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.254 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8db2d834-3e45-4bb4-b4ad-641b99789268]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee52ea03-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c5:77:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452092, 'reachable_time': 21863, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223741, 'error': None, 'target': 'ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.282 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[63a88e0a-bcfb-4cd8-b826-a4cbc701dbd5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec5:776f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 452092, 'tstamp': 452092}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223742, 'error': None, 'target': 'ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.304 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2bfedbb5-cd13-4897-b07d-c5c19459f835]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee52ea03-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c5:77:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452092, 'reachable_time': 21863, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223743, 'error': None, 'target': 'ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.352 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f5af5fb2-7d43-48ca-b9b3-b509c2d3711a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.403 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6a88f989-d0cb-4af0-a6b5-945c03cd486a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.405 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee52ea03-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.405 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.406 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee52ea03-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:57 compute-0 nova_compute[183075]: 2026-01-22 17:15:57.408 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:57 compute-0 NetworkManager[55454]: <info>  [1769102157.4097] manager: (tapee52ea03-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/150)
Jan 22 17:15:57 compute-0 kernel: tapee52ea03-30: entered promiscuous mode
Jan 22 17:15:57 compute-0 nova_compute[183075]: 2026-01-22 17:15:57.414 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.417 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee52ea03-30, col_values=(('external_ids', {'iface-id': '70580b74-5897-42e6-a447-815ee3b83763'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:15:57 compute-0 nova_compute[183075]: 2026-01-22 17:15:57.418 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:57 compute-0 ovn_controller[95372]: 2026-01-22T17:15:57Z|00351|binding|INFO|Releasing lport 70580b74-5897-42e6-a447-815ee3b83763 from this chassis (sb_readonly=0)
Jan 22 17:15:57 compute-0 nova_compute[183075]: 2026-01-22 17:15:57.446 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:57 compute-0 nova_compute[183075]: 2026-01-22 17:15:57.447 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.449 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee52ea03-3241-4391-9bfe-b2039dbf3bfe.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee52ea03-3241-4391-9bfe-b2039dbf3bfe.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.450 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ddc654-635b-4865-99e4-c90a68f0355b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.451 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-ee52ea03-3241-4391-9bfe-b2039dbf3bfe
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/ee52ea03-3241-4391-9bfe-b2039dbf3bfe.pid.haproxy
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID ee52ea03-3241-4391-9bfe-b2039dbf3bfe
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:15:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:15:57.451 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe', 'env', 'PROCESS_TAG=haproxy-ee52ea03-3241-4391-9bfe-b2039dbf3bfe', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee52ea03-3241-4391-9bfe-b2039dbf3bfe.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:15:57 compute-0 nova_compute[183075]: 2026-01-22 17:15:57.549 183079 INFO nova.compute.manager [None req-10721b71-7743-469c-b3bb-08618279eb13 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Get console output
Jan 22 17:15:57 compute-0 nova_compute[183075]: 2026-01-22 17:15:57.556 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:15:57 compute-0 podman[223774]: 2026-01-22 17:15:57.887736063 +0000 UTC m=+0.071410106 container create b366309911a1697880dcbfb275fb455beb55a61a66936a3f9db95ca020fbc0cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:15:57 compute-0 podman[223774]: 2026-01-22 17:15:57.851524657 +0000 UTC m=+0.035198730 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:15:57 compute-0 systemd[1]: Started libpod-conmon-b366309911a1697880dcbfb275fb455beb55a61a66936a3f9db95ca020fbc0cf.scope.
Jan 22 17:15:57 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:15:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f13a1bdb760faaf3db19fac77e363ec9fc505796619536523e29bcf6c3c8286c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:15:58 compute-0 podman[223774]: 2026-01-22 17:15:58.016560519 +0000 UTC m=+0.200234572 container init b366309911a1697880dcbfb275fb455beb55a61a66936a3f9db95ca020fbc0cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:15:58 compute-0 podman[223774]: 2026-01-22 17:15:58.028229324 +0000 UTC m=+0.211903347 container start b366309911a1697880dcbfb275fb455beb55a61a66936a3f9db95ca020fbc0cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 17:15:58 compute-0 neutron-haproxy-ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe[223790]: [NOTICE]   (223794) : New worker (223796) forked
Jan 22 17:15:58 compute-0 neutron-haproxy-ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe[223790]: [NOTICE]   (223794) : Loading success.
Jan 22 17:15:58 compute-0 nova_compute[183075]: 2026-01-22 17:15:58.256 183079 DEBUG nova.network.neutron [req-46d46fee-3102-4044-b6a8-0effc58f663a req-0d2159b8-c02a-43d8-8132-48a6687cc641 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Updated VIF entry in instance network info cache for port 01441304-20f8-4d07-a1ed-05da5d9297d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:15:58 compute-0 nova_compute[183075]: 2026-01-22 17:15:58.257 183079 DEBUG nova.network.neutron [req-46d46fee-3102-4044-b6a8-0effc58f663a req-0d2159b8-c02a-43d8-8132-48a6687cc641 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Updating instance_info_cache with network_info: [{"id": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "address": "fa:16:3e:d9:53:87", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b803bb5-5b", "ovs_interfaceid": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:15:58 compute-0 nova_compute[183075]: 2026-01-22 17:15:58.283 183079 DEBUG oslo_concurrency.lockutils [req-46d46fee-3102-4044-b6a8-0effc58f663a req-0d2159b8-c02a-43d8-8132-48a6687cc641 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:15:58 compute-0 nova_compute[183075]: 2026-01-22 17:15:58.385 183079 DEBUG nova.compute.manager [req-77dc48ea-0d41-4dcd-811d-58a0fc1c31e1 req-4a4e3020-2190-4178-8177-a557599faff9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-vif-plugged-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:15:58 compute-0 nova_compute[183075]: 2026-01-22 17:15:58.386 183079 DEBUG oslo_concurrency.lockutils [req-77dc48ea-0d41-4dcd-811d-58a0fc1c31e1 req-4a4e3020-2190-4178-8177-a557599faff9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:58 compute-0 nova_compute[183075]: 2026-01-22 17:15:58.386 183079 DEBUG oslo_concurrency.lockutils [req-77dc48ea-0d41-4dcd-811d-58a0fc1c31e1 req-4a4e3020-2190-4178-8177-a557599faff9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:58 compute-0 nova_compute[183075]: 2026-01-22 17:15:58.387 183079 DEBUG oslo_concurrency.lockutils [req-77dc48ea-0d41-4dcd-811d-58a0fc1c31e1 req-4a4e3020-2190-4178-8177-a557599faff9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:58 compute-0 nova_compute[183075]: 2026-01-22 17:15:58.387 183079 DEBUG nova.compute.manager [req-77dc48ea-0d41-4dcd-811d-58a0fc1c31e1 req-4a4e3020-2190-4178-8177-a557599faff9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] No waiting events found dispatching network-vif-plugged-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:15:58 compute-0 nova_compute[183075]: 2026-01-22 17:15:58.387 183079 WARNING nova.compute.manager [req-77dc48ea-0d41-4dcd-811d-58a0fc1c31e1 req-4a4e3020-2190-4178-8177-a557599faff9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received unexpected event network-vif-plugged-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 for instance with vm_state active and task_state None.
Jan 22 17:15:58 compute-0 nova_compute[183075]: 2026-01-22 17:15:58.819 183079 DEBUG nova.compute.manager [req-6b9338c9-db18-47fa-901e-96d1f210251c req-76330054-11d6-4bc9-8d55-b7101d562930 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-vif-plugged-01441304-20f8-4d07-a1ed-05da5d9297d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:15:58 compute-0 nova_compute[183075]: 2026-01-22 17:15:58.820 183079 DEBUG oslo_concurrency.lockutils [req-6b9338c9-db18-47fa-901e-96d1f210251c req-76330054-11d6-4bc9-8d55-b7101d562930 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:15:58 compute-0 nova_compute[183075]: 2026-01-22 17:15:58.820 183079 DEBUG oslo_concurrency.lockutils [req-6b9338c9-db18-47fa-901e-96d1f210251c req-76330054-11d6-4bc9-8d55-b7101d562930 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:15:58 compute-0 nova_compute[183075]: 2026-01-22 17:15:58.821 183079 DEBUG oslo_concurrency.lockutils [req-6b9338c9-db18-47fa-901e-96d1f210251c req-76330054-11d6-4bc9-8d55-b7101d562930 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:15:58 compute-0 nova_compute[183075]: 2026-01-22 17:15:58.821 183079 DEBUG nova.compute.manager [req-6b9338c9-db18-47fa-901e-96d1f210251c req-76330054-11d6-4bc9-8d55-b7101d562930 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] No waiting events found dispatching network-vif-plugged-01441304-20f8-4d07-a1ed-05da5d9297d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:15:58 compute-0 nova_compute[183075]: 2026-01-22 17:15:58.821 183079 WARNING nova.compute.manager [req-6b9338c9-db18-47fa-901e-96d1f210251c req-76330054-11d6-4bc9-8d55-b7101d562930 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received unexpected event network-vif-plugged-01441304-20f8-4d07-a1ed-05da5d9297d6 for instance with vm_state active and task_state None.
Jan 22 17:15:59 compute-0 nova_compute[183075]: 2026-01-22 17:15:59.717 183079 INFO nova.compute.manager [None req-0e64fd59-093f-446f-838f-d06f2200fd5d 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Get console output
Jan 22 17:15:59 compute-0 nova_compute[183075]: 2026-01-22 17:15:59.725 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:16:00 compute-0 nova_compute[183075]: 2026-01-22 17:16:00.778 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:01 compute-0 nova_compute[183075]: 2026-01-22 17:16:01.067 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:01 compute-0 ovn_controller[95372]: 2026-01-22T17:16:01Z|00352|binding|INFO|Releasing lport 705c199a-731e-4515-b4ee-a538f73a29f1 from this chassis (sb_readonly=0)
Jan 22 17:16:01 compute-0 ovn_controller[95372]: 2026-01-22T17:16:01Z|00353|binding|INFO|Releasing lport 70580b74-5897-42e6-a447-815ee3b83763 from this chassis (sb_readonly=0)
Jan 22 17:16:01 compute-0 ovn_controller[95372]: 2026-01-22T17:16:01Z|00354|binding|INFO|Releasing lport 934896c0-00c7-4510-9857-208659266984 from this chassis (sb_readonly=0)
Jan 22 17:16:01 compute-0 nova_compute[183075]: 2026-01-22 17:16:01.392 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:02 compute-0 nova_compute[183075]: 2026-01-22 17:16:02.688 183079 INFO nova.compute.manager [None req-da7fb060-522b-4548-87d8-b06a8fba8f48 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Get console output
Jan 22 17:16:02 compute-0 nova_compute[183075]: 2026-01-22 17:16:02.697 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:16:04 compute-0 podman[223808]: 2026-01-22 17:16:04.405565022 +0000 UTC m=+0.085717691 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=openstack_network_exporter, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, version=9.6, architecture=x86_64, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter)
Jan 22 17:16:04 compute-0 podman[223807]: 2026-01-22 17:16:04.414832784 +0000 UTC m=+0.091445020 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 22 17:16:04 compute-0 podman[223806]: 2026-01-22 17:16:04.465830406 +0000 UTC m=+0.153028659 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 17:16:04 compute-0 nova_compute[183075]: 2026-01-22 17:16:04.913 183079 INFO nova.compute.manager [None req-c6b7f66e-d210-4fed-b037-ef99e11dcee6 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Get console output
Jan 22 17:16:04 compute-0 nova_compute[183075]: 2026-01-22 17:16:04.918 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:16:05 compute-0 nova_compute[183075]: 2026-01-22 17:16:05.816 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:06 compute-0 nova_compute[183075]: 2026-01-22 17:16:06.067 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:06 compute-0 nova_compute[183075]: 2026-01-22 17:16:06.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:16:07 compute-0 nova_compute[183075]: 2026-01-22 17:16:07.820 183079 INFO nova.compute.manager [None req-8b18889f-f629-4996-8c9e-85bab3f8c790 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Get console output
Jan 22 17:16:07 compute-0 nova_compute[183075]: 2026-01-22 17:16:07.830 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:16:09 compute-0 ovn_controller[95372]: 2026-01-22T17:16:09Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d9:53:87 10.100.0.7
Jan 22 17:16:09 compute-0 ovn_controller[95372]: 2026-01-22T17:16:09Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d9:53:87 10.100.0.7
Jan 22 17:16:09 compute-0 nova_compute[183075]: 2026-01-22 17:16:09.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:16:10 compute-0 nova_compute[183075]: 2026-01-22 17:16:10.164 183079 INFO nova.compute.manager [None req-a491f750-21e9-443a-9a45-62f404352cf2 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Get console output
Jan 22 17:16:10 compute-0 podman[223883]: 2026-01-22 17:16:10.384166441 +0000 UTC m=+0.095688590 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:16:10 compute-0 nova_compute[183075]: 2026-01-22 17:16:10.866 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:11 compute-0 nova_compute[183075]: 2026-01-22 17:16:11.069 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:12 compute-0 nova_compute[183075]: 2026-01-22 17:16:12.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:16:12 compute-0 nova_compute[183075]: 2026-01-22 17:16:12.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:16:14 compute-0 nova_compute[183075]: 2026-01-22 17:16:14.232 183079 INFO nova.compute.manager [None req-373eb991-65ab-4d39-a023-7ebee253b157 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Get console output
Jan 22 17:16:14 compute-0 nova_compute[183075]: 2026-01-22 17:16:14.237 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:16:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:14.660 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:16:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:14.662 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:16:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:16:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:16:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:16:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:16:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:16:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:16:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:16:15 compute-0 nova_compute[183075]: 2026-01-22 17:16:15.506 183079 INFO nova.compute.manager [None req-036d946b-1884-4389-bafe-171174875f8f 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Get console output
Jan 22 17:16:15 compute-0 nova_compute[183075]: 2026-01-22 17:16:15.512 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.598 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.599 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.9371133
Jan 22 17:16:15 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223722]: 10.100.0.7:50702 [22/Jan/2026:17:16:14.659] listener listener/metadata 0/0/0/939/939 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.609 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.610 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.642 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.642 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0319302
Jan 22 17:16:15 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223722]: 10.100.0.7:50712 [22/Jan/2026:17:16:15.609] listener listener/metadata 0/0/0/33/33 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.650 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.651 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.688 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.689 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0382042
Jan 22 17:16:15 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223722]: 10.100.0.7:50728 [22/Jan/2026:17:16:15.649] listener listener/metadata 0/0/0/39/39 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.700 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.701 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.725 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.725 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0245779
Jan 22 17:16:15 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223722]: 10.100.0.7:50744 [22/Jan/2026:17:16:15.699] listener listener/metadata 0/0/0/25/25 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.732 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.732 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.747 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.747 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0152097
Jan 22 17:16:15 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223722]: 10.100.0.7:50758 [22/Jan/2026:17:16:15.731] listener listener/metadata 0/0/0/16/16 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.754 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.755 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.772 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.772 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0170226
Jan 22 17:16:15 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223722]: 10.100.0.7:50760 [22/Jan/2026:17:16:15.754] listener listener/metadata 0/0/0/18/18 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.780 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.781 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:16:15 compute-0 nova_compute[183075]: 2026-01-22 17:16:15.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:16:15 compute-0 nova_compute[183075]: 2026-01-22 17:16:15.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:16:15 compute-0 nova_compute[183075]: 2026-01-22 17:16:15.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:16:15 compute-0 nova_compute[183075]: 2026-01-22 17:16:15.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.809 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.810 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0290475
Jan 22 17:16:15 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223722]: 10.100.0.7:50770 [22/Jan/2026:17:16:15.779] listener listener/metadata 0/0/0/30/30 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.815 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.816 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.831 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.832 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0161009
Jan 22 17:16:15 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223722]: 10.100.0.7:50772 [22/Jan/2026:17:16:15.815] listener listener/metadata 0/0/0/16/16 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.837 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.837 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.852 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.852 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0153844
Jan 22 17:16:15 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223722]: 10.100.0.7:50776 [22/Jan/2026:17:16:15.836] listener listener/metadata 0/0/0/16/16 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.858 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.858 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:16:15 compute-0 nova_compute[183075]: 2026-01-22 17:16:15.908 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.909 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.909 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0510383
Jan 22 17:16:15 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223722]: 10.100.0.7:50790 [22/Jan/2026:17:16:15.857] listener listener/metadata 0/0/0/51/51 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.915 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.916 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:16:15 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223722]: 10.100.0.7:50804 [22/Jan/2026:17:16:15.915] listener listener/metadata 0/0/0/20/20 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.935 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0188725
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.955 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.956 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.972 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.973 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0163929
Jan 22 17:16:15 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223722]: 10.100.0.7:50812 [22/Jan/2026:17:16:15.955] listener listener/metadata 0/0/0/17/17 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.977 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.977 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.991 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.991 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0135932
Jan 22 17:16:15 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223722]: 10.100.0.7:50816 [22/Jan/2026:17:16:15.977] listener listener/metadata 0/0/0/14/14 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.998 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:15.999 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:16:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:16.013 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:16:16 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223722]: 10.100.0.7:50826 [22/Jan/2026:17:16:15.998] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:16.014 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0149660
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:16.019 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:16.020 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:16.038 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:16.039 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0184143
Jan 22 17:16:16 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223722]: 10.100.0.7:50834 [22/Jan/2026:17:16:16.019] listener listener/metadata 0/0/0/19/19 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:16.044 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:16.045 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:16.061 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:16:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:16.062 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0163989
Jan 22 17:16:16 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223722]: 10.100.0.7:50846 [22/Jan/2026:17:16:16.044] listener listener/metadata 0/0/0/17/17 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:16:16 compute-0 nova_compute[183075]: 2026-01-22 17:16:16.072 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:16 compute-0 nova_compute[183075]: 2026-01-22 17:16:16.118 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-1b9e0b4e-34d1-46d1-8f04-a07da354e704" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:16:16 compute-0 nova_compute[183075]: 2026-01-22 17:16:16.118 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-1b9e0b4e-34d1-46d1-8f04-a07da354e704" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:16:16 compute-0 nova_compute[183075]: 2026-01-22 17:16:16.119 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:16:16 compute-0 nova_compute[183075]: 2026-01-22 17:16:16.120 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1b9e0b4e-34d1-46d1-8f04-a07da354e704 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:16:16 compute-0 podman[223903]: 2026-01-22 17:16:16.376043199 +0000 UTC m=+0.082291601 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.128 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Updating instance_info_cache with network_info: [{"id": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "address": "fa:16:3e:05:67:11", "network": {"id": "922155c4-0d93-4488-a55e-c0d6583804c3", "bridge": "br-int", "label": "tempest-DHCPTest-396339124", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d764c6e1fdd46b88f83657e6a259c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a6b90a-3d", "ovs_interfaceid": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.145 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-1b9e0b4e-34d1-46d1-8f04-a07da354e704" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.146 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.146 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.146 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.146 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.147 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.172 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.172 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.173 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.173 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.260 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.359 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.361 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.432 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.448 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.542 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.543 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.604 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.801 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.803 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5294MB free_disk=73.31105041503906GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.803 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.803 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.891 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 1b9e0b4e-34d1-46d1-8f04-a07da354e704 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.892 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance e02af423-c4c3-4fcd-be73-ebeba9fae411 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.892 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.893 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:16:18 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.984 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:16:19 compute-0 nova_compute[183075]: 2026-01-22 17:16:18.999 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:16:19 compute-0 nova_compute[183075]: 2026-01-22 17:16:19.024 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:16:19 compute-0 nova_compute[183075]: 2026-01-22 17:16:19.025 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:19 compute-0 nova_compute[183075]: 2026-01-22 17:16:19.453 183079 INFO nova.compute.manager [None req-3e4e1277-db1d-41e4-885e-c8a560437028 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Get console output
Jan 22 17:16:19 compute-0 nova_compute[183075]: 2026-01-22 17:16:19.459 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:16:20 compute-0 nova_compute[183075]: 2026-01-22 17:16:20.440 183079 INFO nova.compute.manager [None req-a1de9862-1338-41f3-bdbd-928760ae8553 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Get console output
Jan 22 17:16:20 compute-0 nova_compute[183075]: 2026-01-22 17:16:20.446 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:16:20 compute-0 nova_compute[183075]: 2026-01-22 17:16:20.669 183079 INFO nova.compute.manager [None req-cdf22d44-afeb-4c48-9a15-c119e497d895 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Get console output
Jan 22 17:16:20 compute-0 nova_compute[183075]: 2026-01-22 17:16:20.675 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:16:20 compute-0 nova_compute[183075]: 2026-01-22 17:16:20.770 183079 DEBUG oslo_concurrency.lockutils [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Acquiring lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:20 compute-0 nova_compute[183075]: 2026-01-22 17:16:20.770 183079 DEBUG oslo_concurrency.lockutils [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:20 compute-0 nova_compute[183075]: 2026-01-22 17:16:20.771 183079 DEBUG oslo_concurrency.lockutils [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Acquiring lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:20 compute-0 nova_compute[183075]: 2026-01-22 17:16:20.771 183079 DEBUG oslo_concurrency.lockutils [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:20 compute-0 nova_compute[183075]: 2026-01-22 17:16:20.772 183079 DEBUG oslo_concurrency.lockutils [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:20 compute-0 nova_compute[183075]: 2026-01-22 17:16:20.774 183079 INFO nova.compute.manager [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Terminating instance
Jan 22 17:16:20 compute-0 nova_compute[183075]: 2026-01-22 17:16:20.775 183079 DEBUG nova.compute.manager [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:16:20 compute-0 kernel: tap00a6b90a-3d (unregistering): left promiscuous mode
Jan 22 17:16:20 compute-0 NetworkManager[55454]: <info>  [1769102180.7983] device (tap00a6b90a-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:16:20 compute-0 nova_compute[183075]: 2026-01-22 17:16:20.845 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:20 compute-0 ovn_controller[95372]: 2026-01-22T17:16:20Z|00355|binding|INFO|Releasing lport 00a6b90a-3de5-45e7-93ca-e2b72cac1de3 from this chassis (sb_readonly=0)
Jan 22 17:16:20 compute-0 ovn_controller[95372]: 2026-01-22T17:16:20Z|00356|binding|INFO|Setting lport 00a6b90a-3de5-45e7-93ca-e2b72cac1de3 down in Southbound
Jan 22 17:16:20 compute-0 ovn_controller[95372]: 2026-01-22T17:16:20Z|00357|binding|INFO|Removing iface tap00a6b90a-3d ovn-installed in OVS
Jan 22 17:16:20 compute-0 nova_compute[183075]: 2026-01-22 17:16:20.849 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:20 compute-0 nova_compute[183075]: 2026-01-22 17:16:20.861 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:20.870 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:67:11 10.100.0.9'], port_security=['fa:16:3e:05:67:11 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-DHCPTest-396339124', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1b9e0b4e-34d1-46d1-8f04-a07da354e704', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-922155c4-0d93-4488-a55e-c0d6583804c3', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-DHCPTest-396339124', 'neutron:project_id': '8d764c6e1fdd46b88f83657e6a259c71', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ef07e124-c32c-447c-8695-9ee6eb139f7e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.229', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=284b2620-f1a0-4ab6-8476-908f65d591a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=00a6b90a-3de5-45e7-93ca-e2b72cac1de3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:16:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:20.872 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 00a6b90a-3de5-45e7-93ca-e2b72cac1de3 in datapath 922155c4-0d93-4488-a55e-c0d6583804c3 unbound from our chassis
Jan 22 17:16:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:20.874 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 922155c4-0d93-4488-a55e-c0d6583804c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:16:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:20.875 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[02a3c2f0-2f06-49a9-9e8a-c306e2eda738]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:20.876 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3 namespace which is not needed anymore
Jan 22 17:16:20 compute-0 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Jan 22 17:16:20 compute-0 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000001d.scope: Consumed 15.319s CPU time.
Jan 22 17:16:20 compute-0 nova_compute[183075]: 2026-01-22 17:16:20.909 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:20 compute-0 systemd-machined[154382]: Machine qemu-29-instance-0000001d terminated.
Jan 22 17:16:21 compute-0 neutron-haproxy-ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3[223323]: [NOTICE]   (223327) : haproxy version is 2.8.14-c23fe91
Jan 22 17:16:21 compute-0 neutron-haproxy-ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3[223323]: [NOTICE]   (223327) : path to executable is /usr/sbin/haproxy
Jan 22 17:16:21 compute-0 neutron-haproxy-ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3[223323]: [WARNING]  (223327) : Exiting Master process...
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.054 183079 INFO nova.virt.libvirt.driver [-] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Instance destroyed successfully.
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.055 183079 DEBUG nova.objects.instance [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lazy-loading 'resources' on Instance uuid 1b9e0b4e-34d1-46d1-8f04-a07da354e704 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:16:21 compute-0 neutron-haproxy-ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3[223323]: [ALERT]    (223327) : Current worker (223329) exited with code 143 (Terminated)
Jan 22 17:16:21 compute-0 neutron-haproxy-ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3[223323]: [WARNING]  (223327) : All workers exited. Exiting... (0)
Jan 22 17:16:21 compute-0 systemd[1]: libpod-753d8288d75fee4d824bd426081cff820975aa801c3263879d62b5202b50cdc8.scope: Deactivated successfully.
Jan 22 17:16:21 compute-0 podman[223963]: 2026-01-22 17:16:21.063842388 +0000 UTC m=+0.061141699 container died 753d8288d75fee4d824bd426081cff820975aa801c3263879d62b5202b50cdc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.068 183079 DEBUG nova.virt.libvirt.vif [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:15:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1891638859',display_name='tempest-server-test-1891638859',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1891638859',id=29,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHJ2WpIHRA1kq4lV633wE2x21ZQiUKsNno6rE9q+Yvr8+SBuOLxYX/SrJrCgwnCnqd0qro7QnVhx9iXp6Xfs9luRAXTITM2MCnWaFzPevTHxPPALi/yKqLeXytZxdZTrkw==',key_name='tempest-DHCPTest-396339124',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:15:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8d764c6e1fdd46b88f83657e6a259c71',ramdisk_id='',reservation_id='r-rf98r04v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DHCPTest-488220837',owner_user_name='tempest-DHCPTest-488220837-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:15:25Z,user_data=None,user_id='21741b1e79254e698cc6d7684318589f',uuid=1b9e0b4e-34d1-46d1-8f04-a07da354e704,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "address": "fa:16:3e:05:67:11", "network": {"id": "922155c4-0d93-4488-a55e-c0d6583804c3", "bridge": "br-int", "label": "tempest-DHCPTest-396339124", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d764c6e1fdd46b88f83657e6a259c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a6b90a-3d", "ovs_interfaceid": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.069 183079 DEBUG nova.network.os_vif_util [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Converting VIF {"id": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "address": "fa:16:3e:05:67:11", "network": {"id": "922155c4-0d93-4488-a55e-c0d6583804c3", "bridge": "br-int", "label": "tempest-DHCPTest-396339124", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d764c6e1fdd46b88f83657e6a259c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a6b90a-3d", "ovs_interfaceid": "00a6b90a-3de5-45e7-93ca-e2b72cac1de3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.070 183079 DEBUG nova.network.os_vif_util [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:05:67:11,bridge_name='br-int',has_traffic_filtering=True,id=00a6b90a-3de5-45e7-93ca-e2b72cac1de3,network=Network(922155c4-0d93-4488-a55e-c0d6583804c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap00a6b90a-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.071 183079 DEBUG os_vif [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:67:11,bridge_name='br-int',has_traffic_filtering=True,id=00a6b90a-3de5-45e7-93ca-e2b72cac1de3,network=Network(922155c4-0d93-4488-a55e-c0d6583804c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap00a6b90a-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.075 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.080 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00a6b90a-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.085 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.088 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.090 183079 INFO os_vif [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:67:11,bridge_name='br-int',has_traffic_filtering=True,id=00a6b90a-3de5-45e7-93ca-e2b72cac1de3,network=Network(922155c4-0d93-4488-a55e-c0d6583804c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap00a6b90a-3d')
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.090 183079 INFO nova.virt.libvirt.driver [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Deleting instance files /var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704_del
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.091 183079 INFO nova.virt.libvirt.driver [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Deletion of /var/lib/nova/instances/1b9e0b4e-34d1-46d1-8f04-a07da354e704_del complete
Jan 22 17:16:21 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-753d8288d75fee4d824bd426081cff820975aa801c3263879d62b5202b50cdc8-userdata-shm.mount: Deactivated successfully.
Jan 22 17:16:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-8ea4a1becb38a6424de738958d8c8d23623d806b180d4f1eb01e9de2c2cb9efd-merged.mount: Deactivated successfully.
Jan 22 17:16:21 compute-0 podman[223963]: 2026-01-22 17:16:21.116098073 +0000 UTC m=+0.113397374 container cleanup 753d8288d75fee4d824bd426081cff820975aa801c3263879d62b5202b50cdc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:16:21 compute-0 systemd[1]: libpod-conmon-753d8288d75fee4d824bd426081cff820975aa801c3263879d62b5202b50cdc8.scope: Deactivated successfully.
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.149 183079 INFO nova.compute.manager [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.150 183079 DEBUG oslo.service.loopingcall [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.150 183079 DEBUG nova.compute.manager [-] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.150 183079 DEBUG nova.network.neutron [-] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:16:21 compute-0 podman[224007]: 2026-01-22 17:16:21.192682684 +0000 UTC m=+0.048417886 container remove 753d8288d75fee4d824bd426081cff820975aa801c3263879d62b5202b50cdc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:16:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:21.197 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e4aa572d-2ee8-48b6-ae24-84b58d5c308e]: (4, ('Thu Jan 22 05:16:20 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3 (753d8288d75fee4d824bd426081cff820975aa801c3263879d62b5202b50cdc8)\n753d8288d75fee4d824bd426081cff820975aa801c3263879d62b5202b50cdc8\nThu Jan 22 05:16:21 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3 (753d8288d75fee4d824bd426081cff820975aa801c3263879d62b5202b50cdc8)\n753d8288d75fee4d824bd426081cff820975aa801c3263879d62b5202b50cdc8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:21.200 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[74b845c2-2b1d-4d07-93e3-b1de010af74b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:21.201 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap922155c4-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.204 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:21 compute-0 kernel: tap922155c4-00: left promiscuous mode
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.229 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:21.233 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1b0536ba-081b-4951-86dd-cd29ac7cd091]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:21.246 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4eac4204-d714-48ca-b637-6c550c39e1dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:21.248 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8532c1a9-757a-49d7-8ff8-700eb00a0c8e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:21.271 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[08ef9bda-22f3-44ea-a289-2775bb9befcc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448816, 'reachable_time': 35334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224022, 'error': None, 'target': 'ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:21 compute-0 systemd[1]: run-netns-ovnmeta\x2d922155c4\x2d0d93\x2d4488\x2da55e\x2dc0d6583804c3.mount: Deactivated successfully.
Jan 22 17:16:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:21.275 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-922155c4-0d93-4488-a55e-c0d6583804c3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:16:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:21.275 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[3170b6e6-fdf3-434b-9f9c-c9d0860c888f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.681 183079 DEBUG nova.compute.manager [req-e9fd37a4-50c7-486b-8527-27ef0eff0249 req-a6ae3612-1a67-475c-a8bf-cb72a909246c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Received event network-vif-unplugged-00a6b90a-3de5-45e7-93ca-e2b72cac1de3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.681 183079 DEBUG oslo_concurrency.lockutils [req-e9fd37a4-50c7-486b-8527-27ef0eff0249 req-a6ae3612-1a67-475c-a8bf-cb72a909246c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.681 183079 DEBUG oslo_concurrency.lockutils [req-e9fd37a4-50c7-486b-8527-27ef0eff0249 req-a6ae3612-1a67-475c-a8bf-cb72a909246c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.681 183079 DEBUG oslo_concurrency.lockutils [req-e9fd37a4-50c7-486b-8527-27ef0eff0249 req-a6ae3612-1a67-475c-a8bf-cb72a909246c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.682 183079 DEBUG nova.compute.manager [req-e9fd37a4-50c7-486b-8527-27ef0eff0249 req-a6ae3612-1a67-475c-a8bf-cb72a909246c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] No waiting events found dispatching network-vif-unplugged-00a6b90a-3de5-45e7-93ca-e2b72cac1de3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.682 183079 DEBUG nova.compute.manager [req-e9fd37a4-50c7-486b-8527-27ef0eff0249 req-a6ae3612-1a67-475c-a8bf-cb72a909246c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Received event network-vif-unplugged-00a6b90a-3de5-45e7-93ca-e2b72cac1de3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.778 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:21.778 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:16:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:21.781 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.864 183079 INFO nova.compute.manager [None req-8943ad7e-77da-4ce3-a897-03508c73075b 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Get console output
Jan 22 17:16:21 compute-0 nova_compute[183075]: 2026-01-22 17:16:21.870 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:16:23 compute-0 nova_compute[183075]: 2026-01-22 17:16:23.190 183079 DEBUG nova.network.neutron [-] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:16:23 compute-0 nova_compute[183075]: 2026-01-22 17:16:23.208 183079 INFO nova.compute.manager [-] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Took 2.06 seconds to deallocate network for instance.
Jan 22 17:16:23 compute-0 nova_compute[183075]: 2026-01-22 17:16:23.259 183079 DEBUG oslo_concurrency.lockutils [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:23 compute-0 nova_compute[183075]: 2026-01-22 17:16:23.260 183079 DEBUG oslo_concurrency.lockutils [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:23 compute-0 nova_compute[183075]: 2026-01-22 17:16:23.344 183079 DEBUG nova.compute.provider_tree [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:16:23 compute-0 nova_compute[183075]: 2026-01-22 17:16:23.364 183079 DEBUG nova.scheduler.client.report [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:16:23 compute-0 nova_compute[183075]: 2026-01-22 17:16:23.383 183079 DEBUG oslo_concurrency.lockutils [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:23 compute-0 nova_compute[183075]: 2026-01-22 17:16:23.405 183079 INFO nova.scheduler.client.report [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Deleted allocations for instance 1b9e0b4e-34d1-46d1-8f04-a07da354e704
Jan 22 17:16:23 compute-0 nova_compute[183075]: 2026-01-22 17:16:23.465 183079 DEBUG oslo_concurrency.lockutils [None req-c39e41c7-2caf-45b4-b5d9-047872448dbd 21741b1e79254e698cc6d7684318589f 8d764c6e1fdd46b88f83657e6a259c71 - - default default] Lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:23 compute-0 nova_compute[183075]: 2026-01-22 17:16:23.824 183079 DEBUG nova.compute.manager [req-473f1804-82cb-4042-b105-3ba9dd715a06 req-223bad2c-2c36-43bd-b1c0-378c78203b41 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Received event network-vif-plugged-00a6b90a-3de5-45e7-93ca-e2b72cac1de3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:16:23 compute-0 nova_compute[183075]: 2026-01-22 17:16:23.824 183079 DEBUG oslo_concurrency.lockutils [req-473f1804-82cb-4042-b105-3ba9dd715a06 req-223bad2c-2c36-43bd-b1c0-378c78203b41 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:23 compute-0 nova_compute[183075]: 2026-01-22 17:16:23.825 183079 DEBUG oslo_concurrency.lockutils [req-473f1804-82cb-4042-b105-3ba9dd715a06 req-223bad2c-2c36-43bd-b1c0-378c78203b41 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:23 compute-0 nova_compute[183075]: 2026-01-22 17:16:23.825 183079 DEBUG oslo_concurrency.lockutils [req-473f1804-82cb-4042-b105-3ba9dd715a06 req-223bad2c-2c36-43bd-b1c0-378c78203b41 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "1b9e0b4e-34d1-46d1-8f04-a07da354e704-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:23 compute-0 nova_compute[183075]: 2026-01-22 17:16:23.826 183079 DEBUG nova.compute.manager [req-473f1804-82cb-4042-b105-3ba9dd715a06 req-223bad2c-2c36-43bd-b1c0-378c78203b41 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] No waiting events found dispatching network-vif-plugged-00a6b90a-3de5-45e7-93ca-e2b72cac1de3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:16:23 compute-0 nova_compute[183075]: 2026-01-22 17:16:23.826 183079 WARNING nova.compute.manager [req-473f1804-82cb-4042-b105-3ba9dd715a06 req-223bad2c-2c36-43bd-b1c0-378c78203b41 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Received unexpected event network-vif-plugged-00a6b90a-3de5-45e7-93ca-e2b72cac1de3 for instance with vm_state deleted and task_state None.
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.579 183079 DEBUG oslo_concurrency.lockutils [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "interface-e02af423-c4c3-4fcd-be73-ebeba9fae411-01441304-20f8-4d07-a1ed-05da5d9297d6" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.580 183079 DEBUG oslo_concurrency.lockutils [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "interface-e02af423-c4c3-4fcd-be73-ebeba9fae411-01441304-20f8-4d07-a1ed-05da5d9297d6" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.593 183079 DEBUG nova.objects.instance [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lazy-loading 'flavor' on Instance uuid e02af423-c4c3-4fcd-be73-ebeba9fae411 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.629 183079 DEBUG nova.virt.libvirt.vif [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1223247828',display_name='tempest-server-test-1223247828',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1223247828',id=30,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:15:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-kexgovqt',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:15:56Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=e02af423-c4c3-4fcd-be73-ebeba9fae411,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.629 183079 DEBUG nova.network.os_vif_util [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.630 183079 DEBUG nova.network.os_vif_util [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:72:7a,bridge_name='br-int',has_traffic_filtering=True,id=01441304-20f8-4d07-a1ed-05da5d9297d6,network=Network(ee52ea03-3241-4391-9bfe-b2039dbf3bfe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01441304-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.633 183079 DEBUG nova.virt.libvirt.guest [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:db:72:7a"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap01441304-20"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.635 183079 DEBUG nova.virt.libvirt.guest [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:db:72:7a"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap01441304-20"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.639 183079 DEBUG nova.virt.libvirt.driver [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Attempting to detach device tap01441304-20 from instance e02af423-c4c3-4fcd-be73-ebeba9fae411 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.639 183079 DEBUG nova.virt.libvirt.guest [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] detach device xml: <interface type="ethernet">
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <mac address="fa:16:3e:db:72:7a"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <model type="virtio"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <mtu size="1442"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <target dev="tap01441304-20"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]: </interface>
Jan 22 17:16:25 compute-0 nova_compute[183075]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.645 183079 DEBUG nova.virt.libvirt.guest [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:db:72:7a"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap01441304-20"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.648 183079 DEBUG nova.virt.libvirt.guest [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:db:72:7a"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap01441304-20"/></interface>not found in domain: <domain type='kvm' id='30'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <name>instance-0000001e</name>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <uuid>e02af423-c4c3-4fcd-be73-ebeba9fae411</uuid>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1223247828</nova:name>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:15:55</nova:creationTime>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:user uuid="2fe02f7484a94091bab26aba1c370459">tempest-IPv6Test-1828780436-project-member</nova:user>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:project uuid="4f95c62a5a194d2291b03187a9c85702">tempest-IPv6Test-1828780436</nova:project>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:port uuid="0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510">
Jan 22 17:16:25 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:port uuid="01441304-20f8-4d07-a1ed-05da5d9297d6">
Jan 22 17:16:25 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fedb:727a" ipVersion="6"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <memory unit='KiB'>131072</memory>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <vcpu placement='static'>1</vcpu>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <resource>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <partition>/machine</partition>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </resource>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <sysinfo type='smbios'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <system>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <entry name='manufacturer'>RDO</entry>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <entry name='serial'>e02af423-c4c3-4fcd-be73-ebeba9fae411</entry>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <entry name='uuid'>e02af423-c4c3-4fcd-be73-ebeba9fae411</entry>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <entry name='family'>Virtual Machine</entry>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </system>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <os>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <boot dev='hd'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <smbios mode='sysinfo'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </os>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <features>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <vmcoreinfo state='on'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </features>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <vendor>AMD</vendor>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='x2apic'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc-deadline'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='hypervisor'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc_adjust'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='spec-ctrl'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='stibp'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='ssbd'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='cmp_legacy'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='overflow-recov'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='succor'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='ibrs'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='amd-ssbd'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='virt-ssbd'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='lbrv'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='tsc-scale'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='vmcb-clean'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='flushbyasid'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='pause-filter'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='pfthreshold'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='xsaves'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='svm'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='topoext'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='npt'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='nrip-save'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <clock offset='utc'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <timer name='hpet' present='no'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <on_poweroff>destroy</on_poweroff>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <on_reboot>restart</on_reboot>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <on_crash>destroy</on_crash>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <disk type='file' device='disk'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <source file='/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk' index='1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <backingStore type='file' index='2'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <format type='raw'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <source file='/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <backingStore/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       </backingStore>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target dev='vda' bus='virtio'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='virtio-disk0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pcie.0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='1' port='0x10'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='2' port='0x11'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.2'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='3' port='0x12'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.3'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='4' port='0x13'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.4'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='5' port='0x14'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.5'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='6' port='0x15'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.6'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='7' port='0x16'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.7'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='8' port='0x17'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.8'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='9' port='0x18'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.9'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='10' port='0x19'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.10'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='11' port='0x1a'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.11'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='12' port='0x1b'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.12'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='13' port='0x1c'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.13'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='14' port='0x1d'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.14'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='15' port='0x1e'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.15'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='16' port='0x1f'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.16'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='17' port='0x20'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.17'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='18' port='0x21'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.18'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='19' port='0x22'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.19'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='20' port='0x23'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.20'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='21' port='0x24'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.21'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='22' port='0x25'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.22'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='23' port='0x26'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.23'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='24' port='0x27'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.24'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='25' port='0x28'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.25'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-pci-bridge'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.26'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='usb'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='sata' index='0'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='ide'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <interface type='ethernet'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <mac address='fa:16:3e:d9:53:87'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target dev='tap0b803bb5-5b'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model type='virtio'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <mtu size='1442'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='net0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <interface type='ethernet'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <mac address='fa:16:3e:db:72:7a'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target dev='tap01441304-20'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model type='virtio'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <mtu size='1442'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='net1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <serial type='pty'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <source path='/dev/pts/0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/console.log' append='off'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target type='isa-serial' port='0'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <model name='isa-serial'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       </target>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <console type='pty' tty='/dev/pts/0'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <source path='/dev/pts/0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/console.log' append='off'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target type='serial' port='0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </console>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <input type='tablet' bus='usb'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='input0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='usb' bus='0' port='1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </input>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <input type='mouse' bus='ps2'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='input1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </input>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <input type='keyboard' bus='ps2'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='input2'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </input>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <listen type='address' address='::0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </graphics>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <audio id='1' type='none'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <video>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='video0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </video>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <watchdog model='itco' action='reset'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='watchdog0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </watchdog>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <memballoon model='virtio'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <stats period='10'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='balloon0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <rng model='virtio'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <backend model='random'>/dev/urandom</backend>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='rng0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <label>system_u:system_r:svirt_t:s0:c203,c337</label>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c203,c337</imagelabel>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <label>+107:+107</label>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <imagelabel>+107:+107</imagelabel>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:16:25 compute-0 nova_compute[183075]: </domain>
Jan 22 17:16:25 compute-0 nova_compute[183075]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.649 183079 INFO nova.virt.libvirt.driver [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Successfully detached device tap01441304-20 from instance e02af423-c4c3-4fcd-be73-ebeba9fae411 from the persistent domain config.
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.650 183079 DEBUG nova.virt.libvirt.driver [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] (1/8): Attempting to detach device tap01441304-20 with device alias net1 from instance e02af423-c4c3-4fcd-be73-ebeba9fae411 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.650 183079 DEBUG nova.virt.libvirt.guest [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] detach device xml: <interface type="ethernet">
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <mac address="fa:16:3e:db:72:7a"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <model type="virtio"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <mtu size="1442"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <target dev="tap01441304-20"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]: </interface>
Jan 22 17:16:25 compute-0 nova_compute[183075]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 22 17:16:25 compute-0 kernel: tap01441304-20 (unregistering): left promiscuous mode
Jan 22 17:16:25 compute-0 NetworkManager[55454]: <info>  [1769102185.7067] device (tap01441304-20): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:16:25 compute-0 ovn_controller[95372]: 2026-01-22T17:16:25Z|00358|binding|INFO|Releasing lport 01441304-20f8-4d07-a1ed-05da5d9297d6 from this chassis (sb_readonly=0)
Jan 22 17:16:25 compute-0 ovn_controller[95372]: 2026-01-22T17:16:25Z|00359|binding|INFO|Setting lport 01441304-20f8-4d07-a1ed-05da5d9297d6 down in Southbound
Jan 22 17:16:25 compute-0 ovn_controller[95372]: 2026-01-22T17:16:25Z|00360|binding|INFO|Removing iface tap01441304-20 ovn-installed in OVS
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.724 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.727 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:25.732 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:72:7a 2001:db8::f816:3eff:fedb:727a'], port_security=['fa:16:3e:db:72:7a 2001:db8::f816:3eff:fedb:727a'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fedb:727a/64', 'neutron:device_id': 'e02af423-c4c3-4fcd-be73-ebeba9fae411', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee52ea03-3241-4391-9bfe-b2039dbf3bfe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4f95c62a5a194d2291b03187a9c85702', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1b5e2b25-1ae0-464c-ac9a-7fc65ac893a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9a1c076-b07e-4835-ae9a-9f814ca84200, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=01441304-20f8-4d07-a1ed-05da5d9297d6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:16:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:25.735 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 01441304-20f8-4d07-a1ed-05da5d9297d6 in datapath ee52ea03-3241-4391-9bfe-b2039dbf3bfe unbound from our chassis
Jan 22 17:16:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:25.738 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee52ea03-3241-4391-9bfe-b2039dbf3bfe, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.740 183079 DEBUG nova.virt.libvirt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Received event <DeviceRemovedEvent: 1769102185.7397408, e02af423-c4c3-4fcd-be73-ebeba9fae411 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.743 183079 DEBUG nova.virt.libvirt.driver [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Start waiting for the detach event from libvirt for device tap01441304-20 with device alias net1 for instance e02af423-c4c3-4fcd-be73-ebeba9fae411 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.743 183079 DEBUG nova.virt.libvirt.guest [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:db:72:7a"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap01441304-20"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:16:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:25.740 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8acf2fe5-932a-40ea-8262-0611cc05a389]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:25.741 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe namespace which is not needed anymore
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.748 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.752 183079 DEBUG nova.virt.libvirt.guest [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:db:72:7a"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap01441304-20"/></interface>not found in domain: <domain type='kvm' id='30'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <name>instance-0000001e</name>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <uuid>e02af423-c4c3-4fcd-be73-ebeba9fae411</uuid>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1223247828</nova:name>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:15:55</nova:creationTime>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:user uuid="2fe02f7484a94091bab26aba1c370459">tempest-IPv6Test-1828780436-project-member</nova:user>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:project uuid="4f95c62a5a194d2291b03187a9c85702">tempest-IPv6Test-1828780436</nova:project>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:port uuid="0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510">
Jan 22 17:16:25 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <nova:port uuid="01441304-20f8-4d07-a1ed-05da5d9297d6">
Jan 22 17:16:25 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fedb:727a" ipVersion="6"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <memory unit='KiB'>131072</memory>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <vcpu placement='static'>1</vcpu>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <resource>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <partition>/machine</partition>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </resource>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <sysinfo type='smbios'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <system>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <entry name='manufacturer'>RDO</entry>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <entry name='serial'>e02af423-c4c3-4fcd-be73-ebeba9fae411</entry>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <entry name='uuid'>e02af423-c4c3-4fcd-be73-ebeba9fae411</entry>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <entry name='family'>Virtual Machine</entry>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </system>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <os>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <boot dev='hd'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <smbios mode='sysinfo'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </os>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <features>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <vmcoreinfo state='on'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </features>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <vendor>AMD</vendor>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='x2apic'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc-deadline'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='hypervisor'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc_adjust'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='spec-ctrl'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='stibp'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='ssbd'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='cmp_legacy'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='overflow-recov'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='succor'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='ibrs'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='amd-ssbd'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='virt-ssbd'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='lbrv'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='tsc-scale'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='vmcb-clean'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='flushbyasid'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='pause-filter'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='pfthreshold'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='xsaves'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='svm'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='require' name='topoext'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='npt'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <feature policy='disable' name='nrip-save'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <clock offset='utc'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <timer name='hpet' present='no'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <on_poweroff>destroy</on_poweroff>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <on_reboot>restart</on_reboot>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <on_crash>destroy</on_crash>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <disk type='file' device='disk'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <source file='/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk' index='1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <backingStore type='file' index='2'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <format type='raw'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <source file='/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <backingStore/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       </backingStore>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target dev='vda' bus='virtio'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='virtio-disk0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pcie.0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='1' port='0x10'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='2' port='0x11'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.2'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='3' port='0x12'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.3'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='4' port='0x13'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.4'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='5' port='0x14'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.5'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='6' port='0x15'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.6'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='7' port='0x16'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.7'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='8' port='0x17'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.8'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='9' port='0x18'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.9'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='10' port='0x19'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.10'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='11' port='0x1a'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.11'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='12' port='0x1b'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.12'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='13' port='0x1c'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.13'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='14' port='0x1d'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.14'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='15' port='0x1e'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.15'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='16' port='0x1f'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.16'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='17' port='0x20'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.17'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='18' port='0x21'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.18'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='19' port='0x22'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.19'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='20' port='0x23'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.20'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='21' port='0x24'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.21'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='22' port='0x25'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.22'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='23' port='0x26'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.23'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='24' port='0x27'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.24'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target chassis='25' port='0x28'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.25'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model name='pcie-pci-bridge'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='pci.26'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='usb'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <controller type='sata' index='0'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='ide'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <interface type='ethernet'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <mac address='fa:16:3e:d9:53:87'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target dev='tap0b803bb5-5b'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model type='virtio'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <mtu size='1442'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='net0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <serial type='pty'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <source path='/dev/pts/0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/console.log' append='off'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target type='isa-serial' port='0'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:         <model name='isa-serial'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       </target>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <console type='pty' tty='/dev/pts/0'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <source path='/dev/pts/0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/console.log' append='off'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <target type='serial' port='0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </console>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <input type='tablet' bus='usb'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='input0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='usb' bus='0' port='1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </input>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <input type='mouse' bus='ps2'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='input1'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </input>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <input type='keyboard' bus='ps2'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='input2'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </input>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <listen type='address' address='::0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </graphics>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <audio id='1' type='none'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <video>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='video0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </video>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <watchdog model='itco' action='reset'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='watchdog0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </watchdog>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <memballoon model='virtio'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <stats period='10'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='balloon0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <rng model='virtio'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <backend model='random'>/dev/urandom</backend>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <alias name='rng0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <label>system_u:system_r:svirt_t:s0:c203,c337</label>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c203,c337</imagelabel>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <label>+107:+107</label>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <imagelabel>+107:+107</imagelabel>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:16:25 compute-0 nova_compute[183075]: </domain>
Jan 22 17:16:25 compute-0 nova_compute[183075]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.752 183079 INFO nova.virt.libvirt.driver [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Successfully detached device tap01441304-20 from instance e02af423-c4c3-4fcd-be73-ebeba9fae411 from the live domain config.
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.754 183079 DEBUG nova.virt.libvirt.vif [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1223247828',display_name='tempest-server-test-1223247828',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1223247828',id=30,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:15:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-kexgovqt',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:15:56Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=e02af423-c4c3-4fcd-be73-ebeba9fae411,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.754 183079 DEBUG nova.network.os_vif_util [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.755 183079 DEBUG nova.network.os_vif_util [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:72:7a,bridge_name='br-int',has_traffic_filtering=True,id=01441304-20f8-4d07-a1ed-05da5d9297d6,network=Network(ee52ea03-3241-4391-9bfe-b2039dbf3bfe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01441304-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.756 183079 DEBUG os_vif [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:72:7a,bridge_name='br-int',has_traffic_filtering=True,id=01441304-20f8-4d07-a1ed-05da5d9297d6,network=Network(ee52ea03-3241-4391-9bfe-b2039dbf3bfe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01441304-20') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.759 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.760 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01441304-20, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.763 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.765 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.768 183079 INFO os_vif [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:72:7a,bridge_name='br-int',has_traffic_filtering=True,id=01441304-20f8-4d07-a1ed-05da5d9297d6,network=Network(ee52ea03-3241-4391-9bfe-b2039dbf3bfe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01441304-20')
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.769 183079 DEBUG nova.virt.libvirt.guest [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <nova:name>tempest-server-test-1223247828</nova:name>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <nova:creationTime>2026-01-22 17:16:25</nova:creationTime>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <nova:flavor name="m1.nano">
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <nova:memory>128</nova:memory>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <nova:disk>1</nova:disk>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <nova:swap>0</nova:swap>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <nova:vcpus>1</nova:vcpus>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </nova:flavor>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <nova:owner>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <nova:user uuid="2fe02f7484a94091bab26aba1c370459">tempest-IPv6Test-1828780436-project-member</nova:user>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <nova:project uuid="4f95c62a5a194d2291b03187a9c85702">tempest-IPv6Test-1828780436</nova:project>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </nova:owner>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   <nova:ports>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     <nova:port uuid="0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510">
Jan 22 17:16:25 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:16:25 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:16:25 compute-0 nova_compute[183075]:   </nova:ports>
Jan 22 17:16:25 compute-0 nova_compute[183075]: </nova:instance>
Jan 22 17:16:25 compute-0 nova_compute[183075]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 17:16:25 compute-0 podman[224023]: 2026-01-22 17:16:25.808137793 +0000 UTC m=+0.077483715 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:16:25 compute-0 neutron-haproxy-ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe[223790]: [NOTICE]   (223794) : haproxy version is 2.8.14-c23fe91
Jan 22 17:16:25 compute-0 neutron-haproxy-ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe[223790]: [NOTICE]   (223794) : path to executable is /usr/sbin/haproxy
Jan 22 17:16:25 compute-0 neutron-haproxy-ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe[223790]: [WARNING]  (223794) : Exiting Master process...
Jan 22 17:16:25 compute-0 neutron-haproxy-ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe[223790]: [ALERT]    (223794) : Current worker (223796) exited with code 143 (Terminated)
Jan 22 17:16:25 compute-0 neutron-haproxy-ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe[223790]: [WARNING]  (223794) : All workers exited. Exiting... (0)
Jan 22 17:16:25 compute-0 systemd[1]: libpod-b366309911a1697880dcbfb275fb455beb55a61a66936a3f9db95ca020fbc0cf.scope: Deactivated successfully.
Jan 22 17:16:25 compute-0 conmon[223790]: conmon b366309911a1697880dc <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b366309911a1697880dcbfb275fb455beb55a61a66936a3f9db95ca020fbc0cf.scope/container/memory.events
Jan 22 17:16:25 compute-0 podman[224070]: 2026-01-22 17:16:25.923135077 +0000 UTC m=+0.059553977 container died b366309911a1697880dcbfb275fb455beb55a61a66936a3f9db95ca020fbc0cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.940 183079 DEBUG nova.compute.manager [req-99561639-416f-46cc-8cdc-530b7b7b19c1 req-970be7f1-b05c-4a41-84dd-9d5e5eccefe2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-changed-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.941 183079 DEBUG nova.compute.manager [req-99561639-416f-46cc-8cdc-530b7b7b19c1 req-970be7f1-b05c-4a41-84dd-9d5e5eccefe2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Refreshing instance network info cache due to event network-changed-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.942 183079 DEBUG oslo_concurrency.lockutils [req-99561639-416f-46cc-8cdc-530b7b7b19c1 req-970be7f1-b05c-4a41-84dd-9d5e5eccefe2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.942 183079 DEBUG oslo_concurrency.lockutils [req-99561639-416f-46cc-8cdc-530b7b7b19c1 req-970be7f1-b05c-4a41-84dd-9d5e5eccefe2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:16:25 compute-0 nova_compute[183075]: 2026-01-22 17:16:25.943 183079 DEBUG nova.network.neutron [req-99561639-416f-46cc-8cdc-530b7b7b19c1 req-970be7f1-b05c-4a41-84dd-9d5e5eccefe2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Refreshing network info cache for port 0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:16:25 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b366309911a1697880dcbfb275fb455beb55a61a66936a3f9db95ca020fbc0cf-userdata-shm.mount: Deactivated successfully.
Jan 22 17:16:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-f13a1bdb760faaf3db19fac77e363ec9fc505796619536523e29bcf6c3c8286c-merged.mount: Deactivated successfully.
Jan 22 17:16:25 compute-0 podman[224070]: 2026-01-22 17:16:25.968060151 +0000 UTC m=+0.104478991 container cleanup b366309911a1697880dcbfb275fb455beb55a61a66936a3f9db95ca020fbc0cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:16:25 compute-0 systemd[1]: libpod-conmon-b366309911a1697880dcbfb275fb455beb55a61a66936a3f9db95ca020fbc0cf.scope: Deactivated successfully.
Jan 22 17:16:26 compute-0 podman[224099]: 2026-01-22 17:16:26.042826014 +0000 UTC m=+0.046339742 container remove b366309911a1697880dcbfb275fb455beb55a61a66936a3f9db95ca020fbc0cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:16:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:26.047 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[dc2b5229-da36-4c41-a015-308256927167]: (4, ('Thu Jan 22 05:16:25 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe (b366309911a1697880dcbfb275fb455beb55a61a66936a3f9db95ca020fbc0cf)\nb366309911a1697880dcbfb275fb455beb55a61a66936a3f9db95ca020fbc0cf\nThu Jan 22 05:16:25 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe (b366309911a1697880dcbfb275fb455beb55a61a66936a3f9db95ca020fbc0cf)\nb366309911a1697880dcbfb275fb455beb55a61a66936a3f9db95ca020fbc0cf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:26.049 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a2dc3ab5-8355-4481-bed8-1bafca69dfda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:26.050 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee52ea03-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:16:26 compute-0 nova_compute[183075]: 2026-01-22 17:16:26.051 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:26 compute-0 kernel: tapee52ea03-30: left promiscuous mode
Jan 22 17:16:26 compute-0 nova_compute[183075]: 2026-01-22 17:16:26.054 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:26.056 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0609921a-c757-43c5-bc7c-c580cefcdcc7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:26 compute-0 nova_compute[183075]: 2026-01-22 17:16:26.065 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:26 compute-0 nova_compute[183075]: 2026-01-22 17:16:26.078 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:26.077 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba4c8c1-5b8d-430e-a3ad-42a0287406d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:26.080 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3a831e7e-25c0-4619-938d-909590d45c90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:26.098 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[064e0683-d22f-40f3-922b-9ca9836a7d0b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452080, 'reachable_time': 27364, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224115, 'error': None, 'target': 'ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:26.100 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee52ea03-3241-4391-9bfe-b2039dbf3bfe deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:16:26 compute-0 systemd[1]: run-netns-ovnmeta\x2dee52ea03\x2d3241\x2d4391\x2d9bfe\x2db2039dbf3bfe.mount: Deactivated successfully.
Jan 22 17:16:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:26.101 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[4ba46dcd-4e3e-411f-b8ec-b55de5e17f71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:27.783 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:16:28 compute-0 nova_compute[183075]: 2026-01-22 17:16:28.042 183079 DEBUG nova.compute.manager [req-b2702ee7-3243-4bba-b56c-f56f22a7b281 req-2bb9e177-45fd-41c7-a8e8-175e9b5ae921 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-vif-unplugged-01441304-20f8-4d07-a1ed-05da5d9297d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:16:28 compute-0 nova_compute[183075]: 2026-01-22 17:16:28.043 183079 DEBUG oslo_concurrency.lockutils [req-b2702ee7-3243-4bba-b56c-f56f22a7b281 req-2bb9e177-45fd-41c7-a8e8-175e9b5ae921 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:28 compute-0 nova_compute[183075]: 2026-01-22 17:16:28.043 183079 DEBUG oslo_concurrency.lockutils [req-b2702ee7-3243-4bba-b56c-f56f22a7b281 req-2bb9e177-45fd-41c7-a8e8-175e9b5ae921 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:28 compute-0 nova_compute[183075]: 2026-01-22 17:16:28.044 183079 DEBUG oslo_concurrency.lockutils [req-b2702ee7-3243-4bba-b56c-f56f22a7b281 req-2bb9e177-45fd-41c7-a8e8-175e9b5ae921 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:28 compute-0 nova_compute[183075]: 2026-01-22 17:16:28.044 183079 DEBUG nova.compute.manager [req-b2702ee7-3243-4bba-b56c-f56f22a7b281 req-2bb9e177-45fd-41c7-a8e8-175e9b5ae921 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] No waiting events found dispatching network-vif-unplugged-01441304-20f8-4d07-a1ed-05da5d9297d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:16:28 compute-0 nova_compute[183075]: 2026-01-22 17:16:28.044 183079 WARNING nova.compute.manager [req-b2702ee7-3243-4bba-b56c-f56f22a7b281 req-2bb9e177-45fd-41c7-a8e8-175e9b5ae921 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received unexpected event network-vif-unplugged-01441304-20f8-4d07-a1ed-05da5d9297d6 for instance with vm_state active and task_state None.
Jan 22 17:16:28 compute-0 nova_compute[183075]: 2026-01-22 17:16:28.044 183079 DEBUG nova.compute.manager [req-b2702ee7-3243-4bba-b56c-f56f22a7b281 req-2bb9e177-45fd-41c7-a8e8-175e9b5ae921 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-vif-plugged-01441304-20f8-4d07-a1ed-05da5d9297d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:16:28 compute-0 nova_compute[183075]: 2026-01-22 17:16:28.045 183079 DEBUG oslo_concurrency.lockutils [req-b2702ee7-3243-4bba-b56c-f56f22a7b281 req-2bb9e177-45fd-41c7-a8e8-175e9b5ae921 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:28 compute-0 nova_compute[183075]: 2026-01-22 17:16:28.045 183079 DEBUG oslo_concurrency.lockutils [req-b2702ee7-3243-4bba-b56c-f56f22a7b281 req-2bb9e177-45fd-41c7-a8e8-175e9b5ae921 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:28 compute-0 nova_compute[183075]: 2026-01-22 17:16:28.045 183079 DEBUG oslo_concurrency.lockutils [req-b2702ee7-3243-4bba-b56c-f56f22a7b281 req-2bb9e177-45fd-41c7-a8e8-175e9b5ae921 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:28 compute-0 nova_compute[183075]: 2026-01-22 17:16:28.045 183079 DEBUG nova.compute.manager [req-b2702ee7-3243-4bba-b56c-f56f22a7b281 req-2bb9e177-45fd-41c7-a8e8-175e9b5ae921 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] No waiting events found dispatching network-vif-plugged-01441304-20f8-4d07-a1ed-05da5d9297d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:16:28 compute-0 nova_compute[183075]: 2026-01-22 17:16:28.046 183079 WARNING nova.compute.manager [req-b2702ee7-3243-4bba-b56c-f56f22a7b281 req-2bb9e177-45fd-41c7-a8e8-175e9b5ae921 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received unexpected event network-vif-plugged-01441304-20f8-4d07-a1ed-05da5d9297d6 for instance with vm_state active and task_state None.
Jan 22 17:16:29 compute-0 nova_compute[183075]: 2026-01-22 17:16:29.132 183079 DEBUG nova.network.neutron [req-99561639-416f-46cc-8cdc-530b7b7b19c1 req-970be7f1-b05c-4a41-84dd-9d5e5eccefe2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Updated VIF entry in instance network info cache for port 0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:16:29 compute-0 nova_compute[183075]: 2026-01-22 17:16:29.133 183079 DEBUG nova.network.neutron [req-99561639-416f-46cc-8cdc-530b7b7b19c1 req-970be7f1-b05c-4a41-84dd-9d5e5eccefe2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Updating instance_info_cache with network_info: [{"id": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "address": "fa:16:3e:d9:53:87", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b803bb5-5b", "ovs_interfaceid": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:16:29 compute-0 nova_compute[183075]: 2026-01-22 17:16:29.177 183079 DEBUG oslo_concurrency.lockutils [req-99561639-416f-46cc-8cdc-530b7b7b19c1 req-970be7f1-b05c-4a41-84dd-9d5e5eccefe2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:16:29 compute-0 nova_compute[183075]: 2026-01-22 17:16:29.205 183079 DEBUG oslo_concurrency.lockutils [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:16:29 compute-0 nova_compute[183075]: 2026-01-22 17:16:29.205 183079 DEBUG oslo_concurrency.lockutils [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquired lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:16:29 compute-0 nova_compute[183075]: 2026-01-22 17:16:29.205 183079 DEBUG nova.network.neutron [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:16:29 compute-0 nova_compute[183075]: 2026-01-22 17:16:29.233 183079 DEBUG oslo_concurrency.lockutils [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "interface-e02af423-c4c3-4fcd-be73-ebeba9fae411-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:29 compute-0 nova_compute[183075]: 2026-01-22 17:16:29.233 183079 DEBUG oslo_concurrency.lockutils [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "interface-e02af423-c4c3-4fcd-be73-ebeba9fae411-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:29 compute-0 nova_compute[183075]: 2026-01-22 17:16:29.233 183079 DEBUG nova.objects.instance [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lazy-loading 'flavor' on Instance uuid e02af423-c4c3-4fcd-be73-ebeba9fae411 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.151 183079 DEBUG nova.compute.manager [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-vif-deleted-01441304-20f8-4d07-a1ed-05da5d9297d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.152 183079 INFO nova.compute.manager [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Neutron deleted interface 01441304-20f8-4d07-a1ed-05da5d9297d6; detaching it from the instance and deleting it from the info cache
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.152 183079 DEBUG nova.network.neutron [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Updating instance_info_cache with network_info: [{"id": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "address": "fa:16:3e:d9:53:87", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b803bb5-5b", "ovs_interfaceid": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.190 183079 DEBUG nova.objects.instance [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lazy-loading 'system_metadata' on Instance uuid e02af423-c4c3-4fcd-be73-ebeba9fae411 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.257 183079 DEBUG nova.objects.instance [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lazy-loading 'flavor' on Instance uuid e02af423-c4c3-4fcd-be73-ebeba9fae411 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.302 183079 DEBUG nova.virt.libvirt.vif [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1223247828',display_name='tempest-server-test-1223247828',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1223247828',id=30,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:15:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-kexgovqt',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:15:56Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=e02af423-c4c3-4fcd-be73-ebeba9fae411,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.303 183079 DEBUG nova.network.os_vif_util [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Converting VIF {"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.304 183079 DEBUG nova.network.os_vif_util [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:72:7a,bridge_name='br-int',has_traffic_filtering=True,id=01441304-20f8-4d07-a1ed-05da5d9297d6,network=Network(ee52ea03-3241-4391-9bfe-b2039dbf3bfe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01441304-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.310 183079 DEBUG nova.virt.libvirt.guest [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:db:72:7a"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap01441304-20"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.316 183079 DEBUG nova.virt.libvirt.guest [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:db:72:7a"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap01441304-20"/></interface>not found in domain: <domain type='kvm' id='30'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <name>instance-0000001e</name>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <uuid>e02af423-c4c3-4fcd-be73-ebeba9fae411</uuid>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:name>tempest-server-test-1223247828</nova:name>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:creationTime>2026-01-22 17:16:25</nova:creationTime>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:flavor name="m1.nano">
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:memory>128</nova:memory>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:disk>1</nova:disk>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:swap>0</nova:swap>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:vcpus>1</nova:vcpus>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </nova:flavor>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:owner>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:user uuid="2fe02f7484a94091bab26aba1c370459">tempest-IPv6Test-1828780436-project-member</nova:user>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:project uuid="4f95c62a5a194d2291b03187a9c85702">tempest-IPv6Test-1828780436</nova:project>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </nova:owner>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:ports>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:port uuid="0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510">
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </nova:ports>
Jan 22 17:16:30 compute-0 nova_compute[183075]: </nova:instance>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <memory unit='KiB'>131072</memory>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <vcpu placement='static'>1</vcpu>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <resource>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <partition>/machine</partition>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </resource>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <sysinfo type='smbios'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <system>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <entry name='manufacturer'>RDO</entry>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <entry name='serial'>e02af423-c4c3-4fcd-be73-ebeba9fae411</entry>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <entry name='uuid'>e02af423-c4c3-4fcd-be73-ebeba9fae411</entry>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <entry name='family'>Virtual Machine</entry>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </system>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <os>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <boot dev='hd'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <smbios mode='sysinfo'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </os>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <features>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <vmcoreinfo state='on'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </features>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <vendor>AMD</vendor>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='x2apic'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc-deadline'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='hypervisor'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc_adjust'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='spec-ctrl'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='stibp'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='ssbd'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='cmp_legacy'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='overflow-recov'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='succor'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='ibrs'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='amd-ssbd'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='virt-ssbd'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='lbrv'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='tsc-scale'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='vmcb-clean'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='flushbyasid'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='pause-filter'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='pfthreshold'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='xsaves'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='svm'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='topoext'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='npt'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='nrip-save'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <clock offset='utc'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <timer name='hpet' present='no'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <on_poweroff>destroy</on_poweroff>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <on_reboot>restart</on_reboot>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <on_crash>destroy</on_crash>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <disk type='file' device='disk'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <source file='/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk' index='1'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <backingStore type='file' index='2'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:         <format type='raw'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:         <source file='/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:         <backingStore/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       </backingStore>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target dev='vda' bus='virtio'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='virtio-disk0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pcie.0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='1' port='0x10'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.1'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='2' port='0x11'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.2'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='3' port='0x12'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.3'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='4' port='0x13'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.4'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='5' port='0x14'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.5'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='6' port='0x15'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.6'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='7' port='0x16'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.7'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='8' port='0x17'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.8'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='9' port='0x18'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.9'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='10' port='0x19'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.10'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='11' port='0x1a'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.11'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='12' port='0x1b'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.12'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='13' port='0x1c'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.13'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='14' port='0x1d'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.14'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='15' port='0x1e'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.15'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='16' port='0x1f'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.16'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='17' port='0x20'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.17'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='18' port='0x21'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.18'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='19' port='0x22'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.19'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='20' port='0x23'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.20'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='21' port='0x24'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.21'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='22' port='0x25'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.22'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='23' port='0x26'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.23'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='24' port='0x27'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.24'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='25' port='0x28'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.25'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-pci-bridge'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.26'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='usb'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='sata' index='0'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='ide'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <interface type='ethernet'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <mac address='fa:16:3e:d9:53:87'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target dev='tap0b803bb5-5b'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model type='virtio'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <mtu size='1442'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='net0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <serial type='pty'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <source path='/dev/pts/0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/console.log' append='off'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target type='isa-serial' port='0'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:         <model name='isa-serial'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       </target>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <console type='pty' tty='/dev/pts/0'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <source path='/dev/pts/0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/console.log' append='off'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target type='serial' port='0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </console>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <input type='tablet' bus='usb'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='input0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='usb' bus='0' port='1'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </input>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <input type='mouse' bus='ps2'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='input1'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </input>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <input type='keyboard' bus='ps2'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='input2'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </input>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <listen type='address' address='::0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </graphics>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <audio id='1' type='none'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <video>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='video0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </video>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <watchdog model='itco' action='reset'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='watchdog0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </watchdog>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <memballoon model='virtio'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <stats period='10'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='balloon0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <rng model='virtio'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <backend model='random'>/dev/urandom</backend>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='rng0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <label>system_u:system_r:svirt_t:s0:c203,c337</label>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c203,c337</imagelabel>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <label>+107:+107</label>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <imagelabel>+107:+107</imagelabel>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:16:30 compute-0 nova_compute[183075]: </domain>
Jan 22 17:16:30 compute-0 nova_compute[183075]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.317 183079 DEBUG nova.virt.libvirt.guest [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:db:72:7a"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap01441304-20"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.325 183079 DEBUG nova.virt.libvirt.guest [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:db:72:7a"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap01441304-20"/></interface>not found in domain: <domain type='kvm' id='30'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <name>instance-0000001e</name>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <uuid>e02af423-c4c3-4fcd-be73-ebeba9fae411</uuid>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:name>tempest-server-test-1223247828</nova:name>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:creationTime>2026-01-22 17:16:25</nova:creationTime>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:flavor name="m1.nano">
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:memory>128</nova:memory>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:disk>1</nova:disk>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:swap>0</nova:swap>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:vcpus>1</nova:vcpus>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </nova:flavor>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:owner>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:user uuid="2fe02f7484a94091bab26aba1c370459">tempest-IPv6Test-1828780436-project-member</nova:user>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:project uuid="4f95c62a5a194d2291b03187a9c85702">tempest-IPv6Test-1828780436</nova:project>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </nova:owner>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:ports>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:port uuid="0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510">
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </nova:ports>
Jan 22 17:16:30 compute-0 nova_compute[183075]: </nova:instance>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <memory unit='KiB'>131072</memory>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <vcpu placement='static'>1</vcpu>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <resource>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <partition>/machine</partition>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </resource>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <sysinfo type='smbios'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <system>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <entry name='manufacturer'>RDO</entry>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <entry name='serial'>e02af423-c4c3-4fcd-be73-ebeba9fae411</entry>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <entry name='uuid'>e02af423-c4c3-4fcd-be73-ebeba9fae411</entry>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <entry name='family'>Virtual Machine</entry>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </system>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <os>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <boot dev='hd'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <smbios mode='sysinfo'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </os>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <features>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <vmcoreinfo state='on'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </features>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <vendor>AMD</vendor>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='x2apic'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc-deadline'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='hypervisor'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc_adjust'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='spec-ctrl'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='stibp'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='ssbd'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='cmp_legacy'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='overflow-recov'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='succor'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='ibrs'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='amd-ssbd'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='virt-ssbd'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='lbrv'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='tsc-scale'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='vmcb-clean'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='flushbyasid'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='pause-filter'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='pfthreshold'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='xsaves'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='svm'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='require' name='topoext'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='npt'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <feature policy='disable' name='nrip-save'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <clock offset='utc'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <timer name='hpet' present='no'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <on_poweroff>destroy</on_poweroff>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <on_reboot>restart</on_reboot>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <on_crash>destroy</on_crash>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <disk type='file' device='disk'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <source file='/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/disk' index='1'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <backingStore type='file' index='2'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:         <format type='raw'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:         <source file='/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:         <backingStore/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       </backingStore>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target dev='vda' bus='virtio'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='virtio-disk0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pcie.0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='1' port='0x10'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.1'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='2' port='0x11'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.2'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='3' port='0x12'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.3'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='4' port='0x13'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.4'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='5' port='0x14'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.5'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='6' port='0x15'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.6'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='7' port='0x16'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.7'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='8' port='0x17'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.8'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='9' port='0x18'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.9'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='10' port='0x19'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.10'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='11' port='0x1a'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.11'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='12' port='0x1b'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.12'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='13' port='0x1c'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.13'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='14' port='0x1d'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.14'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='15' port='0x1e'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.15'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='16' port='0x1f'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.16'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='17' port='0x20'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.17'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='18' port='0x21'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.18'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='19' port='0x22'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.19'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='20' port='0x23'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.20'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='21' port='0x24'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.21'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='22' port='0x25'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.22'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='23' port='0x26'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.23'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='24' port='0x27'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.24'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target chassis='25' port='0x28'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.25'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model name='pcie-pci-bridge'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='pci.26'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='usb'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <controller type='sata' index='0'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='ide'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <interface type='ethernet'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <mac address='fa:16:3e:d9:53:87'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target dev='tap0b803bb5-5b'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model type='virtio'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <mtu size='1442'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='net0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <serial type='pty'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <source path='/dev/pts/0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/console.log' append='off'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target type='isa-serial' port='0'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:         <model name='isa-serial'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       </target>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <console type='pty' tty='/dev/pts/0'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <source path='/dev/pts/0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411/console.log' append='off'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <target type='serial' port='0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </console>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <input type='tablet' bus='usb'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='input0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='usb' bus='0' port='1'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </input>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <input type='mouse' bus='ps2'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='input1'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </input>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <input type='keyboard' bus='ps2'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='input2'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </input>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <listen type='address' address='::0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </graphics>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <audio id='1' type='none'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <video>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='video0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </video>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <watchdog model='itco' action='reset'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='watchdog0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </watchdog>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <memballoon model='virtio'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <stats period='10'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='balloon0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <rng model='virtio'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <backend model='random'>/dev/urandom</backend>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <alias name='rng0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <label>system_u:system_r:svirt_t:s0:c203,c337</label>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c203,c337</imagelabel>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <label>+107:+107</label>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <imagelabel>+107:+107</imagelabel>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:16:30 compute-0 nova_compute[183075]: </domain>
Jan 22 17:16:30 compute-0 nova_compute[183075]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.327 183079 WARNING nova.virt.libvirt.driver [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Detaching interface fa:16:3e:db:72:7a failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap01441304-20' not found.
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.328 183079 DEBUG nova.virt.libvirt.vif [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1223247828',display_name='tempest-server-test-1223247828',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1223247828',id=30,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:15:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-kexgovqt',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:15:56Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=e02af423-c4c3-4fcd-be73-ebeba9fae411,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.329 183079 DEBUG nova.network.os_vif_util [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Converting VIF {"id": "01441304-20f8-4d07-a1ed-05da5d9297d6", "address": "fa:16:3e:db:72:7a", "network": {"id": "ee52ea03-3241-4391-9bfe-b2039dbf3bfe", "bridge": "br-int", "label": "tempest-test-network--458174995", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:727a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01441304-20", "ovs_interfaceid": "01441304-20f8-4d07-a1ed-05da5d9297d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.330 183079 DEBUG nova.network.os_vif_util [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:72:7a,bridge_name='br-int',has_traffic_filtering=True,id=01441304-20f8-4d07-a1ed-05da5d9297d6,network=Network(ee52ea03-3241-4391-9bfe-b2039dbf3bfe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01441304-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.331 183079 DEBUG os_vif [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:72:7a,bridge_name='br-int',has_traffic_filtering=True,id=01441304-20f8-4d07-a1ed-05da5d9297d6,network=Network(ee52ea03-3241-4391-9bfe-b2039dbf3bfe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01441304-20') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.333 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.334 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01441304-20, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.334 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.338 183079 INFO os_vif [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:72:7a,bridge_name='br-int',has_traffic_filtering=True,id=01441304-20f8-4d07-a1ed-05da5d9297d6,network=Network(ee52ea03-3241-4391-9bfe-b2039dbf3bfe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01441304-20')
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.339 183079 DEBUG nova.virt.libvirt.guest [req-d80e06f2-3745-40c9-a725-fb7062acf553 req-04013850-ddd3-47b1-a4c4-1fd828e99b10 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:name>tempest-server-test-1223247828</nova:name>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:creationTime>2026-01-22 17:16:30</nova:creationTime>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:flavor name="m1.nano">
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:memory>128</nova:memory>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:disk>1</nova:disk>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:swap>0</nova:swap>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:vcpus>1</nova:vcpus>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </nova:flavor>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:owner>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:user uuid="2fe02f7484a94091bab26aba1c370459">tempest-IPv6Test-1828780436-project-member</nova:user>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:project uuid="4f95c62a5a194d2291b03187a9c85702">tempest-IPv6Test-1828780436</nova:project>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </nova:owner>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   <nova:ports>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     <nova:port uuid="0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510">
Jan 22 17:16:30 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:16:30 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:16:30 compute-0 nova_compute[183075]:   </nova:ports>
Jan 22 17:16:30 compute-0 nova_compute[183075]: </nova:instance>
Jan 22 17:16:30 compute-0 nova_compute[183075]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 17:16:30 compute-0 nova_compute[183075]: 2026-01-22 17:16:30.767 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:31 compute-0 nova_compute[183075]: 2026-01-22 17:16:31.079 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:32 compute-0 nova_compute[183075]: 2026-01-22 17:16:32.170 183079 DEBUG nova.objects.instance [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lazy-loading 'pci_requests' on Instance uuid e02af423-c4c3-4fcd-be73-ebeba9fae411 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:16:32 compute-0 nova_compute[183075]: 2026-01-22 17:16:32.226 183079 DEBUG nova.network.neutron [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:16:32 compute-0 nova_compute[183075]: 2026-01-22 17:16:32.824 183079 DEBUG nova.policy [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:16:33 compute-0 nova_compute[183075]: 2026-01-22 17:16:33.026 183079 INFO nova.network.neutron [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Port 01441304-20f8-4d07-a1ed-05da5d9297d6 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 22 17:16:33 compute-0 nova_compute[183075]: 2026-01-22 17:16:33.027 183079 DEBUG nova.network.neutron [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Updating instance_info_cache with network_info: [{"id": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "address": "fa:16:3e:d9:53:87", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b803bb5-5b", "ovs_interfaceid": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:16:33 compute-0 nova_compute[183075]: 2026-01-22 17:16:33.045 183079 DEBUG oslo_concurrency.lockutils [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Releasing lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:16:33 compute-0 nova_compute[183075]: 2026-01-22 17:16:33.074 183079 DEBUG oslo_concurrency.lockutils [None req-2d53e45c-71fe-44c4-afbb-c37112e5af56 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "interface-e02af423-c4c3-4fcd-be73-ebeba9fae411-01441304-20f8-4d07-a1ed-05da5d9297d6" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 7.495s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:33 compute-0 ovn_controller[95372]: 2026-01-22T17:16:33Z|00361|binding|INFO|Releasing lport 705c199a-731e-4515-b4ee-a538f73a29f1 from this chassis (sb_readonly=0)
Jan 22 17:16:33 compute-0 nova_compute[183075]: 2026-01-22 17:16:33.634 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:33 compute-0 nova_compute[183075]: 2026-01-22 17:16:33.706 183079 DEBUG nova.network.neutron [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Successfully updated port: c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:16:33 compute-0 nova_compute[183075]: 2026-01-22 17:16:33.731 183079 DEBUG oslo_concurrency.lockutils [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:16:33 compute-0 nova_compute[183075]: 2026-01-22 17:16:33.732 183079 DEBUG oslo_concurrency.lockutils [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquired lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:16:33 compute-0 nova_compute[183075]: 2026-01-22 17:16:33.732 183079 DEBUG nova.network.neutron [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:16:33 compute-0 nova_compute[183075]: 2026-01-22 17:16:33.771 183079 DEBUG nova.compute.manager [req-8da53f79-ff05-4ac3-ad1a-592fecd52484 req-72b97f7b-f94c-4e5d-8a3a-16e8c5723404 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-changed-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:16:33 compute-0 nova_compute[183075]: 2026-01-22 17:16:33.772 183079 DEBUG nova.compute.manager [req-8da53f79-ff05-4ac3-ad1a-592fecd52484 req-72b97f7b-f94c-4e5d-8a3a-16e8c5723404 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Refreshing instance network info cache due to event network-changed-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:16:33 compute-0 nova_compute[183075]: 2026-01-22 17:16:33.772 183079 DEBUG oslo_concurrency.lockutils [req-8da53f79-ff05-4ac3-ad1a-592fecd52484 req-72b97f7b-f94c-4e5d-8a3a-16e8c5723404 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:16:35 compute-0 podman[224117]: 2026-01-22 17:16:35.33883182 +0000 UTC m=+0.048512188 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:16:35 compute-0 podman[224118]: 2026-01-22 17:16:35.356517802 +0000 UTC m=+0.057979955 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, release=1755695350)
Jan 22 17:16:35 compute-0 podman[224116]: 2026-01-22 17:16:35.375399186 +0000 UTC m=+0.086079470 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:16:35 compute-0 nova_compute[183075]: 2026-01-22 17:16:35.814 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:36 compute-0 nova_compute[183075]: 2026-01-22 17:16:36.053 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102181.0510821, 1b9e0b4e-34d1-46d1-8f04-a07da354e704 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:16:36 compute-0 nova_compute[183075]: 2026-01-22 17:16:36.054 183079 INFO nova.compute.manager [-] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] VM Stopped (Lifecycle Event)
Jan 22 17:16:36 compute-0 nova_compute[183075]: 2026-01-22 17:16:36.084 183079 DEBUG nova.compute.manager [None req-f3913f24-c05d-4d16-92d1-a51daf3acd6a - - - - - -] [instance: 1b9e0b4e-34d1-46d1-8f04-a07da354e704] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:16:36 compute-0 nova_compute[183075]: 2026-01-22 17:16:36.085 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.398 183079 DEBUG nova.network.neutron [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Updating instance_info_cache with network_info: [{"id": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "address": "fa:16:3e:d9:53:87", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b803bb5-5b", "ovs_interfaceid": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d", "address": "fa:16:3e:74:7b:55", "network": {"id": "2eff683c-dd2b-4457-9348-c55cd5c2c95b", "bridge": "br-int", "label": "tempest-test-network--1024664376", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:7b55", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4d225ba-29", "ovs_interfaceid": "c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.420 183079 DEBUG oslo_concurrency.lockutils [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Releasing lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.422 183079 DEBUG oslo_concurrency.lockutils [req-8da53f79-ff05-4ac3-ad1a-592fecd52484 req-72b97f7b-f94c-4e5d-8a3a-16e8c5723404 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.422 183079 DEBUG nova.network.neutron [req-8da53f79-ff05-4ac3-ad1a-592fecd52484 req-72b97f7b-f94c-4e5d-8a3a-16e8c5723404 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Refreshing network info cache for port c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.427 183079 DEBUG nova.virt.libvirt.vif [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1223247828',display_name='tempest-server-test-1223247828',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1223247828',id=30,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:15:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-kexgovqt',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:15:56Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=e02af423-c4c3-4fcd-be73-ebeba9fae411,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d", "address": "fa:16:3e:74:7b:55", "network": {"id": "2eff683c-dd2b-4457-9348-c55cd5c2c95b", "bridge": "br-int", "label": "tempest-test-network--1024664376", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:7b55", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4d225ba-29", "ovs_interfaceid": "c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.427 183079 DEBUG nova.network.os_vif_util [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d", "address": "fa:16:3e:74:7b:55", "network": {"id": "2eff683c-dd2b-4457-9348-c55cd5c2c95b", "bridge": "br-int", "label": "tempest-test-network--1024664376", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:7b55", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4d225ba-29", "ovs_interfaceid": "c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.428 183079 DEBUG nova.network.os_vif_util [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:7b:55,bridge_name='br-int',has_traffic_filtering=True,id=c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d,network=Network(2eff683c-dd2b-4457-9348-c55cd5c2c95b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc4d225ba-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.429 183079 DEBUG os_vif [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:7b:55,bridge_name='br-int',has_traffic_filtering=True,id=c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d,network=Network(2eff683c-dd2b-4457-9348-c55cd5c2c95b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc4d225ba-29') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.430 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.431 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.431 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.434 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.435 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc4d225ba-29, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.435 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc4d225ba-29, col_values=(('external_ids', {'iface-id': 'c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:7b:55', 'vm-uuid': 'e02af423-c4c3-4fcd-be73-ebeba9fae411'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:16:39 compute-0 NetworkManager[55454]: <info>  [1769102199.4402] manager: (tapc4d225ba-29): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/151)
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.441 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.444 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.445 183079 INFO os_vif [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:7b:55,bridge_name='br-int',has_traffic_filtering=True,id=c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d,network=Network(2eff683c-dd2b-4457-9348-c55cd5c2c95b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc4d225ba-29')
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.446 183079 DEBUG nova.virt.libvirt.vif [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1223247828',display_name='tempest-server-test-1223247828',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1223247828',id=30,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:15:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-kexgovqt',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:15:56Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=e02af423-c4c3-4fcd-be73-ebeba9fae411,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d", "address": "fa:16:3e:74:7b:55", "network": {"id": "2eff683c-dd2b-4457-9348-c55cd5c2c95b", "bridge": "br-int", "label": "tempest-test-network--1024664376", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:7b55", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4d225ba-29", "ovs_interfaceid": "c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.447 183079 DEBUG nova.network.os_vif_util [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d", "address": "fa:16:3e:74:7b:55", "network": {"id": "2eff683c-dd2b-4457-9348-c55cd5c2c95b", "bridge": "br-int", "label": "tempest-test-network--1024664376", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:7b55", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4d225ba-29", "ovs_interfaceid": "c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.448 183079 DEBUG nova.network.os_vif_util [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:7b:55,bridge_name='br-int',has_traffic_filtering=True,id=c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d,network=Network(2eff683c-dd2b-4457-9348-c55cd5c2c95b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc4d225ba-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.450 183079 DEBUG nova.virt.libvirt.guest [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] attach device xml: <interface type="ethernet">
Jan 22 17:16:39 compute-0 nova_compute[183075]:   <mac address="fa:16:3e:74:7b:55"/>
Jan 22 17:16:39 compute-0 nova_compute[183075]:   <model type="virtio"/>
Jan 22 17:16:39 compute-0 nova_compute[183075]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:16:39 compute-0 nova_compute[183075]:   <mtu size="1442"/>
Jan 22 17:16:39 compute-0 nova_compute[183075]:   <target dev="tapc4d225ba-29"/>
Jan 22 17:16:39 compute-0 nova_compute[183075]: </interface>
Jan 22 17:16:39 compute-0 nova_compute[183075]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 22 17:16:39 compute-0 kernel: tapc4d225ba-29: entered promiscuous mode
Jan 22 17:16:39 compute-0 NetworkManager[55454]: <info>  [1769102199.4720] manager: (tapc4d225ba-29): new Tun device (/org/freedesktop/NetworkManager/Devices/152)
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.473 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:39 compute-0 ovn_controller[95372]: 2026-01-22T17:16:39Z|00362|binding|INFO|Claiming lport c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d for this chassis.
Jan 22 17:16:39 compute-0 ovn_controller[95372]: 2026-01-22T17:16:39Z|00363|binding|INFO|c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d: Claiming fa:16:3e:74:7b:55 2001:db8:0:1:f816:3eff:fe74:7b55
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.483 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:7b:55 2001:db8:0:1:f816:3eff:fe74:7b55'], port_security=['fa:16:3e:74:7b:55 2001:db8:0:1:f816:3eff:fe74:7b55'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe74:7b55/64', 'neutron:device_id': 'e02af423-c4c3-4fcd-be73-ebeba9fae411', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2eff683c-dd2b-4457-9348-c55cd5c2c95b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4f95c62a5a194d2291b03187a9c85702', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0ac3e154-5d63-4269-957b-eeadd273d2d6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9c011efd-4fe8-4a3e-bea2-c6f67eca3806, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.489 104629 INFO neutron.agent.ovn.metadata.agent [-] Port c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d in datapath 2eff683c-dd2b-4457-9348-c55cd5c2c95b bound to our chassis
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.492 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2eff683c-dd2b-4457-9348-c55cd5c2c95b
Jan 22 17:16:39 compute-0 ovn_controller[95372]: 2026-01-22T17:16:39Z|00364|binding|INFO|Setting lport c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d ovn-installed in OVS
Jan 22 17:16:39 compute-0 ovn_controller[95372]: 2026-01-22T17:16:39Z|00365|binding|INFO|Setting lport c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d up in Southbound
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.508 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.512 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.514 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b450d66a-0944-4996-9e92-52b1e97e5369]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.515 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2eff683c-d1 in ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.518 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2eff683c-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.518 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ebb71de3-6204-49e1-b01e-d2dc20206f42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.519 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3ddd91f1-58fc-4f00-8ebe-7e54fa5cf91f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:39 compute-0 systemd-udevd[224181]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.539 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[ed6d9032-d708-4078-a365-7c2cddd88319]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:39 compute-0 NetworkManager[55454]: <info>  [1769102199.5486] device (tapc4d225ba-29): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:16:39 compute-0 NetworkManager[55454]: <info>  [1769102199.5511] device (tapc4d225ba-29): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.560 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[408259bb-88c5-4b5d-9afc-f0ff98ca1c43]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.605 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ac9baa28-580e-4dca-8c01-cff863fda4cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:39 compute-0 systemd-udevd[224185]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.613 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c91a5a1f-4bc0-4c08-a6cb-817a6d9a77f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:39 compute-0 NetworkManager[55454]: <info>  [1769102199.6154] manager: (tap2eff683c-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/153)
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.621 183079 DEBUG nova.virt.libvirt.driver [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.622 183079 DEBUG nova.virt.libvirt.driver [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] No VIF found with MAC fa:16:3e:d9:53:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.622 183079 DEBUG nova.virt.libvirt.driver [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] No VIF found with MAC fa:16:3e:74:7b:55, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.650 183079 DEBUG nova.virt.libvirt.guest [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:16:39 compute-0 nova_compute[183075]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:16:39 compute-0 nova_compute[183075]:   <nova:name>tempest-server-test-1223247828</nova:name>
Jan 22 17:16:39 compute-0 nova_compute[183075]:   <nova:creationTime>2026-01-22 17:16:39</nova:creationTime>
Jan 22 17:16:39 compute-0 nova_compute[183075]:   <nova:flavor name="m1.nano">
Jan 22 17:16:39 compute-0 nova_compute[183075]:     <nova:memory>128</nova:memory>
Jan 22 17:16:39 compute-0 nova_compute[183075]:     <nova:disk>1</nova:disk>
Jan 22 17:16:39 compute-0 nova_compute[183075]:     <nova:swap>0</nova:swap>
Jan 22 17:16:39 compute-0 nova_compute[183075]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:16:39 compute-0 nova_compute[183075]:     <nova:vcpus>1</nova:vcpus>
Jan 22 17:16:39 compute-0 nova_compute[183075]:   </nova:flavor>
Jan 22 17:16:39 compute-0 nova_compute[183075]:   <nova:owner>
Jan 22 17:16:39 compute-0 nova_compute[183075]:     <nova:user uuid="2fe02f7484a94091bab26aba1c370459">tempest-IPv6Test-1828780436-project-member</nova:user>
Jan 22 17:16:39 compute-0 nova_compute[183075]:     <nova:project uuid="4f95c62a5a194d2291b03187a9c85702">tempest-IPv6Test-1828780436</nova:project>
Jan 22 17:16:39 compute-0 nova_compute[183075]:   </nova:owner>
Jan 22 17:16:39 compute-0 nova_compute[183075]:   <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:16:39 compute-0 nova_compute[183075]:   <nova:ports>
Jan 22 17:16:39 compute-0 nova_compute[183075]:     <nova:port uuid="0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510">
Jan 22 17:16:39 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:16:39 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:16:39 compute-0 nova_compute[183075]:     <nova:port uuid="c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d">
Jan 22 17:16:39 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe74:7b55" ipVersion="6"/>
Jan 22 17:16:39 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:16:39 compute-0 nova_compute[183075]:   </nova:ports>
Jan 22 17:16:39 compute-0 nova_compute[183075]: </nova:instance>
Jan 22 17:16:39 compute-0 nova_compute[183075]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.652 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[6c82dc5c-51a7-4489-86d4-801b089ed598]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.656 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[cc13ad01-3dde-4eae-b0d7-fa2d8d97e25c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:39 compute-0 NetworkManager[55454]: <info>  [1769102199.6791] device (tap2eff683c-d0): carrier: link connected
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.679 183079 DEBUG oslo_concurrency.lockutils [None req-0f10e684-827f-40be-95b0-2344653d427e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "interface-e02af423-c4c3-4fcd-be73-ebeba9fae411-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 10.446s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.684 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5a6379f7-091f-4fa6-a28a-309af64d14ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.705 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8baee1d0-9a92-4487-a206-427b468cf1ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2eff683c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ed:b3:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456338, 'reachable_time': 24668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224207, 'error': None, 'target': 'ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.730 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4305b126-8056-41e3-be1e-69952dd2f1e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feed:b337'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 456338, 'tstamp': 456338}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224208, 'error': None, 'target': 'ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.754 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e3702723-c5e2-475f-b84c-a58a18dcc323]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2eff683c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ed:b3:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456338, 'reachable_time': 24668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224209, 'error': None, 'target': 'ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.798 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d9ee7d38-2c84-40bc-ba6c-ce3faf2edc6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.837 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[17168e00-1fcf-4fc8-8392-fe472b3f0493]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.838 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2eff683c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.839 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.839 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2eff683c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.845 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:39 compute-0 NetworkManager[55454]: <info>  [1769102199.8467] manager: (tap2eff683c-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/154)
Jan 22 17:16:39 compute-0 kernel: tap2eff683c-d0: entered promiscuous mode
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.848 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.851 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2eff683c-d0, col_values=(('external_ids', {'iface-id': '73e80cb6-5be2-4391-bd01-25f96ab9d9a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.853 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:39 compute-0 ovn_controller[95372]: 2026-01-22T17:16:39Z|00366|binding|INFO|Releasing lport 73e80cb6-5be2-4391-bd01-25f96ab9d9a0 from this chassis (sb_readonly=0)
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.878 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.879 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2eff683c-dd2b-4457-9348-c55cd5c2c95b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2eff683c-dd2b-4457-9348-c55cd5c2c95b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.881 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6705da71-bfb0-4b50-8813-740a3bfed77c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.882 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-2eff683c-dd2b-4457-9348-c55cd5c2c95b
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/2eff683c-dd2b-4457-9348-c55cd5c2c95b.pid.haproxy
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 2eff683c-dd2b-4457-9348-c55cd5c2c95b
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:16:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:39.883 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b', 'env', 'PROCESS_TAG=haproxy-2eff683c-dd2b-4457-9348-c55cd5c2c95b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2eff683c-dd2b-4457-9348-c55cd5c2c95b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.951 183079 DEBUG nova.compute.manager [req-c9965870-1b2a-4246-8fe6-03e7e2df391d req-64a13bbb-e3b7-4b91-b9b3-56661a2edbb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-vif-plugged-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.952 183079 DEBUG oslo_concurrency.lockutils [req-c9965870-1b2a-4246-8fe6-03e7e2df391d req-64a13bbb-e3b7-4b91-b9b3-56661a2edbb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.952 183079 DEBUG oslo_concurrency.lockutils [req-c9965870-1b2a-4246-8fe6-03e7e2df391d req-64a13bbb-e3b7-4b91-b9b3-56661a2edbb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.953 183079 DEBUG oslo_concurrency.lockutils [req-c9965870-1b2a-4246-8fe6-03e7e2df391d req-64a13bbb-e3b7-4b91-b9b3-56661a2edbb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.953 183079 DEBUG nova.compute.manager [req-c9965870-1b2a-4246-8fe6-03e7e2df391d req-64a13bbb-e3b7-4b91-b9b3-56661a2edbb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] No waiting events found dispatching network-vif-plugged-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:16:39 compute-0 nova_compute[183075]: 2026-01-22 17:16:39.953 183079 WARNING nova.compute.manager [req-c9965870-1b2a-4246-8fe6-03e7e2df391d req-64a13bbb-e3b7-4b91-b9b3-56661a2edbb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received unexpected event network-vif-plugged-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d for instance with vm_state active and task_state None.
Jan 22 17:16:40 compute-0 podman[224239]: 2026-01-22 17:16:40.333500006 +0000 UTC m=+0.084212611 container create b367b69b7ee7ac94d649913339cdf7d37e4f4ba500235c0de673869925ff3135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:16:40 compute-0 systemd[1]: Started libpod-conmon-b367b69b7ee7ac94d649913339cdf7d37e4f4ba500235c0de673869925ff3135.scope.
Jan 22 17:16:40 compute-0 podman[224239]: 2026-01-22 17:16:40.290115253 +0000 UTC m=+0.040827928 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:16:40 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:16:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b30d804e73f85b9350f23b1bf0345860c3dd1ab1634fe62afe17fabfa5863666/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:16:40 compute-0 podman[224239]: 2026-01-22 17:16:40.44578016 +0000 UTC m=+0.196492795 container init b367b69b7ee7ac94d649913339cdf7d37e4f4ba500235c0de673869925ff3135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:16:40 compute-0 podman[224239]: 2026-01-22 17:16:40.457756333 +0000 UTC m=+0.208468938 container start b367b69b7ee7ac94d649913339cdf7d37e4f4ba500235c0de673869925ff3135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:16:40 compute-0 neutron-haproxy-ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b[224254]: [NOTICE]   (224269) : New worker (224275) forked
Jan 22 17:16:40 compute-0 neutron-haproxy-ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b[224254]: [NOTICE]   (224269) : Loading success.
Jan 22 17:16:40 compute-0 podman[224257]: 2026-01-22 17:16:40.534104276 +0000 UTC m=+0.095022302 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:16:41 compute-0 nova_compute[183075]: 2026-01-22 17:16:41.118 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:41 compute-0 nova_compute[183075]: 2026-01-22 17:16:41.540 183079 DEBUG nova.network.neutron [req-8da53f79-ff05-4ac3-ad1a-592fecd52484 req-72b97f7b-f94c-4e5d-8a3a-16e8c5723404 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Updated VIF entry in instance network info cache for port c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:16:41 compute-0 nova_compute[183075]: 2026-01-22 17:16:41.541 183079 DEBUG nova.network.neutron [req-8da53f79-ff05-4ac3-ad1a-592fecd52484 req-72b97f7b-f94c-4e5d-8a3a-16e8c5723404 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Updating instance_info_cache with network_info: [{"id": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "address": "fa:16:3e:d9:53:87", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b803bb5-5b", "ovs_interfaceid": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d", "address": "fa:16:3e:74:7b:55", "network": {"id": "2eff683c-dd2b-4457-9348-c55cd5c2c95b", "bridge": "br-int", "label": "tempest-test-network--1024664376", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:7b55", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4d225ba-29", "ovs_interfaceid": "c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:16:41 compute-0 nova_compute[183075]: 2026-01-22 17:16:41.581 183079 DEBUG oslo_concurrency.lockutils [req-8da53f79-ff05-4ac3-ad1a-592fecd52484 req-72b97f7b-f94c-4e5d-8a3a-16e8c5723404 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-e02af423-c4c3-4fcd-be73-ebeba9fae411" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:16:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:41.932 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:41.934 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:41.935 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:42 compute-0 nova_compute[183075]: 2026-01-22 17:16:42.084 183079 DEBUG nova.compute.manager [req-bbf7c026-2b00-48f7-9571-0d8616a2d3aa req-365331af-8bd6-4e68-bf9a-52c920eb09eb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-vif-plugged-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:16:42 compute-0 nova_compute[183075]: 2026-01-22 17:16:42.084 183079 DEBUG oslo_concurrency.lockutils [req-bbf7c026-2b00-48f7-9571-0d8616a2d3aa req-365331af-8bd6-4e68-bf9a-52c920eb09eb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:42 compute-0 nova_compute[183075]: 2026-01-22 17:16:42.084 183079 DEBUG oslo_concurrency.lockutils [req-bbf7c026-2b00-48f7-9571-0d8616a2d3aa req-365331af-8bd6-4e68-bf9a-52c920eb09eb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:42 compute-0 nova_compute[183075]: 2026-01-22 17:16:42.085 183079 DEBUG oslo_concurrency.lockutils [req-bbf7c026-2b00-48f7-9571-0d8616a2d3aa req-365331af-8bd6-4e68-bf9a-52c920eb09eb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:42 compute-0 nova_compute[183075]: 2026-01-22 17:16:42.085 183079 DEBUG nova.compute.manager [req-bbf7c026-2b00-48f7-9571-0d8616a2d3aa req-365331af-8bd6-4e68-bf9a-52c920eb09eb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] No waiting events found dispatching network-vif-plugged-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:16:42 compute-0 nova_compute[183075]: 2026-01-22 17:16:42.085 183079 WARNING nova.compute.manager [req-bbf7c026-2b00-48f7-9571-0d8616a2d3aa req-365331af-8bd6-4e68-bf9a-52c920eb09eb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received unexpected event network-vif-plugged-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d for instance with vm_state active and task_state None.
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.575 183079 DEBUG oslo_concurrency.lockutils [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.576 183079 DEBUG oslo_concurrency.lockutils [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.576 183079 DEBUG oslo_concurrency.lockutils [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.577 183079 DEBUG oslo_concurrency.lockutils [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.577 183079 DEBUG oslo_concurrency.lockutils [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.578 183079 INFO nova.compute.manager [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Terminating instance
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.579 183079 DEBUG nova.compute.manager [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:16:43 compute-0 kernel: tap0b803bb5-5b (unregistering): left promiscuous mode
Jan 22 17:16:43 compute-0 NetworkManager[55454]: <info>  [1769102203.6031] device (tap0b803bb5-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.616 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:43 compute-0 ovn_controller[95372]: 2026-01-22T17:16:43Z|00367|binding|INFO|Releasing lport 0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 from this chassis (sb_readonly=0)
Jan 22 17:16:43 compute-0 ovn_controller[95372]: 2026-01-22T17:16:43Z|00368|binding|INFO|Setting lport 0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 down in Southbound
Jan 22 17:16:43 compute-0 ovn_controller[95372]: 2026-01-22T17:16:43Z|00369|binding|INFO|Removing iface tap0b803bb5-5b ovn-installed in OVS
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.619 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:43.625 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:53:87 10.100.0.7'], port_security=['fa:16:3e:d9:53:87 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e02af423-c4c3-4fcd-be73-ebeba9fae411', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-359b74c5-cbeb-4440-a3e9-a16a51b1ab77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4f95c62a5a194d2291b03187a9c85702', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1b5e2b25-1ae0-464c-ac9a-7fc65ac893a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.221'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee297cf4-fb08-4758-bba6-b8b00aaf6678, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:16:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:43.627 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 in datapath 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 unbound from our chassis
Jan 22 17:16:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:43.630 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 359b74c5-cbeb-4440-a3e9-a16a51b1ab77, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:16:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:43.631 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fbca3511-3fff-4acc-9f25-c30de52018fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:43.632 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77 namespace which is not needed anymore
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.643 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:43 compute-0 kernel: tapc4d225ba-29 (unregistering): left promiscuous mode
Jan 22 17:16:43 compute-0 NetworkManager[55454]: <info>  [1769102203.6480] device (tapc4d225ba-29): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.651 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:43 compute-0 ovn_controller[95372]: 2026-01-22T17:16:43Z|00370|binding|INFO|Releasing lport c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d from this chassis (sb_readonly=0)
Jan 22 17:16:43 compute-0 ovn_controller[95372]: 2026-01-22T17:16:43Z|00371|binding|INFO|Setting lport c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d down in Southbound
Jan 22 17:16:43 compute-0 ovn_controller[95372]: 2026-01-22T17:16:43Z|00372|binding|INFO|Removing iface tapc4d225ba-29 ovn-installed in OVS
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.671 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.673 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:43.680 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:7b:55 2001:db8:0:1:f816:3eff:fe74:7b55'], port_security=['fa:16:3e:74:7b:55 2001:db8:0:1:f816:3eff:fe74:7b55'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe74:7b55/64', 'neutron:device_id': 'e02af423-c4c3-4fcd-be73-ebeba9fae411', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2eff683c-dd2b-4457-9348-c55cd5c2c95b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4f95c62a5a194d2291b03187a9c85702', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0ac3e154-5d63-4269-957b-eeadd273d2d6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9c011efd-4fe8-4a3e-bea2-c6f67eca3806, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.682 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:43 compute-0 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Jan 22 17:16:43 compute-0 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000001e.scope: Consumed 15.027s CPU time.
Jan 22 17:16:43 compute-0 systemd-machined[154382]: Machine qemu-30-instance-0000001e terminated.
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.812 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:43 compute-0 NetworkManager[55454]: <info>  [1769102203.8213] manager: (tapc4d225ba-29): new Tun device (/org/freedesktop/NetworkManager/Devices/155)
Jan 22 17:16:43 compute-0 neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223716]: [NOTICE]   (223720) : haproxy version is 2.8.14-c23fe91
Jan 22 17:16:43 compute-0 neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223716]: [NOTICE]   (223720) : path to executable is /usr/sbin/haproxy
Jan 22 17:16:43 compute-0 neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223716]: [WARNING]  (223720) : Exiting Master process...
Jan 22 17:16:43 compute-0 neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223716]: [ALERT]    (223720) : Current worker (223722) exited with code 143 (Terminated)
Jan 22 17:16:43 compute-0 neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[223716]: [WARNING]  (223720) : All workers exited. Exiting... (0)
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.826 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:43 compute-0 systemd[1]: libpod-23db2a4d832717b097d7becbb77c12bccadb1403cb11f17d49410f55d0c9782c.scope: Deactivated successfully.
Jan 22 17:16:43 compute-0 podman[224316]: 2026-01-22 17:16:43.834148291 +0000 UTC m=+0.072335081 container died 23db2a4d832717b097d7becbb77c12bccadb1403cb11f17d49410f55d0c9782c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.870 183079 INFO nova.virt.libvirt.driver [-] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Instance destroyed successfully.
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.873 183079 DEBUG nova.objects.instance [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lazy-loading 'resources' on Instance uuid e02af423-c4c3-4fcd-be73-ebeba9fae411 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:16:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-23db2a4d832717b097d7becbb77c12bccadb1403cb11f17d49410f55d0c9782c-userdata-shm.mount: Deactivated successfully.
Jan 22 17:16:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-e2261cc405fb906726a062834e89ea717b4ae935ef0400bca60424988b6bf124-merged.mount: Deactivated successfully.
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.892 183079 DEBUG nova.virt.libvirt.vif [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1223247828',display_name='tempest-server-test-1223247828',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1223247828',id=30,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:15:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-kexgovqt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:15:56Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=e02af423-c4c3-4fcd-be73-ebeba9fae411,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "address": "fa:16:3e:d9:53:87", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b803bb5-5b", "ovs_interfaceid": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.894 183079 DEBUG nova.network.os_vif_util [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "address": "fa:16:3e:d9:53:87", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b803bb5-5b", "ovs_interfaceid": "0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.895 183079 DEBUG nova.network.os_vif_util [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d9:53:87,bridge_name='br-int',has_traffic_filtering=True,id=0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510,network=Network(359b74c5-cbeb-4440-a3e9-a16a51b1ab77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b803bb5-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.895 183079 DEBUG os_vif [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d9:53:87,bridge_name='br-int',has_traffic_filtering=True,id=0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510,network=Network(359b74c5-cbeb-4440-a3e9-a16a51b1ab77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b803bb5-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.897 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.897 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b803bb5-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.898 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:43 compute-0 podman[224316]: 2026-01-22 17:16:43.900987127 +0000 UTC m=+0.139173907 container cleanup 23db2a4d832717b097d7becbb77c12bccadb1403cb11f17d49410f55d0c9782c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.901 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.906 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:43 compute-0 systemd[1]: libpod-conmon-23db2a4d832717b097d7becbb77c12bccadb1403cb11f17d49410f55d0c9782c.scope: Deactivated successfully.
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.908 183079 INFO os_vif [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d9:53:87,bridge_name='br-int',has_traffic_filtering=True,id=0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510,network=Network(359b74c5-cbeb-4440-a3e9-a16a51b1ab77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b803bb5-5b')
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.909 183079 DEBUG nova.virt.libvirt.vif [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1223247828',display_name='tempest-server-test-1223247828',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1223247828',id=30,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:15:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-kexgovqt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:15:56Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=e02af423-c4c3-4fcd-be73-ebeba9fae411,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d", "address": "fa:16:3e:74:7b:55", "network": {"id": "2eff683c-dd2b-4457-9348-c55cd5c2c95b", "bridge": "br-int", "label": "tempest-test-network--1024664376", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:7b55", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4d225ba-29", "ovs_interfaceid": "c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.910 183079 DEBUG nova.network.os_vif_util [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d", "address": "fa:16:3e:74:7b:55", "network": {"id": "2eff683c-dd2b-4457-9348-c55cd5c2c95b", "bridge": "br-int", "label": "tempest-test-network--1024664376", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe74:7b55", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4d225ba-29", "ovs_interfaceid": "c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.910 183079 DEBUG nova.network.os_vif_util [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:74:7b:55,bridge_name='br-int',has_traffic_filtering=True,id=c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d,network=Network(2eff683c-dd2b-4457-9348-c55cd5c2c95b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc4d225ba-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.911 183079 DEBUG os_vif [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:7b:55,bridge_name='br-int',has_traffic_filtering=True,id=c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d,network=Network(2eff683c-dd2b-4457-9348-c55cd5c2c95b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc4d225ba-29') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.913 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.913 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc4d225ba-29, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.914 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.916 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.918 183079 INFO os_vif [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:7b:55,bridge_name='br-int',has_traffic_filtering=True,id=c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d,network=Network(2eff683c-dd2b-4457-9348-c55cd5c2c95b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc4d225ba-29')
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.918 183079 INFO nova.virt.libvirt.driver [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Deleting instance files /var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411_del
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.919 183079 INFO nova.virt.libvirt.driver [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Deletion of /var/lib/nova/instances/e02af423-c4c3-4fcd-be73-ebeba9fae411_del complete
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.960 183079 INFO nova.compute.manager [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.961 183079 DEBUG oslo.service.loopingcall [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.961 183079 DEBUG nova.compute.manager [-] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.961 183079 DEBUG nova.network.neutron [-] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:16:43 compute-0 podman[224371]: 2026-01-22 17:16:43.981911211 +0000 UTC m=+0.050620113 container remove 23db2a4d832717b097d7becbb77c12bccadb1403cb11f17d49410f55d0c9782c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:16:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:43.988 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b329bf2e-5d08-447e-9ff3-7e497f4c819b]: (4, ('Thu Jan 22 05:16:43 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77 (23db2a4d832717b097d7becbb77c12bccadb1403cb11f17d49410f55d0c9782c)\n23db2a4d832717b097d7becbb77c12bccadb1403cb11f17d49410f55d0c9782c\nThu Jan 22 05:16:43 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77 (23db2a4d832717b097d7becbb77c12bccadb1403cb11f17d49410f55d0c9782c)\n23db2a4d832717b097d7becbb77c12bccadb1403cb11f17d49410f55d0c9782c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:43.990 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3082330e-11b6-4b04-b45e-ca633e1bb90a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:43.992 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap359b74c5-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:16:43 compute-0 nova_compute[183075]: 2026-01-22 17:16:43.994 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:43 compute-0 kernel: tap359b74c5-c0: left promiscuous mode
Jan 22 17:16:44 compute-0 nova_compute[183075]: 2026-01-22 17:16:44.018 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.022 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1279934e-ea3a-4bdb-9abc-6e19f2098cc0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.038 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e8c8b443-683c-4de3-bae7-4b1e5c20985e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.041 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[85ef9bfc-073e-4aeb-95c7-9956972a59d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.058 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb7ad46-63a8-432b-a6f8-de2bf6d3dc37]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451978, 'reachable_time': 39217, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224386, 'error': None, 'target': 'ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.061 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.061 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[660746b0-6f68-4b95-a1d3-180624e80e9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.061 104629 INFO neutron.agent.ovn.metadata.agent [-] Port c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d in datapath 2eff683c-dd2b-4457-9348-c55cd5c2c95b unbound from our chassis
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.063 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2eff683c-dd2b-4457-9348-c55cd5c2c95b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:16:44 compute-0 systemd[1]: run-netns-ovnmeta\x2d359b74c5\x2dcbeb\x2d4440\x2da3e9\x2da16a51b1ab77.mount: Deactivated successfully.
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.063 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1802a223-7d2c-4045-a39a-7e4efeb0a05c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.064 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b namespace which is not needed anymore
Jan 22 17:16:44 compute-0 neutron-haproxy-ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b[224254]: [NOTICE]   (224269) : haproxy version is 2.8.14-c23fe91
Jan 22 17:16:44 compute-0 neutron-haproxy-ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b[224254]: [NOTICE]   (224269) : path to executable is /usr/sbin/haproxy
Jan 22 17:16:44 compute-0 neutron-haproxy-ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b[224254]: [WARNING]  (224269) : Exiting Master process...
Jan 22 17:16:44 compute-0 neutron-haproxy-ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b[224254]: [ALERT]    (224269) : Current worker (224275) exited with code 143 (Terminated)
Jan 22 17:16:44 compute-0 neutron-haproxy-ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b[224254]: [WARNING]  (224269) : All workers exited. Exiting... (0)
Jan 22 17:16:44 compute-0 systemd[1]: libpod-b367b69b7ee7ac94d649913339cdf7d37e4f4ba500235c0de673869925ff3135.scope: Deactivated successfully.
Jan 22 17:16:44 compute-0 podman[224404]: 2026-01-22 17:16:44.2755053 +0000 UTC m=+0.070034710 container died b367b69b7ee7ac94d649913339cdf7d37e4f4ba500235c0de673869925ff3135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:16:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b367b69b7ee7ac94d649913339cdf7d37e4f4ba500235c0de673869925ff3135-userdata-shm.mount: Deactivated successfully.
Jan 22 17:16:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-b30d804e73f85b9350f23b1bf0345860c3dd1ab1634fe62afe17fabfa5863666-merged.mount: Deactivated successfully.
Jan 22 17:16:44 compute-0 podman[224404]: 2026-01-22 17:16:44.315830604 +0000 UTC m=+0.110359964 container cleanup b367b69b7ee7ac94d649913339cdf7d37e4f4ba500235c0de673869925ff3135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:16:44 compute-0 systemd[1]: libpod-conmon-b367b69b7ee7ac94d649913339cdf7d37e4f4ba500235c0de673869925ff3135.scope: Deactivated successfully.
Jan 22 17:16:44 compute-0 podman[224433]: 2026-01-22 17:16:44.383417479 +0000 UTC m=+0.049434952 container remove b367b69b7ee7ac94d649913339cdf7d37e4f4ba500235c0de673869925ff3135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.391 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[41671586-eafb-44b7-b1a5-0be22f0dfb51]: (4, ('Thu Jan 22 05:16:44 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b (b367b69b7ee7ac94d649913339cdf7d37e4f4ba500235c0de673869925ff3135)\nb367b69b7ee7ac94d649913339cdf7d37e4f4ba500235c0de673869925ff3135\nThu Jan 22 05:16:44 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b (b367b69b7ee7ac94d649913339cdf7d37e4f4ba500235c0de673869925ff3135)\nb367b69b7ee7ac94d649913339cdf7d37e4f4ba500235c0de673869925ff3135\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.394 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f8bf031a-d0f3-4827-8402-884194b7bf1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.396 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2eff683c-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:16:44 compute-0 nova_compute[183075]: 2026-01-22 17:16:44.426 183079 DEBUG nova.compute.manager [req-bd85ed3b-7289-4a0c-8b88-65e7543d99b8 req-be19d209-7f9e-4014-a08b-8c93b2a037b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-vif-unplugged-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:16:44 compute-0 nova_compute[183075]: 2026-01-22 17:16:44.427 183079 DEBUG oslo_concurrency.lockutils [req-bd85ed3b-7289-4a0c-8b88-65e7543d99b8 req-be19d209-7f9e-4014-a08b-8c93b2a037b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:44 compute-0 nova_compute[183075]: 2026-01-22 17:16:44.428 183079 DEBUG oslo_concurrency.lockutils [req-bd85ed3b-7289-4a0c-8b88-65e7543d99b8 req-be19d209-7f9e-4014-a08b-8c93b2a037b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:44 compute-0 nova_compute[183075]: 2026-01-22 17:16:44.428 183079 DEBUG oslo_concurrency.lockutils [req-bd85ed3b-7289-4a0c-8b88-65e7543d99b8 req-be19d209-7f9e-4014-a08b-8c93b2a037b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:44 compute-0 nova_compute[183075]: 2026-01-22 17:16:44.428 183079 DEBUG nova.compute.manager [req-bd85ed3b-7289-4a0c-8b88-65e7543d99b8 req-be19d209-7f9e-4014-a08b-8c93b2a037b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] No waiting events found dispatching network-vif-unplugged-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:16:44 compute-0 nova_compute[183075]: 2026-01-22 17:16:44.429 183079 DEBUG nova.compute.manager [req-bd85ed3b-7289-4a0c-8b88-65e7543d99b8 req-be19d209-7f9e-4014-a08b-8c93b2a037b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-vif-unplugged-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:16:44 compute-0 nova_compute[183075]: 2026-01-22 17:16:44.452 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:44 compute-0 kernel: tap2eff683c-d0: left promiscuous mode
Jan 22 17:16:44 compute-0 nova_compute[183075]: 2026-01-22 17:16:44.466 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:44 compute-0 nova_compute[183075]: 2026-01-22 17:16:44.466 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.469 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cf80199c-4f86-44d4-98ba-f75393854bf5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.484 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7da0a574-07fb-40ae-8345-6c5f9d7dde32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.485 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e92fe9b9-8ad3-4c18-ab50-5b61b742d469]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.517 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[22af8409-4208-4794-9192-31a6d93eadb2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456329, 'reachable_time': 20097, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224451, 'error': None, 'target': 'ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.521 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2eff683c-dd2b-4457-9348-c55cd5c2c95b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:16:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:16:44.521 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[66f9c245-9d39-420d-b901-16b30112d59e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:16:44 compute-0 nova_compute[183075]: 2026-01-22 17:16:44.537 183079 DEBUG nova.compute.manager [req-cea579e3-5c59-4196-ae9d-2935cb51aede req-d5972d27-2451-4b92-ac19-f0608a84d433 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-vif-unplugged-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:16:44 compute-0 nova_compute[183075]: 2026-01-22 17:16:44.538 183079 DEBUG oslo_concurrency.lockutils [req-cea579e3-5c59-4196-ae9d-2935cb51aede req-d5972d27-2451-4b92-ac19-f0608a84d433 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:44 compute-0 nova_compute[183075]: 2026-01-22 17:16:44.538 183079 DEBUG oslo_concurrency.lockutils [req-cea579e3-5c59-4196-ae9d-2935cb51aede req-d5972d27-2451-4b92-ac19-f0608a84d433 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:44 compute-0 nova_compute[183075]: 2026-01-22 17:16:44.539 183079 DEBUG oslo_concurrency.lockutils [req-cea579e3-5c59-4196-ae9d-2935cb51aede req-d5972d27-2451-4b92-ac19-f0608a84d433 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:44 compute-0 nova_compute[183075]: 2026-01-22 17:16:44.539 183079 DEBUG nova.compute.manager [req-cea579e3-5c59-4196-ae9d-2935cb51aede req-d5972d27-2451-4b92-ac19-f0608a84d433 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] No waiting events found dispatching network-vif-unplugged-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:16:44 compute-0 nova_compute[183075]: 2026-01-22 17:16:44.540 183079 DEBUG nova.compute.manager [req-cea579e3-5c59-4196-ae9d-2935cb51aede req-d5972d27-2451-4b92-ac19-f0608a84d433 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-vif-unplugged-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:16:44 compute-0 systemd[1]: run-netns-ovnmeta\x2d2eff683c\x2ddd2b\x2d4457\x2d9348\x2dc55cd5c2c95b.mount: Deactivated successfully.
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.122 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.176 183079 DEBUG nova.network.neutron [-] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.197 183079 INFO nova.compute.manager [-] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Took 2.24 seconds to deallocate network for instance.
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.253 183079 DEBUG oslo_concurrency.lockutils [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.254 183079 DEBUG oslo_concurrency.lockutils [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.352 183079 DEBUG nova.compute.provider_tree [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.369 183079 DEBUG nova.scheduler.client.report [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.393 183079 DEBUG oslo_concurrency.lockutils [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.422 183079 INFO nova.scheduler.client.report [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Deleted allocations for instance e02af423-c4c3-4fcd-be73-ebeba9fae411
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.512 183079 DEBUG oslo_concurrency.lockutils [None req-4d80a31b-f0cc-4ea6-9b36-2b98a61b598a 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.936s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.534 183079 DEBUG nova.compute.manager [req-401e2959-d920-4b9a-83c0-142b61814ddf req-dab5935c-1875-4a07-8124-9d4ef7ec8fd0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-vif-plugged-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.534 183079 DEBUG oslo_concurrency.lockutils [req-401e2959-d920-4b9a-83c0-142b61814ddf req-dab5935c-1875-4a07-8124-9d4ef7ec8fd0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.535 183079 DEBUG oslo_concurrency.lockutils [req-401e2959-d920-4b9a-83c0-142b61814ddf req-dab5935c-1875-4a07-8124-9d4ef7ec8fd0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.535 183079 DEBUG oslo_concurrency.lockutils [req-401e2959-d920-4b9a-83c0-142b61814ddf req-dab5935c-1875-4a07-8124-9d4ef7ec8fd0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.535 183079 DEBUG nova.compute.manager [req-401e2959-d920-4b9a-83c0-142b61814ddf req-dab5935c-1875-4a07-8124-9d4ef7ec8fd0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] No waiting events found dispatching network-vif-plugged-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.536 183079 WARNING nova.compute.manager [req-401e2959-d920-4b9a-83c0-142b61814ddf req-dab5935c-1875-4a07-8124-9d4ef7ec8fd0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received unexpected event network-vif-plugged-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 for instance with vm_state deleted and task_state None.
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.602 183079 DEBUG nova.compute.manager [req-d72c7d62-f705-4948-9f7a-df12d1c376be req-811b867a-f568-41e3-925a-9a430553ebeb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-vif-plugged-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.603 183079 DEBUG oslo_concurrency.lockutils [req-d72c7d62-f705-4948-9f7a-df12d1c376be req-811b867a-f568-41e3-925a-9a430553ebeb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.603 183079 DEBUG oslo_concurrency.lockutils [req-d72c7d62-f705-4948-9f7a-df12d1c376be req-811b867a-f568-41e3-925a-9a430553ebeb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.604 183079 DEBUG oslo_concurrency.lockutils [req-d72c7d62-f705-4948-9f7a-df12d1c376be req-811b867a-f568-41e3-925a-9a430553ebeb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e02af423-c4c3-4fcd-be73-ebeba9fae411-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.604 183079 DEBUG nova.compute.manager [req-d72c7d62-f705-4948-9f7a-df12d1c376be req-811b867a-f568-41e3-925a-9a430553ebeb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] No waiting events found dispatching network-vif-plugged-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.604 183079 WARNING nova.compute.manager [req-d72c7d62-f705-4948-9f7a-df12d1c376be req-811b867a-f568-41e3-925a-9a430553ebeb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received unexpected event network-vif-plugged-c4d225ba-2970-4b71-9ecd-8f3ef13e1b4d for instance with vm_state deleted and task_state None.
Jan 22 17:16:46 compute-0 nova_compute[183075]: 2026-01-22 17:16:46.604 183079 DEBUG nova.compute.manager [req-d72c7d62-f705-4948-9f7a-df12d1c376be req-811b867a-f568-41e3-925a-9a430553ebeb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Received event network-vif-deleted-0b803bb5-5b58-4d1c-bf2c-a8adbbcbe510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:16:47 compute-0 podman[224452]: 2026-01-22 17:16:47.384516694 +0000 UTC m=+0.084339375 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:16:48 compute-0 nova_compute[183075]: 2026-01-22 17:16:48.915 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:50 compute-0 nova_compute[183075]: 2026-01-22 17:16:50.741 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:51 compute-0 nova_compute[183075]: 2026-01-22 17:16:51.125 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:53 compute-0 nova_compute[183075]: 2026-01-22 17:16:53.920 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:54 compute-0 nova_compute[183075]: 2026-01-22 17:16:54.073 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:54 compute-0 nova_compute[183075]: 2026-01-22 17:16:54.811 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:56 compute-0 nova_compute[183075]: 2026-01-22 17:16:56.180 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:16:56 compute-0 podman[224476]: 2026-01-22 17:16:56.367751088 +0000 UTC m=+0.082064195 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:16:58 compute-0 nova_compute[183075]: 2026-01-22 17:16:58.867 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102203.8632317, e02af423-c4c3-4fcd-be73-ebeba9fae411 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:16:58 compute-0 nova_compute[183075]: 2026-01-22 17:16:58.867 183079 INFO nova.compute.manager [-] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] VM Stopped (Lifecycle Event)
Jan 22 17:16:58 compute-0 nova_compute[183075]: 2026-01-22 17:16:58.916 183079 DEBUG nova.compute.manager [None req-a96b766a-e6d1-4f80-a5df-a585f205ecd7 - - - - - -] [instance: e02af423-c4c3-4fcd-be73-ebeba9fae411] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:16:58 compute-0 nova_compute[183075]: 2026-01-22 17:16:58.922 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:00.118 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:17:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:00.119 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:17:00 compute-0 nova_compute[183075]: 2026-01-22 17:17:00.119 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:01 compute-0 nova_compute[183075]: 2026-01-22 17:17:01.181 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:02 compute-0 nova_compute[183075]: 2026-01-22 17:17:02.604 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:03 compute-0 nova_compute[183075]: 2026-01-22 17:17:03.926 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:06 compute-0 nova_compute[183075]: 2026-01-22 17:17:06.182 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:06 compute-0 podman[224504]: 2026-01-22 17:17:06.372054959 +0000 UTC m=+0.063268234 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 22 17:17:06 compute-0 podman[224503]: 2026-01-22 17:17:06.391339543 +0000 UTC m=+0.099069259 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:17:06 compute-0 podman[224510]: 2026-01-22 17:17:06.392116263 +0000 UTC m=+0.076704895 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, version=9.6, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=)
Jan 22 17:17:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:08.121 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:08 compute-0 nova_compute[183075]: 2026-01-22 17:17:08.456 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:08 compute-0 nova_compute[183075]: 2026-01-22 17:17:08.668 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:17:08 compute-0 nova_compute[183075]: 2026-01-22 17:17:08.928 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.068 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.069 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.106 183079 DEBUG nova.compute.manager [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.188 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.221 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.222 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.233 183079 DEBUG nova.virt.hardware [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.233 183079 INFO nova.compute.claims [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.353 183079 DEBUG nova.scheduler.client.report [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Refreshing inventories for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 17:17:11 compute-0 podman[224567]: 2026-01-22 17:17:11.394210943 +0000 UTC m=+0.091121151 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.423 183079 DEBUG nova.scheduler.client.report [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Updating ProviderTree inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.423 183079 DEBUG nova.compute.provider_tree [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.451 183079 DEBUG nova.scheduler.client.report [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Refreshing aggregate associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.484 183079 DEBUG nova.scheduler.client.report [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Refreshing trait associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NODE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.532 183079 DEBUG nova.compute.provider_tree [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.550 183079 DEBUG nova.scheduler.client.report [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.589 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.367s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.590 183079 DEBUG nova.compute.manager [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.664 183079 DEBUG nova.compute.manager [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.665 183079 DEBUG nova.network.neutron [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.685 183079 INFO nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.703 183079 DEBUG nova.compute.manager [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.798 183079 DEBUG nova.compute.manager [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.799 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.800 183079 INFO nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Creating image(s)
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.801 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "/var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.801 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "/var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.802 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "/var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.825 183079 DEBUG oslo_concurrency.processutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.913 183079 DEBUG oslo_concurrency.processutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.915 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.916 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:11 compute-0 nova_compute[183075]: 2026-01-22 17:17:11.939 183079 DEBUG oslo_concurrency.processutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.035 183079 DEBUG oslo_concurrency.processutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.036 183079 DEBUG oslo_concurrency.processutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.096 183079 DEBUG oslo_concurrency.processutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/disk 1073741824" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.099 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.100 183079 DEBUG oslo_concurrency.processutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.192 183079 DEBUG oslo_concurrency.processutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.193 183079 DEBUG nova.virt.disk.api [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Checking if we can resize image /var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.194 183079 DEBUG oslo_concurrency.processutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.286 183079 DEBUG oslo_concurrency.processutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.287 183079 DEBUG nova.virt.disk.api [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Cannot resize image /var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.288 183079 DEBUG nova.objects.instance [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lazy-loading 'migration_context' on Instance uuid c925ab60-0524-40ab-a82b-52f810b9023f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.302 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.303 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Ensure instance console log exists: /var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.303 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.304 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.304 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.342 183079 DEBUG nova.policy [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:17:12 compute-0 nova_compute[183075]: 2026-01-22 17:17:12.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:17:13 compute-0 nova_compute[183075]: 2026-01-22 17:17:13.457 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:13 compute-0 nova_compute[183075]: 2026-01-22 17:17:13.930 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:13 compute-0 nova_compute[183075]: 2026-01-22 17:17:13.955 183079 DEBUG nova.network.neutron [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Successfully created port: 6b1bf9db-e098-4d03-b185-9a64eee8cec2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:17:15 compute-0 nova_compute[183075]: 2026-01-22 17:17:15.251 183079 DEBUG nova.network.neutron [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Successfully created port: 6cf839fd-ff11-4ab3-a473-8a61175f769b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.087 183079 DEBUG nova.network.neutron [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Successfully updated port: 6b1bf9db-e098-4d03-b185-9a64eee8cec2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.187 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.213 183079 DEBUG nova.compute.manager [req-79ed88cf-0b6b-470e-aca2-91b11596c351 req-9cc7c9c1-ba5a-4388-b6b4-f6fd55b8c35b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-changed-6b1bf9db-e098-4d03-b185-9a64eee8cec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.214 183079 DEBUG nova.compute.manager [req-79ed88cf-0b6b-470e-aca2-91b11596c351 req-9cc7c9c1-ba5a-4388-b6b4-f6fd55b8c35b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Refreshing instance network info cache due to event network-changed-6b1bf9db-e098-4d03-b185-9a64eee8cec2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.214 183079 DEBUG oslo_concurrency.lockutils [req-79ed88cf-0b6b-470e-aca2-91b11596c351 req-9cc7c9c1-ba5a-4388-b6b4-f6fd55b8c35b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.214 183079 DEBUG oslo_concurrency.lockutils [req-79ed88cf-0b6b-470e-aca2-91b11596c351 req-9cc7c9c1-ba5a-4388-b6b4-f6fd55b8c35b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.214 183079 DEBUG nova.network.neutron [req-79ed88cf-0b6b-470e-aca2-91b11596c351 req-9cc7c9c1-ba5a-4388-b6b4-f6fd55b8c35b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Refreshing network info cache for port 6b1bf9db-e098-4d03-b185-9a64eee8cec2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.448 183079 DEBUG nova.network.neutron [req-79ed88cf-0b6b-470e-aca2-91b11596c351 req-9cc7c9c1-ba5a-4388-b6b4-f6fd55b8c35b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.767 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Acquiring lock "02af6288-0bd3-438c-982d-f36b31e1a9bf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.768 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "02af6288-0bd3-438c-982d-f36b31e1a9bf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.787 183079 DEBUG nova.compute.manager [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.842 183079 DEBUG nova.network.neutron [req-79ed88cf-0b6b-470e-aca2-91b11596c351 req-9cc7c9c1-ba5a-4388-b6b4-f6fd55b8c35b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.879 183079 DEBUG oslo_concurrency.lockutils [req-79ed88cf-0b6b-470e-aca2-91b11596c351 req-9cc7c9c1-ba5a-4388-b6b4-f6fd55b8c35b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.901 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.902 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.911 183079 DEBUG nova.virt.hardware [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:17:16 compute-0 nova_compute[183075]: 2026-01-22 17:17:16.912 183079 INFO nova.compute.claims [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.053 183079 DEBUG nova.compute.provider_tree [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.066 183079 DEBUG nova.scheduler.client.report [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.070 183079 DEBUG nova.network.neutron [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Successfully updated port: 6cf839fd-ff11-4ab3-a473-8a61175f769b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.089 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.089 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquired lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.089 183079 DEBUG nova.network.neutron [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.091 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.092 183079 DEBUG nova.compute.manager [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.134 183079 DEBUG nova.compute.manager [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.134 183079 DEBUG nova.network.neutron [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.150 183079 INFO nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.165 183079 DEBUG nova.compute.manager [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.246 183079 DEBUG nova.network.neutron [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.256 183079 DEBUG nova.compute.manager [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.257 183079 DEBUG nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.257 183079 INFO nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Creating image(s)
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.257 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Acquiring lock "/var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.258 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "/var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.258 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "/var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.269 183079 DEBUG oslo_concurrency.processutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.315 183079 DEBUG nova.policy [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.328 183079 DEBUG oslo_concurrency.processutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.328 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.329 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.339 183079 DEBUG oslo_concurrency.processutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.414 183079 DEBUG oslo_concurrency.processutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.415 183079 DEBUG oslo_concurrency.processutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.457 183079 DEBUG oslo_concurrency.processutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.458 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.459 183079 DEBUG oslo_concurrency.processutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.515 183079 DEBUG oslo_concurrency.processutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.515 183079 DEBUG nova.virt.disk.api [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Checking if we can resize image /var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.516 183079 DEBUG oslo_concurrency.processutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.579 183079 DEBUG oslo_concurrency.processutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.580 183079 DEBUG nova.virt.disk.api [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Cannot resize image /var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.580 183079 DEBUG nova.objects.instance [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lazy-loading 'migration_context' on Instance uuid 02af6288-0bd3-438c-982d-f36b31e1a9bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.593 183079 DEBUG nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.594 183079 DEBUG nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Ensure instance console log exists: /var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.594 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.595 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.595 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.806 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.807 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:17:17 compute-0 nova_compute[183075]: 2026-01-22 17:17:17.807 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:17:18 compute-0 nova_compute[183075]: 2026-01-22 17:17:18.370 183079 DEBUG nova.compute.manager [req-02a7a093-5ce6-4fce-a297-b654d29b958b req-27111d8f-f2ef-4eab-9f60-731eb272bf69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-changed-6cf839fd-ff11-4ab3-a473-8a61175f769b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:17:18 compute-0 nova_compute[183075]: 2026-01-22 17:17:18.370 183079 DEBUG nova.compute.manager [req-02a7a093-5ce6-4fce-a297-b654d29b958b req-27111d8f-f2ef-4eab-9f60-731eb272bf69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Refreshing instance network info cache due to event network-changed-6cf839fd-ff11-4ab3-a473-8a61175f769b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:17:18 compute-0 nova_compute[183075]: 2026-01-22 17:17:18.371 183079 DEBUG oslo_concurrency.lockutils [req-02a7a093-5ce6-4fce-a297-b654d29b958b req-27111d8f-f2ef-4eab-9f60-731eb272bf69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:17:18 compute-0 podman[224617]: 2026-01-22 17:17:18.38894849 +0000 UTC m=+0.087437155 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:17:18 compute-0 nova_compute[183075]: 2026-01-22 17:17:18.424 183079 DEBUG nova.network.neutron [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Successfully created port: 9a06288c-d8e5-43c4-9559-23674152a05e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:17:18 compute-0 nova_compute[183075]: 2026-01-22 17:17:18.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:17:18 compute-0 nova_compute[183075]: 2026-01-22 17:17:18.932 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.255 183079 DEBUG nova.network.neutron [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Updating instance_info_cache with network_info: [{"id": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "address": "fa:16:3e:e6:59:b5", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b1bf9db-e0", "ovs_interfaceid": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "address": "fa:16:3e:ae:c3:d7", "network": {"id": "3b6ff245-6da2-41fe-a6c8-3c52ded12515", "bridge": "br-int", "label": "tempest-test-network--1156553269", "subnets": [{"cidr": "2001:db8:0:2::/64", "dns": [], "gateway": {"address": "2001:db8:0:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:2:f816:3eff:feae:c3d7", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cf839fd-ff", "ovs_interfaceid": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.274 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Releasing lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.274 183079 DEBUG nova.compute.manager [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Instance network_info: |[{"id": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "address": "fa:16:3e:e6:59:b5", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b1bf9db-e0", "ovs_interfaceid": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "address": "fa:16:3e:ae:c3:d7", "network": {"id": "3b6ff245-6da2-41fe-a6c8-3c52ded12515", "bridge": "br-int", "label": "tempest-test-network--1156553269", "subnets": [{"cidr": "2001:db8:0:2::/64", "dns": [], "gateway": {"address": "2001:db8:0:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:2:f816:3eff:feae:c3d7", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cf839fd-ff", "ovs_interfaceid": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.275 183079 DEBUG oslo_concurrency.lockutils [req-02a7a093-5ce6-4fce-a297-b654d29b958b req-27111d8f-f2ef-4eab-9f60-731eb272bf69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.275 183079 DEBUG nova.network.neutron [req-02a7a093-5ce6-4fce-a297-b654d29b958b req-27111d8f-f2ef-4eab-9f60-731eb272bf69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Refreshing network info cache for port 6cf839fd-ff11-4ab3-a473-8a61175f769b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.281 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Start _get_guest_xml network_info=[{"id": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "address": "fa:16:3e:e6:59:b5", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b1bf9db-e0", "ovs_interfaceid": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "address": "fa:16:3e:ae:c3:d7", "network": {"id": "3b6ff245-6da2-41fe-a6c8-3c52ded12515", "bridge": "br-int", "label": "tempest-test-network--1156553269", "subnets": [{"cidr": "2001:db8:0:2::/64", "dns": [], "gateway": {"address": "2001:db8:0:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:2:f816:3eff:feae:c3d7", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cf839fd-ff", "ovs_interfaceid": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.287 183079 WARNING nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.301 183079 DEBUG nova.virt.libvirt.host [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.301 183079 DEBUG nova.virt.libvirt.host [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.305 183079 DEBUG nova.virt.libvirt.host [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.305 183079 DEBUG nova.virt.libvirt.host [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.305 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.306 183079 DEBUG nova.virt.hardware [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.306 183079 DEBUG nova.virt.hardware [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.306 183079 DEBUG nova.virt.hardware [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.307 183079 DEBUG nova.virt.hardware [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.307 183079 DEBUG nova.virt.hardware [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.307 183079 DEBUG nova.virt.hardware [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.307 183079 DEBUG nova.virt.hardware [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.307 183079 DEBUG nova.virt.hardware [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.308 183079 DEBUG nova.virt.hardware [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.308 183079 DEBUG nova.virt.hardware [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.308 183079 DEBUG nova.virt.hardware [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.311 183079 DEBUG nova.virt.libvirt.vif [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-37806664',display_name='tempest-server-test-37806664',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-37806664',id=31,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-q3uks0k4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:17:11Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=c925ab60-0524-40ab-a82b-52f810b9023f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "address": "fa:16:3e:e6:59:b5", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b1bf9db-e0", "ovs_interfaceid": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.311 183079 DEBUG nova.network.os_vif_util [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "address": "fa:16:3e:e6:59:b5", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b1bf9db-e0", "ovs_interfaceid": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.312 183079 DEBUG nova.network.os_vif_util [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:59:b5,bridge_name='br-int',has_traffic_filtering=True,id=6b1bf9db-e098-4d03-b185-9a64eee8cec2,network=Network(359b74c5-cbeb-4440-a3e9-a16a51b1ab77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b1bf9db-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.313 183079 DEBUG nova.virt.libvirt.vif [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-37806664',display_name='tempest-server-test-37806664',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-37806664',id=31,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-q3uks0k4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:17:11Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=c925ab60-0524-40ab-a82b-52f810b9023f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "address": "fa:16:3e:ae:c3:d7", "network": {"id": "3b6ff245-6da2-41fe-a6c8-3c52ded12515", "bridge": "br-int", "label": "tempest-test-network--1156553269", "subnets": [{"cidr": "2001:db8:0:2::/64", "dns": [], "gateway": {"address": "2001:db8:0:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:2:f816:3eff:feae:c3d7", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cf839fd-ff", "ovs_interfaceid": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.313 183079 DEBUG nova.network.os_vif_util [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "address": "fa:16:3e:ae:c3:d7", "network": {"id": "3b6ff245-6da2-41fe-a6c8-3c52ded12515", "bridge": "br-int", "label": "tempest-test-network--1156553269", "subnets": [{"cidr": "2001:db8:0:2::/64", "dns": [], "gateway": {"address": "2001:db8:0:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:2:f816:3eff:feae:c3d7", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cf839fd-ff", "ovs_interfaceid": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.313 183079 DEBUG nova.network.os_vif_util [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:c3:d7,bridge_name='br-int',has_traffic_filtering=True,id=6cf839fd-ff11-4ab3-a473-8a61175f769b,network=Network(3b6ff245-6da2-41fe-a6c8-3c52ded12515),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cf839fd-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.314 183079 DEBUG nova.objects.instance [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lazy-loading 'pci_devices' on Instance uuid c925ab60-0524-40ab-a82b-52f810b9023f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.327 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:17:19 compute-0 nova_compute[183075]:   <uuid>c925ab60-0524-40ab-a82b-52f810b9023f</uuid>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   <name>instance-0000001f</name>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-37806664</nova:name>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:17:19</nova:creationTime>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:17:19 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:17:19 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:17:19 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:17:19 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:17:19 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:17:19 compute-0 nova_compute[183075]:         <nova:user uuid="2fe02f7484a94091bab26aba1c370459">tempest-IPv6Test-1828780436-project-member</nova:user>
Jan 22 17:17:19 compute-0 nova_compute[183075]:         <nova:project uuid="4f95c62a5a194d2291b03187a9c85702">tempest-IPv6Test-1828780436</nova:project>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:17:19 compute-0 nova_compute[183075]:         <nova:port uuid="6b1bf9db-e098-4d03-b185-9a64eee8cec2">
Jan 22 17:17:19 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:17:19 compute-0 nova_compute[183075]:         <nova:port uuid="6cf839fd-ff11-4ab3-a473-8a61175f769b">
Jan 22 17:17:19 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="2001:db8:0:2:f816:3eff:feae:c3d7" ipVersion="6"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <system>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <entry name="serial">c925ab60-0524-40ab-a82b-52f810b9023f</entry>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <entry name="uuid">c925ab60-0524-40ab-a82b-52f810b9023f</entry>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     </system>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   <os>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   </os>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   <features>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   </features>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/disk"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:e6:59:b5"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <target dev="tap6b1bf9db-e0"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:ae:c3:d7"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <target dev="tap6cf839fd-ff"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/console.log" append="off"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <video>
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     </video>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:17:19 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:17:19 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:17:19 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:17:19 compute-0 nova_compute[183075]: </domain>
Jan 22 17:17:19 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.329 183079 DEBUG nova.compute.manager [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Preparing to wait for external event network-vif-plugged-6b1bf9db-e098-4d03-b185-9a64eee8cec2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.329 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.329 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.329 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.330 183079 DEBUG nova.compute.manager [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Preparing to wait for external event network-vif-plugged-6cf839fd-ff11-4ab3-a473-8a61175f769b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.330 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.330 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.330 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.331 183079 DEBUG nova.virt.libvirt.vif [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-37806664',display_name='tempest-server-test-37806664',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-37806664',id=31,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-q3uks0k4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:17:11Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=c925ab60-0524-40ab-a82b-52f810b9023f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "address": "fa:16:3e:e6:59:b5", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b1bf9db-e0", "ovs_interfaceid": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.331 183079 DEBUG nova.network.os_vif_util [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "address": "fa:16:3e:e6:59:b5", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b1bf9db-e0", "ovs_interfaceid": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.332 183079 DEBUG nova.network.os_vif_util [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:59:b5,bridge_name='br-int',has_traffic_filtering=True,id=6b1bf9db-e098-4d03-b185-9a64eee8cec2,network=Network(359b74c5-cbeb-4440-a3e9-a16a51b1ab77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b1bf9db-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.332 183079 DEBUG os_vif [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:59:b5,bridge_name='br-int',has_traffic_filtering=True,id=6b1bf9db-e098-4d03-b185-9a64eee8cec2,network=Network(359b74c5-cbeb-4440-a3e9-a16a51b1ab77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b1bf9db-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.333 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.333 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.333 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.336 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.336 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b1bf9db-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.336 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6b1bf9db-e0, col_values=(('external_ids', {'iface-id': '6b1bf9db-e098-4d03-b185-9a64eee8cec2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e6:59:b5', 'vm-uuid': 'c925ab60-0524-40ab-a82b-52f810b9023f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:19 compute-0 NetworkManager[55454]: <info>  [1769102239.3389] manager: (tap6b1bf9db-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/156)
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.339 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.347 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.348 183079 INFO os_vif [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:59:b5,bridge_name='br-int',has_traffic_filtering=True,id=6b1bf9db-e098-4d03-b185-9a64eee8cec2,network=Network(359b74c5-cbeb-4440-a3e9-a16a51b1ab77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b1bf9db-e0')
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.348 183079 DEBUG nova.virt.libvirt.vif [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-37806664',display_name='tempest-server-test-37806664',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-37806664',id=31,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-q3uks0k4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:17:11Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=c925ab60-0524-40ab-a82b-52f810b9023f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "address": "fa:16:3e:ae:c3:d7", "network": {"id": "3b6ff245-6da2-41fe-a6c8-3c52ded12515", "bridge": "br-int", "label": "tempest-test-network--1156553269", "subnets": [{"cidr": "2001:db8:0:2::/64", "dns": [], "gateway": {"address": "2001:db8:0:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:2:f816:3eff:feae:c3d7", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cf839fd-ff", "ovs_interfaceid": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.349 183079 DEBUG nova.network.os_vif_util [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "address": "fa:16:3e:ae:c3:d7", "network": {"id": "3b6ff245-6da2-41fe-a6c8-3c52ded12515", "bridge": "br-int", "label": "tempest-test-network--1156553269", "subnets": [{"cidr": "2001:db8:0:2::/64", "dns": [], "gateway": {"address": "2001:db8:0:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:2:f816:3eff:feae:c3d7", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cf839fd-ff", "ovs_interfaceid": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.349 183079 DEBUG nova.network.os_vif_util [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:c3:d7,bridge_name='br-int',has_traffic_filtering=True,id=6cf839fd-ff11-4ab3-a473-8a61175f769b,network=Network(3b6ff245-6da2-41fe-a6c8-3c52ded12515),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cf839fd-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.349 183079 DEBUG os_vif [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:c3:d7,bridge_name='br-int',has_traffic_filtering=True,id=6cf839fd-ff11-4ab3-a473-8a61175f769b,network=Network(3b6ff245-6da2-41fe-a6c8-3c52ded12515),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cf839fd-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.350 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.350 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.350 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.352 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.352 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6cf839fd-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.352 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6cf839fd-ff, col_values=(('external_ids', {'iface-id': '6cf839fd-ff11-4ab3-a473-8a61175f769b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ae:c3:d7', 'vm-uuid': 'c925ab60-0524-40ab-a82b-52f810b9023f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:19 compute-0 NetworkManager[55454]: <info>  [1769102239.3544] manager: (tap6cf839fd-ff): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/157)
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.355 183079 DEBUG nova.network.neutron [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Successfully updated port: 9a06288c-d8e5-43c4-9559-23674152a05e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.356 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.361 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.362 183079 INFO os_vif [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:c3:d7,bridge_name='br-int',has_traffic_filtering=True,id=6cf839fd-ff11-4ab3-a473-8a61175f769b,network=Network(3b6ff245-6da2-41fe-a6c8-3c52ded12515),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cf839fd-ff')
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.366 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Acquiring lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.366 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Acquired lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.366 183079 DEBUG nova.network.neutron [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.404 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.404 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] No VIF found with MAC fa:16:3e:e6:59:b5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.404 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] No VIF found with MAC fa:16:3e:ae:c3:d7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:17:19 compute-0 NetworkManager[55454]: <info>  [1769102239.4711] manager: (tap6b1bf9db-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/158)
Jan 22 17:17:19 compute-0 kernel: tap6b1bf9db-e0: entered promiscuous mode
Jan 22 17:17:19 compute-0 ovn_controller[95372]: 2026-01-22T17:17:19Z|00373|binding|INFO|Claiming lport 6b1bf9db-e098-4d03-b185-9a64eee8cec2 for this chassis.
Jan 22 17:17:19 compute-0 ovn_controller[95372]: 2026-01-22T17:17:19Z|00374|binding|INFO|6b1bf9db-e098-4d03-b185-9a64eee8cec2: Claiming fa:16:3e:e6:59:b5 10.100.0.12
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.474 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.482 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:59:b5 10.100.0.12'], port_security=['fa:16:3e:e6:59:b5 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-359b74c5-cbeb-4440-a3e9-a16a51b1ab77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4f95c62a5a194d2291b03187a9c85702', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1b5e2b25-1ae0-464c-ac9a-7fc65ac893a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee297cf4-fb08-4758-bba6-b8b00aaf6678, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=6b1bf9db-e098-4d03-b185-9a64eee8cec2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.483 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 6b1bf9db-e098-4d03-b185-9a64eee8cec2 in datapath 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 bound to our chassis
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.485 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 359b74c5-cbeb-4440-a3e9-a16a51b1ab77
Jan 22 17:17:19 compute-0 ovn_controller[95372]: 2026-01-22T17:17:19Z|00375|binding|INFO|Setting lport 6b1bf9db-e098-4d03-b185-9a64eee8cec2 ovn-installed in OVS
Jan 22 17:17:19 compute-0 ovn_controller[95372]: 2026-01-22T17:17:19Z|00376|binding|INFO|Setting lport 6b1bf9db-e098-4d03-b185-9a64eee8cec2 up in Southbound
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.492 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:19 compute-0 kernel: tap6cf839fd-ff: entered promiscuous mode
Jan 22 17:17:19 compute-0 NetworkManager[55454]: <info>  [1769102239.4965] manager: (tap6cf839fd-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/159)
Jan 22 17:17:19 compute-0 ovn_controller[95372]: 2026-01-22T17:17:19Z|00377|binding|INFO|Claiming lport 6cf839fd-ff11-4ab3-a473-8a61175f769b for this chassis.
Jan 22 17:17:19 compute-0 ovn_controller[95372]: 2026-01-22T17:17:19Z|00378|binding|INFO|6cf839fd-ff11-4ab3-a473-8a61175f769b: Claiming fa:16:3e:ae:c3:d7 2001:db8:0:2:f816:3eff:feae:c3d7
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.498 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.497 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[728032f9-ad4a-405d-96a0-d01b646dbc22]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.498 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap359b74c5-c1 in ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.500 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap359b74c5-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.500 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f820ff48-40f1-4b4f-9b3a-7461a3a71f95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.501 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[122ded7e-119c-4738-8a40-fedcd2d63eae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.505 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:c3:d7 2001:db8:0:2:f816:3eff:feae:c3d7'], port_security=['fa:16:3e:ae:c3:d7 2001:db8:0:2:f816:3eff:feae:c3d7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:2:f816:3eff:feae:c3d7/64', 'neutron:device_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3b6ff245-6da2-41fe-a6c8-3c52ded12515', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4f95c62a5a194d2291b03187a9c85702', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1b5e2b25-1ae0-464c-ac9a-7fc65ac893a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29e660d1-e0aa-49fa-ba81-62cbb14776f0, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=6cf839fd-ff11-4ab3-a473-8a61175f769b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:17:19 compute-0 systemd-udevd[224663]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:17:19 compute-0 systemd-udevd[224665]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.512 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[bbcd4797-d100-4c40-bd1c-aede99ba33de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.516 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.517 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:19 compute-0 ovn_controller[95372]: 2026-01-22T17:17:19Z|00379|binding|INFO|Setting lport 6cf839fd-ff11-4ab3-a473-8a61175f769b ovn-installed in OVS
Jan 22 17:17:19 compute-0 ovn_controller[95372]: 2026-01-22T17:17:19Z|00380|binding|INFO|Setting lport 6cf839fd-ff11-4ab3-a473-8a61175f769b up in Southbound
Jan 22 17:17:19 compute-0 NetworkManager[55454]: <info>  [1769102239.5254] device (tap6b1bf9db-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:17:19 compute-0 NetworkManager[55454]: <info>  [1769102239.5269] device (tap6cf839fd-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:17:19 compute-0 NetworkManager[55454]: <info>  [1769102239.5274] device (tap6b1bf9db-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:17:19 compute-0 NetworkManager[55454]: <info>  [1769102239.5280] device (tap6cf839fd-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.536 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[05b4d1c5-dc7b-4567-89f8-3214b5a60268]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:19 compute-0 systemd-machined[154382]: New machine qemu-31-instance-0000001f.
Jan 22 17:17:19 compute-0 systemd[1]: Started Virtual Machine qemu-31-instance-0000001f.
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.554 183079 DEBUG nova.network.neutron [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.561 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[208315c1-bac1-4f68-a854-42ffde903468]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:19 compute-0 NetworkManager[55454]: <info>  [1769102239.5685] manager: (tap359b74c5-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/160)
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.568 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd2b457-927d-4408-a210-9c2a0d8ca5e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.598 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[06c51af7-c101-49fe-93a7-3ee9d40a9846]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.602 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[da85b873-222f-4999-8f58-03ef6730b0e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:19 compute-0 NetworkManager[55454]: <info>  [1769102239.6275] device (tap359b74c5-c0): carrier: link connected
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.634 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[9b9f7927-b668-4402-803b-5f952b235885]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.652 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2b2acb9b-0e5c-421d-a7ce-33a9f17a6eec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap359b74c5-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:36:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 105], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460332, 'reachable_time': 18220, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224699, 'error': None, 'target': 'ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.668 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[43f9ef04-5292-478b-af82-ca1f2e514170]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee7:3633'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 460332, 'tstamp': 460332}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224700, 'error': None, 'target': 'ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.686 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[19d6cb06-7728-4e4a-97d0-f0b44da3d270]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap359b74c5-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:36:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 105], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460332, 'reachable_time': 18220, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224701, 'error': None, 'target': 'ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.709 183079 DEBUG nova.compute.manager [req-10db3420-e872-4552-92cf-02209333175b req-43b66a7b-2ca0-4c71-b854-5d705156dbc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-vif-plugged-6b1bf9db-e098-4d03-b185-9a64eee8cec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.709 183079 DEBUG oslo_concurrency.lockutils [req-10db3420-e872-4552-92cf-02209333175b req-43b66a7b-2ca0-4c71-b854-5d705156dbc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.709 183079 DEBUG oslo_concurrency.lockutils [req-10db3420-e872-4552-92cf-02209333175b req-43b66a7b-2ca0-4c71-b854-5d705156dbc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.710 183079 DEBUG oslo_concurrency.lockutils [req-10db3420-e872-4552-92cf-02209333175b req-43b66a7b-2ca0-4c71-b854-5d705156dbc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.710 183079 DEBUG nova.compute.manager [req-10db3420-e872-4552-92cf-02209333175b req-43b66a7b-2ca0-4c71-b854-5d705156dbc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Processing event network-vif-plugged-6b1bf9db-e098-4d03-b185-9a64eee8cec2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.720 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[92b6290e-a7e2-43c0-aa35-f0657a751bc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.793 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[00b9f99d-b69b-4468-8e4a-f86c23414b1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.795 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap359b74c5-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.796 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.797 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap359b74c5-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.808 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.808 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.808 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.808 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:17:19 compute-0 NetworkManager[55454]: <info>  [1769102239.8377] manager: (tap359b74c5-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/161)
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.838 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:19 compute-0 kernel: tap359b74c5-c0: entered promiscuous mode
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.842 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap359b74c5-c0, col_values=(('external_ids', {'iface-id': '705c199a-731e-4515-b4ee-a538f73a29f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:19 compute-0 ovn_controller[95372]: 2026-01-22T17:17:19Z|00381|binding|INFO|Releasing lport 705c199a-731e-4515-b4ee-a538f73a29f1 from this chassis (sb_readonly=0)
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.844 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.867 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.868 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/359b74c5-cbeb-4440-a3e9-a16a51b1ab77.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/359b74c5-cbeb-4440-a3e9-a16a51b1ab77.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.870 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ae4e8743-ca28-4f15-90c9-4d78e2e33c29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.871 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/359b74c5-cbeb-4440-a3e9-a16a51b1ab77.pid.haproxy
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 359b74c5-cbeb-4440-a3e9-a16a51b1ab77
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:17:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:19.872 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77', 'env', 'PROCESS_TAG=haproxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/359b74c5-cbeb-4440-a3e9-a16a51b1ab77.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.893 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102239.8917282, c925ab60-0524-40ab-a82b-52f810b9023f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.893 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] VM Started (Lifecycle Event)
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.901 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.928 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.932 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102239.8920307, c925ab60-0524-40ab-a82b-52f810b9023f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.932 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] VM Paused (Lifecycle Event)
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.957 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.960 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.984 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.987 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:17:19 compute-0 nova_compute[183075]: 2026-01-22 17:17:19.988 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.045 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.244 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.245 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5648MB free_disk=73.36653137207031GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.245 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.245 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:20 compute-0 podman[224748]: 2026-01-22 17:17:20.261926261 +0000 UTC m=+0.050110680 container create 041e28fca1f7500546a492b8274ffdd0943b67da426adbbf4410bcfd2f008135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:17:20 compute-0 systemd[1]: Started libpod-conmon-041e28fca1f7500546a492b8274ffdd0943b67da426adbbf4410bcfd2f008135.scope.
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.317 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance c925ab60-0524-40ab-a82b-52f810b9023f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.317 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 02af6288-0bd3-438c-982d-f36b31e1a9bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.318 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.318 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:17:20 compute-0 podman[224748]: 2026-01-22 17:17:20.233384325 +0000 UTC m=+0.021568764 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:17:20 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:17:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ba61d2a984fcf8a3b7b4e33aad2c392ec4e190543aa7d97027ace06a14863c9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:17:20 compute-0 podman[224748]: 2026-01-22 17:17:20.358758171 +0000 UTC m=+0.146942650 container init 041e28fca1f7500546a492b8274ffdd0943b67da426adbbf4410bcfd2f008135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 17:17:20 compute-0 podman[224748]: 2026-01-22 17:17:20.368356882 +0000 UTC m=+0.156541341 container start 041e28fca1f7500546a492b8274ffdd0943b67da426adbbf4410bcfd2f008135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.382 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.397 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:17:20 compute-0 neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224764]: [NOTICE]   (224768) : New worker (224770) forked
Jan 22 17:17:20 compute-0 neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224764]: [NOTICE]   (224768) : Loading success.
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.417 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.417 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.441 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 6cf839fd-ff11-4ab3-a473-8a61175f769b in datapath 3b6ff245-6da2-41fe-a6c8-3c52ded12515 unbound from our chassis
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.445 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3b6ff245-6da2-41fe-a6c8-3c52ded12515
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.456 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a106cd4b-3284-466a-9534-7d463c53d99a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.457 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3b6ff245-61 in ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.459 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3b6ff245-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.459 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[425960b1-8903-4e24-a14a-d42ddeea606c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.460 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[081e4db5-2084-4bfb-9bca-576b88787b7a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.474 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[e107d8a1-bc0a-44fd-bd70-f8345ade55bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.493 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f4ce6ce7-be79-4e28-a886-0ffe6bc073a0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.531 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[180e3e56-0ab2-4dbb-a6fb-1ff581799dab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.541 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cb04d1f2-43c5-4cf9-94ae-f91cc111007c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:20 compute-0 NetworkManager[55454]: <info>  [1769102240.5424] manager: (tap3b6ff245-60): new Veth device (/org/freedesktop/NetworkManager/Devices/162)
Jan 22 17:17:20 compute-0 systemd-udevd[224687]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.601 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[27d01fd3-2c53-490f-a8d8-df7a82842401]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.606 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae44db1-6ff9-4382-b8ec-1bd9eb6fb640]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:20 compute-0 NetworkManager[55454]: <info>  [1769102240.6486] device (tap3b6ff245-60): carrier: link connected
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.657 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[b002fc17-8c26-48f7-8aed-1659ade7f12d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.686 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[55974042-5a3c-4907-8cfa-1750aeaff9b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3b6ff245-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:fe:35'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460434, 'reachable_time': 31104, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224789, 'error': None, 'target': 'ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.710 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4a1a381a-2b36-4085-8efd-ee6e86f780dc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:fe35'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 460434, 'tstamp': 460434}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224790, 'error': None, 'target': 'ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.736 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[dbf5d4b2-a583-4f04-bd6f-6c513bbbb553]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3b6ff245-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:fe:35'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460434, 'reachable_time': 31104, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224791, 'error': None, 'target': 'ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.757 183079 DEBUG nova.network.neutron [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Updating instance_info_cache with network_info: [{"id": "9a06288c-d8e5-43c4-9559-23674152a05e", "address": "fa:16:3e:b0:5d:07", "network": {"id": "b8340cc9-e27b-4dbd-8de5-9c101e7b64ce", "bridge": "br-int", "label": "tempest-test-network--400188420", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70a37fbbd795434fbeb722ad97dda552", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a06288c-d8", "ovs_interfaceid": "9a06288c-d8e5-43c4-9559-23674152a05e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.781 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Releasing lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.782 183079 DEBUG nova.compute.manager [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Instance network_info: |[{"id": "9a06288c-d8e5-43c4-9559-23674152a05e", "address": "fa:16:3e:b0:5d:07", "network": {"id": "b8340cc9-e27b-4dbd-8de5-9c101e7b64ce", "bridge": "br-int", "label": "tempest-test-network--400188420", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70a37fbbd795434fbeb722ad97dda552", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a06288c-d8", "ovs_interfaceid": "9a06288c-d8e5-43c4-9559-23674152a05e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.786 183079 DEBUG nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Start _get_guest_xml network_info=[{"id": "9a06288c-d8e5-43c4-9559-23674152a05e", "address": "fa:16:3e:b0:5d:07", "network": {"id": "b8340cc9-e27b-4dbd-8de5-9c101e7b64ce", "bridge": "br-int", "label": "tempest-test-network--400188420", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70a37fbbd795434fbeb722ad97dda552", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a06288c-d8", "ovs_interfaceid": "9a06288c-d8e5-43c4-9559-23674152a05e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.789 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7c2c1b5e-03b3-4c56-8e84-e3e405d50689]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.793 183079 WARNING nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.798 183079 DEBUG nova.virt.libvirt.host [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.800 183079 DEBUG nova.virt.libvirt.host [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.803 183079 DEBUG nova.virt.libvirt.host [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.804 183079 DEBUG nova.virt.libvirt.host [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.805 183079 DEBUG nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.805 183079 DEBUG nova.virt.hardware [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.806 183079 DEBUG nova.virt.hardware [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.806 183079 DEBUG nova.virt.hardware [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.807 183079 DEBUG nova.virt.hardware [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.807 183079 DEBUG nova.virt.hardware [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.807 183079 DEBUG nova.virt.hardware [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.808 183079 DEBUG nova.virt.hardware [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.808 183079 DEBUG nova.virt.hardware [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.809 183079 DEBUG nova.virt.hardware [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.809 183079 DEBUG nova.virt.hardware [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.810 183079 DEBUG nova.virt.hardware [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.816 183079 DEBUG nova.virt.libvirt.vif [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:17:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-571651884',display_name='tempest-server-test-571651884',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-571651884',id=32,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOheSF+/g7pqjxa37mBWTVnLSWr9OoUZA+yJcO7BU9vrZDKpB0HwI4MttcuyJijhiuyAewJavO9K5NemBxxQoaBd71z7dq8hTIGwLmdOggCBA+UUuizOD4iEYMwLvvpiWQ==',key_name='tempest-keypair-test-332367794',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70a37fbbd795434fbeb722ad97dda552',ramdisk_id='',reservation_id='r-smyxjhwi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPQosTest-2049146665',owner_user_name='tempest-FloatingIPQosTest-2049146665-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:17:17Z,user_data=None,user_id='3d4b274173814c359055fed8dfc2bdeb',uuid=02af6288-0bd3-438c-982d-f36b31e1a9bf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9a06288c-d8e5-43c4-9559-23674152a05e", "address": "fa:16:3e:b0:5d:07", "network": {"id": "b8340cc9-e27b-4dbd-8de5-9c101e7b64ce", "bridge": "br-int", "label": "tempest-test-network--400188420", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70a37fbbd795434fbeb722ad97dda552", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a06288c-d8", "ovs_interfaceid": "9a06288c-d8e5-43c4-9559-23674152a05e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.817 183079 DEBUG nova.network.os_vif_util [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Converting VIF {"id": "9a06288c-d8e5-43c4-9559-23674152a05e", "address": "fa:16:3e:b0:5d:07", "network": {"id": "b8340cc9-e27b-4dbd-8de5-9c101e7b64ce", "bridge": "br-int", "label": "tempest-test-network--400188420", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70a37fbbd795434fbeb722ad97dda552", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a06288c-d8", "ovs_interfaceid": "9a06288c-d8e5-43c4-9559-23674152a05e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.818 183079 DEBUG nova.network.os_vif_util [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:5d:07,bridge_name='br-int',has_traffic_filtering=True,id=9a06288c-d8e5-43c4-9559-23674152a05e,network=Network(b8340cc9-e27b-4dbd-8de5-9c101e7b64ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a06288c-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.821 183079 DEBUG nova.objects.instance [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lazy-loading 'pci_devices' on Instance uuid 02af6288-0bd3-438c-982d-f36b31e1a9bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.828 183079 DEBUG nova.compute.manager [req-3b43f9ba-24e6-48a3-8a21-11eb44e0acf0 req-5bdfc9a1-ea9a-48f3-b3aa-ee80bfd47c1e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Received event network-changed-9a06288c-d8e5-43c4-9559-23674152a05e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.829 183079 DEBUG nova.compute.manager [req-3b43f9ba-24e6-48a3-8a21-11eb44e0acf0 req-5bdfc9a1-ea9a-48f3-b3aa-ee80bfd47c1e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Refreshing instance network info cache due to event network-changed-9a06288c-d8e5-43c4-9559-23674152a05e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.829 183079 DEBUG oslo_concurrency.lockutils [req-3b43f9ba-24e6-48a3-8a21-11eb44e0acf0 req-5bdfc9a1-ea9a-48f3-b3aa-ee80bfd47c1e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.830 183079 DEBUG oslo_concurrency.lockutils [req-3b43f9ba-24e6-48a3-8a21-11eb44e0acf0 req-5bdfc9a1-ea9a-48f3-b3aa-ee80bfd47c1e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.830 183079 DEBUG nova.network.neutron [req-3b43f9ba-24e6-48a3-8a21-11eb44e0acf0 req-5bdfc9a1-ea9a-48f3-b3aa-ee80bfd47c1e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Refreshing network info cache for port 9a06288c-d8e5-43c4-9559-23674152a05e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.834 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[217aa62e-abfd-409e-97d5-2dcc20e464e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.836 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b6ff245-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.836 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.837 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3b6ff245-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.838 183079 DEBUG nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:17:20 compute-0 nova_compute[183075]:   <uuid>02af6288-0bd3-438c-982d-f36b31e1a9bf</uuid>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   <name>instance-00000020</name>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-571651884</nova:name>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:17:20</nova:creationTime>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:17:20 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:17:20 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:17:20 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:17:20 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:17:20 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:17:20 compute-0 nova_compute[183075]:         <nova:user uuid="3d4b274173814c359055fed8dfc2bdeb">tempest-FloatingIPQosTest-2049146665-project-member</nova:user>
Jan 22 17:17:20 compute-0 nova_compute[183075]:         <nova:project uuid="70a37fbbd795434fbeb722ad97dda552">tempest-FloatingIPQosTest-2049146665</nova:project>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:17:20 compute-0 nova_compute[183075]:         <nova:port uuid="9a06288c-d8e5-43c4-9559-23674152a05e">
Jan 22 17:17:20 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:17:20 compute-0 kernel: tap3b6ff245-60: entered promiscuous mode
Jan 22 17:17:20 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <system>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <entry name="serial">02af6288-0bd3-438c-982d-f36b31e1a9bf</entry>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <entry name="uuid">02af6288-0bd3-438c-982d-f36b31e1a9bf</entry>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     </system>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   <os>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   </os>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   <features>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   </features>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf/disk"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:b0:5d:07"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <target dev="tap9a06288c-d8"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf/console.log" append="off"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <video>
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     </video>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:17:20 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:17:20 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:17:20 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:17:20 compute-0 nova_compute[183075]: </domain>
Jan 22 17:17:20 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:17:20 compute-0 NetworkManager[55454]: <info>  [1769102240.8416] manager: (tap3b6ff245-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/163)
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.841 183079 DEBUG nova.compute.manager [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Preparing to wait for external event network-vif-plugged-9a06288c-d8e5-43c4-9559-23674152a05e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.841 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Acquiring lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.842 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.842 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.845 183079 DEBUG nova.virt.libvirt.vif [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:17:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-571651884',display_name='tempest-server-test-571651884',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-571651884',id=32,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOheSF+/g7pqjxa37mBWTVnLSWr9OoUZA+yJcO7BU9vrZDKpB0HwI4MttcuyJijhiuyAewJavO9K5NemBxxQoaBd71z7dq8hTIGwLmdOggCBA+UUuizOD4iEYMwLvvpiWQ==',key_name='tempest-keypair-test-332367794',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70a37fbbd795434fbeb722ad97dda552',ramdisk_id='',reservation_id='r-smyxjhwi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPQosTest-2049146665',owner_user_name='tempest-FloatingIPQosTest-2049146665-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:17:17Z,user_data=None,user_id='3d4b274173814c359055fed8dfc2bdeb',uuid=02af6288-0bd3-438c-982d-f36b31e1a9bf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9a06288c-d8e5-43c4-9559-23674152a05e", "address": "fa:16:3e:b0:5d:07", "network": {"id": "b8340cc9-e27b-4dbd-8de5-9c101e7b64ce", "bridge": "br-int", "label": "tempest-test-network--400188420", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70a37fbbd795434fbeb722ad97dda552", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a06288c-d8", "ovs_interfaceid": "9a06288c-d8e5-43c4-9559-23674152a05e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.846 183079 DEBUG nova.network.os_vif_util [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Converting VIF {"id": "9a06288c-d8e5-43c4-9559-23674152a05e", "address": "fa:16:3e:b0:5d:07", "network": {"id": "b8340cc9-e27b-4dbd-8de5-9c101e7b64ce", "bridge": "br-int", "label": "tempest-test-network--400188420", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70a37fbbd795434fbeb722ad97dda552", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a06288c-d8", "ovs_interfaceid": "9a06288c-d8e5-43c4-9559-23674152a05e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.847 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3b6ff245-60, col_values=(('external_ids', {'iface-id': '2f605645-0ecc-448a-928d-43b9a99fc5dc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.848 183079 DEBUG nova.network.os_vif_util [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:5d:07,bridge_name='br-int',has_traffic_filtering=True,id=9a06288c-d8e5-43c4-9559-23674152a05e,network=Network(b8340cc9-e27b-4dbd-8de5-9c101e7b64ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a06288c-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:17:20 compute-0 ovn_controller[95372]: 2026-01-22T17:17:20Z|00382|binding|INFO|Releasing lport 2f605645-0ecc-448a-928d-43b9a99fc5dc from this chassis (sb_readonly=0)
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.849 183079 DEBUG os_vif [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:5d:07,bridge_name='br-int',has_traffic_filtering=True,id=9a06288c-d8e5-43c4-9559-23674152a05e,network=Network(b8340cc9-e27b-4dbd-8de5-9c101e7b64ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a06288c-d8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.850 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.854 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.855 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.861 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.868 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.868 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9a06288c-d8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.869 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9a06288c-d8, col_values=(('external_ids', {'iface-id': '9a06288c-d8e5-43c4-9559-23674152a05e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:5d:07', 'vm-uuid': '02af6288-0bd3-438c-982d-f36b31e1a9bf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.916 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.917 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3b6ff245-6da2-41fe-a6c8-3c52ded12515.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3b6ff245-6da2-41fe-a6c8-3c52ded12515.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.918 183079 DEBUG nova.network.neutron [req-02a7a093-5ce6-4fce-a297-b654d29b958b req-27111d8f-f2ef-4eab-9f60-731eb272bf69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Updated VIF entry in instance network info cache for port 6cf839fd-ff11-4ab3-a473-8a61175f769b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:17:20 compute-0 NetworkManager[55454]: <info>  [1769102240.9193] manager: (tap9a06288c-d8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/164)
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.919 183079 DEBUG nova.network.neutron [req-02a7a093-5ce6-4fce-a297-b654d29b958b req-27111d8f-f2ef-4eab-9f60-731eb272bf69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Updating instance_info_cache with network_info: [{"id": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "address": "fa:16:3e:e6:59:b5", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b1bf9db-e0", "ovs_interfaceid": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "address": "fa:16:3e:ae:c3:d7", "network": {"id": "3b6ff245-6da2-41fe-a6c8-3c52ded12515", "bridge": "br-int", "label": "tempest-test-network--1156553269", "subnets": [{"cidr": "2001:db8:0:2::/64", "dns": [], "gateway": {"address": "2001:db8:0:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:2:f816:3eff:feae:c3d7", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cf839fd-ff", "ovs_interfaceid": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.920 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.920 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e1aac78b-727b-46b2-8fde-c2e9c565a189]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.921 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-3b6ff245-6da2-41fe-a6c8-3c52ded12515
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/3b6ff245-6da2-41fe-a6c8-3c52ded12515.pid.haproxy
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 3b6ff245-6da2-41fe-a6c8-3c52ded12515
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:17:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:20.921 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515', 'env', 'PROCESS_TAG=haproxy-3b6ff245-6da2-41fe-a6c8-3c52ded12515', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3b6ff245-6da2-41fe-a6c8-3c52ded12515.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.927 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.928 183079 INFO os_vif [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:5d:07,bridge_name='br-int',has_traffic_filtering=True,id=9a06288c-d8e5-43c4-9559-23674152a05e,network=Network(b8340cc9-e27b-4dbd-8de5-9c101e7b64ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a06288c-d8')
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.943 183079 DEBUG oslo_concurrency.lockutils [req-02a7a093-5ce6-4fce-a297-b654d29b958b req-27111d8f-f2ef-4eab-9f60-731eb272bf69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.990 183079 DEBUG nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:17:20 compute-0 nova_compute[183075]: 2026-01-22 17:17:20.990 183079 DEBUG nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] No VIF found with MAC fa:16:3e:b0:5d:07, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:17:21 compute-0 kernel: tap9a06288c-d8: entered promiscuous mode
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.078 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:21 compute-0 ovn_controller[95372]: 2026-01-22T17:17:21Z|00383|binding|INFO|Claiming lport 9a06288c-d8e5-43c4-9559-23674152a05e for this chassis.
Jan 22 17:17:21 compute-0 ovn_controller[95372]: 2026-01-22T17:17:21Z|00384|binding|INFO|9a06288c-d8e5-43c4-9559-23674152a05e: Claiming fa:16:3e:b0:5d:07 10.100.0.27
Jan 22 17:17:21 compute-0 NetworkManager[55454]: <info>  [1769102241.0830] manager: (tap9a06288c-d8): new Tun device (/org/freedesktop/NetworkManager/Devices/165)
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.086 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:5d:07 10.100.0.27'], port_security=['fa:16:3e:b0:5d:07 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70a37fbbd795434fbeb722ad97dda552', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ec8664a7-160c-4633-979c-dec0eb1895f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=679d6e1b-038a-49c0-93ea-1c7e7848d42c, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=9a06288c-d8e5-43c4-9559-23674152a05e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:17:21 compute-0 NetworkManager[55454]: <info>  [1769102241.1105] device (tap9a06288c-d8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:17:21 compute-0 NetworkManager[55454]: <info>  [1769102241.1111] device (tap9a06288c-d8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:17:21 compute-0 ovn_controller[95372]: 2026-01-22T17:17:21Z|00385|binding|INFO|Setting lport 9a06288c-d8e5-43c4-9559-23674152a05e ovn-installed in OVS
Jan 22 17:17:21 compute-0 ovn_controller[95372]: 2026-01-22T17:17:21Z|00386|binding|INFO|Setting lport 9a06288c-d8e5-43c4-9559-23674152a05e up in Southbound
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.117 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:21 compute-0 systemd-machined[154382]: New machine qemu-32-instance-00000020.
Jan 22 17:17:21 compute-0 systemd[1]: Started Virtual Machine qemu-32-instance-00000020.
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.189 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:21 compute-0 podman[224844]: 2026-01-22 17:17:21.319190872 +0000 UTC m=+0.065541023 container create 3ed5997b2da70bd8c6fe5711915d3ed0fea452810a5e9316c87b9faa210b2db4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 17:17:21 compute-0 systemd[1]: Started libpod-conmon-3ed5997b2da70bd8c6fe5711915d3ed0fea452810a5e9316c87b9faa210b2db4.scope.
Jan 22 17:17:21 compute-0 podman[224844]: 2026-01-22 17:17:21.277584975 +0000 UTC m=+0.023935166 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:17:21 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:17:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7d1faeafc17ebfb2cb8e0422d67804b55d383fcb80ca43817fd60b6da1860e1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:17:21 compute-0 podman[224844]: 2026-01-22 17:17:21.413666071 +0000 UTC m=+0.160016202 container init 3ed5997b2da70bd8c6fe5711915d3ed0fea452810a5e9316c87b9faa210b2db4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.421 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102241.4207888, 02af6288-0bd3-438c-982d-f36b31e1a9bf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.422 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] VM Started (Lifecycle Event)
Jan 22 17:17:21 compute-0 podman[224844]: 2026-01-22 17:17:21.424467183 +0000 UTC m=+0.170817284 container start 3ed5997b2da70bd8c6fe5711915d3ed0fea452810a5e9316c87b9faa210b2db4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.445 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.452 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102241.421049, 02af6288-0bd3-438c-982d-f36b31e1a9bf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.452 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] VM Paused (Lifecycle Event)
Jan 22 17:17:21 compute-0 neutron-haproxy-ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515[224866]: [NOTICE]   (224871) : New worker (224873) forked
Jan 22 17:17:21 compute-0 neutron-haproxy-ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515[224866]: [NOTICE]   (224871) : Loading success.
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.472 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.475 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.496 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.500 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 9a06288c-d8e5-43c4-9559-23674152a05e in datapath b8340cc9-e27b-4dbd-8de5-9c101e7b64ce unbound from our chassis
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.502 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8340cc9-e27b-4dbd-8de5-9c101e7b64ce
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.514 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e2bab15d-ee06-438b-b87e-5dc8d88e6543]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.514 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb8340cc9-e1 in ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.517 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb8340cc9-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.517 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b395cf13-929a-43fa-8658-b16dde1b1d7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.518 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[96896bb7-f591-4c33-bbdc-46c2933fce87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.528 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[ebcf11e7-7224-4a0c-ac16-5fb360a83fd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.551 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c1973f42-cd82-40e1-b857-32ae354c51ef]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.575 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[58a3cfc9-111d-4220-acce-203035b216e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.580 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0731c9df-cac7-433a-ad51-a205d7a310ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:21 compute-0 NetworkManager[55454]: <info>  [1769102241.5812] manager: (tapb8340cc9-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/166)
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.618 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[32dca9de-b08b-4fae-a4ff-1ac0e468c927]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.622 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[e3a50c37-5bae-4b60-8a7b-e0fd33e98766]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:21 compute-0 NetworkManager[55454]: <info>  [1769102241.6451] device (tapb8340cc9-e0): carrier: link connected
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.650 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[25346167-b73f-4785-bf76-d19f2a7c0e84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.671 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6bd212e2-2785-4171-abe8-6fa390c60565]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8340cc9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:60:98:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460534, 'reachable_time': 29205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224892, 'error': None, 'target': 'ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.693 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[51da25c2-3a12-4dfa-ad58-9edfc419f807]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe60:9816'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 460534, 'tstamp': 460534}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224893, 'error': None, 'target': 'ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.721 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d752fc31-600e-4c1d-85b6-dfce8c3cfe93]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8340cc9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:60:98:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460534, 'reachable_time': 29205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224894, 'error': None, 'target': 'ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.756 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[65dfeabe-d55a-499c-8f77-1da6b3408a3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.795 183079 DEBUG nova.compute.manager [req-9490fbcb-b3a9-45f5-bffb-3caa5ae93db7 req-be5479a9-1037-4530-b383-4be22fdb5fca a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-vif-plugged-6b1bf9db-e098-4d03-b185-9a64eee8cec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.796 183079 DEBUG oslo_concurrency.lockutils [req-9490fbcb-b3a9-45f5-bffb-3caa5ae93db7 req-be5479a9-1037-4530-b383-4be22fdb5fca a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.796 183079 DEBUG oslo_concurrency.lockutils [req-9490fbcb-b3a9-45f5-bffb-3caa5ae93db7 req-be5479a9-1037-4530-b383-4be22fdb5fca a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.796 183079 DEBUG oslo_concurrency.lockutils [req-9490fbcb-b3a9-45f5-bffb-3caa5ae93db7 req-be5479a9-1037-4530-b383-4be22fdb5fca a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.796 183079 DEBUG nova.compute.manager [req-9490fbcb-b3a9-45f5-bffb-3caa5ae93db7 req-be5479a9-1037-4530-b383-4be22fdb5fca a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] No event matching network-vif-plugged-6b1bf9db-e098-4d03-b185-9a64eee8cec2 in dict_keys([('network-vif-plugged', '6cf839fd-ff11-4ab3-a473-8a61175f769b')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.797 183079 WARNING nova.compute.manager [req-9490fbcb-b3a9-45f5-bffb-3caa5ae93db7 req-be5479a9-1037-4530-b383-4be22fdb5fca a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received unexpected event network-vif-plugged-6b1bf9db-e098-4d03-b185-9a64eee8cec2 for instance with vm_state building and task_state spawning.
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.831 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[02f52022-ece0-4414-ab97-6611b6d23038]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.832 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8340cc9-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.832 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.833 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8340cc9-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.834 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:21 compute-0 kernel: tapb8340cc9-e0: entered promiscuous mode
Jan 22 17:17:21 compute-0 NetworkManager[55454]: <info>  [1769102241.8356] manager: (tapb8340cc9-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/167)
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.836 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.838 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8340cc9-e0, col_values=(('external_ids', {'iface-id': 'cd0e1120-5be8-4515-9d17-af992fbbcf85'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.839 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:21 compute-0 ovn_controller[95372]: 2026-01-22T17:17:21Z|00387|binding|INFO|Releasing lport cd0e1120-5be8-4515-9d17-af992fbbcf85 from this chassis (sb_readonly=0)
Jan 22 17:17:21 compute-0 nova_compute[183075]: 2026-01-22 17:17:21.849 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.850 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b8340cc9-e27b-4dbd-8de5-9c101e7b64ce.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b8340cc9-e27b-4dbd-8de5-9c101e7b64ce.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.851 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[09894d6a-b540-4cf4-8e0b-0c811e3eaf47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.852 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/b8340cc9-e27b-4dbd-8de5-9c101e7b64ce.pid.haproxy
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID b8340cc9-e27b-4dbd-8de5-9c101e7b64ce
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:17:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:21.852 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce', 'env', 'PROCESS_TAG=haproxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b8340cc9-e27b-4dbd-8de5-9c101e7b64ce.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.046 183079 DEBUG nova.network.neutron [req-3b43f9ba-24e6-48a3-8a21-11eb44e0acf0 req-5bdfc9a1-ea9a-48f3-b3aa-ee80bfd47c1e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Updated VIF entry in instance network info cache for port 9a06288c-d8e5-43c4-9559-23674152a05e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.046 183079 DEBUG nova.network.neutron [req-3b43f9ba-24e6-48a3-8a21-11eb44e0acf0 req-5bdfc9a1-ea9a-48f3-b3aa-ee80bfd47c1e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Updating instance_info_cache with network_info: [{"id": "9a06288c-d8e5-43c4-9559-23674152a05e", "address": "fa:16:3e:b0:5d:07", "network": {"id": "b8340cc9-e27b-4dbd-8de5-9c101e7b64ce", "bridge": "br-int", "label": "tempest-test-network--400188420", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70a37fbbd795434fbeb722ad97dda552", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a06288c-d8", "ovs_interfaceid": "9a06288c-d8e5-43c4-9559-23674152a05e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.060 183079 DEBUG oslo_concurrency.lockutils [req-3b43f9ba-24e6-48a3-8a21-11eb44e0acf0 req-5bdfc9a1-ea9a-48f3-b3aa-ee80bfd47c1e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.060 183079 DEBUG nova.compute.manager [req-3b43f9ba-24e6-48a3-8a21-11eb44e0acf0 req-5bdfc9a1-ea9a-48f3-b3aa-ee80bfd47c1e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-vif-plugged-6cf839fd-ff11-4ab3-a473-8a61175f769b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.060 183079 DEBUG oslo_concurrency.lockutils [req-3b43f9ba-24e6-48a3-8a21-11eb44e0acf0 req-5bdfc9a1-ea9a-48f3-b3aa-ee80bfd47c1e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.061 183079 DEBUG oslo_concurrency.lockutils [req-3b43f9ba-24e6-48a3-8a21-11eb44e0acf0 req-5bdfc9a1-ea9a-48f3-b3aa-ee80bfd47c1e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.061 183079 DEBUG oslo_concurrency.lockutils [req-3b43f9ba-24e6-48a3-8a21-11eb44e0acf0 req-5bdfc9a1-ea9a-48f3-b3aa-ee80bfd47c1e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.061 183079 DEBUG nova.compute.manager [req-3b43f9ba-24e6-48a3-8a21-11eb44e0acf0 req-5bdfc9a1-ea9a-48f3-b3aa-ee80bfd47c1e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Processing event network-vif-plugged-6cf839fd-ff11-4ab3-a473-8a61175f769b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.062 183079 DEBUG nova.compute.manager [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Instance event wait completed in 2 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.066 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102242.0662599, c925ab60-0524-40ab-a82b-52f810b9023f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.066 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] VM Resumed (Lifecycle Event)
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.068 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.072 183079 INFO nova.virt.libvirt.driver [-] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Instance spawned successfully.
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.072 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.086 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.092 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.095 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.096 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.097 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.097 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.098 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.099 183079 DEBUG nova.virt.libvirt.driver [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.107 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.161 183079 INFO nova.compute.manager [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Took 10.36 seconds to spawn the instance on the hypervisor.
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.161 183079 DEBUG nova.compute.manager [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.274 183079 INFO nova.compute.manager [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Took 11.09 seconds to build instance.
Jan 22 17:17:22 compute-0 podman[224926]: 2026-01-22 17:17:22.289165093 +0000 UTC m=+0.056081236 container create d70199136438c29fa3a929bd121f7aaccaabfca76249cd9ac5bd414d39cb8b1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.289 183079 DEBUG oslo_concurrency.lockutils [None req-fe1ebe0c-d111-4682-abcf-8fe36119ab7e 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:22 compute-0 systemd[1]: Started libpod-conmon-d70199136438c29fa3a929bd121f7aaccaabfca76249cd9ac5bd414d39cb8b1a.scope.
Jan 22 17:17:22 compute-0 podman[224926]: 2026-01-22 17:17:22.256990162 +0000 UTC m=+0.023906355 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:17:22 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:17:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84aed5423a13ebe8528ab2577273d5ba1594bac999ee5ce19a6d35e9fcf523ac/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:17:22 compute-0 podman[224926]: 2026-01-22 17:17:22.384648257 +0000 UTC m=+0.151564410 container init d70199136438c29fa3a929bd121f7aaccaabfca76249cd9ac5bd414d39cb8b1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 17:17:22 compute-0 podman[224926]: 2026-01-22 17:17:22.394336691 +0000 UTC m=+0.161252834 container start d70199136438c29fa3a929bd121f7aaccaabfca76249cd9ac5bd414d39cb8b1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 17:17:22 compute-0 neutron-haproxy-ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224941]: [NOTICE]   (224945) : New worker (224947) forked
Jan 22 17:17:22 compute-0 neutron-haproxy-ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224941]: [NOTICE]   (224945) : Loading success.
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.897 183079 DEBUG nova.compute.manager [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-vif-plugged-6cf839fd-ff11-4ab3-a473-8a61175f769b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.898 183079 DEBUG oslo_concurrency.lockutils [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.898 183079 DEBUG oslo_concurrency.lockutils [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.899 183079 DEBUG oslo_concurrency.lockutils [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.899 183079 DEBUG nova.compute.manager [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] No waiting events found dispatching network-vif-plugged-6cf839fd-ff11-4ab3-a473-8a61175f769b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.900 183079 WARNING nova.compute.manager [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received unexpected event network-vif-plugged-6cf839fd-ff11-4ab3-a473-8a61175f769b for instance with vm_state active and task_state None.
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.900 183079 DEBUG nova.compute.manager [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Received event network-vif-plugged-9a06288c-d8e5-43c4-9559-23674152a05e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.900 183079 DEBUG oslo_concurrency.lockutils [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.901 183079 DEBUG oslo_concurrency.lockutils [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.901 183079 DEBUG oslo_concurrency.lockutils [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.902 183079 DEBUG nova.compute.manager [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Processing event network-vif-plugged-9a06288c-d8e5-43c4-9559-23674152a05e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.902 183079 DEBUG nova.compute.manager [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Received event network-vif-plugged-9a06288c-d8e5-43c4-9559-23674152a05e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.903 183079 DEBUG oslo_concurrency.lockutils [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.903 183079 DEBUG oslo_concurrency.lockutils [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.904 183079 DEBUG oslo_concurrency.lockutils [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.904 183079 DEBUG nova.compute.manager [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] No waiting events found dispatching network-vif-plugged-9a06288c-d8e5-43c4-9559-23674152a05e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.905 183079 WARNING nova.compute.manager [req-5a26efd9-34e7-415f-bdfa-97e98aa756db req-010599b1-37f4-4884-b273-2e88d4d061b4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Received unexpected event network-vif-plugged-9a06288c-d8e5-43c4-9559-23674152a05e for instance with vm_state building and task_state spawning.
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.906 183079 DEBUG nova.compute.manager [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.911 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102242.910721, 02af6288-0bd3-438c-982d-f36b31e1a9bf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.911 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] VM Resumed (Lifecycle Event)
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.913 183079 DEBUG nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.917 183079 INFO nova.virt.libvirt.driver [-] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Instance spawned successfully.
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.918 183079 DEBUG nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.936 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.942 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.946 183079 DEBUG nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.946 183079 DEBUG nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.947 183079 DEBUG nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.947 183079 DEBUG nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.948 183079 DEBUG nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.948 183079 DEBUG nova.virt.libvirt.driver [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:17:22 compute-0 nova_compute[183075]: 2026-01-22 17:17:22.988 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:17:23 compute-0 nova_compute[183075]: 2026-01-22 17:17:23.119 183079 INFO nova.compute.manager [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Took 5.86 seconds to spawn the instance on the hypervisor.
Jan 22 17:17:23 compute-0 nova_compute[183075]: 2026-01-22 17:17:23.120 183079 DEBUG nova.compute.manager [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:17:23 compute-0 nova_compute[183075]: 2026-01-22 17:17:23.209 183079 INFO nova.compute.manager [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Took 6.34 seconds to build instance.
Jan 22 17:17:23 compute-0 nova_compute[183075]: 2026-01-22 17:17:23.229 183079 DEBUG oslo_concurrency.lockutils [None req-83e65706-a2e0-4db1-923c-9da46948b158 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "02af6288-0bd3-438c-982d-f36b31e1a9bf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.462s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:24 compute-0 nova_compute[183075]: 2026-01-22 17:17:24.556 183079 INFO nova.compute.manager [None req-04ad7747-9df2-4b02-9f1f-988470541bb8 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Get console output
Jan 22 17:17:24 compute-0 nova_compute[183075]: 2026-01-22 17:17:24.561 183079 INFO nova.compute.manager [None req-1cdbe074-98eb-4cf0-b247-414a367e0565 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Get console output
Jan 22 17:17:24 compute-0 nova_compute[183075]: 2026-01-22 17:17:24.566 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:17:25 compute-0 nova_compute[183075]: 2026-01-22 17:17:25.919 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:26 compute-0 nova_compute[183075]: 2026-01-22 17:17:26.191 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:27 compute-0 podman[224956]: 2026-01-22 17:17:27.354313839 +0000 UTC m=+0.063179832 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:17:29 compute-0 nova_compute[183075]: 2026-01-22 17:17:29.730 183079 INFO nova.compute.manager [None req-2d48b360-abbe-45bd-b3cb-a9afeb658794 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Get console output
Jan 22 17:17:29 compute-0 nova_compute[183075]: 2026-01-22 17:17:29.736 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:17:29 compute-0 nova_compute[183075]: 2026-01-22 17:17:29.752 183079 INFO nova.compute.manager [None req-562b5582-3dc3-4e95-94b6-9a5ebbbdbec0 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Get console output
Jan 22 17:17:29 compute-0 nova_compute[183075]: 2026-01-22 17:17:29.760 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:17:30 compute-0 nova_compute[183075]: 2026-01-22 17:17:30.969 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:31 compute-0 nova_compute[183075]: 2026-01-22 17:17:31.193 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:32 compute-0 nova_compute[183075]: 2026-01-22 17:17:32.413 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:17:34 compute-0 ovn_controller[95372]: 2026-01-22T17:17:34Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b0:5d:07 10.100.0.27
Jan 22 17:17:34 compute-0 ovn_controller[95372]: 2026-01-22T17:17:34Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b0:5d:07 10.100.0.27
Jan 22 17:17:34 compute-0 nova_compute[183075]: 2026-01-22 17:17:34.935 183079 INFO nova.compute.manager [None req-aec9e1e0-3d89-42da-ae2b-a75377c17e02 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Get console output
Jan 22 17:17:34 compute-0 nova_compute[183075]: 2026-01-22 17:17:34.952 183079 INFO nova.compute.manager [None req-4fb08637-78cd-40e9-b420-2e5658aa638d 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Get console output
Jan 22 17:17:34 compute-0 nova_compute[183075]: 2026-01-22 17:17:34.958 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:17:35 compute-0 ovn_controller[95372]: 2026-01-22T17:17:35Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e6:59:b5 10.100.0.12
Jan 22 17:17:35 compute-0 ovn_controller[95372]: 2026-01-22T17:17:35Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e6:59:b5 10.100.0.12
Jan 22 17:17:35 compute-0 nova_compute[183075]: 2026-01-22 17:17:35.974 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:36 compute-0 nova_compute[183075]: 2026-01-22 17:17:36.196 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:37 compute-0 podman[225006]: 2026-01-22 17:17:37.366008534 +0000 UTC m=+0.061778985 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:17:37 compute-0 podman[225007]: 2026-01-22 17:17:37.396437009 +0000 UTC m=+0.077271750 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, architecture=x86_64, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter)
Jan 22 17:17:37 compute-0 podman[225005]: 2026-01-22 17:17:37.420553079 +0000 UTC m=+0.110778285 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:17:40 compute-0 nova_compute[183075]: 2026-01-22 17:17:40.105 183079 INFO nova.compute.manager [None req-74389a60-b5da-440c-83ba-f4a2516157ac 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Get console output
Jan 22 17:17:40 compute-0 nova_compute[183075]: 2026-01-22 17:17:40.109 183079 INFO nova.compute.manager [None req-ead02f88-57c1-40be-9828-76db25d55a61 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Get console output
Jan 22 17:17:40 compute-0 nova_compute[183075]: 2026-01-22 17:17:40.115 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:17:40 compute-0 nova_compute[183075]: 2026-01-22 17:17:40.117 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:17:41 compute-0 nova_compute[183075]: 2026-01-22 17:17:41.009 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:41 compute-0 nova_compute[183075]: 2026-01-22 17:17:41.198 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:41.892 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:41.893 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:41.933 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:41.934 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:41.935 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:41.965 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:41.966 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:17:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b8340cc9-e27b-4dbd-8de5-9c101e7b64ce __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 podman[225074]: 2026-01-22 17:17:42.391645977 +0000 UTC m=+0.087184879 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.705 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.706 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.7404025
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224947]: 10.100.0.27:41470 [22/Jan/2026:17:17:41.964] listener listener/metadata 0/0/0/742/742 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.710 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224770]: 10.100.0.12:56462 [22/Jan/2026:17:17:41.891] listener listener/metadata 0/0/0/820/820 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.711 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.8181813
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.723 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.724 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b8340cc9-e27b-4dbd-8de5-9c101e7b64ce __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.730 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.731 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.753 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224947]: 10.100.0.27:41484 [22/Jan/2026:17:17:42.722] listener listener/metadata 0/0/0/32/32 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.754 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0303383
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.761 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.762 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b8340cc9-e27b-4dbd-8de5-9c101e7b64ce __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.770 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.771 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0403910
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224770]: 10.100.0.12:56476 [22/Jan/2026:17:17:42.726] listener listener/metadata 0/0/0/44/44 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.778 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.779 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.803 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.804 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0248256
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224770]: 10.100.0.12:56486 [22/Jan/2026:17:17:42.777] listener listener/metadata 0/0/0/26/26 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.805 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.805 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0433929
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224947]: 10.100.0.27:41494 [22/Jan/2026:17:17:42.760] listener listener/metadata 0/0/0/44/44 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.809 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.810 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.814 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.814 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b8340cc9-e27b-4dbd-8de5-9c101e7b64ce __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.829 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224770]: 10.100.0.12:56498 [22/Jan/2026:17:17:42.809] listener listener/metadata 0/0/0/20/20 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.830 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0197654
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.831 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.831 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0170374
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224947]: 10.100.0.27:41508 [22/Jan/2026:17:17:42.812] listener listener/metadata 0/0/0/18/18 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.836 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.837 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.841 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.841 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b8340cc9-e27b-4dbd-8de5-9c101e7b64ce __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.861 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.861 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0247467
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224770]: 10.100.0.12:56504 [22/Jan/2026:17:17:42.835] listener listener/metadata 0/0/0/25/25 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.865 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.865 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0242836
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224947]: 10.100.0.27:41516 [22/Jan/2026:17:17:42.840] listener listener/metadata 0/0/0/25/25 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.868 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.868 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.878 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.878 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b8340cc9-e27b-4dbd-8de5-9c101e7b64ce __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.883 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.884 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0153339
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224770]: 10.100.0.12:56512 [22/Jan/2026:17:17:42.867] listener listener/metadata 0/0/0/16/16 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.889 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.890 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.897 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.897 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0184550
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224947]: 10.100.0.27:41526 [22/Jan/2026:17:17:42.877] listener listener/metadata 0/0/0/19/19 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.905 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224770]: 10.100.0.12:56520 [22/Jan/2026:17:17:42.888] listener listener/metadata 0/0/0/18/18 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.906 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0161219
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.907 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.907 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b8340cc9-e27b-4dbd-8de5-9c101e7b64ce __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.911 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.911 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.923 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224947]: 10.100.0.27:41538 [22/Jan/2026:17:17:42.906] listener listener/metadata 0/0/0/17/17 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.923 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0162766
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.924 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.925 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0134552
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224770]: 10.100.0.12:56524 [22/Jan/2026:17:17:42.909] listener listener/metadata 0/0/0/15/15 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.928 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.929 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.933 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.933 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b8340cc9-e27b-4dbd-8de5-9c101e7b64ce __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.948 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.948 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0149457
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224947]: 10.100.0.27:41552 [22/Jan/2026:17:17:42.931] listener listener/metadata 0/0/0/17/17 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.952 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.952 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 164 time: 0.0230055
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224770]: 10.100.0.12:56530 [22/Jan/2026:17:17:42.928] listener listener/metadata 0/0/0/23/23 200 148 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.955 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.956 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b8340cc9-e27b-4dbd-8de5-9c101e7b64ce __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.959 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.960 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.973 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.973 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0174727
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224947]: 10.100.0.27:41556 [22/Jan/2026:17:17:42.955] listener listener/metadata 0/0/0/18/18 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.978 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.978 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 164 time: 0.0187230
Jan 22 17:17:42 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224770]: 10.100.0.12:56532 [22/Jan/2026:17:17:42.957] listener listener/metadata 0/0/0/21/21 200 148 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.980 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.981 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b8340cc9-e27b-4dbd-8de5-9c101e7b64ce __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.986 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:42.987 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:17:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.005 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.006 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0253365
Jan 22 17:17:43 compute-0 haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224947]: 10.100.0.27:41564 [22/Jan/2026:17:17:42.980] listener listener/metadata 0/0/0/26/26 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.012 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.013 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b8340cc9-e27b-4dbd-8de5-9c101e7b64ce __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:43 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224770]: 10.100.0.12:56534 [22/Jan/2026:17:17:42.985] listener listener/metadata 0/0/0/35/35 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.020 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0331612
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.030 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.030 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.034 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0214689
Jan 22 17:17:43 compute-0 haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224947]: 10.100.0.27:41578 [22/Jan/2026:17:17:43.011] listener listener/metadata 0/0/0/22/22 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.049 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.051 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b8340cc9-e27b-4dbd-8de5-9c101e7b64ce __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.058 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.058 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0278425
Jan 22 17:17:43 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224770]: 10.100.0.12:56544 [22/Jan/2026:17:17:43.029] listener listener/metadata 0/0/0/28/28 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.063 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.064 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.084 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:43 compute-0 haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224947]: 10.100.0.27:41588 [22/Jan/2026:17:17:43.048] listener listener/metadata 0/0/0/36/36 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.085 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0341573
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.086 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.086 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0222518
Jan 22 17:17:43 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224770]: 10.100.0.12:56560 [22/Jan/2026:17:17:43.063] listener listener/metadata 0/0/0/23/23 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.088 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.089 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b8340cc9-e27b-4dbd-8de5-9c101e7b64ce __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.093 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.094 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.110 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:43 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224770]: 10.100.0.12:56570 [22/Jan/2026:17:17:43.090] listener listener/metadata 0/0/0/20/20 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.110 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0165362
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.114 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:43 compute-0 haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224947]: 10.100.0.27:41598 [22/Jan/2026:17:17:43.088] listener listener/metadata 0/0/0/26/26 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.115 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0257394
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.115 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.116 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.120 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.121 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b8340cc9-e27b-4dbd-8de5-9c101e7b64ce __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.135 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:43 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224770]: 10.100.0.12:56580 [22/Jan/2026:17:17:43.114] listener listener/metadata 0/0/0/22/22 200 148 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.136 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 164 time: 0.0200603
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.141 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:43 compute-0 haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224947]: 10.100.0.27:41612 [22/Jan/2026:17:17:43.118] listener listener/metadata 0/0/0/23/23 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.141 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0202644
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.144 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.144 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.149 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.149 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b8340cc9-e27b-4dbd-8de5-9c101e7b64ce __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.312 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.312 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.1680405
Jan 22 17:17:43 compute-0 haproxy-metadata-proxy-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224770]: 10.100.0.12:56588 [22/Jan/2026:17:17:43.143] listener listener/metadata 0/0/0/169/169 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.314 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:43 compute-0 haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224947]: 10.100.0.27:41616 [22/Jan/2026:17:17:43.145] listener listener/metadata 0/0/0/168/168 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.314 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.1650498
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.322 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.323 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.27
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: b8340cc9-e27b-4dbd-8de5-9c101e7b64ce __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.342 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:17:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:43.343 104990 INFO eventlet.wsgi.server [-] 10.100.0.27,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0193381
Jan 22 17:17:43 compute-0 haproxy-metadata-proxy-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224947]: 10.100.0.27:41632 [22/Jan/2026:17:17:43.321] listener listener/metadata 0/0/0/21/21 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:17:45 compute-0 nova_compute[183075]: 2026-01-22 17:17:45.253 183079 INFO nova.compute.manager [None req-7fbf50f4-8d7e-4506-a7f9-c0ee0dd36690 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Get console output
Jan 22 17:17:45 compute-0 nova_compute[183075]: 2026-01-22 17:17:45.259 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:17:45 compute-0 nova_compute[183075]: 2026-01-22 17:17:45.306 183079 INFO nova.compute.manager [None req-19d76258-ced6-43cb-8ca1-0f72a089297c 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Get console output
Jan 22 17:17:45 compute-0 nova_compute[183075]: 2026-01-22 17:17:45.312 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:17:46 compute-0 nova_compute[183075]: 2026-01-22 17:17:46.013 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:46 compute-0 nova_compute[183075]: 2026-01-22 17:17:46.202 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:46 compute-0 nova_compute[183075]: 2026-01-22 17:17:46.578 183079 INFO nova.compute.manager [None req-91119485-7a0e-4174-92ee-50eb6a579fff 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Get console output
Jan 22 17:17:46 compute-0 nova_compute[183075]: 2026-01-22 17:17:46.581 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:17:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:48.174 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:17:48 compute-0 nova_compute[183075]: 2026-01-22 17:17:48.175 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:48.175 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:17:49 compute-0 nova_compute[183075]: 2026-01-22 17:17:49.228 183079 DEBUG nova.compute.manager [req-c6ac8c29-7204-4f6d-834f-b2516d023fb9 req-88bbaf67-d1ef-4871-8b6b-154ac35d93ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-changed-6b1bf9db-e098-4d03-b185-9a64eee8cec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:17:49 compute-0 nova_compute[183075]: 2026-01-22 17:17:49.228 183079 DEBUG nova.compute.manager [req-c6ac8c29-7204-4f6d-834f-b2516d023fb9 req-88bbaf67-d1ef-4871-8b6b-154ac35d93ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Refreshing instance network info cache due to event network-changed-6b1bf9db-e098-4d03-b185-9a64eee8cec2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:17:49 compute-0 nova_compute[183075]: 2026-01-22 17:17:49.229 183079 DEBUG oslo_concurrency.lockutils [req-c6ac8c29-7204-4f6d-834f-b2516d023fb9 req-88bbaf67-d1ef-4871-8b6b-154ac35d93ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:17:49 compute-0 nova_compute[183075]: 2026-01-22 17:17:49.229 183079 DEBUG oslo_concurrency.lockutils [req-c6ac8c29-7204-4f6d-834f-b2516d023fb9 req-88bbaf67-d1ef-4871-8b6b-154ac35d93ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:17:49 compute-0 nova_compute[183075]: 2026-01-22 17:17:49.230 183079 DEBUG nova.network.neutron [req-c6ac8c29-7204-4f6d-834f-b2516d023fb9 req-88bbaf67-d1ef-4871-8b6b-154ac35d93ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Refreshing network info cache for port 6b1bf9db-e098-4d03-b185-9a64eee8cec2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:17:49 compute-0 podman[225095]: 2026-01-22 17:17:49.354211994 +0000 UTC m=+0.062811752 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.471 183079 INFO nova.compute.manager [None req-a153c8f8-99ce-4b8a-8787-7c3554aa39b9 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Get console output
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.476 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.730 183079 DEBUG nova.network.neutron [req-c6ac8c29-7204-4f6d-834f-b2516d023fb9 req-88bbaf67-d1ef-4871-8b6b-154ac35d93ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Updated VIF entry in instance network info cache for port 6b1bf9db-e098-4d03-b185-9a64eee8cec2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.731 183079 DEBUG nova.network.neutron [req-c6ac8c29-7204-4f6d-834f-b2516d023fb9 req-88bbaf67-d1ef-4871-8b6b-154ac35d93ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Updating instance_info_cache with network_info: [{"id": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "address": "fa:16:3e:e6:59:b5", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b1bf9db-e0", "ovs_interfaceid": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "address": "fa:16:3e:ae:c3:d7", "network": {"id": "3b6ff245-6da2-41fe-a6c8-3c52ded12515", "bridge": "br-int", "label": "tempest-test-network--1156553269", "subnets": [{"cidr": "2001:db8:0:2::/64", "dns": [], "gateway": {"address": "2001:db8:0:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:2:f816:3eff:feae:c3d7", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cf839fd-ff", "ovs_interfaceid": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.758 183079 DEBUG oslo_concurrency.lockutils [req-c6ac8c29-7204-4f6d-834f-b2516d023fb9 req-88bbaf67-d1ef-4871-8b6b-154ac35d93ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.887 183079 DEBUG oslo_concurrency.lockutils [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "interface-c925ab60-0524-40ab-a82b-52f810b9023f-6cf839fd-ff11-4ab3-a473-8a61175f769b" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.888 183079 DEBUG oslo_concurrency.lockutils [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "interface-c925ab60-0524-40ab-a82b-52f810b9023f-6cf839fd-ff11-4ab3-a473-8a61175f769b" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.913 183079 DEBUG nova.objects.instance [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lazy-loading 'flavor' on Instance uuid c925ab60-0524-40ab-a82b-52f810b9023f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.930 183079 DEBUG nova.virt.libvirt.vif [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-37806664',display_name='tempest-server-test-37806664',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-37806664',id=31,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:17:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-q3uks0k4',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:17:22Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=c925ab60-0524-40ab-a82b-52f810b9023f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "address": "fa:16:3e:ae:c3:d7", "network": {"id": "3b6ff245-6da2-41fe-a6c8-3c52ded12515", "bridge": "br-int", "label": "tempest-test-network--1156553269", "subnets": [{"cidr": "2001:db8:0:2::/64", "dns": [], "gateway": {"address": "2001:db8:0:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:2:f816:3eff:feae:c3d7", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cf839fd-ff", "ovs_interfaceid": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.930 183079 DEBUG nova.network.os_vif_util [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "address": "fa:16:3e:ae:c3:d7", "network": {"id": "3b6ff245-6da2-41fe-a6c8-3c52ded12515", "bridge": "br-int", "label": "tempest-test-network--1156553269", "subnets": [{"cidr": "2001:db8:0:2::/64", "dns": [], "gateway": {"address": "2001:db8:0:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:2:f816:3eff:feae:c3d7", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cf839fd-ff", "ovs_interfaceid": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.931 183079 DEBUG nova.network.os_vif_util [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:c3:d7,bridge_name='br-int',has_traffic_filtering=True,id=6cf839fd-ff11-4ab3-a473-8a61175f769b,network=Network(3b6ff245-6da2-41fe-a6c8-3c52ded12515),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cf839fd-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.934 183079 DEBUG nova.virt.libvirt.guest [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ae:c3:d7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6cf839fd-ff"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.936 183079 DEBUG nova.virt.libvirt.guest [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ae:c3:d7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6cf839fd-ff"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.939 183079 DEBUG nova.virt.libvirt.driver [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Attempting to detach device tap6cf839fd-ff from instance c925ab60-0524-40ab-a82b-52f810b9023f from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.939 183079 DEBUG nova.virt.libvirt.guest [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] detach device xml: <interface type="ethernet">
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <mac address="fa:16:3e:ae:c3:d7"/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <model type="virtio"/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <mtu size="1442"/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <target dev="tap6cf839fd-ff"/>
Jan 22 17:17:50 compute-0 nova_compute[183075]: </interface>
Jan 22 17:17:50 compute-0 nova_compute[183075]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.946 183079 DEBUG nova.virt.libvirt.guest [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ae:c3:d7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6cf839fd-ff"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.948 183079 DEBUG nova.virt.libvirt.guest [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ae:c3:d7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6cf839fd-ff"/></interface>not found in domain: <domain type='kvm' id='31'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <name>instance-0000001f</name>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <uuid>c925ab60-0524-40ab-a82b-52f810b9023f</uuid>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-37806664</nova:name>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:17:19</nova:creationTime>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:17:50 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:17:50 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:17:50 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:17:50 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:17:50 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:17:50 compute-0 nova_compute[183075]:         <nova:user uuid="2fe02f7484a94091bab26aba1c370459">tempest-IPv6Test-1828780436-project-member</nova:user>
Jan 22 17:17:50 compute-0 nova_compute[183075]:         <nova:project uuid="4f95c62a5a194d2291b03187a9c85702">tempest-IPv6Test-1828780436</nova:project>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:17:50 compute-0 nova_compute[183075]:         <nova:port uuid="6b1bf9db-e098-4d03-b185-9a64eee8cec2">
Jan 22 17:17:50 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:17:50 compute-0 nova_compute[183075]:         <nova:port uuid="6cf839fd-ff11-4ab3-a473-8a61175f769b">
Jan 22 17:17:50 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="2001:db8:0:2:f816:3eff:feae:c3d7" ipVersion="6"/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <memory unit='KiB'>131072</memory>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <vcpu placement='static'>1</vcpu>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <resource>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <partition>/machine</partition>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   </resource>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <sysinfo type='smbios'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <system>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <entry name='manufacturer'>RDO</entry>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <entry name='serial'>c925ab60-0524-40ab-a82b-52f810b9023f</entry>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <entry name='uuid'>c925ab60-0524-40ab-a82b-52f810b9023f</entry>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <entry name='family'>Virtual Machine</entry>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </system>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <os>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <boot dev='hd'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <smbios mode='sysinfo'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   </os>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <features>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <vmcoreinfo state='on'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   </features>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <vendor>AMD</vendor>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='require' name='x2apic'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc-deadline'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='require' name='hypervisor'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc_adjust'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='require' name='spec-ctrl'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='require' name='stibp'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='require' name='ssbd'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='require' name='cmp_legacy'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='require' name='overflow-recov'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='require' name='succor'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='require' name='ibrs'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='require' name='amd-ssbd'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='require' name='virt-ssbd'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='disable' name='lbrv'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='disable' name='tsc-scale'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='disable' name='vmcb-clean'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='disable' name='flushbyasid'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='disable' name='pause-filter'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='disable' name='pfthreshold'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='disable' name='xsaves'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='disable' name='svm'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='require' name='topoext'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='disable' name='npt'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <feature policy='disable' name='nrip-save'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <clock offset='utc'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <timer name='hpet' present='no'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <on_poweroff>destroy</on_poweroff>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <on_reboot>restart</on_reboot>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <on_crash>destroy</on_crash>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <disk type='file' device='disk'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <source file='/var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/disk' index='1'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <backingStore type='file' index='2'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:         <format type='raw'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:         <source file='/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:         <backingStore/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       </backingStore>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target dev='vda' bus='virtio'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='virtio-disk0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pcie.0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='1' port='0x10'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.1'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='2' port='0x11'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.2'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='3' port='0x12'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.3'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='4' port='0x13'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.4'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='5' port='0x14'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.5'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='6' port='0x15'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.6'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='7' port='0x16'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.7'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='8' port='0x17'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.8'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='9' port='0x18'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.9'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='10' port='0x19'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.10'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='11' port='0x1a'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.11'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='12' port='0x1b'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.12'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='13' port='0x1c'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.13'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='14' port='0x1d'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.14'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='15' port='0x1e'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.15'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='16' port='0x1f'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.16'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='17' port='0x20'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.17'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='18' port='0x21'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.18'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='19' port='0x22'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.19'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='20' port='0x23'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.20'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='21' port='0x24'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.21'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='22' port='0x25'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.22'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='23' port='0x26'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.23'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='24' port='0x27'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.24'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target chassis='25' port='0x28'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.25'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model name='pcie-pci-bridge'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='pci.26'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='usb'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <controller type='sata' index='0'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='ide'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <interface type='ethernet'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <mac address='fa:16:3e:e6:59:b5'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target dev='tap6b1bf9db-e0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model type='virtio'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <mtu size='1442'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='net0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <interface type='ethernet'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <mac address='fa:16:3e:ae:c3:d7'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target dev='tap6cf839fd-ff'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model type='virtio'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <mtu size='1442'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='net1'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <serial type='pty'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <source path='/dev/pts/0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/console.log' append='off'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target type='isa-serial' port='0'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:         <model name='isa-serial'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       </target>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <console type='pty' tty='/dev/pts/0'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <source path='/dev/pts/0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/console.log' append='off'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <target type='serial' port='0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </console>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <input type='tablet' bus='usb'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='input0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='usb' bus='0' port='1'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </input>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <input type='mouse' bus='ps2'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='input1'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </input>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <input type='keyboard' bus='ps2'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='input2'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </input>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <listen type='address' address='::0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </graphics>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <audio id='1' type='none'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <video>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='video0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </video>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <watchdog model='itco' action='reset'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='watchdog0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </watchdog>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <memballoon model='virtio'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <stats period='10'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='balloon0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <rng model='virtio'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <backend model='random'>/dev/urandom</backend>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <alias name='rng0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <label>system_u:system_r:svirt_t:s0:c217,c950</label>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c217,c950</imagelabel>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <label>+107:+107</label>
Jan 22 17:17:50 compute-0 nova_compute[183075]:     <imagelabel>+107:+107</imagelabel>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:17:50 compute-0 nova_compute[183075]: </domain>
Jan 22 17:17:50 compute-0 nova_compute[183075]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.949 183079 INFO nova.virt.libvirt.driver [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Successfully detached device tap6cf839fd-ff from instance c925ab60-0524-40ab-a82b-52f810b9023f from the persistent domain config.
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.950 183079 DEBUG nova.virt.libvirt.driver [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] (1/8): Attempting to detach device tap6cf839fd-ff with device alias net1 from instance c925ab60-0524-40ab-a82b-52f810b9023f from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 22 17:17:50 compute-0 nova_compute[183075]: 2026-01-22 17:17:50.950 183079 DEBUG nova.virt.libvirt.guest [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] detach device xml: <interface type="ethernet">
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <mac address="fa:16:3e:ae:c3:d7"/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <model type="virtio"/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <mtu size="1442"/>
Jan 22 17:17:50 compute-0 nova_compute[183075]:   <target dev="tap6cf839fd-ff"/>
Jan 22 17:17:50 compute-0 nova_compute[183075]: </interface>
Jan 22 17:17:50 compute-0 nova_compute[183075]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.016 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:51 compute-0 kernel: tap6cf839fd-ff (unregistering): left promiscuous mode
Jan 22 17:17:51 compute-0 NetworkManager[55454]: <info>  [1769102271.0267] device (tap6cf839fd-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.031 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:51 compute-0 ovn_controller[95372]: 2026-01-22T17:17:51Z|00388|binding|INFO|Releasing lport 6cf839fd-ff11-4ab3-a473-8a61175f769b from this chassis (sb_readonly=0)
Jan 22 17:17:51 compute-0 ovn_controller[95372]: 2026-01-22T17:17:51Z|00389|binding|INFO|Setting lport 6cf839fd-ff11-4ab3-a473-8a61175f769b down in Southbound
Jan 22 17:17:51 compute-0 ovn_controller[95372]: 2026-01-22T17:17:51Z|00390|binding|INFO|Removing iface tap6cf839fd-ff ovn-installed in OVS
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.035 183079 DEBUG nova.virt.libvirt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Received event <DeviceRemovedEvent: 1769102271.035435, c925ab60-0524-40ab-a82b-52f810b9023f => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.036 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.037 183079 DEBUG nova.virt.libvirt.driver [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Start waiting for the detach event from libvirt for device tap6cf839fd-ff with device alias net1 for instance c925ab60-0524-40ab-a82b-52f810b9023f _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.037 183079 DEBUG nova.virt.libvirt.guest [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ae:c3:d7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6cf839fd-ff"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 17:17:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:51.043 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:c3:d7 2001:db8:0:2:f816:3eff:feae:c3d7'], port_security=['fa:16:3e:ae:c3:d7 2001:db8:0:2:f816:3eff:feae:c3d7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:2:f816:3eff:feae:c3d7/64', 'neutron:device_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3b6ff245-6da2-41fe-a6c8-3c52ded12515', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4f95c62a5a194d2291b03187a9c85702', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1b5e2b25-1ae0-464c-ac9a-7fc65ac893a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29e660d1-e0aa-49fa-ba81-62cbb14776f0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=6cf839fd-ff11-4ab3-a473-8a61175f769b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:17:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:51.045 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 6cf839fd-ff11-4ab3-a473-8a61175f769b in datapath 3b6ff245-6da2-41fe-a6c8-3c52ded12515 unbound from our chassis
Jan 22 17:17:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:51.047 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3b6ff245-6da2-41fe-a6c8-3c52ded12515, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.049 183079 DEBUG nova.virt.libvirt.guest [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ae:c3:d7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6cf839fd-ff"/></interface>not found in domain: <domain type='kvm' id='31'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <name>instance-0000001f</name>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <uuid>c925ab60-0524-40ab-a82b-52f810b9023f</uuid>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-37806664</nova:name>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:17:19</nova:creationTime>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:17:51 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:17:51 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:17:51 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:17:51 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:17:51 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:17:51 compute-0 nova_compute[183075]:         <nova:user uuid="2fe02f7484a94091bab26aba1c370459">tempest-IPv6Test-1828780436-project-member</nova:user>
Jan 22 17:17:51 compute-0 nova_compute[183075]:         <nova:project uuid="4f95c62a5a194d2291b03187a9c85702">tempest-IPv6Test-1828780436</nova:project>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:17:51 compute-0 nova_compute[183075]:         <nova:port uuid="6b1bf9db-e098-4d03-b185-9a64eee8cec2">
Jan 22 17:17:51 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:17:51 compute-0 nova_compute[183075]:         <nova:port uuid="6cf839fd-ff11-4ab3-a473-8a61175f769b">
Jan 22 17:17:51 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="2001:db8:0:2:f816:3eff:feae:c3d7" ipVersion="6"/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <memory unit='KiB'>131072</memory>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <vcpu placement='static'>1</vcpu>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <resource>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <partition>/machine</partition>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   </resource>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <sysinfo type='smbios'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <system>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <entry name='manufacturer'>RDO</entry>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <entry name='serial'>c925ab60-0524-40ab-a82b-52f810b9023f</entry>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <entry name='uuid'>c925ab60-0524-40ab-a82b-52f810b9023f</entry>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <entry name='family'>Virtual Machine</entry>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </system>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <os>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <boot dev='hd'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <smbios mode='sysinfo'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   </os>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <features>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <vmcoreinfo state='on'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   </features>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <vendor>AMD</vendor>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='require' name='x2apic'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc-deadline'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='require' name='hypervisor'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='require' name='tsc_adjust'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='require' name='spec-ctrl'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='require' name='stibp'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='require' name='ssbd'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='require' name='cmp_legacy'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='require' name='overflow-recov'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='require' name='succor'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='require' name='ibrs'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='require' name='amd-ssbd'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='require' name='virt-ssbd'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='disable' name='lbrv'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='disable' name='tsc-scale'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='disable' name='vmcb-clean'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='disable' name='flushbyasid'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='disable' name='pause-filter'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='disable' name='pfthreshold'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='disable' name='xsaves'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='disable' name='svm'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='require' name='topoext'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='disable' name='npt'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <feature policy='disable' name='nrip-save'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <clock offset='utc'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <timer name='hpet' present='no'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <on_poweroff>destroy</on_poweroff>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <on_reboot>restart</on_reboot>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <on_crash>destroy</on_crash>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <disk type='file' device='disk'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <source file='/var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/disk' index='1'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <backingStore type='file' index='2'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:         <format type='raw'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:         <source file='/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:         <backingStore/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       </backingStore>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target dev='vda' bus='virtio'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='virtio-disk0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pcie.0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='1' port='0x10'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.1'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='2' port='0x11'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.2'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='3' port='0x12'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.3'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='4' port='0x13'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.4'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='5' port='0x14'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.5'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='6' port='0x15'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.6'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='7' port='0x16'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.7'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='8' port='0x17'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.8'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='9' port='0x18'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.9'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='10' port='0x19'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.10'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='11' port='0x1a'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.11'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='12' port='0x1b'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.12'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='13' port='0x1c'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.13'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='14' port='0x1d'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.14'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='15' port='0x1e'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.15'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='16' port='0x1f'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.16'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='17' port='0x20'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.17'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='18' port='0x21'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.18'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='19' port='0x22'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.19'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='20' port='0x23'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.20'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='21' port='0x24'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.21'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='22' port='0x25'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.22'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='23' port='0x26'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.23'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='24' port='0x27'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.24'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-root-port'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target chassis='25' port='0x28'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.25'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model name='pcie-pci-bridge'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='pci.26'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='usb'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:51.049 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6f4ca15f-c079-4c38-9237-e5ee186cf308]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <controller type='sata' index='0'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='ide'/>
Jan 22 17:17:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:51.050 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515 namespace which is not needed anymore
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </controller>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <interface type='ethernet'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <mac address='fa:16:3e:e6:59:b5'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target dev='tap6b1bf9db-e0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model type='virtio'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <mtu size='1442'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='net0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <serial type='pty'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <source path='/dev/pts/0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/console.log' append='off'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target type='isa-serial' port='0'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:         <model name='isa-serial'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       </target>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <console type='pty' tty='/dev/pts/0'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <source path='/dev/pts/0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <log file='/var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f/console.log' append='off'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <target type='serial' port='0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='serial0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </console>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <input type='tablet' bus='usb'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='input0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='usb' bus='0' port='1'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </input>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <input type='mouse' bus='ps2'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='input1'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </input>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <input type='keyboard' bus='ps2'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='input2'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </input>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <listen type='address' address='::0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </graphics>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <audio id='1' type='none'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <video>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='video0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </video>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <watchdog model='itco' action='reset'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='watchdog0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </watchdog>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <memballoon model='virtio'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <stats period='10'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='balloon0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <rng model='virtio'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <backend model='random'>/dev/urandom</backend>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <alias name='rng0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <label>system_u:system_r:svirt_t:s0:c217,c950</label>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c217,c950</imagelabel>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <label>+107:+107</label>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <imagelabel>+107:+107</imagelabel>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   </seclabel>
Jan 22 17:17:51 compute-0 nova_compute[183075]: </domain>
Jan 22 17:17:51 compute-0 nova_compute[183075]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.049 183079 INFO nova.virt.libvirt.driver [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Successfully detached device tap6cf839fd-ff from instance c925ab60-0524-40ab-a82b-52f810b9023f from the live domain config.
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.051 183079 DEBUG nova.virt.libvirt.vif [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-37806664',display_name='tempest-server-test-37806664',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-37806664',id=31,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:17:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-q3uks0k4',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:17:22Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=c925ab60-0524-40ab-a82b-52f810b9023f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "address": "fa:16:3e:ae:c3:d7", "network": {"id": "3b6ff245-6da2-41fe-a6c8-3c52ded12515", "bridge": "br-int", "label": "tempest-test-network--1156553269", "subnets": [{"cidr": "2001:db8:0:2::/64", "dns": [], "gateway": {"address": "2001:db8:0:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:2:f816:3eff:feae:c3d7", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cf839fd-ff", "ovs_interfaceid": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.052 183079 DEBUG nova.network.os_vif_util [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "address": "fa:16:3e:ae:c3:d7", "network": {"id": "3b6ff245-6da2-41fe-a6c8-3c52ded12515", "bridge": "br-int", "label": "tempest-test-network--1156553269", "subnets": [{"cidr": "2001:db8:0:2::/64", "dns": [], "gateway": {"address": "2001:db8:0:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:2:f816:3eff:feae:c3d7", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cf839fd-ff", "ovs_interfaceid": "6cf839fd-ff11-4ab3-a473-8a61175f769b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.053 183079 DEBUG nova.network.os_vif_util [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:c3:d7,bridge_name='br-int',has_traffic_filtering=True,id=6cf839fd-ff11-4ab3-a473-8a61175f769b,network=Network(3b6ff245-6da2-41fe-a6c8-3c52ded12515),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cf839fd-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.053 183079 DEBUG os_vif [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:c3:d7,bridge_name='br-int',has_traffic_filtering=True,id=6cf839fd-ff11-4ab3-a473-8a61175f769b,network=Network(3b6ff245-6da2-41fe-a6c8-3c52ded12515),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cf839fd-ff') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.055 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.055 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6cf839fd-ff, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.056 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.059 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.063 183079 INFO os_vif [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:c3:d7,bridge_name='br-int',has_traffic_filtering=True,id=6cf839fd-ff11-4ab3-a473-8a61175f769b,network=Network(3b6ff245-6da2-41fe-a6c8-3c52ded12515),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cf839fd-ff')
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.063 183079 DEBUG nova.virt.libvirt.guest [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <nova:name>tempest-server-test-37806664</nova:name>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <nova:creationTime>2026-01-22 17:17:51</nova:creationTime>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <nova:flavor name="m1.nano">
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <nova:memory>128</nova:memory>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <nova:disk>1</nova:disk>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <nova:swap>0</nova:swap>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <nova:vcpus>1</nova:vcpus>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   </nova:flavor>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <nova:owner>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <nova:user uuid="2fe02f7484a94091bab26aba1c370459">tempest-IPv6Test-1828780436-project-member</nova:user>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <nova:project uuid="4f95c62a5a194d2291b03187a9c85702">tempest-IPv6Test-1828780436</nova:project>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   </nova:owner>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   <nova:ports>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     <nova:port uuid="6b1bf9db-e098-4d03-b185-9a64eee8cec2">
Jan 22 17:17:51 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:17:51 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:17:51 compute-0 nova_compute[183075]:   </nova:ports>
Jan 22 17:17:51 compute-0 nova_compute[183075]: </nova:instance>
Jan 22 17:17:51 compute-0 nova_compute[183075]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 17:17:51 compute-0 neutron-haproxy-ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515[224866]: [NOTICE]   (224871) : haproxy version is 2.8.14-c23fe91
Jan 22 17:17:51 compute-0 neutron-haproxy-ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515[224866]: [NOTICE]   (224871) : path to executable is /usr/sbin/haproxy
Jan 22 17:17:51 compute-0 neutron-haproxy-ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515[224866]: [WARNING]  (224871) : Exiting Master process...
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.207 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:51 compute-0 neutron-haproxy-ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515[224866]: [ALERT]    (224871) : Current worker (224873) exited with code 143 (Terminated)
Jan 22 17:17:51 compute-0 neutron-haproxy-ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515[224866]: [WARNING]  (224871) : All workers exited. Exiting... (0)
Jan 22 17:17:51 compute-0 systemd[1]: libpod-3ed5997b2da70bd8c6fe5711915d3ed0fea452810a5e9316c87b9faa210b2db4.scope: Deactivated successfully.
Jan 22 17:17:51 compute-0 podman[225144]: 2026-01-22 17:17:51.215303465 +0000 UTC m=+0.049075933 container died 3ed5997b2da70bd8c6fe5711915d3ed0fea452810a5e9316c87b9faa210b2db4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:17:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3ed5997b2da70bd8c6fe5711915d3ed0fea452810a5e9316c87b9faa210b2db4-userdata-shm.mount: Deactivated successfully.
Jan 22 17:17:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-f7d1faeafc17ebfb2cb8e0422d67804b55d383fcb80ca43817fd60b6da1860e1-merged.mount: Deactivated successfully.
Jan 22 17:17:51 compute-0 podman[225144]: 2026-01-22 17:17:51.250410002 +0000 UTC m=+0.084182450 container cleanup 3ed5997b2da70bd8c6fe5711915d3ed0fea452810a5e9316c87b9faa210b2db4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:17:51 compute-0 systemd[1]: libpod-conmon-3ed5997b2da70bd8c6fe5711915d3ed0fea452810a5e9316c87b9faa210b2db4.scope: Deactivated successfully.
Jan 22 17:17:51 compute-0 podman[225175]: 2026-01-22 17:17:51.306460897 +0000 UTC m=+0.037367348 container remove 3ed5997b2da70bd8c6fe5711915d3ed0fea452810a5e9316c87b9faa210b2db4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:17:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:51.312 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7a7403c5-1892-4a5b-a14a-6d38bfe5dcd1]: (4, ('Thu Jan 22 05:17:51 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515 (3ed5997b2da70bd8c6fe5711915d3ed0fea452810a5e9316c87b9faa210b2db4)\n3ed5997b2da70bd8c6fe5711915d3ed0fea452810a5e9316c87b9faa210b2db4\nThu Jan 22 05:17:51 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515 (3ed5997b2da70bd8c6fe5711915d3ed0fea452810a5e9316c87b9faa210b2db4)\n3ed5997b2da70bd8c6fe5711915d3ed0fea452810a5e9316c87b9faa210b2db4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:51.313 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[33e9baa3-a2e0-4346-b153-721e9268e6cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:51.314 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b6ff245-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.316 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:51 compute-0 kernel: tap3b6ff245-60: left promiscuous mode
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.330 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.331 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:51.333 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7ef323ee-86bf-45d3-9f4f-247a28104d94]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:51.352 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[80cab0fc-32e2-4f97-b3bf-a855f7467b22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:51.353 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6edbd717-9b4e-496d-bc90-d6ee94af4b44]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:51.371 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[69cf669a-4147-443d-8ecb-3203030805ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460422, 'reachable_time': 21882, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225190, 'error': None, 'target': 'ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:51.376 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3b6ff245-6da2-41fe-a6c8-3c52ded12515 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:17:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:51.376 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[a483a007-8054-440d-8a70-3c699fb3c979]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:51 compute-0 systemd[1]: run-netns-ovnmeta\x2d3b6ff245\x2d6da2\x2d41fe\x2da6c8\x2d3c52ded12515.mount: Deactivated successfully.
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.420 183079 DEBUG nova.compute.manager [req-edcc6407-52bd-474d-aeba-9c4ddda71a0b req-5e9ecf18-dc5b-4580-9837-d051ce1b07d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-vif-unplugged-6cf839fd-ff11-4ab3-a473-8a61175f769b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.420 183079 DEBUG oslo_concurrency.lockutils [req-edcc6407-52bd-474d-aeba-9c4ddda71a0b req-5e9ecf18-dc5b-4580-9837-d051ce1b07d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.420 183079 DEBUG oslo_concurrency.lockutils [req-edcc6407-52bd-474d-aeba-9c4ddda71a0b req-5e9ecf18-dc5b-4580-9837-d051ce1b07d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.421 183079 DEBUG oslo_concurrency.lockutils [req-edcc6407-52bd-474d-aeba-9c4ddda71a0b req-5e9ecf18-dc5b-4580-9837-d051ce1b07d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.421 183079 DEBUG nova.compute.manager [req-edcc6407-52bd-474d-aeba-9c4ddda71a0b req-5e9ecf18-dc5b-4580-9837-d051ce1b07d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] No waiting events found dispatching network-vif-unplugged-6cf839fd-ff11-4ab3-a473-8a61175f769b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:17:51 compute-0 nova_compute[183075]: 2026-01-22 17:17:51.421 183079 WARNING nova.compute.manager [req-edcc6407-52bd-474d-aeba-9c4ddda71a0b req-5e9ecf18-dc5b-4580-9837-d051ce1b07d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received unexpected event network-vif-unplugged-6cf839fd-ff11-4ab3-a473-8a61175f769b for instance with vm_state active and task_state None.
Jan 22 17:17:52 compute-0 nova_compute[183075]: 2026-01-22 17:17:52.073 183079 DEBUG oslo_concurrency.lockutils [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:17:52 compute-0 nova_compute[183075]: 2026-01-22 17:17:52.073 183079 DEBUG oslo_concurrency.lockutils [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquired lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:17:52 compute-0 nova_compute[183075]: 2026-01-22 17:17:52.073 183079 DEBUG nova.network.neutron [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:17:52 compute-0 nova_compute[183075]: 2026-01-22 17:17:52.621 183079 DEBUG oslo_concurrency.lockutils [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "interface-c925ab60-0524-40ab-a82b-52f810b9023f-7e367df6-1641-474f-ae64-7a03b03508ec" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:52 compute-0 nova_compute[183075]: 2026-01-22 17:17:52.622 183079 DEBUG oslo_concurrency.lockutils [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "interface-c925ab60-0524-40ab-a82b-52f810b9023f-7e367df6-1641-474f-ae64-7a03b03508ec" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:52 compute-0 nova_compute[183075]: 2026-01-22 17:17:52.623 183079 DEBUG nova.objects.instance [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lazy-loading 'flavor' on Instance uuid c925ab60-0524-40ab-a82b-52f810b9023f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:17:53 compute-0 nova_compute[183075]: 2026-01-22 17:17:53.222 183079 INFO nova.network.neutron [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Port 6cf839fd-ff11-4ab3-a473-8a61175f769b from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 22 17:17:53 compute-0 nova_compute[183075]: 2026-01-22 17:17:53.223 183079 DEBUG nova.network.neutron [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Updating instance_info_cache with network_info: [{"id": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "address": "fa:16:3e:e6:59:b5", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b1bf9db-e0", "ovs_interfaceid": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:17:53 compute-0 nova_compute[183075]: 2026-01-22 17:17:53.227 183079 DEBUG nova.objects.instance [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lazy-loading 'pci_requests' on Instance uuid c925ab60-0524-40ab-a82b-52f810b9023f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:17:53 compute-0 nova_compute[183075]: 2026-01-22 17:17:53.241 183079 DEBUG nova.network.neutron [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:17:53 compute-0 nova_compute[183075]: 2026-01-22 17:17:53.244 183079 DEBUG oslo_concurrency.lockutils [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Releasing lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:17:53 compute-0 nova_compute[183075]: 2026-01-22 17:17:53.265 183079 DEBUG oslo_concurrency.lockutils [None req-090ddacf-afd6-4e0d-a8e3-6d33cd24bd59 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "interface-c925ab60-0524-40ab-a82b-52f810b9023f-6cf839fd-ff11-4ab3-a473-8a61175f769b" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:53 compute-0 nova_compute[183075]: 2026-01-22 17:17:53.495 183079 DEBUG nova.compute.manager [req-b7db5f3e-b0f0-4ef8-b5dd-e946b55f6692 req-8350918a-d06f-4c0a-bcd5-d1b92687649a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-vif-plugged-6cf839fd-ff11-4ab3-a473-8a61175f769b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:17:53 compute-0 nova_compute[183075]: 2026-01-22 17:17:53.496 183079 DEBUG oslo_concurrency.lockutils [req-b7db5f3e-b0f0-4ef8-b5dd-e946b55f6692 req-8350918a-d06f-4c0a-bcd5-d1b92687649a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:53 compute-0 nova_compute[183075]: 2026-01-22 17:17:53.496 183079 DEBUG oslo_concurrency.lockutils [req-b7db5f3e-b0f0-4ef8-b5dd-e946b55f6692 req-8350918a-d06f-4c0a-bcd5-d1b92687649a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:53 compute-0 nova_compute[183075]: 2026-01-22 17:17:53.497 183079 DEBUG oslo_concurrency.lockutils [req-b7db5f3e-b0f0-4ef8-b5dd-e946b55f6692 req-8350918a-d06f-4c0a-bcd5-d1b92687649a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:53 compute-0 nova_compute[183075]: 2026-01-22 17:17:53.497 183079 DEBUG nova.compute.manager [req-b7db5f3e-b0f0-4ef8-b5dd-e946b55f6692 req-8350918a-d06f-4c0a-bcd5-d1b92687649a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] No waiting events found dispatching network-vif-plugged-6cf839fd-ff11-4ab3-a473-8a61175f769b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:17:53 compute-0 nova_compute[183075]: 2026-01-22 17:17:53.497 183079 WARNING nova.compute.manager [req-b7db5f3e-b0f0-4ef8-b5dd-e946b55f6692 req-8350918a-d06f-4c0a-bcd5-d1b92687649a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received unexpected event network-vif-plugged-6cf839fd-ff11-4ab3-a473-8a61175f769b for instance with vm_state active and task_state None.
Jan 22 17:17:53 compute-0 nova_compute[183075]: 2026-01-22 17:17:53.498 183079 DEBUG nova.compute.manager [req-b7db5f3e-b0f0-4ef8-b5dd-e946b55f6692 req-8350918a-d06f-4c0a-bcd5-d1b92687649a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-vif-deleted-6cf839fd-ff11-4ab3-a473-8a61175f769b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:17:53 compute-0 nova_compute[183075]: 2026-01-22 17:17:53.503 183079 DEBUG nova.policy [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:17:54 compute-0 nova_compute[183075]: 2026-01-22 17:17:54.171 183079 DEBUG nova.network.neutron [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Successfully updated port: 7e367df6-1641-474f-ae64-7a03b03508ec _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:17:54 compute-0 nova_compute[183075]: 2026-01-22 17:17:54.191 183079 DEBUG oslo_concurrency.lockutils [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:17:54 compute-0 nova_compute[183075]: 2026-01-22 17:17:54.191 183079 DEBUG oslo_concurrency.lockutils [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquired lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:17:54 compute-0 nova_compute[183075]: 2026-01-22 17:17:54.192 183079 DEBUG nova.network.neutron [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:17:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:55.177 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.454 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'name': 'tempest-server-test-37806664', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001f', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '4f95c62a5a194d2291b03187a9c85702', 'user_id': '2fe02f7484a94091bab26aba1c370459', 'hostId': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.456 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'name': 'tempest-server-test-571651884', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000020', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '70a37fbbd795434fbeb722ad97dda552', 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'hostId': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.456 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.456 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.456 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-37806664>, <NovaLikeServer: tempest-server-test-571651884>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-37806664>, <NovaLikeServer: tempest-server-test-571651884>]
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.457 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.459 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c925ab60-0524-40ab-a82b-52f810b9023f / tap6b1bf9db-e0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.459 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/network.incoming.packets volume: 144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.462 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 02af6288-0bd3-438c-982d-f36b31e1a9bf / tap9a06288c-d8 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.462 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/network.incoming.packets volume: 62 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5341f46f-92ee-44f8-8c43-c0e27695cf6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 144, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'instance-0000001f-c925ab60-0524-40ab-a82b-52f810b9023f-tap6b1bf9db-e0', 'timestamp': '2026-01-22T17:17:55.457073', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'tap6b1bf9db-e0', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:59:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b1bf9db-e0'}, 'message_id': '4a818856-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.217482503, 'message_signature': 'c8cc326e2acbdb690afb37279b085320f292e2e38b8e67d90f4c801057e48243'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 62, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': 'instance-00000020-02af6288-0bd3-438c-982d-f36b31e1a9bf-tap9a06288c-d8', 'timestamp': '2026-01-22T17:17:55.457073', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'tap9a06288c-d8', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5d:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a06288c-d8'}, 'message_id': '4a8204ac-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.220246535, 'message_signature': 'a1db87993709859c216ca0c6b46a2727b929c90ddfd456c89ac04decbd21f095'}]}, 'timestamp': '2026-01-22 17:17:55.462999', '_unique_id': '069a2637994847c9b2b5e6b6d701f1f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.464 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.478 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/disk.device.read.latency volume: 263586075 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.498 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/disk.device.read.latency volume: 271629862 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b57a524-eb66-468f-a94d-d370e9a7ff1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 263586075, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'c925ab60-0524-40ab-a82b-52f810b9023f-vda', 'timestamp': '2026-01-22T17:17:55.464901', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'instance-0000001f', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a847ed0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.225313738, 'message_signature': '2abf405c627331623db9f2cbdeea6f3f3b568ee868f34abd8350e7f28d9ec0eb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 271629862, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf-vda', 'timestamp': '2026-01-22T17:17:55.464901', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'instance-00000020', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a8785da-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.239622951, 'message_signature': '65e3b0bb2959725df055df3bafa40d23408330d6feeb288d4a3d086c283560ea'}]}, 'timestamp': '2026-01-22 17:17:55.499079', '_unique_id': 'e87358b1cdef45038b57eeb6f77b8081'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.500 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.510 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.521 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09c3e20b-c190-47f9-a939-a6d08d6e70d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'c925ab60-0524-40ab-a82b-52f810b9023f-vda', 'timestamp': '2026-01-22T17:17:55.500603', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'instance-0000001f', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a8961c0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.261050171, 'message_signature': 'deb0b8dfa99d020a68af6b028ee36d16c21a0931f2b415bafafdbe231bdb3918'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf-vda', 'timestamp': '2026-01-22T17:17:55.500603', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'instance-00000020', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a8b06a6-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.271869394, 'message_signature': '4d0416df8fb0cb96c171d980c9e5618da71b8bb5c42c5cf865d7eb11c9858b53'}]}, 'timestamp': '2026-01-22 17:17:55.522244', '_unique_id': '3eb093e22e114df589fe8c3bed79d0ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.526 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.526 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.527 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f0003db-1e63-4dd0-8c0d-02d85fccf462', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'c925ab60-0524-40ab-a82b-52f810b9023f-vda', 'timestamp': '2026-01-22T17:17:55.526603', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'instance-0000001f', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a8bcbc2-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.261050171, 'message_signature': '31eb6959ce8c62933fb90e9d1416a24cbc9a33dd98e2303c859cf32c6f966df1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf-vda', 'timestamp': '2026-01-22T17:17:55.526603', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'instance-00000020', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a8bdf90-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.271869394, 'message_signature': 'ae10d14452537b8f9cdcd2bd7f20cd47198502c7c54172d38ea4ed30fbcd87c4'}]}, 'timestamp': '2026-01-22 17:17:55.527772', '_unique_id': '6804dcd30a1343a5a43705cb62fe9ad3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.530 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.531 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/network.outgoing.bytes volume: 26143 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.532 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/network.outgoing.bytes volume: 10851 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '911c21e9-2e69-4afb-b1aa-8bd2c7d25c47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 26143, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'instance-0000001f-c925ab60-0524-40ab-a82b-52f810b9023f-tap6b1bf9db-e0', 'timestamp': '2026-01-22T17:17:55.531248', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'tap6b1bf9db-e0', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:59:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b1bf9db-e0'}, 'message_id': '4a8c86ac-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.217482503, 'message_signature': 'd96e748fc3ee97a9381bf225f8f0dc0713832c4ee4dde1961c1d91596c7641c7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10851, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': 'instance-00000020-02af6288-0bd3-438c-982d-f36b31e1a9bf-tap9a06288c-d8', 'timestamp': '2026-01-22T17:17:55.531248', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'tap9a06288c-d8', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5d:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a06288c-d8'}, 'message_id': '4a8ca236-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.220246535, 'message_signature': '9522bafc425f7b124ea6f3e7d98c88cadb5af7f0e67fe8cc072ba81d34ce597f'}]}, 'timestamp': '2026-01-22 17:17:55.532773', '_unique_id': '99b661d8434c4eec88fb16ccd2b1c499'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.537 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.537 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.537 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-37806664>, <NovaLikeServer: tempest-server-test-571651884>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-37806664>, <NovaLikeServer: tempest-server-test-571651884>]
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.538 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.538 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/disk.device.write.requests volume: 329 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.539 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/disk.device.write.requests volume: 326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab08950b-588a-4c89-8cb8-027627e80d25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 329, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'c925ab60-0524-40ab-a82b-52f810b9023f-vda', 'timestamp': '2026-01-22T17:17:55.538546', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'instance-0000001f', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a8d9fe2-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.225313738, 'message_signature': 'b113430c27605eb2796d74c0918fbd14bdd14706a00523796ba14bd561c1fc76'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 326, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf-vda', 'timestamp': '2026-01-22T17:17:55.538546', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'instance-00000020', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a8db9c8-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.239622951, 'message_signature': 'f15936e35c7a340ecf82b5af7be9fa6eaa5cf567d1a5b46017610e14cf296fc2'}]}, 'timestamp': '2026-01-22 17:17:55.539932', '_unique_id': '1f83cedfbb2b4b79b2799a3531995ad2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.543 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.544 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/disk.device.write.latency volume: 4421423679 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.544 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/disk.device.write.latency volume: 2747888268 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '222fb487-db2c-45c8-a831-3be9d89e8f0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4421423679, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'c925ab60-0524-40ab-a82b-52f810b9023f-vda', 'timestamp': '2026-01-22T17:17:55.544229', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'instance-0000001f', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a8e7bb0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.225313738, 'message_signature': 'd95c5e2fccf331e743d07838229681f1fa82dc0b50b5352f4cc42f60614bf2f5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2747888268, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf-vda', 'timestamp': '2026-01-22T17:17:55.544229', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'instance-00000020', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a8e90dc-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.239622951, 'message_signature': '4d1d3e756f1108eb0a9a84625721138ea02f60056978f3c0cbd6c58dd698e8fb'}]}, 'timestamp': '2026-01-22 17:17:55.545324', '_unique_id': '7999c146ae364386ae4f666cdbaea750'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.546 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.548 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.574 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/cpu volume: 11460000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.600 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/cpu volume: 11110000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28f59a63-807f-4b5e-ba0c-8d84868b6fbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11460000000, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'timestamp': '2026-01-22T17:17:55.548203', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'instance-0000001f', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '4a932372-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.33448598, 'message_signature': '2b20944ab13c373644fc4b9a5c1bd98a0cf76560ee6bd8d446cef98de154d8cd'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11110000000, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'timestamp': '2026-01-22T17:17:55.548203', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'instance-00000020', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '4a971360-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.360756856, 'message_signature': 'c10ee3083eb0d11091b356f81fef46e3291ff993f417c320bc7625445f03591b'}]}, 'timestamp': '2026-01-22 17:17:55.601095', '_unique_id': 'ac5a9135e7cc41fd839775faa79572c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.602 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.603 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.603 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.603 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/disk.device.allocation volume: 30810112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a20f3fd-ca86-4972-a05f-7716d9d0496f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'c925ab60-0524-40ab-a82b-52f810b9023f-vda', 'timestamp': '2026-01-22T17:17:55.603121', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'instance-0000001f', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a977026-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.261050171, 'message_signature': '111b1983783ecd5f08b977fb89d39dde3a2bedb6844f3f91b9b27040074b893c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30810112, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf-vda', 'timestamp': '2026-01-22T17:17:55.603121', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'instance-00000020', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a977b52-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.271869394, 'message_signature': '26f458f7041753fc635d62236de8406874905561648394aa9968d9a2bcc599d7'}]}, 'timestamp': '2026-01-22 17:17:55.603710', '_unique_id': 'ff55179817654c7b9c8f5102c881883a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.604 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.605 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.605 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-37806664>, <NovaLikeServer: tempest-server-test-571651884>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-37806664>, <NovaLikeServer: tempest-server-test-571651884>]
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.605 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.605 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.605 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a52e57a5-a02e-439a-8e3b-a655056cf3a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'instance-0000001f-c925ab60-0524-40ab-a82b-52f810b9023f-tap6b1bf9db-e0', 'timestamp': '2026-01-22T17:17:55.605315', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'tap6b1bf9db-e0', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:59:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b1bf9db-e0'}, 'message_id': '4a97c580-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.217482503, 'message_signature': '162f17b93c67f5c86965a3d18c6f2c0f0cc70ca3b77a469a063b58b426fd8e20'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': 'instance-00000020-02af6288-0bd3-438c-982d-f36b31e1a9bf-tap9a06288c-d8', 'timestamp': '2026-01-22T17:17:55.605315', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'tap9a06288c-d8', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5d:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a06288c-d8'}, 'message_id': '4a97ce9a-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.220246535, 'message_signature': '399fe262a61240b44e6d484dceb7ff5afe87634ba7c69fbfedfe881041bad797'}]}, 'timestamp': '2026-01-22 17:17:55.605777', '_unique_id': 'c9cdac21a96841c8be7a9cf31234a26c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.606 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/disk.device.read.requests volume: 1130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/disk.device.read.requests volume: 1151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '221a7fbf-f25f-4166-bd0e-2efd60844089', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1130, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'c925ab60-0524-40ab-a82b-52f810b9023f-vda', 'timestamp': '2026-01-22T17:17:55.606910', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'instance-0000001f', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a980388-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.225313738, 'message_signature': '187f07f7f1b80d150ebabbe4001fe7d5378dd8d0734939b705834ddf1b0339cc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1151, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf-vda', 'timestamp': '2026-01-22T17:17:55.606910', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'instance-00000020', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a980bda-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.239622951, 'message_signature': '16afd71c8ad5e50c4f648ce8c53f856661d551e5d283016f8dd444342e6ae98c'}]}, 'timestamp': '2026-01-22 17:17:55.607335', '_unique_id': '4eea77b1f02d43b2a2e71fdd14ec0759'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.607 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.608 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.608 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.609 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cce5114d-f9e7-49b3-b55e-c57cdc132849', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'instance-0000001f-c925ab60-0524-40ab-a82b-52f810b9023f-tap6b1bf9db-e0', 'timestamp': '2026-01-22T17:17:55.608747', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'tap6b1bf9db-e0', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:59:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b1bf9db-e0'}, 'message_id': '4a984e92-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.217482503, 'message_signature': 'ff8e70130ca61b064a7ded6eac3af4c7d99b12ee38c3e31d075d2ae5fddc6f3e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': 'instance-00000020-02af6288-0bd3-438c-982d-f36b31e1a9bf-tap9a06288c-d8', 'timestamp': '2026-01-22T17:17:55.608747', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'tap9a06288c-d8', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5d:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a06288c-d8'}, 'message_id': '4a985ae0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.220246535, 'message_signature': '3837d5c8ca5f552209d4fd95f94249e2f01953cbbac24ab711bbe401b46ca555'}]}, 'timestamp': '2026-01-22 17:17:55.609416', '_unique_id': '5558c470310c46728be294b364d678fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.610 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.611 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/network.outgoing.packets volume: 208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.611 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/network.outgoing.packets volume: 123 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 nova_compute[183075]: 2026-01-22 17:17:55.611 183079 DEBUG nova.compute.manager [req-2e3e1f64-acf5-47e1-ab69-3c481ebdeca4 req-2ae8ea1d-0bba-49ac-b6a8-c005796709d2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-changed-7e367df6-1641-474f-ae64-7a03b03508ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:17:55 compute-0 nova_compute[183075]: 2026-01-22 17:17:55.612 183079 DEBUG nova.compute.manager [req-2e3e1f64-acf5-47e1-ab69-3c481ebdeca4 req-2ae8ea1d-0bba-49ac-b6a8-c005796709d2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Refreshing instance network info cache due to event network-changed-7e367df6-1641-474f-ae64-7a03b03508ec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:17:55 compute-0 nova_compute[183075]: 2026-01-22 17:17:55.612 183079 DEBUG oslo_concurrency.lockutils [req-2e3e1f64-acf5-47e1-ab69-3c481ebdeca4 req-2ae8ea1d-0bba-49ac-b6a8-c005796709d2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74e9a0dc-a0a1-4ea0-b8d4-df41ab47b2a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 208, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'instance-0000001f-c925ab60-0524-40ab-a82b-52f810b9023f-tap6b1bf9db-e0', 'timestamp': '2026-01-22T17:17:55.611081', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'tap6b1bf9db-e0', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:59:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b1bf9db-e0'}, 'message_id': '4a98a8ba-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.217482503, 'message_signature': '9de551dce9539ee70758c4f2618553d17b5b292a3f78123016ffbbdaeb6643f0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 123, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': 'instance-00000020-02af6288-0bd3-438c-982d-f36b31e1a9bf-tap9a06288c-d8', 'timestamp': '2026-01-22T17:17:55.611081', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'tap9a06288c-d8', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5d:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a06288c-d8'}, 'message_id': '4a98b4cc-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.220246535, 'message_signature': '884aab0a7308edcb92a0c2b6c5b2e23d453033fc2e5b893ef9e55bc05c07fd1d'}]}, 'timestamp': '2026-01-22 17:17:55.611744', '_unique_id': '57fa8f737801419bb0c56e93264037f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.612 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.613 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.613 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/disk.device.read.bytes volume: 30353920 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.613 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/disk.device.read.bytes volume: 31095296 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3e72779-5e9b-4960-a284-e8b5b9708112', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30353920, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'c925ab60-0524-40ab-a82b-52f810b9023f-vda', 'timestamp': '2026-01-22T17:17:55.613278', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'instance-0000001f', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a98fe3c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.225313738, 'message_signature': 'ff6a4649e5d0182be1feff3c5d4526c2e520c2bff7aeb357bdaae6aaacfb02d6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31095296, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf-vda', 'timestamp': '2026-01-22T17:17:55.613278', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'instance-00000020', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a990b34-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.239622951, 'message_signature': '28d13e250f8cb945a5c1e3d3e9bd6f4f8f8b2eaa8778d03a08f4ebd698faae13'}]}, 'timestamp': '2026-01-22 17:17:55.613939', '_unique_id': '6a2dacdde9f44b8abcc44ca5b32ae043'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.614 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.615 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.615 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.615 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82e862ee-0a7f-4937-b91a-40fbcbc35045', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'instance-0000001f-c925ab60-0524-40ab-a82b-52f810b9023f-tap6b1bf9db-e0', 'timestamp': '2026-01-22T17:17:55.615459', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'tap6b1bf9db-e0', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:59:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b1bf9db-e0'}, 'message_id': '4a99556c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.217482503, 'message_signature': '88cbc0a1d4ea3acb34cc9c971a14cc8dce5c66d9e5dddd51b703f0aa1815ffa4'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': 'instance-00000020-02af6288-0bd3-438c-982d-f36b31e1a9bf-tap9a06288c-d8', 'timestamp': '2026-01-22T17:17:55.615459', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'tap9a06288c-d8', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5d:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a06288c-d8'}, 'message_id': '4a9962aa-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.220246535, 'message_signature': '549803ba67de3a9ccc292c40e7d27aac4e717ac010bd0c66e81d81dc784894f0'}]}, 'timestamp': '2026-01-22 17:17:55.616174', '_unique_id': 'f44a6a5998ee46379b707a18e0c8e9aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.616 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.617 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.617 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.617 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '183676c8-9d6c-479d-9796-ab5be8228f71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'instance-0000001f-c925ab60-0524-40ab-a82b-52f810b9023f-tap6b1bf9db-e0', 'timestamp': '2026-01-22T17:17:55.617614', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'tap6b1bf9db-e0', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:59:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b1bf9db-e0'}, 'message_id': '4a99a882-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.217482503, 'message_signature': '36ce97af909e4b8651fff4c430df60874f1600f241cc669c0b9cb6d48b6ce620'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': 'instance-00000020-02af6288-0bd3-438c-982d-f36b31e1a9bf-tap9a06288c-d8', 'timestamp': '2026-01-22T17:17:55.617614', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'tap9a06288c-d8', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5d:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a06288c-d8'}, 'message_id': '4a99b282-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.220246535, 'message_signature': 'c80c2f48b61f9cabb79da639fd0be1a9314bce068ce10584051c21bd316fad27'}]}, 'timestamp': '2026-01-22 17:17:55.618162', '_unique_id': '19af19c11c56476d900e305829540e83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.618 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.619 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.619 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.619 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-37806664>, <NovaLikeServer: tempest-server-test-571651884>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-37806664>, <NovaLikeServer: tempest-server-test-571651884>]
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.620 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.620 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/disk.device.write.bytes volume: 73007104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.620 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '253581ff-0f18-4738-85de-b6e9717c8bce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73007104, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'c925ab60-0524-40ab-a82b-52f810b9023f-vda', 'timestamp': '2026-01-22T17:17:55.620217', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'instance-0000001f', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a9a0de0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.225313738, 'message_signature': 'b31f7e7846e4d9f67f565e16801892e8d7336dd4d604bf6558e3fb4f611aa09c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf-vda', 'timestamp': '2026-01-22T17:17:55.620217', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'instance-00000020', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4a9a1650-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.239622951, 'message_signature': 'c7d7374dcd3af3b5fbba812aa9bddc9b287c1af59af957a518628034de1e8769'}]}, 'timestamp': '2026-01-22 17:17:55.620728', '_unique_id': 'ebe6560381864eae8da17ec396ec589d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.621 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/memory.usage volume: 43.58203125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/memory.usage volume: 43.25 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eef85944-7391-4683-a5a9-87d9abf05a61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.58203125, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'timestamp': '2026-01-22T17:17:55.621920', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'instance-0000001f', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4a9a4de6-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.33448598, 'message_signature': 'db7e5bda5c5578d042702cc642d4b639ebf6da2e34a29de85d448b7068f91ea5'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.25, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'timestamp': '2026-01-22T17:17:55.621920', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'instance-00000020', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4a9a55c0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.360756856, 'message_signature': '1228e0ae313dfd6f47f9090489bedbe2829c17d93aebb2a6e6cab9f8c27d1516'}]}, 'timestamp': '2026-01-22 17:17:55.622333', '_unique_id': '6e440f1b7ffb482c9a240d333a6a70f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.622 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.623 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.623 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.624 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '911306a5-dae3-460b-9389-e4527022ff60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'instance-0000001f-c925ab60-0524-40ab-a82b-52f810b9023f-tap6b1bf9db-e0', 'timestamp': '2026-01-22T17:17:55.623857', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'tap6b1bf9db-e0', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:59:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b1bf9db-e0'}, 'message_id': '4a9a9b84-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.217482503, 'message_signature': 'e5f21d60d9b1f06a4fd6a367dcd2c8645ac9becdcf183237ea99558533511308'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': 'instance-00000020-02af6288-0bd3-438c-982d-f36b31e1a9bf-tap9a06288c-d8', 'timestamp': '2026-01-22T17:17:55.623857', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'tap9a06288c-d8', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5d:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a06288c-d8'}, 'message_id': '4a9aa7be-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.220246535, 'message_signature': '3fe1c92233b500870fa51a13d65dfa2defe8b42915088056404c01953d07ac13'}]}, 'timestamp': '2026-01-22 17:17:55.624494', '_unique_id': '9200688cb03144c784a21e505d4e82db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/network.incoming.bytes volume: 22964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.625 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/network.incoming.bytes volume: 7342 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46b3096b-e047-4a25-8bd8-58ef3939b99d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 22964, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'instance-0000001f-c925ab60-0524-40ab-a82b-52f810b9023f-tap6b1bf9db-e0', 'timestamp': '2026-01-22T17:17:55.625756', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'tap6b1bf9db-e0', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:59:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b1bf9db-e0'}, 'message_id': '4a9ae3b4-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.217482503, 'message_signature': '48e3eaa9402fe95667546b1ef3ae74afd4646bc62b4ca09dab2010faab552300'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7342, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': 'instance-00000020-02af6288-0bd3-438c-982d-f36b31e1a9bf-tap9a06288c-d8', 'timestamp': '2026-01-22T17:17:55.625756', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'tap9a06288c-d8', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5d:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a06288c-d8'}, 'message_id': '4a9aed28-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.220246535, 'message_signature': 'f1251f377fa80cbd922979727daf1d49e0fb1b79e67d3a7b4f8305dcea442d77'}]}, 'timestamp': '2026-01-22 17:17:55.626232', '_unique_id': 'e68932f30ded49159684b9864270d133'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.626 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.627 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.627 12 DEBUG ceilometer.compute.pollsters [-] c925ab60-0524-40ab-a82b-52f810b9023f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.627 12 DEBUG ceilometer.compute.pollsters [-] 02af6288-0bd3-438c-982d-f36b31e1a9bf/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e787c603-12e3-4b01-814d-25be6c1dc4c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2fe02f7484a94091bab26aba1c370459', 'user_name': None, 'project_id': '4f95c62a5a194d2291b03187a9c85702', 'project_name': None, 'resource_id': 'instance-0000001f-c925ab60-0524-40ab-a82b-52f810b9023f-tap6b1bf9db-e0', 'timestamp': '2026-01-22T17:17:55.627340', 'resource_metadata': {'display_name': 'tempest-server-test-37806664', 'name': 'tap6b1bf9db-e0', 'instance_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'instance_type': 'm1.nano', 'host': 'bc83ccfc8630a483b91f8ec6b9cbc1691f83379d0f12fffb808a881d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:59:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b1bf9db-e0'}, 'message_id': '4a9b2194-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.217482503, 'message_signature': 'fc548fba1adb40ae7299ea06e1c3e453a8d46f14db72e65e7eb3333206cb9e2b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3d4b274173814c359055fed8dfc2bdeb', 'user_name': None, 'project_id': '70a37fbbd795434fbeb722ad97dda552', 'project_name': None, 'resource_id': 'instance-00000020-02af6288-0bd3-438c-982d-f36b31e1a9bf-tap9a06288c-d8', 'timestamp': '2026-01-22T17:17:55.627340', 'resource_metadata': {'display_name': 'tempest-server-test-571651884', 'name': 'tap9a06288c-d8', 'instance_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'instance_type': 'm1.nano', 'host': 'e8933227d6a3efd2305f5d9d26f91ebb25e55f4f685af8a0a733b206', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5d:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a06288c-d8'}, 'message_id': '4a9b2bd0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4639.220246535, 'message_signature': 'bc515306156a08884d39513f7e93a0b77763b6c65f2d11713b63bb380892b204'}]}, 'timestamp': '2026-01-22 17:17:55.627829', '_unique_id': 'dae075314e2e469b92407c7028bfd666'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:17:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:17:55.628 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:17:55 compute-0 nova_compute[183075]: 2026-01-22 17:17:55.690 183079 INFO nova.compute.manager [None req-56149322-b187-41e3-9781-ecc27baec981 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Get console output
Jan 22 17:17:55 compute-0 nova_compute[183075]: 2026-01-22 17:17:55.703 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:17:56 compute-0 nova_compute[183075]: 2026-01-22 17:17:56.057 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:56 compute-0 nova_compute[183075]: 2026-01-22 17:17:56.266 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.225 183079 DEBUG nova.network.neutron [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Updating instance_info_cache with network_info: [{"id": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "address": "fa:16:3e:e6:59:b5", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b1bf9db-e0", "ovs_interfaceid": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7e367df6-1641-474f-ae64-7a03b03508ec", "address": "fa:16:3e:24:11:71", "network": {"id": "6c418cf9-3d22-4e00-a57b-fdff8a1243c2", "bridge": "br-int", "label": "tempest-test-network--970566607", "subnets": [{"cidr": "2001:db8:0:3::/64", "dns": [], "gateway": {"address": "2001:db8:0:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:3:f816:3eff:fe24:1171", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e367df6-16", "ovs_interfaceid": "7e367df6-1641-474f-ae64-7a03b03508ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.249 183079 DEBUG oslo_concurrency.lockutils [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Releasing lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.250 183079 DEBUG oslo_concurrency.lockutils [req-2e3e1f64-acf5-47e1-ab69-3c481ebdeca4 req-2ae8ea1d-0bba-49ac-b6a8-c005796709d2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.251 183079 DEBUG nova.network.neutron [req-2e3e1f64-acf5-47e1-ab69-3c481ebdeca4 req-2ae8ea1d-0bba-49ac-b6a8-c005796709d2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Refreshing network info cache for port 7e367df6-1641-474f-ae64-7a03b03508ec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.254 183079 DEBUG nova.virt.libvirt.vif [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-37806664',display_name='tempest-server-test-37806664',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-37806664',id=31,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:17:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-q3uks0k4',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:17:22Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=c925ab60-0524-40ab-a82b-52f810b9023f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7e367df6-1641-474f-ae64-7a03b03508ec", "address": "fa:16:3e:24:11:71", "network": {"id": "6c418cf9-3d22-4e00-a57b-fdff8a1243c2", "bridge": "br-int", "label": "tempest-test-network--970566607", "subnets": [{"cidr": "2001:db8:0:3::/64", "dns": [], "gateway": {"address": "2001:db8:0:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:3:f816:3eff:fe24:1171", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e367df6-16", "ovs_interfaceid": "7e367df6-1641-474f-ae64-7a03b03508ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.255 183079 DEBUG nova.network.os_vif_util [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "7e367df6-1641-474f-ae64-7a03b03508ec", "address": "fa:16:3e:24:11:71", "network": {"id": "6c418cf9-3d22-4e00-a57b-fdff8a1243c2", "bridge": "br-int", "label": "tempest-test-network--970566607", "subnets": [{"cidr": "2001:db8:0:3::/64", "dns": [], "gateway": {"address": "2001:db8:0:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:3:f816:3eff:fe24:1171", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e367df6-16", "ovs_interfaceid": "7e367df6-1641-474f-ae64-7a03b03508ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.256 183079 DEBUG nova.network.os_vif_util [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:11:71,bridge_name='br-int',has_traffic_filtering=True,id=7e367df6-1641-474f-ae64-7a03b03508ec,network=Network(6c418cf9-3d22-4e00-a57b-fdff8a1243c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e367df6-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.256 183079 DEBUG os_vif [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:11:71,bridge_name='br-int',has_traffic_filtering=True,id=7e367df6-1641-474f-ae64-7a03b03508ec,network=Network(6c418cf9-3d22-4e00-a57b-fdff8a1243c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e367df6-16') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.257 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.257 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.257 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.260 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.260 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7e367df6-16, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.260 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7e367df6-16, col_values=(('external_ids', {'iface-id': '7e367df6-1641-474f-ae64-7a03b03508ec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:11:71', 'vm-uuid': 'c925ab60-0524-40ab-a82b-52f810b9023f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.262 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:58 compute-0 NetworkManager[55454]: <info>  [1769102278.2630] manager: (tap7e367df6-16): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.268 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.269 183079 INFO os_vif [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:11:71,bridge_name='br-int',has_traffic_filtering=True,id=7e367df6-1641-474f-ae64-7a03b03508ec,network=Network(6c418cf9-3d22-4e00-a57b-fdff8a1243c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e367df6-16')
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.270 183079 DEBUG nova.virt.libvirt.vif [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-37806664',display_name='tempest-server-test-37806664',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-37806664',id=31,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:17:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-q3uks0k4',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:17:22Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=c925ab60-0524-40ab-a82b-52f810b9023f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7e367df6-1641-474f-ae64-7a03b03508ec", "address": "fa:16:3e:24:11:71", "network": {"id": "6c418cf9-3d22-4e00-a57b-fdff8a1243c2", "bridge": "br-int", "label": "tempest-test-network--970566607", "subnets": [{"cidr": "2001:db8:0:3::/64", "dns": [], "gateway": {"address": "2001:db8:0:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:3:f816:3eff:fe24:1171", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e367df6-16", "ovs_interfaceid": "7e367df6-1641-474f-ae64-7a03b03508ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.271 183079 DEBUG nova.network.os_vif_util [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "7e367df6-1641-474f-ae64-7a03b03508ec", "address": "fa:16:3e:24:11:71", "network": {"id": "6c418cf9-3d22-4e00-a57b-fdff8a1243c2", "bridge": "br-int", "label": "tempest-test-network--970566607", "subnets": [{"cidr": "2001:db8:0:3::/64", "dns": [], "gateway": {"address": "2001:db8:0:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:3:f816:3eff:fe24:1171", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e367df6-16", "ovs_interfaceid": "7e367df6-1641-474f-ae64-7a03b03508ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.271 183079 DEBUG nova.network.os_vif_util [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:11:71,bridge_name='br-int',has_traffic_filtering=True,id=7e367df6-1641-474f-ae64-7a03b03508ec,network=Network(6c418cf9-3d22-4e00-a57b-fdff8a1243c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e367df6-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.273 183079 DEBUG nova.virt.libvirt.guest [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] attach device xml: <interface type="ethernet">
Jan 22 17:17:58 compute-0 nova_compute[183075]:   <mac address="fa:16:3e:24:11:71"/>
Jan 22 17:17:58 compute-0 nova_compute[183075]:   <model type="virtio"/>
Jan 22 17:17:58 compute-0 nova_compute[183075]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:17:58 compute-0 nova_compute[183075]:   <mtu size="1442"/>
Jan 22 17:17:58 compute-0 nova_compute[183075]:   <target dev="tap7e367df6-16"/>
Jan 22 17:17:58 compute-0 nova_compute[183075]: </interface>
Jan 22 17:17:58 compute-0 nova_compute[183075]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 22 17:17:58 compute-0 kernel: tap7e367df6-16: entered promiscuous mode
Jan 22 17:17:58 compute-0 NetworkManager[55454]: <info>  [1769102278.2877] manager: (tap7e367df6-16): new Tun device (/org/freedesktop/NetworkManager/Devices/169)
Jan 22 17:17:58 compute-0 ovn_controller[95372]: 2026-01-22T17:17:58Z|00391|binding|INFO|Claiming lport 7e367df6-1641-474f-ae64-7a03b03508ec for this chassis.
Jan 22 17:17:58 compute-0 ovn_controller[95372]: 2026-01-22T17:17:58Z|00392|binding|INFO|7e367df6-1641-474f-ae64-7a03b03508ec: Claiming fa:16:3e:24:11:71 2001:db8:0:3:f816:3eff:fe24:1171
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.287 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.296 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:11:71 2001:db8:0:3:f816:3eff:fe24:1171'], port_security=['fa:16:3e:24:11:71 2001:db8:0:3:f816:3eff:fe24:1171'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:3:f816:3eff:fe24:1171/64', 'neutron:device_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c418cf9-3d22-4e00-a57b-fdff8a1243c2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4f95c62a5a194d2291b03187a9c85702', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0ac3e154-5d63-4269-957b-eeadd273d2d6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6d3b513f-3c67-48df-94de-712c23da7c75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=7e367df6-1641-474f-ae64-7a03b03508ec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.298 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 7e367df6-1641-474f-ae64-7a03b03508ec in datapath 6c418cf9-3d22-4e00-a57b-fdff8a1243c2 bound to our chassis
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.301 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6c418cf9-3d22-4e00-a57b-fdff8a1243c2
Jan 22 17:17:58 compute-0 ovn_controller[95372]: 2026-01-22T17:17:58Z|00393|binding|INFO|Setting lport 7e367df6-1641-474f-ae64-7a03b03508ec ovn-installed in OVS
Jan 22 17:17:58 compute-0 ovn_controller[95372]: 2026-01-22T17:17:58Z|00394|binding|INFO|Setting lport 7e367df6-1641-474f-ae64-7a03b03508ec up in Southbound
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.309 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.314 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.312 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2964b1db-3d4b-47da-b0fe-c7169c31b82e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.314 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6c418cf9-31 in ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.315 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6c418cf9-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.315 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f37563ea-14c9-4820-a925-7cf4430e10e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.317 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bc8d4b11-5597-4185-b967-7f6ec175eed9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:58 compute-0 systemd-udevd[225206]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:17:58 compute-0 NetworkManager[55454]: <info>  [1769102278.3354] device (tap7e367df6-16): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:17:58 compute-0 NetworkManager[55454]: <info>  [1769102278.3361] device (tap7e367df6-16): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.336 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[01334dee-cf81-4904-b979-9241453a30df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.359 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8e7baebc-37a6-4bc3-befa-77709ca4648b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:58 compute-0 podman[225192]: 2026-01-22 17:17:58.360529835 +0000 UTC m=+0.062484493 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.377 183079 DEBUG nova.virt.libvirt.driver [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.378 183079 DEBUG nova.virt.libvirt.driver [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] No VIF found with MAC fa:16:3e:e6:59:b5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.378 183079 DEBUG nova.virt.libvirt.driver [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] No VIF found with MAC fa:16:3e:24:11:71, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.387 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[64f736e4-ffd8-4171-a5f3-f8d81e59010c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:58 compute-0 systemd-udevd[225212]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:17:58 compute-0 NetworkManager[55454]: <info>  [1769102278.3937] manager: (tap6c418cf9-30): new Veth device (/org/freedesktop/NetworkManager/Devices/170)
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.393 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a52546df-7eab-4de8-91c3-bddb98cbdb36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.398 183079 DEBUG nova.virt.libvirt.guest [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:17:58 compute-0 nova_compute[183075]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:17:58 compute-0 nova_compute[183075]:   <nova:name>tempest-server-test-37806664</nova:name>
Jan 22 17:17:58 compute-0 nova_compute[183075]:   <nova:creationTime>2026-01-22 17:17:58</nova:creationTime>
Jan 22 17:17:58 compute-0 nova_compute[183075]:   <nova:flavor name="m1.nano">
Jan 22 17:17:58 compute-0 nova_compute[183075]:     <nova:memory>128</nova:memory>
Jan 22 17:17:58 compute-0 nova_compute[183075]:     <nova:disk>1</nova:disk>
Jan 22 17:17:58 compute-0 nova_compute[183075]:     <nova:swap>0</nova:swap>
Jan 22 17:17:58 compute-0 nova_compute[183075]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:17:58 compute-0 nova_compute[183075]:     <nova:vcpus>1</nova:vcpus>
Jan 22 17:17:58 compute-0 nova_compute[183075]:   </nova:flavor>
Jan 22 17:17:58 compute-0 nova_compute[183075]:   <nova:owner>
Jan 22 17:17:58 compute-0 nova_compute[183075]:     <nova:user uuid="2fe02f7484a94091bab26aba1c370459">tempest-IPv6Test-1828780436-project-member</nova:user>
Jan 22 17:17:58 compute-0 nova_compute[183075]:     <nova:project uuid="4f95c62a5a194d2291b03187a9c85702">tempest-IPv6Test-1828780436</nova:project>
Jan 22 17:17:58 compute-0 nova_compute[183075]:   </nova:owner>
Jan 22 17:17:58 compute-0 nova_compute[183075]:   <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:17:58 compute-0 nova_compute[183075]:   <nova:ports>
Jan 22 17:17:58 compute-0 nova_compute[183075]:     <nova:port uuid="6b1bf9db-e098-4d03-b185-9a64eee8cec2">
Jan 22 17:17:58 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:17:58 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:17:58 compute-0 nova_compute[183075]:     <nova:port uuid="7e367df6-1641-474f-ae64-7a03b03508ec">
Jan 22 17:17:58 compute-0 nova_compute[183075]:       <nova:ip type="fixed" address="2001:db8:0:3:f816:3eff:fe24:1171" ipVersion="6"/>
Jan 22 17:17:58 compute-0 nova_compute[183075]:     </nova:port>
Jan 22 17:17:58 compute-0 nova_compute[183075]:   </nova:ports>
Jan 22 17:17:58 compute-0 nova_compute[183075]: </nova:instance>
Jan 22 17:17:58 compute-0 nova_compute[183075]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.419 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[b3802d83-e467-4873-ba90-baad9b015d93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.422 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[22d93c1d-a23b-41cc-a37f-6a5c4d19b8ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.424 183079 DEBUG oslo_concurrency.lockutils [None req-6e8075c2-d17c-479a-93cc-284dfa0221f0 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "interface-c925ab60-0524-40ab-a82b-52f810b9023f-7e367df6-1641-474f-ae64-7a03b03508ec" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.802s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:58 compute-0 NetworkManager[55454]: <info>  [1769102278.4435] device (tap6c418cf9-30): carrier: link connected
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.449 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[216805b2-4d9f-4a40-b382-96992609e03d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.465 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4153d089-bd7a-48a5-9d1a-553df05d4f8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c418cf9-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:71:5f:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 110], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464214, 'reachable_time': 36942, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225246, 'error': None, 'target': 'ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.481 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[44bc0e97-e490-4d62-b62d-7773b340fb20]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe71:5f9a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 464214, 'tstamp': 464214}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225247, 'error': None, 'target': 'ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.502 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a35e7914-2ac5-4a51-af4c-55a7aa714c3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c418cf9-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:71:5f:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 110], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464214, 'reachable_time': 36942, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225248, 'error': None, 'target': 'ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.529 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0f1ed06f-6254-4097-8548-0c9b84b4d560]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.556 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1b83a84a-419c-4a3e-8796-a257bb91142e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.557 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c418cf9-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.557 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.558 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c418cf9-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.559 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:58 compute-0 kernel: tap6c418cf9-30: entered promiscuous mode
Jan 22 17:17:58 compute-0 NetworkManager[55454]: <info>  [1769102278.5621] manager: (tap6c418cf9-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/171)
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.563 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.564 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6c418cf9-30, col_values=(('external_ids', {'iface-id': '22d851ae-7a73-491f-b969-ae4f0315ddec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.565 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:58 compute-0 ovn_controller[95372]: 2026-01-22T17:17:58Z|00395|binding|INFO|Releasing lport 22d851ae-7a73-491f-b969-ae4f0315ddec from this chassis (sb_readonly=0)
Jan 22 17:17:58 compute-0 nova_compute[183075]: 2026-01-22 17:17:58.620 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.621 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6c418cf9-3d22-4e00-a57b-fdff8a1243c2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6c418cf9-3d22-4e00-a57b-fdff8a1243c2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.621 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[080d5a08-8492-4875-ba68-c08cd7bd25f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.622 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-6c418cf9-3d22-4e00-a57b-fdff8a1243c2
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/6c418cf9-3d22-4e00-a57b-fdff8a1243c2.pid.haproxy
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 6c418cf9-3d22-4e00-a57b-fdff8a1243c2
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:17:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:17:58.622 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2', 'env', 'PROCESS_TAG=haproxy-6c418cf9-3d22-4e00-a57b-fdff8a1243c2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6c418cf9-3d22-4e00-a57b-fdff8a1243c2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:17:58 compute-0 podman[225278]: 2026-01-22 17:17:58.937902069 +0000 UTC m=+0.044484783 container create 100756c2c1ca9154c5fd9396da462fae4b462634e49c333e3cda7ad5a4587548 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 22 17:17:58 compute-0 systemd[1]: Started libpod-conmon-100756c2c1ca9154c5fd9396da462fae4b462634e49c333e3cda7ad5a4587548.scope.
Jan 22 17:17:58 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:17:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d104e43ad36d12e6db9eea08fb45c113a0219dc7d5602be960ccddd60626bba/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:17:59 compute-0 podman[225278]: 2026-01-22 17:17:58.914731774 +0000 UTC m=+0.021314498 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:17:59 compute-0 podman[225278]: 2026-01-22 17:17:59.014287755 +0000 UTC m=+0.120870469 container init 100756c2c1ca9154c5fd9396da462fae4b462634e49c333e3cda7ad5a4587548 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 17:17:59 compute-0 podman[225278]: 2026-01-22 17:17:59.018903475 +0000 UTC m=+0.125486189 container start 100756c2c1ca9154c5fd9396da462fae4b462634e49c333e3cda7ad5a4587548 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 17:17:59 compute-0 neutron-haproxy-ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2[225294]: [NOTICE]   (225298) : New worker (225300) forked
Jan 22 17:17:59 compute-0 neutron-haproxy-ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2[225294]: [NOTICE]   (225298) : Loading success.
Jan 22 17:17:59 compute-0 nova_compute[183075]: 2026-01-22 17:17:59.411 183079 DEBUG nova.compute.manager [req-5a3d6f64-a2ba-479a-ad95-cb9ff81f8950 req-abe662b2-c452-4256-a449-cfab2afff4a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-vif-plugged-7e367df6-1641-474f-ae64-7a03b03508ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:17:59 compute-0 nova_compute[183075]: 2026-01-22 17:17:59.412 183079 DEBUG oslo_concurrency.lockutils [req-5a3d6f64-a2ba-479a-ad95-cb9ff81f8950 req-abe662b2-c452-4256-a449-cfab2afff4a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:17:59 compute-0 nova_compute[183075]: 2026-01-22 17:17:59.412 183079 DEBUG oslo_concurrency.lockutils [req-5a3d6f64-a2ba-479a-ad95-cb9ff81f8950 req-abe662b2-c452-4256-a449-cfab2afff4a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:17:59 compute-0 nova_compute[183075]: 2026-01-22 17:17:59.413 183079 DEBUG oslo_concurrency.lockutils [req-5a3d6f64-a2ba-479a-ad95-cb9ff81f8950 req-abe662b2-c452-4256-a449-cfab2afff4a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:17:59 compute-0 nova_compute[183075]: 2026-01-22 17:17:59.413 183079 DEBUG nova.compute.manager [req-5a3d6f64-a2ba-479a-ad95-cb9ff81f8950 req-abe662b2-c452-4256-a449-cfab2afff4a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] No waiting events found dispatching network-vif-plugged-7e367df6-1641-474f-ae64-7a03b03508ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:17:59 compute-0 nova_compute[183075]: 2026-01-22 17:17:59.414 183079 WARNING nova.compute.manager [req-5a3d6f64-a2ba-479a-ad95-cb9ff81f8950 req-abe662b2-c452-4256-a449-cfab2afff4a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received unexpected event network-vif-plugged-7e367df6-1641-474f-ae64-7a03b03508ec for instance with vm_state active and task_state None.
Jan 22 17:18:00 compute-0 nova_compute[183075]: 2026-01-22 17:18:00.818 183079 INFO nova.compute.manager [None req-8ed3f4ff-90d5-4dcc-86f0-7f6b459c5ca0 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Get console output
Jan 22 17:18:00 compute-0 nova_compute[183075]: 2026-01-22 17:18:00.824 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:18:01 compute-0 nova_compute[183075]: 2026-01-22 17:18:01.269 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:01 compute-0 nova_compute[183075]: 2026-01-22 17:18:01.502 183079 DEBUG nova.compute.manager [req-3c710442-443b-4af3-8a31-2a70347ee73b req-2398d314-e145-4bd7-a10a-bfecdf986ecf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-vif-plugged-7e367df6-1641-474f-ae64-7a03b03508ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:18:01 compute-0 nova_compute[183075]: 2026-01-22 17:18:01.503 183079 DEBUG oslo_concurrency.lockutils [req-3c710442-443b-4af3-8a31-2a70347ee73b req-2398d314-e145-4bd7-a10a-bfecdf986ecf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:18:01 compute-0 nova_compute[183075]: 2026-01-22 17:18:01.504 183079 DEBUG oslo_concurrency.lockutils [req-3c710442-443b-4af3-8a31-2a70347ee73b req-2398d314-e145-4bd7-a10a-bfecdf986ecf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:18:01 compute-0 nova_compute[183075]: 2026-01-22 17:18:01.505 183079 DEBUG oslo_concurrency.lockutils [req-3c710442-443b-4af3-8a31-2a70347ee73b req-2398d314-e145-4bd7-a10a-bfecdf986ecf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:18:01 compute-0 nova_compute[183075]: 2026-01-22 17:18:01.505 183079 DEBUG nova.compute.manager [req-3c710442-443b-4af3-8a31-2a70347ee73b req-2398d314-e145-4bd7-a10a-bfecdf986ecf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] No waiting events found dispatching network-vif-plugged-7e367df6-1641-474f-ae64-7a03b03508ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:18:01 compute-0 nova_compute[183075]: 2026-01-22 17:18:01.506 183079 WARNING nova.compute.manager [req-3c710442-443b-4af3-8a31-2a70347ee73b req-2398d314-e145-4bd7-a10a-bfecdf986ecf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received unexpected event network-vif-plugged-7e367df6-1641-474f-ae64-7a03b03508ec for instance with vm_state active and task_state None.
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.022 183079 DEBUG oslo_concurrency.lockutils [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.023 183079 DEBUG oslo_concurrency.lockutils [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.023 183079 DEBUG oslo_concurrency.lockutils [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.024 183079 DEBUG oslo_concurrency.lockutils [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.024 183079 DEBUG oslo_concurrency.lockutils [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.026 183079 INFO nova.compute.manager [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Terminating instance
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.028 183079 DEBUG nova.compute.manager [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:18:02 compute-0 kernel: tap6b1bf9db-e0 (unregistering): left promiscuous mode
Jan 22 17:18:02 compute-0 NetworkManager[55454]: <info>  [1769102282.0598] device (tap6b1bf9db-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.067 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 ovn_controller[95372]: 2026-01-22T17:18:02Z|00396|binding|INFO|Releasing lport 6b1bf9db-e098-4d03-b185-9a64eee8cec2 from this chassis (sb_readonly=0)
Jan 22 17:18:02 compute-0 ovn_controller[95372]: 2026-01-22T17:18:02Z|00397|binding|INFO|Setting lport 6b1bf9db-e098-4d03-b185-9a64eee8cec2 down in Southbound
Jan 22 17:18:02 compute-0 ovn_controller[95372]: 2026-01-22T17:18:02Z|00398|binding|INFO|Removing iface tap6b1bf9db-e0 ovn-installed in OVS
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.072 183079 DEBUG nova.network.neutron [req-2e3e1f64-acf5-47e1-ab69-3c481ebdeca4 req-2ae8ea1d-0bba-49ac-b6a8-c005796709d2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Updated VIF entry in instance network info cache for port 7e367df6-1641-474f-ae64-7a03b03508ec. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.073 183079 DEBUG nova.network.neutron [req-2e3e1f64-acf5-47e1-ab69-3c481ebdeca4 req-2ae8ea1d-0bba-49ac-b6a8-c005796709d2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Updating instance_info_cache with network_info: [{"id": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "address": "fa:16:3e:e6:59:b5", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b1bf9db-e0", "ovs_interfaceid": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7e367df6-1641-474f-ae64-7a03b03508ec", "address": "fa:16:3e:24:11:71", "network": {"id": "6c418cf9-3d22-4e00-a57b-fdff8a1243c2", "bridge": "br-int", "label": "tempest-test-network--970566607", "subnets": [{"cidr": "2001:db8:0:3::/64", "dns": [], "gateway": {"address": "2001:db8:0:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:3:f816:3eff:fe24:1171", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e367df6-16", "ovs_interfaceid": "7e367df6-1641-474f-ae64-7a03b03508ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.075 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.079 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:59:b5 10.100.0.12'], port_security=['fa:16:3e:e6:59:b5 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-359b74c5-cbeb-4440-a3e9-a16a51b1ab77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4f95c62a5a194d2291b03187a9c85702', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1b5e2b25-1ae0-464c-ac9a-7fc65ac893a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.239'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee297cf4-fb08-4758-bba6-b8b00aaf6678, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=6b1bf9db-e098-4d03-b185-9a64eee8cec2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.082 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.083 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 6b1bf9db-e098-4d03-b185-9a64eee8cec2 in datapath 359b74c5-cbeb-4440-a3e9-a16a51b1ab77 unbound from our chassis
Jan 22 17:18:02 compute-0 kernel: tap7e367df6-16 (unregistering): left promiscuous mode
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.087 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 359b74c5-cbeb-4440-a3e9-a16a51b1ab77, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.089 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1a43c79a-ed96-43e8-bb7d-e200f6712c86]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:02 compute-0 NetworkManager[55454]: <info>  [1769102282.0907] device (tap7e367df6-16): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.090 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77 namespace which is not needed anymore
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.097 183079 DEBUG oslo_concurrency.lockutils [req-2e3e1f64-acf5-47e1-ab69-3c481ebdeca4 req-2ae8ea1d-0bba-49ac-b6a8-c005796709d2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-c925ab60-0524-40ab-a82b-52f810b9023f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.103 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 ovn_controller[95372]: 2026-01-22T17:18:02Z|00399|binding|INFO|Releasing lport 7e367df6-1641-474f-ae64-7a03b03508ec from this chassis (sb_readonly=0)
Jan 22 17:18:02 compute-0 ovn_controller[95372]: 2026-01-22T17:18:02Z|00400|binding|INFO|Setting lport 7e367df6-1641-474f-ae64-7a03b03508ec down in Southbound
Jan 22 17:18:02 compute-0 ovn_controller[95372]: 2026-01-22T17:18:02Z|00401|binding|INFO|Removing iface tap7e367df6-16 ovn-installed in OVS
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.104 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.111 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:11:71 2001:db8:0:3:f816:3eff:fe24:1171'], port_security=['fa:16:3e:24:11:71 2001:db8:0:3:f816:3eff:fe24:1171'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:3:f816:3eff:fe24:1171/64', 'neutron:device_id': 'c925ab60-0524-40ab-a82b-52f810b9023f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c418cf9-3d22-4e00-a57b-fdff8a1243c2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4f95c62a5a194d2291b03187a9c85702', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0ac3e154-5d63-4269-957b-eeadd273d2d6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6d3b513f-3c67-48df-94de-712c23da7c75, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=7e367df6-1641-474f-ae64-7a03b03508ec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.115 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Jan 22 17:18:02 compute-0 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000001f.scope: Consumed 14.040s CPU time.
Jan 22 17:18:02 compute-0 systemd-machined[154382]: Machine qemu-31-instance-0000001f terminated.
Jan 22 17:18:02 compute-0 neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224764]: [NOTICE]   (224768) : haproxy version is 2.8.14-c23fe91
Jan 22 17:18:02 compute-0 neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224764]: [NOTICE]   (224768) : path to executable is /usr/sbin/haproxy
Jan 22 17:18:02 compute-0 neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224764]: [WARNING]  (224768) : Exiting Master process...
Jan 22 17:18:02 compute-0 neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224764]: [ALERT]    (224768) : Current worker (224770) exited with code 143 (Terminated)
Jan 22 17:18:02 compute-0 neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77[224764]: [WARNING]  (224768) : All workers exited. Exiting... (0)
Jan 22 17:18:02 compute-0 systemd[1]: libpod-041e28fca1f7500546a492b8274ffdd0943b67da426adbbf4410bcfd2f008135.scope: Deactivated successfully.
Jan 22 17:18:02 compute-0 podman[225339]: 2026-01-22 17:18:02.241339502 +0000 UTC m=+0.049721450 container died 041e28fca1f7500546a492b8274ffdd0943b67da426adbbf4410bcfd2f008135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:18:02 compute-0 NetworkManager[55454]: <info>  [1769102282.2596] manager: (tap7e367df6-16): new Tun device (/org/freedesktop/NetworkManager/Devices/172)
Jan 22 17:18:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-041e28fca1f7500546a492b8274ffdd0943b67da426adbbf4410bcfd2f008135-userdata-shm.mount: Deactivated successfully.
Jan 22 17:18:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-1ba61d2a984fcf8a3b7b4e33aad2c392ec4e190543aa7d97027ace06a14863c9-merged.mount: Deactivated successfully.
Jan 22 17:18:02 compute-0 podman[225339]: 2026-01-22 17:18:02.291469362 +0000 UTC m=+0.099851290 container cleanup 041e28fca1f7500546a492b8274ffdd0943b67da426adbbf4410bcfd2f008135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.291 183079 INFO nova.virt.libvirt.driver [-] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Instance destroyed successfully.
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.291 183079 DEBUG nova.objects.instance [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lazy-loading 'resources' on Instance uuid c925ab60-0524-40ab-a82b-52f810b9023f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:18:02 compute-0 systemd[1]: libpod-conmon-041e28fca1f7500546a492b8274ffdd0943b67da426adbbf4410bcfd2f008135.scope: Deactivated successfully.
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.319 183079 DEBUG nova.virt.libvirt.vif [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-37806664',display_name='tempest-server-test-37806664',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-37806664',id=31,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:17:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-q3uks0k4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:17:22Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=c925ab60-0524-40ab-a82b-52f810b9023f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "address": "fa:16:3e:e6:59:b5", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b1bf9db-e0", "ovs_interfaceid": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.319 183079 DEBUG nova.network.os_vif_util [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "address": "fa:16:3e:e6:59:b5", "network": {"id": "359b74c5-cbeb-4440-a3e9-a16a51b1ab77", "bridge": "br-int", "label": "tempest-test-network--531378131", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b1bf9db-e0", "ovs_interfaceid": "6b1bf9db-e098-4d03-b185-9a64eee8cec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.320 183079 DEBUG nova.network.os_vif_util [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e6:59:b5,bridge_name='br-int',has_traffic_filtering=True,id=6b1bf9db-e098-4d03-b185-9a64eee8cec2,network=Network(359b74c5-cbeb-4440-a3e9-a16a51b1ab77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b1bf9db-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.321 183079 DEBUG os_vif [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e6:59:b5,bridge_name='br-int',has_traffic_filtering=True,id=6b1bf9db-e098-4d03-b185-9a64eee8cec2,network=Network(359b74c5-cbeb-4440-a3e9-a16a51b1ab77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b1bf9db-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.322 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.323 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b1bf9db-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.324 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.326 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.328 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.330 183079 INFO os_vif [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e6:59:b5,bridge_name='br-int',has_traffic_filtering=True,id=6b1bf9db-e098-4d03-b185-9a64eee8cec2,network=Network(359b74c5-cbeb-4440-a3e9-a16a51b1ab77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b1bf9db-e0')
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.330 183079 DEBUG nova.virt.libvirt.vif [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-37806664',display_name='tempest-server-test-37806664',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-37806664',id=31,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDmUdcgi0W3CCW7+0zVPZm4+SGvgMVYIF2EiVCZOZNnFUe7R59w7UIBntHS+djSABVTvpJeBQFo+aQIZ5cRYQ1WALN6CDQL2VSKr+LjaC8setfZMT5V7IalcT1iGutwkuw==',key_name='tempest-keypair-test-914827607',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:17:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4f95c62a5a194d2291b03187a9c85702',ramdisk_id='',reservation_id='r-q3uks0k4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-IPv6Test-1828780436',owner_user_name='tempest-IPv6Test-1828780436-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:17:22Z,user_data=None,user_id='2fe02f7484a94091bab26aba1c370459',uuid=c925ab60-0524-40ab-a82b-52f810b9023f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7e367df6-1641-474f-ae64-7a03b03508ec", "address": "fa:16:3e:24:11:71", "network": {"id": "6c418cf9-3d22-4e00-a57b-fdff8a1243c2", "bridge": "br-int", "label": "tempest-test-network--970566607", "subnets": [{"cidr": "2001:db8:0:3::/64", "dns": [], "gateway": {"address": "2001:db8:0:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:3:f816:3eff:fe24:1171", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e367df6-16", "ovs_interfaceid": "7e367df6-1641-474f-ae64-7a03b03508ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.330 183079 DEBUG nova.network.os_vif_util [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converting VIF {"id": "7e367df6-1641-474f-ae64-7a03b03508ec", "address": "fa:16:3e:24:11:71", "network": {"id": "6c418cf9-3d22-4e00-a57b-fdff8a1243c2", "bridge": "br-int", "label": "tempest-test-network--970566607", "subnets": [{"cidr": "2001:db8:0:3::/64", "dns": [], "gateway": {"address": "2001:db8:0:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:3:f816:3eff:fe24:1171", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "4f95c62a5a194d2291b03187a9c85702", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e367df6-16", "ovs_interfaceid": "7e367df6-1641-474f-ae64-7a03b03508ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.331 183079 DEBUG nova.network.os_vif_util [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:11:71,bridge_name='br-int',has_traffic_filtering=True,id=7e367df6-1641-474f-ae64-7a03b03508ec,network=Network(6c418cf9-3d22-4e00-a57b-fdff8a1243c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e367df6-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.331 183079 DEBUG os_vif [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:11:71,bridge_name='br-int',has_traffic_filtering=True,id=7e367df6-1641-474f-ae64-7a03b03508ec,network=Network(6c418cf9-3d22-4e00-a57b-fdff8a1243c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e367df6-16') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.332 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.332 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e367df6-16, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.333 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.334 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.335 183079 INFO os_vif [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:11:71,bridge_name='br-int',has_traffic_filtering=True,id=7e367df6-1641-474f-ae64-7a03b03508ec,network=Network(6c418cf9-3d22-4e00-a57b-fdff8a1243c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e367df6-16')
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.336 183079 INFO nova.virt.libvirt.driver [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Deleting instance files /var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f_del
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.337 183079 INFO nova.virt.libvirt.driver [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Deletion of /var/lib/nova/instances/c925ab60-0524-40ab-a82b-52f810b9023f_del complete
Jan 22 17:18:02 compute-0 podman[225395]: 2026-01-22 17:18:02.352876856 +0000 UTC m=+0.040896240 container remove 041e28fca1f7500546a492b8274ffdd0943b67da426adbbf4410bcfd2f008135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.357 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0f8d8190-3c4d-4d29-95d9-c8e888f6eeef]: (4, ('Thu Jan 22 05:18:02 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77 (041e28fca1f7500546a492b8274ffdd0943b67da426adbbf4410bcfd2f008135)\n041e28fca1f7500546a492b8274ffdd0943b67da426adbbf4410bcfd2f008135\nThu Jan 22 05:18:02 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77 (041e28fca1f7500546a492b8274ffdd0943b67da426adbbf4410bcfd2f008135)\n041e28fca1f7500546a492b8274ffdd0943b67da426adbbf4410bcfd2f008135\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.358 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[94d5673b-63cb-4f4f-8ad3-18af45de208d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.359 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap359b74c5-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.360 183079 DEBUG nova.compute.manager [req-223a5d91-fdf1-4aee-b9da-3ecac3319f07 req-ef79e15b-2338-4323-9573-32b57c0205ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-vif-unplugged-6b1bf9db-e098-4d03-b185-9a64eee8cec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.360 183079 DEBUG oslo_concurrency.lockutils [req-223a5d91-fdf1-4aee-b9da-3ecac3319f07 req-ef79e15b-2338-4323-9573-32b57c0205ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.360 183079 DEBUG oslo_concurrency.lockutils [req-223a5d91-fdf1-4aee-b9da-3ecac3319f07 req-ef79e15b-2338-4323-9573-32b57c0205ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.361 183079 DEBUG oslo_concurrency.lockutils [req-223a5d91-fdf1-4aee-b9da-3ecac3319f07 req-ef79e15b-2338-4323-9573-32b57c0205ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:18:02 compute-0 kernel: tap359b74c5-c0: left promiscuous mode
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.361 183079 DEBUG nova.compute.manager [req-223a5d91-fdf1-4aee-b9da-3ecac3319f07 req-ef79e15b-2338-4323-9573-32b57c0205ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] No waiting events found dispatching network-vif-unplugged-6b1bf9db-e098-4d03-b185-9a64eee8cec2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.362 183079 DEBUG nova.compute.manager [req-223a5d91-fdf1-4aee-b9da-3ecac3319f07 req-ef79e15b-2338-4323-9573-32b57c0205ba a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-vif-unplugged-6b1bf9db-e098-4d03-b185-9a64eee8cec2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.363 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.364 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0c22a525-e584-4040-a15c-ea8208213252]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.373 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.383 183079 INFO nova.compute.manager [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.384 183079 DEBUG oslo.service.loopingcall [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.384 183079 DEBUG nova.compute.manager [-] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.385 183079 DEBUG nova.network.neutron [-] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.387 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c96733f5-2eb0-4955-87a1-2f3a54c647f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.388 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[20568fd2-9759-40d9-8952-db3db6fa834c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.404 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f4b22e8c-6f96-4a15-89dc-34e384c26718]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460325, 'reachable_time': 18394, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225411, 'error': None, 'target': 'ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.406 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-359b74c5-cbeb-4440-a3e9-a16a51b1ab77 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.406 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[f42894ef-9fa6-4e0d-8cfa-dbeeca307163]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.407 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 7e367df6-1641-474f-ae64-7a03b03508ec in datapath 6c418cf9-3d22-4e00-a57b-fdff8a1243c2 unbound from our chassis
Jan 22 17:18:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d359b74c5\x2dcbeb\x2d4440\x2da3e9\x2da16a51b1ab77.mount: Deactivated successfully.
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.409 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6c418cf9-3d22-4e00-a57b-fdff8a1243c2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.409 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[60484de6-196b-4d87-8be7-94da3b157d16]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.410 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2 namespace which is not needed anymore
Jan 22 17:18:02 compute-0 neutron-haproxy-ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2[225294]: [NOTICE]   (225298) : haproxy version is 2.8.14-c23fe91
Jan 22 17:18:02 compute-0 neutron-haproxy-ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2[225294]: [NOTICE]   (225298) : path to executable is /usr/sbin/haproxy
Jan 22 17:18:02 compute-0 neutron-haproxy-ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2[225294]: [WARNING]  (225298) : Exiting Master process...
Jan 22 17:18:02 compute-0 neutron-haproxy-ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2[225294]: [WARNING]  (225298) : Exiting Master process...
Jan 22 17:18:02 compute-0 neutron-haproxy-ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2[225294]: [ALERT]    (225298) : Current worker (225300) exited with code 143 (Terminated)
Jan 22 17:18:02 compute-0 neutron-haproxy-ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2[225294]: [WARNING]  (225298) : All workers exited. Exiting... (0)
Jan 22 17:18:02 compute-0 systemd[1]: libpod-100756c2c1ca9154c5fd9396da462fae4b462634e49c333e3cda7ad5a4587548.scope: Deactivated successfully.
Jan 22 17:18:02 compute-0 podman[225430]: 2026-01-22 17:18:02.530815705 +0000 UTC m=+0.043518728 container died 100756c2c1ca9154c5fd9396da462fae4b462634e49c333e3cda7ad5a4587548 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:18:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-100756c2c1ca9154c5fd9396da462fae4b462634e49c333e3cda7ad5a4587548-userdata-shm.mount: Deactivated successfully.
Jan 22 17:18:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-0d104e43ad36d12e6db9eea08fb45c113a0219dc7d5602be960ccddd60626bba-merged.mount: Deactivated successfully.
Jan 22 17:18:02 compute-0 podman[225430]: 2026-01-22 17:18:02.560522611 +0000 UTC m=+0.073225644 container cleanup 100756c2c1ca9154c5fd9396da462fae4b462634e49c333e3cda7ad5a4587548 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 17:18:02 compute-0 systemd[1]: libpod-conmon-100756c2c1ca9154c5fd9396da462fae4b462634e49c333e3cda7ad5a4587548.scope: Deactivated successfully.
Jan 22 17:18:02 compute-0 podman[225462]: 2026-01-22 17:18:02.617751016 +0000 UTC m=+0.035422957 container remove 100756c2c1ca9154c5fd9396da462fae4b462634e49c333e3cda7ad5a4587548 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.622 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6b00f8e5-f86c-403d-9638-312d20a8555d]: (4, ('Thu Jan 22 05:18:02 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2 (100756c2c1ca9154c5fd9396da462fae4b462634e49c333e3cda7ad5a4587548)\n100756c2c1ca9154c5fd9396da462fae4b462634e49c333e3cda7ad5a4587548\nThu Jan 22 05:18:02 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2 (100756c2c1ca9154c5fd9396da462fae4b462634e49c333e3cda7ad5a4587548)\n100756c2c1ca9154c5fd9396da462fae4b462634e49c333e3cda7ad5a4587548\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.624 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b099174b-f887-4851-9af0-8bf7ea6dbb4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.625 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c418cf9-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.626 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 kernel: tap6c418cf9-30: left promiscuous mode
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.628 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.629 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cdf6f621-a2c1-4564-b29a-6d5f9971dae6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:02 compute-0 nova_compute[183075]: 2026-01-22 17:18:02.638 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.648 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[62665738-0d40-4971-8f58-b264f7a9bfd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.649 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e33422eb-d5f4-4976-9a34-1ef97c5436b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.668 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7091214e-ad1f-4b25-972c-bbd59b94a2b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464208, 'reachable_time': 24588, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225477, 'error': None, 'target': 'ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.670 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6c418cf9-3d22-4e00-a57b-fdff8a1243c2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:18:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:02.670 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[cc21aaf8-ff1c-4cca-b7d5-173fed78963b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:03 compute-0 systemd[1]: run-netns-ovnmeta\x2d6c418cf9\x2d3d22\x2d4e00\x2da57b\x2dfdff8a1243c2.mount: Deactivated successfully.
Jan 22 17:18:03 compute-0 nova_compute[183075]: 2026-01-22 17:18:03.784 183079 DEBUG nova.compute.manager [req-9928ec85-1b08-424e-83e7-07e4962b1129 req-9b940f51-d1e1-49f9-a14a-b3184a1745b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-vif-unplugged-7e367df6-1641-474f-ae64-7a03b03508ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:18:03 compute-0 nova_compute[183075]: 2026-01-22 17:18:03.785 183079 DEBUG oslo_concurrency.lockutils [req-9928ec85-1b08-424e-83e7-07e4962b1129 req-9b940f51-d1e1-49f9-a14a-b3184a1745b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:18:03 compute-0 nova_compute[183075]: 2026-01-22 17:18:03.785 183079 DEBUG oslo_concurrency.lockutils [req-9928ec85-1b08-424e-83e7-07e4962b1129 req-9b940f51-d1e1-49f9-a14a-b3184a1745b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:18:03 compute-0 nova_compute[183075]: 2026-01-22 17:18:03.785 183079 DEBUG oslo_concurrency.lockutils [req-9928ec85-1b08-424e-83e7-07e4962b1129 req-9b940f51-d1e1-49f9-a14a-b3184a1745b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:18:03 compute-0 nova_compute[183075]: 2026-01-22 17:18:03.785 183079 DEBUG nova.compute.manager [req-9928ec85-1b08-424e-83e7-07e4962b1129 req-9b940f51-d1e1-49f9-a14a-b3184a1745b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] No waiting events found dispatching network-vif-unplugged-7e367df6-1641-474f-ae64-7a03b03508ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:18:03 compute-0 nova_compute[183075]: 2026-01-22 17:18:03.785 183079 DEBUG nova.compute.manager [req-9928ec85-1b08-424e-83e7-07e4962b1129 req-9b940f51-d1e1-49f9-a14a-b3184a1745b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-vif-unplugged-7e367df6-1641-474f-ae64-7a03b03508ec for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:18:03 compute-0 nova_compute[183075]: 2026-01-22 17:18:03.786 183079 DEBUG nova.compute.manager [req-9928ec85-1b08-424e-83e7-07e4962b1129 req-9b940f51-d1e1-49f9-a14a-b3184a1745b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-vif-plugged-7e367df6-1641-474f-ae64-7a03b03508ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:18:03 compute-0 nova_compute[183075]: 2026-01-22 17:18:03.786 183079 DEBUG oslo_concurrency.lockutils [req-9928ec85-1b08-424e-83e7-07e4962b1129 req-9b940f51-d1e1-49f9-a14a-b3184a1745b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:18:03 compute-0 nova_compute[183075]: 2026-01-22 17:18:03.786 183079 DEBUG oslo_concurrency.lockutils [req-9928ec85-1b08-424e-83e7-07e4962b1129 req-9b940f51-d1e1-49f9-a14a-b3184a1745b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:18:03 compute-0 nova_compute[183075]: 2026-01-22 17:18:03.786 183079 DEBUG oslo_concurrency.lockutils [req-9928ec85-1b08-424e-83e7-07e4962b1129 req-9b940f51-d1e1-49f9-a14a-b3184a1745b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:18:03 compute-0 nova_compute[183075]: 2026-01-22 17:18:03.787 183079 DEBUG nova.compute.manager [req-9928ec85-1b08-424e-83e7-07e4962b1129 req-9b940f51-d1e1-49f9-a14a-b3184a1745b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] No waiting events found dispatching network-vif-plugged-7e367df6-1641-474f-ae64-7a03b03508ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:18:03 compute-0 nova_compute[183075]: 2026-01-22 17:18:03.787 183079 WARNING nova.compute.manager [req-9928ec85-1b08-424e-83e7-07e4962b1129 req-9b940f51-d1e1-49f9-a14a-b3184a1745b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received unexpected event network-vif-plugged-7e367df6-1641-474f-ae64-7a03b03508ec for instance with vm_state active and task_state deleting.
Jan 22 17:18:04 compute-0 nova_compute[183075]: 2026-01-22 17:18:04.905 183079 DEBUG nova.compute.manager [req-b1b3c4f7-4875-4bab-bb20-7b1239a54ca2 req-aa1a0c79-60ad-4e12-b38a-6ce8aec05ebd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-vif-plugged-6b1bf9db-e098-4d03-b185-9a64eee8cec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:18:04 compute-0 nova_compute[183075]: 2026-01-22 17:18:04.906 183079 DEBUG oslo_concurrency.lockutils [req-b1b3c4f7-4875-4bab-bb20-7b1239a54ca2 req-aa1a0c79-60ad-4e12-b38a-6ce8aec05ebd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:18:04 compute-0 nova_compute[183075]: 2026-01-22 17:18:04.907 183079 DEBUG oslo_concurrency.lockutils [req-b1b3c4f7-4875-4bab-bb20-7b1239a54ca2 req-aa1a0c79-60ad-4e12-b38a-6ce8aec05ebd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:18:04 compute-0 nova_compute[183075]: 2026-01-22 17:18:04.907 183079 DEBUG oslo_concurrency.lockutils [req-b1b3c4f7-4875-4bab-bb20-7b1239a54ca2 req-aa1a0c79-60ad-4e12-b38a-6ce8aec05ebd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:18:04 compute-0 nova_compute[183075]: 2026-01-22 17:18:04.908 183079 DEBUG nova.compute.manager [req-b1b3c4f7-4875-4bab-bb20-7b1239a54ca2 req-aa1a0c79-60ad-4e12-b38a-6ce8aec05ebd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] No waiting events found dispatching network-vif-plugged-6b1bf9db-e098-4d03-b185-9a64eee8cec2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:18:04 compute-0 nova_compute[183075]: 2026-01-22 17:18:04.908 183079 WARNING nova.compute.manager [req-b1b3c4f7-4875-4bab-bb20-7b1239a54ca2 req-aa1a0c79-60ad-4e12-b38a-6ce8aec05ebd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received unexpected event network-vif-plugged-6b1bf9db-e098-4d03-b185-9a64eee8cec2 for instance with vm_state active and task_state deleting.
Jan 22 17:18:05 compute-0 nova_compute[183075]: 2026-01-22 17:18:05.920 183079 INFO nova.compute.manager [None req-617999cb-3ef8-427a-82d7-ed6e742f40d5 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Get console output
Jan 22 17:18:05 compute-0 nova_compute[183075]: 2026-01-22 17:18:05.925 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:18:06 compute-0 nova_compute[183075]: 2026-01-22 17:18:06.319 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:06 compute-0 nova_compute[183075]: 2026-01-22 17:18:06.357 183079 DEBUG nova.network.neutron [-] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:18:06 compute-0 nova_compute[183075]: 2026-01-22 17:18:06.375 183079 INFO nova.compute.manager [-] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Took 3.99 seconds to deallocate network for instance.
Jan 22 17:18:06 compute-0 nova_compute[183075]: 2026-01-22 17:18:06.406 183079 DEBUG nova.compute.manager [req-48bb3d1f-51d9-4ce3-9e07-59ab3cfd3d09 req-2fe9b472-614e-4e8e-80f5-f50f85e1304d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Received event network-vif-deleted-6b1bf9db-e098-4d03-b185-9a64eee8cec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:18:06 compute-0 nova_compute[183075]: 2026-01-22 17:18:06.422 183079 DEBUG oslo_concurrency.lockutils [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:18:06 compute-0 nova_compute[183075]: 2026-01-22 17:18:06.423 183079 DEBUG oslo_concurrency.lockutils [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:18:06 compute-0 nova_compute[183075]: 2026-01-22 17:18:06.492 183079 DEBUG nova.compute.provider_tree [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:18:06 compute-0 nova_compute[183075]: 2026-01-22 17:18:06.505 183079 DEBUG nova.scheduler.client.report [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:18:06 compute-0 nova_compute[183075]: 2026-01-22 17:18:06.524 183079 DEBUG oslo_concurrency.lockutils [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:18:06 compute-0 nova_compute[183075]: 2026-01-22 17:18:06.562 183079 INFO nova.scheduler.client.report [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Deleted allocations for instance c925ab60-0524-40ab-a82b-52f810b9023f
Jan 22 17:18:06 compute-0 nova_compute[183075]: 2026-01-22 17:18:06.624 183079 DEBUG oslo_concurrency.lockutils [None req-417c9b21-e4ad-4af1-8078-d7f891c4e2d9 2fe02f7484a94091bab26aba1c370459 4f95c62a5a194d2291b03187a9c85702 - - default default] Lock "c925ab60-0524-40ab-a82b-52f810b9023f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:18:07 compute-0 nova_compute[183075]: 2026-01-22 17:18:07.334 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:08 compute-0 podman[225479]: 2026-01-22 17:18:08.392719835 +0000 UTC m=+0.082251050 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 22 17:18:08 compute-0 podman[225480]: 2026-01-22 17:18:08.413595941 +0000 UTC m=+0.100650841 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter)
Jan 22 17:18:08 compute-0 podman[225478]: 2026-01-22 17:18:08.445363591 +0000 UTC m=+0.142145705 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:18:08 compute-0 nova_compute[183075]: 2026-01-22 17:18:08.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:18:11 compute-0 nova_compute[183075]: 2026-01-22 17:18:11.099 183079 INFO nova.compute.manager [None req-666bc30c-3915-4be3-8144-18cb2f41eb9a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Get console output
Jan 22 17:18:11 compute-0 nova_compute[183075]: 2026-01-22 17:18:11.105 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:18:11 compute-0 nova_compute[183075]: 2026-01-22 17:18:11.321 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:11 compute-0 nova_compute[183075]: 2026-01-22 17:18:11.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:18:12 compute-0 nova_compute[183075]: 2026-01-22 17:18:12.337 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:12 compute-0 nova_compute[183075]: 2026-01-22 17:18:12.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:18:13 compute-0 podman[225540]: 2026-01-22 17:18:13.408346139 +0000 UTC m=+0.106925045 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:18:14 compute-0 nova_compute[183075]: 2026-01-22 17:18:14.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:18:16 compute-0 nova_compute[183075]: 2026-01-22 17:18:16.287 183079 INFO nova.compute.manager [None req-a2b991bc-73b3-4cd9-953d-952af41115fc 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Get console output
Jan 22 17:18:16 compute-0 nova_compute[183075]: 2026-01-22 17:18:16.294 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:18:16 compute-0 nova_compute[183075]: 2026-01-22 17:18:16.324 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:16 compute-0 nova_compute[183075]: 2026-01-22 17:18:16.784 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:18:17 compute-0 nova_compute[183075]: 2026-01-22 17:18:17.290 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102282.2893598, c925ab60-0524-40ab-a82b-52f810b9023f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:18:17 compute-0 nova_compute[183075]: 2026-01-22 17:18:17.291 183079 INFO nova.compute.manager [-] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] VM Stopped (Lifecycle Event)
Jan 22 17:18:17 compute-0 nova_compute[183075]: 2026-01-22 17:18:17.378 183079 DEBUG nova.compute.manager [None req-0c88d55a-6016-425e-a77e-a47d7bf17643 - - - - - -] [instance: c925ab60-0524-40ab-a82b-52f810b9023f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:18:17 compute-0 nova_compute[183075]: 2026-01-22 17:18:17.379 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:17 compute-0 nova_compute[183075]: 2026-01-22 17:18:17.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:18:17 compute-0 nova_compute[183075]: 2026-01-22 17:18:17.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:18:18 compute-0 nova_compute[183075]: 2026-01-22 17:18:18.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:18:18 compute-0 nova_compute[183075]: 2026-01-22 17:18:18.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:18:18 compute-0 nova_compute[183075]: 2026-01-22 17:18:18.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:18:18 compute-0 nova_compute[183075]: 2026-01-22 17:18:18.844 183079 DEBUG nova.compute.manager [req-541f2c52-8168-4ffe-aaa7-d1858637cb8e req-adc229d9-3753-42a3-bf52-eeef7b3a8e87 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Received event network-changed-9a06288c-d8e5-43c4-9559-23674152a05e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:18:18 compute-0 nova_compute[183075]: 2026-01-22 17:18:18.845 183079 DEBUG nova.compute.manager [req-541f2c52-8168-4ffe-aaa7-d1858637cb8e req-adc229d9-3753-42a3-bf52-eeef7b3a8e87 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Refreshing instance network info cache due to event network-changed-9a06288c-d8e5-43c4-9559-23674152a05e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:18:18 compute-0 nova_compute[183075]: 2026-01-22 17:18:18.845 183079 DEBUG oslo_concurrency.lockutils [req-541f2c52-8168-4ffe-aaa7-d1858637cb8e req-adc229d9-3753-42a3-bf52-eeef7b3a8e87 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:18:18 compute-0 nova_compute[183075]: 2026-01-22 17:18:18.845 183079 DEBUG oslo_concurrency.lockutils [req-541f2c52-8168-4ffe-aaa7-d1858637cb8e req-adc229d9-3753-42a3-bf52-eeef7b3a8e87 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:18:18 compute-0 nova_compute[183075]: 2026-01-22 17:18:18.845 183079 DEBUG nova.network.neutron [req-541f2c52-8168-4ffe-aaa7-d1858637cb8e req-adc229d9-3753-42a3-bf52-eeef7b3a8e87 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Refreshing network info cache for port 9a06288c-d8e5-43c4-9559-23674152a05e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:18:19 compute-0 nova_compute[183075]: 2026-01-22 17:18:19.107 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:18:19 compute-0 ovn_controller[95372]: 2026-01-22T17:18:19Z|00402|binding|INFO|Releasing lport cd0e1120-5be8-4515-9d17-af992fbbcf85 from this chassis (sb_readonly=0)
Jan 22 17:18:19 compute-0 nova_compute[183075]: 2026-01-22 17:18:19.753 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:20 compute-0 podman[225560]: 2026-01-22 17:18:20.353062179 +0000 UTC m=+0.060711567 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:18:20 compute-0 nova_compute[183075]: 2026-01-22 17:18:20.634 183079 DEBUG nova.network.neutron [req-541f2c52-8168-4ffe-aaa7-d1858637cb8e req-adc229d9-3753-42a3-bf52-eeef7b3a8e87 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Updated VIF entry in instance network info cache for port 9a06288c-d8e5-43c4-9559-23674152a05e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:18:20 compute-0 nova_compute[183075]: 2026-01-22 17:18:20.635 183079 DEBUG nova.network.neutron [req-541f2c52-8168-4ffe-aaa7-d1858637cb8e req-adc229d9-3753-42a3-bf52-eeef7b3a8e87 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Updating instance_info_cache with network_info: [{"id": "9a06288c-d8e5-43c4-9559-23674152a05e", "address": "fa:16:3e:b0:5d:07", "network": {"id": "b8340cc9-e27b-4dbd-8de5-9c101e7b64ce", "bridge": "br-int", "label": "tempest-test-network--400188420", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70a37fbbd795434fbeb722ad97dda552", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a06288c-d8", "ovs_interfaceid": "9a06288c-d8e5-43c4-9559-23674152a05e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:18:20 compute-0 nova_compute[183075]: 2026-01-22 17:18:20.660 183079 DEBUG oslo_concurrency.lockutils [req-541f2c52-8168-4ffe-aaa7-d1858637cb8e req-adc229d9-3753-42a3-bf52-eeef7b3a8e87 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:18:20 compute-0 nova_compute[183075]: 2026-01-22 17:18:20.660 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:18:20 compute-0 nova_compute[183075]: 2026-01-22 17:18:20.661 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:18:20 compute-0 nova_compute[183075]: 2026-01-22 17:18:20.661 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 02af6288-0bd3-438c-982d-f36b31e1a9bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:18:21 compute-0 nova_compute[183075]: 2026-01-22 17:18:21.326 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:21 compute-0 nova_compute[183075]: 2026-01-22 17:18:21.760 183079 DEBUG nova.compute.manager [req-b883bab6-8ce4-44c8-ad79-e9edd175be17 req-16047310-f710-43cd-adc5-50d4213bc328 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Received event network-changed-9a06288c-d8e5-43c4-9559-23674152a05e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:18:21 compute-0 nova_compute[183075]: 2026-01-22 17:18:21.761 183079 DEBUG nova.compute.manager [req-b883bab6-8ce4-44c8-ad79-e9edd175be17 req-16047310-f710-43cd-adc5-50d4213bc328 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Refreshing instance network info cache due to event network-changed-9a06288c-d8e5-43c4-9559-23674152a05e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:18:21 compute-0 nova_compute[183075]: 2026-01-22 17:18:21.762 183079 DEBUG oslo_concurrency.lockutils [req-b883bab6-8ce4-44c8-ad79-e9edd175be17 req-16047310-f710-43cd-adc5-50d4213bc328 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.237 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Updating instance_info_cache with network_info: [{"id": "9a06288c-d8e5-43c4-9559-23674152a05e", "address": "fa:16:3e:b0:5d:07", "network": {"id": "b8340cc9-e27b-4dbd-8de5-9c101e7b64ce", "bridge": "br-int", "label": "tempest-test-network--400188420", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70a37fbbd795434fbeb722ad97dda552", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a06288c-d8", "ovs_interfaceid": "9a06288c-d8e5-43c4-9559-23674152a05e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.255 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.255 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.256 183079 DEBUG oslo_concurrency.lockutils [req-b883bab6-8ce4-44c8-ad79-e9edd175be17 req-16047310-f710-43cd-adc5-50d4213bc328 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.256 183079 DEBUG nova.network.neutron [req-b883bab6-8ce4-44c8-ad79-e9edd175be17 req-16047310-f710-43cd-adc5-50d4213bc328 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Refreshing network info cache for port 9a06288c-d8e5-43c4-9559-23674152a05e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.257 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.259 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.283 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.283 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.283 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.284 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.364 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.381 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.419 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.420 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.484 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.644 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.645 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5554MB free_disk=73.3380012512207GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.645 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.645 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.740 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 02af6288-0bd3-438c-982d-f36b31e1a9bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.740 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.740 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.800 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.818 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.847 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:18:22 compute-0 nova_compute[183075]: 2026-01-22 17:18:22.848 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:18:23 compute-0 nova_compute[183075]: 2026-01-22 17:18:23.844 183079 DEBUG nova.compute.manager [req-335ce94e-45a4-4b76-a519-1d9b2ab9e57b req-130cc66d-8fd6-483e-a3c7-2404689fe68b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Received event network-changed-9a06288c-d8e5-43c4-9559-23674152a05e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:18:23 compute-0 nova_compute[183075]: 2026-01-22 17:18:23.845 183079 DEBUG nova.compute.manager [req-335ce94e-45a4-4b76-a519-1d9b2ab9e57b req-130cc66d-8fd6-483e-a3c7-2404689fe68b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Refreshing instance network info cache due to event network-changed-9a06288c-d8e5-43c4-9559-23674152a05e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:18:23 compute-0 nova_compute[183075]: 2026-01-22 17:18:23.845 183079 DEBUG oslo_concurrency.lockutils [req-335ce94e-45a4-4b76-a519-1d9b2ab9e57b req-130cc66d-8fd6-483e-a3c7-2404689fe68b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:18:24 compute-0 nova_compute[183075]: 2026-01-22 17:18:24.252 183079 DEBUG nova.network.neutron [req-b883bab6-8ce4-44c8-ad79-e9edd175be17 req-16047310-f710-43cd-adc5-50d4213bc328 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Updated VIF entry in instance network info cache for port 9a06288c-d8e5-43c4-9559-23674152a05e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:18:24 compute-0 nova_compute[183075]: 2026-01-22 17:18:24.253 183079 DEBUG nova.network.neutron [req-b883bab6-8ce4-44c8-ad79-e9edd175be17 req-16047310-f710-43cd-adc5-50d4213bc328 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Updating instance_info_cache with network_info: [{"id": "9a06288c-d8e5-43c4-9559-23674152a05e", "address": "fa:16:3e:b0:5d:07", "network": {"id": "b8340cc9-e27b-4dbd-8de5-9c101e7b64ce", "bridge": "br-int", "label": "tempest-test-network--400188420", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70a37fbbd795434fbeb722ad97dda552", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a06288c-d8", "ovs_interfaceid": "9a06288c-d8e5-43c4-9559-23674152a05e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:18:24 compute-0 nova_compute[183075]: 2026-01-22 17:18:24.275 183079 DEBUG oslo_concurrency.lockutils [req-b883bab6-8ce4-44c8-ad79-e9edd175be17 req-16047310-f710-43cd-adc5-50d4213bc328 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:18:24 compute-0 nova_compute[183075]: 2026-01-22 17:18:24.277 183079 DEBUG oslo_concurrency.lockutils [req-335ce94e-45a4-4b76-a519-1d9b2ab9e57b req-130cc66d-8fd6-483e-a3c7-2404689fe68b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:18:24 compute-0 nova_compute[183075]: 2026-01-22 17:18:24.277 183079 DEBUG nova.network.neutron [req-335ce94e-45a4-4b76-a519-1d9b2ab9e57b req-130cc66d-8fd6-483e-a3c7-2404689fe68b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Refreshing network info cache for port 9a06288c-d8e5-43c4-9559-23674152a05e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:18:26 compute-0 nova_compute[183075]: 2026-01-22 17:18:26.327 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:26 compute-0 nova_compute[183075]: 2026-01-22 17:18:26.961 183079 DEBUG nova.network.neutron [req-335ce94e-45a4-4b76-a519-1d9b2ab9e57b req-130cc66d-8fd6-483e-a3c7-2404689fe68b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Updated VIF entry in instance network info cache for port 9a06288c-d8e5-43c4-9559-23674152a05e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:18:26 compute-0 nova_compute[183075]: 2026-01-22 17:18:26.962 183079 DEBUG nova.network.neutron [req-335ce94e-45a4-4b76-a519-1d9b2ab9e57b req-130cc66d-8fd6-483e-a3c7-2404689fe68b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Updating instance_info_cache with network_info: [{"id": "9a06288c-d8e5-43c4-9559-23674152a05e", "address": "fa:16:3e:b0:5d:07", "network": {"id": "b8340cc9-e27b-4dbd-8de5-9c101e7b64ce", "bridge": "br-int", "label": "tempest-test-network--400188420", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70a37fbbd795434fbeb722ad97dda552", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a06288c-d8", "ovs_interfaceid": "9a06288c-d8e5-43c4-9559-23674152a05e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:18:26 compute-0 nova_compute[183075]: 2026-01-22 17:18:26.994 183079 DEBUG oslo_concurrency.lockutils [req-335ce94e-45a4-4b76-a519-1d9b2ab9e57b req-130cc66d-8fd6-483e-a3c7-2404689fe68b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-02af6288-0bd3-438c-982d-f36b31e1a9bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:18:27 compute-0 nova_compute[183075]: 2026-01-22 17:18:27.384 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:29 compute-0 podman[225591]: 2026-01-22 17:18:29.362650033 +0000 UTC m=+0.059787653 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:18:31 compute-0 nova_compute[183075]: 2026-01-22 17:18:31.329 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:32 compute-0 nova_compute[183075]: 2026-01-22 17:18:32.419 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.180 183079 DEBUG oslo_concurrency.lockutils [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Acquiring lock "02af6288-0bd3-438c-982d-f36b31e1a9bf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.181 183079 DEBUG oslo_concurrency.lockutils [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "02af6288-0bd3-438c-982d-f36b31e1a9bf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.181 183079 DEBUG oslo_concurrency.lockutils [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Acquiring lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.182 183079 DEBUG oslo_concurrency.lockutils [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.182 183079 DEBUG oslo_concurrency.lockutils [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.184 183079 INFO nova.compute.manager [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Terminating instance
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.186 183079 DEBUG nova.compute.manager [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:18:35 compute-0 kernel: tap9a06288c-d8 (unregistering): left promiscuous mode
Jan 22 17:18:35 compute-0 NetworkManager[55454]: <info>  [1769102315.2174] device (tap9a06288c-d8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.225 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:35 compute-0 ovn_controller[95372]: 2026-01-22T17:18:35Z|00403|binding|INFO|Releasing lport 9a06288c-d8e5-43c4-9559-23674152a05e from this chassis (sb_readonly=0)
Jan 22 17:18:35 compute-0 ovn_controller[95372]: 2026-01-22T17:18:35Z|00404|binding|INFO|Setting lport 9a06288c-d8e5-43c4-9559-23674152a05e down in Southbound
Jan 22 17:18:35 compute-0 ovn_controller[95372]: 2026-01-22T17:18:35Z|00405|binding|INFO|Removing iface tap9a06288c-d8 ovn-installed in OVS
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.230 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.246 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:35.261 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:5d:07 10.100.0.27'], port_security=['fa:16:3e:b0:5d:07 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '02af6288-0bd3-438c-982d-f36b31e1a9bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70a37fbbd795434fbeb722ad97dda552', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ec8664a7-160c-4633-979c-dec0eb1895f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.197'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=679d6e1b-038a-49c0-93ea-1c7e7848d42c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=9a06288c-d8e5-43c4-9559-23674152a05e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:18:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:35.262 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 9a06288c-d8e5-43c4-9559-23674152a05e in datapath b8340cc9-e27b-4dbd-8de5-9c101e7b64ce unbound from our chassis
Jan 22 17:18:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:35.264 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8340cc9-e27b-4dbd-8de5-9c101e7b64ce, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:18:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:35.265 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[55f31556-3433-4771-9a66-7c037cb4b2db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:35.266 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce namespace which is not needed anymore
Jan 22 17:18:35 compute-0 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000020.scope: Deactivated successfully.
Jan 22 17:18:35 compute-0 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000020.scope: Consumed 15.504s CPU time.
Jan 22 17:18:35 compute-0 systemd-machined[154382]: Machine qemu-32-instance-00000020 terminated.
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.406 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.411 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:35 compute-0 neutron-haproxy-ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224941]: [NOTICE]   (224945) : haproxy version is 2.8.14-c23fe91
Jan 22 17:18:35 compute-0 neutron-haproxy-ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224941]: [NOTICE]   (224945) : path to executable is /usr/sbin/haproxy
Jan 22 17:18:35 compute-0 neutron-haproxy-ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224941]: [WARNING]  (224945) : Exiting Master process...
Jan 22 17:18:35 compute-0 neutron-haproxy-ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224941]: [WARNING]  (224945) : Exiting Master process...
Jan 22 17:18:35 compute-0 neutron-haproxy-ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224941]: [ALERT]    (224945) : Current worker (224947) exited with code 143 (Terminated)
Jan 22 17:18:35 compute-0 neutron-haproxy-ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce[224941]: [WARNING]  (224945) : All workers exited. Exiting... (0)
Jan 22 17:18:35 compute-0 systemd[1]: libpod-d70199136438c29fa3a929bd121f7aaccaabfca76249cd9ac5bd414d39cb8b1a.scope: Deactivated successfully.
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.439 183079 INFO nova.virt.libvirt.driver [-] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Instance destroyed successfully.
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.439 183079 DEBUG nova.objects.instance [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lazy-loading 'resources' on Instance uuid 02af6288-0bd3-438c-982d-f36b31e1a9bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:18:35 compute-0 podman[225640]: 2026-01-22 17:18:35.447185102 +0000 UTC m=+0.089062888 container died d70199136438c29fa3a929bd121f7aaccaabfca76249cd9ac5bd414d39cb8b1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.470 183079 DEBUG nova.virt.libvirt.vif [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:17:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-571651884',display_name='tempest-server-test-571651884',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-571651884',id=32,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOheSF+/g7pqjxa37mBWTVnLSWr9OoUZA+yJcO7BU9vrZDKpB0HwI4MttcuyJijhiuyAewJavO9K5NemBxxQoaBd71z7dq8hTIGwLmdOggCBA+UUuizOD4iEYMwLvvpiWQ==',key_name='tempest-keypair-test-332367794',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:17:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='70a37fbbd795434fbeb722ad97dda552',ramdisk_id='',reservation_id='r-smyxjhwi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPQosTest-2049146665',owner_user_name='tempest-FloatingIPQosTest-2049146665-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:17:23Z,user_data=None,user_id='3d4b274173814c359055fed8dfc2bdeb',uuid=02af6288-0bd3-438c-982d-f36b31e1a9bf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9a06288c-d8e5-43c4-9559-23674152a05e", "address": "fa:16:3e:b0:5d:07", "network": {"id": "b8340cc9-e27b-4dbd-8de5-9c101e7b64ce", "bridge": "br-int", "label": "tempest-test-network--400188420", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70a37fbbd795434fbeb722ad97dda552", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a06288c-d8", "ovs_interfaceid": "9a06288c-d8e5-43c4-9559-23674152a05e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.471 183079 DEBUG nova.network.os_vif_util [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Converting VIF {"id": "9a06288c-d8e5-43c4-9559-23674152a05e", "address": "fa:16:3e:b0:5d:07", "network": {"id": "b8340cc9-e27b-4dbd-8de5-9c101e7b64ce", "bridge": "br-int", "label": "tempest-test-network--400188420", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70a37fbbd795434fbeb722ad97dda552", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a06288c-d8", "ovs_interfaceid": "9a06288c-d8e5-43c4-9559-23674152a05e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.472 183079 DEBUG nova.network.os_vif_util [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:5d:07,bridge_name='br-int',has_traffic_filtering=True,id=9a06288c-d8e5-43c4-9559-23674152a05e,network=Network(b8340cc9-e27b-4dbd-8de5-9c101e7b64ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a06288c-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.473 183079 DEBUG os_vif [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:5d:07,bridge_name='br-int',has_traffic_filtering=True,id=9a06288c-d8e5-43c4-9559-23674152a05e,network=Network(b8340cc9-e27b-4dbd-8de5-9c101e7b64ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a06288c-d8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.478 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.479 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a06288c-d8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.483 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.486 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.489 183079 INFO os_vif [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:5d:07,bridge_name='br-int',has_traffic_filtering=True,id=9a06288c-d8e5-43c4-9559-23674152a05e,network=Network(b8340cc9-e27b-4dbd-8de5-9c101e7b64ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a06288c-d8')
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.490 183079 INFO nova.virt.libvirt.driver [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Deleting instance files /var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf_del
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.491 183079 INFO nova.virt.libvirt.driver [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Deletion of /var/lib/nova/instances/02af6288-0bd3-438c-982d-f36b31e1a9bf_del complete
Jan 22 17:18:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-84aed5423a13ebe8528ab2577273d5ba1594bac999ee5ce19a6d35e9fcf523ac-merged.mount: Deactivated successfully.
Jan 22 17:18:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d70199136438c29fa3a929bd121f7aaccaabfca76249cd9ac5bd414d39cb8b1a-userdata-shm.mount: Deactivated successfully.
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.575 183079 INFO nova.compute.manager [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.576 183079 DEBUG oslo.service.loopingcall [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.577 183079 DEBUG nova.compute.manager [-] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.577 183079 DEBUG nova.network.neutron [-] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:18:35 compute-0 podman[225640]: 2026-01-22 17:18:35.617322306 +0000 UTC m=+0.259200072 container cleanup d70199136438c29fa3a929bd121f7aaccaabfca76249cd9ac5bd414d39cb8b1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:18:35 compute-0 systemd[1]: libpod-conmon-d70199136438c29fa3a929bd121f7aaccaabfca76249cd9ac5bd414d39cb8b1a.scope: Deactivated successfully.
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.631 183079 DEBUG nova.compute.manager [req-39f25454-8c18-4453-80f1-08c4121dbb00 req-bb506739-f299-4818-8e6b-2b9fa9deea34 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Received event network-vif-unplugged-9a06288c-d8e5-43c4-9559-23674152a05e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.632 183079 DEBUG oslo_concurrency.lockutils [req-39f25454-8c18-4453-80f1-08c4121dbb00 req-bb506739-f299-4818-8e6b-2b9fa9deea34 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.633 183079 DEBUG oslo_concurrency.lockutils [req-39f25454-8c18-4453-80f1-08c4121dbb00 req-bb506739-f299-4818-8e6b-2b9fa9deea34 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.633 183079 DEBUG oslo_concurrency.lockutils [req-39f25454-8c18-4453-80f1-08c4121dbb00 req-bb506739-f299-4818-8e6b-2b9fa9deea34 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.634 183079 DEBUG nova.compute.manager [req-39f25454-8c18-4453-80f1-08c4121dbb00 req-bb506739-f299-4818-8e6b-2b9fa9deea34 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] No waiting events found dispatching network-vif-unplugged-9a06288c-d8e5-43c4-9559-23674152a05e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.634 183079 DEBUG nova.compute.manager [req-39f25454-8c18-4453-80f1-08c4121dbb00 req-bb506739-f299-4818-8e6b-2b9fa9deea34 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Received event network-vif-unplugged-9a06288c-d8e5-43c4-9559-23674152a05e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:18:35 compute-0 podman[225686]: 2026-01-22 17:18:35.78089396 +0000 UTC m=+0.138579512 container remove d70199136438c29fa3a929bd121f7aaccaabfca76249cd9ac5bd414d39cb8b1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 17:18:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:35.789 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9f8bacea-5baa-4e2b-974b-f0a3a5cc38dc]: (4, ('Thu Jan 22 05:18:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce (d70199136438c29fa3a929bd121f7aaccaabfca76249cd9ac5bd414d39cb8b1a)\nd70199136438c29fa3a929bd121f7aaccaabfca76249cd9ac5bd414d39cb8b1a\nThu Jan 22 05:18:35 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce (d70199136438c29fa3a929bd121f7aaccaabfca76249cd9ac5bd414d39cb8b1a)\nd70199136438c29fa3a929bd121f7aaccaabfca76249cd9ac5bd414d39cb8b1a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:35.791 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[307eb0c6-2c50-4c18-a2d9-5c5b3c1e87a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:35.792 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8340cc9-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.794 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.817 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:35 compute-0 kernel: tapb8340cc9-e0: left promiscuous mode
Jan 22 17:18:35 compute-0 nova_compute[183075]: 2026-01-22 17:18:35.820 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:35.824 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b5f84e4b-617f-44e5-88f8-00403c9c4a45]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:35.838 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5e53e82b-81cd-4816-b1d4-26f3d0b82ea2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:35.839 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cf8c1a9d-aa8b-4ee2-a599-b8d3e54b7b6e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:35.859 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[dbb10266-3fab-43df-aaee-2a1e98794d7b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460527, 'reachable_time': 19211, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225701, 'error': None, 'target': 'ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:35 compute-0 systemd[1]: run-netns-ovnmeta\x2db8340cc9\x2de27b\x2d4dbd\x2d8de5\x2d9c101e7b64ce.mount: Deactivated successfully.
Jan 22 17:18:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:35.864 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b8340cc9-e27b-4dbd-8de5-9c101e7b64ce deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:18:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:35.864 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[f724b875-d08e-4a15-b6e9-dd35f1f380ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:18:36 compute-0 nova_compute[183075]: 2026-01-22 17:18:36.334 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:37 compute-0 nova_compute[183075]: 2026-01-22 17:18:37.957 183079 DEBUG nova.compute.manager [req-6571cf14-8ef7-4e6e-8bd3-3c6f15e13dc2 req-99671d9f-0835-40ce-a278-2572af88b0b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Received event network-vif-plugged-9a06288c-d8e5-43c4-9559-23674152a05e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:18:37 compute-0 nova_compute[183075]: 2026-01-22 17:18:37.958 183079 DEBUG oslo_concurrency.lockutils [req-6571cf14-8ef7-4e6e-8bd3-3c6f15e13dc2 req-99671d9f-0835-40ce-a278-2572af88b0b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:18:37 compute-0 nova_compute[183075]: 2026-01-22 17:18:37.958 183079 DEBUG oslo_concurrency.lockutils [req-6571cf14-8ef7-4e6e-8bd3-3c6f15e13dc2 req-99671d9f-0835-40ce-a278-2572af88b0b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:18:37 compute-0 nova_compute[183075]: 2026-01-22 17:18:37.959 183079 DEBUG oslo_concurrency.lockutils [req-6571cf14-8ef7-4e6e-8bd3-3c6f15e13dc2 req-99671d9f-0835-40ce-a278-2572af88b0b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "02af6288-0bd3-438c-982d-f36b31e1a9bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:18:37 compute-0 nova_compute[183075]: 2026-01-22 17:18:37.960 183079 DEBUG nova.compute.manager [req-6571cf14-8ef7-4e6e-8bd3-3c6f15e13dc2 req-99671d9f-0835-40ce-a278-2572af88b0b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] No waiting events found dispatching network-vif-plugged-9a06288c-d8e5-43c4-9559-23674152a05e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:18:37 compute-0 nova_compute[183075]: 2026-01-22 17:18:37.960 183079 WARNING nova.compute.manager [req-6571cf14-8ef7-4e6e-8bd3-3c6f15e13dc2 req-99671d9f-0835-40ce-a278-2572af88b0b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Received unexpected event network-vif-plugged-9a06288c-d8e5-43c4-9559-23674152a05e for instance with vm_state active and task_state deleting.
Jan 22 17:18:37 compute-0 nova_compute[183075]: 2026-01-22 17:18:37.962 183079 DEBUG nova.network.neutron [-] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:18:38 compute-0 nova_compute[183075]: 2026-01-22 17:18:38.106 183079 INFO nova.compute.manager [-] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Took 2.53 seconds to deallocate network for instance.
Jan 22 17:18:38 compute-0 nova_compute[183075]: 2026-01-22 17:18:38.163 183079 DEBUG oslo_concurrency.lockutils [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:18:38 compute-0 nova_compute[183075]: 2026-01-22 17:18:38.164 183079 DEBUG oslo_concurrency.lockutils [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:18:38 compute-0 nova_compute[183075]: 2026-01-22 17:18:38.251 183079 DEBUG nova.compute.provider_tree [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:18:38 compute-0 nova_compute[183075]: 2026-01-22 17:18:38.287 183079 DEBUG nova.scheduler.client.report [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:18:38 compute-0 nova_compute[183075]: 2026-01-22 17:18:38.337 183079 DEBUG oslo_concurrency.lockutils [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:18:38 compute-0 nova_compute[183075]: 2026-01-22 17:18:38.414 183079 INFO nova.scheduler.client.report [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Deleted allocations for instance 02af6288-0bd3-438c-982d-f36b31e1a9bf
Jan 22 17:18:38 compute-0 nova_compute[183075]: 2026-01-22 17:18:38.574 183079 DEBUG oslo_concurrency.lockutils [None req-5d490315-4892-49fd-9b07-fe499de4d01a 3d4b274173814c359055fed8dfc2bdeb 70a37fbbd795434fbeb722ad97dda552 - - default default] Lock "02af6288-0bd3-438c-982d-f36b31e1a9bf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.393s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:18:39 compute-0 podman[225703]: 2026-01-22 17:18:39.38666453 +0000 UTC m=+0.076076748 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:18:39 compute-0 podman[225704]: 2026-01-22 17:18:39.396103017 +0000 UTC m=+0.082543907 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 17:18:39 compute-0 podman[225702]: 2026-01-22 17:18:39.429096539 +0000 UTC m=+0.125814758 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:18:40 compute-0 nova_compute[183075]: 2026-01-22 17:18:40.098 183079 DEBUG nova.compute.manager [req-4dc0d424-f93c-4268-a095-b8955e9dbb6b req-c0fdb1e2-6850-4318-9589-c163a8540bd6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Received event network-vif-deleted-9a06288c-d8e5-43c4-9559-23674152a05e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:18:40 compute-0 nova_compute[183075]: 2026-01-22 17:18:40.482 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:41 compute-0 nova_compute[183075]: 2026-01-22 17:18:41.336 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:41.934 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:18:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:41.934 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:18:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:41.935 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:18:44 compute-0 podman[225765]: 2026-01-22 17:18:44.412164171 +0000 UTC m=+0.107819378 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 17:18:45 compute-0 nova_compute[183075]: 2026-01-22 17:18:45.485 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:45 compute-0 nova_compute[183075]: 2026-01-22 17:18:45.600 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:45 compute-0 nova_compute[183075]: 2026-01-22 17:18:45.734 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:46 compute-0 nova_compute[183075]: 2026-01-22 17:18:46.382 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:48.839 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:18:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:48.840 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:18:48 compute-0 nova_compute[183075]: 2026-01-22 17:18:48.840 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:50 compute-0 nova_compute[183075]: 2026-01-22 17:18:50.436 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102315.4343574, 02af6288-0bd3-438c-982d-f36b31e1a9bf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:18:50 compute-0 nova_compute[183075]: 2026-01-22 17:18:50.436 183079 INFO nova.compute.manager [-] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] VM Stopped (Lifecycle Event)
Jan 22 17:18:50 compute-0 nova_compute[183075]: 2026-01-22 17:18:50.469 183079 DEBUG nova.compute.manager [None req-60f8d5f4-1ffe-465e-86d1-ec272ef25528 - - - - - -] [instance: 02af6288-0bd3-438c-982d-f36b31e1a9bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:18:50 compute-0 nova_compute[183075]: 2026-01-22 17:18:50.489 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:51 compute-0 podman[225789]: 2026-01-22 17:18:51.344323031 +0000 UTC m=+0.059983747 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:18:51 compute-0 nova_compute[183075]: 2026-01-22 17:18:51.384 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:55 compute-0 nova_compute[183075]: 2026-01-22 17:18:55.492 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:56 compute-0 nova_compute[183075]: 2026-01-22 17:18:56.386 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:18:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:18:58.842 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:19:00 compute-0 podman[225814]: 2026-01-22 17:19:00.352786496 +0000 UTC m=+0.066936710 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:19:00 compute-0 nova_compute[183075]: 2026-01-22 17:19:00.496 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:01 compute-0 anacron[4164]: Job `cron.monthly' started
Jan 22 17:19:01 compute-0 anacron[4164]: Job `cron.monthly' terminated
Jan 22 17:19:01 compute-0 anacron[4164]: Normal exit (3 jobs run)
Jan 22 17:19:01 compute-0 nova_compute[183075]: 2026-01-22 17:19:01.389 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:05 compute-0 nova_compute[183075]: 2026-01-22 17:19:05.500 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:06 compute-0 nova_compute[183075]: 2026-01-22 17:19:06.391 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:09 compute-0 nova_compute[183075]: 2026-01-22 17:19:09.380 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:19:10 compute-0 podman[225841]: 2026-01-22 17:19:10.388792522 +0000 UTC m=+0.079338274 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 17:19:10 compute-0 podman[225847]: 2026-01-22 17:19:10.400763257 +0000 UTC m=+0.079280422 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41)
Jan 22 17:19:10 compute-0 podman[225840]: 2026-01-22 17:19:10.435414201 +0000 UTC m=+0.142270033 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 22 17:19:10 compute-0 nova_compute[183075]: 2026-01-22 17:19:10.502 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:11 compute-0 nova_compute[183075]: 2026-01-22 17:19:11.422 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:11 compute-0 nova_compute[183075]: 2026-01-22 17:19:11.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:19:11 compute-0 nova_compute[183075]: 2026-01-22 17:19:11.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 17:19:12 compute-0 nova_compute[183075]: 2026-01-22 17:19:12.799 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:19:13 compute-0 nova_compute[183075]: 2026-01-22 17:19:13.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:19:15 compute-0 podman[225904]: 2026-01-22 17:19:15.345819114 +0000 UTC m=+0.060822126 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:19:15 compute-0 nova_compute[183075]: 2026-01-22 17:19:15.505 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:16 compute-0 nova_compute[183075]: 2026-01-22 17:19:16.423 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:16 compute-0 nova_compute[183075]: 2026-01-22 17:19:16.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:19:17 compute-0 nova_compute[183075]: 2026-01-22 17:19:17.782 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:19:18 compute-0 nova_compute[183075]: 2026-01-22 17:19:18.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:19:19 compute-0 nova_compute[183075]: 2026-01-22 17:19:19.791 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:19:19 compute-0 nova_compute[183075]: 2026-01-22 17:19:19.792 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:19:19 compute-0 nova_compute[183075]: 2026-01-22 17:19:19.792 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:19:19 compute-0 nova_compute[183075]: 2026-01-22 17:19:19.829 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:19 compute-0 nova_compute[183075]: 2026-01-22 17:19:19.829 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:19 compute-0 nova_compute[183075]: 2026-01-22 17:19:19.830 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:19 compute-0 nova_compute[183075]: 2026-01-22 17:19:19.830 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:19:20 compute-0 nova_compute[183075]: 2026-01-22 17:19:20.052 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:19:20 compute-0 nova_compute[183075]: 2026-01-22 17:19:20.054 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5710MB free_disk=73.36702728271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:19:20 compute-0 nova_compute[183075]: 2026-01-22 17:19:20.054 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:20 compute-0 nova_compute[183075]: 2026-01-22 17:19:20.054 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:20 compute-0 nova_compute[183075]: 2026-01-22 17:19:20.223 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:19:20 compute-0 nova_compute[183075]: 2026-01-22 17:19:20.223 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:19:20 compute-0 nova_compute[183075]: 2026-01-22 17:19:20.252 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:19:20 compute-0 nova_compute[183075]: 2026-01-22 17:19:20.268 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:19:20 compute-0 nova_compute[183075]: 2026-01-22 17:19:20.291 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:19:20 compute-0 nova_compute[183075]: 2026-01-22 17:19:20.292 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:20 compute-0 nova_compute[183075]: 2026-01-22 17:19:20.508 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:21 compute-0 nova_compute[183075]: 2026-01-22 17:19:21.289 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:19:21 compute-0 nova_compute[183075]: 2026-01-22 17:19:21.289 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:19:21 compute-0 nova_compute[183075]: 2026-01-22 17:19:21.289 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:19:21 compute-0 nova_compute[183075]: 2026-01-22 17:19:21.316 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:19:21 compute-0 nova_compute[183075]: 2026-01-22 17:19:21.317 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:19:21 compute-0 nova_compute[183075]: 2026-01-22 17:19:21.425 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:22 compute-0 podman[225925]: 2026-01-22 17:19:22.35007298 +0000 UTC m=+0.062962542 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:19:23 compute-0 ovn_controller[95372]: 2026-01-22T17:19:23Z|00406|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 22 17:19:23 compute-0 nova_compute[183075]: 2026-01-22 17:19:23.769 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:23 compute-0 nova_compute[183075]: 2026-01-22 17:19:23.769 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:23 compute-0 nova_compute[183075]: 2026-01-22 17:19:23.786 183079 DEBUG nova.compute.manager [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:19:23 compute-0 nova_compute[183075]: 2026-01-22 17:19:23.790 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:19:23 compute-0 nova_compute[183075]: 2026-01-22 17:19:23.791 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 17:19:23 compute-0 nova_compute[183075]: 2026-01-22 17:19:23.823 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.003 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.004 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.013 183079 DEBUG nova.virt.hardware [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.014 183079 INFO nova.compute.claims [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.132 183079 DEBUG nova.compute.provider_tree [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.144 183079 DEBUG nova.scheduler.client.report [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.161 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.162 183079 DEBUG nova.compute.manager [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.196 183079 DEBUG nova.compute.manager [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.197 183079 DEBUG nova.network.neutron [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.217 183079 INFO nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.233 183079 DEBUG nova.compute.manager [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.340 183079 DEBUG nova.compute.manager [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.341 183079 DEBUG nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.341 183079 INFO nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Creating image(s)
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.342 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "/var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.342 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.343 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.355 183079 DEBUG oslo_concurrency.processutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.411 183079 DEBUG oslo_concurrency.processutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.412 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.412 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.423 183079 DEBUG oslo_concurrency.processutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.477 183079 DEBUG oslo_concurrency.processutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.478 183079 DEBUG oslo_concurrency.processutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.530 183079 DEBUG oslo_concurrency.processutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.531 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.532 183079 DEBUG oslo_concurrency.processutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.588 183079 DEBUG oslo_concurrency.processutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.589 183079 DEBUG nova.virt.disk.api [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Checking if we can resize image /var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.589 183079 DEBUG oslo_concurrency.processutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.650 183079 DEBUG nova.policy [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.672 183079 DEBUG oslo_concurrency.processutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.672 183079 DEBUG nova.virt.disk.api [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Cannot resize image /var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.673 183079 DEBUG nova.objects.instance [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'migration_context' on Instance uuid c4c0edb4-a206-4617-9465-58c87dcdf7d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.688 183079 DEBUG nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.688 183079 DEBUG nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Ensure instance console log exists: /var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.688 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.689 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:24 compute-0 nova_compute[183075]: 2026-01-22 17:19:24.689 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:25 compute-0 nova_compute[183075]: 2026-01-22 17:19:25.511 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:26 compute-0 nova_compute[183075]: 2026-01-22 17:19:26.453 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:26 compute-0 nova_compute[183075]: 2026-01-22 17:19:26.685 183079 DEBUG nova.network.neutron [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Successfully updated port: f801fa72-ebf3-48b4-a510-f75bbe40e687 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:19:26 compute-0 nova_compute[183075]: 2026-01-22 17:19:26.859 183079 DEBUG nova.compute.manager [req-f2df6728-3279-44fe-ab78-bcd7788164b2 req-3661355e-fcdd-4609-96a8-9cf0bb3488eb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Received event network-changed-f801fa72-ebf3-48b4-a510-f75bbe40e687 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:19:26 compute-0 nova_compute[183075]: 2026-01-22 17:19:26.859 183079 DEBUG nova.compute.manager [req-f2df6728-3279-44fe-ab78-bcd7788164b2 req-3661355e-fcdd-4609-96a8-9cf0bb3488eb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Refreshing instance network info cache due to event network-changed-f801fa72-ebf3-48b4-a510-f75bbe40e687. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:19:26 compute-0 nova_compute[183075]: 2026-01-22 17:19:26.860 183079 DEBUG oslo_concurrency.lockutils [req-f2df6728-3279-44fe-ab78-bcd7788164b2 req-3661355e-fcdd-4609-96a8-9cf0bb3488eb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-c4c0edb4-a206-4617-9465-58c87dcdf7d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:19:26 compute-0 nova_compute[183075]: 2026-01-22 17:19:26.861 183079 DEBUG oslo_concurrency.lockutils [req-f2df6728-3279-44fe-ab78-bcd7788164b2 req-3661355e-fcdd-4609-96a8-9cf0bb3488eb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-c4c0edb4-a206-4617-9465-58c87dcdf7d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:19:26 compute-0 nova_compute[183075]: 2026-01-22 17:19:26.861 183079 DEBUG nova.network.neutron [req-f2df6728-3279-44fe-ab78-bcd7788164b2 req-3661355e-fcdd-4609-96a8-9cf0bb3488eb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Refreshing network info cache for port f801fa72-ebf3-48b4-a510-f75bbe40e687 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:19:26 compute-0 nova_compute[183075]: 2026-01-22 17:19:26.866 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "refresh_cache-c4c0edb4-a206-4617-9465-58c87dcdf7d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:19:27 compute-0 nova_compute[183075]: 2026-01-22 17:19:27.244 183079 DEBUG nova.network.neutron [req-f2df6728-3279-44fe-ab78-bcd7788164b2 req-3661355e-fcdd-4609-96a8-9cf0bb3488eb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:19:27 compute-0 nova_compute[183075]: 2026-01-22 17:19:27.631 183079 DEBUG nova.network.neutron [req-f2df6728-3279-44fe-ab78-bcd7788164b2 req-3661355e-fcdd-4609-96a8-9cf0bb3488eb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:19:27 compute-0 nova_compute[183075]: 2026-01-22 17:19:27.649 183079 DEBUG oslo_concurrency.lockutils [req-f2df6728-3279-44fe-ab78-bcd7788164b2 req-3661355e-fcdd-4609-96a8-9cf0bb3488eb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-c4c0edb4-a206-4617-9465-58c87dcdf7d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:19:27 compute-0 nova_compute[183075]: 2026-01-22 17:19:27.650 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquired lock "refresh_cache-c4c0edb4-a206-4617-9465-58c87dcdf7d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:19:27 compute-0 nova_compute[183075]: 2026-01-22 17:19:27.651 183079 DEBUG nova.network.neutron [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:19:27 compute-0 nova_compute[183075]: 2026-01-22 17:19:27.822 183079 DEBUG nova.network.neutron [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:19:28 compute-0 nova_compute[183075]: 2026-01-22 17:19:28.967 183079 DEBUG nova.network.neutron [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Updating instance_info_cache with network_info: [{"id": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "address": "fa:16:3e:1b:b8:fd", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf801fa72-eb", "ovs_interfaceid": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:19:28 compute-0 nova_compute[183075]: 2026-01-22 17:19:28.999 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Releasing lock "refresh_cache-c4c0edb4-a206-4617-9465-58c87dcdf7d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:28.999 183079 DEBUG nova.compute.manager [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Instance network_info: |[{"id": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "address": "fa:16:3e:1b:b8:fd", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf801fa72-eb", "ovs_interfaceid": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.002 183079 DEBUG nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Start _get_guest_xml network_info=[{"id": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "address": "fa:16:3e:1b:b8:fd", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf801fa72-eb", "ovs_interfaceid": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.289 183079 WARNING nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.295 183079 DEBUG nova.virt.libvirt.host [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.295 183079 DEBUG nova.virt.libvirt.host [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.300 183079 DEBUG nova.virt.libvirt.host [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.301 183079 DEBUG nova.virt.libvirt.host [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.302 183079 DEBUG nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.303 183079 DEBUG nova.virt.hardware [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.304 183079 DEBUG nova.virt.hardware [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.305 183079 DEBUG nova.virt.hardware [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.306 183079 DEBUG nova.virt.hardware [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.306 183079 DEBUG nova.virt.hardware [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.307 183079 DEBUG nova.virt.hardware [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.308 183079 DEBUG nova.virt.hardware [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.309 183079 DEBUG nova.virt.hardware [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.309 183079 DEBUG nova.virt.hardware [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.310 183079 DEBUG nova.virt.hardware [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.311 183079 DEBUG nova.virt.hardware [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.317 183079 DEBUG nova.virt.libvirt.vif [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:19:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1941539204',display_name='tempest-server-test-1941539204',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1941539204',id=33,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-dlb2hgpr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:19:24Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=c4c0edb4-a206-4617-9465-58c87dcdf7d5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "address": "fa:16:3e:1b:b8:fd", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf801fa72-eb", "ovs_interfaceid": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.318 183079 DEBUG nova.network.os_vif_util [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "address": "fa:16:3e:1b:b8:fd", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf801fa72-eb", "ovs_interfaceid": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.319 183079 DEBUG nova.network.os_vif_util [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:b8:fd,bridge_name='br-int',has_traffic_filtering=True,id=f801fa72-ebf3-48b4-a510-f75bbe40e687,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapf801fa72-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.321 183079 DEBUG nova.objects.instance [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'pci_devices' on Instance uuid c4c0edb4-a206-4617-9465-58c87dcdf7d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.338 183079 DEBUG nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:19:29 compute-0 nova_compute[183075]:   <uuid>c4c0edb4-a206-4617-9465-58c87dcdf7d5</uuid>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   <name>instance-00000021</name>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1941539204</nova:name>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:19:29</nova:creationTime>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:19:29 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:19:29 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:19:29 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:19:29 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:19:29 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:19:29 compute-0 nova_compute[183075]:         <nova:user uuid="1148a46489e842e6a0c7660c54567798">tempest-FloatingIpSameNetwork-953620552-project-member</nova:user>
Jan 22 17:19:29 compute-0 nova_compute[183075]:         <nova:project uuid="02818155e7af4645bc909d4ba671f11f">tempest-FloatingIpSameNetwork-953620552</nova:project>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:19:29 compute-0 nova_compute[183075]:         <nova:port uuid="f801fa72-ebf3-48b4-a510-f75bbe40e687">
Jan 22 17:19:29 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <system>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <entry name="serial">c4c0edb4-a206-4617-9465-58c87dcdf7d5</entry>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <entry name="uuid">c4c0edb4-a206-4617-9465-58c87dcdf7d5</entry>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     </system>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   <os>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   </os>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   <features>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   </features>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:1b:b8:fd"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <target dev="tapf801fa72-eb"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5/console.log" append="off"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <video>
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     </video>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:19:29 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:19:29 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:19:29 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:19:29 compute-0 nova_compute[183075]: </domain>
Jan 22 17:19:29 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.341 183079 DEBUG nova.compute.manager [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Preparing to wait for external event network-vif-plugged-f801fa72-ebf3-48b4-a510-f75bbe40e687 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.341 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.342 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.342 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.343 183079 DEBUG nova.virt.libvirt.vif [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:19:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1941539204',display_name='tempest-server-test-1941539204',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1941539204',id=33,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-dlb2hgpr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:19:24Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=c4c0edb4-a206-4617-9465-58c87dcdf7d5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "address": "fa:16:3e:1b:b8:fd", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf801fa72-eb", "ovs_interfaceid": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.344 183079 DEBUG nova.network.os_vif_util [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "address": "fa:16:3e:1b:b8:fd", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf801fa72-eb", "ovs_interfaceid": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.345 183079 DEBUG nova.network.os_vif_util [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:b8:fd,bridge_name='br-int',has_traffic_filtering=True,id=f801fa72-ebf3-48b4-a510-f75bbe40e687,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapf801fa72-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.346 183079 DEBUG os_vif [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:b8:fd,bridge_name='br-int',has_traffic_filtering=True,id=f801fa72-ebf3-48b4-a510-f75bbe40e687,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapf801fa72-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.347 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.347 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.348 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.351 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.352 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf801fa72-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.353 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf801fa72-eb, col_values=(('external_ids', {'iface-id': 'f801fa72-ebf3-48b4-a510-f75bbe40e687', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1b:b8:fd', 'vm-uuid': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.355 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:29 compute-0 NetworkManager[55454]: <info>  [1769102369.3566] manager: (tapf801fa72-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/173)
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.358 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.364 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.365 183079 INFO os_vif [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:b8:fd,bridge_name='br-int',has_traffic_filtering=True,id=f801fa72-ebf3-48b4-a510-f75bbe40e687,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapf801fa72-eb')
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.431 183079 DEBUG nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.432 183079 DEBUG nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No VIF found with MAC fa:16:3e:1b:b8:fd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:19:29 compute-0 kernel: tapf801fa72-eb: entered promiscuous mode
Jan 22 17:19:29 compute-0 NetworkManager[55454]: <info>  [1769102369.5158] manager: (tapf801fa72-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/174)
Jan 22 17:19:29 compute-0 ovn_controller[95372]: 2026-01-22T17:19:29Z|00407|binding|INFO|Claiming lport f801fa72-ebf3-48b4-a510-f75bbe40e687 for this chassis.
Jan 22 17:19:29 compute-0 ovn_controller[95372]: 2026-01-22T17:19:29Z|00408|binding|INFO|f801fa72-ebf3-48b4-a510-f75bbe40e687: Claiming fa:16:3e:1b:b8:fd 10.100.0.6
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.515 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.519 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.524 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.535 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:b8:fd 10.100.0.6'], port_security=['fa:16:3e:1b:b8:fd 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.231'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=f801fa72-ebf3-48b4-a510-f75bbe40e687) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.537 104629 INFO neutron.agent.ovn.metadata.agent [-] Port f801fa72-ebf3-48b4-a510-f75bbe40e687 in datapath eee918a6-66b2-47ae-b702-620a23ef395b bound to our chassis
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.540 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.560 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[94806261-e202-459a-9ec6-274d8bb00eb4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.562 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapeee918a6-61 in ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.566 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapeee918a6-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.566 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2733116f-dba8-48e1-bffc-213d484aa444]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.568 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bc6515e0-7ee4-40e6-9a95-7b7d9557581c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:29 compute-0 systemd-udevd[225982]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:19:29 compute-0 systemd-machined[154382]: New machine qemu-33-instance-00000021.
Jan 22 17:19:29 compute-0 NetworkManager[55454]: <info>  [1769102369.5840] device (tapf801fa72-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.583 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[bdfead7d-ff39-4b07-b44f-c50d4a306c45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:29 compute-0 NetworkManager[55454]: <info>  [1769102369.5855] device (tapf801fa72-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.592 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:29 compute-0 ovn_controller[95372]: 2026-01-22T17:19:29Z|00409|binding|INFO|Setting lport f801fa72-ebf3-48b4-a510-f75bbe40e687 ovn-installed in OVS
Jan 22 17:19:29 compute-0 ovn_controller[95372]: 2026-01-22T17:19:29Z|00410|binding|INFO|Setting lport f801fa72-ebf3-48b4-a510-f75bbe40e687 up in Southbound
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.596 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:29 compute-0 systemd[1]: Started Virtual Machine qemu-33-instance-00000021.
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.615 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8632a1aa-a595-47be-b1aa-ca16e33607a8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.649 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[9c39638c-b47f-48dc-8b84-56e43ad840a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.654 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bb7fc113-fcb5-4ec4-8174-8e595a59555b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:29 compute-0 NetworkManager[55454]: <info>  [1769102369.6560] manager: (tapeee918a6-60): new Veth device (/org/freedesktop/NetworkManager/Devices/175)
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.684 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[efbc5c39-f9f6-445d-a44e-8f7019fde3c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.687 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[22612363-7f6b-43eb-b2f2-d41b3fe71814]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:29 compute-0 NetworkManager[55454]: <info>  [1769102369.7065] device (tapeee918a6-60): carrier: link connected
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.710 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[be173663-44f8-4afd-aaf0-569fe4572506]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.724 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0b2f2e24-031d-4e05-8048-aa69d33296f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473340, 'reachable_time': 17462, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226014, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.736 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c8c9aa98-767a-4887-820d-c2c54c2bdb8a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe68:e27e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473340, 'tstamp': 473340}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226015, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.749 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6e4a0d73-a48a-489c-8c0e-e61bfa318387]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473340, 'reachable_time': 17462, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226016, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.772 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[de354e00-de0d-4344-a382-0bfa2bf0eafd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.816 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[12aa7c71-eb36-4fe0-9d0c-db5285aff218]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.817 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.818 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.818 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee918a6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.820 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:29 compute-0 NetworkManager[55454]: <info>  [1769102369.8206] manager: (tapeee918a6-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/176)
Jan 22 17:19:29 compute-0 kernel: tapeee918a6-60: entered promiscuous mode
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.823 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.824 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeee918a6-60, col_values=(('external_ids', {'iface-id': '15d4de90-41f4-4532-aebd-197c2a33c6d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.824 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:29 compute-0 ovn_controller[95372]: 2026-01-22T17:19:29Z|00411|binding|INFO|Releasing lport 15d4de90-41f4-4532-aebd-197c2a33c6d6 from this chassis (sb_readonly=0)
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.847 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:29 compute-0 nova_compute[183075]: 2026-01-22 17:19:29.849 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.849 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eee918a6-66b2-47ae-b702-620a23ef395b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eee918a6-66b2-47ae-b702-620a23ef395b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.850 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0e8ae06c-a3c4-47bc-833a-3a8e3f987916]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.851 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/eee918a6-66b2-47ae-b702-620a23ef395b.pid.haproxy
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:19:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:29.852 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'env', 'PROCESS_TAG=haproxy-eee918a6-66b2-47ae-b702-620a23ef395b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/eee918a6-66b2-47ae-b702-620a23ef395b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:19:30 compute-0 nova_compute[183075]: 2026-01-22 17:19:30.028 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102370.027501, c4c0edb4-a206-4617-9465-58c87dcdf7d5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:19:30 compute-0 nova_compute[183075]: 2026-01-22 17:19:30.028 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] VM Started (Lifecycle Event)
Jan 22 17:19:30 compute-0 nova_compute[183075]: 2026-01-22 17:19:30.060 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:19:30 compute-0 nova_compute[183075]: 2026-01-22 17:19:30.064 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102370.0300508, c4c0edb4-a206-4617-9465-58c87dcdf7d5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:19:30 compute-0 nova_compute[183075]: 2026-01-22 17:19:30.064 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] VM Paused (Lifecycle Event)
Jan 22 17:19:30 compute-0 nova_compute[183075]: 2026-01-22 17:19:30.087 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:19:30 compute-0 nova_compute[183075]: 2026-01-22 17:19:30.095 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:19:30 compute-0 nova_compute[183075]: 2026-01-22 17:19:30.131 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:19:30 compute-0 podman[226056]: 2026-01-22 17:19:30.278592855 +0000 UTC m=+0.060118716 container create fcdc4fd31b0b112f1c951fd1ef2ec16d34bd8931479c922b5f45611715bede86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 17:19:30 compute-0 systemd[1]: Started libpod-conmon-fcdc4fd31b0b112f1c951fd1ef2ec16d34bd8931479c922b5f45611715bede86.scope.
Jan 22 17:19:30 compute-0 podman[226056]: 2026-01-22 17:19:30.246651443 +0000 UTC m=+0.028177304 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:19:30 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:19:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31de07f8a2dd99678bcac0fef902867481f5fbf35583b4cb74db643f14141e01/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:19:30 compute-0 podman[226056]: 2026-01-22 17:19:30.390884187 +0000 UTC m=+0.172410068 container init fcdc4fd31b0b112f1c951fd1ef2ec16d34bd8931479c922b5f45611715bede86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 17:19:30 compute-0 podman[226056]: 2026-01-22 17:19:30.402972356 +0000 UTC m=+0.184498197 container start fcdc4fd31b0b112f1c951fd1ef2ec16d34bd8931479c922b5f45611715bede86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:19:30 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[226071]: [NOTICE]   (226087) : New worker (226094) forked
Jan 22 17:19:30 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[226071]: [NOTICE]   (226087) : Loading success.
Jan 22 17:19:30 compute-0 podman[226074]: 2026-01-22 17:19:30.457122424 +0000 UTC m=+0.071951389 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.437 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:31 compute-0 NetworkManager[55454]: <info>  [1769102371.4387] manager: (patch-br-int-to-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/177)
Jan 22 17:19:31 compute-0 NetworkManager[55454]: <info>  [1769102371.4394] manager: (patch-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/178)
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.503 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:31 compute-0 ovn_controller[95372]: 2026-01-22T17:19:31Z|00412|binding|INFO|Releasing lport 15d4de90-41f4-4532-aebd-197c2a33c6d6 from this chassis (sb_readonly=0)
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.523 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.603 183079 DEBUG nova.compute.manager [req-c9eb1683-65a7-4931-9012-de9e338f0670 req-db324142-c84f-46c6-8d48-7fa035f7a783 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Received event network-vif-plugged-f801fa72-ebf3-48b4-a510-f75bbe40e687 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.604 183079 DEBUG oslo_concurrency.lockutils [req-c9eb1683-65a7-4931-9012-de9e338f0670 req-db324142-c84f-46c6-8d48-7fa035f7a783 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.604 183079 DEBUG oslo_concurrency.lockutils [req-c9eb1683-65a7-4931-9012-de9e338f0670 req-db324142-c84f-46c6-8d48-7fa035f7a783 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.605 183079 DEBUG oslo_concurrency.lockutils [req-c9eb1683-65a7-4931-9012-de9e338f0670 req-db324142-c84f-46c6-8d48-7fa035f7a783 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.605 183079 DEBUG nova.compute.manager [req-c9eb1683-65a7-4931-9012-de9e338f0670 req-db324142-c84f-46c6-8d48-7fa035f7a783 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Processing event network-vif-plugged-f801fa72-ebf3-48b4-a510-f75bbe40e687 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.606 183079 DEBUG nova.compute.manager [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.612 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102371.6117723, c4c0edb4-a206-4617-9465-58c87dcdf7d5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.612 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] VM Resumed (Lifecycle Event)
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.616 183079 DEBUG nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.620 183079 INFO nova.virt.libvirt.driver [-] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Instance spawned successfully.
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.621 183079 DEBUG nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.643 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.648 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.659 183079 DEBUG nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.660 183079 DEBUG nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.660 183079 DEBUG nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.661 183079 DEBUG nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.662 183079 DEBUG nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.662 183079 DEBUG nova.virt.libvirt.driver [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.669 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.739 183079 INFO nova.compute.manager [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Took 7.40 seconds to spawn the instance on the hypervisor.
Jan 22 17:19:31 compute-0 nova_compute[183075]: 2026-01-22 17:19:31.740 183079 DEBUG nova.compute.manager [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:19:32 compute-0 nova_compute[183075]: 2026-01-22 17:19:32.248 183079 INFO nova.compute.manager [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Took 8.37 seconds to build instance.
Jan 22 17:19:32 compute-0 nova_compute[183075]: 2026-01-22 17:19:32.467 183079 DEBUG oslo_concurrency.lockutils [None req-1a724de5-e7b8-47c8-af07-e257c08af7c0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.697s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:32 compute-0 nova_compute[183075]: 2026-01-22 17:19:32.909 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:32 compute-0 nova_compute[183075]: 2026-01-22 17:19:32.911 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:32 compute-0 nova_compute[183075]: 2026-01-22 17:19:32.917 183079 INFO nova.compute.manager [None req-4836baa7-5f4f-436b-99ae-256895079ebd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Get console output
Jan 22 17:19:32 compute-0 nova_compute[183075]: 2026-01-22 17:19:32.925 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:19:32 compute-0 nova_compute[183075]: 2026-01-22 17:19:32.929 183079 DEBUG nova.compute.manager [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.003 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.005 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.015 183079 DEBUG nova.virt.hardware [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.015 183079 INFO nova.compute.claims [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.154 183079 DEBUG nova.compute.provider_tree [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.170 183079 DEBUG nova.scheduler.client.report [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.191 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.192 183079 DEBUG nova.compute.manager [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.250 183079 DEBUG nova.compute.manager [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.251 183079 DEBUG nova.network.neutron [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.274 183079 INFO nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.305 183079 DEBUG nova.compute.manager [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.438 183079 DEBUG nova.compute.manager [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.440 183079 DEBUG nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.440 183079 INFO nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Creating image(s)
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.441 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "/var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.441 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "/var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.442 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "/var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.454 183079 DEBUG oslo_concurrency.processutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.510 183079 DEBUG oslo_concurrency.processutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.511 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.512 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.522 183079 DEBUG oslo_concurrency.processutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.577 183079 DEBUG oslo_concurrency.processutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.578 183079 DEBUG oslo_concurrency.processutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.629 183079 DEBUG oslo_concurrency.processutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.631 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.632 183079 DEBUG oslo_concurrency.processutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.692 183079 DEBUG nova.compute.manager [req-7a86547d-ab64-43ba-b87e-7a27fab23ac4 req-49d8746c-ce60-4c12-822d-f4ecbf3727f1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Received event network-vif-plugged-f801fa72-ebf3-48b4-a510-f75bbe40e687 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.693 183079 DEBUG oslo_concurrency.lockutils [req-7a86547d-ab64-43ba-b87e-7a27fab23ac4 req-49d8746c-ce60-4c12-822d-f4ecbf3727f1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.693 183079 DEBUG oslo_concurrency.lockutils [req-7a86547d-ab64-43ba-b87e-7a27fab23ac4 req-49d8746c-ce60-4c12-822d-f4ecbf3727f1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.694 183079 DEBUG oslo_concurrency.lockutils [req-7a86547d-ab64-43ba-b87e-7a27fab23ac4 req-49d8746c-ce60-4c12-822d-f4ecbf3727f1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.694 183079 DEBUG nova.compute.manager [req-7a86547d-ab64-43ba-b87e-7a27fab23ac4 req-49d8746c-ce60-4c12-822d-f4ecbf3727f1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] No waiting events found dispatching network-vif-plugged-f801fa72-ebf3-48b4-a510-f75bbe40e687 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.694 183079 WARNING nova.compute.manager [req-7a86547d-ab64-43ba-b87e-7a27fab23ac4 req-49d8746c-ce60-4c12-822d-f4ecbf3727f1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Received unexpected event network-vif-plugged-f801fa72-ebf3-48b4-a510-f75bbe40e687 for instance with vm_state active and task_state None.
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.715 183079 DEBUG oslo_concurrency.processutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.716 183079 DEBUG nova.virt.disk.api [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Checking if we can resize image /var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.717 183079 DEBUG oslo_concurrency.processutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.771 183079 DEBUG oslo_concurrency.processutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.772 183079 DEBUG nova.virt.disk.api [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Cannot resize image /var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.772 183079 DEBUG nova.objects.instance [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lazy-loading 'migration_context' on Instance uuid 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.789 183079 DEBUG nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.789 183079 DEBUG nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Ensure instance console log exists: /var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.790 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.790 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:33 compute-0 nova_compute[183075]: 2026-01-22 17:19:33.790 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:34 compute-0 nova_compute[183075]: 2026-01-22 17:19:34.356 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:34 compute-0 nova_compute[183075]: 2026-01-22 17:19:34.429 183079 DEBUG nova.policy [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:19:35 compute-0 nova_compute[183075]: 2026-01-22 17:19:35.815 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:19:36 compute-0 nova_compute[183075]: 2026-01-22 17:19:36.388 183079 DEBUG nova.network.neutron [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Successfully updated port: 4452f367-08e1-4434-a6fa-e97f48bf084c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:19:36 compute-0 nova_compute[183075]: 2026-01-22 17:19:36.409 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "refresh_cache-01065d98-95b1-4364-b9dc-eaf2c3a6d8f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:19:36 compute-0 nova_compute[183075]: 2026-01-22 17:19:36.409 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquired lock "refresh_cache-01065d98-95b1-4364-b9dc-eaf2c3a6d8f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:19:36 compute-0 nova_compute[183075]: 2026-01-22 17:19:36.410 183079 DEBUG nova.network.neutron [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:19:36 compute-0 nova_compute[183075]: 2026-01-22 17:19:36.508 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:36 compute-0 nova_compute[183075]: 2026-01-22 17:19:36.522 183079 DEBUG nova.compute.manager [req-c1efe537-ecbf-4e4a-b092-989c219e8c95 req-98932300-7104-4eb8-9c4d-6d6b4ef77024 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Received event network-changed-4452f367-08e1-4434-a6fa-e97f48bf084c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:19:36 compute-0 nova_compute[183075]: 2026-01-22 17:19:36.522 183079 DEBUG nova.compute.manager [req-c1efe537-ecbf-4e4a-b092-989c219e8c95 req-98932300-7104-4eb8-9c4d-6d6b4ef77024 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Refreshing instance network info cache due to event network-changed-4452f367-08e1-4434-a6fa-e97f48bf084c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:19:36 compute-0 nova_compute[183075]: 2026-01-22 17:19:36.523 183079 DEBUG oslo_concurrency.lockutils [req-c1efe537-ecbf-4e4a-b092-989c219e8c95 req-98932300-7104-4eb8-9c4d-6d6b4ef77024 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-01065d98-95b1-4364-b9dc-eaf2c3a6d8f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:19:37 compute-0 nova_compute[183075]: 2026-01-22 17:19:37.253 183079 DEBUG nova.network.neutron [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.097 183079 INFO nova.compute.manager [None req-d79bd869-4d41-4330-8647-b6a2e4f649de 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Get console output
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.108 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.629 183079 DEBUG nova.network.neutron [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Updating instance_info_cache with network_info: [{"id": "4452f367-08e1-4434-a6fa-e97f48bf084c", "address": "fa:16:3e:19:a5:65", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4452f367-08", "ovs_interfaceid": "4452f367-08e1-4434-a6fa-e97f48bf084c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.654 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Releasing lock "refresh_cache-01065d98-95b1-4364-b9dc-eaf2c3a6d8f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.654 183079 DEBUG nova.compute.manager [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Instance network_info: |[{"id": "4452f367-08e1-4434-a6fa-e97f48bf084c", "address": "fa:16:3e:19:a5:65", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4452f367-08", "ovs_interfaceid": "4452f367-08e1-4434-a6fa-e97f48bf084c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.655 183079 DEBUG oslo_concurrency.lockutils [req-c1efe537-ecbf-4e4a-b092-989c219e8c95 req-98932300-7104-4eb8-9c4d-6d6b4ef77024 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-01065d98-95b1-4364-b9dc-eaf2c3a6d8f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.655 183079 DEBUG nova.network.neutron [req-c1efe537-ecbf-4e4a-b092-989c219e8c95 req-98932300-7104-4eb8-9c4d-6d6b4ef77024 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Refreshing network info cache for port 4452f367-08e1-4434-a6fa-e97f48bf084c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.660 183079 DEBUG nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Start _get_guest_xml network_info=[{"id": "4452f367-08e1-4434-a6fa-e97f48bf084c", "address": "fa:16:3e:19:a5:65", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4452f367-08", "ovs_interfaceid": "4452f367-08e1-4434-a6fa-e97f48bf084c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.665 183079 WARNING nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.672 183079 DEBUG nova.virt.libvirt.host [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.673 183079 DEBUG nova.virt.libvirt.host [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.680 183079 DEBUG nova.virt.libvirt.host [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.681 183079 DEBUG nova.virt.libvirt.host [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.682 183079 DEBUG nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.682 183079 DEBUG nova.virt.hardware [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.683 183079 DEBUG nova.virt.hardware [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.683 183079 DEBUG nova.virt.hardware [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.684 183079 DEBUG nova.virt.hardware [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.684 183079 DEBUG nova.virt.hardware [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.684 183079 DEBUG nova.virt.hardware [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.685 183079 DEBUG nova.virt.hardware [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.685 183079 DEBUG nova.virt.hardware [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.686 183079 DEBUG nova.virt.hardware [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.686 183079 DEBUG nova.virt.hardware [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.687 183079 DEBUG nova.virt.hardware [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.692 183079 DEBUG nova.virt.libvirt.vif [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:19:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-1-871110647',display_name='tempest-server-1-871110647',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-1-871110647',id=34,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMynqvoXBI34qH0ZDQNq01ZPmk41Xdi4JxkvNO4nJ7GrdggfhpXkKQhhUZRV3fv3bSwcXMqi04ipV9xe7IsTm4GBRV+U9o6VN5JSMLulp+KIvjPuEmjq0h65ra09/zQB9w==',key_name='tempest-keypair-test-1762910261',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e4c0bb18013747dfad2e25b2495090eb',ramdisk_id='',reservation_id='r-343r0wfy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortForwardingTestJSON-1240706675',owner_user_name='tempest-PortForwardingTestJSON-1240706675-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:19:33Z,user_data=None,user_id='852aea4e08344f39ae07e6b57393c767',uuid=01065d98-95b1-4364-b9dc-eaf2c3a6d8f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4452f367-08e1-4434-a6fa-e97f48bf084c", "address": "fa:16:3e:19:a5:65", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4452f367-08", "ovs_interfaceid": "4452f367-08e1-4434-a6fa-e97f48bf084c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.693 183079 DEBUG nova.network.os_vif_util [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converting VIF {"id": "4452f367-08e1-4434-a6fa-e97f48bf084c", "address": "fa:16:3e:19:a5:65", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4452f367-08", "ovs_interfaceid": "4452f367-08e1-4434-a6fa-e97f48bf084c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.694 183079 DEBUG nova.network.os_vif_util [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:a5:65,bridge_name='br-int',has_traffic_filtering=True,id=4452f367-08e1-4434-a6fa-e97f48bf084c,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4452f367-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.695 183079 DEBUG nova.objects.instance [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lazy-loading 'pci_devices' on Instance uuid 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.712 183079 DEBUG nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:19:38 compute-0 nova_compute[183075]:   <uuid>01065d98-95b1-4364-b9dc-eaf2c3a6d8f8</uuid>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   <name>instance-00000022</name>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <nova:name>tempest-server-1-871110647</nova:name>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:19:38</nova:creationTime>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:19:38 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:19:38 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:19:38 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:19:38 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:19:38 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:19:38 compute-0 nova_compute[183075]:         <nova:user uuid="852aea4e08344f39ae07e6b57393c767">tempest-PortForwardingTestJSON-1240706675-project-member</nova:user>
Jan 22 17:19:38 compute-0 nova_compute[183075]:         <nova:project uuid="e4c0bb18013747dfad2e25b2495090eb">tempest-PortForwardingTestJSON-1240706675</nova:project>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:19:38 compute-0 nova_compute[183075]:         <nova:port uuid="4452f367-08e1-4434-a6fa-e97f48bf084c">
Jan 22 17:19:38 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <system>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <entry name="serial">01065d98-95b1-4364-b9dc-eaf2c3a6d8f8</entry>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <entry name="uuid">01065d98-95b1-4364-b9dc-eaf2c3a6d8f8</entry>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     </system>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   <os>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   </os>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   <features>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   </features>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:19:a5:65"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <target dev="tap4452f367-08"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/console.log" append="off"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <video>
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     </video>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:19:38 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:19:38 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:19:38 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:19:38 compute-0 nova_compute[183075]: </domain>
Jan 22 17:19:38 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.714 183079 DEBUG nova.compute.manager [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Preparing to wait for external event network-vif-plugged-4452f367-08e1-4434-a6fa-e97f48bf084c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.715 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.715 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.716 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.717 183079 DEBUG nova.virt.libvirt.vif [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:19:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-1-871110647',display_name='tempest-server-1-871110647',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-1-871110647',id=34,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMynqvoXBI34qH0ZDQNq01ZPmk41Xdi4JxkvNO4nJ7GrdggfhpXkKQhhUZRV3fv3bSwcXMqi04ipV9xe7IsTm4GBRV+U9o6VN5JSMLulp+KIvjPuEmjq0h65ra09/zQB9w==',key_name='tempest-keypair-test-1762910261',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e4c0bb18013747dfad2e25b2495090eb',ramdisk_id='',reservation_id='r-343r0wfy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortForwardingTestJSON-1240706675',owner_user_name='tempest-PortForwardingTestJSON-1240706675-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:19:33Z,user_data=None,user_id='852aea4e08344f39ae07e6b57393c767',uuid=01065d98-95b1-4364-b9dc-eaf2c3a6d8f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4452f367-08e1-4434-a6fa-e97f48bf084c", "address": "fa:16:3e:19:a5:65", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4452f367-08", "ovs_interfaceid": "4452f367-08e1-4434-a6fa-e97f48bf084c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.717 183079 DEBUG nova.network.os_vif_util [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converting VIF {"id": "4452f367-08e1-4434-a6fa-e97f48bf084c", "address": "fa:16:3e:19:a5:65", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4452f367-08", "ovs_interfaceid": "4452f367-08e1-4434-a6fa-e97f48bf084c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.718 183079 DEBUG nova.network.os_vif_util [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:a5:65,bridge_name='br-int',has_traffic_filtering=True,id=4452f367-08e1-4434-a6fa-e97f48bf084c,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4452f367-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.719 183079 DEBUG os_vif [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:a5:65,bridge_name='br-int',has_traffic_filtering=True,id=4452f367-08e1-4434-a6fa-e97f48bf084c,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4452f367-08') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.720 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.720 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.721 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.728 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.729 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4452f367-08, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.730 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4452f367-08, col_values=(('external_ids', {'iface-id': '4452f367-08e1-4434-a6fa-e97f48bf084c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:a5:65', 'vm-uuid': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.732 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:38 compute-0 NetworkManager[55454]: <info>  [1769102378.7337] manager: (tap4452f367-08): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.736 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.744 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.745 183079 INFO os_vif [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:a5:65,bridge_name='br-int',has_traffic_filtering=True,id=4452f367-08e1-4434-a6fa-e97f48bf084c,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4452f367-08')
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.843 183079 DEBUG nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.844 183079 DEBUG nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] No VIF found with MAC fa:16:3e:19:a5:65, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:19:38 compute-0 kernel: tap4452f367-08: entered promiscuous mode
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.951 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:38 compute-0 NetworkManager[55454]: <info>  [1769102378.9538] manager: (tap4452f367-08): new Tun device (/org/freedesktop/NetworkManager/Devices/180)
Jan 22 17:19:38 compute-0 ovn_controller[95372]: 2026-01-22T17:19:38Z|00413|binding|INFO|Claiming lport 4452f367-08e1-4434-a6fa-e97f48bf084c for this chassis.
Jan 22 17:19:38 compute-0 ovn_controller[95372]: 2026-01-22T17:19:38Z|00414|binding|INFO|4452f367-08e1-4434-a6fa-e97f48bf084c: Claiming fa:16:3e:19:a5:65 10.100.0.10
Jan 22 17:19:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:38.970 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:a5:65 10.100.0.10'], port_security=['fa:16:3e:19:a5:65 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '94e3530f-8012-4817-a338-7919b109ef3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12343ce0-7cef-4f7f-9439-6550d878d4ba, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=4452f367-08e1-4434-a6fa-e97f48bf084c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:19:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:38.972 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 4452f367-08e1-4434-a6fa-e97f48bf084c in datapath 44326f3c-1431-44d6-85ce-61ecbbb5ed7a bound to our chassis
Jan 22 17:19:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:38.976 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44326f3c-1431-44d6-85ce-61ecbbb5ed7a
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.982 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:38 compute-0 ovn_controller[95372]: 2026-01-22T17:19:38Z|00415|binding|INFO|Setting lport 4452f367-08e1-4434-a6fa-e97f48bf084c ovn-installed in OVS
Jan 22 17:19:38 compute-0 ovn_controller[95372]: 2026-01-22T17:19:38Z|00416|binding|INFO|Setting lport 4452f367-08e1-4434-a6fa-e97f48bf084c up in Southbound
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.990 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:38 compute-0 nova_compute[183075]: 2026-01-22 17:19:38.994 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:38.996 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9490244f-3d2e-4aa9-bb16-4c1ca1a0d49a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:38.999 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap44326f3c-11 in ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.004 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap44326f3c-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.004 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4bd8e089-4ba2-4fd5-bdbd-3df58e504e56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.007 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1181c522-c418-4448-b74d-154d53a5da72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:39 compute-0 systemd-udevd[226142]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:19:39 compute-0 NetworkManager[55454]: <info>  [1769102379.0216] device (tap4452f367-08): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:19:39 compute-0 NetworkManager[55454]: <info>  [1769102379.0226] device (tap4452f367-08): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.029 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[1c68a814-f70b-4120-a116-b70d00d335ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:39 compute-0 systemd-machined[154382]: New machine qemu-34-instance-00000022.
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.058 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1843af22-9044-4695-8967-fc4868e49437]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:39 compute-0 systemd[1]: Started Virtual Machine qemu-34-instance-00000022.
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.096 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[26c2aa4a-0492-429c-a7c7-889821c7b9b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:39 compute-0 NetworkManager[55454]: <info>  [1769102379.1057] manager: (tap44326f3c-10): new Veth device (/org/freedesktop/NetworkManager/Devices/181)
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.105 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ce633728-af41-430d-9b4a-2ab2ad017e0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:39 compute-0 systemd-udevd[226147]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.154 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f698f96e-56c5-4331-a635-24b8e1c5905c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.159 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[b109ba76-5aeb-44b0-a7b9-c71f4248cfc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:39 compute-0 NetworkManager[55454]: <info>  [1769102379.1995] device (tap44326f3c-10): carrier: link connected
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.209 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[618da15a-6081-4cd1-9b5e-12fe4807ab09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.234 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[30da2da4-add4-42ee-8e49-83b56d23f802]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44326f3c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:1b:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474290, 'reachable_time': 26132, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226177, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.262 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b4617310-f87e-414b-bfce-e285140df247]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe37:1b89'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474290, 'tstamp': 474290}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226178, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.287 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[99289d25-2c99-423a-8262-94895ed7ca34]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44326f3c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:1b:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474290, 'reachable_time': 26132, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226179, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.341 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[df99aec6-735c-4013-9896-9cae88254318]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.426 183079 DEBUG nova.compute.manager [req-56d9f1bd-d4b0-4485-b5ef-24b258c35c24 req-5c1ad78f-dd0e-44ea-80a6-6b7103a77820 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Received event network-vif-plugged-4452f367-08e1-4434-a6fa-e97f48bf084c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.426 183079 DEBUG oslo_concurrency.lockutils [req-56d9f1bd-d4b0-4485-b5ef-24b258c35c24 req-5c1ad78f-dd0e-44ea-80a6-6b7103a77820 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.427 183079 DEBUG oslo_concurrency.lockutils [req-56d9f1bd-d4b0-4485-b5ef-24b258c35c24 req-5c1ad78f-dd0e-44ea-80a6-6b7103a77820 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.427 183079 DEBUG oslo_concurrency.lockutils [req-56d9f1bd-d4b0-4485-b5ef-24b258c35c24 req-5c1ad78f-dd0e-44ea-80a6-6b7103a77820 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.427 183079 DEBUG nova.compute.manager [req-56d9f1bd-d4b0-4485-b5ef-24b258c35c24 req-5c1ad78f-dd0e-44ea-80a6-6b7103a77820 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Processing event network-vif-plugged-4452f367-08e1-4434-a6fa-e97f48bf084c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.432 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9ffea8a5-dd5b-4053-a7bf-5544d7d5c032]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.435 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44326f3c-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.435 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.436 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44326f3c-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:19:39 compute-0 NetworkManager[55454]: <info>  [1769102379.4748] manager: (tap44326f3c-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.474 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:39 compute-0 kernel: tap44326f3c-10: entered promiscuous mode
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.480 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.483 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44326f3c-10, col_values=(('external_ids', {'iface-id': '118957e0-7da0-4d87-b7d4-2c204e19e5b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:19:39 compute-0 ovn_controller[95372]: 2026-01-22T17:19:39Z|00417|binding|INFO|Releasing lport 118957e0-7da0-4d87-b7d4-2c204e19e5b6 from this chassis (sb_readonly=0)
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.485 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.487 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44326f3c-1431-44d6-85ce-61ecbbb5ed7a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44326f3c-1431-44d6-85ce-61ecbbb5ed7a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.488 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cedbe560-daff-447d-b4cd-93505e8ac3c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.489 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/44326f3c-1431-44d6-85ce-61ecbbb5ed7a.pid.haproxy
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 44326f3c-1431-44d6-85ce-61ecbbb5ed7a
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:19:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:39.491 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'env', 'PROCESS_TAG=haproxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/44326f3c-1431-44d6-85ce-61ecbbb5ed7a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.499 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.587 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102379.5864272, 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.587 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] VM Started (Lifecycle Event)
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.591 183079 DEBUG nova.compute.manager [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.597 183079 DEBUG nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.605 183079 INFO nova.virt.libvirt.driver [-] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Instance spawned successfully.
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.606 183079 DEBUG nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.619 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.625 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.640 183079 DEBUG nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.641 183079 DEBUG nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.642 183079 DEBUG nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.642 183079 DEBUG nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.643 183079 DEBUG nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.644 183079 DEBUG nova.virt.libvirt.driver [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.649 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.649 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102379.58656, 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.650 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] VM Paused (Lifecycle Event)
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.681 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.687 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102379.5952306, 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.688 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] VM Resumed (Lifecycle Event)
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.714 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.719 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.740 183079 INFO nova.compute.manager [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Took 6.30 seconds to spawn the instance on the hypervisor.
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.741 183079 DEBUG nova.compute.manager [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.749 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.818 183079 INFO nova.compute.manager [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Took 6.85 seconds to build instance.
Jan 22 17:19:39 compute-0 nova_compute[183075]: 2026-01-22 17:19:39.841 183079 DEBUG oslo_concurrency.lockutils [None req-68377633-ac32-4fb4-b770-01d61082fb74 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.930s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:39 compute-0 podman[226215]: 2026-01-22 17:19:39.951594461 +0000 UTC m=+0.088741822 container create 78748a997f6fcf5d1ed1de9e255753aa5a5c7b5540551db9f55005e20505c8a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 17:19:39 compute-0 podman[226215]: 2026-01-22 17:19:39.899778824 +0000 UTC m=+0.036926245 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:19:39 compute-0 systemd[1]: Started libpod-conmon-78748a997f6fcf5d1ed1de9e255753aa5a5c7b5540551db9f55005e20505c8a2.scope.
Jan 22 17:19:40 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:19:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fee4a742c266e2f827121656d7e379fe1d600c08915bfe27ca923c1e944c78af/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:19:40 compute-0 podman[226215]: 2026-01-22 17:19:40.066980404 +0000 UTC m=+0.204127745 container init 78748a997f6fcf5d1ed1de9e255753aa5a5c7b5540551db9f55005e20505c8a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:19:40 compute-0 podman[226215]: 2026-01-22 17:19:40.078111898 +0000 UTC m=+0.215259229 container start 78748a997f6fcf5d1ed1de9e255753aa5a5c7b5540551db9f55005e20505c8a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:19:40 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226230]: [NOTICE]   (226234) : New worker (226236) forked
Jan 22 17:19:40 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226230]: [NOTICE]   (226234) : Loading success.
Jan 22 17:19:41 compute-0 nova_compute[183075]: 2026-01-22 17:19:41.286 183079 DEBUG nova.network.neutron [req-c1efe537-ecbf-4e4a-b092-989c219e8c95 req-98932300-7104-4eb8-9c4d-6d6b4ef77024 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Updated VIF entry in instance network info cache for port 4452f367-08e1-4434-a6fa-e97f48bf084c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:19:41 compute-0 nova_compute[183075]: 2026-01-22 17:19:41.295 183079 DEBUG nova.network.neutron [req-c1efe537-ecbf-4e4a-b092-989c219e8c95 req-98932300-7104-4eb8-9c4d-6d6b4ef77024 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Updating instance_info_cache with network_info: [{"id": "4452f367-08e1-4434-a6fa-e97f48bf084c", "address": "fa:16:3e:19:a5:65", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4452f367-08", "ovs_interfaceid": "4452f367-08e1-4434-a6fa-e97f48bf084c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:19:41 compute-0 nova_compute[183075]: 2026-01-22 17:19:41.322 183079 DEBUG oslo_concurrency.lockutils [req-c1efe537-ecbf-4e4a-b092-989c219e8c95 req-98932300-7104-4eb8-9c4d-6d6b4ef77024 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-01065d98-95b1-4364-b9dc-eaf2c3a6d8f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:19:41 compute-0 podman[226246]: 2026-01-22 17:19:41.411498766 +0000 UTC m=+0.086815461 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 17:19:41 compute-0 podman[226247]: 2026-01-22 17:19:41.415773399 +0000 UTC m=+0.088241428 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter)
Jan 22 17:19:41 compute-0 podman[226245]: 2026-01-22 17:19:41.461726351 +0000 UTC m=+0.150437229 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 17:19:41 compute-0 nova_compute[183075]: 2026-01-22 17:19:41.510 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:41 compute-0 nova_compute[183075]: 2026-01-22 17:19:41.528 183079 DEBUG nova.compute.manager [req-30ad3df1-42c1-49d2-b0bb-036f16106ef8 req-4d5ba9fa-aac8-4686-b0b3-6516f004cf6f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Received event network-vif-plugged-4452f367-08e1-4434-a6fa-e97f48bf084c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:19:41 compute-0 nova_compute[183075]: 2026-01-22 17:19:41.529 183079 DEBUG oslo_concurrency.lockutils [req-30ad3df1-42c1-49d2-b0bb-036f16106ef8 req-4d5ba9fa-aac8-4686-b0b3-6516f004cf6f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:41 compute-0 nova_compute[183075]: 2026-01-22 17:19:41.529 183079 DEBUG oslo_concurrency.lockutils [req-30ad3df1-42c1-49d2-b0bb-036f16106ef8 req-4d5ba9fa-aac8-4686-b0b3-6516f004cf6f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:41 compute-0 nova_compute[183075]: 2026-01-22 17:19:41.530 183079 DEBUG oslo_concurrency.lockutils [req-30ad3df1-42c1-49d2-b0bb-036f16106ef8 req-4d5ba9fa-aac8-4686-b0b3-6516f004cf6f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:41 compute-0 nova_compute[183075]: 2026-01-22 17:19:41.531 183079 DEBUG nova.compute.manager [req-30ad3df1-42c1-49d2-b0bb-036f16106ef8 req-4d5ba9fa-aac8-4686-b0b3-6516f004cf6f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] No waiting events found dispatching network-vif-plugged-4452f367-08e1-4434-a6fa-e97f48bf084c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:19:41 compute-0 nova_compute[183075]: 2026-01-22 17:19:41.531 183079 WARNING nova.compute.manager [req-30ad3df1-42c1-49d2-b0bb-036f16106ef8 req-4d5ba9fa-aac8-4686-b0b3-6516f004cf6f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Received unexpected event network-vif-plugged-4452f367-08e1-4434-a6fa-e97f48bf084c for instance with vm_state active and task_state None.
Jan 22 17:19:41 compute-0 nova_compute[183075]: 2026-01-22 17:19:41.553 183079 INFO nova.compute.manager [None req-ff75e0b0-1c19-4977-830b-43188b2a76d4 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Get console output
Jan 22 17:19:41 compute-0 nova_compute[183075]: 2026-01-22 17:19:41.564 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:19:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:41.935 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:41.938 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:41.942 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:43 compute-0 nova_compute[183075]: 2026-01-22 17:19:43.239 183079 INFO nova.compute.manager [None req-3524fe35-8fc2-410d-ba1a-9d3dc164e396 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Get console output
Jan 22 17:19:43 compute-0 nova_compute[183075]: 2026-01-22 17:19:43.247 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:19:43 compute-0 nova_compute[183075]: 2026-01-22 17:19:43.735 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:44 compute-0 ovn_controller[95372]: 2026-01-22T17:19:44Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1b:b8:fd 10.100.0.6
Jan 22 17:19:44 compute-0 ovn_controller[95372]: 2026-01-22T17:19:44Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1b:b8:fd 10.100.0.6
Jan 22 17:19:46 compute-0 podman[226320]: 2026-01-22 17:19:46.419809211 +0000 UTC m=+0.112790416 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 17:19:46 compute-0 nova_compute[183075]: 2026-01-22 17:19:46.512 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:46 compute-0 nova_compute[183075]: 2026-01-22 17:19:46.729 183079 INFO nova.compute.manager [None req-1f27a2a9-9dce-4297-b026-fcf348d9d14f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Get console output
Jan 22 17:19:46 compute-0 nova_compute[183075]: 2026-01-22 17:19:46.736 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:19:48 compute-0 nova_compute[183075]: 2026-01-22 17:19:48.640 183079 INFO nova.compute.manager [None req-013265e2-1655-41d1-87cd-b51e674f1242 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Get console output
Jan 22 17:19:48 compute-0 nova_compute[183075]: 2026-01-22 17:19:48.651 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:19:48 compute-0 nova_compute[183075]: 2026-01-22 17:19:48.786 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:48 compute-0 nova_compute[183075]: 2026-01-22 17:19:48.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:19:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:49.030 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:49.032 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:19:49 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:49 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:49 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:49 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:49 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:49 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:19:49 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.271 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.271 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 2.2400663
Jan 22 17:19:51 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.6:36230 [22/Jan/2026:17:19:49.029] listener listener/metadata 0/0/0/2242/2242 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.288 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.289 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.318 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.319 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0296926
Jan 22 17:19:51 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.6:36236 [22/Jan/2026:17:19:51.287] listener listener/metadata 0/0/0/31/31 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.329 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.330 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.353 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:51 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.6:36238 [22/Jan/2026:17:19:51.328] listener listener/metadata 0/0/0/25/25 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.354 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0243263
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.364 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.365 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.384 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.385 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0205336
Jan 22 17:19:51 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.6:36246 [22/Jan/2026:17:19:51.362] listener listener/metadata 0/0/0/22/22 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.390 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.391 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.413 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.413 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0221221
Jan 22 17:19:51 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.6:36260 [22/Jan/2026:17:19:51.390] listener listener/metadata 0/0/0/23/23 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.419 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.420 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.442 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.442 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0227695
Jan 22 17:19:51 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.6:36270 [22/Jan/2026:17:19:51.418] listener listener/metadata 0/0/0/24/24 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.448 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.449 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.469 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.469 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0206115
Jan 22 17:19:51 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.6:36280 [22/Jan/2026:17:19:51.447] listener listener/metadata 0/0/0/22/22 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.476 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.477 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.503 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:51 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.6:36292 [22/Jan/2026:17:19:51.475] listener listener/metadata 0/0/0/29/29 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.505 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0277040
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.512 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.512 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:51 compute-0 nova_compute[183075]: 2026-01-22 17:19:51.515 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.538 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:51 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.6:36296 [22/Jan/2026:17:19:51.511] listener listener/metadata 0/0/0/27/27 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.539 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0264692
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.545 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.546 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.564 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:51 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.6:36304 [22/Jan/2026:17:19:51.544] listener listener/metadata 0/0/0/20/20 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.565 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0194764
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.570 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.571 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:51 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.6:36314 [22/Jan/2026:17:19:51.569] listener listener/metadata 0/0/0/24/24 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.594 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0230780
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.604 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.605 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.628 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:51 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.6:36320 [22/Jan/2026:17:19:51.603] listener listener/metadata 0/0/0/25/25 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.629 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0239019
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.644 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.649 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.675 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:51 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.6:36336 [22/Jan/2026:17:19:51.636] listener listener/metadata 0/0/0/39/39 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.675 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0269339
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.681 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.682 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.706 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:51 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.6:36352 [22/Jan/2026:17:19:51.681] listener listener/metadata 0/0/0/25/25 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.706 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0241671
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.717 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.718 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.745 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.746 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0280464
Jan 22 17:19:51 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.6:36356 [22/Jan/2026:17:19:51.716] listener listener/metadata 0/0/0/29/29 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.756 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.757 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.956 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:51.957 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.1998041
Jan 22 17:19:51 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.6:36358 [22/Jan/2026:17:19:51.756] listener listener/metadata 0/0/0/201/201 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:19:52 compute-0 nova_compute[183075]: 2026-01-22 17:19:52.039 183079 INFO nova.compute.manager [None req-a1d8b39d-b97d-4dc6-ac7b-cdc9f1e29e80 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Get console output
Jan 22 17:19:52 compute-0 nova_compute[183075]: 2026-01-22 17:19:52.049 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:19:52 compute-0 ovn_controller[95372]: 2026-01-22T17:19:52Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:19:a5:65 10.100.0.10
Jan 22 17:19:52 compute-0 ovn_controller[95372]: 2026-01-22T17:19:52Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:19:a5:65 10.100.0.10
Jan 22 17:19:53 compute-0 podman[226358]: 2026-01-22 17:19:53.398581546 +0000 UTC m=+0.089963754 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:19:53 compute-0 nova_compute[183075]: 2026-01-22 17:19:53.791 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:53 compute-0 nova_compute[183075]: 2026-01-22 17:19:53.816 183079 INFO nova.compute.manager [None req-a9bf2290-2179-4189-a887-cb69e6fac709 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Get console output
Jan 22 17:19:53 compute-0 nova_compute[183075]: 2026-01-22 17:19:53.822 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:19:54 compute-0 nova_compute[183075]: 2026-01-22 17:19:54.762 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:19:54 compute-0 nova_compute[183075]: 2026-01-22 17:19:54.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Triggering sync for uuid c4c0edb4-a206-4617-9465-58c87dcdf7d5 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 17:19:54 compute-0 nova_compute[183075]: 2026-01-22 17:19:54.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Triggering sync for uuid 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 17:19:54 compute-0 nova_compute[183075]: 2026-01-22 17:19:54.789 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:54 compute-0 nova_compute[183075]: 2026-01-22 17:19:54.790 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:54 compute-0 nova_compute[183075]: 2026-01-22 17:19:54.790 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:19:54 compute-0 nova_compute[183075]: 2026-01-22 17:19:54.791 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:19:54 compute-0 nova_compute[183075]: 2026-01-22 17:19:54.830 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:54 compute-0 nova_compute[183075]: 2026-01-22 17:19:54.831 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.041s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:19:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:54.928 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:19:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:54.929 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:19:54 compute-0 nova_compute[183075]: 2026-01-22 17:19:54.958 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.459 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'name': 'tempest-server-test-1941539204', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000021', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '02818155e7af4645bc909d4ba671f11f', 'user_id': '1148a46489e842e6a0c7660c54567798', 'hostId': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.463 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'name': 'tempest-server-1-871110647', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000022', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e4c0bb18013747dfad2e25b2495090eb', 'user_id': '852aea4e08344f39ae07e6b57393c767', 'hostId': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.464 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.468 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c4c0edb4-a206-4617-9465-58c87dcdf7d5 / tapf801fa72-eb inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.468 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/network.incoming.bytes volume: 7392 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.472 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8 / tap4452f367-08 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.473 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ace1df59-4538-45fa-b68a-072077010fa8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7392, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000021-c4c0edb4-a206-4617-9465-58c87dcdf7d5-tapf801fa72-eb', 'timestamp': '2026-01-22T17:19:55.464275', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'tapf801fa72-eb', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:b8:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf801fa72-eb'}, 'message_id': '9209947a-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.224766571, 'message_signature': '9656a20f62914dcd05d880c392640131b4cfb6afd8b774f6419769f0c0ae3f65'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000022-01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-tap4452f367-08', 'timestamp': '2026-01-22T17:19:55.464275', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'tap4452f367-08', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:a5:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4452f367-08'}, 'message_id': '920a30ec-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.230294927, 'message_signature': '41099f381f168cc247dbb49ce6094b1ec4beaf1606bdadec0af1f494e769cd34'}]}, 'timestamp': '2026-01-22 17:19:55.473931', '_unique_id': '309b7e978417469b96902f477089a472'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.476 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.477 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.478 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/network.outgoing.bytes volume: 10121 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.478 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/network.outgoing.bytes volume: 1284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '31cdf70f-1428-49ca-b028-99a750484c38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10121, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000021-c4c0edb4-a206-4617-9465-58c87dcdf7d5-tapf801fa72-eb', 'timestamp': '2026-01-22T17:19:55.478036', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'tapf801fa72-eb', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:b8:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf801fa72-eb'}, 'message_id': '920aeb90-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.224766571, 'message_signature': '2079d803d3a114fca479307264d5e6f4af13d308590c5d5237d3dc77aa48bdd2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1284, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000022-01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-tap4452f367-08', 'timestamp': '2026-01-22T17:19:55.478036', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'tap4452f367-08', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:a5:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4452f367-08'}, 'message_id': '920b0166-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.230294927, 'message_signature': '777337b09a48dd3a8d9c1001cd71261338d100214a021eca8fcd6ebf02d87119'}]}, 'timestamp': '2026-01-22 17:19:55.479101', '_unique_id': '9923b8f359e1449a9eb85e9bc91cb066'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.480 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.481 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.481 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.482 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a4fe104-4f11-488c-b72f-f332c2800c55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000021-c4c0edb4-a206-4617-9465-58c87dcdf7d5-tapf801fa72-eb', 'timestamp': '2026-01-22T17:19:55.481841', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'tapf801fa72-eb', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:b8:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf801fa72-eb'}, 'message_id': '920b80fa-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.224766571, 'message_signature': 'bbe7a48e3b032f62f6f85f4c100ac9170dc4688329883689b86f20ce4895fc5d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000022-01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-tap4452f367-08', 'timestamp': '2026-01-22T17:19:55.481841', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'tap4452f367-08', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:a5:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4452f367-08'}, 'message_id': '920b939c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.230294927, 'message_signature': 'b7e64855ad45b7e650bbf7f7ecb295e1940edc6c53949feac4260ceb0db99683'}]}, 'timestamp': '2026-01-22 17:19:55.482886', '_unique_id': 'e275553d910242399dd198a7a28f1294'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.485 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.495 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.505 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk.device.allocation volume: 28319744 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb76f928-f800-400c-870f-b3bfca389601', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5-vda', 'timestamp': '2026-01-22T17:19:55.485374', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'instance-00000021', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '920d8efe-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.245847877, 'message_signature': 'cbf2b38b367fc03a41a49d281529acf81b20bcba811c8a6433ff976540dd763f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28319744, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-vda', 'timestamp': '2026-01-22T17:19:55.485374', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'instance-00000022', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '920f0f04-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.256274752, 'message_signature': 'daf5ec9a9a24b98ff30de1585b60739da366abf43ea2171e55feb8553e441f57'}]}, 'timestamp': '2026-01-22 17:19:55.505804', '_unique_id': 'd6f6d7277dd2469a99d7a58d12a5da9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.508 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.508 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.509 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3b6602b-196a-4348-ac3b-41d0c81640a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000021-c4c0edb4-a206-4617-9465-58c87dcdf7d5-tapf801fa72-eb', 'timestamp': '2026-01-22T17:19:55.508731', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'tapf801fa72-eb', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:b8:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf801fa72-eb'}, 'message_id': '920f9b54-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.224766571, 'message_signature': 'a54cca47beec5db58116cf5f81d5007fa13edae98f7b8854ebce219d8ea3614d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000022-01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-tap4452f367-08', 'timestamp': '2026-01-22T17:19:55.508731', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'tap4452f367-08', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:a5:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4452f367-08'}, 'message_id': '920fb09e-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.230294927, 'message_signature': 'dc23402d47d74b45398f80adecd0216cadc1874ba649c9cc94b64a84f90e0f4b'}]}, 'timestamp': '2026-01-22 17:19:55.509840', '_unique_id': 'ef43554ec0d74f559761a70e8852be3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.512 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.512 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/network.outgoing.packets volume: 115 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.513 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df99e7ed-dbd9-42dd-b276-42f39e7363b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 115, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000021-c4c0edb4-a206-4617-9465-58c87dcdf7d5-tapf801fa72-eb', 'timestamp': '2026-01-22T17:19:55.512619', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'tapf801fa72-eb', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:b8:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf801fa72-eb'}, 'message_id': '9210333e-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.224766571, 'message_signature': 'f217d871f25ce0f97e8186f694adab74a621797062ee8c8d8fe48b3527f8fd5c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000022-01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-tap4452f367-08', 'timestamp': '2026-01-22T17:19:55.512619', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'tap4452f367-08', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:a5:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4452f367-08'}, 'message_id': '921051c0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.230294927, 'message_signature': '2c7adec017ed3525d9d619eb6c3997ced1a5a740562aeec6d94a12af5f2f8b3d'}]}, 'timestamp': '2026-01-22 17:19:55.513919', '_unique_id': '74f2351769c3417e894f5a3c84b78714'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.514 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.516 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.539 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk.device.read.latency volume: 226096054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.560 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk.device.read.latency volume: 255548309 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8b85a43-b1d2-481e-830d-3903c7f4af9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 226096054, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5-vda', 'timestamp': '2026-01-22T17:19:55.516353', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'instance-00000021', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92145c5c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.276818954, 'message_signature': '2fa2a17b21bb98ecec4792dcf9b426c2fe9dfdaca8226adf189f84450f9e9777'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 255548309, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-vda', 'timestamp': '2026-01-22T17:19:55.516353', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'instance-00000022', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '921779aa-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.300872778, 'message_signature': '459f2d9d727ef99d7a83a1d09b4f52054e45b6fb3ab2104911d914617cea913c'}]}, 'timestamp': '2026-01-22 17:19:55.560938', '_unique_id': '15fe57f0c04c42c39607e63e5f99cacc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.562 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.564 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.564 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.565 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2197af4f-9921-47e9-84fc-9e4503a1b53a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000021-c4c0edb4-a206-4617-9465-58c87dcdf7d5-tapf801fa72-eb', 'timestamp': '2026-01-22T17:19:55.564565', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'tapf801fa72-eb', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:b8:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf801fa72-eb'}, 'message_id': '92182422-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.224766571, 'message_signature': 'a4797c27e434b340bb43c6a846dcc8d75a4116aabf6e1c86f10348a6cc6a6feb'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000022-01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-tap4452f367-08', 'timestamp': '2026-01-22T17:19:55.564565', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'tap4452f367-08', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:a5:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4452f367-08'}, 'message_id': '92183714-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.230294927, 'message_signature': '04b9f34dd512be7860100135bdfd722c61c95f3616789c989bb95ca1b01f7d2f'}]}, 'timestamp': '2026-01-22 17:19:55.565703', '_unique_id': '4f64442b6cd24fcc84d6b1ad5cac83f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.566 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.568 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.593 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/memory.usage volume: 42.65234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.612 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/memory.usage volume: 40.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7fe5bead-ea40-43c3-940f-d75a621055f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.65234375, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'timestamp': '2026-01-22T17:19:55.568230', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'instance-00000021', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '921c9f70-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.354186044, 'message_signature': '414c88676786022181e28247eef074a19428fabef262548f8874dd3704cfbf02'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.38671875, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'timestamp': '2026-01-22T17:19:55.568230', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'instance-00000022', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '921f6020-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.372326483, 'message_signature': '31ceb02b2c9a3b34280b78f304e4a9586e46ba39ffef4943838938218a03619a'}]}, 'timestamp': '2026-01-22 17:19:55.612661', '_unique_id': '3a7abf8fa8b34f01855589142f97444a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.614 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.615 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.616 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.616 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a420d092-ac04-43d7-9e43-1efab4966e69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000021-c4c0edb4-a206-4617-9465-58c87dcdf7d5-tapf801fa72-eb', 'timestamp': '2026-01-22T17:19:55.615976', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'tapf801fa72-eb', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:b8:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf801fa72-eb'}, 'message_id': '921ff7c4-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.224766571, 'message_signature': '50fc7ef909aa4352cba7adf830d54f7ce47ccfc2c7c9435870149706531a7273'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000022-01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-tap4452f367-08', 'timestamp': '2026-01-22T17:19:55.615976', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'tap4452f367-08', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:a5:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4452f367-08'}, 'message_id': '92200dfe-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.230294927, 'message_signature': '3090528b7c2a66a67dfdf8a25589d833055513f277d8989b866c30084683f5ff'}]}, 'timestamp': '2026-01-22 17:19:55.617046', '_unique_id': 'fe0cc91e1de146e3a8266aa7cc47f259'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.618 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.619 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.619 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.619 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1941539204>, <NovaLikeServer: tempest-server-1-871110647>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1941539204>, <NovaLikeServer: tempest-server-1-871110647>]
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.620 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.620 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk.device.read.requests volume: 1140 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.621 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk.device.read.requests volume: 984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d6173ab-ea00-4bc2-afdc-83a3bdbb53b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1140, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5-vda', 'timestamp': '2026-01-22T17:19:55.620353', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'instance-00000021', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9220a6ec-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.276818954, 'message_signature': '43f6fc44fc1a515e9d420cabc637874b612f9c44648b8ffd6ad1adcd2e4a8a76'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 984, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-vda', 'timestamp': '2026-01-22T17:19:55.620353', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'instance-00000022', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9220bce0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.300872778, 'message_signature': '5bb63c7c6430051acb82bae407f73d143077630c1f163b8818cfe6428aa7b600'}]}, 'timestamp': '2026-01-22 17:19:55.621513', '_unique_id': '86f9db36726347568d356dea0077cbfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.622 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.624 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.624 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk.device.usage volume: 29818880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.624 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk.device.usage volume: 28246016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '237002c7-0c0d-4af6-bfd5-f03d0aab9ea5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29818880, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5-vda', 'timestamp': '2026-01-22T17:19:55.624191', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'instance-00000021', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9221394a-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.245847877, 'message_signature': '79d246e952ee6a4a8eb8913b7878917d880aa4881bea2166d594f6dbfa9a05b6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28246016, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-vda', 'timestamp': '2026-01-22T17:19:55.624191', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'instance-00000022', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92214d18-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.256274752, 'message_signature': '210407ac632bfff521c56bb2a857d6b94b7cc5a0b953c2b7b00f3f0741586f32'}]}, 'timestamp': '2026-01-22 17:19:55.625245', '_unique_id': '14a4b533c3ac4e70a8857b1fff05c5e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.626 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.627 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.628 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.628 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1941539204>, <NovaLikeServer: tempest-server-1-871110647>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1941539204>, <NovaLikeServer: tempest-server-1-871110647>]
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.628 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.628 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.629 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ba9256a-cead-41d6-a72e-c8a03415a393', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000021-c4c0edb4-a206-4617-9465-58c87dcdf7d5-tapf801fa72-eb', 'timestamp': '2026-01-22T17:19:55.628849', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'tapf801fa72-eb', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:b8:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf801fa72-eb'}, 'message_id': '9221ef70-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.224766571, 'message_signature': '333d34fff901a0bf46e28c6b474e23cf608bef83feb93aa5a670c9bfc0a2531f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000022-01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-tap4452f367-08', 'timestamp': '2026-01-22T17:19:55.628849', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'tap4452f367-08', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:a5:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4452f367-08'}, 'message_id': '922203d4-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.230294927, 'message_signature': '60fb3f90d42c5caae8e792b81d377fbe410be03f640a435c462255b8399f5e48'}]}, 'timestamp': '2026-01-22 17:19:55.629889', '_unique_id': 'aa90b8247915415d93013a61747def00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.630 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.632 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.632 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/network.incoming.packets volume: 62 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.633 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a74c7bce-198c-44ec-b893-fd96cd514175', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 62, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000021-c4c0edb4-a206-4617-9465-58c87dcdf7d5-tapf801fa72-eb', 'timestamp': '2026-01-22T17:19:55.632479', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'tapf801fa72-eb', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:b8:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf801fa72-eb'}, 'message_id': '92227d32-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.224766571, 'message_signature': 'e6d1f270893301f2eb7faac47f4b212bcd6b0e2aba19a5256d3b1621071cb837'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000022-01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-tap4452f367-08', 'timestamp': '2026-01-22T17:19:55.632479', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'tap4452f367-08', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:a5:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4452f367-08'}, 'message_id': '922290d8-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.230294927, 'message_signature': '3b194c3cca6a72c1c03fac15b95758f770b4aff8c7d691c33ccb19ca442c46db'}]}, 'timestamp': '2026-01-22 17:19:55.633501', '_unique_id': 'f59952df9ea24f8f8adc8c674c865a4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.634 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.635 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.636 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/cpu volume: 11500000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.636 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/cpu volume: 10180000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ff97d5c-82a0-4a5a-83aa-d5691266c323', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11500000000, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'timestamp': '2026-01-22T17:19:55.636012', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'instance-00000021', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '922305a4-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.354186044, 'message_signature': '214a6ec58eac248e7294eb327fcab29fa9f857eeb31b5e26b04d41d117e7683e'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10180000000, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'timestamp': '2026-01-22T17:19:55.636012', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'instance-00000022', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '92231a12-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.372326483, 'message_signature': 'a7aa8ce210b3c38d9a5d8e13b5b287f2e079d7ccad0e9ad723bd2b021d0ba7c5'}]}, 'timestamp': '2026-01-22 17:19:55.637000', '_unique_id': '8dc48b685c8a43828683d6ead1a7d658'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.638 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.639 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.639 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk.device.write.bytes volume: 72871936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.640 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk.device.write.bytes volume: 25628672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b578a41-41a0-47f4-a2fb-e225394d1b95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72871936, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5-vda', 'timestamp': '2026-01-22T17:19:55.639787', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'instance-00000021', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92239d5c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.276818954, 'message_signature': 'eef84a30894409a3fa46cf206ffa8b4592bee01e88595b57626a528d1cfbce89'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25628672, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-vda', 'timestamp': '2026-01-22T17:19:55.639787', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'instance-00000022', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9223b918-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.300872778, 'message_signature': '7e7309bae38c7527e46dc8102c8c71685c066b363c22b860669c9af371d60c44'}]}, 'timestamp': '2026-01-22 17:19:55.641151', '_unique_id': '57ef658c131b4790be8cdb23bae47708'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.642 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.643 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.643 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk.device.read.bytes volume: 30837248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.644 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk.device.read.bytes volume: 28441600 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd54663cb-aa77-4f2e-bb87-d5d16ee10172', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30837248, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5-vda', 'timestamp': '2026-01-22T17:19:55.643671', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'instance-00000021', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9224310e-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.276818954, 'message_signature': '30cf767f1bed869c9f2733b6f364f415a4512d64ccdb9bb4970fa8dfbdbfc687'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28441600, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-vda', 'timestamp': '2026-01-22T17:19:55.643671', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'instance-00000022', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9224459a-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.300872778, 'message_signature': '95f1d833b21b00a14201f47ef58a65b76bd8121be045f060c9e29648a89f0d6e'}]}, 'timestamp': '2026-01-22 17:19:55.644703', '_unique_id': 'bc4e6d96eb2b4de1a2fe2dd96bd5ad10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.645 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.647 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.647 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.647 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1941539204>, <NovaLikeServer: tempest-server-1-871110647>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1941539204>, <NovaLikeServer: tempest-server-1-871110647>]
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.647 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.647 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.648 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfbd62d8-51d7-4e19-bfd1-91dd440a1a45', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5-vda', 'timestamp': '2026-01-22T17:19:55.647883', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'instance-00000021', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9224d62c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.245847877, 'message_signature': '911629ce01b31902bc366dbbe266a77f965260f085333b4d69986d72cd85a0ae'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-vda', 'timestamp': '2026-01-22T17:19:55.647883', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'instance-00000022', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9224eb12-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.256274752, 'message_signature': '7b67f3bdd4bdce5e84e694ded9d7a69e690b5a171f0c60c14ccc3dfabe2b48a7'}]}, 'timestamp': '2026-01-22 17:19:55.648903', '_unique_id': '5d5fe7d885e54e1c90a38afa9d921951'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.649 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.650 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.650 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk.device.write.latency volume: 3982450037 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.650 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk.device.write.latency volume: 4947876495 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97a1eb1b-871c-4551-b82b-d3e088f57a2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3982450037, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5-vda', 'timestamp': '2026-01-22T17:19:55.650437', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'instance-00000021', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9225352c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.276818954, 'message_signature': 'b6fcf1fbb802f5a6be3a0bb2e3a65013b1608db8c9318ebfc6b3b94196f1223e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4947876495, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-vda', 'timestamp': '2026-01-22T17:19:55.650437', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'instance-00000022', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '922540a8-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.300872778, 'message_signature': '996d14591136038eb39251a2dcb81f191d27d43ea5baa7310800595c1caef0df'}]}, 'timestamp': '2026-01-22 17:19:55.651010', '_unique_id': 'bef95f2b7bfb40eaa6505863baf48919'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.651 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.652 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.652 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.652 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1941539204>, <NovaLikeServer: tempest-server-1-871110647>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1941539204>, <NovaLikeServer: tempest-server-1-871110647>]
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.652 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.653 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.653 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a428ce92-c768-4988-b95f-7f457217148c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000021-c4c0edb4-a206-4617-9465-58c87dcdf7d5-tapf801fa72-eb', 'timestamp': '2026-01-22T17:19:55.653039', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'tapf801fa72-eb', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:b8:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf801fa72-eb'}, 'message_id': '92259ac6-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.224766571, 'message_signature': '7be5f52d0cd869e4761ef3af829f11c2a791e9eb41584597c90b6426e879a00c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000022-01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-tap4452f367-08', 'timestamp': '2026-01-22T17:19:55.653039', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'tap4452f367-08', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:a5:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4452f367-08'}, 'message_id': '9225a610-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.230294927, 'message_signature': '5894f97c5426f87a3f25d896fe90b926174ff00b65f4b5a243acfaf1c42814f3'}]}, 'timestamp': '2026-01-22 17:19:55.653616', '_unique_id': '8844c59c60f448a6a58220edf2b112d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.654 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.655 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.655 12 DEBUG ceilometer.compute.pollsters [-] c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk.device.write.requests volume: 312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.655 12 DEBUG ceilometer.compute.pollsters [-] 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk.device.write.requests volume: 241 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6313f38a-ef09-441d-a27f-f6c159b2cd66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 312, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5-vda', 'timestamp': '2026-01-22T17:19:55.655106', 'resource_metadata': {'display_name': 'tempest-server-test-1941539204', 'name': 'instance-00000021', 'instance_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9225eb84-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.276818954, 'message_signature': '037f013989012d321bc62b54cb78972258a94b747c0601e8e6126ec099c46d25'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 241, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-vda', 'timestamp': '2026-01-22T17:19:55.655106', 'resource_metadata': {'display_name': 'tempest-server-1-871110647', 'name': 'instance-00000022', 'instance_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9225f5ac-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4759.300872778, 'message_signature': '8a5bde2b6a71b86bc41360d7a9d06c378eade8215fc51e5bf503fa6a312459f4'}]}, 'timestamp': '2026-01-22 17:19:55.655668', '_unique_id': 'a1077c3831cd46d28585ad30c764677c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:19:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:19:55.656 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:19:56 compute-0 nova_compute[183075]: 2026-01-22 17:19:56.517 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:57 compute-0 nova_compute[183075]: 2026-01-22 17:19:57.181 183079 INFO nova.compute.manager [None req-75400a88-5ff8-4c2c-a8bd-26eba9cb33ca 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Get console output
Jan 22 17:19:57 compute-0 nova_compute[183075]: 2026-01-22 17:19:57.185 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:19:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:57.932 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:19:58 compute-0 nova_compute[183075]: 2026-01-22 17:19:58.837 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:19:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:58.867 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:58.869 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:19:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:19:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.446 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.447 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.5782249
Jan 22 17:19:59 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226236]: 10.100.0.10:34896 [22/Jan/2026:17:19:58.866] listener listener/metadata 0/0/0/580/580 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.460 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.461 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.476 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.477 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0161667
Jan 22 17:19:59 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226236]: 10.100.0.10:34908 [22/Jan/2026:17:19:59.459] listener listener/metadata 0/0/0/17/17 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.482 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.483 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.500 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.501 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0186040
Jan 22 17:19:59 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226236]: 10.100.0.10:34910 [22/Jan/2026:17:19:59.481] listener listener/metadata 0/0/0/19/19 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.507 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.507 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.523 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.523 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0159616
Jan 22 17:19:59 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226236]: 10.100.0.10:34914 [22/Jan/2026:17:19:59.506] listener listener/metadata 0/0/0/17/17 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.529 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.530 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.544 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.545 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0145707
Jan 22 17:19:59 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226236]: 10.100.0.10:34924 [22/Jan/2026:17:19:59.529] listener listener/metadata 0/0/0/15/15 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.550 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.551 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.575 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.575 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0247569
Jan 22 17:19:59 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226236]: 10.100.0.10:34938 [22/Jan/2026:17:19:59.549] listener listener/metadata 0/0/0/26/26 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.581 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.581 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.596 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:59 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226236]: 10.100.0.10:34948 [22/Jan/2026:17:19:59.580] listener listener/metadata 0/0/0/16/16 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.597 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0158567
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.606 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.607 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.623 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.624 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0168400
Jan 22 17:19:59 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226236]: 10.100.0.10:34954 [22/Jan/2026:17:19:59.605] listener listener/metadata 0/0/0/18/18 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.632 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.632 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.644 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:59 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226236]: 10.100.0.10:34964 [22/Jan/2026:17:19:59.631] listener listener/metadata 0/0/0/13/13 200 146 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.645 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 162 time: 0.0125413
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.653 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.654 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.673 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.673 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 162 time: 0.0196457
Jan 22 17:19:59 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226236]: 10.100.0.10:34976 [22/Jan/2026:17:19:59.652] listener listener/metadata 0/0/0/21/21 200 146 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.683 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.684 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:59 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226236]: 10.100.0.10:34980 [22/Jan/2026:17:19:59.683] listener listener/metadata 0/0/0/20/20 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.703 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0186961
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.720 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.721 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.742 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.743 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0214856
Jan 22 17:19:59 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226236]: 10.100.0.10:34990 [22/Jan/2026:17:19:59.720] listener listener/metadata 0/0/0/23/23 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.750 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.751 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.771 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.772 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0215435
Jan 22 17:19:59 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226236]: 10.100.0.10:35000 [22/Jan/2026:17:19:59.749] listener listener/metadata 0/0/0/22/22 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.780 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.781 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.794 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.794 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0137269
Jan 22 17:19:59 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226236]: 10.100.0.10:35014 [22/Jan/2026:17:19:59.779] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.803 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.803 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.821 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:59 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226236]: 10.100.0.10:35022 [22/Jan/2026:17:19:59.802] listener listener/metadata 0/0/0/19/19 200 146 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.822 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 162 time: 0.0188999
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.830 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.831 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.848 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:19:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:19:59.849 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0180151
Jan 22 17:19:59 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226236]: 10.100.0.10:35024 [22/Jan/2026:17:19:59.829] listener listener/metadata 0/0/0/19/19 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:20:00 compute-0 nova_compute[183075]: 2026-01-22 17:20:00.728 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:00 compute-0 nova_compute[183075]: 2026-01-22 17:20:00.729 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:00 compute-0 nova_compute[183075]: 2026-01-22 17:20:00.746 183079 DEBUG nova.compute.manager [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:20:00 compute-0 nova_compute[183075]: 2026-01-22 17:20:00.817 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:00 compute-0 nova_compute[183075]: 2026-01-22 17:20:00.818 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:00 compute-0 nova_compute[183075]: 2026-01-22 17:20:00.827 183079 DEBUG nova.virt.hardware [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:20:00 compute-0 nova_compute[183075]: 2026-01-22 17:20:00.827 183079 INFO nova.compute.claims [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:20:00 compute-0 nova_compute[183075]: 2026-01-22 17:20:00.961 183079 DEBUG nova.compute.provider_tree [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:20:00 compute-0 nova_compute[183075]: 2026-01-22 17:20:00.978 183079 DEBUG nova.scheduler.client.report [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.009 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.011 183079 DEBUG nova.compute.manager [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.065 183079 DEBUG nova.compute.manager [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.065 183079 DEBUG nova.network.neutron [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.091 183079 INFO nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.117 183079 DEBUG nova.compute.manager [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.279 183079 DEBUG nova.compute.manager [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.280 183079 DEBUG nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.281 183079 INFO nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Creating image(s)
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.281 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "/var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.281 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.282 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.295 183079 DEBUG oslo_concurrency.processutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.350 183079 DEBUG oslo_concurrency.processutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.352 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.353 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.376 183079 DEBUG oslo_concurrency.processutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:20:01 compute-0 podman[226383]: 2026-01-22 17:20:01.383966121 +0000 UTC m=+0.078546223 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.436 183079 DEBUG oslo_concurrency.processutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.438 183079 DEBUG oslo_concurrency.processutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.461 183079 DEBUG nova.policy [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.475 183079 DEBUG oslo_concurrency.processutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.476 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.477 183079 DEBUG oslo_concurrency.processutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.520 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.576 183079 DEBUG oslo_concurrency.processutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.577 183079 DEBUG nova.virt.disk.api [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Checking if we can resize image /var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.577 183079 DEBUG oslo_concurrency.processutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.663 183079 DEBUG oslo_concurrency.processutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.665 183079 DEBUG nova.virt.disk.api [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Cannot resize image /var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.665 183079 DEBUG nova.objects.instance [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'migration_context' on Instance uuid d424f9c2-6a07-485c-9feb-fd1f6a145be8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.683 183079 DEBUG nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.683 183079 DEBUG nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Ensure instance console log exists: /var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.684 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.685 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:01 compute-0 nova_compute[183075]: 2026-01-22 17:20:01.685 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:02 compute-0 nova_compute[183075]: 2026-01-22 17:20:02.315 183079 INFO nova.compute.manager [None req-544c292e-f66f-4be8-a7d9-236d7cad6d08 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Get console output
Jan 22 17:20:02 compute-0 nova_compute[183075]: 2026-01-22 17:20:02.323 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:20:02 compute-0 nova_compute[183075]: 2026-01-22 17:20:02.467 183079 DEBUG nova.network.neutron [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Successfully updated port: d6646594-6084-469b-933d-86261a4e465e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:20:02 compute-0 nova_compute[183075]: 2026-01-22 17:20:02.485 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "refresh_cache-d424f9c2-6a07-485c-9feb-fd1f6a145be8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:20:02 compute-0 nova_compute[183075]: 2026-01-22 17:20:02.485 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquired lock "refresh_cache-d424f9c2-6a07-485c-9feb-fd1f6a145be8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:20:02 compute-0 nova_compute[183075]: 2026-01-22 17:20:02.485 183079 DEBUG nova.network.neutron [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:20:02 compute-0 nova_compute[183075]: 2026-01-22 17:20:02.568 183079 DEBUG nova.compute.manager [req-ed737a3d-e652-4bce-91d3-176b92e3fd98 req-2c709d37-f811-432a-81a6-cecb6320c163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Received event network-changed-d6646594-6084-469b-933d-86261a4e465e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:20:02 compute-0 nova_compute[183075]: 2026-01-22 17:20:02.569 183079 DEBUG nova.compute.manager [req-ed737a3d-e652-4bce-91d3-176b92e3fd98 req-2c709d37-f811-432a-81a6-cecb6320c163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Refreshing instance network info cache due to event network-changed-d6646594-6084-469b-933d-86261a4e465e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:20:02 compute-0 nova_compute[183075]: 2026-01-22 17:20:02.569 183079 DEBUG oslo_concurrency.lockutils [req-ed737a3d-e652-4bce-91d3-176b92e3fd98 req-2c709d37-f811-432a-81a6-cecb6320c163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-d424f9c2-6a07-485c-9feb-fd1f6a145be8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:20:02 compute-0 nova_compute[183075]: 2026-01-22 17:20:02.682 183079 DEBUG nova.network.neutron [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.598 183079 DEBUG nova.network.neutron [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Updating instance_info_cache with network_info: [{"id": "d6646594-6084-469b-933d-86261a4e465e", "address": "fa:16:3e:85:37:b5", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6646594-60", "ovs_interfaceid": "d6646594-6084-469b-933d-86261a4e465e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.840 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.854 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Releasing lock "refresh_cache-d424f9c2-6a07-485c-9feb-fd1f6a145be8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.854 183079 DEBUG nova.compute.manager [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Instance network_info: |[{"id": "d6646594-6084-469b-933d-86261a4e465e", "address": "fa:16:3e:85:37:b5", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6646594-60", "ovs_interfaceid": "d6646594-6084-469b-933d-86261a4e465e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.856 183079 DEBUG oslo_concurrency.lockutils [req-ed737a3d-e652-4bce-91d3-176b92e3fd98 req-2c709d37-f811-432a-81a6-cecb6320c163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-d424f9c2-6a07-485c-9feb-fd1f6a145be8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.856 183079 DEBUG nova.network.neutron [req-ed737a3d-e652-4bce-91d3-176b92e3fd98 req-2c709d37-f811-432a-81a6-cecb6320c163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Refreshing network info cache for port d6646594-6084-469b-933d-86261a4e465e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.861 183079 DEBUG nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Start _get_guest_xml network_info=[{"id": "d6646594-6084-469b-933d-86261a4e465e", "address": "fa:16:3e:85:37:b5", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6646594-60", "ovs_interfaceid": "d6646594-6084-469b-933d-86261a4e465e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.866 183079 WARNING nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.871 183079 DEBUG nova.virt.libvirt.host [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.872 183079 DEBUG nova.virt.libvirt.host [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.874 183079 DEBUG nova.virt.libvirt.host [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.875 183079 DEBUG nova.virt.libvirt.host [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.875 183079 DEBUG nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.876 183079 DEBUG nova.virt.hardware [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.876 183079 DEBUG nova.virt.hardware [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.876 183079 DEBUG nova.virt.hardware [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.877 183079 DEBUG nova.virt.hardware [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.877 183079 DEBUG nova.virt.hardware [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.877 183079 DEBUG nova.virt.hardware [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.877 183079 DEBUG nova.virt.hardware [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.878 183079 DEBUG nova.virt.hardware [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.878 183079 DEBUG nova.virt.hardware [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.878 183079 DEBUG nova.virt.hardware [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.879 183079 DEBUG nova.virt.hardware [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.882 183079 DEBUG nova.virt.libvirt.vif [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:19:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-534742582',display_name='tempest-server-test-534742582',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-534742582',id=35,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-kzu0013k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:20:01Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=d424f9c2-6a07-485c-9feb-fd1f6a145be8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d6646594-6084-469b-933d-86261a4e465e", "address": "fa:16:3e:85:37:b5", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6646594-60", "ovs_interfaceid": "d6646594-6084-469b-933d-86261a4e465e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.883 183079 DEBUG nova.network.os_vif_util [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "d6646594-6084-469b-933d-86261a4e465e", "address": "fa:16:3e:85:37:b5", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6646594-60", "ovs_interfaceid": "d6646594-6084-469b-933d-86261a4e465e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.884 183079 DEBUG nova.network.os_vif_util [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:37:b5,bridge_name='br-int',has_traffic_filtering=True,id=d6646594-6084-469b-933d-86261a4e465e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6646594-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.885 183079 DEBUG nova.objects.instance [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'pci_devices' on Instance uuid d424f9c2-6a07-485c-9feb-fd1f6a145be8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.899 183079 DEBUG nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:20:03 compute-0 nova_compute[183075]:   <uuid>d424f9c2-6a07-485c-9feb-fd1f6a145be8</uuid>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   <name>instance-00000023</name>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-534742582</nova:name>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:20:03</nova:creationTime>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:20:03 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:20:03 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:20:03 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:20:03 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:20:03 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:20:03 compute-0 nova_compute[183075]:         <nova:user uuid="1148a46489e842e6a0c7660c54567798">tempest-FloatingIpSameNetwork-953620552-project-member</nova:user>
Jan 22 17:20:03 compute-0 nova_compute[183075]:         <nova:project uuid="02818155e7af4645bc909d4ba671f11f">tempest-FloatingIpSameNetwork-953620552</nova:project>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:20:03 compute-0 nova_compute[183075]:         <nova:port uuid="d6646594-6084-469b-933d-86261a4e465e">
Jan 22 17:20:03 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <system>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <entry name="serial">d424f9c2-6a07-485c-9feb-fd1f6a145be8</entry>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <entry name="uuid">d424f9c2-6a07-485c-9feb-fd1f6a145be8</entry>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     </system>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   <os>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   </os>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   <features>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   </features>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8/disk"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:85:37:b5"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <target dev="tapd6646594-60"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8/console.log" append="off"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <video>
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     </video>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:20:03 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:20:03 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:20:03 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:20:03 compute-0 nova_compute[183075]: </domain>
Jan 22 17:20:03 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.901 183079 DEBUG nova.compute.manager [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Preparing to wait for external event network-vif-plugged-d6646594-6084-469b-933d-86261a4e465e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.901 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.902 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.903 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.904 183079 DEBUG nova.virt.libvirt.vif [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:19:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-534742582',display_name='tempest-server-test-534742582',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-534742582',id=35,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-kzu0013k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:20:01Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=d424f9c2-6a07-485c-9feb-fd1f6a145be8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d6646594-6084-469b-933d-86261a4e465e", "address": "fa:16:3e:85:37:b5", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6646594-60", "ovs_interfaceid": "d6646594-6084-469b-933d-86261a4e465e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.904 183079 DEBUG nova.network.os_vif_util [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "d6646594-6084-469b-933d-86261a4e465e", "address": "fa:16:3e:85:37:b5", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6646594-60", "ovs_interfaceid": "d6646594-6084-469b-933d-86261a4e465e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.905 183079 DEBUG nova.network.os_vif_util [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:37:b5,bridge_name='br-int',has_traffic_filtering=True,id=d6646594-6084-469b-933d-86261a4e465e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6646594-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.905 183079 DEBUG os_vif [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:37:b5,bridge_name='br-int',has_traffic_filtering=True,id=d6646594-6084-469b-933d-86261a4e465e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6646594-60') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.906 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.906 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.907 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.910 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.910 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6646594-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.911 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd6646594-60, col_values=(('external_ids', {'iface-id': 'd6646594-6084-469b-933d-86261a4e465e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:85:37:b5', 'vm-uuid': 'd424f9c2-6a07-485c-9feb-fd1f6a145be8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.912 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:03 compute-0 NetworkManager[55454]: <info>  [1769102403.9141] manager: (tapd6646594-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.915 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.921 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:03 compute-0 nova_compute[183075]: 2026-01-22 17:20:03.922 183079 INFO os_vif [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:37:b5,bridge_name='br-int',has_traffic_filtering=True,id=d6646594-6084-469b-933d-86261a4e465e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6646594-60')
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.008 183079 DEBUG nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.009 183079 DEBUG nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No VIF found with MAC fa:16:3e:85:37:b5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:20:04 compute-0 NetworkManager[55454]: <info>  [1769102404.0653] manager: (tapd6646594-60): new Tun device (/org/freedesktop/NetworkManager/Devices/184)
Jan 22 17:20:04 compute-0 kernel: tapd6646594-60: entered promiscuous mode
Jan 22 17:20:04 compute-0 ovn_controller[95372]: 2026-01-22T17:20:04Z|00418|binding|INFO|Claiming lport d6646594-6084-469b-933d-86261a4e465e for this chassis.
Jan 22 17:20:04 compute-0 ovn_controller[95372]: 2026-01-22T17:20:04Z|00419|binding|INFO|d6646594-6084-469b-933d-86261a4e465e: Claiming fa:16:3e:85:37:b5 10.100.0.8
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.073 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:04.075 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:37:b5 10.100.0.8'], port_security=['fa:16:3e:85:37:b5 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'd424f9c2-6a07-485c-9feb-fd1f6a145be8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.192'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=d6646594-6084-469b-933d-86261a4e465e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:20:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:04.077 104629 INFO neutron.agent.ovn.metadata.agent [-] Port d6646594-6084-469b-933d-86261a4e465e in datapath eee918a6-66b2-47ae-b702-620a23ef395b bound to our chassis
Jan 22 17:20:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:04.079 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:20:04 compute-0 ovn_controller[95372]: 2026-01-22T17:20:04Z|00420|binding|INFO|Setting lport d6646594-6084-469b-933d-86261a4e465e ovn-installed in OVS
Jan 22 17:20:04 compute-0 ovn_controller[95372]: 2026-01-22T17:20:04Z|00421|binding|INFO|Setting lport d6646594-6084-469b-933d-86261a4e465e up in Southbound
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.090 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:04.100 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1a154900-b692-4246-8110-5712e0a2df23]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:04 compute-0 systemd-udevd[226439]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:20:04 compute-0 systemd-machined[154382]: New machine qemu-35-instance-00000023.
Jan 22 17:20:04 compute-0 systemd[1]: Started Virtual Machine qemu-35-instance-00000023.
Jan 22 17:20:04 compute-0 NetworkManager[55454]: <info>  [1769102404.1245] device (tapd6646594-60): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:20:04 compute-0 NetworkManager[55454]: <info>  [1769102404.1250] device (tapd6646594-60): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:20:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:04.135 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[061a1cee-f3fe-4328-83ca-031af7b8e68a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:04.139 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[6c4297b6-6dc4-46ca-a359-b5a0893db2a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:04.169 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[cb92d280-47ad-413b-b191-d5978ead22a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:04.186 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a50da462-cce7-49e5-a525-ed7a9696eabc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 55, 'rx_bytes': 8920, 'tx_bytes': 6212, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 55, 'rx_bytes': 8920, 'tx_bytes': 6212, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473340, 'reachable_time': 17462, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226450, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:04.202 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[532d5ea0-7587-4597-95af-b9d093631cec]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473349, 'tstamp': 473349}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226452, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473351, 'tstamp': 473351}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226452, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:04.204 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.205 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.206 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:04.207 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee918a6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:20:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:04.207 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:20:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:04.207 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeee918a6-60, col_values=(('external_ids', {'iface-id': '15d4de90-41f4-4532-aebd-197c2a33c6d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:20:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:04.207 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.643 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102404.6424618, d424f9c2-6a07-485c-9feb-fd1f6a145be8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.643 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] VM Started (Lifecycle Event)
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.649 183079 DEBUG nova.compute.manager [req-b2b0cacd-e828-410e-87b7-3567534ef99e req-711c15fc-59e6-4480-9d28-72b4f7b091c3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Received event network-vif-plugged-d6646594-6084-469b-933d-86261a4e465e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.649 183079 DEBUG oslo_concurrency.lockutils [req-b2b0cacd-e828-410e-87b7-3567534ef99e req-711c15fc-59e6-4480-9d28-72b4f7b091c3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.649 183079 DEBUG oslo_concurrency.lockutils [req-b2b0cacd-e828-410e-87b7-3567534ef99e req-711c15fc-59e6-4480-9d28-72b4f7b091c3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.650 183079 DEBUG oslo_concurrency.lockutils [req-b2b0cacd-e828-410e-87b7-3567534ef99e req-711c15fc-59e6-4480-9d28-72b4f7b091c3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.650 183079 DEBUG nova.compute.manager [req-b2b0cacd-e828-410e-87b7-3567534ef99e req-711c15fc-59e6-4480-9d28-72b4f7b091c3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Processing event network-vif-plugged-d6646594-6084-469b-933d-86261a4e465e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.650 183079 DEBUG nova.compute.manager [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.654 183079 DEBUG nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.658 183079 INFO nova.virt.libvirt.driver [-] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Instance spawned successfully.
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.658 183079 DEBUG nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.662 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.665 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.679 183079 DEBUG nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.679 183079 DEBUG nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.679 183079 DEBUG nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.680 183079 DEBUG nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.680 183079 DEBUG nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.680 183079 DEBUG nova.virt.libvirt.driver [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.684 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.684 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102404.64283, d424f9c2-6a07-485c-9feb-fd1f6a145be8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.685 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] VM Paused (Lifecycle Event)
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.714 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.718 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102404.6545682, d424f9c2-6a07-485c-9feb-fd1f6a145be8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.719 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] VM Resumed (Lifecycle Event)
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.737 183079 INFO nova.compute.manager [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Took 3.46 seconds to spawn the instance on the hypervisor.
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.737 183079 DEBUG nova.compute.manager [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.738 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.746 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.790 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.848 183079 INFO nova.compute.manager [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Took 4.06 seconds to build instance.
Jan 22 17:20:04 compute-0 nova_compute[183075]: 2026-01-22 17:20:04.870 183079 DEBUG oslo_concurrency.lockutils [None req-15e2e9bb-630d-4f6c-91cb-11ae6d420528 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:05 compute-0 nova_compute[183075]: 2026-01-22 17:20:05.286 183079 DEBUG nova.network.neutron [req-ed737a3d-e652-4bce-91d3-176b92e3fd98 req-2c709d37-f811-432a-81a6-cecb6320c163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Updated VIF entry in instance network info cache for port d6646594-6084-469b-933d-86261a4e465e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:20:05 compute-0 nova_compute[183075]: 2026-01-22 17:20:05.287 183079 DEBUG nova.network.neutron [req-ed737a3d-e652-4bce-91d3-176b92e3fd98 req-2c709d37-f811-432a-81a6-cecb6320c163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Updating instance_info_cache with network_info: [{"id": "d6646594-6084-469b-933d-86261a4e465e", "address": "fa:16:3e:85:37:b5", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6646594-60", "ovs_interfaceid": "d6646594-6084-469b-933d-86261a4e465e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:20:05 compute-0 nova_compute[183075]: 2026-01-22 17:20:05.304 183079 DEBUG oslo_concurrency.lockutils [req-ed737a3d-e652-4bce-91d3-176b92e3fd98 req-2c709d37-f811-432a-81a6-cecb6320c163 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-d424f9c2-6a07-485c-9feb-fd1f6a145be8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:20:06 compute-0 nova_compute[183075]: 2026-01-22 17:20:06.102 183079 INFO nova.compute.manager [None req-246d7aba-1076-4c56-b322-248aeb71c374 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Get console output
Jan 22 17:20:06 compute-0 nova_compute[183075]: 2026-01-22 17:20:06.523 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:06 compute-0 nova_compute[183075]: 2026-01-22 17:20:06.726 183079 DEBUG nova.compute.manager [req-71bdf7fa-62b1-483f-8066-071841f3de1a req-63390c83-f1d1-4742-bea2-9c06c9b1c98f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Received event network-vif-plugged-d6646594-6084-469b-933d-86261a4e465e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:20:06 compute-0 nova_compute[183075]: 2026-01-22 17:20:06.726 183079 DEBUG oslo_concurrency.lockutils [req-71bdf7fa-62b1-483f-8066-071841f3de1a req-63390c83-f1d1-4742-bea2-9c06c9b1c98f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:06 compute-0 nova_compute[183075]: 2026-01-22 17:20:06.726 183079 DEBUG oslo_concurrency.lockutils [req-71bdf7fa-62b1-483f-8066-071841f3de1a req-63390c83-f1d1-4742-bea2-9c06c9b1c98f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:06 compute-0 nova_compute[183075]: 2026-01-22 17:20:06.727 183079 DEBUG oslo_concurrency.lockutils [req-71bdf7fa-62b1-483f-8066-071841f3de1a req-63390c83-f1d1-4742-bea2-9c06c9b1c98f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:06 compute-0 nova_compute[183075]: 2026-01-22 17:20:06.727 183079 DEBUG nova.compute.manager [req-71bdf7fa-62b1-483f-8066-071841f3de1a req-63390c83-f1d1-4742-bea2-9c06c9b1c98f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] No waiting events found dispatching network-vif-plugged-d6646594-6084-469b-933d-86261a4e465e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:20:06 compute-0 nova_compute[183075]: 2026-01-22 17:20:06.727 183079 WARNING nova.compute.manager [req-71bdf7fa-62b1-483f-8066-071841f3de1a req-63390c83-f1d1-4742-bea2-9c06c9b1c98f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Received unexpected event network-vif-plugged-d6646594-6084-469b-933d-86261a4e465e for instance with vm_state active and task_state None.
Jan 22 17:20:07 compute-0 nova_compute[183075]: 2026-01-22 17:20:07.470 183079 INFO nova.compute.manager [None req-cd23a512-5ef1-46a2-9436-2da84d851595 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Get console output
Jan 22 17:20:07 compute-0 nova_compute[183075]: 2026-01-22 17:20:07.477 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:20:08 compute-0 nova_compute[183075]: 2026-01-22 17:20:08.816 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:20:08 compute-0 nova_compute[183075]: 2026-01-22 17:20:08.915 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:11 compute-0 nova_compute[183075]: 2026-01-22 17:20:11.309 183079 INFO nova.compute.manager [None req-70c02ed5-d21b-4aa4-8474-70e1768faa64 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Get console output
Jan 22 17:20:11 compute-0 nova_compute[183075]: 2026-01-22 17:20:11.526 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:11 compute-0 podman[226464]: 2026-01-22 17:20:11.625390739 +0000 UTC m=+0.058621318 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:20:11 compute-0 podman[226465]: 2026-01-22 17:20:11.638906425 +0000 UTC m=+0.066504705 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vcs-type=git, version=9.6)
Jan 22 17:20:11 compute-0 podman[226463]: 2026-01-22 17:20:11.660855373 +0000 UTC m=+0.096648999 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:20:12 compute-0 sshd-session[226461]: Received disconnect from 91.224.92.108 port 24616:11:  [preauth]
Jan 22 17:20:12 compute-0 sshd-session[226461]: Disconnected from authenticating user root 91.224.92.108 port 24616 [preauth]
Jan 22 17:20:12 compute-0 nova_compute[183075]: 2026-01-22 17:20:12.632 183079 INFO nova.compute.manager [None req-1be3b2e0-827e-45ff-98ef-e43347623561 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Get console output
Jan 22 17:20:12 compute-0 nova_compute[183075]: 2026-01-22 17:20:12.639 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:20:13 compute-0 nova_compute[183075]: 2026-01-22 17:20:13.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:20:13 compute-0 nova_compute[183075]: 2026-01-22 17:20:13.917 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:14 compute-0 nova_compute[183075]: 2026-01-22 17:20:14.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:20:16 compute-0 nova_compute[183075]: 2026-01-22 17:20:16.460 183079 INFO nova.compute.manager [None req-fcb6b33f-819b-455a-aac7-1c6cb3567f89 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Get console output
Jan 22 17:20:16 compute-0 nova_compute[183075]: 2026-01-22 17:20:16.466 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:20:16 compute-0 nova_compute[183075]: 2026-01-22 17:20:16.529 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:16 compute-0 nova_compute[183075]: 2026-01-22 17:20:16.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:20:17 compute-0 podman[226555]: 2026-01-22 17:20:17.371086481 +0000 UTC m=+0.080821693 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:20:17 compute-0 nova_compute[183075]: 2026-01-22 17:20:17.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:20:18 compute-0 nova_compute[183075]: 2026-01-22 17:20:18.087 183079 INFO nova.compute.manager [None req-c275b38b-c2c1-4b2d-b61a-8d15c973d05b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Get console output
Jan 22 17:20:18 compute-0 nova_compute[183075]: 2026-01-22 17:20:18.095 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:20:18 compute-0 ovn_controller[95372]: 2026-01-22T17:20:18Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:85:37:b5 10.100.0.8
Jan 22 17:20:18 compute-0 ovn_controller[95372]: 2026-01-22T17:20:18Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:85:37:b5 10.100.0.8
Jan 22 17:20:18 compute-0 nova_compute[183075]: 2026-01-22 17:20:18.922 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:19 compute-0 nova_compute[183075]: 2026-01-22 17:20:19.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:20:19 compute-0 nova_compute[183075]: 2026-01-22 17:20:19.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:20:20 compute-0 nova_compute[183075]: 2026-01-22 17:20:20.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:20:20 compute-0 nova_compute[183075]: 2026-01-22 17:20:20.820 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:20 compute-0 nova_compute[183075]: 2026-01-22 17:20:20.821 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:20 compute-0 nova_compute[183075]: 2026-01-22 17:20:20.821 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:20 compute-0 nova_compute[183075]: 2026-01-22 17:20:20.821 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:20:20 compute-0 nova_compute[183075]: 2026-01-22 17:20:20.968 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.065 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.066 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.131 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.138 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.203 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.204 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.267 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.275 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.331 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.332 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.390 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.532 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.614 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.615 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5187MB free_disk=73.28284454345703GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.615 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.616 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.808 183079 INFO nova.compute.manager [None req-2005cfe2-b887-40f5-8085-ff2056494f31 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Get console output
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.813 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.891 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance c4c0edb4-a206-4617-9465-58c87dcdf7d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.892 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.892 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance d424f9c2-6a07-485c-9feb-fd1f6a145be8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.892 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.892 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:20:21 compute-0 nova_compute[183075]: 2026-01-22 17:20:21.965 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:20:22 compute-0 nova_compute[183075]: 2026-01-22 17:20:22.198 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:20:22 compute-0 nova_compute[183075]: 2026-01-22 17:20:22.247 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:20:22 compute-0 nova_compute[183075]: 2026-01-22 17:20:22.248 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:23 compute-0 nova_compute[183075]: 2026-01-22 17:20:23.250 183079 INFO nova.compute.manager [None req-e365c6d9-8166-42f3-8d6e-33c18b92636f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Get console output
Jan 22 17:20:23 compute-0 nova_compute[183075]: 2026-01-22 17:20:23.251 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:20:23 compute-0 nova_compute[183075]: 2026-01-22 17:20:23.252 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:20:23 compute-0 nova_compute[183075]: 2026-01-22 17:20:23.252 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:20:23 compute-0 nova_compute[183075]: 2026-01-22 17:20:23.259 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:20:23 compute-0 nova_compute[183075]: 2026-01-22 17:20:23.927 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:24.240 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:20:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:24.243 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:20:24 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:20:24 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:20:24 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:20:24 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:20:24 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:20:24 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:20:24 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:20:24 compute-0 nova_compute[183075]: 2026-01-22 17:20:24.268 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-c4c0edb4-a206-4617-9465-58c87dcdf7d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:20:24 compute-0 nova_compute[183075]: 2026-01-22 17:20:24.269 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-c4c0edb4-a206-4617-9465-58c87dcdf7d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:20:24 compute-0 nova_compute[183075]: 2026-01-22 17:20:24.269 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:20:24 compute-0 nova_compute[183075]: 2026-01-22 17:20:24.270 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c4c0edb4-a206-4617-9465-58c87dcdf7d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:20:24 compute-0 podman[226595]: 2026-01-22 17:20:24.370367167 +0000 UTC m=+0.062564411 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.448 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.449 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.2071636
Jan 22 17:20:25 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.8:50150 [22/Jan/2026:17:20:24.239] listener listener/metadata 0/0/0/1210/1210 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.458 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.459 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.474 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.474 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0158067
Jan 22 17:20:25 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.8:50166 [22/Jan/2026:17:20:25.457] listener listener/metadata 0/0/0/16/16 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.478 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.479 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.494 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.494 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0155540
Jan 22 17:20:25 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.8:50172 [22/Jan/2026:17:20:25.478] listener listener/metadata 0/0/0/16/16 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.499 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.500 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.514 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.514 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0142570
Jan 22 17:20:25 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.8:50188 [22/Jan/2026:17:20:25.499] listener listener/metadata 0/0/0/15/15 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.519 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.519 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.540 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.540 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0207613
Jan 22 17:20:25 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.8:50196 [22/Jan/2026:17:20:25.518] listener listener/metadata 0/0/0/22/22 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.545 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.545 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.562 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:20:25 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.8:50200 [22/Jan/2026:17:20:25.544] listener listener/metadata 0/0/0/18/18 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.563 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0175333
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.567 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.568 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.586 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.586 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0185101
Jan 22 17:20:25 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.8:50202 [22/Jan/2026:17:20:25.567] listener listener/metadata 0/0/0/19/19 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.593 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.594 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.609 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.610 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0163858
Jan 22 17:20:25 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.8:50206 [22/Jan/2026:17:20:25.592] listener listener/metadata 0/0/0/17/17 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.616 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.616 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.643 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.644 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0274494
Jan 22 17:20:25 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.8:50216 [22/Jan/2026:17:20:25.615] listener listener/metadata 0/0/0/28/28 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.649 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.649 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.664 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.665 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0153952
Jan 22 17:20:25 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.8:50222 [22/Jan/2026:17:20:25.648] listener listener/metadata 0/0/0/16/16 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.669 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.670 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:20:25 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.8:50230 [22/Jan/2026:17:20:25.668] listener listener/metadata 0/0/0/21/21 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.690 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0205019
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.708 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.709 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.727 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:20:25 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.8:50246 [22/Jan/2026:17:20:25.707] listener listener/metadata 0/0/0/19/19 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.728 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0185564
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.733 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.734 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.751 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.752 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0182703
Jan 22 17:20:25 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.8:50258 [22/Jan/2026:17:20:25.732] listener listener/metadata 0/0/0/19/19 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.758 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.759 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.782 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.783 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0241067
Jan 22 17:20:25 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.8:50274 [22/Jan/2026:17:20:25.757] listener listener/metadata 0/0/0/25/25 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.787 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.788 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.806 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:20:25 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.8:50288 [22/Jan/2026:17:20:25.787] listener listener/metadata 0/0/0/20/20 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.807 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0189090
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.812 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.814 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.8
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.838 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:20:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:25.839 104990 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0252776
Jan 22 17:20:25 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[226094]: 10.100.0.8:50292 [22/Jan/2026:17:20:25.812] listener listener/metadata 0/0/0/26/26 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:20:26 compute-0 nova_compute[183075]: 2026-01-22 17:20:26.534 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:26 compute-0 nova_compute[183075]: 2026-01-22 17:20:26.993 183079 INFO nova.compute.manager [None req-daa2b192-c191-42ac-b06f-5dba351631ea 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Get console output
Jan 22 17:20:27 compute-0 nova_compute[183075]: 2026-01-22 17:20:27.001 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:20:28 compute-0 nova_compute[183075]: 2026-01-22 17:20:28.299 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Updating instance_info_cache with network_info: [{"id": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "address": "fa:16:3e:1b:b8:fd", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf801fa72-eb", "ovs_interfaceid": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:20:28 compute-0 nova_compute[183075]: 2026-01-22 17:20:28.435 183079 INFO nova.compute.manager [None req-5d7f3dd1-58e9-4ebc-889f-1b789a6b46ad 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Get console output
Jan 22 17:20:28 compute-0 nova_compute[183075]: 2026-01-22 17:20:28.442 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:20:28 compute-0 nova_compute[183075]: 2026-01-22 17:20:28.490 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-c4c0edb4-a206-4617-9465-58c87dcdf7d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:20:28 compute-0 nova_compute[183075]: 2026-01-22 17:20:28.491 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:20:28 compute-0 nova_compute[183075]: 2026-01-22 17:20:28.492 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:20:28 compute-0 nova_compute[183075]: 2026-01-22 17:20:28.932 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:31 compute-0 nova_compute[183075]: 2026-01-22 17:20:31.535 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:32 compute-0 nova_compute[183075]: 2026-01-22 17:20:32.125 183079 INFO nova.compute.manager [None req-a5f16d37-6989-4b70-9a43-23d639d9190d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Get console output
Jan 22 17:20:32 compute-0 nova_compute[183075]: 2026-01-22 17:20:32.129 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:20:32 compute-0 podman[226619]: 2026-01-22 17:20:32.369592527 +0000 UTC m=+0.083738400 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.181 183079 DEBUG oslo_concurrency.lockutils [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.182 183079 DEBUG oslo_concurrency.lockutils [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.183 183079 DEBUG oslo_concurrency.lockutils [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.183 183079 DEBUG oslo_concurrency.lockutils [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.184 183079 DEBUG oslo_concurrency.lockutils [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.186 183079 INFO nova.compute.manager [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Terminating instance
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.188 183079 DEBUG nova.compute.manager [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:20:33 compute-0 kernel: tapd6646594-60 (unregistering): left promiscuous mode
Jan 22 17:20:33 compute-0 NetworkManager[55454]: <info>  [1769102433.2062] device (tapd6646594-60): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.214 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:33 compute-0 ovn_controller[95372]: 2026-01-22T17:20:33Z|00422|binding|INFO|Releasing lport d6646594-6084-469b-933d-86261a4e465e from this chassis (sb_readonly=0)
Jan 22 17:20:33 compute-0 ovn_controller[95372]: 2026-01-22T17:20:33Z|00423|binding|INFO|Setting lport d6646594-6084-469b-933d-86261a4e465e down in Southbound
Jan 22 17:20:33 compute-0 ovn_controller[95372]: 2026-01-22T17:20:33Z|00424|binding|INFO|Removing iface tapd6646594-60 ovn-installed in OVS
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.217 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:33.223 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:37:b5 10.100.0.8'], port_security=['fa:16:3e:85:37:b5 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'd424f9c2-6a07-485c-9feb-fd1f6a145be8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.192', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=d6646594-6084-469b-933d-86261a4e465e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:20:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:33.225 104629 INFO neutron.agent.ovn.metadata.agent [-] Port d6646594-6084-469b-933d-86261a4e465e in datapath eee918a6-66b2-47ae-b702-620a23ef395b unbound from our chassis
Jan 22 17:20:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:33.226 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.228 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:33.242 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[41423038-db30-473b-bb80-9b915c63dd10]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:33.274 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[880d47ad-68f2-4afa-965e-d063f4d025b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:33 compute-0 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000023.scope: Deactivated successfully.
Jan 22 17:20:33 compute-0 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000023.scope: Consumed 12.884s CPU time.
Jan 22 17:20:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:33.278 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d7dae1b8-1cec-40a1-af5d-c2b2a641c122]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:33 compute-0 systemd-machined[154382]: Machine qemu-35-instance-00000023 terminated.
Jan 22 17:20:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:33.307 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[3629cbca-e0c9-4350-80fc-6c13cbffc29c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:33.323 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7a154b41-86a9-48d5-9e35-953d65852249]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 106, 'rx_bytes': 17308, 'tx_bytes': 12085, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 106, 'rx_bytes': 17308, 'tx_bytes': 12085, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473340, 'reachable_time': 17462, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226655, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:33.342 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[152b4f8d-a3e5-4ed0-9127-81a4f2dbf01a]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473349, 'tstamp': 473349}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226656, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473351, 'tstamp': 473351}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226656, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:33.344 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.346 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.350 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:33.350 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee918a6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:20:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:33.350 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:20:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:33.351 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeee918a6-60, col_values=(('external_ids', {'iface-id': '15d4de90-41f4-4532-aebd-197c2a33c6d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:20:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:33.351 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.410 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.416 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.443 183079 INFO nova.virt.libvirt.driver [-] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Instance destroyed successfully.
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.444 183079 DEBUG nova.objects.instance [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'resources' on Instance uuid d424f9c2-6a07-485c-9feb-fd1f6a145be8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.461 183079 DEBUG nova.virt.libvirt.vif [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:19:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-534742582',display_name='tempest-server-test-534742582',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-534742582',id=35,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:20:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-kzu0013k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:20:04Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=d424f9c2-6a07-485c-9feb-fd1f6a145be8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d6646594-6084-469b-933d-86261a4e465e", "address": "fa:16:3e:85:37:b5", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6646594-60", "ovs_interfaceid": "d6646594-6084-469b-933d-86261a4e465e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.462 183079 DEBUG nova.network.os_vif_util [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "d6646594-6084-469b-933d-86261a4e465e", "address": "fa:16:3e:85:37:b5", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6646594-60", "ovs_interfaceid": "d6646594-6084-469b-933d-86261a4e465e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.463 183079 DEBUG nova.network.os_vif_util [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:37:b5,bridge_name='br-int',has_traffic_filtering=True,id=d6646594-6084-469b-933d-86261a4e465e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6646594-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.463 183079 DEBUG os_vif [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:37:b5,bridge_name='br-int',has_traffic_filtering=True,id=d6646594-6084-469b-933d-86261a4e465e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6646594-60') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.465 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.465 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6646594-60, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.467 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.470 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.473 183079 INFO os_vif [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:37:b5,bridge_name='br-int',has_traffic_filtering=True,id=d6646594-6084-469b-933d-86261a4e465e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6646594-60')
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.474 183079 INFO nova.virt.libvirt.driver [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Deleting instance files /var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8_del
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.475 183079 INFO nova.virt.libvirt.driver [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Deletion of /var/lib/nova/instances/d424f9c2-6a07-485c-9feb-fd1f6a145be8_del complete
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.523 183079 INFO nova.compute.manager [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.523 183079 DEBUG oslo.service.loopingcall [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.523 183079 DEBUG nova.compute.manager [-] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.523 183079 DEBUG nova.network.neutron [-] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.589 183079 INFO nova.compute.manager [None req-e08f192b-c749-435f-8a34-2657b595b0c3 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Get console output
Jan 22 17:20:33 compute-0 nova_compute[183075]: 2026-01-22 17:20:33.594 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:20:34 compute-0 ovn_controller[95372]: 2026-01-22T17:20:34Z|00425|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 22 17:20:35 compute-0 nova_compute[183075]: 2026-01-22 17:20:35.635 183079 INFO nova.compute.manager [None req-4d04fbf7-9134-4063-a625-0361a7ceefce 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Get console output
Jan 22 17:20:35 compute-0 nova_compute[183075]: 2026-01-22 17:20:35.637 183079 DEBUG nova.compute.manager [req-e5d1ac51-52c8-4ea2-9db5-ed92000d17ac req-67159665-9194-4cc1-8661-ebbdc059cf97 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Received event network-vif-unplugged-d6646594-6084-469b-933d-86261a4e465e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:20:35 compute-0 nova_compute[183075]: 2026-01-22 17:20:35.639 183079 DEBUG oslo_concurrency.lockutils [req-e5d1ac51-52c8-4ea2-9db5-ed92000d17ac req-67159665-9194-4cc1-8661-ebbdc059cf97 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:35 compute-0 nova_compute[183075]: 2026-01-22 17:20:35.639 183079 DEBUG oslo_concurrency.lockutils [req-e5d1ac51-52c8-4ea2-9db5-ed92000d17ac req-67159665-9194-4cc1-8661-ebbdc059cf97 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:35 compute-0 nova_compute[183075]: 2026-01-22 17:20:35.640 183079 DEBUG oslo_concurrency.lockutils [req-e5d1ac51-52c8-4ea2-9db5-ed92000d17ac req-67159665-9194-4cc1-8661-ebbdc059cf97 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:35 compute-0 nova_compute[183075]: 2026-01-22 17:20:35.640 183079 DEBUG nova.compute.manager [req-e5d1ac51-52c8-4ea2-9db5-ed92000d17ac req-67159665-9194-4cc1-8661-ebbdc059cf97 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] No waiting events found dispatching network-vif-unplugged-d6646594-6084-469b-933d-86261a4e465e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:20:35 compute-0 nova_compute[183075]: 2026-01-22 17:20:35.641 183079 DEBUG nova.compute.manager [req-e5d1ac51-52c8-4ea2-9db5-ed92000d17ac req-67159665-9194-4cc1-8661-ebbdc059cf97 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Received event network-vif-unplugged-d6646594-6084-469b-933d-86261a4e465e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:20:35 compute-0 nova_compute[183075]: 2026-01-22 17:20:35.649 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:20:36 compute-0 nova_compute[183075]: 2026-01-22 17:20:36.537 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:38 compute-0 nova_compute[183075]: 2026-01-22 17:20:38.467 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:38 compute-0 nova_compute[183075]: 2026-01-22 17:20:38.539 183079 DEBUG nova.compute.manager [req-f2842c75-ce16-4a8b-8b5a-185411077603 req-1f485cb0-fefc-4a91-bd3f-f400dffaeb3c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Received event network-vif-plugged-d6646594-6084-469b-933d-86261a4e465e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:20:38 compute-0 nova_compute[183075]: 2026-01-22 17:20:38.540 183079 DEBUG oslo_concurrency.lockutils [req-f2842c75-ce16-4a8b-8b5a-185411077603 req-1f485cb0-fefc-4a91-bd3f-f400dffaeb3c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:38 compute-0 nova_compute[183075]: 2026-01-22 17:20:38.541 183079 DEBUG oslo_concurrency.lockutils [req-f2842c75-ce16-4a8b-8b5a-185411077603 req-1f485cb0-fefc-4a91-bd3f-f400dffaeb3c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:38 compute-0 nova_compute[183075]: 2026-01-22 17:20:38.541 183079 DEBUG oslo_concurrency.lockutils [req-f2842c75-ce16-4a8b-8b5a-185411077603 req-1f485cb0-fefc-4a91-bd3f-f400dffaeb3c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:38 compute-0 nova_compute[183075]: 2026-01-22 17:20:38.542 183079 DEBUG nova.compute.manager [req-f2842c75-ce16-4a8b-8b5a-185411077603 req-1f485cb0-fefc-4a91-bd3f-f400dffaeb3c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] No waiting events found dispatching network-vif-plugged-d6646594-6084-469b-933d-86261a4e465e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:20:38 compute-0 nova_compute[183075]: 2026-01-22 17:20:38.542 183079 WARNING nova.compute.manager [req-f2842c75-ce16-4a8b-8b5a-185411077603 req-1f485cb0-fefc-4a91-bd3f-f400dffaeb3c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Received unexpected event network-vif-plugged-d6646594-6084-469b-933d-86261a4e465e for instance with vm_state active and task_state deleting.
Jan 22 17:20:39 compute-0 nova_compute[183075]: 2026-01-22 17:20:39.216 183079 DEBUG nova.network.neutron [-] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:20:39 compute-0 nova_compute[183075]: 2026-01-22 17:20:39.282 183079 INFO nova.compute.manager [-] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Took 5.76 seconds to deallocate network for instance.
Jan 22 17:20:39 compute-0 nova_compute[183075]: 2026-01-22 17:20:39.350 183079 DEBUG oslo_concurrency.lockutils [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:39 compute-0 nova_compute[183075]: 2026-01-22 17:20:39.350 183079 DEBUG oslo_concurrency.lockutils [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:39 compute-0 nova_compute[183075]: 2026-01-22 17:20:39.622 183079 DEBUG nova.compute.provider_tree [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:20:39 compute-0 nova_compute[183075]: 2026-01-22 17:20:39.643 183079 DEBUG nova.scheduler.client.report [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:20:39 compute-0 nova_compute[183075]: 2026-01-22 17:20:39.676 183079 DEBUG oslo_concurrency.lockutils [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:40 compute-0 nova_compute[183075]: 2026-01-22 17:20:40.115 183079 INFO nova.scheduler.client.report [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Deleted allocations for instance d424f9c2-6a07-485c-9feb-fd1f6a145be8
Jan 22 17:20:40 compute-0 nova_compute[183075]: 2026-01-22 17:20:40.342 183079 DEBUG oslo_concurrency.lockutils [None req-a8b7bdbf-0d43-4407-bb34-0da06f336cc1 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "d424f9c2-6a07-485c-9feb-fd1f6a145be8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:41 compute-0 nova_compute[183075]: 2026-01-22 17:20:41.539 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:41 compute-0 nova_compute[183075]: 2026-01-22 17:20:41.549 183079 DEBUG oslo_concurrency.lockutils [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:41 compute-0 nova_compute[183075]: 2026-01-22 17:20:41.550 183079 DEBUG oslo_concurrency.lockutils [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:41 compute-0 nova_compute[183075]: 2026-01-22 17:20:41.550 183079 DEBUG oslo_concurrency.lockutils [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:41 compute-0 nova_compute[183075]: 2026-01-22 17:20:41.550 183079 DEBUG oslo_concurrency.lockutils [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:41 compute-0 nova_compute[183075]: 2026-01-22 17:20:41.551 183079 DEBUG oslo_concurrency.lockutils [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:41 compute-0 nova_compute[183075]: 2026-01-22 17:20:41.552 183079 INFO nova.compute.manager [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Terminating instance
Jan 22 17:20:41 compute-0 nova_compute[183075]: 2026-01-22 17:20:41.553 183079 DEBUG nova.compute.manager [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:20:41 compute-0 kernel: tapf801fa72-eb (unregistering): left promiscuous mode
Jan 22 17:20:41 compute-0 NetworkManager[55454]: <info>  [1769102441.5772] device (tapf801fa72-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:20:41 compute-0 ovn_controller[95372]: 2026-01-22T17:20:41Z|00426|binding|INFO|Releasing lport f801fa72-ebf3-48b4-a510-f75bbe40e687 from this chassis (sb_readonly=0)
Jan 22 17:20:41 compute-0 ovn_controller[95372]: 2026-01-22T17:20:41Z|00427|binding|INFO|Setting lport f801fa72-ebf3-48b4-a510-f75bbe40e687 down in Southbound
Jan 22 17:20:41 compute-0 nova_compute[183075]: 2026-01-22 17:20:41.586 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:41 compute-0 ovn_controller[95372]: 2026-01-22T17:20:41Z|00428|binding|INFO|Removing iface tapf801fa72-eb ovn-installed in OVS
Jan 22 17:20:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:41.594 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:b8:fd 10.100.0.6'], port_security=['fa:16:3e:1b:b8:fd 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c4c0edb4-a206-4617-9465-58c87dcdf7d5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.231', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=f801fa72-ebf3-48b4-a510-f75bbe40e687) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:20:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:41.595 104629 INFO neutron.agent.ovn.metadata.agent [-] Port f801fa72-ebf3-48b4-a510-f75bbe40e687 in datapath eee918a6-66b2-47ae-b702-620a23ef395b unbound from our chassis
Jan 22 17:20:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:41.597 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eee918a6-66b2-47ae-b702-620a23ef395b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:20:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:41.598 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[eb0dae35-7a88-4c4f-8b91-eb6f3ce0db59]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:41.599 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b namespace which is not needed anymore
Jan 22 17:20:41 compute-0 nova_compute[183075]: 2026-01-22 17:20:41.600 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:41 compute-0 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000021.scope: Deactivated successfully.
Jan 22 17:20:41 compute-0 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000021.scope: Consumed 15.643s CPU time.
Jan 22 17:20:41 compute-0 systemd-machined[154382]: Machine qemu-33-instance-00000021 terminated.
Jan 22 17:20:41 compute-0 kernel: tapf801fa72-eb: entered promiscuous mode
Jan 22 17:20:41 compute-0 kernel: tapf801fa72-eb (unregistering): left promiscuous mode
Jan 22 17:20:41 compute-0 nova_compute[183075]: 2026-01-22 17:20:41.789 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:41 compute-0 nova_compute[183075]: 2026-01-22 17:20:41.828 183079 INFO nova.virt.libvirt.driver [-] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Instance destroyed successfully.
Jan 22 17:20:41 compute-0 nova_compute[183075]: 2026-01-22 17:20:41.829 183079 DEBUG nova.objects.instance [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'resources' on Instance uuid c4c0edb4-a206-4617-9465-58c87dcdf7d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:20:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:41.936 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:41.937 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:41.938 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:42 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[226071]: [NOTICE]   (226087) : haproxy version is 2.8.14-c23fe91
Jan 22 17:20:42 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[226071]: [NOTICE]   (226087) : path to executable is /usr/sbin/haproxy
Jan 22 17:20:42 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[226071]: [WARNING]  (226087) : Exiting Master process...
Jan 22 17:20:42 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[226071]: [ALERT]    (226087) : Current worker (226094) exited with code 143 (Terminated)
Jan 22 17:20:42 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[226071]: [WARNING]  (226087) : All workers exited. Exiting... (0)
Jan 22 17:20:42 compute-0 nova_compute[183075]: 2026-01-22 17:20:42.039 183079 DEBUG nova.virt.libvirt.vif [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:19:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1941539204',display_name='tempest-server-test-1941539204',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1941539204',id=33,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:19:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-dlb2hgpr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:19:31Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=c4c0edb4-a206-4617-9465-58c87dcdf7d5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "address": "fa:16:3e:1b:b8:fd", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf801fa72-eb", "ovs_interfaceid": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:20:42 compute-0 nova_compute[183075]: 2026-01-22 17:20:42.040 183079 DEBUG nova.network.os_vif_util [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "address": "fa:16:3e:1b:b8:fd", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf801fa72-eb", "ovs_interfaceid": "f801fa72-ebf3-48b4-a510-f75bbe40e687", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:20:42 compute-0 nova_compute[183075]: 2026-01-22 17:20:42.042 183079 DEBUG nova.network.os_vif_util [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1b:b8:fd,bridge_name='br-int',has_traffic_filtering=True,id=f801fa72-ebf3-48b4-a510-f75bbe40e687,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapf801fa72-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:20:42 compute-0 systemd[1]: libpod-fcdc4fd31b0b112f1c951fd1ef2ec16d34bd8931479c922b5f45611715bede86.scope: Deactivated successfully.
Jan 22 17:20:42 compute-0 nova_compute[183075]: 2026-01-22 17:20:42.042 183079 DEBUG os_vif [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1b:b8:fd,bridge_name='br-int',has_traffic_filtering=True,id=f801fa72-ebf3-48b4-a510-f75bbe40e687,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapf801fa72-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:20:42 compute-0 nova_compute[183075]: 2026-01-22 17:20:42.046 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:42 compute-0 nova_compute[183075]: 2026-01-22 17:20:42.046 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf801fa72-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:20:42 compute-0 podman[226697]: 2026-01-22 17:20:42.050058227 +0000 UTC m=+0.340384778 container died fcdc4fd31b0b112f1c951fd1ef2ec16d34bd8931479c922b5f45611715bede86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:20:42 compute-0 nova_compute[183075]: 2026-01-22 17:20:42.051 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:20:42 compute-0 nova_compute[183075]: 2026-01-22 17:20:42.055 183079 INFO os_vif [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1b:b8:fd,bridge_name='br-int',has_traffic_filtering=True,id=f801fa72-ebf3-48b4-a510-f75bbe40e687,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapf801fa72-eb')
Jan 22 17:20:42 compute-0 nova_compute[183075]: 2026-01-22 17:20:42.056 183079 INFO nova.virt.libvirt.driver [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Deleting instance files /var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5_del
Jan 22 17:20:42 compute-0 nova_compute[183075]: 2026-01-22 17:20:42.057 183079 INFO nova.virt.libvirt.driver [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Deletion of /var/lib/nova/instances/c4c0edb4-a206-4617-9465-58c87dcdf7d5_del complete
Jan 22 17:20:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fcdc4fd31b0b112f1c951fd1ef2ec16d34bd8931479c922b5f45611715bede86-userdata-shm.mount: Deactivated successfully.
Jan 22 17:20:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-31de07f8a2dd99678bcac0fef902867481f5fbf35583b4cb74db643f14141e01-merged.mount: Deactivated successfully.
Jan 22 17:20:42 compute-0 podman[226718]: 2026-01-22 17:20:42.115574415 +0000 UTC m=+0.302146259 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41)
Jan 22 17:20:42 compute-0 nova_compute[183075]: 2026-01-22 17:20:42.189 183079 INFO nova.compute.manager [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Took 0.63 seconds to destroy the instance on the hypervisor.
Jan 22 17:20:42 compute-0 nova_compute[183075]: 2026-01-22 17:20:42.189 183079 DEBUG oslo.service.loopingcall [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:20:42 compute-0 nova_compute[183075]: 2026-01-22 17:20:42.190 183079 DEBUG nova.compute.manager [-] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:20:42 compute-0 nova_compute[183075]: 2026-01-22 17:20:42.190 183079 DEBUG nova.network.neutron [-] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:20:42 compute-0 podman[226697]: 2026-01-22 17:20:42.218432028 +0000 UTC m=+0.508758549 container cleanup fcdc4fd31b0b112f1c951fd1ef2ec16d34bd8931479c922b5f45611715bede86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 22 17:20:42 compute-0 systemd[1]: libpod-conmon-fcdc4fd31b0b112f1c951fd1ef2ec16d34bd8931479c922b5f45611715bede86.scope: Deactivated successfully.
Jan 22 17:20:42 compute-0 podman[226777]: 2026-01-22 17:20:42.295603544 +0000 UTC m=+0.052071855 container remove fcdc4fd31b0b112f1c951fd1ef2ec16d34bd8931479c922b5f45611715bede86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:20:42 compute-0 podman[226716]: 2026-01-22 17:20:42.29924482 +0000 UTC m=+0.492338096 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 17:20:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:42.303 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[58bdbbe4-390d-4f73-b3d3-13b8fc0d8550]: (4, ('Thu Jan 22 05:20:41 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b (fcdc4fd31b0b112f1c951fd1ef2ec16d34bd8931479c922b5f45611715bede86)\nfcdc4fd31b0b112f1c951fd1ef2ec16d34bd8931479c922b5f45611715bede86\nThu Jan 22 05:20:42 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b (fcdc4fd31b0b112f1c951fd1ef2ec16d34bd8931479c922b5f45611715bede86)\nfcdc4fd31b0b112f1c951fd1ef2ec16d34bd8931479c922b5f45611715bede86\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:42.305 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b0cab6b9-4cc5-4276-9eb4-257ac8db34e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:42.306 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:20:42 compute-0 nova_compute[183075]: 2026-01-22 17:20:42.308 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:42 compute-0 kernel: tapeee918a6-60: left promiscuous mode
Jan 22 17:20:42 compute-0 podman[226711]: 2026-01-22 17:20:42.325384949 +0000 UTC m=+0.523487148 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 17:20:42 compute-0 nova_compute[183075]: 2026-01-22 17:20:42.325 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:42.328 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f60d01d4-9b32-454f-839c-7797146caa21]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:42.342 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3037f206-b1b9-4f64-91b7-6416c9141baa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:42.343 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8f995926-ee61-498f-be10-404111032664]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:42.359 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[aafe6eec-b2d1-4c67-82a5-9fa97be48f43]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473334, 'reachable_time': 44123, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226818, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:42 compute-0 systemd[1]: run-netns-ovnmeta\x2deee918a6\x2d66b2\x2d47ae\x2db702\x2d620a23ef395b.mount: Deactivated successfully.
Jan 22 17:20:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:42.363 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:20:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:42.364 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[a720d1bf-3ce6-45c5-a9f7-946e5bbeda63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:20:43 compute-0 nova_compute[183075]: 2026-01-22 17:20:43.324 183079 DEBUG nova.compute.manager [req-b4de7c1a-de4c-4b3c-9051-688b87d4a7aa req-451a719d-e7ab-40be-ad3b-f610c81a5cef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Received event network-vif-unplugged-f801fa72-ebf3-48b4-a510-f75bbe40e687 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:20:43 compute-0 nova_compute[183075]: 2026-01-22 17:20:43.325 183079 DEBUG oslo_concurrency.lockutils [req-b4de7c1a-de4c-4b3c-9051-688b87d4a7aa req-451a719d-e7ab-40be-ad3b-f610c81a5cef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:43 compute-0 nova_compute[183075]: 2026-01-22 17:20:43.326 183079 DEBUG oslo_concurrency.lockutils [req-b4de7c1a-de4c-4b3c-9051-688b87d4a7aa req-451a719d-e7ab-40be-ad3b-f610c81a5cef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:43 compute-0 nova_compute[183075]: 2026-01-22 17:20:43.326 183079 DEBUG oslo_concurrency.lockutils [req-b4de7c1a-de4c-4b3c-9051-688b87d4a7aa req-451a719d-e7ab-40be-ad3b-f610c81a5cef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:43 compute-0 nova_compute[183075]: 2026-01-22 17:20:43.326 183079 DEBUG nova.compute.manager [req-b4de7c1a-de4c-4b3c-9051-688b87d4a7aa req-451a719d-e7ab-40be-ad3b-f610c81a5cef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] No waiting events found dispatching network-vif-unplugged-f801fa72-ebf3-48b4-a510-f75bbe40e687 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:20:43 compute-0 nova_compute[183075]: 2026-01-22 17:20:43.328 183079 DEBUG nova.compute.manager [req-b4de7c1a-de4c-4b3c-9051-688b87d4a7aa req-451a719d-e7ab-40be-ad3b-f610c81a5cef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Received event network-vif-unplugged-f801fa72-ebf3-48b4-a510-f75bbe40e687 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:20:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:43.817 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:20:43 compute-0 nova_compute[183075]: 2026-01-22 17:20:43.817 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:43.819 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:20:43 compute-0 nova_compute[183075]: 2026-01-22 17:20:43.981 183079 DEBUG nova.network.neutron [-] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:20:44 compute-0 nova_compute[183075]: 2026-01-22 17:20:44.384 183079 INFO nova.compute.manager [-] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Took 2.19 seconds to deallocate network for instance.
Jan 22 17:20:44 compute-0 nova_compute[183075]: 2026-01-22 17:20:44.579 183079 DEBUG oslo_concurrency.lockutils [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:44 compute-0 nova_compute[183075]: 2026-01-22 17:20:44.580 183079 DEBUG oslo_concurrency.lockutils [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:44 compute-0 nova_compute[183075]: 2026-01-22 17:20:44.644 183079 DEBUG nova.compute.provider_tree [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:20:45 compute-0 nova_compute[183075]: 2026-01-22 17:20:45.094 183079 DEBUG nova.scheduler.client.report [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:20:45 compute-0 nova_compute[183075]: 2026-01-22 17:20:45.382 183079 DEBUG oslo_concurrency.lockutils [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.802s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:45 compute-0 nova_compute[183075]: 2026-01-22 17:20:45.412 183079 INFO nova.scheduler.client.report [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Deleted allocations for instance c4c0edb4-a206-4617-9465-58c87dcdf7d5
Jan 22 17:20:45 compute-0 nova_compute[183075]: 2026-01-22 17:20:45.446 183079 DEBUG nova.compute.manager [req-e4ff395e-3f57-4ccf-877d-e5c2af791cbf req-730edf36-68b4-4cb2-a2ae-415cb87edd1f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Received event network-vif-plugged-f801fa72-ebf3-48b4-a510-f75bbe40e687 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:20:45 compute-0 nova_compute[183075]: 2026-01-22 17:20:45.446 183079 DEBUG oslo_concurrency.lockutils [req-e4ff395e-3f57-4ccf-877d-e5c2af791cbf req-730edf36-68b4-4cb2-a2ae-415cb87edd1f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:45 compute-0 nova_compute[183075]: 2026-01-22 17:20:45.447 183079 DEBUG oslo_concurrency.lockutils [req-e4ff395e-3f57-4ccf-877d-e5c2af791cbf req-730edf36-68b4-4cb2-a2ae-415cb87edd1f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:45 compute-0 nova_compute[183075]: 2026-01-22 17:20:45.447 183079 DEBUG oslo_concurrency.lockutils [req-e4ff395e-3f57-4ccf-877d-e5c2af791cbf req-730edf36-68b4-4cb2-a2ae-415cb87edd1f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:45 compute-0 nova_compute[183075]: 2026-01-22 17:20:45.447 183079 DEBUG nova.compute.manager [req-e4ff395e-3f57-4ccf-877d-e5c2af791cbf req-730edf36-68b4-4cb2-a2ae-415cb87edd1f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] No waiting events found dispatching network-vif-plugged-f801fa72-ebf3-48b4-a510-f75bbe40e687 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:20:45 compute-0 nova_compute[183075]: 2026-01-22 17:20:45.447 183079 WARNING nova.compute.manager [req-e4ff395e-3f57-4ccf-877d-e5c2af791cbf req-730edf36-68b4-4cb2-a2ae-415cb87edd1f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Received unexpected event network-vif-plugged-f801fa72-ebf3-48b4-a510-f75bbe40e687 for instance with vm_state deleted and task_state None.
Jan 22 17:20:45 compute-0 nova_compute[183075]: 2026-01-22 17:20:45.482 183079 DEBUG oslo_concurrency.lockutils [None req-20d67897-a51d-4f17-8403-8c63f6908549 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "c4c0edb4-a206-4617-9465-58c87dcdf7d5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.933s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:46 compute-0 nova_compute[183075]: 2026-01-22 17:20:46.542 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:47 compute-0 nova_compute[183075]: 2026-01-22 17:20:47.049 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:48 compute-0 podman[226819]: 2026-01-22 17:20:48.40466005 +0000 UTC m=+0.099752402 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 17:20:48 compute-0 nova_compute[183075]: 2026-01-22 17:20:48.442 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102433.4409204, d424f9c2-6a07-485c-9feb-fd1f6a145be8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:20:48 compute-0 nova_compute[183075]: 2026-01-22 17:20:48.443 183079 INFO nova.compute.manager [-] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] VM Stopped (Lifecycle Event)
Jan 22 17:20:48 compute-0 nova_compute[183075]: 2026-01-22 17:20:48.465 183079 DEBUG nova.compute.manager [None req-b5320349-52f7-4da9-8781-297d680363a7 - - - - - -] [instance: d424f9c2-6a07-485c-9feb-fd1f6a145be8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:20:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:20:48.821 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:20:51 compute-0 nova_compute[183075]: 2026-01-22 17:20:51.544 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:52 compute-0 nova_compute[183075]: 2026-01-22 17:20:52.051 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:53 compute-0 nova_compute[183075]: 2026-01-22 17:20:53.750 183079 INFO nova.compute.manager [None req-962bffac-dcf1-4272-a0bf-93335dc3f551 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Get console output
Jan 22 17:20:53 compute-0 nova_compute[183075]: 2026-01-22 17:20:53.759 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:20:54 compute-0 nova_compute[183075]: 2026-01-22 17:20:54.803 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:54 compute-0 nova_compute[183075]: 2026-01-22 17:20:54.804 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:54 compute-0 nova_compute[183075]: 2026-01-22 17:20:54.833 183079 DEBUG nova.compute.manager [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:20:54 compute-0 nova_compute[183075]: 2026-01-22 17:20:54.923 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:54 compute-0 nova_compute[183075]: 2026-01-22 17:20:54.923 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:54 compute-0 nova_compute[183075]: 2026-01-22 17:20:54.931 183079 DEBUG nova.virt.hardware [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:20:54 compute-0 nova_compute[183075]: 2026-01-22 17:20:54.932 183079 INFO nova.compute.claims [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.058 183079 DEBUG nova.compute.provider_tree [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.072 183079 DEBUG nova.scheduler.client.report [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.095 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.096 183079 DEBUG nova.compute.manager [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.161 183079 DEBUG nova.compute.manager [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.162 183079 DEBUG nova.network.neutron [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.178 183079 INFO nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.198 183079 DEBUG nova.compute.manager [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.291 183079 DEBUG nova.compute.manager [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.292 183079 DEBUG nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.293 183079 INFO nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Creating image(s)
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.293 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "/var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.294 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.294 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.312 183079 DEBUG oslo_concurrency.processutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:20:55 compute-0 podman[226840]: 2026-01-22 17:20:55.366635712 +0000 UTC m=+0.072193055 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.394 183079 DEBUG oslo_concurrency.processutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.395 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.396 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.408 183079 DEBUG oslo_concurrency.processutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.476 183079 DEBUG oslo_concurrency.processutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.477 183079 DEBUG oslo_concurrency.processutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.513 183079 DEBUG oslo_concurrency.processutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.514 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.514 183079 DEBUG oslo_concurrency.processutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.572 183079 DEBUG oslo_concurrency.processutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.573 183079 DEBUG nova.virt.disk.api [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Checking if we can resize image /var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.573 183079 DEBUG oslo_concurrency.processutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.642 183079 DEBUG oslo_concurrency.processutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.643 183079 DEBUG nova.virt.disk.api [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Cannot resize image /var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.643 183079 DEBUG nova.objects.instance [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'migration_context' on Instance uuid 803da6ed-0f79-4c4d-b054-593b0dee0c0b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.657 183079 DEBUG nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.657 183079 DEBUG nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Ensure instance console log exists: /var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.658 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.658 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:20:55 compute-0 nova_compute[183075]: 2026-01-22 17:20:55.658 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:20:56 compute-0 nova_compute[183075]: 2026-01-22 17:20:56.548 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:56 compute-0 nova_compute[183075]: 2026-01-22 17:20:56.826 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102441.8247302, c4c0edb4-a206-4617-9465-58c87dcdf7d5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:20:56 compute-0 nova_compute[183075]: 2026-01-22 17:20:56.826 183079 INFO nova.compute.manager [-] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] VM Stopped (Lifecycle Event)
Jan 22 17:20:56 compute-0 nova_compute[183075]: 2026-01-22 17:20:56.855 183079 DEBUG nova.compute.manager [None req-abb3f85f-26fc-474c-8493-5787039e6ded - - - - - -] [instance: c4c0edb4-a206-4617-9465-58c87dcdf7d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:20:57 compute-0 nova_compute[183075]: 2026-01-22 17:20:57.054 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:20:57 compute-0 nova_compute[183075]: 2026-01-22 17:20:57.370 183079 DEBUG nova.policy [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:21:00 compute-0 nova_compute[183075]: 2026-01-22 17:21:00.422 183079 DEBUG nova.network.neutron [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Successfully updated port: 7cc526f2-d707-4153-92b0-f1be81a8b1e3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:21:00 compute-0 nova_compute[183075]: 2026-01-22 17:21:00.438 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "refresh_cache-803da6ed-0f79-4c4d-b054-593b0dee0c0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:21:00 compute-0 nova_compute[183075]: 2026-01-22 17:21:00.438 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquired lock "refresh_cache-803da6ed-0f79-4c4d-b054-593b0dee0c0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:21:00 compute-0 nova_compute[183075]: 2026-01-22 17:21:00.439 183079 DEBUG nova.network.neutron [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:21:00 compute-0 nova_compute[183075]: 2026-01-22 17:21:00.526 183079 DEBUG nova.compute.manager [req-a2e308b1-aebb-4d71-b04b-a13daf03e1ea req-6055fdcc-cac1-4490-a856-67817f2eed3e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Received event network-changed-7cc526f2-d707-4153-92b0-f1be81a8b1e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:21:00 compute-0 nova_compute[183075]: 2026-01-22 17:21:00.527 183079 DEBUG nova.compute.manager [req-a2e308b1-aebb-4d71-b04b-a13daf03e1ea req-6055fdcc-cac1-4490-a856-67817f2eed3e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Refreshing instance network info cache due to event network-changed-7cc526f2-d707-4153-92b0-f1be81a8b1e3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:21:00 compute-0 nova_compute[183075]: 2026-01-22 17:21:00.528 183079 DEBUG oslo_concurrency.lockutils [req-a2e308b1-aebb-4d71-b04b-a13daf03e1ea req-6055fdcc-cac1-4490-a856-67817f2eed3e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-803da6ed-0f79-4c4d-b054-593b0dee0c0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:21:00 compute-0 nova_compute[183075]: 2026-01-22 17:21:00.616 183079 DEBUG nova.network.neutron [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.544 183079 DEBUG nova.network.neutron [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Updating instance_info_cache with network_info: [{"id": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "address": "fa:16:3e:31:9a:74", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cc526f2-d7", "ovs_interfaceid": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.547 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.562 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Releasing lock "refresh_cache-803da6ed-0f79-4c4d-b054-593b0dee0c0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.563 183079 DEBUG nova.compute.manager [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Instance network_info: |[{"id": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "address": "fa:16:3e:31:9a:74", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cc526f2-d7", "ovs_interfaceid": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.563 183079 DEBUG oslo_concurrency.lockutils [req-a2e308b1-aebb-4d71-b04b-a13daf03e1ea req-6055fdcc-cac1-4490-a856-67817f2eed3e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-803da6ed-0f79-4c4d-b054-593b0dee0c0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.563 183079 DEBUG nova.network.neutron [req-a2e308b1-aebb-4d71-b04b-a13daf03e1ea req-6055fdcc-cac1-4490-a856-67817f2eed3e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Refreshing network info cache for port 7cc526f2-d707-4153-92b0-f1be81a8b1e3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.566 183079 DEBUG nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Start _get_guest_xml network_info=[{"id": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "address": "fa:16:3e:31:9a:74", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cc526f2-d7", "ovs_interfaceid": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.569 183079 WARNING nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.575 183079 DEBUG nova.virt.libvirt.host [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.575 183079 DEBUG nova.virt.libvirt.host [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.581 183079 DEBUG nova.virt.libvirt.host [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.582 183079 DEBUG nova.virt.libvirt.host [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.583 183079 DEBUG nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.584 183079 DEBUG nova.virt.hardware [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.584 183079 DEBUG nova.virt.hardware [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.585 183079 DEBUG nova.virt.hardware [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.585 183079 DEBUG nova.virt.hardware [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.585 183079 DEBUG nova.virt.hardware [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.586 183079 DEBUG nova.virt.hardware [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.586 183079 DEBUG nova.virt.hardware [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.586 183079 DEBUG nova.virt.hardware [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.587 183079 DEBUG nova.virt.hardware [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.587 183079 DEBUG nova.virt.hardware [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.587 183079 DEBUG nova.virt.hardware [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.593 183079 DEBUG nova.virt.libvirt.vif [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:20:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1984705589',display_name='tempest-server-test-1984705589',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1984705589',id=36,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-v6730tnl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:20:55Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=803da6ed-0f79-4c4d-b054-593b0dee0c0b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "address": "fa:16:3e:31:9a:74", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cc526f2-d7", "ovs_interfaceid": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.594 183079 DEBUG nova.network.os_vif_util [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "address": "fa:16:3e:31:9a:74", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cc526f2-d7", "ovs_interfaceid": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.595 183079 DEBUG nova.network.os_vif_util [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:9a:74,bridge_name='br-int',has_traffic_filtering=True,id=7cc526f2-d707-4153-92b0-f1be81a8b1e3,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7cc526f2-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.597 183079 DEBUG nova.objects.instance [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'pci_devices' on Instance uuid 803da6ed-0f79-4c4d-b054-593b0dee0c0b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.612 183079 DEBUG nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:21:01 compute-0 nova_compute[183075]:   <uuid>803da6ed-0f79-4c4d-b054-593b0dee0c0b</uuid>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   <name>instance-00000024</name>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1984705589</nova:name>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:21:01</nova:creationTime>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:21:01 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:21:01 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:21:01 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:21:01 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:21:01 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:21:01 compute-0 nova_compute[183075]:         <nova:user uuid="1148a46489e842e6a0c7660c54567798">tempest-FloatingIpSameNetwork-953620552-project-member</nova:user>
Jan 22 17:21:01 compute-0 nova_compute[183075]:         <nova:project uuid="02818155e7af4645bc909d4ba671f11f">tempest-FloatingIpSameNetwork-953620552</nova:project>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:21:01 compute-0 nova_compute[183075]:         <nova:port uuid="7cc526f2-d707-4153-92b0-f1be81a8b1e3">
Jan 22 17:21:01 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <system>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <entry name="serial">803da6ed-0f79-4c4d-b054-593b0dee0c0b</entry>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <entry name="uuid">803da6ed-0f79-4c4d-b054-593b0dee0c0b</entry>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     </system>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   <os>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   </os>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   <features>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   </features>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:31:9a:74"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <target dev="tap7cc526f2-d7"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b/console.log" append="off"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <video>
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     </video>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:21:01 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:21:01 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:21:01 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:21:01 compute-0 nova_compute[183075]: </domain>
Jan 22 17:21:01 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.613 183079 DEBUG nova.compute.manager [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Preparing to wait for external event network-vif-plugged-7cc526f2-d707-4153-92b0-f1be81a8b1e3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.614 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.614 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.614 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.615 183079 DEBUG nova.virt.libvirt.vif [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:20:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1984705589',display_name='tempest-server-test-1984705589',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1984705589',id=36,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-v6730tnl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:20:55Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=803da6ed-0f79-4c4d-b054-593b0dee0c0b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "address": "fa:16:3e:31:9a:74", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cc526f2-d7", "ovs_interfaceid": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.616 183079 DEBUG nova.network.os_vif_util [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "address": "fa:16:3e:31:9a:74", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cc526f2-d7", "ovs_interfaceid": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.616 183079 DEBUG nova.network.os_vif_util [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:9a:74,bridge_name='br-int',has_traffic_filtering=True,id=7cc526f2-d707-4153-92b0-f1be81a8b1e3,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7cc526f2-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.617 183079 DEBUG os_vif [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:9a:74,bridge_name='br-int',has_traffic_filtering=True,id=7cc526f2-d707-4153-92b0-f1be81a8b1e3,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7cc526f2-d7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.617 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.618 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.618 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.622 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.622 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7cc526f2-d7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.623 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7cc526f2-d7, col_values=(('external_ids', {'iface-id': '7cc526f2-d707-4153-92b0-f1be81a8b1e3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:31:9a:74', 'vm-uuid': '803da6ed-0f79-4c4d-b054-593b0dee0c0b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:01 compute-0 NetworkManager[55454]: <info>  [1769102461.6253] manager: (tap7cc526f2-d7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/185)
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.626 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.631 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.633 183079 INFO os_vif [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:9a:74,bridge_name='br-int',has_traffic_filtering=True,id=7cc526f2-d707-4153-92b0-f1be81a8b1e3,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7cc526f2-d7')
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.693 183079 DEBUG nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.694 183079 DEBUG nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No VIF found with MAC fa:16:3e:31:9a:74, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:21:01 compute-0 kernel: tap7cc526f2-d7: entered promiscuous mode
Jan 22 17:21:01 compute-0 NetworkManager[55454]: <info>  [1769102461.7636] manager: (tap7cc526f2-d7): new Tun device (/org/freedesktop/NetworkManager/Devices/186)
Jan 22 17:21:01 compute-0 ovn_controller[95372]: 2026-01-22T17:21:01Z|00429|binding|INFO|Claiming lport 7cc526f2-d707-4153-92b0-f1be81a8b1e3 for this chassis.
Jan 22 17:21:01 compute-0 ovn_controller[95372]: 2026-01-22T17:21:01Z|00430|binding|INFO|7cc526f2-d707-4153-92b0-f1be81a8b1e3: Claiming fa:16:3e:31:9a:74 10.100.0.9
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.764 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.774 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:9a:74 10.100.0.9'], port_security=['fa:16:3e:31:9a:74 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=7cc526f2-d707-4153-92b0-f1be81a8b1e3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.777 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 7cc526f2-d707-4153-92b0-f1be81a8b1e3 in datapath eee918a6-66b2-47ae-b702-620a23ef395b bound to our chassis
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.779 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:21:01 compute-0 ovn_controller[95372]: 2026-01-22T17:21:01Z|00431|binding|INFO|Setting lport 7cc526f2-d707-4153-92b0-f1be81a8b1e3 ovn-installed in OVS
Jan 22 17:21:01 compute-0 ovn_controller[95372]: 2026-01-22T17:21:01Z|00432|binding|INFO|Setting lport 7cc526f2-d707-4153-92b0-f1be81a8b1e3 up in Southbound
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.786 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:01 compute-0 nova_compute[183075]: 2026-01-22 17:21:01.789 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.796 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee4b38d-734c-4aea-bcba-a31491a0c5a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.797 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapeee918a6-61 in ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.799 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapeee918a6-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.799 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d2e6f0-b776-4d81-9b1e-0a54c6cc34af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:01 compute-0 systemd-udevd[226897]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.800 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[430111e9-a72b-4183-a3ec-4ab06ac88357]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.814 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[974f2baf-717d-4166-8c18-54e6ff5d1423]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:01 compute-0 NetworkManager[55454]: <info>  [1769102461.8194] device (tap7cc526f2-d7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:21:01 compute-0 NetworkManager[55454]: <info>  [1769102461.8201] device (tap7cc526f2-d7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:21:01 compute-0 systemd-machined[154382]: New machine qemu-36-instance-00000024.
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.836 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[95f4f101-74c6-4cb7-ae36-bb41d10ab3b2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:01 compute-0 systemd[1]: Started Virtual Machine qemu-36-instance-00000024.
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.873 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5e38c4bc-6646-4b74-8506-ff3c9b9234bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:01 compute-0 NetworkManager[55454]: <info>  [1769102461.8798] manager: (tapeee918a6-60): new Veth device (/org/freedesktop/NetworkManager/Devices/187)
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.879 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6083150d-e479-41df-91b5-770edcd2302a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.914 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d7151144-b114-4109-87e5-835e69fe5eb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.917 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c23cfd50-0890-42f9-9f38-614632f5f783]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:01 compute-0 NetworkManager[55454]: <info>  [1769102461.9391] device (tapeee918a6-60): carrier: link connected
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.949 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[129ef4e7-f583-4c7b-8c2e-1947c26c7e80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.970 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fdd71768-002b-4253-8066-4e6675e94ec4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 122], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482564, 'reachable_time': 38352, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226930, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:01.986 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[40898080-a61d-438d-be7a-58994f8ab6dc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe68:e27e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482564, 'tstamp': 482564}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226931, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:02.006 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[eaeebbed-0f57-4974-9c0c-cc6b120bca02]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 122], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482564, 'reachable_time': 38352, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226932, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.007 183079 DEBUG nova.compute.manager [req-67b0b131-a915-424c-939e-ebf8ccd2fa8a req-6122d6c0-8b57-4e34-8367-277850305f92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Received event network-vif-plugged-7cc526f2-d707-4153-92b0-f1be81a8b1e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.007 183079 DEBUG oslo_concurrency.lockutils [req-67b0b131-a915-424c-939e-ebf8ccd2fa8a req-6122d6c0-8b57-4e34-8367-277850305f92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.008 183079 DEBUG oslo_concurrency.lockutils [req-67b0b131-a915-424c-939e-ebf8ccd2fa8a req-6122d6c0-8b57-4e34-8367-277850305f92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.008 183079 DEBUG oslo_concurrency.lockutils [req-67b0b131-a915-424c-939e-ebf8ccd2fa8a req-6122d6c0-8b57-4e34-8367-277850305f92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.008 183079 DEBUG nova.compute.manager [req-67b0b131-a915-424c-939e-ebf8ccd2fa8a req-6122d6c0-8b57-4e34-8367-277850305f92 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Processing event network-vif-plugged-7cc526f2-d707-4153-92b0-f1be81a8b1e3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:02.035 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3851d22d-8b99-4dee-9d25-4123390aeba2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:02.113 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a06405-32a9-4090-b3c2-99557d7172cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:02.115 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:02.115 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:02.115 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee918a6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:02 compute-0 kernel: tapeee918a6-60: entered promiscuous mode
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:02.119 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeee918a6-60, col_values=(('external_ids', {'iface-id': '15d4de90-41f4-4532-aebd-197c2a33c6d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:02 compute-0 NetworkManager[55454]: <info>  [1769102462.1213] manager: (tapeee918a6-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/188)
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.119 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:02 compute-0 ovn_controller[95372]: 2026-01-22T17:21:02Z|00433|binding|INFO|Releasing lport 15d4de90-41f4-4532-aebd-197c2a33c6d6 from this chassis (sb_readonly=0)
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:02.123 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eee918a6-66b2-47ae-b702-620a23ef395b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eee918a6-66b2-47ae-b702-620a23ef395b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:02.124 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9debfdab-a6b2-4481-88e4-18d2d251e96a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:02.125 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/eee918a6-66b2-47ae-b702-620a23ef395b.pid.haproxy
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:21:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:02.127 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'env', 'PROCESS_TAG=haproxy-eee918a6-66b2-47ae-b702-620a23ef395b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/eee918a6-66b2-47ae-b702-620a23ef395b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.134 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.169 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102462.1687176, 803da6ed-0f79-4c4d-b054-593b0dee0c0b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.169 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] VM Started (Lifecycle Event)
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.171 183079 DEBUG nova.compute.manager [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.174 183079 DEBUG nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.178 183079 INFO nova.virt.libvirt.driver [-] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Instance spawned successfully.
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.178 183079 DEBUG nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.187 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.190 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.200 183079 DEBUG nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.200 183079 DEBUG nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.201 183079 DEBUG nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.201 183079 DEBUG nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.202 183079 DEBUG nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.202 183079 DEBUG nova.virt.libvirt.driver [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.213 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.213 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102462.1706707, 803da6ed-0f79-4c4d-b054-593b0dee0c0b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.213 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] VM Paused (Lifecycle Event)
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.253 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.257 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102462.1735852, 803da6ed-0f79-4c4d-b054-593b0dee0c0b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.257 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] VM Resumed (Lifecycle Event)
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.275 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.280 183079 INFO nova.compute.manager [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Took 6.99 seconds to spawn the instance on the hypervisor.
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.280 183079 DEBUG nova.compute.manager [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.285 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.313 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.346 183079 INFO nova.compute.manager [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Took 7.45 seconds to build instance.
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.364 183079 DEBUG oslo_concurrency.lockutils [None req-e1fc8eaa-c64b-49b1-9f9a-e912f9ad6c20 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.560s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:02 compute-0 podman[226987]: 2026-01-22 17:21:02.564043844 +0000 UTC m=+0.070754008 container create 3442a1ee6a2c184f5fc6c40c63d18109f86621fd3a893b509606c67e3ad9914d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:21:02 compute-0 systemd[1]: Started libpod-conmon-3442a1ee6a2c184f5fc6c40c63d18109f86621fd3a893b509606c67e3ad9914d.scope.
Jan 22 17:21:02 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:21:02 compute-0 podman[226987]: 2026-01-22 17:21:02.535506901 +0000 UTC m=+0.042217095 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:21:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd1f8f6301b712456f55b0676b70aa1a1156b7d38c0a295fbe888b5143ea00fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:21:02 compute-0 podman[226987]: 2026-01-22 17:21:02.647718701 +0000 UTC m=+0.154428865 container init 3442a1ee6a2c184f5fc6c40c63d18109f86621fd3a893b509606c67e3ad9914d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 17:21:02 compute-0 podman[226987]: 2026-01-22 17:21:02.653653677 +0000 UTC m=+0.160363841 container start 3442a1ee6a2c184f5fc6c40c63d18109f86621fd3a893b509606c67e3ad9914d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 17:21:02 compute-0 podman[226999]: 2026-01-22 17:21:02.676600352 +0000 UTC m=+0.070459879 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:21:02 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[227002]: [NOTICE]   (227018) : New worker (227029) forked
Jan 22 17:21:02 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[227002]: [NOTICE]   (227018) : Loading success.
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.717 183079 DEBUG nova.network.neutron [req-a2e308b1-aebb-4d71-b04b-a13daf03e1ea req-6055fdcc-cac1-4490-a856-67817f2eed3e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Updated VIF entry in instance network info cache for port 7cc526f2-d707-4153-92b0-f1be81a8b1e3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.717 183079 DEBUG nova.network.neutron [req-a2e308b1-aebb-4d71-b04b-a13daf03e1ea req-6055fdcc-cac1-4490-a856-67817f2eed3e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Updating instance_info_cache with network_info: [{"id": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "address": "fa:16:3e:31:9a:74", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cc526f2-d7", "ovs_interfaceid": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:21:02 compute-0 nova_compute[183075]: 2026-01-22 17:21:02.729 183079 DEBUG oslo_concurrency.lockutils [req-a2e308b1-aebb-4d71-b04b-a13daf03e1ea req-6055fdcc-cac1-4490-a856-67817f2eed3e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-803da6ed-0f79-4c4d-b054-593b0dee0c0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:21:03 compute-0 nova_compute[183075]: 2026-01-22 17:21:03.152 183079 INFO nova.compute.manager [None req-978b2c05-098e-48e8-9562-b3673e137092 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Get console output
Jan 22 17:21:03 compute-0 nova_compute[183075]: 2026-01-22 17:21:03.157 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:04 compute-0 nova_compute[183075]: 2026-01-22 17:21:04.098 183079 DEBUG nova.compute.manager [req-25247c98-0b77-40c6-913d-3f01e8ad7fdd req-62ab8c2f-c8e8-4f97-98bf-e658020dfd00 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Received event network-vif-plugged-7cc526f2-d707-4153-92b0-f1be81a8b1e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:21:04 compute-0 nova_compute[183075]: 2026-01-22 17:21:04.099 183079 DEBUG oslo_concurrency.lockutils [req-25247c98-0b77-40c6-913d-3f01e8ad7fdd req-62ab8c2f-c8e8-4f97-98bf-e658020dfd00 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:04 compute-0 nova_compute[183075]: 2026-01-22 17:21:04.099 183079 DEBUG oslo_concurrency.lockutils [req-25247c98-0b77-40c6-913d-3f01e8ad7fdd req-62ab8c2f-c8e8-4f97-98bf-e658020dfd00 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:04 compute-0 nova_compute[183075]: 2026-01-22 17:21:04.099 183079 DEBUG oslo_concurrency.lockutils [req-25247c98-0b77-40c6-913d-3f01e8ad7fdd req-62ab8c2f-c8e8-4f97-98bf-e658020dfd00 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:04 compute-0 nova_compute[183075]: 2026-01-22 17:21:04.100 183079 DEBUG nova.compute.manager [req-25247c98-0b77-40c6-913d-3f01e8ad7fdd req-62ab8c2f-c8e8-4f97-98bf-e658020dfd00 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] No waiting events found dispatching network-vif-plugged-7cc526f2-d707-4153-92b0-f1be81a8b1e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:21:04 compute-0 nova_compute[183075]: 2026-01-22 17:21:04.100 183079 WARNING nova.compute.manager [req-25247c98-0b77-40c6-913d-3f01e8ad7fdd req-62ab8c2f-c8e8-4f97-98bf-e658020dfd00 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Received unexpected event network-vif-plugged-7cc526f2-d707-4153-92b0-f1be81a8b1e3 for instance with vm_state active and task_state None.
Jan 22 17:21:04 compute-0 nova_compute[183075]: 2026-01-22 17:21:04.899 183079 INFO nova.compute.manager [None req-b71e6760-55ea-4561-9790-b92debee0979 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Get console output
Jan 22 17:21:04 compute-0 nova_compute[183075]: 2026-01-22 17:21:04.904 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:06 compute-0 nova_compute[183075]: 2026-01-22 17:21:06.550 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:06 compute-0 nova_compute[183075]: 2026-01-22 17:21:06.625 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.577 183079 DEBUG oslo_concurrency.lockutils [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.578 183079 DEBUG oslo_concurrency.lockutils [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.579 183079 DEBUG oslo_concurrency.lockutils [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.579 183079 DEBUG oslo_concurrency.lockutils [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.580 183079 DEBUG oslo_concurrency.lockutils [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.581 183079 INFO nova.compute.manager [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Terminating instance
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.582 183079 DEBUG nova.compute.manager [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:21:07 compute-0 kernel: tap4452f367-08 (unregistering): left promiscuous mode
Jan 22 17:21:07 compute-0 NetworkManager[55454]: <info>  [1769102467.6281] device (tap4452f367-08): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:21:07 compute-0 ovn_controller[95372]: 2026-01-22T17:21:07Z|00434|binding|INFO|Releasing lport 4452f367-08e1-4434-a6fa-e97f48bf084c from this chassis (sb_readonly=0)
Jan 22 17:21:07 compute-0 ovn_controller[95372]: 2026-01-22T17:21:07Z|00435|binding|INFO|Setting lport 4452f367-08e1-4434-a6fa-e97f48bf084c down in Southbound
Jan 22 17:21:07 compute-0 ovn_controller[95372]: 2026-01-22T17:21:07Z|00436|binding|INFO|Removing iface tap4452f367-08 ovn-installed in OVS
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.709 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.725 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:07 compute-0 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000022.scope: Deactivated successfully.
Jan 22 17:21:07 compute-0 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000022.scope: Consumed 16.553s CPU time.
Jan 22 17:21:07 compute-0 systemd-machined[154382]: Machine qemu-34-instance-00000022 terminated.
Jan 22 17:21:07 compute-0 kernel: tap4452f367-08: entered promiscuous mode
Jan 22 17:21:07 compute-0 systemd-udevd[227040]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:21:07 compute-0 NetworkManager[55454]: <info>  [1769102467.8105] manager: (tap4452f367-08): new Tun device (/org/freedesktop/NetworkManager/Devices/189)
Jan 22 17:21:07 compute-0 ovn_controller[95372]: 2026-01-22T17:21:07Z|00437|if_status|INFO|Not updating pb chassis for 4452f367-08e1-4434-a6fa-e97f48bf084c now as sb is readonly
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.816 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:07 compute-0 kernel: tap4452f367-08 (unregistering): left promiscuous mode
Jan 22 17:21:07 compute-0 ovn_controller[95372]: 2026-01-22T17:21:07Z|00438|binding|INFO|Releasing lport 4452f367-08e1-4434-a6fa-e97f48bf084c from this chassis (sb_readonly=1)
Jan 22 17:21:07 compute-0 ovn_controller[95372]: 2026-01-22T17:21:07Z|00439|if_status|INFO|Dropped 1 log messages in last 466 seconds (most recently, 466 seconds ago) due to excessive rate
Jan 22 17:21:07 compute-0 ovn_controller[95372]: 2026-01-22T17:21:07Z|00440|if_status|INFO|Not setting lport 4452f367-08e1-4434-a6fa-e97f48bf084c down as sb is readonly
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.860 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.875 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:07.881 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:a5:65 10.100.0.10'], port_security=['fa:16:3e:19:a5:65 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '01065d98-95b1-4364-b9dc-eaf2c3a6d8f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'neutron:revision_number': '4', 'neutron:security_group_ids': '94e3530f-8012-4817-a338-7919b109ef3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12343ce0-7cef-4f7f-9439-6550d878d4ba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=4452f367-08e1-4434-a6fa-e97f48bf084c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:21:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:07.883 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 4452f367-08e1-4434-a6fa-e97f48bf084c in datapath 44326f3c-1431-44d6-85ce-61ecbbb5ed7a unbound from our chassis
Jan 22 17:21:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:07.885 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44326f3c-1431-44d6-85ce-61ecbbb5ed7a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.887 183079 INFO nova.virt.libvirt.driver [-] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Instance destroyed successfully.
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.888 183079 DEBUG nova.objects.instance [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lazy-loading 'resources' on Instance uuid 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:21:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:07.886 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8f4cabb9-57ad-476a-8cb1-30b891b4cf65]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:07.887 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a namespace which is not needed anymore
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.941 183079 DEBUG nova.virt.libvirt.vif [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:19:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-1-871110647',display_name='tempest-server-1-871110647',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-1-871110647',id=34,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMynqvoXBI34qH0ZDQNq01ZPmk41Xdi4JxkvNO4nJ7GrdggfhpXkKQhhUZRV3fv3bSwcXMqi04ipV9xe7IsTm4GBRV+U9o6VN5JSMLulp+KIvjPuEmjq0h65ra09/zQB9w==',key_name='tempest-keypair-test-1762910261',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:19:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e4c0bb18013747dfad2e25b2495090eb',ramdisk_id='',reservation_id='r-343r0wfy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortForwardingTestJSON-1240706675',owner_user_name='tempest-PortForwardingTestJSON-1240706675-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:19:39Z,user_data=None,user_id='852aea4e08344f39ae07e6b57393c767',uuid=01065d98-95b1-4364-b9dc-eaf2c3a6d8f8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4452f367-08e1-4434-a6fa-e97f48bf084c", "address": "fa:16:3e:19:a5:65", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4452f367-08", "ovs_interfaceid": "4452f367-08e1-4434-a6fa-e97f48bf084c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.942 183079 DEBUG nova.network.os_vif_util [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converting VIF {"id": "4452f367-08e1-4434-a6fa-e97f48bf084c", "address": "fa:16:3e:19:a5:65", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4452f367-08", "ovs_interfaceid": "4452f367-08e1-4434-a6fa-e97f48bf084c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.943 183079 DEBUG nova.network.os_vif_util [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:a5:65,bridge_name='br-int',has_traffic_filtering=True,id=4452f367-08e1-4434-a6fa-e97f48bf084c,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4452f367-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.944 183079 DEBUG os_vif [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:a5:65,bridge_name='br-int',has_traffic_filtering=True,id=4452f367-08e1-4434-a6fa-e97f48bf084c,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4452f367-08') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.947 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.948 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4452f367-08, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.954 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.958 183079 INFO os_vif [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:a5:65,bridge_name='br-int',has_traffic_filtering=True,id=4452f367-08e1-4434-a6fa-e97f48bf084c,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4452f367-08')
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.959 183079 INFO nova.virt.libvirt.driver [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Deleting instance files /var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8_del
Jan 22 17:21:07 compute-0 nova_compute[183075]: 2026-01-22 17:21:07.961 183079 INFO nova.virt.libvirt.driver [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Deletion of /var/lib/nova/instances/01065d98-95b1-4364-b9dc-eaf2c3a6d8f8_del complete
Jan 22 17:21:08 compute-0 nova_compute[183075]: 2026-01-22 17:21:08.070 183079 INFO nova.compute.manager [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Took 0.49 seconds to destroy the instance on the hypervisor.
Jan 22 17:21:08 compute-0 nova_compute[183075]: 2026-01-22 17:21:08.071 183079 DEBUG oslo.service.loopingcall [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:21:08 compute-0 nova_compute[183075]: 2026-01-22 17:21:08.071 183079 DEBUG nova.compute.manager [-] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:21:08 compute-0 nova_compute[183075]: 2026-01-22 17:21:08.072 183079 DEBUG nova.network.neutron [-] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:21:08 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226230]: [NOTICE]   (226234) : haproxy version is 2.8.14-c23fe91
Jan 22 17:21:08 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226230]: [NOTICE]   (226234) : path to executable is /usr/sbin/haproxy
Jan 22 17:21:08 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226230]: [WARNING]  (226234) : Exiting Master process...
Jan 22 17:21:08 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226230]: [ALERT]    (226234) : Current worker (226236) exited with code 143 (Terminated)
Jan 22 17:21:08 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[226230]: [WARNING]  (226234) : All workers exited. Exiting... (0)
Jan 22 17:21:08 compute-0 systemd[1]: libpod-78748a997f6fcf5d1ed1de9e255753aa5a5c7b5540551db9f55005e20505c8a2.scope: Deactivated successfully.
Jan 22 17:21:08 compute-0 podman[227069]: 2026-01-22 17:21:08.097443097 +0000 UTC m=+0.080263828 container died 78748a997f6fcf5d1ed1de9e255753aa5a5c7b5540551db9f55005e20505c8a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 17:21:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-78748a997f6fcf5d1ed1de9e255753aa5a5c7b5540551db9f55005e20505c8a2-userdata-shm.mount: Deactivated successfully.
Jan 22 17:21:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-fee4a742c266e2f827121656d7e379fe1d600c08915bfe27ca923c1e944c78af-merged.mount: Deactivated successfully.
Jan 22 17:21:08 compute-0 podman[227069]: 2026-01-22 17:21:08.226954983 +0000 UTC m=+0.209775724 container cleanup 78748a997f6fcf5d1ed1de9e255753aa5a5c7b5540551db9f55005e20505c8a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:21:08 compute-0 systemd[1]: libpod-conmon-78748a997f6fcf5d1ed1de9e255753aa5a5c7b5540551db9f55005e20505c8a2.scope: Deactivated successfully.
Jan 22 17:21:08 compute-0 nova_compute[183075]: 2026-01-22 17:21:08.256 183079 INFO nova.compute.manager [None req-f720ccb9-4b0c-4682-b456-196a93e70e92 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Get console output
Jan 22 17:21:08 compute-0 nova_compute[183075]: 2026-01-22 17:21:08.261 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:08 compute-0 podman[227099]: 2026-01-22 17:21:08.296298222 +0000 UTC m=+0.042724057 container remove 78748a997f6fcf5d1ed1de9e255753aa5a5c7b5540551db9f55005e20505c8a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 17:21:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:08.301 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cef94eb1-4d1c-48a4-8aca-a43b2203f46a]: (4, ('Thu Jan 22 05:21:07 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a (78748a997f6fcf5d1ed1de9e255753aa5a5c7b5540551db9f55005e20505c8a2)\n78748a997f6fcf5d1ed1de9e255753aa5a5c7b5540551db9f55005e20505c8a2\nThu Jan 22 05:21:08 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a (78748a997f6fcf5d1ed1de9e255753aa5a5c7b5540551db9f55005e20505c8a2)\n78748a997f6fcf5d1ed1de9e255753aa5a5c7b5540551db9f55005e20505c8a2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:08.303 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2f42d5f3-fa30-4464-bd65-b9c67fd2bf58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:08.304 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44326f3c-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:08 compute-0 nova_compute[183075]: 2026-01-22 17:21:08.307 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:08 compute-0 kernel: tap44326f3c-10: left promiscuous mode
Jan 22 17:21:08 compute-0 nova_compute[183075]: 2026-01-22 17:21:08.319 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:08.324 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9dc640ac-b27c-472b-9e57-25f6592c76b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:08.340 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f81a34da-7c97-427c-9bf1-6b499fe885de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:08.342 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8fd7e0fb-8c2f-45f8-b0ed-925c65463c58]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:08.356 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f4036262-b098-49e7-bb71-226b31e5492a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474279, 'reachable_time': 40160, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227114, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:08.358 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:21:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:08.358 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[79aba3bb-2f2a-4f8c-ba61-8df14bf77c5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:08 compute-0 systemd[1]: run-netns-ovnmeta\x2d44326f3c\x2d1431\x2d44d6\x2d85ce\x2d61ecbbb5ed7a.mount: Deactivated successfully.
Jan 22 17:21:08 compute-0 nova_compute[183075]: 2026-01-22 17:21:08.581 183079 DEBUG nova.compute.manager [req-b01f58fe-7b9c-4bbe-bc31-974ae4cda3a2 req-08b976e3-3466-4176-9db1-d442672136b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Received event network-vif-unplugged-4452f367-08e1-4434-a6fa-e97f48bf084c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:21:08 compute-0 nova_compute[183075]: 2026-01-22 17:21:08.582 183079 DEBUG oslo_concurrency.lockutils [req-b01f58fe-7b9c-4bbe-bc31-974ae4cda3a2 req-08b976e3-3466-4176-9db1-d442672136b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:08 compute-0 nova_compute[183075]: 2026-01-22 17:21:08.582 183079 DEBUG oslo_concurrency.lockutils [req-b01f58fe-7b9c-4bbe-bc31-974ae4cda3a2 req-08b976e3-3466-4176-9db1-d442672136b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:08 compute-0 nova_compute[183075]: 2026-01-22 17:21:08.582 183079 DEBUG oslo_concurrency.lockutils [req-b01f58fe-7b9c-4bbe-bc31-974ae4cda3a2 req-08b976e3-3466-4176-9db1-d442672136b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:08 compute-0 nova_compute[183075]: 2026-01-22 17:21:08.583 183079 DEBUG nova.compute.manager [req-b01f58fe-7b9c-4bbe-bc31-974ae4cda3a2 req-08b976e3-3466-4176-9db1-d442672136b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] No waiting events found dispatching network-vif-unplugged-4452f367-08e1-4434-a6fa-e97f48bf084c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:21:08 compute-0 nova_compute[183075]: 2026-01-22 17:21:08.583 183079 DEBUG nova.compute.manager [req-b01f58fe-7b9c-4bbe-bc31-974ae4cda3a2 req-08b976e3-3466-4176-9db1-d442672136b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Received event network-vif-unplugged-4452f367-08e1-4434-a6fa-e97f48bf084c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:21:10 compute-0 nova_compute[183075]: 2026-01-22 17:21:10.448 183079 DEBUG nova.network.neutron [-] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:21:10 compute-0 nova_compute[183075]: 2026-01-22 17:21:10.503 183079 INFO nova.compute.manager [-] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Took 2.43 seconds to deallocate network for instance.
Jan 22 17:21:10 compute-0 nova_compute[183075]: 2026-01-22 17:21:10.636 183079 DEBUG oslo_concurrency.lockutils [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:10 compute-0 nova_compute[183075]: 2026-01-22 17:21:10.636 183079 DEBUG oslo_concurrency.lockutils [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:10 compute-0 nova_compute[183075]: 2026-01-22 17:21:10.665 183079 DEBUG nova.compute.manager [req-4dabbb98-f87e-4449-a7c3-d8bbef1cc341 req-23a6c388-1c92-4739-a18c-948f522e290f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Received event network-vif-plugged-4452f367-08e1-4434-a6fa-e97f48bf084c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:21:10 compute-0 nova_compute[183075]: 2026-01-22 17:21:10.666 183079 DEBUG oslo_concurrency.lockutils [req-4dabbb98-f87e-4449-a7c3-d8bbef1cc341 req-23a6c388-1c92-4739-a18c-948f522e290f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:10 compute-0 nova_compute[183075]: 2026-01-22 17:21:10.666 183079 DEBUG oslo_concurrency.lockutils [req-4dabbb98-f87e-4449-a7c3-d8bbef1cc341 req-23a6c388-1c92-4739-a18c-948f522e290f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:10 compute-0 nova_compute[183075]: 2026-01-22 17:21:10.667 183079 DEBUG oslo_concurrency.lockutils [req-4dabbb98-f87e-4449-a7c3-d8bbef1cc341 req-23a6c388-1c92-4739-a18c-948f522e290f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:10 compute-0 nova_compute[183075]: 2026-01-22 17:21:10.667 183079 DEBUG nova.compute.manager [req-4dabbb98-f87e-4449-a7c3-d8bbef1cc341 req-23a6c388-1c92-4739-a18c-948f522e290f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] No waiting events found dispatching network-vif-plugged-4452f367-08e1-4434-a6fa-e97f48bf084c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:21:10 compute-0 nova_compute[183075]: 2026-01-22 17:21:10.668 183079 WARNING nova.compute.manager [req-4dabbb98-f87e-4449-a7c3-d8bbef1cc341 req-23a6c388-1c92-4739-a18c-948f522e290f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Received unexpected event network-vif-plugged-4452f367-08e1-4434-a6fa-e97f48bf084c for instance with vm_state deleted and task_state None.
Jan 22 17:21:10 compute-0 nova_compute[183075]: 2026-01-22 17:21:10.724 183079 DEBUG nova.compute.provider_tree [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:21:10 compute-0 nova_compute[183075]: 2026-01-22 17:21:10.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:21:10 compute-0 nova_compute[183075]: 2026-01-22 17:21:10.883 183079 DEBUG nova.scheduler.client.report [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:21:10 compute-0 nova_compute[183075]: 2026-01-22 17:21:10.980 183079 DEBUG oslo_concurrency.lockutils [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:11 compute-0 nova_compute[183075]: 2026-01-22 17:21:11.058 183079 INFO nova.scheduler.client.report [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Deleted allocations for instance 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8
Jan 22 17:21:11 compute-0 nova_compute[183075]: 2026-01-22 17:21:11.197 183079 DEBUG oslo_concurrency.lockutils [None req-53691b4d-8a9b-456f-a8a7-fafa8ce1222e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "01065d98-95b1-4364-b9dc-eaf2c3a6d8f8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:11 compute-0 nova_compute[183075]: 2026-01-22 17:21:11.553 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:12 compute-0 podman[227115]: 2026-01-22 17:21:12.389380377 +0000 UTC m=+0.092074799 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9-minimal, architecture=x86_64)
Jan 22 17:21:12 compute-0 podman[227137]: 2026-01-22 17:21:12.509956357 +0000 UTC m=+0.072371139 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 22 17:21:12 compute-0 podman[227136]: 2026-01-22 17:21:12.533104917 +0000 UTC m=+0.099896565 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:21:12 compute-0 nova_compute[183075]: 2026-01-22 17:21:12.951 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:13 compute-0 nova_compute[183075]: 2026-01-22 17:21:13.375 183079 INFO nova.compute.manager [None req-14274765-cdcb-43dd-a0be-84f1513e628b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Get console output
Jan 22 17:21:13 compute-0 nova_compute[183075]: 2026-01-22 17:21:13.382 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:14 compute-0 ovn_controller[95372]: 2026-01-22T17:21:14Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:31:9a:74 10.100.0.9
Jan 22 17:21:14 compute-0 ovn_controller[95372]: 2026-01-22T17:21:14Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:31:9a:74 10.100.0.9
Jan 22 17:21:14 compute-0 nova_compute[183075]: 2026-01-22 17:21:14.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:21:14 compute-0 nova_compute[183075]: 2026-01-22 17:21:14.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:21:15 compute-0 nova_compute[183075]: 2026-01-22 17:21:15.786 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:15 compute-0 nova_compute[183075]: 2026-01-22 17:21:15.787 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:15 compute-0 nova_compute[183075]: 2026-01-22 17:21:15.808 183079 DEBUG nova.compute.manager [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:21:15 compute-0 nova_compute[183075]: 2026-01-22 17:21:15.895 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:15 compute-0 nova_compute[183075]: 2026-01-22 17:21:15.895 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:15 compute-0 nova_compute[183075]: 2026-01-22 17:21:15.903 183079 DEBUG nova.virt.hardware [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:21:15 compute-0 nova_compute[183075]: 2026-01-22 17:21:15.903 183079 INFO nova.compute.claims [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.014 183079 DEBUG nova.compute.provider_tree [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.028 183079 DEBUG nova.scheduler.client.report [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.053 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.054 183079 DEBUG nova.compute.manager [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.110 183079 DEBUG nova.compute.manager [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.110 183079 DEBUG nova.network.neutron [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.134 183079 INFO nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.151 183079 DEBUG nova.compute.manager [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.255 183079 DEBUG nova.compute.manager [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.257 183079 DEBUG nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.257 183079 INFO nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Creating image(s)
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.258 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "/var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.259 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "/var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.259 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "/var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.274 183079 DEBUG oslo_concurrency.processutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.352 183079 DEBUG oslo_concurrency.processutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.353 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.354 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.368 183079 DEBUG oslo_concurrency.processutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.418 183079 DEBUG oslo_concurrency.processutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.419 183079 DEBUG oslo_concurrency.processutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.449 183079 DEBUG oslo_concurrency.processutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.450 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.451 183079 DEBUG oslo_concurrency.processutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.514 183079 DEBUG oslo_concurrency.processutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.516 183079 DEBUG nova.virt.disk.api [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Checking if we can resize image /var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.517 183079 DEBUG oslo_concurrency.processutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.562 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.572 183079 DEBUG nova.policy [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.579 183079 DEBUG oslo_concurrency.processutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.580 183079 DEBUG nova.virt.disk.api [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Cannot resize image /var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.580 183079 DEBUG nova.objects.instance [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lazy-loading 'migration_context' on Instance uuid b4f5d7ef-7780-43fe-9ed3-e83542116fa8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.597 183079 DEBUG nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.598 183079 DEBUG nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Ensure instance console log exists: /var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.599 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.599 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:16 compute-0 nova_compute[183075]: 2026-01-22 17:21:16.600 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:17 compute-0 nova_compute[183075]: 2026-01-22 17:21:17.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:21:17 compute-0 nova_compute[183075]: 2026-01-22 17:21:17.953 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:18 compute-0 nova_compute[183075]: 2026-01-22 17:21:18.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:21:18 compute-0 nova_compute[183075]: 2026-01-22 17:21:18.948 183079 INFO nova.compute.manager [None req-b44826c5-6384-40ad-ace7-0c6704e01e84 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Get console output
Jan 22 17:21:18 compute-0 nova_compute[183075]: 2026-01-22 17:21:18.956 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:19 compute-0 nova_compute[183075]: 2026-01-22 17:21:19.300 183079 DEBUG nova.network.neutron [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Successfully updated port: ce445831-3685-4525-80bd-f4dc617c4911 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:21:19 compute-0 podman[227208]: 2026-01-22 17:21:19.41353371 +0000 UTC m=+0.108293467 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 17:21:19 compute-0 nova_compute[183075]: 2026-01-22 17:21:19.473 183079 DEBUG nova.compute.manager [req-2dd6ef4e-d2f6-478c-9240-b1a88d8ab6cd req-63472f70-ebde-4695-b139-d6aa447f54a0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Received event network-changed-ce445831-3685-4525-80bd-f4dc617c4911 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:21:19 compute-0 nova_compute[183075]: 2026-01-22 17:21:19.474 183079 DEBUG nova.compute.manager [req-2dd6ef4e-d2f6-478c-9240-b1a88d8ab6cd req-63472f70-ebde-4695-b139-d6aa447f54a0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Refreshing instance network info cache due to event network-changed-ce445831-3685-4525-80bd-f4dc617c4911. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:21:19 compute-0 nova_compute[183075]: 2026-01-22 17:21:19.474 183079 DEBUG oslo_concurrency.lockutils [req-2dd6ef4e-d2f6-478c-9240-b1a88d8ab6cd req-63472f70-ebde-4695-b139-d6aa447f54a0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-b4f5d7ef-7780-43fe-9ed3-e83542116fa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:21:19 compute-0 nova_compute[183075]: 2026-01-22 17:21:19.474 183079 DEBUG oslo_concurrency.lockutils [req-2dd6ef4e-d2f6-478c-9240-b1a88d8ab6cd req-63472f70-ebde-4695-b139-d6aa447f54a0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-b4f5d7ef-7780-43fe-9ed3-e83542116fa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:21:19 compute-0 nova_compute[183075]: 2026-01-22 17:21:19.475 183079 DEBUG nova.network.neutron [req-2dd6ef4e-d2f6-478c-9240-b1a88d8ab6cd req-63472f70-ebde-4695-b139-d6aa447f54a0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Refreshing network info cache for port ce445831-3685-4525-80bd-f4dc617c4911 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:21:19 compute-0 nova_compute[183075]: 2026-01-22 17:21:19.477 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "refresh_cache-b4f5d7ef-7780-43fe-9ed3-e83542116fa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:21:19 compute-0 nova_compute[183075]: 2026-01-22 17:21:19.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:21:19 compute-0 nova_compute[183075]: 2026-01-22 17:21:19.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:21:20 compute-0 nova_compute[183075]: 2026-01-22 17:21:20.296 183079 DEBUG nova.network.neutron [req-2dd6ef4e-d2f6-478c-9240-b1a88d8ab6cd req-63472f70-ebde-4695-b139-d6aa447f54a0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:21:20 compute-0 nova_compute[183075]: 2026-01-22 17:21:20.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:21:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:20.935 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:20.936 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:21:20 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:20 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:20 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:20 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:20 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:20 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:21:20 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.327 183079 DEBUG nova.network.neutron [req-2dd6ef4e-d2f6-478c-9240-b1a88d8ab6cd req-63472f70-ebde-4695-b139-d6aa447f54a0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.330 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.331 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.331 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.331 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.349 183079 DEBUG oslo_concurrency.lockutils [req-2dd6ef4e-d2f6-478c-9240-b1a88d8ab6cd req-63472f70-ebde-4695-b139-d6aa447f54a0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-b4f5d7ef-7780-43fe-9ed3-e83542116fa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.349 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquired lock "refresh_cache-b4f5d7ef-7780-43fe-9ed3-e83542116fa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.350 183079 DEBUG nova.network.neutron [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.487 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.514 183079 DEBUG nova.network.neutron [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.526 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:21 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.9:41792 [22/Jan/2026:17:21:20.933] listener listener/metadata 0/0/0/594/594 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.528 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.5918751
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.536 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.537 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.553 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.553 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0164082
Jan 22 17:21:21 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.9:41796 [22/Jan/2026:17:21:21.536] listener listener/metadata 0/0/0/17/17 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.554 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.555 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.557 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.558 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.577 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:21 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.9:41810 [22/Jan/2026:17:21:21.557] listener listener/metadata 0/0/0/20/20 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.578 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0196195
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.592 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.595 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.596 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.608 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.609 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0130608
Jan 22 17:21:21 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.9:41820 [22/Jan/2026:17:21:21.595] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.614 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.614 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.616 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.630 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.631 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0161712
Jan 22 17:21:21 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.9:41832 [22/Jan/2026:17:21:21.613] listener listener/metadata 0/0/0/17/17 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.636 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.637 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.650 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:21 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.9:41840 [22/Jan/2026:17:21:21.635] listener listener/metadata 0/0/0/15/15 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.651 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0140707
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.655 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.656 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.668 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.669 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0130007
Jan 22 17:21:21 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.9:41852 [22/Jan/2026:17:21:21.655] listener listener/metadata 0/0/0/13/13 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.674 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.675 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.689 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.689 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0144920
Jan 22 17:21:21 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.9:41858 [22/Jan/2026:17:21:21.673] listener listener/metadata 0/0/0/15/15 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.694 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.695 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.709 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.709 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0144582
Jan 22 17:21:21 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.9:41874 [22/Jan/2026:17:21:21.694] listener listener/metadata 0/0/0/15/15 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.714 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.715 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.732 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:21 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.9:41890 [22/Jan/2026:17:21:21.714] listener listener/metadata 0/0/0/19/19 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.733 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0183260
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.738 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.739 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:21 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.9:41898 [22/Jan/2026:17:21:21.738] listener listener/metadata 0/0/0/24/24 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.763 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0235231
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.771 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.772 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.787 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.787 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0155687
Jan 22 17:21:21 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.9:41900 [22/Jan/2026:17:21:21.771] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.791 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.792 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.798 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.799 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5520MB free_disk=73.33951950073242GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.799 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.799 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.804 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.805 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0131240
Jan 22 17:21:21 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.9:41908 [22/Jan/2026:17:21:21.791] listener listener/metadata 0/0/0/14/14 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.809 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.809 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.825 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:21 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.9:41922 [22/Jan/2026:17:21:21.808] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.825 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0157552
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.829 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.830 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.845 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.846 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0157094
Jan 22 17:21:21 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.9:41926 [22/Jan/2026:17:21:21.829] listener listener/metadata 0/0/0/17/17 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.850 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.850 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.872 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:21.873 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0225921
Jan 22 17:21:21 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.9:41930 [22/Jan/2026:17:21:21.850] listener listener/metadata 0/0/0/23/23 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.875 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 803da6ed-0f79-4c4d-b054-593b0dee0c0b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.875 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance b4f5d7ef-7780-43fe-9ed3-e83542116fa8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.875 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.875 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.924 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.941 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.961 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:21:21 compute-0 nova_compute[183075]: 2026-01-22 17:21:21.962 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.719 183079 DEBUG nova.network.neutron [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Updating instance_info_cache with network_info: [{"id": "ce445831-3685-4525-80bd-f4dc617c4911", "address": "fa:16:3e:c5:38:5a", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce445831-36", "ovs_interfaceid": "ce445831-3685-4525-80bd-f4dc617c4911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.748 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Releasing lock "refresh_cache-b4f5d7ef-7780-43fe-9ed3-e83542116fa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.748 183079 DEBUG nova.compute.manager [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Instance network_info: |[{"id": "ce445831-3685-4525-80bd-f4dc617c4911", "address": "fa:16:3e:c5:38:5a", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce445831-36", "ovs_interfaceid": "ce445831-3685-4525-80bd-f4dc617c4911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.751 183079 DEBUG nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Start _get_guest_xml network_info=[{"id": "ce445831-3685-4525-80bd-f4dc617c4911", "address": "fa:16:3e:c5:38:5a", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce445831-36", "ovs_interfaceid": "ce445831-3685-4525-80bd-f4dc617c4911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.755 183079 WARNING nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.760 183079 DEBUG nova.virt.libvirt.host [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.761 183079 DEBUG nova.virt.libvirt.host [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.764 183079 DEBUG nova.virt.libvirt.host [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.765 183079 DEBUG nova.virt.libvirt.host [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.765 183079 DEBUG nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.766 183079 DEBUG nova.virt.hardware [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.766 183079 DEBUG nova.virt.hardware [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.767 183079 DEBUG nova.virt.hardware [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.767 183079 DEBUG nova.virt.hardware [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.767 183079 DEBUG nova.virt.hardware [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.768 183079 DEBUG nova.virt.hardware [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.768 183079 DEBUG nova.virt.hardware [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.769 183079 DEBUG nova.virt.hardware [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.769 183079 DEBUG nova.virt.hardware [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.770 183079 DEBUG nova.virt.hardware [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.770 183079 DEBUG nova.virt.hardware [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.776 183079 DEBUG nova.virt.libvirt.vif [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:21:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-1-1949948609',display_name='tempest-server-1-1949948609',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-1-1949948609',id=37,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMynqvoXBI34qH0ZDQNq01ZPmk41Xdi4JxkvNO4nJ7GrdggfhpXkKQhhUZRV3fv3bSwcXMqi04ipV9xe7IsTm4GBRV+U9o6VN5JSMLulp+KIvjPuEmjq0h65ra09/zQB9w==',key_name='tempest-keypair-test-1762910261',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e4c0bb18013747dfad2e25b2495090eb',ramdisk_id='',reservation_id='r-j8vsskaf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortForwardingTestJSON-1240706675',owner_user_name='tempest-PortForwardingTestJSON-1240706675-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:21:16Z,user_data=None,user_id='852aea4e08344f39ae07e6b57393c767',uuid=b4f5d7ef-7780-43fe-9ed3-e83542116fa8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ce445831-3685-4525-80bd-f4dc617c4911", "address": "fa:16:3e:c5:38:5a", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce445831-36", "ovs_interfaceid": "ce445831-3685-4525-80bd-f4dc617c4911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.776 183079 DEBUG nova.network.os_vif_util [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converting VIF {"id": "ce445831-3685-4525-80bd-f4dc617c4911", "address": "fa:16:3e:c5:38:5a", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce445831-36", "ovs_interfaceid": "ce445831-3685-4525-80bd-f4dc617c4911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.777 183079 DEBUG nova.network.os_vif_util [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:38:5a,bridge_name='br-int',has_traffic_filtering=True,id=ce445831-3685-4525-80bd-f4dc617c4911,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapce445831-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.778 183079 DEBUG nova.objects.instance [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lazy-loading 'pci_devices' on Instance uuid b4f5d7ef-7780-43fe-9ed3-e83542116fa8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.877 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102467.8767712, 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.878 183079 INFO nova.compute.manager [-] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] VM Stopped (Lifecycle Event)
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.961 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.962 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.990 183079 DEBUG nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:21:22 compute-0 nova_compute[183075]:   <uuid>b4f5d7ef-7780-43fe-9ed3-e83542116fa8</uuid>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   <name>instance-00000025</name>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <nova:name>tempest-server-1-1949948609</nova:name>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:21:22</nova:creationTime>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:21:22 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:21:22 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:21:22 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:21:22 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:21:22 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:21:22 compute-0 nova_compute[183075]:         <nova:user uuid="852aea4e08344f39ae07e6b57393c767">tempest-PortForwardingTestJSON-1240706675-project-member</nova:user>
Jan 22 17:21:22 compute-0 nova_compute[183075]:         <nova:project uuid="e4c0bb18013747dfad2e25b2495090eb">tempest-PortForwardingTestJSON-1240706675</nova:project>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:21:22 compute-0 nova_compute[183075]:         <nova:port uuid="ce445831-3685-4525-80bd-f4dc617c4911">
Jan 22 17:21:22 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <system>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <entry name="serial">b4f5d7ef-7780-43fe-9ed3-e83542116fa8</entry>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <entry name="uuid">b4f5d7ef-7780-43fe-9ed3-e83542116fa8</entry>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     </system>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   <os>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   </os>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   <features>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   </features>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:c5:38:5a"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <target dev="tapce445831-36"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8/console.log" append="off"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <video>
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     </video>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:21:22 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:21:22 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:21:22 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:21:22 compute-0 nova_compute[183075]: </domain>
Jan 22 17:21:22 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.991 183079 DEBUG nova.compute.manager [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Preparing to wait for external event network-vif-plugged-ce445831-3685-4525-80bd-f4dc617c4911 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.992 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.992 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.992 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.993 183079 DEBUG nova.virt.libvirt.vif [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:21:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-1-1949948609',display_name='tempest-server-1-1949948609',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-1-1949948609',id=37,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMynqvoXBI34qH0ZDQNq01ZPmk41Xdi4JxkvNO4nJ7GrdggfhpXkKQhhUZRV3fv3bSwcXMqi04ipV9xe7IsTm4GBRV+U9o6VN5JSMLulp+KIvjPuEmjq0h65ra09/zQB9w==',key_name='tempest-keypair-test-1762910261',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e4c0bb18013747dfad2e25b2495090eb',ramdisk_id='',reservation_id='r-j8vsskaf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortForwardingTestJSON-1240706675',owner_user_name='tempest-PortForwardingTestJSON-1240706675-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:21:16Z,user_data=None,user_id='852aea4e08344f39ae07e6b57393c767',uuid=b4f5d7ef-7780-43fe-9ed3-e83542116fa8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ce445831-3685-4525-80bd-f4dc617c4911", "address": "fa:16:3e:c5:38:5a", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce445831-36", "ovs_interfaceid": "ce445831-3685-4525-80bd-f4dc617c4911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.994 183079 DEBUG nova.network.os_vif_util [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converting VIF {"id": "ce445831-3685-4525-80bd-f4dc617c4911", "address": "fa:16:3e:c5:38:5a", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce445831-36", "ovs_interfaceid": "ce445831-3685-4525-80bd-f4dc617c4911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.995 183079 DEBUG nova.network.os_vif_util [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:38:5a,bridge_name='br-int',has_traffic_filtering=True,id=ce445831-3685-4525-80bd-f4dc617c4911,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapce445831-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.995 183079 DEBUG os_vif [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:38:5a,bridge_name='br-int',has_traffic_filtering=True,id=ce445831-3685-4525-80bd-f4dc617c4911,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapce445831-36') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.996 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.996 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.997 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:21:22 compute-0 nova_compute[183075]: 2026-01-22 17:21:22.997 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.000 183079 DEBUG nova.compute.manager [None req-9393c191-1af3-4db2-ac93-ac06c41accdd - - - - - -] [instance: 01065d98-95b1-4364-b9dc-eaf2c3a6d8f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.002 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.002 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce445831-36, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.002 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapce445831-36, col_values=(('external_ids', {'iface-id': 'ce445831-3685-4525-80bd-f4dc617c4911', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c5:38:5a', 'vm-uuid': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.005 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:23 compute-0 NetworkManager[55454]: <info>  [1769102483.0066] manager: (tapce445831-36): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.007 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.012 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.013 183079 INFO os_vif [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:38:5a,bridge_name='br-int',has_traffic_filtering=True,id=ce445831-3685-4525-80bd-f4dc617c4911,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapce445831-36')
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.024 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.025 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.195 183079 DEBUG nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.195 183079 DEBUG nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] No VIF found with MAC fa:16:3e:c5:38:5a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:21:23 compute-0 kernel: tapce445831-36: entered promiscuous mode
Jan 22 17:21:23 compute-0 NetworkManager[55454]: <info>  [1769102483.2675] manager: (tapce445831-36): new Tun device (/org/freedesktop/NetworkManager/Devices/191)
Jan 22 17:21:23 compute-0 ovn_controller[95372]: 2026-01-22T17:21:23Z|00441|binding|INFO|Claiming lport ce445831-3685-4525-80bd-f4dc617c4911 for this chassis.
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.269 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:23 compute-0 ovn_controller[95372]: 2026-01-22T17:21:23Z|00442|binding|INFO|ce445831-3685-4525-80bd-f4dc617c4911: Claiming fa:16:3e:c5:38:5a 10.100.0.3
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.279 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:38:5a 10.100.0.3'], port_security=['fa:16:3e:c5:38:5a 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '94e3530f-8012-4817-a338-7919b109ef3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12343ce0-7cef-4f7f-9439-6550d878d4ba, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=ce445831-3685-4525-80bd-f4dc617c4911) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.281 104629 INFO neutron.agent.ovn.metadata.agent [-] Port ce445831-3685-4525-80bd-f4dc617c4911 in datapath 44326f3c-1431-44d6-85ce-61ecbbb5ed7a bound to our chassis
Jan 22 17:21:23 compute-0 ovn_controller[95372]: 2026-01-22T17:21:23Z|00443|binding|INFO|Setting lport ce445831-3685-4525-80bd-f4dc617c4911 ovn-installed in OVS
Jan 22 17:21:23 compute-0 ovn_controller[95372]: 2026-01-22T17:21:23Z|00444|binding|INFO|Setting lport ce445831-3685-4525-80bd-f4dc617c4911 up in Southbound
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.283 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44326f3c-1431-44d6-85ce-61ecbbb5ed7a
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.285 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.298 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6257103b-6ff8-4ae8-a2c6-76c9d360b5ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.299 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap44326f3c-11 in ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.300 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap44326f3c-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.301 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d091ea5e-c826-4f84-81bf-229a6053e045]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.301 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9e1f95e7-58b6-4369-acb1-82ce98f0090a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:23 compute-0 systemd-machined[154382]: New machine qemu-37-instance-00000025.
Jan 22 17:21:23 compute-0 systemd-udevd[227253]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.314 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[be473b69-ec9f-4380-9aa5-97b06aef8a0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:23 compute-0 NetworkManager[55454]: <info>  [1769102483.3245] device (tapce445831-36): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:21:23 compute-0 NetworkManager[55454]: <info>  [1769102483.3252] device (tapce445831-36): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:21:23 compute-0 systemd[1]: Started Virtual Machine qemu-37-instance-00000025.
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.330 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a816f441-46e9-4211-91ab-7bb617a80a88]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.365 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c84c7052-f77b-4d16-9742-2434ae754d70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:23 compute-0 systemd-udevd[227256]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.372 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b027c47d-d8ee-4ec6-bf60-bc4d4906de96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:23 compute-0 NetworkManager[55454]: <info>  [1769102483.3732] manager: (tap44326f3c-10): new Veth device (/org/freedesktop/NetworkManager/Devices/192)
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.404 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d4c3c1dc-4dd6-44d3-9916-c32470fd1acd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.407 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f5ea7593-c0ba-4aa2-bb47-daf48939fb33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:23 compute-0 NetworkManager[55454]: <info>  [1769102483.4382] device (tap44326f3c-10): carrier: link connected
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.446 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[eb97461b-1784-4499-8ff5-9e7b45d76f5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.466 183079 DEBUG nova.compute.manager [req-72e1c87d-0b2b-4469-9384-02dc9a086a7f req-83c23765-ed14-410a-8487-b032a1fec934 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Received event network-vif-plugged-ce445831-3685-4525-80bd-f4dc617c4911 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.466 183079 DEBUG oslo_concurrency.lockutils [req-72e1c87d-0b2b-4469-9384-02dc9a086a7f req-83c23765-ed14-410a-8487-b032a1fec934 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.467 183079 DEBUG oslo_concurrency.lockutils [req-72e1c87d-0b2b-4469-9384-02dc9a086a7f req-83c23765-ed14-410a-8487-b032a1fec934 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.467 183079 DEBUG oslo_concurrency.lockutils [req-72e1c87d-0b2b-4469-9384-02dc9a086a7f req-83c23765-ed14-410a-8487-b032a1fec934 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.467 183079 DEBUG nova.compute.manager [req-72e1c87d-0b2b-4469-9384-02dc9a086a7f req-83c23765-ed14-410a-8487-b032a1fec934 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Processing event network-vif-plugged-ce445831-3685-4525-80bd-f4dc617c4911 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.467 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c56957d6-0292-427e-9576-32b2b6549137]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44326f3c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:1b:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 484714, 'reachable_time': 26366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227284, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.483 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ce813438-d706-4a53-a906-0272bb36e76a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe37:1b89'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 484714, 'tstamp': 484714}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227285, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.498 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a091c557-9eaa-445a-af5c-b26fd7bbeb5d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44326f3c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:1b:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 484714, 'reachable_time': 26366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227286, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.533 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[578b033c-46c9-408a-bdf2-499eac7a5b7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.591 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[987c6e5c-a94c-4fff-9e78-1244db88293e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.593 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44326f3c-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.593 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.593 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44326f3c-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.596 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:23 compute-0 NetworkManager[55454]: <info>  [1769102483.5965] manager: (tap44326f3c-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/193)
Jan 22 17:21:23 compute-0 kernel: tap44326f3c-10: entered promiscuous mode
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.600 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44326f3c-10, col_values=(('external_ids', {'iface-id': '118957e0-7da0-4d87-b7d4-2c204e19e5b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:23 compute-0 ovn_controller[95372]: 2026-01-22T17:21:23Z|00445|binding|INFO|Releasing lport 118957e0-7da0-4d87-b7d4-2c204e19e5b6 from this chassis (sb_readonly=0)
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.602 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.603 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44326f3c-1431-44d6-85ce-61ecbbb5ed7a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44326f3c-1431-44d6-85ce-61ecbbb5ed7a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.603 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[da6c4e54-08e2-4ee5-afb9-0e586902a0f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.604 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/44326f3c-1431-44d6-85ce-61ecbbb5ed7a.pid.haproxy
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 44326f3c-1431-44d6-85ce-61ecbbb5ed7a
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:21:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:23.605 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'env', 'PROCESS_TAG=haproxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/44326f3c-1431-44d6-85ce-61ecbbb5ed7a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.613 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.871 183079 DEBUG nova.compute.manager [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.873 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102483.871408, b4f5d7ef-7780-43fe-9ed3-e83542116fa8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.873 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] VM Started (Lifecycle Event)
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.883 183079 DEBUG nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.886 183079 INFO nova.virt.libvirt.driver [-] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Instance spawned successfully.
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.887 183079 DEBUG nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.893 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.895 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.921 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.922 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102483.8723333, b4f5d7ef-7780-43fe-9ed3-e83542116fa8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.922 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] VM Paused (Lifecycle Event)
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.926 183079 DEBUG nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.926 183079 DEBUG nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.927 183079 DEBUG nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.927 183079 DEBUG nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.928 183079 DEBUG nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.928 183079 DEBUG nova.virt.libvirt.driver [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.954 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.957 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102483.8782923, b4f5d7ef-7780-43fe-9ed3-e83542116fa8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.958 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] VM Resumed (Lifecycle Event)
Jan 22 17:21:23 compute-0 podman[227322]: 2026-01-22 17:21:23.973261303 +0000 UTC m=+0.052023813 container create d6408edcaee6f293a780cdb210de85f2c13c219ecb5789eaebe87747d660f204 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.985 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.991 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.998 183079 INFO nova.compute.manager [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Took 7.74 seconds to spawn the instance on the hypervisor.
Jan 22 17:21:23 compute-0 nova_compute[183075]: 2026-01-22 17:21:23.998 183079 DEBUG nova.compute.manager [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:21:24 compute-0 nova_compute[183075]: 2026-01-22 17:21:24.010 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:21:24 compute-0 systemd[1]: Started libpod-conmon-d6408edcaee6f293a780cdb210de85f2c13c219ecb5789eaebe87747d660f204.scope.
Jan 22 17:21:24 compute-0 podman[227322]: 2026-01-22 17:21:23.946835406 +0000 UTC m=+0.025597946 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:21:24 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:21:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edf6d6b1b0aa25cb11fb54da7a8df253d0383fd7d373a159881698b2490805e3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:21:24 compute-0 nova_compute[183075]: 2026-01-22 17:21:24.213 183079 INFO nova.compute.manager [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Took 8.34 seconds to build instance.
Jan 22 17:21:24 compute-0 nova_compute[183075]: 2026-01-22 17:21:24.680 183079 INFO nova.compute.manager [None req-1d92b00a-a15d-40ce-8de3-856df429372c 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Get console output
Jan 22 17:21:24 compute-0 nova_compute[183075]: 2026-01-22 17:21:24.687 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:24 compute-0 podman[227322]: 2026-01-22 17:21:24.755197997 +0000 UTC m=+0.833960547 container init d6408edcaee6f293a780cdb210de85f2c13c219ecb5789eaebe87747d660f204 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 17:21:24 compute-0 podman[227322]: 2026-01-22 17:21:24.769132294 +0000 UTC m=+0.847894834 container start d6408edcaee6f293a780cdb210de85f2c13c219ecb5789eaebe87747d660f204 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:21:24 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227337]: [NOTICE]   (227342) : New worker (227344) forked
Jan 22 17:21:24 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227337]: [NOTICE]   (227342) : Loading success.
Jan 22 17:21:25 compute-0 nova_compute[183075]: 2026-01-22 17:21:25.127 183079 DEBUG oslo_concurrency.lockutils [None req-336ee9cc-ed13-45e2-93d6-5f0e8207d5b2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.340s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:25 compute-0 nova_compute[183075]: 2026-01-22 17:21:25.508 183079 INFO nova.compute.manager [None req-b4788440-8738-4676-862f-843e6e013a21 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Get console output
Jan 22 17:21:25 compute-0 nova_compute[183075]: 2026-01-22 17:21:25.515 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:25 compute-0 nova_compute[183075]: 2026-01-22 17:21:25.611 183079 DEBUG nova.compute.manager [req-59d654e3-30cd-41f5-97d5-bde13d654517 req-b600a59b-8c5e-4949-967a-67b359f8355e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Received event network-vif-plugged-ce445831-3685-4525-80bd-f4dc617c4911 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:21:25 compute-0 nova_compute[183075]: 2026-01-22 17:21:25.612 183079 DEBUG oslo_concurrency.lockutils [req-59d654e3-30cd-41f5-97d5-bde13d654517 req-b600a59b-8c5e-4949-967a-67b359f8355e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:25 compute-0 nova_compute[183075]: 2026-01-22 17:21:25.612 183079 DEBUG oslo_concurrency.lockutils [req-59d654e3-30cd-41f5-97d5-bde13d654517 req-b600a59b-8c5e-4949-967a-67b359f8355e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:25 compute-0 nova_compute[183075]: 2026-01-22 17:21:25.613 183079 DEBUG oslo_concurrency.lockutils [req-59d654e3-30cd-41f5-97d5-bde13d654517 req-b600a59b-8c5e-4949-967a-67b359f8355e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:25 compute-0 nova_compute[183075]: 2026-01-22 17:21:25.614 183079 DEBUG nova.compute.manager [req-59d654e3-30cd-41f5-97d5-bde13d654517 req-b600a59b-8c5e-4949-967a-67b359f8355e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] No waiting events found dispatching network-vif-plugged-ce445831-3685-4525-80bd-f4dc617c4911 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:21:25 compute-0 nova_compute[183075]: 2026-01-22 17:21:25.614 183079 WARNING nova.compute.manager [req-59d654e3-30cd-41f5-97d5-bde13d654517 req-b600a59b-8c5e-4949-967a-67b359f8355e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Received unexpected event network-vif-plugged-ce445831-3685-4525-80bd-f4dc617c4911 for instance with vm_state active and task_state None.
Jan 22 17:21:26 compute-0 podman[227353]: 2026-01-22 17:21:26.395515241 +0000 UTC m=+0.090416475 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:21:26 compute-0 nova_compute[183075]: 2026-01-22 17:21:26.596 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:28 compute-0 nova_compute[183075]: 2026-01-22 17:21:28.006 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:31 compute-0 nova_compute[183075]: 2026-01-22 17:21:31.599 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:32 compute-0 nova_compute[183075]: 2026-01-22 17:21:32.758 183079 INFO nova.compute.manager [None req-c1d438b5-bd69-4d79-98a4-cbdb0cc218ef 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Get console output
Jan 22 17:21:32 compute-0 nova_compute[183075]: 2026-01-22 17:21:32.765 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:33 compute-0 nova_compute[183075]: 2026-01-22 17:21:33.009 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:33 compute-0 podman[227374]: 2026-01-22 17:21:33.374587814 +0000 UTC m=+0.071325622 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.110 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.111 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.126 183079 DEBUG nova.compute.manager [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.192 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.193 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.200 183079 DEBUG nova.virt.hardware [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.200 183079 INFO nova.compute.claims [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.374 183079 DEBUG nova.compute.provider_tree [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.447 183079 DEBUG nova.scheduler.client.report [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.470 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.471 183079 DEBUG nova.compute.manager [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.517 183079 DEBUG nova.compute.manager [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.517 183079 DEBUG nova.network.neutron [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.535 183079 INFO nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.556 183079 DEBUG nova.compute.manager [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.655 183079 DEBUG nova.compute.manager [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.657 183079 DEBUG nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.657 183079 INFO nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Creating image(s)
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.658 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "/var/lib/nova/instances/2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.658 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.659 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.679 183079 DEBUG oslo_concurrency.processutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.754 183079 DEBUG oslo_concurrency.processutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.756 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.757 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.774 183079 DEBUG oslo_concurrency.processutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.829 183079 DEBUG oslo_concurrency.processutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.831 183079 DEBUG oslo_concurrency.processutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.868 183079 DEBUG oslo_concurrency.processutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.870 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.871 183079 DEBUG oslo_concurrency.processutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.926 183079 DEBUG oslo_concurrency.processutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.928 183079 DEBUG nova.virt.disk.api [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Checking if we can resize image /var/lib/nova/instances/2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.929 183079 DEBUG oslo_concurrency.processutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.993 183079 DEBUG oslo_concurrency.processutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.994 183079 DEBUG nova.virt.disk.api [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Cannot resize image /var/lib/nova/instances/2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:21:34 compute-0 nova_compute[183075]: 2026-01-22 17:21:34.995 183079 DEBUG nova.objects.instance [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'migration_context' on Instance uuid 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:21:35 compute-0 nova_compute[183075]: 2026-01-22 17:21:35.016 183079 DEBUG nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:21:35 compute-0 nova_compute[183075]: 2026-01-22 17:21:35.017 183079 DEBUG nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Ensure instance console log exists: /var/lib/nova/instances/2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:21:35 compute-0 nova_compute[183075]: 2026-01-22 17:21:35.018 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:35 compute-0 nova_compute[183075]: 2026-01-22 17:21:35.018 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:35 compute-0 nova_compute[183075]: 2026-01-22 17:21:35.019 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:35 compute-0 ovn_controller[95372]: 2026-01-22T17:21:35Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c5:38:5a 10.100.0.3
Jan 22 17:21:35 compute-0 ovn_controller[95372]: 2026-01-22T17:21:35Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c5:38:5a 10.100.0.3
Jan 22 17:21:35 compute-0 nova_compute[183075]: 2026-01-22 17:21:35.541 183079 DEBUG nova.policy [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:21:36 compute-0 nova_compute[183075]: 2026-01-22 17:21:36.601 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:36 compute-0 nova_compute[183075]: 2026-01-22 17:21:36.846 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:21:37 compute-0 nova_compute[183075]: 2026-01-22 17:21:37.426 183079 DEBUG nova.network.neutron [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Successfully updated port: d6bc5516-013e-4fd3-981b-a500723e5cc6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:21:37 compute-0 nova_compute[183075]: 2026-01-22 17:21:37.450 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "refresh_cache-2ee81c99-07a0-41e3-bfac-dfb718a8e4c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:21:37 compute-0 nova_compute[183075]: 2026-01-22 17:21:37.451 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquired lock "refresh_cache-2ee81c99-07a0-41e3-bfac-dfb718a8e4c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:21:37 compute-0 nova_compute[183075]: 2026-01-22 17:21:37.451 183079 DEBUG nova.network.neutron [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:21:37 compute-0 nova_compute[183075]: 2026-01-22 17:21:37.508 183079 DEBUG nova.compute.manager [req-cadb06f7-e305-43a2-8944-87060b385097 req-ca0819dc-4342-4473-a0a2-d13f3cdf29ed a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Received event network-changed-d6bc5516-013e-4fd3-981b-a500723e5cc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:21:37 compute-0 nova_compute[183075]: 2026-01-22 17:21:37.509 183079 DEBUG nova.compute.manager [req-cadb06f7-e305-43a2-8944-87060b385097 req-ca0819dc-4342-4473-a0a2-d13f3cdf29ed a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Refreshing instance network info cache due to event network-changed-d6bc5516-013e-4fd3-981b-a500723e5cc6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:21:37 compute-0 nova_compute[183075]: 2026-01-22 17:21:37.509 183079 DEBUG oslo_concurrency.lockutils [req-cadb06f7-e305-43a2-8944-87060b385097 req-ca0819dc-4342-4473-a0a2-d13f3cdf29ed a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-2ee81c99-07a0-41e3-bfac-dfb718a8e4c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:21:37 compute-0 nova_compute[183075]: 2026-01-22 17:21:37.623 183079 DEBUG nova.network.neutron [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:21:37 compute-0 nova_compute[183075]: 2026-01-22 17:21:37.884 183079 INFO nova.compute.manager [None req-6a4c45a8-693e-45e9-b263-10bc723aa7d7 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Get console output
Jan 22 17:21:37 compute-0 nova_compute[183075]: 2026-01-22 17:21:37.891 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:38 compute-0 nova_compute[183075]: 2026-01-22 17:21:38.013 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.332 183079 DEBUG nova.network.neutron [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Updating instance_info_cache with network_info: [{"id": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "address": "fa:16:3e:13:5a:48", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6bc5516-01", "ovs_interfaceid": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.357 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Releasing lock "refresh_cache-2ee81c99-07a0-41e3-bfac-dfb718a8e4c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.357 183079 DEBUG nova.compute.manager [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Instance network_info: |[{"id": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "address": "fa:16:3e:13:5a:48", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6bc5516-01", "ovs_interfaceid": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.357 183079 DEBUG oslo_concurrency.lockutils [req-cadb06f7-e305-43a2-8944-87060b385097 req-ca0819dc-4342-4473-a0a2-d13f3cdf29ed a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-2ee81c99-07a0-41e3-bfac-dfb718a8e4c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.357 183079 DEBUG nova.network.neutron [req-cadb06f7-e305-43a2-8944-87060b385097 req-ca0819dc-4342-4473-a0a2-d13f3cdf29ed a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Refreshing network info cache for port d6bc5516-013e-4fd3-981b-a500723e5cc6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.360 183079 DEBUG nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Start _get_guest_xml network_info=[{"id": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "address": "fa:16:3e:13:5a:48", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6bc5516-01", "ovs_interfaceid": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.365 183079 WARNING nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.371 183079 DEBUG nova.virt.libvirt.host [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.372 183079 DEBUG nova.virt.libvirt.host [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.380 183079 DEBUG nova.virt.libvirt.host [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.381 183079 DEBUG nova.virt.libvirt.host [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.382 183079 DEBUG nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.382 183079 DEBUG nova.virt.hardware [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.383 183079 DEBUG nova.virt.hardware [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.384 183079 DEBUG nova.virt.hardware [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.384 183079 DEBUG nova.virt.hardware [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.385 183079 DEBUG nova.virt.hardware [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.385 183079 DEBUG nova.virt.hardware [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.386 183079 DEBUG nova.virt.hardware [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.386 183079 DEBUG nova.virt.hardware [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.387 183079 DEBUG nova.virt.hardware [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.387 183079 DEBUG nova.virt.hardware [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.388 183079 DEBUG nova.virt.hardware [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.394 183079 DEBUG nova.virt.libvirt.vif [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:21:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-766639800',display_name='tempest-server-test-766639800',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-766639800',id=38,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-ttafj00f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:21:34Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=2ee81c99-07a0-41e3-bfac-dfb718a8e4c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "address": "fa:16:3e:13:5a:48", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6bc5516-01", "ovs_interfaceid": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.394 183079 DEBUG nova.network.os_vif_util [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "address": "fa:16:3e:13:5a:48", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6bc5516-01", "ovs_interfaceid": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.396 183079 DEBUG nova.network.os_vif_util [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:5a:48,bridge_name='br-int',has_traffic_filtering=True,id=d6bc5516-013e-4fd3-981b-a500723e5cc6,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6bc5516-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.397 183079 DEBUG nova.objects.instance [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.416 183079 DEBUG nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:21:39 compute-0 nova_compute[183075]:   <uuid>2ee81c99-07a0-41e3-bfac-dfb718a8e4c6</uuid>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   <name>instance-00000026</name>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-766639800</nova:name>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:21:39</nova:creationTime>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:21:39 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:21:39 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:21:39 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:21:39 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:21:39 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:21:39 compute-0 nova_compute[183075]:         <nova:user uuid="1148a46489e842e6a0c7660c54567798">tempest-FloatingIpSameNetwork-953620552-project-member</nova:user>
Jan 22 17:21:39 compute-0 nova_compute[183075]:         <nova:project uuid="02818155e7af4645bc909d4ba671f11f">tempest-FloatingIpSameNetwork-953620552</nova:project>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:21:39 compute-0 nova_compute[183075]:         <nova:port uuid="d6bc5516-013e-4fd3-981b-a500723e5cc6">
Jan 22 17:21:39 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <system>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <entry name="serial">2ee81c99-07a0-41e3-bfac-dfb718a8e4c6</entry>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <entry name="uuid">2ee81c99-07a0-41e3-bfac-dfb718a8e4c6</entry>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     </system>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   <os>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   </os>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   <features>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   </features>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:13:5a:48"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <target dev="tapd6bc5516-01"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/console.log" append="off"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <video>
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     </video>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:21:39 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:21:39 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:21:39 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:21:39 compute-0 nova_compute[183075]: </domain>
Jan 22 17:21:39 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.418 183079 DEBUG nova.compute.manager [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Preparing to wait for external event network-vif-plugged-d6bc5516-013e-4fd3-981b-a500723e5cc6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.418 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.420 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.420 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.421 183079 DEBUG nova.virt.libvirt.vif [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:21:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-766639800',display_name='tempest-server-test-766639800',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-766639800',id=38,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-ttafj00f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:21:34Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=2ee81c99-07a0-41e3-bfac-dfb718a8e4c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "address": "fa:16:3e:13:5a:48", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6bc5516-01", "ovs_interfaceid": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.422 183079 DEBUG nova.network.os_vif_util [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "address": "fa:16:3e:13:5a:48", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6bc5516-01", "ovs_interfaceid": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.423 183079 DEBUG nova.network.os_vif_util [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:5a:48,bridge_name='br-int',has_traffic_filtering=True,id=d6bc5516-013e-4fd3-981b-a500723e5cc6,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6bc5516-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.424 183079 DEBUG os_vif [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:5a:48,bridge_name='br-int',has_traffic_filtering=True,id=d6bc5516-013e-4fd3-981b-a500723e5cc6,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6bc5516-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.425 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.425 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.426 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.430 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.430 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6bc5516-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.431 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd6bc5516-01, col_values=(('external_ids', {'iface-id': 'd6bc5516-013e-4fd3-981b-a500723e5cc6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:13:5a:48', 'vm-uuid': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.434 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:39 compute-0 NetworkManager[55454]: <info>  [1769102499.4353] manager: (tapd6bc5516-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.437 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.446 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.447 183079 INFO os_vif [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:5a:48,bridge_name='br-int',has_traffic_filtering=True,id=d6bc5516-013e-4fd3-981b-a500723e5cc6,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6bc5516-01')
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.516 183079 DEBUG nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.516 183079 DEBUG nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No VIF found with MAC fa:16:3e:13:5a:48, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:21:39 compute-0 kernel: tapd6bc5516-01: entered promiscuous mode
Jan 22 17:21:39 compute-0 NetworkManager[55454]: <info>  [1769102499.5905] manager: (tapd6bc5516-01): new Tun device (/org/freedesktop/NetworkManager/Devices/195)
Jan 22 17:21:39 compute-0 ovn_controller[95372]: 2026-01-22T17:21:39Z|00446|binding|INFO|Claiming lport d6bc5516-013e-4fd3-981b-a500723e5cc6 for this chassis.
Jan 22 17:21:39 compute-0 ovn_controller[95372]: 2026-01-22T17:21:39Z|00447|binding|INFO|d6bc5516-013e-4fd3-981b-a500723e5cc6: Claiming fa:16:3e:13:5a:48 10.100.0.4
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.593 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:39.606 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:5a:48 10.100.0.4'], port_security=['fa:16:3e:13:5a:48 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=d6bc5516-013e-4fd3-981b-a500723e5cc6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:21:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:39.609 104629 INFO neutron.agent.ovn.metadata.agent [-] Port d6bc5516-013e-4fd3-981b-a500723e5cc6 in datapath eee918a6-66b2-47ae-b702-620a23ef395b bound to our chassis
Jan 22 17:21:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:39.613 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:21:39 compute-0 ovn_controller[95372]: 2026-01-22T17:21:39Z|00448|binding|INFO|Setting lport d6bc5516-013e-4fd3-981b-a500723e5cc6 ovn-installed in OVS
Jan 22 17:21:39 compute-0 ovn_controller[95372]: 2026-01-22T17:21:39Z|00449|binding|INFO|Setting lport d6bc5516-013e-4fd3-981b-a500723e5cc6 up in Southbound
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.629 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.632 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:39.639 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5d28c65d-1b72-42ed-95ec-2588eefe4963]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:39 compute-0 systemd-udevd[227447]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:21:39 compute-0 NetworkManager[55454]: <info>  [1769102499.6557] device (tapd6bc5516-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:21:39 compute-0 NetworkManager[55454]: <info>  [1769102499.6563] device (tapd6bc5516-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:21:39 compute-0 systemd-machined[154382]: New machine qemu-38-instance-00000026.
Jan 22 17:21:39 compute-0 systemd[1]: Started Virtual Machine qemu-38-instance-00000026.
Jan 22 17:21:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:39.680 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[b54cb2fd-2f32-4c5e-a28c-7319bd569c05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:39.685 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[614da00e-5588-4e2a-b4a0-6de93c66bbae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:39.721 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[eb7a09ec-cf4f-463a-9365-64d66302d3c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:39.748 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[57b9077f-6ef8-4f24-8e3a-91986357ceb6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6146, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6146, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 122], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482564, 'reachable_time': 38352, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227459, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:39.772 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b1273df4-5800-49d0-9eda-c26fdf8138a4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482576, 'tstamp': 482576}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227462, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482580, 'tstamp': 482580}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227462, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:21:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:39.774 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.776 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.777 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:39.777 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee918a6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:39.778 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:21:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:39.778 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeee918a6-60, col_values=(('external_ids', {'iface-id': '15d4de90-41f4-4532-aebd-197c2a33c6d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:21:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:39.778 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.832 183079 DEBUG nova.compute.manager [req-cc3d5032-9138-47d2-b6b3-86efdbe4b7f9 req-25421e65-241c-4479-a045-f7dfabc73fa1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Received event network-vif-plugged-d6bc5516-013e-4fd3-981b-a500723e5cc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.832 183079 DEBUG oslo_concurrency.lockutils [req-cc3d5032-9138-47d2-b6b3-86efdbe4b7f9 req-25421e65-241c-4479-a045-f7dfabc73fa1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.833 183079 DEBUG oslo_concurrency.lockutils [req-cc3d5032-9138-47d2-b6b3-86efdbe4b7f9 req-25421e65-241c-4479-a045-f7dfabc73fa1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.833 183079 DEBUG oslo_concurrency.lockutils [req-cc3d5032-9138-47d2-b6b3-86efdbe4b7f9 req-25421e65-241c-4479-a045-f7dfabc73fa1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:39 compute-0 nova_compute[183075]: 2026-01-22 17:21:39.833 183079 DEBUG nova.compute.manager [req-cc3d5032-9138-47d2-b6b3-86efdbe4b7f9 req-25421e65-241c-4479-a045-f7dfabc73fa1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Processing event network-vif-plugged-d6bc5516-013e-4fd3-981b-a500723e5cc6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.063 183079 DEBUG nova.compute.manager [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.065 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102500.0625193, 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.065 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] VM Started (Lifecycle Event)
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.071 183079 DEBUG nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.075 183079 INFO nova.virt.libvirt.driver [-] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Instance spawned successfully.
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.075 183079 DEBUG nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.098 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.107 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.115 183079 DEBUG nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.116 183079 DEBUG nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.117 183079 DEBUG nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.117 183079 DEBUG nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.118 183079 DEBUG nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.119 183079 DEBUG nova.virt.libvirt.driver [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.156 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.157 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102500.063085, 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.158 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] VM Paused (Lifecycle Event)
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.189 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.195 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102500.0707328, 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.195 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] VM Resumed (Lifecycle Event)
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.203 183079 INFO nova.compute.manager [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Took 5.55 seconds to spawn the instance on the hypervisor.
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.205 183079 DEBUG nova.compute.manager [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.220 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.225 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.262 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.306 183079 INFO nova.compute.manager [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Took 6.14 seconds to build instance.
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.332 183079 DEBUG oslo_concurrency.lockutils [None req-c059b3ad-135f-446f-8deb-dc777b183267 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:40.796 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:40.798 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:21:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:21:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.833 183079 INFO nova.compute.manager [None req-d9530738-ad65-4872-a876-6b61dd7949d0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Get console output
Jan 22 17:21:40 compute-0 nova_compute[183075]: 2026-01-22 17:21:40.838 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:41 compute-0 nova_compute[183075]: 2026-01-22 17:21:41.342 183079 DEBUG nova.network.neutron [req-cadb06f7-e305-43a2-8944-87060b385097 req-ca0819dc-4342-4473-a0a2-d13f3cdf29ed a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Updated VIF entry in instance network info cache for port d6bc5516-013e-4fd3-981b-a500723e5cc6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:21:41 compute-0 nova_compute[183075]: 2026-01-22 17:21:41.342 183079 DEBUG nova.network.neutron [req-cadb06f7-e305-43a2-8944-87060b385097 req-ca0819dc-4342-4473-a0a2-d13f3cdf29ed a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Updating instance_info_cache with network_info: [{"id": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "address": "fa:16:3e:13:5a:48", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6bc5516-01", "ovs_interfaceid": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:21:41 compute-0 nova_compute[183075]: 2026-01-22 17:21:41.367 183079 DEBUG oslo_concurrency.lockutils [req-cadb06f7-e305-43a2-8944-87060b385097 req-ca0819dc-4342-4473-a0a2-d13f3cdf29ed a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-2ee81c99-07a0-41e3-bfac-dfb718a8e4c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.498 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:41 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227344]: 10.100.0.3:40618 [22/Jan/2026:17:21:40.795] listener listener/metadata 0/0/0/704/704 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.500 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.7028949
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.508 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.509 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.532 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.532 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0230100
Jan 22 17:21:41 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227344]: 10.100.0.3:40634 [22/Jan/2026:17:21:41.508] listener listener/metadata 0/0/0/23/23 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.537 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.537 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.552 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.552 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0152757
Jan 22 17:21:41 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227344]: 10.100.0.3:40638 [22/Jan/2026:17:21:41.536] listener listener/metadata 0/0/0/16/16 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.558 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.559 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.581 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.581 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0222459
Jan 22 17:21:41 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227344]: 10.100.0.3:40642 [22/Jan/2026:17:21:41.558] listener listener/metadata 0/0/0/23/23 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.589 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.590 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:41 compute-0 nova_compute[183075]: 2026-01-22 17:21:41.605 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.610 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.611 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0210631
Jan 22 17:21:41 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227344]: 10.100.0.3:40654 [22/Jan/2026:17:21:41.588] listener listener/metadata 0/0/0/23/23 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.626 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.627 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.646 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.647 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0198429
Jan 22 17:21:41 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227344]: 10.100.0.3:40662 [22/Jan/2026:17:21:41.626] listener listener/metadata 0/0/0/21/21 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.656 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.657 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.683 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.684 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0269625
Jan 22 17:21:41 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227344]: 10.100.0.3:40678 [22/Jan/2026:17:21:41.655] listener listener/metadata 0/0/0/28/28 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.692 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.693 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.714 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.715 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0217559
Jan 22 17:21:41 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227344]: 10.100.0.3:40684 [22/Jan/2026:17:21:41.692] listener listener/metadata 0/0/0/23/23 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.723 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.724 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.740 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:41 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227344]: 10.100.0.3:40700 [22/Jan/2026:17:21:41.722] listener listener/metadata 0/0/0/18/18 200 147 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.741 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 163 time: 0.0171409
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.749 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.750 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.766 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:41 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227344]: 10.100.0.3:40708 [22/Jan/2026:17:21:41.748] listener listener/metadata 0/0/0/18/18 200 147 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.766 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 163 time: 0.0166798
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.774 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.775 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:41 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227344]: 10.100.0.3:40720 [22/Jan/2026:17:21:41.773] listener listener/metadata 0/0/0/19/19 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.792 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0178797
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.807 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.809 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.828 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.829 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0199249
Jan 22 17:21:41 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227344]: 10.100.0.3:40722 [22/Jan/2026:17:21:41.806] listener listener/metadata 0/0/0/22/22 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.834 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.835 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.855 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:41 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227344]: 10.100.0.3:40724 [22/Jan/2026:17:21:41.834] listener listener/metadata 0/0/0/21/21 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.856 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0206342
Jan 22 17:21:41 compute-0 rsyslogd[1006]: imjournal from <np0005592449:ovn_metadata_agent>: begin to drop messages due to rate-limiting
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.869 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.871 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.887 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.887 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0162945
Jan 22 17:21:41 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227344]: 10.100.0.3:40730 [22/Jan/2026:17:21:41.869] listener listener/metadata 0/0/0/18/18 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.893 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.894 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.912 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.912 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 163 time: 0.0185764
Jan 22 17:21:41 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227344]: 10.100.0.3:40740 [22/Jan/2026:17:21:41.893] listener listener/metadata 0/0/0/19/19 200 147 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.924 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.926 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:41 compute-0 nova_compute[183075]: 2026-01-22 17:21:41.930 183079 DEBUG nova.compute.manager [req-ddb85d77-f6de-45cf-9f05-2ff5412d951e req-b4e3ef6d-cd05-4749-909c-0d744068c32a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Received event network-vif-plugged-d6bc5516-013e-4fd3-981b-a500723e5cc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:21:41 compute-0 nova_compute[183075]: 2026-01-22 17:21:41.930 183079 DEBUG oslo_concurrency.lockutils [req-ddb85d77-f6de-45cf-9f05-2ff5412d951e req-b4e3ef6d-cd05-4749-909c-0d744068c32a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:41 compute-0 nova_compute[183075]: 2026-01-22 17:21:41.930 183079 DEBUG oslo_concurrency.lockutils [req-ddb85d77-f6de-45cf-9f05-2ff5412d951e req-b4e3ef6d-cd05-4749-909c-0d744068c32a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:41 compute-0 nova_compute[183075]: 2026-01-22 17:21:41.931 183079 DEBUG oslo_concurrency.lockutils [req-ddb85d77-f6de-45cf-9f05-2ff5412d951e req-b4e3ef6d-cd05-4749-909c-0d744068c32a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:41 compute-0 nova_compute[183075]: 2026-01-22 17:21:41.931 183079 DEBUG nova.compute.manager [req-ddb85d77-f6de-45cf-9f05-2ff5412d951e req-b4e3ef6d-cd05-4749-909c-0d744068c32a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] No waiting events found dispatching network-vif-plugged-d6bc5516-013e-4fd3-981b-a500723e5cc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:21:41 compute-0 nova_compute[183075]: 2026-01-22 17:21:41.931 183079 WARNING nova.compute.manager [req-ddb85d77-f6de-45cf-9f05-2ff5412d951e req-b4e3ef6d-cd05-4749-909c-0d744068c32a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Received unexpected event network-vif-plugged-d6bc5516-013e-4fd3-981b-a500723e5cc6 for instance with vm_state active and task_state None.
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.938 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.939 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.940 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.944 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:21:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:41.945 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0191145
Jan 22 17:21:41 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227344]: 10.100.0.3:40754 [22/Jan/2026:17:21:41.924] listener listener/metadata 0/0/0/20/20 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:21:43 compute-0 nova_compute[183075]: 2026-01-22 17:21:43.044 183079 INFO nova.compute.manager [None req-a701a069-77cd-4f6d-b648-e45fd6cdce5d 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Get console output
Jan 22 17:21:43 compute-0 nova_compute[183075]: 2026-01-22 17:21:43.050 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:43 compute-0 podman[227472]: 2026-01-22 17:21:43.38129631 +0000 UTC m=+0.064576994 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 22 17:21:43 compute-0 podman[227473]: 2026-01-22 17:21:43.407206373 +0000 UTC m=+0.094443312 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41)
Jan 22 17:21:43 compute-0 podman[227471]: 2026-01-22 17:21:43.434944625 +0000 UTC m=+0.124169756 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 22 17:21:44 compute-0 nova_compute[183075]: 2026-01-22 17:21:44.470 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:45 compute-0 nova_compute[183075]: 2026-01-22 17:21:45.959 183079 INFO nova.compute.manager [None req-af51bd33-4317-4729-bddc-43ab7d890ee7 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Get console output
Jan 22 17:21:45 compute-0 nova_compute[183075]: 2026-01-22 17:21:45.966 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:46 compute-0 nova_compute[183075]: 2026-01-22 17:21:46.608 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:48 compute-0 nova_compute[183075]: 2026-01-22 17:21:48.227 183079 INFO nova.compute.manager [None req-6b3d0b0d-7f34-48a1-bb32-d89d34e218ba 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Get console output
Jan 22 17:21:48 compute-0 nova_compute[183075]: 2026-01-22 17:21:48.235 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:49 compute-0 nova_compute[183075]: 2026-01-22 17:21:49.473 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:50 compute-0 podman[227536]: 2026-01-22 17:21:50.382307541 +0000 UTC m=+0.079294002 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:21:51 compute-0 nova_compute[183075]: 2026-01-22 17:21:51.084 183079 INFO nova.compute.manager [None req-203ff70d-6238-4239-8edc-6c10f52aad42 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Get console output
Jan 22 17:21:51 compute-0 nova_compute[183075]: 2026-01-22 17:21:51.092 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:51 compute-0 nova_compute[183075]: 2026-01-22 17:21:51.656 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:53 compute-0 ovn_controller[95372]: 2026-01-22T17:21:53Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:13:5a:48 10.100.0.4
Jan 22 17:21:53 compute-0 ovn_controller[95372]: 2026-01-22T17:21:53Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:13:5a:48 10.100.0.4
Jan 22 17:21:53 compute-0 nova_compute[183075]: 2026-01-22 17:21:53.384 183079 INFO nova.compute.manager [None req-664e6476-be34-4aef-a358-0e4d2653d183 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Get console output
Jan 22 17:21:53 compute-0 nova_compute[183075]: 2026-01-22 17:21:53.392 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:54 compute-0 nova_compute[183075]: 2026-01-22 17:21:54.522 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.457 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'name': 'tempest-server-1-1949948609', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000025', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e4c0bb18013747dfad2e25b2495090eb', 'user_id': '852aea4e08344f39ae07e6b57393c767', 'hostId': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.461 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'name': 'tempest-server-test-1984705589', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000024', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '02818155e7af4645bc909d4ba671f11f', 'user_id': '1148a46489e842e6a0c7660c54567798', 'hostId': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.464 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'name': 'tempest-server-test-766639800', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000026', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '02818155e7af4645bc909d4ba671f11f', 'user_id': '1148a46489e842e6a0c7660c54567798', 'hostId': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.465 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.465 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.465 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-1-1949948609>, <NovaLikeServer: tempest-server-test-1984705589>, <NovaLikeServer: tempest-server-test-766639800>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-1-1949948609>, <NovaLikeServer: tempest-server-test-1984705589>, <NovaLikeServer: tempest-server-test-766639800>]
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.466 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.471 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b4f5d7ef-7780-43fe-9ed3-e83542116fa8 / tapce445831-36 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.471 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.475 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 803da6ed-0f79-4c4d-b054-593b0dee0c0b / tap7cc526f2-d7 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.476 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.479 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6 / tapd6bc5516-01 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.480 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e107f1b1-3d64-4c33-a299-06ac4a714bc5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000025-b4f5d7ef-7780-43fe-9ed3-e83542116fa8-tapce445831-36', 'timestamp': '2026-01-22T17:21:55.466819', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'tapce445831-36', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:38:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce445831-36'}, 'message_id': 'd99084de-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.227270997, 'message_signature': '8a522f76be69dc36d50b55e93628a516665048dc0506452742ebb424ee93404b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000024-803da6ed-0f79-4c4d-b054-593b0dee0c0b-tap7cc526f2-d7', 'timestamp': '2026-01-22T17:21:55.466819', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'tap7cc526f2-d7', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:9a:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7cc526f2-d7'}, 'message_id': 'd991376c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.232847144, 'message_signature': 'fbbc76a734108e20e8f66213bb69a3fdfd39f10ab158bff85aa2f96fd1952ca6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000026-2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-tapd6bc5516-01', 'timestamp': '2026-01-22T17:21:55.466819', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'tapd6bc5516-01', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:5a:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd6bc5516-01'}, 'message_id': 'd991dd16-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.237398574, 'message_signature': 'f8f3b95a361b1445e766fdf993ca6ef35bcf10d8a7db24e23473014487f36f48'}]}, 'timestamp': '2026-01-22 17:21:55.481263', '_unique_id': '4e240f4f48604bbda72820e7c23a1f06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.485 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.486 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/network.incoming.bytes volume: 7114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.486 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/network.incoming.bytes volume: 7410 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.487 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/network.incoming.bytes volume: 1346 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c1bf040-e0dd-4b6a-aa07-0b06873e7640', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7114, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000025-b4f5d7ef-7780-43fe-9ed3-e83542116fa8-tapce445831-36', 'timestamp': '2026-01-22T17:21:55.486142', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'tapce445831-36', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:38:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce445831-36'}, 'message_id': 'd992b808-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.227270997, 'message_signature': 'bdda6fd17f9b1c25c5381be9078b0b601086db649a262276df9cc172d00ac4c0'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7410, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000024-803da6ed-0f79-4c4d-b054-593b0dee0c0b-tap7cc526f2-d7', 'timestamp': '2026-01-22T17:21:55.486142', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'tap7cc526f2-d7', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:9a:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7cc526f2-d7'}, 'message_id': 'd992d0fe-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.232847144, 'message_signature': 'c71c0e894b0e0308f06dd9110f14a7af2f72d5923f2d778ed4ef9ce895128253'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1346, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000026-2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-tapd6bc5516-01', 'timestamp': '2026-01-22T17:21:55.486142', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'tapd6bc5516-01', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:5a:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd6bc5516-01'}, 'message_id': 'd992ec4c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.237398574, 'message_signature': '65b6fadcc1042ce17caaf921b799e73592e46d801b0dbbea4368c7da72c7652c'}]}, 'timestamp': '2026-01-22 17:21:55.488109', '_unique_id': '607720d610e14c1f8414a512726b0c86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.489 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.490 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.491 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.491 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.492 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2bfc59e2-5db0-48b0-b974-d8d8fca0f7dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000025-b4f5d7ef-7780-43fe-9ed3-e83542116fa8-tapce445831-36', 'timestamp': '2026-01-22T17:21:55.491041', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'tapce445831-36', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:38:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce445831-36'}, 'message_id': 'd99373b0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.227270997, 'message_signature': '566164f1103a031bddd28031bc133278520df5c0594dccdc96343daca39a355f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000024-803da6ed-0f79-4c4d-b054-593b0dee0c0b-tap7cc526f2-d7', 'timestamp': '2026-01-22T17:21:55.491041', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'tap7cc526f2-d7', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:9a:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7cc526f2-d7'}, 'message_id': 'd99388fa-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.232847144, 'message_signature': 'e21f9a2eadf1a52ad634bbcca35a6fc41067a034e4a9c343460d2848f04560d7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000026-2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-tapd6bc5516-01', 'timestamp': '2026-01-22T17:21:55.491041', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'tapd6bc5516-01', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:5a:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd6bc5516-01'}, 'message_id': 'd9939c00-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.237398574, 'message_signature': 'c7df5bfc30267bda1b95e91c4a6774f06a6c75a4dbd5ad4f1556743aa5d73ec5'}]}, 'timestamp': '2026-01-22 17:21:55.492580', '_unique_id': '2d5c65b419514733a92b309f30ace720'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.495 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.524 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/cpu volume: 10770000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.549 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/cpu volume: 11440000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.575 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/cpu volume: 10080000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb1e0234-1efe-4c86-93d2-92641bb7094c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10770000000, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'timestamp': '2026-01-22T17:21:55.495954', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'instance-00000025', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd998a074-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.284832385, 'message_signature': '3ecccda1d3d7ec1a1dc8bc718d248c6579c15a7ac7f9d9e2ff2785dff9d92833'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11440000000, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'timestamp': '2026-01-22T17:21:55.495954', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'instance-00000024', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd99c5eee-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.309357052, 'message_signature': '182249773d2ea39a67539bb323aeb45f3c73c7aa2f7fab587f1343610d54bdc8'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10080000000, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'timestamp': '2026-01-22T17:21:55.495954', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'instance-00000026', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd9a07b00-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.336273042, 'message_signature': '58195df999c1adc8ac8a4dd920eeac9cf530b9b9809510d3e9795ef2bd1f75bf'}]}, 'timestamp': '2026-01-22 17:21:55.577082', '_unique_id': 'd335703b33fd4201aafe0cdee6e4ab93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.581 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.581 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.581 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-1-1949948609>, <NovaLikeServer: tempest-server-test-1984705589>, <NovaLikeServer: tempest-server-test-766639800>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-1-1949948609>, <NovaLikeServer: tempest-server-test-1984705589>, <NovaLikeServer: tempest-server-test-766639800>]
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.582 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.582 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/network.incoming.packets volume: 59 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.582 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.583 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27f1c30f-4f74-47b9-b64a-38bfe855c7f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 59, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000025-b4f5d7ef-7780-43fe-9ed3-e83542116fa8-tapce445831-36', 'timestamp': '2026-01-22T17:21:55.582277', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'tapce445831-36', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:38:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce445831-36'}, 'message_id': 'd9a1615a-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.227270997, 'message_signature': '6d9b3b18ea77be4409c8e6ca9ba81bf8c067cdce3f121577a8eb4e328c4a945c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000024-803da6ed-0f79-4c4d-b054-593b0dee0c0b-tap7cc526f2-d7', 'timestamp': '2026-01-22T17:21:55.582277', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'tap7cc526f2-d7', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:9a:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7cc526f2-d7'}, 'message_id': 'd9a1774e-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.232847144, 'message_signature': 'd736bd29840599a625a0278b30f23928d0d16de071863df01926e452fe13dfe5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000026-2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-tapd6bc5516-01', 'timestamp': '2026-01-22T17:21:55.582277', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'tapd6bc5516-01', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:5a:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd6bc5516-01'}, 'message_id': 'd9a18ad6-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.237398574, 'message_signature': '6966fe09ea6abb81da87f272b0240c845f877de9444a93964682b4c85b70d6f4'}]}, 'timestamp': '2026-01-22 17:21:55.583956', '_unique_id': '0ee4c9ace65c4b0bb83ca77ad46f5fb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.586 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.611 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk.device.read.latency volume: 212775172 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.637 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk.device.read.latency volume: 211532981 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.660 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk.device.read.latency volume: 214356827 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f829330-09ca-4281-a884-56bec51c486a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 212775172, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8-vda', 'timestamp': '2026-01-22T17:21:55.586994', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'instance-00000025', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9a5ef68-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.347480697, 'message_signature': '1938a094d817f5eef3485087bc6d8debfa69a3e61aa817d607e2d4f9b30dbb35'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 211532981, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b-vda', 'timestamp': '2026-01-22T17:21:55.586994', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'instance-00000024', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9a9d13c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.373327769, 'message_signature': '4a994abb07651f636dedadd5ec2bda33a1ee5eb1bfb0121c966fd16ab3f645ad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 214356827, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-vda', 'timestamp': '2026-01-22T17:21:55.586994', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'instance-00000026', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9ad6360-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.398724659, 'message_signature': 'cc620a7dc4a36490efd15de141762443f1a86fb0e164aa084526d5bdcd230ee9'}]}, 'timestamp': '2026-01-22 17:21:55.661600', '_unique_id': '11f08918b1a64eb698500f89542bc52f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.663 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.665 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.675 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.685 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.692 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ead58654-7aa2-47f9-87b0-8bc796a1c9db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8-vda', 'timestamp': '2026-01-22T17:21:55.665493', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'instance-00000025', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9af96f8-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.425975058, 'message_signature': '3a24397a71b84eb6e01bea22d1e816b6e216a803a5a57bc47f26a9212a656caf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b-vda', 'timestamp': '2026-01-22T17:21:55.665493', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'instance-00000024', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b11dac-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.436364542, 'message_signature': '21ed1349c0f23e9678d329eeb7d72740884db8c8ac52b033ea6b590ec1fcbc9a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-vda', 'timestamp': '2026-01-22T17:21:55.665493', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'instance-00000026', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b242d6-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.446385476, 'message_signature': '31a54a4e8485c0053a6becab4ece31dddd02cd3b4853f2d25c54cc1a93ee9a9e'}]}, 'timestamp': '2026-01-22 17:21:55.693522', '_unique_id': 'b17e1669f11549669b5108a9f72ee664'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.695 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.697 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.697 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk.device.allocation volume: 30875648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.697 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.698 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk.device.allocation volume: 29171712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ba5e710-0dbf-4184-a249-495377c8a52d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30875648, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8-vda', 'timestamp': '2026-01-22T17:21:55.697259', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'instance-00000025', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b2ea74-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.425975058, 'message_signature': '96f8b5bf681e17dbbb503fb00f20e07311d590d69ae213aa6577982ae777449a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b-vda', 'timestamp': '2026-01-22T17:21:55.697259', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'instance-00000024', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b2fe60-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.436364542, 'message_signature': '13119124852c0ed63d123d3efbd309df9623f687c8145237c644b7efa847fb21'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29171712, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-vda', 'timestamp': '2026-01-22T17:21:55.697259', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'instance-00000026', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b31080-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.446385476, 'message_signature': 'd6c9a2b66dfd11a74635578fc75001165f63f3d2fe89185ae92b9207950b4467'}]}, 'timestamp': '2026-01-22 17:21:55.698754', '_unique_id': 'c743e730c91040ecb8f148801f8cddf2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.699 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.701 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.701 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.701 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-1-1949948609>, <NovaLikeServer: tempest-server-test-1984705589>, <NovaLikeServer: tempest-server-test-766639800>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-1-1949948609>, <NovaLikeServer: tempest-server-test-1984705589>, <NovaLikeServer: tempest-server-test-766639800>]
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.701 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.702 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk.device.write.latency volume: 3926637944 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.702 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk.device.write.latency volume: 2806086617 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.703 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk.device.write.latency volume: 2990181254 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73cfbaf5-e9e7-42f7-9f43-27829bbad61a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3926637944, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8-vda', 'timestamp': '2026-01-22T17:21:55.702168', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'instance-00000025', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b3b01c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.347480697, 'message_signature': 'b6f41ce591e49753c132c9ba5434d9a81f683fbac022779c615b6f3f5178a08e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2806086617, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b-vda', 'timestamp': '2026-01-22T17:21:55.702168', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'instance-00000024', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b3c84a-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.373327769, 'message_signature': 'af9ac4af9c430e625d58ea39ac2c0b41c9b2fb311f15ed78702ee350ef2ce178'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2990181254, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-vda', 'timestamp': '2026-01-22T17:21:55.702168', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'instance-00000026', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b3db3c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.398724659, 'message_signature': '44483961fff61e3009577bf6166486899af32ddcea61355bd090969434976ee0'}]}, 'timestamp': '2026-01-22 17:21:55.703895', '_unique_id': '5ee769d721f64f039335f535a1a57996'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.705 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.706 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.706 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.707 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.707 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ca916c0-8144-4a65-97c2-a783f75037b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000025-b4f5d7ef-7780-43fe-9ed3-e83542116fa8-tapce445831-36', 'timestamp': '2026-01-22T17:21:55.706665', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'tapce445831-36', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:38:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce445831-36'}, 'message_id': 'd9b45a76-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.227270997, 'message_signature': '326b47e4f359b0a70dea6c56d71caa9856ae7a5fe6560cbaa687ded6e58a7447'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000024-803da6ed-0f79-4c4d-b054-593b0dee0c0b-tap7cc526f2-d7', 'timestamp': '2026-01-22T17:21:55.706665', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'tap7cc526f2-d7', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:9a:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7cc526f2-d7'}, 'message_id': 'd9b46c64-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.232847144, 'message_signature': '373a5d192d360e60d5065bbff94d7bd0ede1eda8b36706061841fd71eb9df868'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000026-2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-tapd6bc5516-01', 'timestamp': '2026-01-22T17:21:55.706665', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'tapd6bc5516-01', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:5a:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd6bc5516-01'}, 'message_id': 'd9b47f42-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.237398574, 'message_signature': '5c7eca158dffd68f02acb0a7fda3b17517dd82c1535250471ce586887aa64aa0'}]}, 'timestamp': '2026-01-22 17:21:55.708117', '_unique_id': 'd5b765adb8154e3cb6aa3834f438c932'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.709 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.710 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.710 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.711 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.711 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7621d49f-babd-4eb6-be56-0d76541d69ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000025-b4f5d7ef-7780-43fe-9ed3-e83542116fa8-tapce445831-36', 'timestamp': '2026-01-22T17:21:55.710741', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'tapce445831-36', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:38:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce445831-36'}, 'message_id': 'd9b4f954-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.227270997, 'message_signature': '29003f957a745c6f14ddc84313d2b63f8353933af24295308fd85cd713de1285'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000024-803da6ed-0f79-4c4d-b054-593b0dee0c0b-tap7cc526f2-d7', 'timestamp': '2026-01-22T17:21:55.710741', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'tap7cc526f2-d7', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:9a:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7cc526f2-d7'}, 'message_id': 'd9b50ca0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.232847144, 'message_signature': 'e4053cc681e62d9de3643a9423c2a768442485859ce02ba1998288c3908bef0f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000026-2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-tapd6bc5516-01', 'timestamp': '2026-01-22T17:21:55.710741', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'tapd6bc5516-01', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:5a:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd6bc5516-01'}, 'message_id': 'd9b5200a-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.237398574, 'message_signature': '4b79c5bb484e793c9824aae23cfdabc1f06bb03f7509eac9c7a7d76507d9f3bb'}]}, 'timestamp': '2026-01-22 17:21:55.712220', '_unique_id': '51062f73565c4a95906209f4e64efa52'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.713 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.714 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.715 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/network.outgoing.packets volume: 123 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.715 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/network.outgoing.packets volume: 121 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.716 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6464d793-a28e-49e4-b712-11ef2059b1e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 123, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000025-b4f5d7ef-7780-43fe-9ed3-e83542116fa8-tapce445831-36', 'timestamp': '2026-01-22T17:21:55.715111', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'tapce445831-36', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:38:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce445831-36'}, 'message_id': 'd9b5a5c0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.227270997, 'message_signature': '8e29f3502d33adbda30509846e308f371465686e51a094b58426f6a05028afd2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 121, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000024-803da6ed-0f79-4c4d-b054-593b0dee0c0b-tap7cc526f2-d7', 'timestamp': '2026-01-22T17:21:55.715111', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'tap7cc526f2-d7', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:9a:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7cc526f2-d7'}, 'message_id': 'd9b5ba6a-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.232847144, 'message_signature': '3deeba58ecbc230cdb1f1e00fd73fbb7a5201425d2fbe55212222fca32ad2b30'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000026-2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-tapd6bc5516-01', 'timestamp': '2026-01-22T17:21:55.715111', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'tapd6bc5516-01', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:5a:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd6bc5516-01'}, 'message_id': 'd9b5cd0c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.237398574, 'message_signature': '88cca08d92162828cd598901c7cf06136da80724153b7f17604284982cc85bd0'}]}, 'timestamp': '2026-01-22 17:21:55.716690', '_unique_id': '942ff6b0f81b4347a93a91288eece00a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.717 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.719 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.719 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.719 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.720 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk.device.usage volume: 28246016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9d89066-c20a-4109-b49c-7988a3dcb62f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8-vda', 'timestamp': '2026-01-22T17:21:55.719193', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'instance-00000025', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b64304-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.425975058, 'message_signature': '7ae607082ca2adb91f5e764f3d32a06cde2859ddcf440568ce222d2c9387d994'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b-vda', 'timestamp': '2026-01-22T17:21:55.719193', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'instance-00000024', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b656e6-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.436364542, 'message_signature': '86f0f4d8131ba33cc9a54b5763bf220d816ad45d15e27aa3afa114f63b6a702e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28246016, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-vda', 'timestamp': '2026-01-22T17:21:55.719193', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'instance-00000026', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b66744-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.446385476, 'message_signature': 'dbb504285ee79fb576fc8852aa6f3d788383565c5caff8a0e7f25336380887e3'}]}, 'timestamp': '2026-01-22 17:21:55.720581', '_unique_id': '886f4b8d363048b0b9a3eed02591d978'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.721 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.723 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.723 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.724 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.724 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9c783ec-de74-42da-8c9d-46258dbc6c14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000025-b4f5d7ef-7780-43fe-9ed3-e83542116fa8-tapce445831-36', 'timestamp': '2026-01-22T17:21:55.723474', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'tapce445831-36', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:38:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce445831-36'}, 'message_id': 'd9b6ec6e-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.227270997, 'message_signature': 'b0b7ebc8da0f0cbd99e0f972fe10af17d74acc54f8eb70f2ea97542963bda68c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000024-803da6ed-0f79-4c4d-b054-593b0dee0c0b-tap7cc526f2-d7', 'timestamp': '2026-01-22T17:21:55.723474', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'tap7cc526f2-d7', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:9a:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7cc526f2-d7'}, 'message_id': 'd9b6fe20-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.232847144, 'message_signature': 'e55f477fe53f04bff674dd56ebc60401a8872e6e995374f31329ad9b1d2a35cf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000026-2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-tapd6bc5516-01', 'timestamp': '2026-01-22T17:21:55.723474', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'tapd6bc5516-01', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:5a:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd6bc5516-01'}, 'message_id': 'd9b710e0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.237398574, 'message_signature': '820b16163b1440158ba44bfadab16acd26d20dd9eb85238f81f04f12c5c6da3f'}]}, 'timestamp': '2026-01-22 17:21:55.724978', '_unique_id': '62eb2116cc4546db89dde2d492cdf8d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.726 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.727 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.727 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk.device.read.bytes volume: 30046720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.727 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk.device.read.bytes volume: 31255040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.728 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk.device.read.bytes volume: 27724800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23d26d18-70d5-4bd2-9a72-9222856940e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30046720, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8-vda', 'timestamp': '2026-01-22T17:21:55.727499', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'instance-00000025', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b784d0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.347480697, 'message_signature': 'a5102a07c1d83e3818ef7268883a545be04ed05746a4b3482d608b699a3cb194'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31255040, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b-vda', 'timestamp': '2026-01-22T17:21:55.727499', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'instance-00000024', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b78f84-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.373327769, 'message_signature': 'b45b8f77861801a9bd01d51b14dfe276cc205bb3f1e1790a9b776a02ac82d295'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 27724800, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-vda', 'timestamp': '2026-01-22T17:21:55.727499', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'instance-00000026', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b79934-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.398724659, 'message_signature': '7fdf595cb3b4c2d48dabfd40313ffc9355dc76f74b36857a0b31679380a5dc25'}]}, 'timestamp': '2026-01-22 17:21:55.728334', '_unique_id': 'e6659f30dcf44185beb93ae5a47f076f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.729 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.730 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.730 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1591bf6d-2d8a-4e33-bc8f-5ac01132dd31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000025-b4f5d7ef-7780-43fe-9ed3-e83542116fa8-tapce445831-36', 'timestamp': '2026-01-22T17:21:55.729874', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'tapce445831-36', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:38:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce445831-36'}, 'message_id': 'd9b7e1e6-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.227270997, 'message_signature': '32d52ffc869384b164660a0189dd93c8cebde3c55ef31f52fb3cfeb8890dc573'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000024-803da6ed-0f79-4c4d-b054-593b0dee0c0b-tap7cc526f2-d7', 'timestamp': '2026-01-22T17:21:55.729874', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'tap7cc526f2-d7', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:9a:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7cc526f2-d7'}, 'message_id': 'd9b7f05a-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.232847144, 'message_signature': '6018bf692b3df96a6bb0723a69cf22c68c92311983bae6cdbf06732490d75037'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000026-2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-tapd6bc5516-01', 'timestamp': '2026-01-22T17:21:55.729874', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'tapd6bc5516-01', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:5a:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd6bc5516-01'}, 'message_id': 'd9b80158-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.237398574, 'message_signature': '83b88b0e0a58e74166a6432d3b159f210a05be529f696aa48519bde2d5eba558'}]}, 'timestamp': '2026-01-22 17:21:55.731018', '_unique_id': '0b2fedb725524c0d9c6ace094897cfa3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.731 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.732 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.732 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk.device.write.requests volume: 323 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.732 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk.device.write.requests volume: 336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.733 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk.device.write.requests volume: 233 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf96a69a-25a4-48a0-9026-4d17d51bec32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 323, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8-vda', 'timestamp': '2026-01-22T17:21:55.732661', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'instance-00000025', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b84d70-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.347480697, 'message_signature': '2cb762c6d819a87298a06afc2eef2de99be7697384a0434dee31f3926453cf9a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 336, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b-vda', 'timestamp': '2026-01-22T17:21:55.732661', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'instance-00000024', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b85806-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.373327769, 'message_signature': 'd2e019101bb728043659060843679011fbfe92a5373e33345b00aaebf3410680'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 233, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-vda', 'timestamp': '2026-01-22T17:21:55.732661', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'instance-00000026', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b86364-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.398724659, 'message_signature': '9cb139bcc2cfed481b9e237e2ca4cdd344dba85ee9e52f98902751b4632baf27'}]}, 'timestamp': '2026-01-22 17:21:55.733554', '_unique_id': '7e61f5cb839b490a80642ed2100e97e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.734 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.735 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.735 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk.device.read.requests volume: 1112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.735 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk.device.read.requests volume: 1165 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.735 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk.device.read.requests volume: 967 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b2abb72-72d1-4a7e-bc1b-b5e8a32153f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1112, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8-vda', 'timestamp': '2026-01-22T17:21:55.735138', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'instance-00000025', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b8ade2-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.347480697, 'message_signature': '1d9f0d4ea5857110559bce72eb8590c52db3b476718f959346346d77a1308536'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1165, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b-vda', 'timestamp': '2026-01-22T17:21:55.735138', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'instance-00000024', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b8b832-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.373327769, 'message_signature': '1168e33454ac73f079e89a0f72e1b741ddf872ffdc53bd6fb0ac59bec110ed1c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 967, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-vda', 'timestamp': '2026-01-22T17:21:55.735138', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'instance-00000026', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b8c37c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.398724659, 'message_signature': 'a2ee46428975f3f19a501ab9886d09ffe8c2347e9b2543c205e9185e29b47f26'}]}, 'timestamp': '2026-01-22 17:21:55.735970', '_unique_id': '98c8e19ded3b47df8076377125186a03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.736 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.737 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.737 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/memory.usage volume: 43.296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.737 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/memory.usage volume: 42.64453125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.738 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4eda6029-0f66-4a39-99c2-d3db0d8dcc89', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.296875, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'timestamp': '2026-01-22T17:21:55.737607', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'instance-00000025', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'd9b911f6-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.284832385, 'message_signature': '8888861751e0c62dcfc8c415dd6d65f08774dbc15853fd70628fadd0a20f2a58'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.64453125, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'timestamp': '2026-01-22T17:21:55.737607', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'instance-00000024', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'd9b91ca0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.309357052, 'message_signature': '68ea9e2fe5eeb3e8f3052958db69a3e3d67651b9886d6a2de855567c9a87b34b'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'timestamp': '2026-01-22T17:21:55.737607', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'instance-00000026', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'd9b9268c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.336273042, 'message_signature': 'a0696ef263b37e9972e4ab53d5fb2ec801ccf7f08d2c51e22ecd652af30ec017'}]}, 'timestamp': '2026-01-22 17:21:55.738508', '_unique_id': 'cf8e5d92544c4f78a276629e8b63e8ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.739 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.740 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.740 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk.device.write.bytes volume: 72978432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.740 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/disk.device.write.bytes volume: 73129984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.741 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/disk.device.write.bytes volume: 25628672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50f78cae-bfa4-4820-b6f2-f7bd55b9ad0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72978432, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8-vda', 'timestamp': '2026-01-22T17:21:55.740424', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'instance-00000025', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b97c72-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.347480697, 'message_signature': 'daf0d9483f408d08fba5ec7dcc51c33912e639901b6068613ac863c493fcc810'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73129984, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b-vda', 'timestamp': '2026-01-22T17:21:55.740424', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'instance-00000024', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b9899c-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.373327769, 'message_signature': 'c840731825d5343b0c0375ba644615a777c1b72dd7549c20cadf4fe6717b32cb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25628672, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-vda', 'timestamp': '2026-01-22T17:21:55.740424', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'instance-00000026', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9b9977a-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.398724659, 'message_signature': '683bdaea0b543d4edad773b30710c9925cb460166e59cbc0b9a00506fe0604ca'}]}, 'timestamp': '2026-01-22 17:21:55.741459', '_unique_id': '47eb9c9740a84f0fbf4acf276ea4d6f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.742 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.743 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.743 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.743 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-1-1949948609>, <NovaLikeServer: tempest-server-test-1984705589>, <NovaLikeServer: tempest-server-test-766639800>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-1-1949948609>, <NovaLikeServer: tempest-server-test-1984705589>, <NovaLikeServer: tempest-server-test-766639800>]
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.743 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.743 12 DEBUG ceilometer.compute.pollsters [-] b4f5d7ef-7780-43fe-9ed3-e83542116fa8/network.outgoing.bytes volume: 10851 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.744 12 DEBUG ceilometer.compute.pollsters [-] 803da6ed-0f79-4c4d-b054-593b0dee0c0b/network.outgoing.bytes volume: 10616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.744 12 DEBUG ceilometer.compute.pollsters [-] 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6/network.outgoing.bytes volume: 1284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f76fc420-0f1e-48ad-bd5b-4ae27ffac3f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10851, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000025-b4f5d7ef-7780-43fe-9ed3-e83542116fa8-tapce445831-36', 'timestamp': '2026-01-22T17:21:55.743692', 'resource_metadata': {'display_name': 'tempest-server-1-1949948609', 'name': 'tapce445831-36', 'instance_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:38:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce445831-36'}, 'message_id': 'd9b9fee0-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.227270997, 'message_signature': '6b2ca89a0627d6f3f2df12dde58c613bc0c4da2acb1b10e174a876c0fc67246b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10616, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000024-803da6ed-0f79-4c4d-b054-593b0dee0c0b-tap7cc526f2-d7', 'timestamp': '2026-01-22T17:21:55.743692', 'resource_metadata': {'display_name': 'tempest-server-test-1984705589', 'name': 'tap7cc526f2-d7', 'instance_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:9a:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7cc526f2-d7'}, 'message_id': 'd9ba0e08-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.232847144, 'message_signature': 'a5d9902159e748caa26a95f85c0342c2c53c8da771e3334c6deeca5146c6c9da'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1284, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000026-2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-tapd6bc5516-01', 'timestamp': '2026-01-22T17:21:55.743692', 'resource_metadata': {'display_name': 'tempest-server-test-766639800', 'name': 'tapd6bc5516-01', 'instance_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:5a:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd6bc5516-01'}, 'message_id': 'd9ba1a10-f7b6-11f0-9e69-fa163eaea1db', 'monotonic_time': 4879.237398574, 'message_signature': '7dd6968311c0a616941b4656fed6e80d1ece38e0e370e659751dd81934da4dc5'}]}, 'timestamp': '2026-01-22 17:21:55.744779', '_unique_id': '41d0e1c8ebf94744a8fb6532a85e9dcc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:21:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:21:55.745 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:21:56 compute-0 nova_compute[183075]: 2026-01-22 17:21:56.252 183079 INFO nova.compute.manager [None req-4cbf890a-f2a8-4b18-949f-e05a3da805b0 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Get console output
Jan 22 17:21:56 compute-0 nova_compute[183075]: 2026-01-22 17:21:56.259 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:56 compute-0 nova_compute[183075]: 2026-01-22 17:21:56.659 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:21:57 compute-0 podman[227587]: 2026-01-22 17:21:57.367525757 +0000 UTC m=+0.063500876 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:21:58 compute-0 nova_compute[183075]: 2026-01-22 17:21:58.523 183079 INFO nova.compute.manager [None req-6bc5598c-9e3c-4381-b558-6141d1f97380 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Get console output
Jan 22 17:21:58 compute-0 nova_compute[183075]: 2026-01-22 17:21:58.529 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:21:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:59.006 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:21:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:21:59.009 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:21:59 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:21:59 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:21:59 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:21:59 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:21:59 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:21:59 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:21:59 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:21:59 compute-0 nova_compute[183075]: 2026-01-22 17:21:59.525 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.325 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.326 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.3169899
Jan 22 17:22:00 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.4:41760 [22/Jan/2026:17:21:59.006] listener listener/metadata 0/0/0/1320/1320 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.343 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.344 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.370 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.370 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0259726
Jan 22 17:22:00 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.4:41762 [22/Jan/2026:17:22:00.342] listener listener/metadata 0/0/0/27/27 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.375 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.376 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.398 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.398 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0223646
Jan 22 17:22:00 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.4:41764 [22/Jan/2026:17:22:00.375] listener listener/metadata 0/0/0/23/23 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.408 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.409 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.432 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.433 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0237401
Jan 22 17:22:00 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.4:41766 [22/Jan/2026:17:22:00.407] listener listener/metadata 0/0/0/25/25 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.439 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.440 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.460 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.461 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0208728
Jan 22 17:22:00 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.4:41780 [22/Jan/2026:17:22:00.438] listener listener/metadata 0/0/0/22/22 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.471 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.472 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.489 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.490 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0180664
Jan 22 17:22:00 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.4:41786 [22/Jan/2026:17:22:00.471] listener listener/metadata 0/0/0/19/19 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.500 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.500 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.519 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.520 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0194728
Jan 22 17:22:00 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.4:41796 [22/Jan/2026:17:22:00.499] listener listener/metadata 0/0/0/20/20 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.528 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.529 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.550 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:22:00 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.4:41812 [22/Jan/2026:17:22:00.528] listener listener/metadata 0/0/0/22/22 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.550 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0218031
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.558 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.559 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.583 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.584 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0253434
Jan 22 17:22:00 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.4:41826 [22/Jan/2026:17:22:00.558] listener listener/metadata 0/0/0/26/26 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.593 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.594 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:22:00 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.617 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:22:00 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.4:41842 [22/Jan/2026:17:22:00.593] listener listener/metadata 0/0/0/25/25 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.618 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0238352
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.627 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.628 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:22:00 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.4:41844 [22/Jan/2026:17:22:00.627] listener listener/metadata 0/0/0/28/28 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.655 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0270126
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.676 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.677 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.703 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.703 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0258331
Jan 22 17:22:00 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.4:41848 [22/Jan/2026:17:22:00.676] listener listener/metadata 0/0/0/27/27 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.712 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.712 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.734 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:22:00 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.4:41860 [22/Jan/2026:17:22:00.711] listener listener/metadata 0/0/0/23/23 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.734 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0219226
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.741 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.742 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.763 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:22:00 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.4:41872 [22/Jan/2026:17:22:00.740] listener listener/metadata 0/0/0/23/23 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.764 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0223312
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.774 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.775 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.798 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.799 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0239964
Jan 22 17:22:00 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.4:41874 [22/Jan/2026:17:22:00.774] listener listener/metadata 0/0/0/25/25 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.808 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.808 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.831 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:22:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:00.831 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0229936
Jan 22 17:22:00 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[227029]: 10.100.0.4:41884 [22/Jan/2026:17:22:00.807] listener listener/metadata 0/0/0/23/23 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:22:01 compute-0 nova_compute[183075]: 2026-01-22 17:22:01.411 183079 INFO nova.compute.manager [None req-fbad538d-fd4b-48be-a1f1-ca15d712fa77 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Get console output
Jan 22 17:22:01 compute-0 nova_compute[183075]: 2026-01-22 17:22:01.418 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:22:01 compute-0 nova_compute[183075]: 2026-01-22 17:22:01.665 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:03 compute-0 nova_compute[183075]: 2026-01-22 17:22:03.706 183079 INFO nova.compute.manager [None req-4aa44580-3b78-4b4b-ab64-3032deba049c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Get console output
Jan 22 17:22:03 compute-0 nova_compute[183075]: 2026-01-22 17:22:03.713 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:22:04 compute-0 podman[227613]: 2026-01-22 17:22:04.3670281 +0000 UTC m=+0.066332930 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:22:04 compute-0 nova_compute[183075]: 2026-01-22 17:22:04.527 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:06 compute-0 nova_compute[183075]: 2026-01-22 17:22:06.549 183079 INFO nova.compute.manager [None req-44b27c2d-c8ac-4a6f-899c-833065044b46 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Get console output
Jan 22 17:22:06 compute-0 nova_compute[183075]: 2026-01-22 17:22:06.555 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:22:06 compute-0 nova_compute[183075]: 2026-01-22 17:22:06.669 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:07 compute-0 rsyslogd[1006]: imjournal: 1608 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 22 17:22:07 compute-0 nova_compute[183075]: 2026-01-22 17:22:07.810 183079 DEBUG oslo_concurrency.lockutils [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:07 compute-0 nova_compute[183075]: 2026-01-22 17:22:07.811 183079 DEBUG oslo_concurrency.lockutils [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:07 compute-0 nova_compute[183075]: 2026-01-22 17:22:07.812 183079 DEBUG oslo_concurrency.lockutils [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:07 compute-0 nova_compute[183075]: 2026-01-22 17:22:07.814 183079 DEBUG oslo_concurrency.lockutils [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:07 compute-0 nova_compute[183075]: 2026-01-22 17:22:07.815 183079 DEBUG oslo_concurrency.lockutils [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:07 compute-0 nova_compute[183075]: 2026-01-22 17:22:07.817 183079 INFO nova.compute.manager [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Terminating instance
Jan 22 17:22:07 compute-0 nova_compute[183075]: 2026-01-22 17:22:07.818 183079 DEBUG nova.compute.manager [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:22:07 compute-0 kernel: tapd6bc5516-01 (unregistering): left promiscuous mode
Jan 22 17:22:07 compute-0 NetworkManager[55454]: <info>  [1769102527.8463] device (tapd6bc5516-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:22:07 compute-0 ovn_controller[95372]: 2026-01-22T17:22:07Z|00450|binding|INFO|Releasing lport d6bc5516-013e-4fd3-981b-a500723e5cc6 from this chassis (sb_readonly=0)
Jan 22 17:22:07 compute-0 ovn_controller[95372]: 2026-01-22T17:22:07Z|00451|binding|INFO|Setting lport d6bc5516-013e-4fd3-981b-a500723e5cc6 down in Southbound
Jan 22 17:22:07 compute-0 nova_compute[183075]: 2026-01-22 17:22:07.893 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:07 compute-0 ovn_controller[95372]: 2026-01-22T17:22:07Z|00452|binding|INFO|Removing iface tapd6bc5516-01 ovn-installed in OVS
Jan 22 17:22:07 compute-0 nova_compute[183075]: 2026-01-22 17:22:07.898 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:07.914 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:5a:48 10.100.0.4'], port_security=['fa:16:3e:13:5a:48 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '2ee81c99-07a0-41e3-bfac-dfb718a8e4c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=d6bc5516-013e-4fd3-981b-a500723e5cc6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:22:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:07.916 104629 INFO neutron.agent.ovn.metadata.agent [-] Port d6bc5516-013e-4fd3-981b-a500723e5cc6 in datapath eee918a6-66b2-47ae-b702-620a23ef395b unbound from our chassis
Jan 22 17:22:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:07.920 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:22:07 compute-0 nova_compute[183075]: 2026-01-22 17:22:07.926 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:07 compute-0 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000026.scope: Deactivated successfully.
Jan 22 17:22:07 compute-0 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000026.scope: Consumed 12.865s CPU time.
Jan 22 17:22:07 compute-0 systemd-machined[154382]: Machine qemu-38-instance-00000026 terminated.
Jan 22 17:22:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:07.949 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5f25afe7-dae6-42a8-b0c1-d20fd37221ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:07.995 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[116c0f3c-5aac-4415-aea8-77aece398219]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:08.000 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f120f42a-1db5-4b9c-9077-4f47080f8c17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:08.044 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[1110423f-80e5-43a1-a292-30586ed0b5b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.053 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.062 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:08.075 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3b610f3e-16d2-4ff3-9a72-7b9804f02ce5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 106, 'rx_bytes': 17308, 'tx_bytes': 12069, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 106, 'rx_bytes': 17308, 'tx_bytes': 12069, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 122], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482564, 'reachable_time': 38352, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227653, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.098 183079 INFO nova.virt.libvirt.driver [-] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Instance destroyed successfully.
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.098 183079 DEBUG nova.objects.instance [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'resources' on Instance uuid 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:22:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:08.108 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[00b01063-0ee6-43d3-ab9b-6526aca64c76]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482576, 'tstamp': 482576}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227664, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482580, 'tstamp': 482580}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227664, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:08.111 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.113 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.115 183079 DEBUG nova.virt.libvirt.vif [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:21:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-766639800',display_name='tempest-server-test-766639800',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-766639800',id=38,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:21:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-ttafj00f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:21:40Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=2ee81c99-07a0-41e3-bfac-dfb718a8e4c6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "address": "fa:16:3e:13:5a:48", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6bc5516-01", "ovs_interfaceid": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.116 183079 DEBUG nova.network.os_vif_util [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "address": "fa:16:3e:13:5a:48", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6bc5516-01", "ovs_interfaceid": "d6bc5516-013e-4fd3-981b-a500723e5cc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.117 183079 DEBUG nova.network.os_vif_util [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:5a:48,bridge_name='br-int',has_traffic_filtering=True,id=d6bc5516-013e-4fd3-981b-a500723e5cc6,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6bc5516-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.117 183079 DEBUG os_vif [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:5a:48,bridge_name='br-int',has_traffic_filtering=True,id=d6bc5516-013e-4fd3-981b-a500723e5cc6,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6bc5516-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.118 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.119 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6bc5516-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.121 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:08.121 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee918a6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:22:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:08.122 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.123 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:22:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:08.123 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeee918a6-60, col_values=(('external_ids', {'iface-id': '15d4de90-41f4-4532-aebd-197c2a33c6d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:22:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:08.124 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.128 183079 INFO os_vif [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:5a:48,bridge_name='br-int',has_traffic_filtering=True,id=d6bc5516-013e-4fd3-981b-a500723e5cc6,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6bc5516-01')
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.130 183079 INFO nova.virt.libvirt.driver [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Deleting instance files /var/lib/nova/instances/2ee81c99-07a0-41e3-bfac-dfb718a8e4c6_del
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.131 183079 INFO nova.virt.libvirt.driver [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Deletion of /var/lib/nova/instances/2ee81c99-07a0-41e3-bfac-dfb718a8e4c6_del complete
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.206 183079 INFO nova.compute.manager [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.207 183079 DEBUG oslo.service.loopingcall [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.207 183079 DEBUG nova.compute.manager [-] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.207 183079 DEBUG nova.network.neutron [-] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.513 183079 DEBUG nova.compute.manager [req-10bd5e40-d7c4-4aac-95bc-60fe9bb6514f req-54e40f35-51d3-4ad1-b442-f381ce07ab3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Received event network-vif-unplugged-d6bc5516-013e-4fd3-981b-a500723e5cc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.513 183079 DEBUG oslo_concurrency.lockutils [req-10bd5e40-d7c4-4aac-95bc-60fe9bb6514f req-54e40f35-51d3-4ad1-b442-f381ce07ab3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.514 183079 DEBUG oslo_concurrency.lockutils [req-10bd5e40-d7c4-4aac-95bc-60fe9bb6514f req-54e40f35-51d3-4ad1-b442-f381ce07ab3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.514 183079 DEBUG oslo_concurrency.lockutils [req-10bd5e40-d7c4-4aac-95bc-60fe9bb6514f req-54e40f35-51d3-4ad1-b442-f381ce07ab3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.515 183079 DEBUG nova.compute.manager [req-10bd5e40-d7c4-4aac-95bc-60fe9bb6514f req-54e40f35-51d3-4ad1-b442-f381ce07ab3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] No waiting events found dispatching network-vif-unplugged-d6bc5516-013e-4fd3-981b-a500723e5cc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.515 183079 DEBUG nova.compute.manager [req-10bd5e40-d7c4-4aac-95bc-60fe9bb6514f req-54e40f35-51d3-4ad1-b442-f381ce07ab3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Received event network-vif-unplugged-d6bc5516-013e-4fd3-981b-a500723e5cc6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:22:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:08.640 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.641 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:08.643 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.813 183079 INFO nova.compute.manager [None req-b7560d6e-e95c-43f8-9c6d-f3d735011a5f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Get console output
Jan 22 17:22:08 compute-0 nova_compute[183075]: 2026-01-22 17:22:08.821 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:22:09 compute-0 nova_compute[183075]: 2026-01-22 17:22:09.481 183079 DEBUG nova.network.neutron [-] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:22:09 compute-0 nova_compute[183075]: 2026-01-22 17:22:09.499 183079 INFO nova.compute.manager [-] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Took 1.29 seconds to deallocate network for instance.
Jan 22 17:22:09 compute-0 nova_compute[183075]: 2026-01-22 17:22:09.540 183079 DEBUG oslo_concurrency.lockutils [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:09 compute-0 nova_compute[183075]: 2026-01-22 17:22:09.541 183079 DEBUG oslo_concurrency.lockutils [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:09 compute-0 ovn_controller[95372]: 2026-01-22T17:22:09Z|00453|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 22 17:22:09 compute-0 nova_compute[183075]: 2026-01-22 17:22:09.650 183079 DEBUG nova.compute.provider_tree [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:22:09 compute-0 nova_compute[183075]: 2026-01-22 17:22:09.668 183079 DEBUG nova.scheduler.client.report [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:22:09 compute-0 nova_compute[183075]: 2026-01-22 17:22:09.689 183079 DEBUG oslo_concurrency.lockutils [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:09 compute-0 nova_compute[183075]: 2026-01-22 17:22:09.711 183079 INFO nova.scheduler.client.report [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Deleted allocations for instance 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6
Jan 22 17:22:09 compute-0 nova_compute[183075]: 2026-01-22 17:22:09.803 183079 DEBUG oslo_concurrency.lockutils [None req-6ea4c860-0935-4470-9599-9c5d9a3131f9 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.992s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:10 compute-0 nova_compute[183075]: 2026-01-22 17:22:10.637 183079 DEBUG nova.compute.manager [req-8ae09dc6-a8bd-4d2d-a983-9b9508a17dc8 req-f541b3c7-dc28-4895-94bb-7ad4a22666f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Received event network-vif-plugged-d6bc5516-013e-4fd3-981b-a500723e5cc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:22:10 compute-0 nova_compute[183075]: 2026-01-22 17:22:10.638 183079 DEBUG oslo_concurrency.lockutils [req-8ae09dc6-a8bd-4d2d-a983-9b9508a17dc8 req-f541b3c7-dc28-4895-94bb-7ad4a22666f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:10 compute-0 nova_compute[183075]: 2026-01-22 17:22:10.638 183079 DEBUG oslo_concurrency.lockutils [req-8ae09dc6-a8bd-4d2d-a983-9b9508a17dc8 req-f541b3c7-dc28-4895-94bb-7ad4a22666f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:10 compute-0 nova_compute[183075]: 2026-01-22 17:22:10.639 183079 DEBUG oslo_concurrency.lockutils [req-8ae09dc6-a8bd-4d2d-a983-9b9508a17dc8 req-f541b3c7-dc28-4895-94bb-7ad4a22666f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "2ee81c99-07a0-41e3-bfac-dfb718a8e4c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:10 compute-0 nova_compute[183075]: 2026-01-22 17:22:10.640 183079 DEBUG nova.compute.manager [req-8ae09dc6-a8bd-4d2d-a983-9b9508a17dc8 req-f541b3c7-dc28-4895-94bb-7ad4a22666f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] No waiting events found dispatching network-vif-plugged-d6bc5516-013e-4fd3-981b-a500723e5cc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:22:10 compute-0 nova_compute[183075]: 2026-01-22 17:22:10.641 183079 WARNING nova.compute.manager [req-8ae09dc6-a8bd-4d2d-a983-9b9508a17dc8 req-f541b3c7-dc28-4895-94bb-7ad4a22666f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Received unexpected event network-vif-plugged-d6bc5516-013e-4fd3-981b-a500723e5cc6 for instance with vm_state deleted and task_state None.
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.121 183079 DEBUG oslo_concurrency.lockutils [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.123 183079 DEBUG oslo_concurrency.lockutils [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.123 183079 DEBUG oslo_concurrency.lockutils [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.124 183079 DEBUG oslo_concurrency.lockutils [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.124 183079 DEBUG oslo_concurrency.lockutils [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.127 183079 INFO nova.compute.manager [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Terminating instance
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.128 183079 DEBUG nova.compute.manager [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:22:11 compute-0 kernel: tap7cc526f2-d7 (unregistering): left promiscuous mode
Jan 22 17:22:11 compute-0 NetworkManager[55454]: <info>  [1769102531.3353] device (tap7cc526f2-d7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.363 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:11 compute-0 ovn_controller[95372]: 2026-01-22T17:22:11Z|00454|binding|INFO|Releasing lport 7cc526f2-d707-4153-92b0-f1be81a8b1e3 from this chassis (sb_readonly=0)
Jan 22 17:22:11 compute-0 ovn_controller[95372]: 2026-01-22T17:22:11Z|00455|binding|INFO|Setting lport 7cc526f2-d707-4153-92b0-f1be81a8b1e3 down in Southbound
Jan 22 17:22:11 compute-0 ovn_controller[95372]: 2026-01-22T17:22:11Z|00456|binding|INFO|Removing iface tap7cc526f2-d7 ovn-installed in OVS
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.366 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.377 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:11.381 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:9a:74 10.100.0.9'], port_security=['fa:16:3e:31:9a:74 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '803da6ed-0f79-4c4d-b054-593b0dee0c0b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=7cc526f2-d707-4153-92b0-f1be81a8b1e3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:22:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:11.382 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 7cc526f2-d707-4153-92b0-f1be81a8b1e3 in datapath eee918a6-66b2-47ae-b702-620a23ef395b unbound from our chassis
Jan 22 17:22:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:11.384 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eee918a6-66b2-47ae-b702-620a23ef395b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:22:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:11.385 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2d85e17e-5812-4cc0-8787-602a3387b7ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:11.385 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b namespace which is not needed anymore
Jan 22 17:22:11 compute-0 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000024.scope: Deactivated successfully.
Jan 22 17:22:11 compute-0 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000024.scope: Consumed 15.215s CPU time.
Jan 22 17:22:11 compute-0 systemd-machined[154382]: Machine qemu-36-instance-00000024 terminated.
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.549 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.553 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.599 183079 INFO nova.virt.libvirt.driver [-] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Instance destroyed successfully.
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.599 183079 DEBUG nova.objects.instance [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'resources' on Instance uuid 803da6ed-0f79-4c4d-b054-593b0dee0c0b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:22:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:11.645 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.671 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.762 183079 DEBUG nova.virt.libvirt.vif [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:20:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1984705589',display_name='tempest-server-test-1984705589',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1984705589',id=36,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:21:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-v6730tnl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:21:02Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=803da6ed-0f79-4c4d-b054-593b0dee0c0b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "address": "fa:16:3e:31:9a:74", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cc526f2-d7", "ovs_interfaceid": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.763 183079 DEBUG nova.network.os_vif_util [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "address": "fa:16:3e:31:9a:74", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cc526f2-d7", "ovs_interfaceid": "7cc526f2-d707-4153-92b0-f1be81a8b1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.764 183079 DEBUG nova.network.os_vif_util [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:9a:74,bridge_name='br-int',has_traffic_filtering=True,id=7cc526f2-d707-4153-92b0-f1be81a8b1e3,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7cc526f2-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.764 183079 DEBUG os_vif [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:9a:74,bridge_name='br-int',has_traffic_filtering=True,id=7cc526f2-d707-4153-92b0-f1be81a8b1e3,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7cc526f2-d7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.766 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.766 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7cc526f2-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:22:11 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[227002]: [NOTICE]   (227018) : haproxy version is 2.8.14-c23fe91
Jan 22 17:22:11 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[227002]: [NOTICE]   (227018) : path to executable is /usr/sbin/haproxy
Jan 22 17:22:11 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[227002]: [WARNING]  (227018) : Exiting Master process...
Jan 22 17:22:11 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[227002]: [WARNING]  (227018) : Exiting Master process...
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.769 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:11 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[227002]: [ALERT]    (227018) : Current worker (227029) exited with code 143 (Terminated)
Jan 22 17:22:11 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[227002]: [WARNING]  (227018) : All workers exited. Exiting... (0)
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.771 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:11 compute-0 systemd[1]: libpod-3442a1ee6a2c184f5fc6c40c63d18109f86621fd3a893b509606c67e3ad9914d.scope: Deactivated successfully.
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.775 183079 INFO os_vif [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:9a:74,bridge_name='br-int',has_traffic_filtering=True,id=7cc526f2-d707-4153-92b0-f1be81a8b1e3,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7cc526f2-d7')
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.777 183079 INFO nova.virt.libvirt.driver [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Deleting instance files /var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b_del
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.777 183079 INFO nova.virt.libvirt.driver [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Deletion of /var/lib/nova/instances/803da6ed-0f79-4c4d-b054-593b0dee0c0b_del complete
Jan 22 17:22:11 compute-0 podman[227692]: 2026-01-22 17:22:11.778865807 +0000 UTC m=+0.297963679 container died 3442a1ee6a2c184f5fc6c40c63d18109f86621fd3a893b509606c67e3ad9914d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:22:11 compute-0 nova_compute[183075]: 2026-01-22 17:22:11.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:22:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3442a1ee6a2c184f5fc6c40c63d18109f86621fd3a893b509606c67e3ad9914d-userdata-shm.mount: Deactivated successfully.
Jan 22 17:22:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-dd1f8f6301b712456f55b0676b70aa1a1156b7d38c0a295fbe888b5143ea00fc-merged.mount: Deactivated successfully.
Jan 22 17:22:12 compute-0 podman[227692]: 2026-01-22 17:22:12.005778812 +0000 UTC m=+0.524876654 container cleanup 3442a1ee6a2c184f5fc6c40c63d18109f86621fd3a893b509606c67e3ad9914d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 22 17:22:12 compute-0 nova_compute[183075]: 2026-01-22 17:22:12.026 183079 INFO nova.compute.manager [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Took 0.90 seconds to destroy the instance on the hypervisor.
Jan 22 17:22:12 compute-0 nova_compute[183075]: 2026-01-22 17:22:12.027 183079 DEBUG oslo.service.loopingcall [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:22:12 compute-0 nova_compute[183075]: 2026-01-22 17:22:12.027 183079 DEBUG nova.compute.manager [-] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:22:12 compute-0 nova_compute[183075]: 2026-01-22 17:22:12.028 183079 DEBUG nova.network.neutron [-] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:22:12 compute-0 systemd[1]: libpod-conmon-3442a1ee6a2c184f5fc6c40c63d18109f86621fd3a893b509606c67e3ad9914d.scope: Deactivated successfully.
Jan 22 17:22:12 compute-0 podman[227737]: 2026-01-22 17:22:12.091651157 +0000 UTC m=+0.056513481 container remove 3442a1ee6a2c184f5fc6c40c63d18109f86621fd3a893b509606c67e3ad9914d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:22:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:12.097 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d0f1edae-601e-4915-bdd5-024947c2b129]: (4, ('Thu Jan 22 05:22:11 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b (3442a1ee6a2c184f5fc6c40c63d18109f86621fd3a893b509606c67e3ad9914d)\n3442a1ee6a2c184f5fc6c40c63d18109f86621fd3a893b509606c67e3ad9914d\nThu Jan 22 05:22:12 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b (3442a1ee6a2c184f5fc6c40c63d18109f86621fd3a893b509606c67e3ad9914d)\n3442a1ee6a2c184f5fc6c40c63d18109f86621fd3a893b509606c67e3ad9914d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:12.099 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c0b4a6b5-cf4d-496a-9504-f009ab0a07bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:12.100 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:22:12 compute-0 nova_compute[183075]: 2026-01-22 17:22:12.102 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:12 compute-0 kernel: tapeee918a6-60: left promiscuous mode
Jan 22 17:22:12 compute-0 nova_compute[183075]: 2026-01-22 17:22:12.115 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:12.119 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[69074dcb-3f10-41a9-8165-25cac4ba705b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:12.134 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f2f620ed-e3df-4756-9fb2-eade64a0ef58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:12.135 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[13768fe0-6f2e-4fcc-9bfa-24c053cf73f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:12.150 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e6eb94c7-7313-44ed-8fbb-6165ea2ffa68]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482556, 'reachable_time': 41687, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227749, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:12 compute-0 systemd[1]: run-netns-ovnmeta\x2deee918a6\x2d66b2\x2d47ae\x2db702\x2d620a23ef395b.mount: Deactivated successfully.
Jan 22 17:22:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:12.157 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:22:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:12.158 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[d18d5e5a-f82f-4e34-9e5c-09a908b17162]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:12 compute-0 nova_compute[183075]: 2026-01-22 17:22:12.841 183079 DEBUG nova.compute.manager [req-1fb8d17c-b28e-4a2b-9cde-c3d07793967f req-572c9ad4-9ba2-4150-9415-0d064a8b42fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Received event network-vif-unplugged-7cc526f2-d707-4153-92b0-f1be81a8b1e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:22:12 compute-0 nova_compute[183075]: 2026-01-22 17:22:12.841 183079 DEBUG oslo_concurrency.lockutils [req-1fb8d17c-b28e-4a2b-9cde-c3d07793967f req-572c9ad4-9ba2-4150-9415-0d064a8b42fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:12 compute-0 nova_compute[183075]: 2026-01-22 17:22:12.841 183079 DEBUG oslo_concurrency.lockutils [req-1fb8d17c-b28e-4a2b-9cde-c3d07793967f req-572c9ad4-9ba2-4150-9415-0d064a8b42fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:12 compute-0 nova_compute[183075]: 2026-01-22 17:22:12.842 183079 DEBUG oslo_concurrency.lockutils [req-1fb8d17c-b28e-4a2b-9cde-c3d07793967f req-572c9ad4-9ba2-4150-9415-0d064a8b42fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:12 compute-0 nova_compute[183075]: 2026-01-22 17:22:12.842 183079 DEBUG nova.compute.manager [req-1fb8d17c-b28e-4a2b-9cde-c3d07793967f req-572c9ad4-9ba2-4150-9415-0d064a8b42fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] No waiting events found dispatching network-vif-unplugged-7cc526f2-d707-4153-92b0-f1be81a8b1e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:22:12 compute-0 nova_compute[183075]: 2026-01-22 17:22:12.842 183079 DEBUG nova.compute.manager [req-1fb8d17c-b28e-4a2b-9cde-c3d07793967f req-572c9ad4-9ba2-4150-9415-0d064a8b42fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Received event network-vif-unplugged-7cc526f2-d707-4153-92b0-f1be81a8b1e3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:22:13 compute-0 nova_compute[183075]: 2026-01-22 17:22:13.919 183079 INFO nova.compute.manager [None req-c4422d74-317a-4847-892e-1468bcea1dbb 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Get console output
Jan 22 17:22:13 compute-0 nova_compute[183075]: 2026-01-22 17:22:13.924 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:22:14 compute-0 podman[227751]: 2026-01-22 17:22:14.364704809 +0000 UTC m=+0.059821339 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 22 17:22:14 compute-0 podman[227750]: 2026-01-22 17:22:14.396445746 +0000 UTC m=+0.097579234 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:22:14 compute-0 podman[227752]: 2026-01-22 17:22:14.397000501 +0000 UTC m=+0.086961275 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, architecture=x86_64, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Jan 22 17:22:14 compute-0 nova_compute[183075]: 2026-01-22 17:22:14.965 183079 DEBUG nova.compute.manager [req-84fe869f-e405-40ec-ab9d-918c16aee825 req-fe6e55dc-d722-4db4-9d90-d4682344244c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Received event network-vif-plugged-7cc526f2-d707-4153-92b0-f1be81a8b1e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:22:14 compute-0 nova_compute[183075]: 2026-01-22 17:22:14.966 183079 DEBUG oslo_concurrency.lockutils [req-84fe869f-e405-40ec-ab9d-918c16aee825 req-fe6e55dc-d722-4db4-9d90-d4682344244c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:14 compute-0 nova_compute[183075]: 2026-01-22 17:22:14.966 183079 DEBUG oslo_concurrency.lockutils [req-84fe869f-e405-40ec-ab9d-918c16aee825 req-fe6e55dc-d722-4db4-9d90-d4682344244c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:14 compute-0 nova_compute[183075]: 2026-01-22 17:22:14.966 183079 DEBUG oslo_concurrency.lockutils [req-84fe869f-e405-40ec-ab9d-918c16aee825 req-fe6e55dc-d722-4db4-9d90-d4682344244c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:14 compute-0 nova_compute[183075]: 2026-01-22 17:22:14.966 183079 DEBUG nova.compute.manager [req-84fe869f-e405-40ec-ab9d-918c16aee825 req-fe6e55dc-d722-4db4-9d90-d4682344244c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] No waiting events found dispatching network-vif-plugged-7cc526f2-d707-4153-92b0-f1be81a8b1e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:22:14 compute-0 nova_compute[183075]: 2026-01-22 17:22:14.966 183079 WARNING nova.compute.manager [req-84fe869f-e405-40ec-ab9d-918c16aee825 req-fe6e55dc-d722-4db4-9d90-d4682344244c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Received unexpected event network-vif-plugged-7cc526f2-d707-4153-92b0-f1be81a8b1e3 for instance with vm_state active and task_state deleting.
Jan 22 17:22:15 compute-0 nova_compute[183075]: 2026-01-22 17:22:15.473 183079 DEBUG nova.network.neutron [-] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:22:15 compute-0 nova_compute[183075]: 2026-01-22 17:22:15.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:22:15 compute-0 nova_compute[183075]: 2026-01-22 17:22:15.846 183079 INFO nova.compute.manager [-] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Took 3.82 seconds to deallocate network for instance.
Jan 22 17:22:16 compute-0 nova_compute[183075]: 2026-01-22 17:22:16.107 183079 DEBUG oslo_concurrency.lockutils [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:16 compute-0 nova_compute[183075]: 2026-01-22 17:22:16.108 183079 DEBUG oslo_concurrency.lockutils [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:16 compute-0 nova_compute[183075]: 2026-01-22 17:22:16.200 183079 DEBUG nova.scheduler.client.report [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Refreshing inventories for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 17:22:16 compute-0 nova_compute[183075]: 2026-01-22 17:22:16.229 183079 DEBUG nova.scheduler.client.report [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Updating ProviderTree inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 17:22:16 compute-0 nova_compute[183075]: 2026-01-22 17:22:16.230 183079 DEBUG nova.compute.provider_tree [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 17:22:16 compute-0 nova_compute[183075]: 2026-01-22 17:22:16.251 183079 DEBUG nova.scheduler.client.report [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Refreshing aggregate associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 17:22:16 compute-0 nova_compute[183075]: 2026-01-22 17:22:16.285 183079 DEBUG nova.scheduler.client.report [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Refreshing trait associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NODE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 17:22:16 compute-0 nova_compute[183075]: 2026-01-22 17:22:16.352 183079 DEBUG nova.compute.provider_tree [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:22:16 compute-0 nova_compute[183075]: 2026-01-22 17:22:16.409 183079 DEBUG nova.scheduler.client.report [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:22:16 compute-0 nova_compute[183075]: 2026-01-22 17:22:16.574 183079 DEBUG oslo_concurrency.lockutils [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.466s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:16 compute-0 nova_compute[183075]: 2026-01-22 17:22:16.622 183079 INFO nova.scheduler.client.report [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Deleted allocations for instance 803da6ed-0f79-4c4d-b054-593b0dee0c0b
Jan 22 17:22:16 compute-0 nova_compute[183075]: 2026-01-22 17:22:16.681 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:16 compute-0 nova_compute[183075]: 2026-01-22 17:22:16.769 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:16 compute-0 nova_compute[183075]: 2026-01-22 17:22:16.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:22:17 compute-0 nova_compute[183075]: 2026-01-22 17:22:17.271 183079 DEBUG oslo_concurrency.lockutils [None req-8b9b090d-dfa9-41e0-976c-4b33048a159d 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "803da6ed-0f79-4c4d-b054-593b0dee0c0b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:19 compute-0 nova_compute[183075]: 2026-01-22 17:22:19.100 183079 INFO nova.compute.manager [None req-f8b35bfb-81f9-4550-a191-91f8f926ac62 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Get console output
Jan 22 17:22:19 compute-0 nova_compute[183075]: 2026-01-22 17:22:19.104 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:22:19 compute-0 nova_compute[183075]: 2026-01-22 17:22:19.782 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:22:19 compute-0 nova_compute[183075]: 2026-01-22 17:22:19.786 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:22:19 compute-0 nova_compute[183075]: 2026-01-22 17:22:19.786 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:22:20 compute-0 nova_compute[183075]: 2026-01-22 17:22:20.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:22:21 compute-0 nova_compute[183075]: 2026-01-22 17:22:21.103 183079 INFO nova.compute.manager [None req-5bd07a86-705a-4d2b-984d-4803f552a30f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Get console output
Jan 22 17:22:21 compute-0 nova_compute[183075]: 2026-01-22 17:22:21.110 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:22:21 compute-0 podman[227813]: 2026-01-22 17:22:21.391941933 +0000 UTC m=+0.092418948 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 17:22:21 compute-0 nova_compute[183075]: 2026-01-22 17:22:21.684 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:21 compute-0 nova_compute[183075]: 2026-01-22 17:22:21.772 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:21 compute-0 nova_compute[183075]: 2026-01-22 17:22:21.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:22:21 compute-0 nova_compute[183075]: 2026-01-22 17:22:21.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:22:21 compute-0 nova_compute[183075]: 2026-01-22 17:22:21.902 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:21 compute-0 nova_compute[183075]: 2026-01-22 17:22:21.902 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:21 compute-0 nova_compute[183075]: 2026-01-22 17:22:21.903 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:21 compute-0 nova_compute[183075]: 2026-01-22 17:22:21.903 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:22:22 compute-0 nova_compute[183075]: 2026-01-22 17:22:22.320 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:22:22 compute-0 nova_compute[183075]: 2026-01-22 17:22:22.411 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:22:22 compute-0 nova_compute[183075]: 2026-01-22 17:22:22.412 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:22:22 compute-0 nova_compute[183075]: 2026-01-22 17:22:22.505 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:22:22 compute-0 nova_compute[183075]: 2026-01-22 17:22:22.691 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:22:22 compute-0 nova_compute[183075]: 2026-01-22 17:22:22.693 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5573MB free_disk=73.33749008178711GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:22:22 compute-0 nova_compute[183075]: 2026-01-22 17:22:22.693 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:22 compute-0 nova_compute[183075]: 2026-01-22 17:22:22.693 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:23 compute-0 nova_compute[183075]: 2026-01-22 17:22:23.096 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102528.0953023, 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:22:23 compute-0 nova_compute[183075]: 2026-01-22 17:22:23.097 183079 INFO nova.compute.manager [-] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] VM Stopped (Lifecycle Event)
Jan 22 17:22:23 compute-0 sshd-session[227637]: Connection closed by 206.168.34.195 port 57710 [preauth]
Jan 22 17:22:24 compute-0 nova_compute[183075]: 2026-01-22 17:22:24.016 183079 DEBUG nova.compute.manager [None req-f214a283-8bda-4525-8aa5-b2b4884caaaf - - - - - -] [instance: 2ee81c99-07a0-41e3-bfac-dfb718a8e4c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:22:24 compute-0 nova_compute[183075]: 2026-01-22 17:22:24.070 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance b4f5d7ef-7780-43fe-9ed3-e83542116fa8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:22:24 compute-0 nova_compute[183075]: 2026-01-22 17:22:24.071 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:22:24 compute-0 nova_compute[183075]: 2026-01-22 17:22:24.072 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:22:24 compute-0 nova_compute[183075]: 2026-01-22 17:22:24.113 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:22:25 compute-0 nova_compute[183075]: 2026-01-22 17:22:25.175 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:22:26 compute-0 nova_compute[183075]: 2026-01-22 17:22:26.597 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102531.5956225, 803da6ed-0f79-4c4d-b054-593b0dee0c0b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:22:26 compute-0 nova_compute[183075]: 2026-01-22 17:22:26.597 183079 INFO nova.compute.manager [-] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] VM Stopped (Lifecycle Event)
Jan 22 17:22:26 compute-0 nova_compute[183075]: 2026-01-22 17:22:26.686 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:26 compute-0 nova_compute[183075]: 2026-01-22 17:22:26.773 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:26 compute-0 nova_compute[183075]: 2026-01-22 17:22:26.972 183079 DEBUG nova.compute.manager [None req-e4558111-2486-46e0-9f58-88e1d9e69907 - - - - - -] [instance: 803da6ed-0f79-4c4d-b054-593b0dee0c0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:22:26 compute-0 nova_compute[183075]: 2026-01-22 17:22:26.973 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:22:26 compute-0 nova_compute[183075]: 2026-01-22 17:22:26.974 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.280s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:27 compute-0 nova_compute[183075]: 2026-01-22 17:22:27.975 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:22:27 compute-0 nova_compute[183075]: 2026-01-22 17:22:27.976 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:22:27 compute-0 nova_compute[183075]: 2026-01-22 17:22:27.976 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:22:28 compute-0 podman[227841]: 2026-01-22 17:22:28.359118523 +0000 UTC m=+0.068148969 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:22:31 compute-0 nova_compute[183075]: 2026-01-22 17:22:31.318 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-b4f5d7ef-7780-43fe-9ed3-e83542116fa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:22:31 compute-0 nova_compute[183075]: 2026-01-22 17:22:31.319 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-b4f5d7ef-7780-43fe-9ed3-e83542116fa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:22:31 compute-0 nova_compute[183075]: 2026-01-22 17:22:31.320 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:22:31 compute-0 nova_compute[183075]: 2026-01-22 17:22:31.320 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b4f5d7ef-7780-43fe-9ed3-e83542116fa8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:22:31 compute-0 nova_compute[183075]: 2026-01-22 17:22:31.689 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:31 compute-0 nova_compute[183075]: 2026-01-22 17:22:31.775 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:33 compute-0 nova_compute[183075]: 2026-01-22 17:22:33.601 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Updating instance_info_cache with network_info: [{"id": "ce445831-3685-4525-80bd-f4dc617c4911", "address": "fa:16:3e:c5:38:5a", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce445831-36", "ovs_interfaceid": "ce445831-3685-4525-80bd-f4dc617c4911", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:22:33 compute-0 nova_compute[183075]: 2026-01-22 17:22:33.620 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-b4f5d7ef-7780-43fe-9ed3-e83542116fa8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:22:33 compute-0 nova_compute[183075]: 2026-01-22 17:22:33.620 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.207 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "a7440e72-b977-4601-88ad-ce8a4c72e883" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.207 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "a7440e72-b977-4601-88ad-ce8a4c72e883" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.225 183079 DEBUG nova.compute.manager [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.286 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.287 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.295 183079 DEBUG nova.virt.hardware [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.295 183079 INFO nova.compute.claims [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:22:35 compute-0 podman[227865]: 2026-01-22 17:22:35.352701878 +0000 UTC m=+0.050752039 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.414 183079 DEBUG nova.compute.provider_tree [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.428 183079 DEBUG nova.scheduler.client.report [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.455 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.456 183079 DEBUG nova.compute.manager [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.506 183079 DEBUG nova.compute.manager [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.506 183079 DEBUG nova.network.neutron [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.529 183079 INFO nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.545 183079 DEBUG nova.compute.manager [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.622 183079 DEBUG nova.compute.manager [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.623 183079 DEBUG nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.624 183079 INFO nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Creating image(s)
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.624 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "/var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.624 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.625 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.639 183079 DEBUG oslo_concurrency.processutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.692 183079 DEBUG oslo_concurrency.processutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.693 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.694 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.711 183079 DEBUG oslo_concurrency.processutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.765 183079 DEBUG oslo_concurrency.processutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.766 183079 DEBUG oslo_concurrency.processutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.828 183079 DEBUG oslo_concurrency.processutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk 1073741824" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.829 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.829 183079 DEBUG oslo_concurrency.processutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.885 183079 DEBUG oslo_concurrency.processutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.887 183079 DEBUG nova.virt.disk.api [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Checking if we can resize image /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.888 183079 DEBUG oslo_concurrency.processutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.945 183079 DEBUG oslo_concurrency.processutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.946 183079 DEBUG nova.virt.disk.api [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Cannot resize image /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.946 183079 DEBUG nova.objects.instance [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'migration_context' on Instance uuid a7440e72-b977-4601-88ad-ce8a4c72e883 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.964 183079 DEBUG nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.965 183079 DEBUG nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Ensure instance console log exists: /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.965 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.966 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:35 compute-0 nova_compute[183075]: 2026-01-22 17:22:35.967 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:36 compute-0 nova_compute[183075]: 2026-01-22 17:22:36.691 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:36 compute-0 nova_compute[183075]: 2026-01-22 17:22:36.778 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:37 compute-0 nova_compute[183075]: 2026-01-22 17:22:37.355 183079 DEBUG nova.policy [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:22:40 compute-0 nova_compute[183075]: 2026-01-22 17:22:40.381 183079 DEBUG nova.network.neutron [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Successfully updated port: 6b312169-d575-48ac-b3c3-72634952d91f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:22:40 compute-0 nova_compute[183075]: 2026-01-22 17:22:40.419 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "refresh_cache-a7440e72-b977-4601-88ad-ce8a4c72e883" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:22:40 compute-0 nova_compute[183075]: 2026-01-22 17:22:40.420 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquired lock "refresh_cache-a7440e72-b977-4601-88ad-ce8a4c72e883" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:22:40 compute-0 nova_compute[183075]: 2026-01-22 17:22:40.420 183079 DEBUG nova.network.neutron [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:22:40 compute-0 nova_compute[183075]: 2026-01-22 17:22:40.470 183079 DEBUG nova.compute.manager [req-180ce162-bd75-46b3-b657-3141905e960e req-c9d989a7-0fb8-456c-95b1-5a14bfc3fc17 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Received event network-changed-6b312169-d575-48ac-b3c3-72634952d91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:22:40 compute-0 nova_compute[183075]: 2026-01-22 17:22:40.470 183079 DEBUG nova.compute.manager [req-180ce162-bd75-46b3-b657-3141905e960e req-c9d989a7-0fb8-456c-95b1-5a14bfc3fc17 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Refreshing instance network info cache due to event network-changed-6b312169-d575-48ac-b3c3-72634952d91f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:22:40 compute-0 nova_compute[183075]: 2026-01-22 17:22:40.471 183079 DEBUG oslo_concurrency.lockutils [req-180ce162-bd75-46b3-b657-3141905e960e req-c9d989a7-0fb8-456c-95b1-5a14bfc3fc17 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-a7440e72-b977-4601-88ad-ce8a4c72e883" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:22:40 compute-0 nova_compute[183075]: 2026-01-22 17:22:40.592 183079 DEBUG nova.network.neutron [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:22:41 compute-0 nova_compute[183075]: 2026-01-22 17:22:41.693 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:41 compute-0 nova_compute[183075]: 2026-01-22 17:22:41.780 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:41.939 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:41.940 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:41.941 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.717 183079 DEBUG nova.network.neutron [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Updating instance_info_cache with network_info: [{"id": "6b312169-d575-48ac-b3c3-72634952d91f", "address": "fa:16:3e:bf:52:41", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b312169-d5", "ovs_interfaceid": "6b312169-d575-48ac-b3c3-72634952d91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.737 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Releasing lock "refresh_cache-a7440e72-b977-4601-88ad-ce8a4c72e883" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.738 183079 DEBUG nova.compute.manager [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Instance network_info: |[{"id": "6b312169-d575-48ac-b3c3-72634952d91f", "address": "fa:16:3e:bf:52:41", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b312169-d5", "ovs_interfaceid": "6b312169-d575-48ac-b3c3-72634952d91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.739 183079 DEBUG oslo_concurrency.lockutils [req-180ce162-bd75-46b3-b657-3141905e960e req-c9d989a7-0fb8-456c-95b1-5a14bfc3fc17 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-a7440e72-b977-4601-88ad-ce8a4c72e883" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.739 183079 DEBUG nova.network.neutron [req-180ce162-bd75-46b3-b657-3141905e960e req-c9d989a7-0fb8-456c-95b1-5a14bfc3fc17 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Refreshing network info cache for port 6b312169-d575-48ac-b3c3-72634952d91f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.744 183079 DEBUG nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Start _get_guest_xml network_info=[{"id": "6b312169-d575-48ac-b3c3-72634952d91f", "address": "fa:16:3e:bf:52:41", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b312169-d5", "ovs_interfaceid": "6b312169-d575-48ac-b3c3-72634952d91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.750 183079 WARNING nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.757 183079 DEBUG nova.virt.libvirt.host [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.758 183079 DEBUG nova.virt.libvirt.host [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.762 183079 DEBUG nova.virt.libvirt.host [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.763 183079 DEBUG nova.virt.libvirt.host [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.764 183079 DEBUG nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.764 183079 DEBUG nova.virt.hardware [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.765 183079 DEBUG nova.virt.hardware [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.766 183079 DEBUG nova.virt.hardware [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.766 183079 DEBUG nova.virt.hardware [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.767 183079 DEBUG nova.virt.hardware [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.767 183079 DEBUG nova.virt.hardware [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.768 183079 DEBUG nova.virt.hardware [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.768 183079 DEBUG nova.virt.hardware [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.769 183079 DEBUG nova.virt.hardware [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.770 183079 DEBUG nova.virt.hardware [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.770 183079 DEBUG nova.virt.hardware [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.777 183079 DEBUG nova.virt.libvirt.vif [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:22:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1599866618',display_name='tempest-server-test-1599866618',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1599866618',id=39,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-pr2e3mbj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:22:35Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=a7440e72-b977-4601-88ad-ce8a4c72e883,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b312169-d575-48ac-b3c3-72634952d91f", "address": "fa:16:3e:bf:52:41", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b312169-d5", "ovs_interfaceid": "6b312169-d575-48ac-b3c3-72634952d91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.777 183079 DEBUG nova.network.os_vif_util [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "6b312169-d575-48ac-b3c3-72634952d91f", "address": "fa:16:3e:bf:52:41", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b312169-d5", "ovs_interfaceid": "6b312169-d575-48ac-b3c3-72634952d91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.779 183079 DEBUG nova.network.os_vif_util [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:52:41,bridge_name='br-int',has_traffic_filtering=True,id=6b312169-d575-48ac-b3c3-72634952d91f,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6b312169-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.780 183079 DEBUG nova.objects.instance [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'pci_devices' on Instance uuid a7440e72-b977-4601-88ad-ce8a4c72e883 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.808 183079 DEBUG nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:22:42 compute-0 nova_compute[183075]:   <uuid>a7440e72-b977-4601-88ad-ce8a4c72e883</uuid>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   <name>instance-00000027</name>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1599866618</nova:name>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:22:42</nova:creationTime>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:22:42 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:22:42 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:22:42 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:22:42 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:22:42 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:22:42 compute-0 nova_compute[183075]:         <nova:user uuid="1148a46489e842e6a0c7660c54567798">tempest-FloatingIpSameNetwork-953620552-project-member</nova:user>
Jan 22 17:22:42 compute-0 nova_compute[183075]:         <nova:project uuid="02818155e7af4645bc909d4ba671f11f">tempest-FloatingIpSameNetwork-953620552</nova:project>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:22:42 compute-0 nova_compute[183075]:         <nova:port uuid="6b312169-d575-48ac-b3c3-72634952d91f">
Jan 22 17:22:42 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <system>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <entry name="serial">a7440e72-b977-4601-88ad-ce8a4c72e883</entry>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <entry name="uuid">a7440e72-b977-4601-88ad-ce8a4c72e883</entry>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     </system>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   <os>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   </os>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   <features>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   </features>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:bf:52:41"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <target dev="tap6b312169-d5"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/console.log" append="off"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <video>
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     </video>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:22:42 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:22:42 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:22:42 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:22:42 compute-0 nova_compute[183075]: </domain>
Jan 22 17:22:42 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.810 183079 DEBUG nova.compute.manager [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Preparing to wait for external event network-vif-plugged-6b312169-d575-48ac-b3c3-72634952d91f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.810 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.810 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.811 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.812 183079 DEBUG nova.virt.libvirt.vif [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:22:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1599866618',display_name='tempest-server-test-1599866618',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1599866618',id=39,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-pr2e3mbj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:22:35Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=a7440e72-b977-4601-88ad-ce8a4c72e883,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b312169-d575-48ac-b3c3-72634952d91f", "address": "fa:16:3e:bf:52:41", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b312169-d5", "ovs_interfaceid": "6b312169-d575-48ac-b3c3-72634952d91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.812 183079 DEBUG nova.network.os_vif_util [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "6b312169-d575-48ac-b3c3-72634952d91f", "address": "fa:16:3e:bf:52:41", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b312169-d5", "ovs_interfaceid": "6b312169-d575-48ac-b3c3-72634952d91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.813 183079 DEBUG nova.network.os_vif_util [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:52:41,bridge_name='br-int',has_traffic_filtering=True,id=6b312169-d575-48ac-b3c3-72634952d91f,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6b312169-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.814 183079 DEBUG os_vif [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:52:41,bridge_name='br-int',has_traffic_filtering=True,id=6b312169-d575-48ac-b3c3-72634952d91f,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6b312169-d5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.814 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.815 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.816 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.821 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.821 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b312169-d5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.822 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6b312169-d5, col_values=(('external_ids', {'iface-id': '6b312169-d575-48ac-b3c3-72634952d91f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bf:52:41', 'vm-uuid': 'a7440e72-b977-4601-88ad-ce8a4c72e883'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.824 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:42 compute-0 NetworkManager[55454]: <info>  [1769102562.8257] manager: (tap6b312169-d5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/196)
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.826 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.831 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.832 183079 INFO os_vif [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:52:41,bridge_name='br-int',has_traffic_filtering=True,id=6b312169-d575-48ac-b3c3-72634952d91f,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6b312169-d5')
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.878 183079 DEBUG nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.878 183079 DEBUG nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No VIF found with MAC fa:16:3e:bf:52:41, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:22:42 compute-0 kernel: tap6b312169-d5: entered promiscuous mode
Jan 22 17:22:42 compute-0 ovn_controller[95372]: 2026-01-22T17:22:42Z|00457|binding|INFO|Claiming lport 6b312169-d575-48ac-b3c3-72634952d91f for this chassis.
Jan 22 17:22:42 compute-0 ovn_controller[95372]: 2026-01-22T17:22:42Z|00458|binding|INFO|6b312169-d575-48ac-b3c3-72634952d91f: Claiming fa:16:3e:bf:52:41 10.100.0.10
Jan 22 17:22:42 compute-0 NetworkManager[55454]: <info>  [1769102562.9665] manager: (tap6b312169-d5): new Tun device (/org/freedesktop/NetworkManager/Devices/197)
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.965 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:42.972 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:52:41 10.100.0.10'], port_security=['fa:16:3e:bf:52:41 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.213'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=6b312169-d575-48ac-b3c3-72634952d91f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:22:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:42.974 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 6b312169-d575-48ac-b3c3-72634952d91f in datapath eee918a6-66b2-47ae-b702-620a23ef395b bound to our chassis
Jan 22 17:22:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:42.976 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:22:42 compute-0 ovn_controller[95372]: 2026-01-22T17:22:42Z|00459|binding|INFO|Setting lport 6b312169-d575-48ac-b3c3-72634952d91f ovn-installed in OVS
Jan 22 17:22:42 compute-0 ovn_controller[95372]: 2026-01-22T17:22:42Z|00460|binding|INFO|Setting lport 6b312169-d575-48ac-b3c3-72634952d91f up in Southbound
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.983 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:42 compute-0 nova_compute[183075]: 2026-01-22 17:22:42.985 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:42.993 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[eb85fb1f-fd88-4e38-9868-706b963a9781]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:42.994 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapeee918a6-61 in ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:22:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:42.997 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapeee918a6-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:22:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:42.997 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[66342ce4-7d0b-4fbb-9ebe-44ad218faaef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:42.998 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9e1757c7-217e-48b4-b681-d19b0e01a5ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:43 compute-0 systemd-machined[154382]: New machine qemu-39-instance-00000027.
Jan 22 17:22:43 compute-0 systemd-udevd[227922]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.015 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[f9ab5000-1de1-4ede-bdd2-a62856381dcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:43 compute-0 systemd[1]: Started Virtual Machine qemu-39-instance-00000027.
Jan 22 17:22:43 compute-0 NetworkManager[55454]: <info>  [1769102563.0293] device (tap6b312169-d5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:22:43 compute-0 NetworkManager[55454]: <info>  [1769102563.0305] device (tap6b312169-d5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.038 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[44cf5b2e-cc9e-4d84-ba47-f5126a5c5c37]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.067 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[9fbb8de7-d124-43ae-b9ca-3d74bc82ecd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.074 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a4de34d0-d49e-4cb6-8f76-bb86f63a9ad4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:43 compute-0 NetworkManager[55454]: <info>  [1769102563.0757] manager: (tapeee918a6-60): new Veth device (/org/freedesktop/NetworkManager/Devices/198)
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.106 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[26b65995-f3ef-4517-bfa3-1b23a70f11e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.109 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a60f3f37-c60c-4d13-8b43-d6d84c83f8d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:43 compute-0 NetworkManager[55454]: <info>  [1769102563.1389] device (tapeee918a6-60): carrier: link connected
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.147 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f9d9aba3-0e79-4128-b1e5-004222942eb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.170 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[af6313fa-f123-4e6b-9dd9-123564b536a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492684, 'reachable_time': 42858, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227953, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.191 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2cf56591-28dd-416f-9405-cb098308e37e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe68:e27e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492684, 'tstamp': 492684}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227954, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:43 compute-0 nova_compute[183075]: 2026-01-22 17:22:43.201 183079 DEBUG nova.compute.manager [req-72dccd27-9a3e-4f4c-8ffd-a360bee7113f req-0b3d9ee7-189b-4697-afcd-c3d7a844d031 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Received event network-vif-plugged-6b312169-d575-48ac-b3c3-72634952d91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:22:43 compute-0 nova_compute[183075]: 2026-01-22 17:22:43.201 183079 DEBUG oslo_concurrency.lockutils [req-72dccd27-9a3e-4f4c-8ffd-a360bee7113f req-0b3d9ee7-189b-4697-afcd-c3d7a844d031 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:43 compute-0 nova_compute[183075]: 2026-01-22 17:22:43.202 183079 DEBUG oslo_concurrency.lockutils [req-72dccd27-9a3e-4f4c-8ffd-a360bee7113f req-0b3d9ee7-189b-4697-afcd-c3d7a844d031 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:43 compute-0 nova_compute[183075]: 2026-01-22 17:22:43.202 183079 DEBUG oslo_concurrency.lockutils [req-72dccd27-9a3e-4f4c-8ffd-a360bee7113f req-0b3d9ee7-189b-4697-afcd-c3d7a844d031 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:43 compute-0 nova_compute[183075]: 2026-01-22 17:22:43.202 183079 DEBUG nova.compute.manager [req-72dccd27-9a3e-4f4c-8ffd-a360bee7113f req-0b3d9ee7-189b-4697-afcd-c3d7a844d031 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Processing event network-vif-plugged-6b312169-d575-48ac-b3c3-72634952d91f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.212 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[498a53ab-258e-4214-85f3-4a613fea0293]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492684, 'reachable_time': 42858, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227955, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.255 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c1a3032e-d290-4990-96f7-f97eaf1a7d8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.328 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[86ee6817-c546-44ba-a278-83108cbc81ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.330 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.330 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.331 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee918a6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:22:43 compute-0 NetworkManager[55454]: <info>  [1769102563.3827] manager: (tapeee918a6-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/199)
Jan 22 17:22:43 compute-0 kernel: tapeee918a6-60: entered promiscuous mode
Jan 22 17:22:43 compute-0 nova_compute[183075]: 2026-01-22 17:22:43.383 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.387 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeee918a6-60, col_values=(('external_ids', {'iface-id': '15d4de90-41f4-4532-aebd-197c2a33c6d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:22:43 compute-0 nova_compute[183075]: 2026-01-22 17:22:43.388 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:43 compute-0 ovn_controller[95372]: 2026-01-22T17:22:43Z|00461|binding|INFO|Releasing lport 15d4de90-41f4-4532-aebd-197c2a33c6d6 from this chassis (sb_readonly=0)
Jan 22 17:22:43 compute-0 nova_compute[183075]: 2026-01-22 17:22:43.389 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.390 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eee918a6-66b2-47ae-b702-620a23ef395b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eee918a6-66b2-47ae-b702-620a23ef395b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.391 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f5fb2f53-620e-422b-8fc1-51955e3806a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.392 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/eee918a6-66b2-47ae-b702-620a23ef395b.pid.haproxy
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:22:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:43.393 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'env', 'PROCESS_TAG=haproxy-eee918a6-66b2-47ae-b702-620a23ef395b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/eee918a6-66b2-47ae-b702-620a23ef395b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:22:43 compute-0 nova_compute[183075]: 2026-01-22 17:22:43.399 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:43 compute-0 nova_compute[183075]: 2026-01-22 17:22:43.443 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102563.4430504, a7440e72-b977-4601-88ad-ce8a4c72e883 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:22:43 compute-0 nova_compute[183075]: 2026-01-22 17:22:43.444 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] VM Started (Lifecycle Event)
Jan 22 17:22:43 compute-0 nova_compute[183075]: 2026-01-22 17:22:43.446 183079 DEBUG nova.compute.manager [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:22:43 compute-0 nova_compute[183075]: 2026-01-22 17:22:43.450 183079 DEBUG nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:22:43 compute-0 nova_compute[183075]: 2026-01-22 17:22:43.455 183079 INFO nova.virt.libvirt.driver [-] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Instance spawned successfully.
Jan 22 17:22:43 compute-0 nova_compute[183075]: 2026-01-22 17:22:43.455 183079 DEBUG nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:22:43 compute-0 podman[227994]: 2026-01-22 17:22:43.811563081 +0000 UTC m=+0.041324331 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.016 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:22:44 compute-0 podman[227994]: 2026-01-22 17:22:44.018850038 +0000 UTC m=+0.248611228 container create 965a6e6fb48931b041a0430e955e9d11408557368b5064c9e68f9e7227950093 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.026 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.032 183079 DEBUG nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.033 183079 DEBUG nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.034 183079 DEBUG nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.035 183079 DEBUG nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.036 183079 DEBUG nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.037 183079 DEBUG nova.virt.libvirt.driver [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.048 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.049 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102563.4440577, a7440e72-b977-4601-88ad-ce8a4c72e883 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.049 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] VM Paused (Lifecycle Event)
Jan 22 17:22:44 compute-0 systemd[1]: Started libpod-conmon-965a6e6fb48931b041a0430e955e9d11408557368b5064c9e68f9e7227950093.scope.
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.082 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.087 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102563.448675, a7440e72-b977-4601-88ad-ce8a4c72e883 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.087 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] VM Resumed (Lifecycle Event)
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.101 183079 INFO nova.compute.manager [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Took 8.48 seconds to spawn the instance on the hypervisor.
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.102 183079 DEBUG nova.compute.manager [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:22:44 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.111 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.114 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:22:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4eedbd7cc9a36558f05b5597a3eb5eb97b510e492fcef83ea116135477665986/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:22:44 compute-0 podman[227994]: 2026-01-22 17:22:44.132868505 +0000 UTC m=+0.362629775 container init 965a6e6fb48931b041a0430e955e9d11408557368b5064c9e68f9e7227950093 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.138 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:22:44 compute-0 podman[227994]: 2026-01-22 17:22:44.140553828 +0000 UTC m=+0.370315028 container start 965a6e6fb48931b041a0430e955e9d11408557368b5064c9e68f9e7227950093 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 17:22:44 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[228022]: [NOTICE]   (228026) : New worker (228028) forked
Jan 22 17:22:44 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[228022]: [NOTICE]   (228026) : Loading success.
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.171 183079 INFO nova.compute.manager [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Took 8.90 seconds to build instance.
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.187 183079 DEBUG oslo_concurrency.lockutils [None req-9ac82ab6-e222-40ba-9520-4f91335eff4b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "a7440e72-b977-4601-88ad-ce8a4c72e883" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.980s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.609 183079 DEBUG nova.network.neutron [req-180ce162-bd75-46b3-b657-3141905e960e req-c9d989a7-0fb8-456c-95b1-5a14bfc3fc17 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Updated VIF entry in instance network info cache for port 6b312169-d575-48ac-b3c3-72634952d91f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.610 183079 DEBUG nova.network.neutron [req-180ce162-bd75-46b3-b657-3141905e960e req-c9d989a7-0fb8-456c-95b1-5a14bfc3fc17 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Updating instance_info_cache with network_info: [{"id": "6b312169-d575-48ac-b3c3-72634952d91f", "address": "fa:16:3e:bf:52:41", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b312169-d5", "ovs_interfaceid": "6b312169-d575-48ac-b3c3-72634952d91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:22:44 compute-0 nova_compute[183075]: 2026-01-22 17:22:44.624 183079 DEBUG oslo_concurrency.lockutils [req-180ce162-bd75-46b3-b657-3141905e960e req-c9d989a7-0fb8-456c-95b1-5a14bfc3fc17 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-a7440e72-b977-4601-88ad-ce8a4c72e883" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:22:45 compute-0 podman[228038]: 2026-01-22 17:22:45.353781267 +0000 UTC m=+0.057814226 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 22 17:22:45 compute-0 podman[228039]: 2026-01-22 17:22:45.366253146 +0000 UTC m=+0.069659848 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 17:22:45 compute-0 podman[228037]: 2026-01-22 17:22:45.390027083 +0000 UTC m=+0.088915526 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 22 17:22:45 compute-0 nova_compute[183075]: 2026-01-22 17:22:45.492 183079 DEBUG nova.compute.manager [req-0ec15ecd-c52c-408a-8097-c1408a7b05ae req-a7d55d50-c40a-46a3-8394-5c231c2b9ed4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Received event network-vif-plugged-6b312169-d575-48ac-b3c3-72634952d91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:22:45 compute-0 nova_compute[183075]: 2026-01-22 17:22:45.493 183079 DEBUG oslo_concurrency.lockutils [req-0ec15ecd-c52c-408a-8097-c1408a7b05ae req-a7d55d50-c40a-46a3-8394-5c231c2b9ed4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:45 compute-0 nova_compute[183075]: 2026-01-22 17:22:45.493 183079 DEBUG oslo_concurrency.lockutils [req-0ec15ecd-c52c-408a-8097-c1408a7b05ae req-a7d55d50-c40a-46a3-8394-5c231c2b9ed4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:45 compute-0 nova_compute[183075]: 2026-01-22 17:22:45.494 183079 DEBUG oslo_concurrency.lockutils [req-0ec15ecd-c52c-408a-8097-c1408a7b05ae req-a7d55d50-c40a-46a3-8394-5c231c2b9ed4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:45 compute-0 nova_compute[183075]: 2026-01-22 17:22:45.494 183079 DEBUG nova.compute.manager [req-0ec15ecd-c52c-408a-8097-c1408a7b05ae req-a7d55d50-c40a-46a3-8394-5c231c2b9ed4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] No waiting events found dispatching network-vif-plugged-6b312169-d575-48ac-b3c3-72634952d91f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:22:45 compute-0 nova_compute[183075]: 2026-01-22 17:22:45.494 183079 WARNING nova.compute.manager [req-0ec15ecd-c52c-408a-8097-c1408a7b05ae req-a7d55d50-c40a-46a3-8394-5c231c2b9ed4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Received unexpected event network-vif-plugged-6b312169-d575-48ac-b3c3-72634952d91f for instance with vm_state active and task_state None.
Jan 22 17:22:46 compute-0 nova_compute[183075]: 2026-01-22 17:22:46.480 183079 INFO nova.compute.manager [None req-8c88b8cb-12d7-45ec-8d30-232aec231a6a 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Get console output
Jan 22 17:22:46 compute-0 nova_compute[183075]: 2026-01-22 17:22:46.484 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:22:46 compute-0 nova_compute[183075]: 2026-01-22 17:22:46.695 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:47 compute-0 nova_compute[183075]: 2026-01-22 17:22:47.825 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:49 compute-0 nova_compute[183075]: 2026-01-22 17:22:49.897 183079 DEBUG oslo_concurrency.lockutils [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:49 compute-0 nova_compute[183075]: 2026-01-22 17:22:49.898 183079 DEBUG oslo_concurrency.lockutils [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:49 compute-0 nova_compute[183075]: 2026-01-22 17:22:49.899 183079 DEBUG oslo_concurrency.lockutils [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:49 compute-0 nova_compute[183075]: 2026-01-22 17:22:49.899 183079 DEBUG oslo_concurrency.lockutils [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:49 compute-0 nova_compute[183075]: 2026-01-22 17:22:49.899 183079 DEBUG oslo_concurrency.lockutils [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:49 compute-0 nova_compute[183075]: 2026-01-22 17:22:49.900 183079 INFO nova.compute.manager [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Terminating instance
Jan 22 17:22:49 compute-0 nova_compute[183075]: 2026-01-22 17:22:49.901 183079 DEBUG nova.compute.manager [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:22:49 compute-0 kernel: tapce445831-36 (unregistering): left promiscuous mode
Jan 22 17:22:49 compute-0 NetworkManager[55454]: <info>  [1769102569.9251] device (tapce445831-36): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:22:49 compute-0 nova_compute[183075]: 2026-01-22 17:22:49.954 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:49 compute-0 ovn_controller[95372]: 2026-01-22T17:22:49Z|00462|binding|INFO|Releasing lport ce445831-3685-4525-80bd-f4dc617c4911 from this chassis (sb_readonly=0)
Jan 22 17:22:49 compute-0 ovn_controller[95372]: 2026-01-22T17:22:49Z|00463|binding|INFO|Setting lport ce445831-3685-4525-80bd-f4dc617c4911 down in Southbound
Jan 22 17:22:49 compute-0 ovn_controller[95372]: 2026-01-22T17:22:49Z|00464|binding|INFO|Removing iface tapce445831-36 ovn-installed in OVS
Jan 22 17:22:49 compute-0 nova_compute[183075]: 2026-01-22 17:22:49.974 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:49 compute-0 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000025.scope: Deactivated successfully.
Jan 22 17:22:49 compute-0 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000025.scope: Consumed 16.131s CPU time.
Jan 22 17:22:50 compute-0 systemd-machined[154382]: Machine qemu-37-instance-00000025 terminated.
Jan 22 17:22:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:50.004 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:38:5a 10.100.0.3'], port_security=['fa:16:3e:c5:38:5a 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b4f5d7ef-7780-43fe-9ed3-e83542116fa8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'neutron:revision_number': '4', 'neutron:security_group_ids': '94e3530f-8012-4817-a338-7919b109ef3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12343ce0-7cef-4f7f-9439-6550d878d4ba, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=ce445831-3685-4525-80bd-f4dc617c4911) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:22:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:50.005 104629 INFO neutron.agent.ovn.metadata.agent [-] Port ce445831-3685-4525-80bd-f4dc617c4911 in datapath 44326f3c-1431-44d6-85ce-61ecbbb5ed7a unbound from our chassis
Jan 22 17:22:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:50.006 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44326f3c-1431-44d6-85ce-61ecbbb5ed7a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:22:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:50.007 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4a1e397c-e93a-48db-a4d4-640fca723c88]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:50.008 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a namespace which is not needed anymore
Jan 22 17:22:50 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227337]: [NOTICE]   (227342) : haproxy version is 2.8.14-c23fe91
Jan 22 17:22:50 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227337]: [NOTICE]   (227342) : path to executable is /usr/sbin/haproxy
Jan 22 17:22:50 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227337]: [WARNING]  (227342) : Exiting Master process...
Jan 22 17:22:50 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227337]: [WARNING]  (227342) : Exiting Master process...
Jan 22 17:22:50 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227337]: [ALERT]    (227342) : Current worker (227344) exited with code 143 (Terminated)
Jan 22 17:22:50 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[227337]: [WARNING]  (227342) : All workers exited. Exiting... (0)
Jan 22 17:22:50 compute-0 systemd[1]: libpod-d6408edcaee6f293a780cdb210de85f2c13c219ecb5789eaebe87747d660f204.scope: Deactivated successfully.
Jan 22 17:22:50 compute-0 podman[228128]: 2026-01-22 17:22:50.155950994 +0000 UTC m=+0.054181060 container died d6408edcaee6f293a780cdb210de85f2c13c219ecb5789eaebe87747d660f204 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.157 183079 INFO nova.virt.libvirt.driver [-] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Instance destroyed successfully.
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.158 183079 DEBUG nova.objects.instance [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lazy-loading 'resources' on Instance uuid b4f5d7ef-7780-43fe-9ed3-e83542116fa8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.170 183079 DEBUG nova.virt.libvirt.vif [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:21:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-1-1949948609',display_name='tempest-server-1-1949948609',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-1-1949948609',id=37,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMynqvoXBI34qH0ZDQNq01ZPmk41Xdi4JxkvNO4nJ7GrdggfhpXkKQhhUZRV3fv3bSwcXMqi04ipV9xe7IsTm4GBRV+U9o6VN5JSMLulp+KIvjPuEmjq0h65ra09/zQB9w==',key_name='tempest-keypair-test-1762910261',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:21:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e4c0bb18013747dfad2e25b2495090eb',ramdisk_id='',reservation_id='r-j8vsskaf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortForwardingTestJSON-1240706675',owner_user_name='tempest-PortForwardingTestJSON-1240706675-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:21:24Z,user_data=None,user_id='852aea4e08344f39ae07e6b57393c767',uuid=b4f5d7ef-7780-43fe-9ed3-e83542116fa8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ce445831-3685-4525-80bd-f4dc617c4911", "address": "fa:16:3e:c5:38:5a", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce445831-36", "ovs_interfaceid": "ce445831-3685-4525-80bd-f4dc617c4911", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.170 183079 DEBUG nova.network.os_vif_util [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converting VIF {"id": "ce445831-3685-4525-80bd-f4dc617c4911", "address": "fa:16:3e:c5:38:5a", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce445831-36", "ovs_interfaceid": "ce445831-3685-4525-80bd-f4dc617c4911", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.171 183079 DEBUG nova.network.os_vif_util [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c5:38:5a,bridge_name='br-int',has_traffic_filtering=True,id=ce445831-3685-4525-80bd-f4dc617c4911,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapce445831-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.171 183079 DEBUG os_vif [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c5:38:5a,bridge_name='br-int',has_traffic_filtering=True,id=ce445831-3685-4525-80bd-f4dc617c4911,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapce445831-36') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.172 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.173 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce445831-36, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.174 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.176 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.178 183079 INFO os_vif [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c5:38:5a,bridge_name='br-int',has_traffic_filtering=True,id=ce445831-3685-4525-80bd-f4dc617c4911,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapce445831-36')
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.178 183079 INFO nova.virt.libvirt.driver [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Deleting instance files /var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8_del
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.179 183079 INFO nova.virt.libvirt.driver [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Deletion of /var/lib/nova/instances/b4f5d7ef-7780-43fe-9ed3-e83542116fa8_del complete
Jan 22 17:22:50 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d6408edcaee6f293a780cdb210de85f2c13c219ecb5789eaebe87747d660f204-userdata-shm.mount: Deactivated successfully.
Jan 22 17:22:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-edf6d6b1b0aa25cb11fb54da7a8df253d0383fd7d373a159881698b2490805e3-merged.mount: Deactivated successfully.
Jan 22 17:22:50 compute-0 podman[228128]: 2026-01-22 17:22:50.197347445 +0000 UTC m=+0.095577531 container cleanup d6408edcaee6f293a780cdb210de85f2c13c219ecb5789eaebe87747d660f204 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:22:50 compute-0 systemd[1]: libpod-conmon-d6408edcaee6f293a780cdb210de85f2c13c219ecb5789eaebe87747d660f204.scope: Deactivated successfully.
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.230 183079 INFO nova.compute.manager [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.230 183079 DEBUG oslo.service.loopingcall [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.231 183079 DEBUG nova.compute.manager [-] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.231 183079 DEBUG nova.network.neutron [-] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:22:50 compute-0 podman[228172]: 2026-01-22 17:22:50.259396092 +0000 UTC m=+0.040551771 container remove d6408edcaee6f293a780cdb210de85f2c13c219ecb5789eaebe87747d660f204 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:22:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:50.265 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1711cabb-99cd-4bbb-8112-0206981f321f]: (4, ('Thu Jan 22 05:22:50 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a (d6408edcaee6f293a780cdb210de85f2c13c219ecb5789eaebe87747d660f204)\nd6408edcaee6f293a780cdb210de85f2c13c219ecb5789eaebe87747d660f204\nThu Jan 22 05:22:50 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a (d6408edcaee6f293a780cdb210de85f2c13c219ecb5789eaebe87747d660f204)\nd6408edcaee6f293a780cdb210de85f2c13c219ecb5789eaebe87747d660f204\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:50.267 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[92ceefc5-7eba-4b2b-a5e4-c0fe0cfa3c55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:50.268 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44326f3c-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:22:50 compute-0 kernel: tap44326f3c-10: left promiscuous mode
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.269 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.281 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:50.285 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[193164ac-ed58-47c0-956b-d7e06e18d462]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:50.300 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[934da0bb-2717-4afe-8b7f-c47806d5b645]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:50.301 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c2c5a2-044f-40e6-b955-8adedc6a2778]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:50.322 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[47a2c27d-3726-422a-8502-1a30d5c96c92]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 484706, 'reachable_time': 36039, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228187, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:50 compute-0 systemd[1]: run-netns-ovnmeta\x2d44326f3c\x2d1431\x2d44d6\x2d85ce\x2d61ecbbb5ed7a.mount: Deactivated successfully.
Jan 22 17:22:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:50.326 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:22:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:22:50.327 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[3512263e-fbc7-4553-b031-bc2c39de91de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.495 183079 DEBUG nova.compute.manager [req-48e75a3d-ff1b-470a-a15c-57cf4137d486 req-c2d28bcf-811d-4e7c-b6f7-084c5fa0e3f6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Received event network-vif-unplugged-ce445831-3685-4525-80bd-f4dc617c4911 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.496 183079 DEBUG oslo_concurrency.lockutils [req-48e75a3d-ff1b-470a-a15c-57cf4137d486 req-c2d28bcf-811d-4e7c-b6f7-084c5fa0e3f6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.496 183079 DEBUG oslo_concurrency.lockutils [req-48e75a3d-ff1b-470a-a15c-57cf4137d486 req-c2d28bcf-811d-4e7c-b6f7-084c5fa0e3f6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.497 183079 DEBUG oslo_concurrency.lockutils [req-48e75a3d-ff1b-470a-a15c-57cf4137d486 req-c2d28bcf-811d-4e7c-b6f7-084c5fa0e3f6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.497 183079 DEBUG nova.compute.manager [req-48e75a3d-ff1b-470a-a15c-57cf4137d486 req-c2d28bcf-811d-4e7c-b6f7-084c5fa0e3f6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] No waiting events found dispatching network-vif-unplugged-ce445831-3685-4525-80bd-f4dc617c4911 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:22:50 compute-0 nova_compute[183075]: 2026-01-22 17:22:50.497 183079 DEBUG nova.compute.manager [req-48e75a3d-ff1b-470a-a15c-57cf4137d486 req-c2d28bcf-811d-4e7c-b6f7-084c5fa0e3f6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Received event network-vif-unplugged-ce445831-3685-4525-80bd-f4dc617c4911 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:22:51 compute-0 nova_compute[183075]: 2026-01-22 17:22:51.697 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:52 compute-0 nova_compute[183075]: 2026-01-22 17:22:52.221 183079 INFO nova.compute.manager [None req-0e5210ca-9f7f-4081-a15d-1f5ca9c97c93 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Get console output
Jan 22 17:22:52 compute-0 nova_compute[183075]: 2026-01-22 17:22:52.227 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:22:52 compute-0 podman[228188]: 2026-01-22 17:22:52.354693526 +0000 UTC m=+0.060784194 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:22:52 compute-0 nova_compute[183075]: 2026-01-22 17:22:52.880 183079 DEBUG nova.compute.manager [req-5f451571-9270-4c84-b614-a9ba7f0dabe8 req-e65e8b5a-ed45-4949-9cd2-7ed17ece903a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Received event network-vif-plugged-ce445831-3685-4525-80bd-f4dc617c4911 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:22:52 compute-0 nova_compute[183075]: 2026-01-22 17:22:52.881 183079 DEBUG oslo_concurrency.lockutils [req-5f451571-9270-4c84-b614-a9ba7f0dabe8 req-e65e8b5a-ed45-4949-9cd2-7ed17ece903a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:52 compute-0 nova_compute[183075]: 2026-01-22 17:22:52.882 183079 DEBUG oslo_concurrency.lockutils [req-5f451571-9270-4c84-b614-a9ba7f0dabe8 req-e65e8b5a-ed45-4949-9cd2-7ed17ece903a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:52 compute-0 nova_compute[183075]: 2026-01-22 17:22:52.883 183079 DEBUG oslo_concurrency.lockutils [req-5f451571-9270-4c84-b614-a9ba7f0dabe8 req-e65e8b5a-ed45-4949-9cd2-7ed17ece903a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:52 compute-0 nova_compute[183075]: 2026-01-22 17:22:52.883 183079 DEBUG nova.compute.manager [req-5f451571-9270-4c84-b614-a9ba7f0dabe8 req-e65e8b5a-ed45-4949-9cd2-7ed17ece903a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] No waiting events found dispatching network-vif-plugged-ce445831-3685-4525-80bd-f4dc617c4911 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:22:52 compute-0 nova_compute[183075]: 2026-01-22 17:22:52.884 183079 WARNING nova.compute.manager [req-5f451571-9270-4c84-b614-a9ba7f0dabe8 req-e65e8b5a-ed45-4949-9cd2-7ed17ece903a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Received unexpected event network-vif-plugged-ce445831-3685-4525-80bd-f4dc617c4911 for instance with vm_state active and task_state deleting.
Jan 22 17:22:53 compute-0 nova_compute[183075]: 2026-01-22 17:22:53.203 183079 DEBUG nova.network.neutron [-] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:22:53 compute-0 nova_compute[183075]: 2026-01-22 17:22:53.229 183079 INFO nova.compute.manager [-] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Took 3.00 seconds to deallocate network for instance.
Jan 22 17:22:53 compute-0 nova_compute[183075]: 2026-01-22 17:22:53.269 183079 DEBUG oslo_concurrency.lockutils [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:53 compute-0 nova_compute[183075]: 2026-01-22 17:22:53.269 183079 DEBUG oslo_concurrency.lockutils [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:53 compute-0 nova_compute[183075]: 2026-01-22 17:22:53.364 183079 DEBUG nova.compute.provider_tree [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:22:53 compute-0 nova_compute[183075]: 2026-01-22 17:22:53.382 183079 DEBUG nova.scheduler.client.report [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:22:53 compute-0 nova_compute[183075]: 2026-01-22 17:22:53.403 183079 DEBUG oslo_concurrency.lockutils [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:53 compute-0 nova_compute[183075]: 2026-01-22 17:22:53.427 183079 INFO nova.scheduler.client.report [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Deleted allocations for instance b4f5d7ef-7780-43fe-9ed3-e83542116fa8
Jan 22 17:22:53 compute-0 nova_compute[183075]: 2026-01-22 17:22:53.484 183079 DEBUG oslo_concurrency.lockutils [None req-fc217ff1-b25a-4a06-92fb-00c82a27b184 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "b4f5d7ef-7780-43fe-9ed3-e83542116fa8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:22:55 compute-0 nova_compute[183075]: 2026-01-22 17:22:55.214 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:56 compute-0 ovn_controller[95372]: 2026-01-22T17:22:56Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bf:52:41 10.100.0.10
Jan 22 17:22:56 compute-0 ovn_controller[95372]: 2026-01-22T17:22:56Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bf:52:41 10.100.0.10
Jan 22 17:22:56 compute-0 nova_compute[183075]: 2026-01-22 17:22:56.701 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:22:57 compute-0 nova_compute[183075]: 2026-01-22 17:22:57.327 183079 INFO nova.compute.manager [None req-9fb73414-6a83-4a6b-a6a1-6d51df53a2ac 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Get console output
Jan 22 17:22:57 compute-0 nova_compute[183075]: 2026-01-22 17:22:57.333 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:22:59 compute-0 podman[228224]: 2026-01-22 17:22:59.37172167 +0000 UTC m=+0.075473752 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:22:59 compute-0 nova_compute[183075]: 2026-01-22 17:22:59.950 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "7c4cc341-c93c-4077-a541-31a8487482f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:22:59 compute-0 nova_compute[183075]: 2026-01-22 17:22:59.951 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "7c4cc341-c93c-4077-a541-31a8487482f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:22:59 compute-0 nova_compute[183075]: 2026-01-22 17:22:59.969 183079 DEBUG nova.compute.manager [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.064 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.064 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.069 183079 DEBUG nova.virt.hardware [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.069 183079 INFO nova.compute.claims [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.215 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.219 183079 DEBUG nova.compute.provider_tree [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.235 183079 DEBUG nova.scheduler.client.report [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.259 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.260 183079 DEBUG nova.compute.manager [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.316 183079 DEBUG nova.compute.manager [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.317 183079 DEBUG nova.network.neutron [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.336 183079 INFO nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.353 183079 DEBUG nova.compute.manager [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.452 183079 DEBUG nova.compute.manager [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.453 183079 DEBUG nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.454 183079 INFO nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Creating image(s)
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.454 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "/var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.455 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "/var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.455 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "/var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.466 183079 DEBUG oslo_concurrency.processutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.517 183079 DEBUG oslo_concurrency.processutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.518 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.519 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.529 183079 DEBUG oslo_concurrency.processutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.576 183079 DEBUG oslo_concurrency.processutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.578 183079 DEBUG oslo_concurrency.processutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.607 183079 DEBUG oslo_concurrency.processutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.608 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.609 183079 DEBUG oslo_concurrency.processutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.676 183079 DEBUG oslo_concurrency.processutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.677 183079 DEBUG nova.virt.disk.api [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Checking if we can resize image /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.678 183079 DEBUG oslo_concurrency.processutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.724 183079 DEBUG oslo_concurrency.processutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.725 183079 DEBUG nova.virt.disk.api [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Cannot resize image /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.726 183079 DEBUG nova.objects.instance [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lazy-loading 'migration_context' on Instance uuid 7c4cc341-c93c-4077-a541-31a8487482f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.740 183079 DEBUG nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.741 183079 DEBUG nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Ensure instance console log exists: /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.742 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.743 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:00 compute-0 nova_compute[183075]: 2026-01-22 17:23:00.743 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:01 compute-0 nova_compute[183075]: 2026-01-22 17:23:01.476 183079 DEBUG nova.policy [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:23:01 compute-0 nova_compute[183075]: 2026-01-22 17:23:01.704 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:01.943 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:01.944 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:23:01 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:01 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:01 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:01 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:01 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:01 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:23:01 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:02 compute-0 nova_compute[183075]: 2026-01-22 17:23:02.455 183079 INFO nova.compute.manager [None req-1c1bf1c2-f727-4fb4-a93e-bd344c236e4e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Get console output
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.461 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.463 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.5182512
Jan 22 17:23:02 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.10:42942 [22/Jan/2026:17:23:01.942] listener listener/metadata 0/0/0/520/520 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:23:02 compute-0 nova_compute[183075]: 2026-01-22 17:23:02.464 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.481 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.482 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.514 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.515 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0326676
Jan 22 17:23:02 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.10:42944 [22/Jan/2026:17:23:02.480] listener listener/metadata 0/0/0/35/35 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.520 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.521 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.540 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.541 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0199947
Jan 22 17:23:02 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.10:42952 [22/Jan/2026:17:23:02.519] listener listener/metadata 0/0/0/21/21 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.545 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.546 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.567 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.568 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0217724
Jan 22 17:23:02 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.10:42956 [22/Jan/2026:17:23:02.545] listener listener/metadata 0/0/0/23/23 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.573 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.575 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.593 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:02 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.10:42968 [22/Jan/2026:17:23:02.573] listener listener/metadata 0/0/0/20/20 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.593 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0188243
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.598 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.599 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.616 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.617 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0175920
Jan 22 17:23:02 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.10:42982 [22/Jan/2026:17:23:02.598] listener listener/metadata 0/0/0/18/18 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.622 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.623 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.639 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.640 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0164366
Jan 22 17:23:02 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.10:42990 [22/Jan/2026:17:23:02.622] listener listener/metadata 0/0/0/17/17 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.644 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.645 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.663 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.663 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0185444
Jan 22 17:23:02 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.10:43006 [22/Jan/2026:17:23:02.644] listener listener/metadata 0/0/0/19/19 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.670 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.670 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.689 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.690 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0193417
Jan 22 17:23:02 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.10:43020 [22/Jan/2026:17:23:02.669] listener listener/metadata 0/0/0/20/20 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.697 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.698 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.720 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.721 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0231969
Jan 22 17:23:02 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.10:43036 [22/Jan/2026:17:23:02.696] listener listener/metadata 0/0/0/24/24 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.731 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.732 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:02 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.10:43040 [22/Jan/2026:17:23:02.730] listener listener/metadata 0/0/0/28/28 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.759 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0269172
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.768 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.769 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.788 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.789 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0199907
Jan 22 17:23:02 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.10:43042 [22/Jan/2026:17:23:02.767] listener listener/metadata 0/0/0/21/21 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.796 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.797 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.816 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.817 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0198243
Jan 22 17:23:02 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.10:43044 [22/Jan/2026:17:23:02.796] listener listener/metadata 0/0/0/20/20 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.820 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.821 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.842 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.842 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0211177
Jan 22 17:23:02 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.10:43060 [22/Jan/2026:17:23:02.820] listener listener/metadata 0/0/0/22/22 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.847 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.847 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.863 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.864 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0163465
Jan 22 17:23:02 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.10:43066 [22/Jan/2026:17:23:02.846] listener listener/metadata 0/0/0/17/17 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.868 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.869 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.884 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:02.884 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0150933
Jan 22 17:23:02 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.10:43074 [22/Jan/2026:17:23:02.868] listener listener/metadata 0/0/0/16/16 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:23:03 compute-0 nova_compute[183075]: 2026-01-22 17:23:03.409 183079 DEBUG nova.network.neutron [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Successfully updated port: dbad268a-40fe-4d38-aab1-20fbfbcc0775 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:23:03 compute-0 nova_compute[183075]: 2026-01-22 17:23:03.428 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "refresh_cache-7c4cc341-c93c-4077-a541-31a8487482f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:23:03 compute-0 nova_compute[183075]: 2026-01-22 17:23:03.428 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquired lock "refresh_cache-7c4cc341-c93c-4077-a541-31a8487482f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:23:03 compute-0 nova_compute[183075]: 2026-01-22 17:23:03.428 183079 DEBUG nova.network.neutron [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:23:03 compute-0 nova_compute[183075]: 2026-01-22 17:23:03.524 183079 DEBUG nova.compute.manager [req-50172a5a-9be4-464a-8f36-b2fc5e29ef57 req-74202e77-59a0-4bf5-9f3b-9ebae062d069 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Received event network-changed-dbad268a-40fe-4d38-aab1-20fbfbcc0775 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:23:03 compute-0 nova_compute[183075]: 2026-01-22 17:23:03.525 183079 DEBUG nova.compute.manager [req-50172a5a-9be4-464a-8f36-b2fc5e29ef57 req-74202e77-59a0-4bf5-9f3b-9ebae062d069 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Refreshing instance network info cache due to event network-changed-dbad268a-40fe-4d38-aab1-20fbfbcc0775. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:23:03 compute-0 nova_compute[183075]: 2026-01-22 17:23:03.525 183079 DEBUG oslo_concurrency.lockutils [req-50172a5a-9be4-464a-8f36-b2fc5e29ef57 req-74202e77-59a0-4bf5-9f3b-9ebae062d069 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-7c4cc341-c93c-4077-a541-31a8487482f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:23:03 compute-0 nova_compute[183075]: 2026-01-22 17:23:03.589 183079 DEBUG nova.network.neutron [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.154 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102570.1529324, b4f5d7ef-7780-43fe-9ed3-e83542116fa8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.154 183079 INFO nova.compute.manager [-] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] VM Stopped (Lifecycle Event)
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.177 183079 DEBUG nova.compute.manager [None req-3e138d79-8f15-4562-9743-61ffbcfd1170 - - - - - -] [instance: b4f5d7ef-7780-43fe-9ed3-e83542116fa8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.218 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.334 183079 DEBUG nova.network.neutron [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Updating instance_info_cache with network_info: [{"id": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "address": "fa:16:3e:d2:a8:c8", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbad268a-40", "ovs_interfaceid": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.351 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Releasing lock "refresh_cache-7c4cc341-c93c-4077-a541-31a8487482f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.351 183079 DEBUG nova.compute.manager [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Instance network_info: |[{"id": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "address": "fa:16:3e:d2:a8:c8", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbad268a-40", "ovs_interfaceid": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.351 183079 DEBUG oslo_concurrency.lockutils [req-50172a5a-9be4-464a-8f36-b2fc5e29ef57 req-74202e77-59a0-4bf5-9f3b-9ebae062d069 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-7c4cc341-c93c-4077-a541-31a8487482f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.351 183079 DEBUG nova.network.neutron [req-50172a5a-9be4-464a-8f36-b2fc5e29ef57 req-74202e77-59a0-4bf5-9f3b-9ebae062d069 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Refreshing network info cache for port dbad268a-40fe-4d38-aab1-20fbfbcc0775 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.353 183079 DEBUG nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Start _get_guest_xml network_info=[{"id": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "address": "fa:16:3e:d2:a8:c8", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbad268a-40", "ovs_interfaceid": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.358 183079 WARNING nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.362 183079 DEBUG nova.virt.libvirt.host [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.363 183079 DEBUG nova.virt.libvirt.host [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.366 183079 DEBUG nova.virt.libvirt.host [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.366 183079 DEBUG nova.virt.libvirt.host [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.367 183079 DEBUG nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.367 183079 DEBUG nova.virt.hardware [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.367 183079 DEBUG nova.virt.hardware [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.367 183079 DEBUG nova.virt.hardware [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.367 183079 DEBUG nova.virt.hardware [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.368 183079 DEBUG nova.virt.hardware [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.368 183079 DEBUG nova.virt.hardware [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.368 183079 DEBUG nova.virt.hardware [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.368 183079 DEBUG nova.virt.hardware [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.368 183079 DEBUG nova.virt.hardware [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.368 183079 DEBUG nova.virt.hardware [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.368 183079 DEBUG nova.virt.hardware [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.371 183079 DEBUG nova.virt.libvirt.vif [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:22:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-0-352913770',display_name='tempest-server-0-352913770',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-0-352913770',id=40,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMynqvoXBI34qH0ZDQNq01ZPmk41Xdi4JxkvNO4nJ7GrdggfhpXkKQhhUZRV3fv3bSwcXMqi04ipV9xe7IsTm4GBRV+U9o6VN5JSMLulp+KIvjPuEmjq0h65ra09/zQB9w==',key_name='tempest-keypair-test-1762910261',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e4c0bb18013747dfad2e25b2495090eb',ramdisk_id='',reservation_id='r-xwmek31g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortForwardingTestJSON-1240706675',owner_user_name='tempest-PortForwardingTestJSON-1240706675-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:23:00Z,user_data=None,user_id='852aea4e08344f39ae07e6b57393c767',uuid=7c4cc341-c93c-4077-a541-31a8487482f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "address": "fa:16:3e:d2:a8:c8", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbad268a-40", "ovs_interfaceid": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.372 183079 DEBUG nova.network.os_vif_util [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converting VIF {"id": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "address": "fa:16:3e:d2:a8:c8", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbad268a-40", "ovs_interfaceid": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.372 183079 DEBUG nova.network.os_vif_util [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:a8:c8,bridge_name='br-int',has_traffic_filtering=True,id=dbad268a-40fe-4d38-aab1-20fbfbcc0775,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdbad268a-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.373 183079 DEBUG nova.objects.instance [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lazy-loading 'pci_devices' on Instance uuid 7c4cc341-c93c-4077-a541-31a8487482f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.386 183079 DEBUG nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:23:05 compute-0 nova_compute[183075]:   <uuid>7c4cc341-c93c-4077-a541-31a8487482f0</uuid>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   <name>instance-00000028</name>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <nova:name>tempest-server-0-352913770</nova:name>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:23:05</nova:creationTime>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:23:05 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:23:05 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:23:05 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:23:05 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:23:05 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:23:05 compute-0 nova_compute[183075]:         <nova:user uuid="852aea4e08344f39ae07e6b57393c767">tempest-PortForwardingTestJSON-1240706675-project-member</nova:user>
Jan 22 17:23:05 compute-0 nova_compute[183075]:         <nova:project uuid="e4c0bb18013747dfad2e25b2495090eb">tempest-PortForwardingTestJSON-1240706675</nova:project>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:23:05 compute-0 nova_compute[183075]:         <nova:port uuid="dbad268a-40fe-4d38-aab1-20fbfbcc0775">
Jan 22 17:23:05 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <system>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <entry name="serial">7c4cc341-c93c-4077-a541-31a8487482f0</entry>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <entry name="uuid">7c4cc341-c93c-4077-a541-31a8487482f0</entry>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     </system>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   <os>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   </os>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   <features>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   </features>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:d2:a8:c8"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <target dev="tapdbad268a-40"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/console.log" append="off"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <video>
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     </video>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:23:05 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:23:05 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:23:05 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:23:05 compute-0 nova_compute[183075]: </domain>
Jan 22 17:23:05 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.387 183079 DEBUG nova.compute.manager [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Preparing to wait for external event network-vif-plugged-dbad268a-40fe-4d38-aab1-20fbfbcc0775 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.387 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.387 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.387 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.388 183079 DEBUG nova.virt.libvirt.vif [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:22:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-0-352913770',display_name='tempest-server-0-352913770',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-0-352913770',id=40,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMynqvoXBI34qH0ZDQNq01ZPmk41Xdi4JxkvNO4nJ7GrdggfhpXkKQhhUZRV3fv3bSwcXMqi04ipV9xe7IsTm4GBRV+U9o6VN5JSMLulp+KIvjPuEmjq0h65ra09/zQB9w==',key_name='tempest-keypair-test-1762910261',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e4c0bb18013747dfad2e25b2495090eb',ramdisk_id='',reservation_id='r-xwmek31g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortForwardingTestJSON-1240706675',owner_user_name='tempest-PortForwardingTestJSON-1240706675-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:23:00Z,user_data=None,user_id='852aea4e08344f39ae07e6b57393c767',uuid=7c4cc341-c93c-4077-a541-31a8487482f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "address": "fa:16:3e:d2:a8:c8", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbad268a-40", "ovs_interfaceid": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.388 183079 DEBUG nova.network.os_vif_util [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converting VIF {"id": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "address": "fa:16:3e:d2:a8:c8", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbad268a-40", "ovs_interfaceid": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.389 183079 DEBUG nova.network.os_vif_util [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:a8:c8,bridge_name='br-int',has_traffic_filtering=True,id=dbad268a-40fe-4d38-aab1-20fbfbcc0775,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdbad268a-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.389 183079 DEBUG os_vif [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:a8:c8,bridge_name='br-int',has_traffic_filtering=True,id=dbad268a-40fe-4d38-aab1-20fbfbcc0775,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdbad268a-40') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.389 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.390 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.390 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.392 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.392 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdbad268a-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.392 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdbad268a-40, col_values=(('external_ids', {'iface-id': 'dbad268a-40fe-4d38-aab1-20fbfbcc0775', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d2:a8:c8', 'vm-uuid': '7c4cc341-c93c-4077-a541-31a8487482f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.394 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:05 compute-0 NetworkManager[55454]: <info>  [1769102585.3959] manager: (tapdbad268a-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/200)
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.396 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.403 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.403 183079 INFO os_vif [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:a8:c8,bridge_name='br-int',has_traffic_filtering=True,id=dbad268a-40fe-4d38-aab1-20fbfbcc0775,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdbad268a-40')
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.453 183079 DEBUG nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.453 183079 DEBUG nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] No VIF found with MAC fa:16:3e:d2:a8:c8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:23:05 compute-0 podman[228269]: 2026-01-22 17:23:05.507574472 +0000 UTC m=+0.063786513 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:23:05 compute-0 kernel: tapdbad268a-40: entered promiscuous mode
Jan 22 17:23:05 compute-0 NetworkManager[55454]: <info>  [1769102585.5307] manager: (tapdbad268a-40): new Tun device (/org/freedesktop/NetworkManager/Devices/201)
Jan 22 17:23:05 compute-0 ovn_controller[95372]: 2026-01-22T17:23:05Z|00465|binding|INFO|Claiming lport dbad268a-40fe-4d38-aab1-20fbfbcc0775 for this chassis.
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.530 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:05 compute-0 ovn_controller[95372]: 2026-01-22T17:23:05Z|00466|binding|INFO|dbad268a-40fe-4d38-aab1-20fbfbcc0775: Claiming fa:16:3e:d2:a8:c8 10.100.0.11
Jan 22 17:23:05 compute-0 ovn_controller[95372]: 2026-01-22T17:23:05Z|00467|binding|INFO|Setting lport dbad268a-40fe-4d38-aab1-20fbfbcc0775 ovn-installed in OVS
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.549 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.551 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:05 compute-0 ovn_controller[95372]: 2026-01-22T17:23:05Z|00468|binding|INFO|Setting lport dbad268a-40fe-4d38-aab1-20fbfbcc0775 up in Southbound
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.557 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:a8:c8 10.100.0.11'], port_security=['fa:16:3e:d2:a8:c8 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '94e3530f-8012-4817-a338-7919b109ef3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12343ce0-7cef-4f7f-9439-6550d878d4ba, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=dbad268a-40fe-4d38-aab1-20fbfbcc0775) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.559 104629 INFO neutron.agent.ovn.metadata.agent [-] Port dbad268a-40fe-4d38-aab1-20fbfbcc0775 in datapath 44326f3c-1431-44d6-85ce-61ecbbb5ed7a bound to our chassis
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.562 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44326f3c-1431-44d6-85ce-61ecbbb5ed7a
Jan 22 17:23:05 compute-0 systemd-udevd[228306]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:23:05 compute-0 systemd-machined[154382]: New machine qemu-40-instance-00000028.
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.578 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b06c3798-f630-495e-aae0-69ad65e181cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.580 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap44326f3c-11 in ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.582 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap44326f3c-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.582 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e6e003f4-ce1c-402f-ad6c-e6b8b111b257]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.583 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[207b175a-fb45-4879-b46b-0ba4761558a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:05 compute-0 NetworkManager[55454]: <info>  [1769102585.5864] device (tapdbad268a-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:23:05 compute-0 NetworkManager[55454]: <info>  [1769102585.5873] device (tapdbad268a-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:23:05 compute-0 systemd[1]: Started Virtual Machine qemu-40-instance-00000028.
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.595 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[b5ceb8d4-edac-4f5f-9042-1283135e9aae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.609 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[47b24e29-0794-463e-97d1-5fbf37dbf1fa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.637 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[3b5c83b3-5ff2-475c-b36c-c04d35857e2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.642 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6410f46b-1861-4fe9-8e67-3f29557d2f9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:05 compute-0 NetworkManager[55454]: <info>  [1769102585.6441] manager: (tap44326f3c-10): new Veth device (/org/freedesktop/NetworkManager/Devices/202)
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.673 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[e9350fd9-e43b-43df-a5af-92b74085e9b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.676 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[297ad2ed-2310-4520-b6a2-f29ab74862e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:05 compute-0 NetworkManager[55454]: <info>  [1769102585.7009] device (tap44326f3c-10): carrier: link connected
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.706 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[0311c44b-e3f7-4bc3-b706-a7c7a87f544a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.728 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a435a3e0-050d-450c-be72-cbffc3d4b9ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44326f3c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:1b:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494940, 'reachable_time': 30280, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228340, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.745 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7d24dc42-5473-40ce-96e9-26c9bfe94ff9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe37:1b89'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 494940, 'tstamp': 494940}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228341, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.773 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[116a93f3-8d23-40d9-a5f4-dea3aeb7f308]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44326f3c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:1b:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494940, 'reachable_time': 30280, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228342, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.781 183079 DEBUG nova.compute.manager [req-2772f30e-d5aa-4427-97a6-2e5489e4c73d req-1c48d547-29a2-4feb-923c-4628036ff204 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Received event network-vif-plugged-dbad268a-40fe-4d38-aab1-20fbfbcc0775 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.782 183079 DEBUG oslo_concurrency.lockutils [req-2772f30e-d5aa-4427-97a6-2e5489e4c73d req-1c48d547-29a2-4feb-923c-4628036ff204 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.782 183079 DEBUG oslo_concurrency.lockutils [req-2772f30e-d5aa-4427-97a6-2e5489e4c73d req-1c48d547-29a2-4feb-923c-4628036ff204 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.782 183079 DEBUG oslo_concurrency.lockutils [req-2772f30e-d5aa-4427-97a6-2e5489e4c73d req-1c48d547-29a2-4feb-923c-4628036ff204 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.782 183079 DEBUG nova.compute.manager [req-2772f30e-d5aa-4427-97a6-2e5489e4c73d req-1c48d547-29a2-4feb-923c-4628036ff204 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Processing event network-vif-plugged-dbad268a-40fe-4d38-aab1-20fbfbcc0775 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.815 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[53e4d454-12e7-4463-97cb-172ef359c688]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.881 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8de12d59-9758-48b4-b9b9-e32c920ada5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.883 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44326f3c-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.883 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.884 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44326f3c-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:05 compute-0 NetworkManager[55454]: <info>  [1769102585.8875] manager: (tap44326f3c-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Jan 22 17:23:05 compute-0 kernel: tap44326f3c-10: entered promiscuous mode
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.886 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.891 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44326f3c-10, col_values=(('external_ids', {'iface-id': '118957e0-7da0-4d87-b7d4-2c204e19e5b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:05 compute-0 ovn_controller[95372]: 2026-01-22T17:23:05Z|00469|binding|INFO|Releasing lport 118957e0-7da0-4d87-b7d4-2c204e19e5b6 from this chassis (sb_readonly=0)
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.892 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:05 compute-0 nova_compute[183075]: 2026-01-22 17:23:05.953 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.955 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44326f3c-1431-44d6-85ce-61ecbbb5ed7a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44326f3c-1431-44d6-85ce-61ecbbb5ed7a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.956 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b0780cdf-53a7-428b-a351-c071bb0780f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.956 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/44326f3c-1431-44d6-85ce-61ecbbb5ed7a.pid.haproxy
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 44326f3c-1431-44d6-85ce-61ecbbb5ed7a
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:23:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:05.958 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'env', 'PROCESS_TAG=haproxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/44326f3c-1431-44d6-85ce-61ecbbb5ed7a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:23:06 compute-0 podman[228374]: 2026-01-22 17:23:06.34653894 +0000 UTC m=+0.069318309 container create 916153c4b06663be31dca44d1d83090ae4fd70bc7c5370aace02e9a7aa5b4a74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:23:06 compute-0 systemd[1]: Started libpod-conmon-916153c4b06663be31dca44d1d83090ae4fd70bc7c5370aace02e9a7aa5b4a74.scope.
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.409 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102586.4084973, 7c4cc341-c93c-4077-a541-31a8487482f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.410 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] VM Started (Lifecycle Event)
Jan 22 17:23:06 compute-0 podman[228374]: 2026-01-22 17:23:06.31505861 +0000 UTC m=+0.037837999 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.412 183079 DEBUG nova.compute.manager [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:23:06 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.417 183079 DEBUG nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:23:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cfb125d12df97dbf39808d17fbefcdaef63f0fd816b75d3157f1d18bcc1e9450/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.426 183079 INFO nova.virt.libvirt.driver [-] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Instance spawned successfully.
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.426 183079 DEBUG nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.436 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:23:06 compute-0 podman[228374]: 2026-01-22 17:23:06.441735141 +0000 UTC m=+0.164514550 container init 916153c4b06663be31dca44d1d83090ae4fd70bc7c5370aace02e9a7aa5b4a74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.443 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.447 183079 DEBUG nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.447 183079 DEBUG nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:06 compute-0 podman[228374]: 2026-01-22 17:23:06.448299764 +0000 UTC m=+0.171079153 container start 916153c4b06663be31dca44d1d83090ae4fd70bc7c5370aace02e9a7aa5b4a74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.448 183079 DEBUG nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.448 183079 DEBUG nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.449 183079 DEBUG nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.449 183079 DEBUG nova.virt.libvirt.driver [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.475 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.475 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102586.4098535, 7c4cc341-c93c-4077-a541-31a8487482f0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.476 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] VM Paused (Lifecycle Event)
Jan 22 17:23:06 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228396]: [NOTICE]   (228400) : New worker (228402) forked
Jan 22 17:23:06 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228396]: [NOTICE]   (228400) : Loading success.
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.496 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.499 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102586.416377, 7c4cc341-c93c-4077-a541-31a8487482f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.500 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] VM Resumed (Lifecycle Event)
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.508 183079 INFO nova.compute.manager [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Took 6.05 seconds to spawn the instance on the hypervisor.
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.508 183079 DEBUG nova.compute.manager [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.515 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.519 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.540 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.562 183079 INFO nova.compute.manager [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Took 6.52 seconds to build instance.
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.578 183079 DEBUG oslo_concurrency.lockutils [None req-9e5dfb19-d26b-4ea0-87dd-fd6fbbb09bfe 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "7c4cc341-c93c-4077-a541-31a8487482f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.707 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.774 183079 DEBUG nova.network.neutron [req-50172a5a-9be4-464a-8f36-b2fc5e29ef57 req-74202e77-59a0-4bf5-9f3b-9ebae062d069 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Updated VIF entry in instance network info cache for port dbad268a-40fe-4d38-aab1-20fbfbcc0775. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.775 183079 DEBUG nova.network.neutron [req-50172a5a-9be4-464a-8f36-b2fc5e29ef57 req-74202e77-59a0-4bf5-9f3b-9ebae062d069 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Updating instance_info_cache with network_info: [{"id": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "address": "fa:16:3e:d2:a8:c8", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbad268a-40", "ovs_interfaceid": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:23:06 compute-0 nova_compute[183075]: 2026-01-22 17:23:06.789 183079 DEBUG oslo_concurrency.lockutils [req-50172a5a-9be4-464a-8f36-b2fc5e29ef57 req-74202e77-59a0-4bf5-9f3b-9ebae062d069 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-7c4cc341-c93c-4077-a541-31a8487482f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:23:07 compute-0 nova_compute[183075]: 2026-01-22 17:23:07.645 183079 INFO nova.compute.manager [None req-c72e7d51-4428-41b4-8deb-195f3bfefc14 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Get console output
Jan 22 17:23:07 compute-0 nova_compute[183075]: 2026-01-22 17:23:07.652 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:23:08 compute-0 nova_compute[183075]: 2026-01-22 17:23:08.607 183079 DEBUG nova.compute.manager [req-195eb3e4-7dbe-4316-a747-1281be6e1e14 req-ad78f13f-eab1-4c13-9a02-27c812ea24e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Received event network-vif-plugged-dbad268a-40fe-4d38-aab1-20fbfbcc0775 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:23:08 compute-0 nova_compute[183075]: 2026-01-22 17:23:08.608 183079 DEBUG oslo_concurrency.lockutils [req-195eb3e4-7dbe-4316-a747-1281be6e1e14 req-ad78f13f-eab1-4c13-9a02-27c812ea24e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:08 compute-0 nova_compute[183075]: 2026-01-22 17:23:08.608 183079 DEBUG oslo_concurrency.lockutils [req-195eb3e4-7dbe-4316-a747-1281be6e1e14 req-ad78f13f-eab1-4c13-9a02-27c812ea24e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:08 compute-0 nova_compute[183075]: 2026-01-22 17:23:08.608 183079 DEBUG oslo_concurrency.lockutils [req-195eb3e4-7dbe-4316-a747-1281be6e1e14 req-ad78f13f-eab1-4c13-9a02-27c812ea24e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:08 compute-0 nova_compute[183075]: 2026-01-22 17:23:08.608 183079 DEBUG nova.compute.manager [req-195eb3e4-7dbe-4316-a747-1281be6e1e14 req-ad78f13f-eab1-4c13-9a02-27c812ea24e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] No waiting events found dispatching network-vif-plugged-dbad268a-40fe-4d38-aab1-20fbfbcc0775 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:23:08 compute-0 nova_compute[183075]: 2026-01-22 17:23:08.609 183079 WARNING nova.compute.manager [req-195eb3e4-7dbe-4316-a747-1281be6e1e14 req-ad78f13f-eab1-4c13-9a02-27c812ea24e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Received unexpected event network-vif-plugged-dbad268a-40fe-4d38-aab1-20fbfbcc0775 for instance with vm_state active and task_state None.
Jan 22 17:23:08 compute-0 nova_compute[183075]: 2026-01-22 17:23:08.611 183079 INFO nova.compute.manager [None req-135450ff-82aa-41ac-b88b-547120995e50 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Get console output
Jan 22 17:23:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:10.381 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:23:10 compute-0 nova_compute[183075]: 2026-01-22 17:23:10.381 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:10.383 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:23:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:10.385 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:10 compute-0 nova_compute[183075]: 2026-01-22 17:23:10.394 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:11 compute-0 nova_compute[183075]: 2026-01-22 17:23:11.710 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:11 compute-0 nova_compute[183075]: 2026-01-22 17:23:11.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:23:13 compute-0 nova_compute[183075]: 2026-01-22 17:23:13.744 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "89784883-b435-428a-8936-a513f9e65fe0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:13 compute-0 nova_compute[183075]: 2026-01-22 17:23:13.745 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "89784883-b435-428a-8936-a513f9e65fe0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:13 compute-0 nova_compute[183075]: 2026-01-22 17:23:13.764 183079 DEBUG nova.compute.manager [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:23:13 compute-0 nova_compute[183075]: 2026-01-22 17:23:13.775 183079 INFO nova.compute.manager [None req-3782e023-806b-41aa-b0cf-033973ce0795 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Get console output
Jan 22 17:23:13 compute-0 nova_compute[183075]: 2026-01-22 17:23:13.846 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:13 compute-0 nova_compute[183075]: 2026-01-22 17:23:13.846 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:13 compute-0 nova_compute[183075]: 2026-01-22 17:23:13.858 183079 DEBUG nova.virt.hardware [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:23:13 compute-0 nova_compute[183075]: 2026-01-22 17:23:13.858 183079 INFO nova.compute.claims [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.026 183079 DEBUG nova.compute.provider_tree [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.040 183079 DEBUG nova.scheduler.client.report [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.060 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.060 183079 DEBUG nova.compute.manager [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.111 183079 DEBUG nova.compute.manager [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.112 183079 DEBUG nova.network.neutron [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.132 183079 INFO nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.151 183079 DEBUG nova.compute.manager [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.236 183079 DEBUG nova.compute.manager [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.238 183079 DEBUG nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.238 183079 INFO nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Creating image(s)
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.239 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "/var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.240 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.241 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.266 183079 DEBUG oslo_concurrency.processutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.362 183079 DEBUG oslo_concurrency.processutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.364 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.365 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.389 183079 DEBUG oslo_concurrency.processutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.458 183079 DEBUG nova.policy [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.485 183079 DEBUG oslo_concurrency.processutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.487 183079 DEBUG oslo_concurrency.processutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.725 183079 DEBUG oslo_concurrency.processutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk 1073741824" returned: 0 in 0.237s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.726 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.362s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.727 183079 DEBUG oslo_concurrency.processutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.793 183079 DEBUG oslo_concurrency.processutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.796 183079 DEBUG nova.virt.disk.api [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Checking if we can resize image /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.796 183079 DEBUG oslo_concurrency.processutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.885 183079 DEBUG oslo_concurrency.processutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.887 183079 DEBUG nova.virt.disk.api [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Cannot resize image /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.888 183079 DEBUG nova.objects.instance [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'migration_context' on Instance uuid 89784883-b435-428a-8936-a513f9e65fe0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.909 183079 DEBUG nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.910 183079 DEBUG nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Ensure instance console log exists: /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.911 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.911 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:14 compute-0 nova_compute[183075]: 2026-01-22 17:23:14.912 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:15 compute-0 nova_compute[183075]: 2026-01-22 17:23:15.128 183079 DEBUG nova.network.neutron [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Successfully updated port: b49f37fd-778f-41bc-b520-547fbfd8002e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:23:15 compute-0 nova_compute[183075]: 2026-01-22 17:23:15.149 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "refresh_cache-89784883-b435-428a-8936-a513f9e65fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:23:15 compute-0 nova_compute[183075]: 2026-01-22 17:23:15.150 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquired lock "refresh_cache-89784883-b435-428a-8936-a513f9e65fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:23:15 compute-0 nova_compute[183075]: 2026-01-22 17:23:15.150 183079 DEBUG nova.network.neutron [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:23:15 compute-0 nova_compute[183075]: 2026-01-22 17:23:15.236 183079 DEBUG nova.compute.manager [req-e95e4def-b288-4555-a86b-d89ae50b6c98 req-fe779328-10b8-4aa2-a6be-ae1fab93fa3a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Received event network-changed-b49f37fd-778f-41bc-b520-547fbfd8002e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:23:15 compute-0 nova_compute[183075]: 2026-01-22 17:23:15.237 183079 DEBUG nova.compute.manager [req-e95e4def-b288-4555-a86b-d89ae50b6c98 req-fe779328-10b8-4aa2-a6be-ae1fab93fa3a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Refreshing instance network info cache due to event network-changed-b49f37fd-778f-41bc-b520-547fbfd8002e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:23:15 compute-0 nova_compute[183075]: 2026-01-22 17:23:15.238 183079 DEBUG oslo_concurrency.lockutils [req-e95e4def-b288-4555-a86b-d89ae50b6c98 req-fe779328-10b8-4aa2-a6be-ae1fab93fa3a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-89784883-b435-428a-8936-a513f9e65fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:23:15 compute-0 nova_compute[183075]: 2026-01-22 17:23:15.318 183079 DEBUG nova.network.neutron [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:23:15 compute-0 nova_compute[183075]: 2026-01-22 17:23:15.397 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:15 compute-0 nova_compute[183075]: 2026-01-22 17:23:15.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.085 183079 DEBUG nova.network.neutron [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Updating instance_info_cache with network_info: [{"id": "b49f37fd-778f-41bc-b520-547fbfd8002e", "address": "fa:16:3e:95:2f:57", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb49f37fd-77", "ovs_interfaceid": "b49f37fd-778f-41bc-b520-547fbfd8002e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.104 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Releasing lock "refresh_cache-89784883-b435-428a-8936-a513f9e65fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.105 183079 DEBUG nova.compute.manager [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Instance network_info: |[{"id": "b49f37fd-778f-41bc-b520-547fbfd8002e", "address": "fa:16:3e:95:2f:57", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb49f37fd-77", "ovs_interfaceid": "b49f37fd-778f-41bc-b520-547fbfd8002e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.105 183079 DEBUG oslo_concurrency.lockutils [req-e95e4def-b288-4555-a86b-d89ae50b6c98 req-fe779328-10b8-4aa2-a6be-ae1fab93fa3a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-89784883-b435-428a-8936-a513f9e65fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.106 183079 DEBUG nova.network.neutron [req-e95e4def-b288-4555-a86b-d89ae50b6c98 req-fe779328-10b8-4aa2-a6be-ae1fab93fa3a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Refreshing network info cache for port b49f37fd-778f-41bc-b520-547fbfd8002e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.108 183079 DEBUG nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Start _get_guest_xml network_info=[{"id": "b49f37fd-778f-41bc-b520-547fbfd8002e", "address": "fa:16:3e:95:2f:57", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb49f37fd-77", "ovs_interfaceid": "b49f37fd-778f-41bc-b520-547fbfd8002e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.112 183079 WARNING nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.119 183079 DEBUG nova.virt.libvirt.host [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.119 183079 DEBUG nova.virt.libvirt.host [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.124 183079 DEBUG nova.virt.libvirt.host [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.125 183079 DEBUG nova.virt.libvirt.host [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.126 183079 DEBUG nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.126 183079 DEBUG nova.virt.hardware [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.126 183079 DEBUG nova.virt.hardware [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.127 183079 DEBUG nova.virt.hardware [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.127 183079 DEBUG nova.virt.hardware [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.127 183079 DEBUG nova.virt.hardware [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.128 183079 DEBUG nova.virt.hardware [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.128 183079 DEBUG nova.virt.hardware [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.128 183079 DEBUG nova.virt.hardware [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.128 183079 DEBUG nova.virt.hardware [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.129 183079 DEBUG nova.virt.hardware [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.129 183079 DEBUG nova.virt.hardware [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.132 183079 DEBUG nova.virt.libvirt.vif [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:23:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1551955303',display_name='tempest-server-test-1551955303',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1551955303',id=41,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-zmub7s9r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:23:14Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=89784883-b435-428a-8936-a513f9e65fe0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b49f37fd-778f-41bc-b520-547fbfd8002e", "address": "fa:16:3e:95:2f:57", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb49f37fd-77", "ovs_interfaceid": "b49f37fd-778f-41bc-b520-547fbfd8002e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.133 183079 DEBUG nova.network.os_vif_util [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "b49f37fd-778f-41bc-b520-547fbfd8002e", "address": "fa:16:3e:95:2f:57", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb49f37fd-77", "ovs_interfaceid": "b49f37fd-778f-41bc-b520-547fbfd8002e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.133 183079 DEBUG nova.network.os_vif_util [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:2f:57,bridge_name='br-int',has_traffic_filtering=True,id=b49f37fd-778f-41bc-b520-547fbfd8002e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb49f37fd-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.134 183079 DEBUG nova.objects.instance [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'pci_devices' on Instance uuid 89784883-b435-428a-8936-a513f9e65fe0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.152 183079 DEBUG nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:23:16 compute-0 nova_compute[183075]:   <uuid>89784883-b435-428a-8936-a513f9e65fe0</uuid>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   <name>instance-00000029</name>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1551955303</nova:name>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:23:16</nova:creationTime>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:23:16 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:23:16 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:23:16 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:23:16 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:23:16 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:23:16 compute-0 nova_compute[183075]:         <nova:user uuid="1148a46489e842e6a0c7660c54567798">tempest-FloatingIpSameNetwork-953620552-project-member</nova:user>
Jan 22 17:23:16 compute-0 nova_compute[183075]:         <nova:project uuid="02818155e7af4645bc909d4ba671f11f">tempest-FloatingIpSameNetwork-953620552</nova:project>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:23:16 compute-0 nova_compute[183075]:         <nova:port uuid="b49f37fd-778f-41bc-b520-547fbfd8002e">
Jan 22 17:23:16 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <system>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <entry name="serial">89784883-b435-428a-8936-a513f9e65fe0</entry>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <entry name="uuid">89784883-b435-428a-8936-a513f9e65fe0</entry>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     </system>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   <os>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   </os>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   <features>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   </features>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:95:2f:57"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <target dev="tapb49f37fd-77"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/console.log" append="off"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <video>
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     </video>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:23:16 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:23:16 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:23:16 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:23:16 compute-0 nova_compute[183075]: </domain>
Jan 22 17:23:16 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.157 183079 DEBUG nova.compute.manager [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Preparing to wait for external event network-vif-plugged-b49f37fd-778f-41bc-b520-547fbfd8002e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.158 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "89784883-b435-428a-8936-a513f9e65fe0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.158 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "89784883-b435-428a-8936-a513f9e65fe0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.158 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "89784883-b435-428a-8936-a513f9e65fe0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.159 183079 DEBUG nova.virt.libvirt.vif [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:23:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1551955303',display_name='tempest-server-test-1551955303',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1551955303',id=41,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-zmub7s9r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:23:14Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=89784883-b435-428a-8936-a513f9e65fe0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b49f37fd-778f-41bc-b520-547fbfd8002e", "address": "fa:16:3e:95:2f:57", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb49f37fd-77", "ovs_interfaceid": "b49f37fd-778f-41bc-b520-547fbfd8002e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.160 183079 DEBUG nova.network.os_vif_util [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "b49f37fd-778f-41bc-b520-547fbfd8002e", "address": "fa:16:3e:95:2f:57", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb49f37fd-77", "ovs_interfaceid": "b49f37fd-778f-41bc-b520-547fbfd8002e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.160 183079 DEBUG nova.network.os_vif_util [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:2f:57,bridge_name='br-int',has_traffic_filtering=True,id=b49f37fd-778f-41bc-b520-547fbfd8002e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb49f37fd-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.161 183079 DEBUG os_vif [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:2f:57,bridge_name='br-int',has_traffic_filtering=True,id=b49f37fd-778f-41bc-b520-547fbfd8002e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb49f37fd-77') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.161 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.162 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.162 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.166 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.166 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb49f37fd-77, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.167 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb49f37fd-77, col_values=(('external_ids', {'iface-id': 'b49f37fd-778f-41bc-b520-547fbfd8002e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:2f:57', 'vm-uuid': '89784883-b435-428a-8936-a513f9e65fe0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.208 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:16 compute-0 NetworkManager[55454]: <info>  [1769102596.2092] manager: (tapb49f37fd-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/204)
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.211 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.220 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.221 183079 INFO os_vif [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:2f:57,bridge_name='br-int',has_traffic_filtering=True,id=b49f37fd-778f-41bc-b520-547fbfd8002e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb49f37fd-77')
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.293 183079 DEBUG nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.293 183079 DEBUG nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No VIF found with MAC fa:16:3e:95:2f:57, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:23:16 compute-0 podman[228431]: 2026-01-22 17:23:16.345737028 +0000 UTC m=+0.067306826 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:23:16 compute-0 podman[228432]: 2026-01-22 17:23:16.367605705 +0000 UTC m=+0.093608120 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc.)
Jan 22 17:23:16 compute-0 kernel: tapb49f37fd-77: entered promiscuous mode
Jan 22 17:23:16 compute-0 ovn_controller[95372]: 2026-01-22T17:23:16Z|00470|binding|INFO|Claiming lport b49f37fd-778f-41bc-b520-547fbfd8002e for this chassis.
Jan 22 17:23:16 compute-0 ovn_controller[95372]: 2026-01-22T17:23:16Z|00471|binding|INFO|b49f37fd-778f-41bc-b520-547fbfd8002e: Claiming fa:16:3e:95:2f:57 10.100.0.3
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.394 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:16 compute-0 NetworkManager[55454]: <info>  [1769102596.3968] manager: (tapb49f37fd-77): new Tun device (/org/freedesktop/NetworkManager/Devices/205)
Jan 22 17:23:16 compute-0 podman[228430]: 2026-01-22 17:23:16.402961938 +0000 UTC m=+0.121646080 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:23:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:16.403 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:2f:57 10.100.0.3'], port_security=['fa:16:3e:95:2f:57 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '89784883-b435-428a-8936-a513f9e65fe0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=b49f37fd-778f-41bc-b520-547fbfd8002e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:23:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:16.405 104629 INFO neutron.agent.ovn.metadata.agent [-] Port b49f37fd-778f-41bc-b520-547fbfd8002e in datapath eee918a6-66b2-47ae-b702-620a23ef395b bound to our chassis
Jan 22 17:23:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:16.407 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:23:16 compute-0 ovn_controller[95372]: 2026-01-22T17:23:16Z|00472|binding|INFO|Setting lport b49f37fd-778f-41bc-b520-547fbfd8002e ovn-installed in OVS
Jan 22 17:23:16 compute-0 ovn_controller[95372]: 2026-01-22T17:23:16Z|00473|binding|INFO|Setting lport b49f37fd-778f-41bc-b520-547fbfd8002e up in Southbound
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.416 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.422 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:16.432 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4fc94850-4e12-475a-9216-fa18ff75df9d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:16 compute-0 systemd-udevd[228508]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:23:16 compute-0 systemd-machined[154382]: New machine qemu-41-instance-00000029.
Jan 22 17:23:16 compute-0 systemd[1]: Started Virtual Machine qemu-41-instance-00000029.
Jan 22 17:23:16 compute-0 NetworkManager[55454]: <info>  [1769102596.4665] device (tapb49f37fd-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:23:16 compute-0 NetworkManager[55454]: <info>  [1769102596.4677] device (tapb49f37fd-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:23:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:16.476 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[cd35ec8e-0a00-4f4a-a97c-4eafc1cc61c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:16.479 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a59efa5e-d5d3-4c14-8f6a-b531c59b1fb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:16.521 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[05f34e5b-a0c8-4ccf-be43-9071fabd336e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:16.544 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d993926c-3e91-4eeb-85f5-a90e4fe91910]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6147, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6147, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492684, 'reachable_time': 42858, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228520, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:16.565 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0f9b973a-b5a9-401e-9084-e78a6e2b5c6c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492699, 'tstamp': 492699}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228522, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492702, 'tstamp': 492702}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228522, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:16.568 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.570 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.572 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:16.572 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee918a6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:16.573 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:23:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:16.573 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeee918a6-60, col_values=(('external_ids', {'iface-id': '15d4de90-41f4-4532-aebd-197c2a33c6d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:16.573 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.713 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.779 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102596.7791018, 89784883-b435-428a-8936-a513f9e65fe0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.780 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 89784883-b435-428a-8936-a513f9e65fe0] VM Started (Lifecycle Event)
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.819 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.824 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102596.781592, 89784883-b435-428a-8936-a513f9e65fe0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.824 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 89784883-b435-428a-8936-a513f9e65fe0] VM Paused (Lifecycle Event)
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.850 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.853 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:23:16 compute-0 nova_compute[183075]: 2026-01-22 17:23:16.870 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 89784883-b435-428a-8936-a513f9e65fe0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.508 183079 DEBUG nova.compute.manager [req-f4226a63-7766-484c-8ccd-698f97117f76 req-2ce29c29-70ec-4759-ab7f-77c93928e33f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Received event network-vif-plugged-b49f37fd-778f-41bc-b520-547fbfd8002e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.510 183079 DEBUG oslo_concurrency.lockutils [req-f4226a63-7766-484c-8ccd-698f97117f76 req-2ce29c29-70ec-4759-ab7f-77c93928e33f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "89784883-b435-428a-8936-a513f9e65fe0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.510 183079 DEBUG oslo_concurrency.lockutils [req-f4226a63-7766-484c-8ccd-698f97117f76 req-2ce29c29-70ec-4759-ab7f-77c93928e33f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "89784883-b435-428a-8936-a513f9e65fe0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.511 183079 DEBUG oslo_concurrency.lockutils [req-f4226a63-7766-484c-8ccd-698f97117f76 req-2ce29c29-70ec-4759-ab7f-77c93928e33f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "89784883-b435-428a-8936-a513f9e65fe0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.511 183079 DEBUG nova.compute.manager [req-f4226a63-7766-484c-8ccd-698f97117f76 req-2ce29c29-70ec-4759-ab7f-77c93928e33f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Processing event network-vif-plugged-b49f37fd-778f-41bc-b520-547fbfd8002e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.512 183079 DEBUG nova.compute.manager [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.519 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102598.518457, 89784883-b435-428a-8936-a513f9e65fe0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.520 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 89784883-b435-428a-8936-a513f9e65fe0] VM Resumed (Lifecycle Event)
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.523 183079 DEBUG nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.528 183079 INFO nova.virt.libvirt.driver [-] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Instance spawned successfully.
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.529 183079 DEBUG nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.547 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.554 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.558 183079 DEBUG nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.559 183079 DEBUG nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.560 183079 DEBUG nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.560 183079 DEBUG nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.561 183079 DEBUG nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.561 183079 DEBUG nova.virt.libvirt.driver [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.607 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 89784883-b435-428a-8936-a513f9e65fe0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.651 183079 INFO nova.compute.manager [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Took 4.41 seconds to spawn the instance on the hypervisor.
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.651 183079 DEBUG nova.compute.manager [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.722 183079 INFO nova.compute.manager [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Took 4.91 seconds to build instance.
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.743 183079 DEBUG oslo_concurrency.lockutils [None req-ebb4aba1-018f-4226-ad81-24256c9a1a78 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "89784883-b435-428a-8936-a513f9e65fe0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.998s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.944 183079 INFO nova.compute.manager [None req-ed01fa31-14df-4258-b6ed-d3b565c37022 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Get console output
Jan 22 17:23:18 compute-0 nova_compute[183075]: 2026-01-22 17:23:18.951 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:23:19 compute-0 ovn_controller[95372]: 2026-01-22T17:23:19Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d2:a8:c8 10.100.0.11
Jan 22 17:23:19 compute-0 ovn_controller[95372]: 2026-01-22T17:23:19Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d2:a8:c8 10.100.0.11
Jan 22 17:23:20 compute-0 nova_compute[183075]: 2026-01-22 17:23:20.587 183079 DEBUG nova.compute.manager [req-24af147d-23bc-4f32-8253-ed60ad18c537 req-577bbe0b-e6da-4373-a2d7-4a234e374fb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Received event network-vif-plugged-b49f37fd-778f-41bc-b520-547fbfd8002e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:23:20 compute-0 nova_compute[183075]: 2026-01-22 17:23:20.587 183079 DEBUG oslo_concurrency.lockutils [req-24af147d-23bc-4f32-8253-ed60ad18c537 req-577bbe0b-e6da-4373-a2d7-4a234e374fb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "89784883-b435-428a-8936-a513f9e65fe0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:20 compute-0 nova_compute[183075]: 2026-01-22 17:23:20.588 183079 DEBUG oslo_concurrency.lockutils [req-24af147d-23bc-4f32-8253-ed60ad18c537 req-577bbe0b-e6da-4373-a2d7-4a234e374fb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "89784883-b435-428a-8936-a513f9e65fe0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:20 compute-0 nova_compute[183075]: 2026-01-22 17:23:20.588 183079 DEBUG oslo_concurrency.lockutils [req-24af147d-23bc-4f32-8253-ed60ad18c537 req-577bbe0b-e6da-4373-a2d7-4a234e374fb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "89784883-b435-428a-8936-a513f9e65fe0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:20 compute-0 nova_compute[183075]: 2026-01-22 17:23:20.588 183079 DEBUG nova.compute.manager [req-24af147d-23bc-4f32-8253-ed60ad18c537 req-577bbe0b-e6da-4373-a2d7-4a234e374fb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] No waiting events found dispatching network-vif-plugged-b49f37fd-778f-41bc-b520-547fbfd8002e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:23:20 compute-0 nova_compute[183075]: 2026-01-22 17:23:20.588 183079 WARNING nova.compute.manager [req-24af147d-23bc-4f32-8253-ed60ad18c537 req-577bbe0b-e6da-4373-a2d7-4a234e374fb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Received unexpected event network-vif-plugged-b49f37fd-778f-41bc-b520-547fbfd8002e for instance with vm_state active and task_state None.
Jan 22 17:23:20 compute-0 nova_compute[183075]: 2026-01-22 17:23:20.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:23:20 compute-0 nova_compute[183075]: 2026-01-22 17:23:20.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:23:21 compute-0 nova_compute[183075]: 2026-01-22 17:23:21.211 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:21 compute-0 nova_compute[183075]: 2026-01-22 17:23:21.371 183079 DEBUG nova.network.neutron [req-e95e4def-b288-4555-a86b-d89ae50b6c98 req-fe779328-10b8-4aa2-a6be-ae1fab93fa3a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Updated VIF entry in instance network info cache for port b49f37fd-778f-41bc-b520-547fbfd8002e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:23:21 compute-0 nova_compute[183075]: 2026-01-22 17:23:21.372 183079 DEBUG nova.network.neutron [req-e95e4def-b288-4555-a86b-d89ae50b6c98 req-fe779328-10b8-4aa2-a6be-ae1fab93fa3a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Updating instance_info_cache with network_info: [{"id": "b49f37fd-778f-41bc-b520-547fbfd8002e", "address": "fa:16:3e:95:2f:57", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb49f37fd-77", "ovs_interfaceid": "b49f37fd-778f-41bc-b520-547fbfd8002e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:23:21 compute-0 nova_compute[183075]: 2026-01-22 17:23:21.394 183079 DEBUG oslo_concurrency.lockutils [req-e95e4def-b288-4555-a86b-d89ae50b6c98 req-fe779328-10b8-4aa2-a6be-ae1fab93fa3a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-89784883-b435-428a-8936-a513f9e65fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:23:21 compute-0 nova_compute[183075]: 2026-01-22 17:23:21.432 183079 INFO nova.compute.manager [None req-10425ed9-a694-4daf-9ee9-9f944c9ef17c 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Get console output
Jan 22 17:23:21 compute-0 nova_compute[183075]: 2026-01-22 17:23:21.437 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:23:21 compute-0 nova_compute[183075]: 2026-01-22 17:23:21.716 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:21 compute-0 nova_compute[183075]: 2026-01-22 17:23:21.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:23:22 compute-0 nova_compute[183075]: 2026-01-22 17:23:22.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:23:23 compute-0 podman[228540]: 2026-01-22 17:23:23.36285465 +0000 UTC m=+0.067765612 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:23:23 compute-0 nova_compute[183075]: 2026-01-22 17:23:23.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:23:23 compute-0 nova_compute[183075]: 2026-01-22 17:23:23.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:23:23 compute-0 nova_compute[183075]: 2026-01-22 17:23:23.813 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:23 compute-0 nova_compute[183075]: 2026-01-22 17:23:23.813 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:23 compute-0 nova_compute[183075]: 2026-01-22 17:23:23.814 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:23 compute-0 nova_compute[183075]: 2026-01-22 17:23:23.814 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:23:23 compute-0 nova_compute[183075]: 2026-01-22 17:23:23.903 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:23 compute-0 nova_compute[183075]: 2026-01-22 17:23:23.963 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:23 compute-0 nova_compute[183075]: 2026-01-22 17:23:23.965 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.020 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.027 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.080 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.081 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.109 183079 INFO nova.compute.manager [None req-ff453210-593e-4d80-9b14-740976a7b517 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Get console output
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.117 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.155 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.162 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.220 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.222 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.314 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.509 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.510 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5268MB free_disk=73.31132125854492GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.511 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.511 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.618 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance a7440e72-b977-4601-88ad-ce8a4c72e883 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.618 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 7c4cc341-c93c-4077-a541-31a8487482f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.619 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 89784883-b435-428a-8936-a513f9e65fe0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.619 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.619 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:23:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:24.671 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:24.672 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:23:24 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:24 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:24 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:24 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:24 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:24 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:23:24 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.712 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.728 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.767 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:23:24 compute-0 nova_compute[183075]: 2026-01-22 17:23:24.768 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.520 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.521 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.8486202
Jan 22 17:23:25 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228402]: 10.100.0.11:34230 [22/Jan/2026:17:23:24.670] listener listener/metadata 0/0/0/850/850 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.528 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.529 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.547 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.548 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0192668
Jan 22 17:23:25 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228402]: 10.100.0.11:34246 [22/Jan/2026:17:23:25.527] listener listener/metadata 0/0/0/20/20 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.553 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.554 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.573 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.574 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0207603
Jan 22 17:23:25 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228402]: 10.100.0.11:34248 [22/Jan/2026:17:23:25.552] listener listener/metadata 0/0/0/22/22 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.580 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.581 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.600 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.601 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0197484
Jan 22 17:23:25 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228402]: 10.100.0.11:34264 [22/Jan/2026:17:23:25.579] listener listener/metadata 0/0/0/21/21 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.606 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.607 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.631 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.632 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0248513
Jan 22 17:23:25 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228402]: 10.100.0.11:34268 [22/Jan/2026:17:23:25.606] listener listener/metadata 0/0/0/26/26 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.637 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.638 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.657 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.658 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0198894
Jan 22 17:23:25 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228402]: 10.100.0.11:34276 [22/Jan/2026:17:23:25.637] listener listener/metadata 0/0/0/21/21 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.663 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.663 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.684 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.685 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0217776
Jan 22 17:23:25 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228402]: 10.100.0.11:34284 [22/Jan/2026:17:23:25.662] listener listener/metadata 0/0/0/22/22 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.690 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.690 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.709 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.710 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0192175
Jan 22 17:23:25 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228402]: 10.100.0.11:34298 [22/Jan/2026:17:23:25.689] listener listener/metadata 0/0/0/20/20 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.713 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.714 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.727 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.728 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 162 time: 0.0138206
Jan 22 17:23:25 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228402]: 10.100.0.11:34304 [22/Jan/2026:17:23:25.713] listener listener/metadata 0/0/0/14/14 200 146 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.732 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.732 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.754 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.755 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 162 time: 0.0222273
Jan 22 17:23:25 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228402]: 10.100.0.11:34310 [22/Jan/2026:17:23:25.731] listener listener/metadata 0/0/0/23/23 200 146 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.759 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.759 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:25 compute-0 nova_compute[183075]: 2026-01-22 17:23:25.768 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:23:25 compute-0 nova_compute[183075]: 2026-01-22 17:23:25.768 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:23:25 compute-0 nova_compute[183075]: 2026-01-22 17:23:25.768 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:23:25 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228402]: 10.100.0.11:34316 [22/Jan/2026:17:23:25.758] listener listener/metadata 0/0/0/23/23 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.781 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0222311
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.790 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.791 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.805 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.805 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0142660
Jan 22 17:23:25 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228402]: 10.100.0.11:34322 [22/Jan/2026:17:23:25.790] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.809 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.810 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.825 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.826 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0161726
Jan 22 17:23:25 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228402]: 10.100.0.11:34336 [22/Jan/2026:17:23:25.809] listener listener/metadata 0/0/0/17/17 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.830 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.831 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.843 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.844 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0132990
Jan 22 17:23:25 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228402]: 10.100.0.11:34340 [22/Jan/2026:17:23:25.829] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.848 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.849 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.862 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.863 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 162 time: 0.0142322
Jan 22 17:23:25 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228402]: 10.100.0.11:34352 [22/Jan/2026:17:23:25.848] listener listener/metadata 0/0/0/15/15 200 146 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.873 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.874 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.893 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:25 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228402]: 10.100.0.11:34368 [22/Jan/2026:17:23:25.872] listener listener/metadata 0/0/0/21/21 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:23:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:25.894 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0201979
Jan 22 17:23:26 compute-0 nova_compute[183075]: 2026-01-22 17:23:26.286 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:26 compute-0 nova_compute[183075]: 2026-01-22 17:23:26.720 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:26 compute-0 nova_compute[183075]: 2026-01-22 17:23:26.791 183079 INFO nova.compute.manager [None req-50f0588e-c3fa-493b-aa34-c92b43518ef2 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Get console output
Jan 22 17:23:26 compute-0 nova_compute[183075]: 2026-01-22 17:23:26.797 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:23:27 compute-0 nova_compute[183075]: 2026-01-22 17:23:27.342 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-a7440e72-b977-4601-88ad-ce8a4c72e883" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:23:27 compute-0 nova_compute[183075]: 2026-01-22 17:23:27.342 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-a7440e72-b977-4601-88ad-ce8a4c72e883" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:23:27 compute-0 nova_compute[183075]: 2026-01-22 17:23:27.342 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:23:27 compute-0 nova_compute[183075]: 2026-01-22 17:23:27.342 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a7440e72-b977-4601-88ad-ce8a4c72e883 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:23:29 compute-0 nova_compute[183075]: 2026-01-22 17:23:29.252 183079 INFO nova.compute.manager [None req-76cc94ba-591e-4b94-b901-0529d09a1cc2 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Get console output
Jan 22 17:23:29 compute-0 nova_compute[183075]: 2026-01-22 17:23:29.259 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:23:30 compute-0 nova_compute[183075]: 2026-01-22 17:23:30.341 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Updating instance_info_cache with network_info: [{"id": "6b312169-d575-48ac-b3c3-72634952d91f", "address": "fa:16:3e:bf:52:41", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b312169-d5", "ovs_interfaceid": "6b312169-d575-48ac-b3c3-72634952d91f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:23:30 compute-0 nova_compute[183075]: 2026-01-22 17:23:30.361 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-a7440e72-b977-4601-88ad-ce8a4c72e883" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:23:30 compute-0 nova_compute[183075]: 2026-01-22 17:23:30.361 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:23:30 compute-0 podman[228595]: 2026-01-22 17:23:30.389790705 +0000 UTC m=+0.085706571 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:23:31 compute-0 nova_compute[183075]: 2026-01-22 17:23:31.288 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:31 compute-0 nova_compute[183075]: 2026-01-22 17:23:31.723 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:31 compute-0 ovn_controller[95372]: 2026-01-22T17:23:31Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:95:2f:57 10.100.0.3
Jan 22 17:23:31 compute-0 ovn_controller[95372]: 2026-01-22T17:23:31Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:95:2f:57 10.100.0.3
Jan 22 17:23:32 compute-0 nova_compute[183075]: 2026-01-22 17:23:32.245 183079 INFO nova.compute.manager [None req-1369e753-811d-4a73-bc61-c4ae30268931 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Get console output
Jan 22 17:23:32 compute-0 nova_compute[183075]: 2026-01-22 17:23:32.250 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:23:34 compute-0 nova_compute[183075]: 2026-01-22 17:23:34.399 183079 INFO nova.compute.manager [None req-1a44bcb6-2d27-4bcf-a832-3c9f6f8bdb90 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Get console output
Jan 22 17:23:34 compute-0 nova_compute[183075]: 2026-01-22 17:23:34.408 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:23:36 compute-0 nova_compute[183075]: 2026-01-22 17:23:36.292 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:36 compute-0 podman[228627]: 2026-01-22 17:23:36.372060749 +0000 UTC m=+0.075293372 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:23:36 compute-0 nova_compute[183075]: 2026-01-22 17:23:36.727 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:37 compute-0 nova_compute[183075]: 2026-01-22 17:23:37.396 183079 INFO nova.compute.manager [None req-24f78f5f-182a-4410-82aa-6b04f37bbd84 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Get console output
Jan 22 17:23:37 compute-0 nova_compute[183075]: 2026-01-22 17:23:37.403 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:23:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:38.068 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:38.069 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:23:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:23:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.476 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.477 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.4076238
Jan 22 17:23:39 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.3:32944 [22/Jan/2026:17:23:38.067] listener listener/metadata 0/0/0/1409/1409 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.485 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.486 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.505 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:39 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.3:32952 [22/Jan/2026:17:23:39.485] listener listener/metadata 0/0/0/21/21 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.506 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0199592
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.512 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.513 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.532 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.532 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0193522
Jan 22 17:23:39 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.3:32960 [22/Jan/2026:17:23:39.512] listener listener/metadata 0/0/0/20/20 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.541 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.542 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.565 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:39 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.3:32972 [22/Jan/2026:17:23:39.541] listener listener/metadata 0/0/0/25/25 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.566 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0240819
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.574 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.576 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.599 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:39 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.3:32986 [22/Jan/2026:17:23:39.574] listener listener/metadata 0/0/0/26/26 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.600 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0243976
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.608 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.610 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.632 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.633 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0230737
Jan 22 17:23:39 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.3:33002 [22/Jan/2026:17:23:39.607] listener listener/metadata 0/0/0/25/25 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.640 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.641 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.671 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:39 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.3:33016 [22/Jan/2026:17:23:39.639] listener listener/metadata 0/0/0/32/32 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.671 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0308218
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.677 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.678 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.699 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:39 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.3:33032 [22/Jan/2026:17:23:39.677] listener listener/metadata 0/0/0/22/22 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.700 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0218773
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.705 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.705 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.726 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.727 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0213172
Jan 22 17:23:39 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.3:33044 [22/Jan/2026:17:23:39.704] listener listener/metadata 0/0/0/22/22 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.739 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.740 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.769 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:39 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.3:33052 [22/Jan/2026:17:23:39.739] listener listener/metadata 0/0/0/32/32 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.771 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0314870
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.783 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.785 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.803 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0186100
Jan 22 17:23:39 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.3:33060 [22/Jan/2026:17:23:39.782] listener listener/metadata 0/0/0/21/21 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.815 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.816 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:39 compute-0 nova_compute[183075]: 2026-01-22 17:23:39.834 183079 INFO nova.compute.manager [None req-b6c284ed-1c11-4f86-90b0-2ba0e41c3952 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Get console output
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.837 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.838 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0216110
Jan 22 17:23:39 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.3:33070 [22/Jan/2026:17:23:39.814] listener listener/metadata 0/0/0/24/24 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:23:39 compute-0 nova_compute[183075]: 2026-01-22 17:23:39.841 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.844 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.845 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.860 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:39 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.3:33086 [22/Jan/2026:17:23:39.843] listener listener/metadata 0/0/0/16/16 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.860 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0152588
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.864 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.865 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.886 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.887 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0215313
Jan 22 17:23:39 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.3:33088 [22/Jan/2026:17:23:39.864] listener listener/metadata 0/0/0/22/22 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.893 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.894 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.910 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:39 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.3:33104 [22/Jan/2026:17:23:39.893] listener listener/metadata 0/0/0/18/18 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.911 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0173013
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.916 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.917 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.935 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:23:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:39.935 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0182111
Jan 22 17:23:39 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.3:33110 [22/Jan/2026:17:23:39.916] listener listener/metadata 0/0/0/19/19 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:23:41 compute-0 nova_compute[183075]: 2026-01-22 17:23:41.295 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:41 compute-0 nova_compute[183075]: 2026-01-22 17:23:41.729 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:41.940 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:41.941 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:41.942 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:42 compute-0 nova_compute[183075]: 2026-01-22 17:23:42.560 183079 INFO nova.compute.manager [None req-2445ae64-4b2b-4fe3-9d95-6d91939c05d3 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Get console output
Jan 22 17:23:42 compute-0 nova_compute[183075]: 2026-01-22 17:23:42.567 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:23:43 compute-0 nova_compute[183075]: 2026-01-22 17:23:43.378 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:23:44 compute-0 nova_compute[183075]: 2026-01-22 17:23:44.974 183079 INFO nova.compute.manager [None req-6a563b5b-6b24-4a58-ba8f-1134890b1249 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Get console output
Jan 22 17:23:44 compute-0 nova_compute[183075]: 2026-01-22 17:23:44.982 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:23:46 compute-0 nova_compute[183075]: 2026-01-22 17:23:46.296 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:46 compute-0 nova_compute[183075]: 2026-01-22 17:23:46.732 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:47 compute-0 podman[228653]: 2026-01-22 17:23:47.367290223 +0000 UTC m=+0.062151201 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 17:23:47 compute-0 podman[228654]: 2026-01-22 17:23:47.378654917 +0000 UTC m=+0.071147382 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=)
Jan 22 17:23:47 compute-0 podman[228652]: 2026-01-22 17:23:47.404386814 +0000 UTC m=+0.102606502 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 17:23:49 compute-0 nova_compute[183075]: 2026-01-22 17:23:49.919 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:49 compute-0 nova_compute[183075]: 2026-01-22 17:23:49.920 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:49 compute-0 nova_compute[183075]: 2026-01-22 17:23:49.941 183079 DEBUG nova.compute.manager [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.018 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.019 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.027 183079 DEBUG nova.virt.hardware [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.027 183079 INFO nova.compute.claims [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.092 183079 INFO nova.compute.manager [None req-44ce0fe3-f7e6-46d8-84de-843feca834b9 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Get console output
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.097 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.186 183079 DEBUG nova.compute.provider_tree [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.202 183079 DEBUG nova.scheduler.client.report [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.222 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.223 183079 DEBUG nova.compute.manager [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.258 183079 DEBUG nova.compute.manager [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.259 183079 DEBUG nova.network.neutron [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.277 183079 INFO nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.292 183079 DEBUG nova.compute.manager [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.382 183079 DEBUG nova.compute.manager [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.383 183079 DEBUG nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.384 183079 INFO nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Creating image(s)
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.384 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "/var/lib/nova/instances/c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.385 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.385 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.397 183079 DEBUG oslo_concurrency.processutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.452 183079 DEBUG oslo_concurrency.processutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.453 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.454 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.469 183079 DEBUG oslo_concurrency.processutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.535 183079 DEBUG oslo_concurrency.processutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.537 183079 DEBUG oslo_concurrency.processutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.596 183079 DEBUG oslo_concurrency.processutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk 1073741824" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.598 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.599 183079 DEBUG oslo_concurrency.processutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.685 183079 DEBUG oslo_concurrency.processutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.687 183079 DEBUG nova.virt.disk.api [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Checking if we can resize image /var/lib/nova/instances/c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.688 183079 DEBUG oslo_concurrency.processutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.763 183079 DEBUG oslo_concurrency.processutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.765 183079 DEBUG nova.virt.disk.api [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Cannot resize image /var/lib/nova/instances/c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.765 183079 DEBUG nova.objects.instance [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'migration_context' on Instance uuid c4708d03-a0cd-40d7-be06-e97b6a4b45b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.780 183079 DEBUG nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.781 183079 DEBUG nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Ensure instance console log exists: /var/lib/nova/instances/c4708d03-a0cd-40d7-be06-e97b6a4b45b7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.781 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.782 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:50 compute-0 nova_compute[183075]: 2026-01-22 17:23:50.782 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:51 compute-0 nova_compute[183075]: 2026-01-22 17:23:51.300 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:51 compute-0 nova_compute[183075]: 2026-01-22 17:23:51.734 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:52 compute-0 nova_compute[183075]: 2026-01-22 17:23:52.371 183079 DEBUG nova.policy [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:23:53 compute-0 nova_compute[183075]: 2026-01-22 17:23:53.074 183079 DEBUG nova.network.neutron [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Successfully updated port: 8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:23:53 compute-0 nova_compute[183075]: 2026-01-22 17:23:53.087 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "refresh_cache-c4708d03-a0cd-40d7-be06-e97b6a4b45b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:23:53 compute-0 nova_compute[183075]: 2026-01-22 17:23:53.087 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquired lock "refresh_cache-c4708d03-a0cd-40d7-be06-e97b6a4b45b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:23:53 compute-0 nova_compute[183075]: 2026-01-22 17:23:53.088 183079 DEBUG nova.network.neutron [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:23:53 compute-0 nova_compute[183075]: 2026-01-22 17:23:53.161 183079 DEBUG nova.compute.manager [req-418324c7-e253-4686-a7c5-75c944e43877 req-02e3ce01-cce6-4b7d-a3fe-345bd9747737 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Received event network-changed-8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:23:53 compute-0 nova_compute[183075]: 2026-01-22 17:23:53.161 183079 DEBUG nova.compute.manager [req-418324c7-e253-4686-a7c5-75c944e43877 req-02e3ce01-cce6-4b7d-a3fe-345bd9747737 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Refreshing instance network info cache due to event network-changed-8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:23:53 compute-0 nova_compute[183075]: 2026-01-22 17:23:53.162 183079 DEBUG oslo_concurrency.lockutils [req-418324c7-e253-4686-a7c5-75c944e43877 req-02e3ce01-cce6-4b7d-a3fe-345bd9747737 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-c4708d03-a0cd-40d7-be06-e97b6a4b45b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:23:53 compute-0 nova_compute[183075]: 2026-01-22 17:23:53.221 183079 DEBUG nova.network.neutron [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:23:54 compute-0 podman[228730]: 2026-01-22 17:23:54.354515518 +0000 UTC m=+0.070603537 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.688 183079 DEBUG nova.network.neutron [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Updating instance_info_cache with network_info: [{"id": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "address": "fa:16:3e:4f:49:11", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c03d5ae-dc", "ovs_interfaceid": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.710 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Releasing lock "refresh_cache-c4708d03-a0cd-40d7-be06-e97b6a4b45b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.710 183079 DEBUG nova.compute.manager [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Instance network_info: |[{"id": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "address": "fa:16:3e:4f:49:11", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c03d5ae-dc", "ovs_interfaceid": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.711 183079 DEBUG oslo_concurrency.lockutils [req-418324c7-e253-4686-a7c5-75c944e43877 req-02e3ce01-cce6-4b7d-a3fe-345bd9747737 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-c4708d03-a0cd-40d7-be06-e97b6a4b45b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.711 183079 DEBUG nova.network.neutron [req-418324c7-e253-4686-a7c5-75c944e43877 req-02e3ce01-cce6-4b7d-a3fe-345bd9747737 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Refreshing network info cache for port 8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.717 183079 DEBUG nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Start _get_guest_xml network_info=[{"id": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "address": "fa:16:3e:4f:49:11", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c03d5ae-dc", "ovs_interfaceid": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.725 183079 WARNING nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.737 183079 DEBUG nova.virt.libvirt.host [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.737 183079 DEBUG nova.virt.libvirt.host [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.740 183079 DEBUG nova.virt.libvirt.host [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.740 183079 DEBUG nova.virt.libvirt.host [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.741 183079 DEBUG nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.741 183079 DEBUG nova.virt.hardware [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.741 183079 DEBUG nova.virt.hardware [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.742 183079 DEBUG nova.virt.hardware [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.742 183079 DEBUG nova.virt.hardware [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.742 183079 DEBUG nova.virt.hardware [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.742 183079 DEBUG nova.virt.hardware [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.743 183079 DEBUG nova.virt.hardware [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.743 183079 DEBUG nova.virt.hardware [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.743 183079 DEBUG nova.virt.hardware [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.743 183079 DEBUG nova.virt.hardware [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.744 183079 DEBUG nova.virt.hardware [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.747 183079 DEBUG nova.virt.libvirt.vif [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:23:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1395753074',display_name='tempest-server-test-1395753074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1395753074',id=42,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-hkqxuw3r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:23:50Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=c4708d03-a0cd-40d7-be06-e97b6a4b45b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "address": "fa:16:3e:4f:49:11", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c03d5ae-dc", "ovs_interfaceid": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.748 183079 DEBUG nova.network.os_vif_util [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "address": "fa:16:3e:4f:49:11", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c03d5ae-dc", "ovs_interfaceid": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.748 183079 DEBUG nova.network.os_vif_util [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:49:11,bridge_name='br-int',has_traffic_filtering=True,id=8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8c03d5ae-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.749 183079 DEBUG nova.objects.instance [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'pci_devices' on Instance uuid c4708d03-a0cd-40d7-be06-e97b6a4b45b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.763 183079 DEBUG nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:23:54 compute-0 nova_compute[183075]:   <uuid>c4708d03-a0cd-40d7-be06-e97b6a4b45b7</uuid>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   <name>instance-0000002a</name>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1395753074</nova:name>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:23:54</nova:creationTime>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:23:54 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:23:54 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:23:54 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:23:54 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:23:54 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:23:54 compute-0 nova_compute[183075]:         <nova:user uuid="1148a46489e842e6a0c7660c54567798">tempest-FloatingIpSameNetwork-953620552-project-member</nova:user>
Jan 22 17:23:54 compute-0 nova_compute[183075]:         <nova:project uuid="02818155e7af4645bc909d4ba671f11f">tempest-FloatingIpSameNetwork-953620552</nova:project>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:23:54 compute-0 nova_compute[183075]:         <nova:port uuid="8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d">
Jan 22 17:23:54 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <system>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <entry name="serial">c4708d03-a0cd-40d7-be06-e97b6a4b45b7</entry>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <entry name="uuid">c4708d03-a0cd-40d7-be06-e97b6a4b45b7</entry>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     </system>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   <os>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   </os>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   <features>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   </features>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:4f:49:11"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <target dev="tap8c03d5ae-dc"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/c4708d03-a0cd-40d7-be06-e97b6a4b45b7/console.log" append="off"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <video>
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     </video>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:23:54 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:23:54 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:23:54 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:23:54 compute-0 nova_compute[183075]: </domain>
Jan 22 17:23:54 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.764 183079 DEBUG nova.compute.manager [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Preparing to wait for external event network-vif-plugged-8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.764 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.765 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.765 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.766 183079 DEBUG nova.virt.libvirt.vif [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:23:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1395753074',display_name='tempest-server-test-1395753074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1395753074',id=42,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-hkqxuw3r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:23:50Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=c4708d03-a0cd-40d7-be06-e97b6a4b45b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "address": "fa:16:3e:4f:49:11", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c03d5ae-dc", "ovs_interfaceid": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.766 183079 DEBUG nova.network.os_vif_util [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "address": "fa:16:3e:4f:49:11", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c03d5ae-dc", "ovs_interfaceid": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.767 183079 DEBUG nova.network.os_vif_util [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:49:11,bridge_name='br-int',has_traffic_filtering=True,id=8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8c03d5ae-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.767 183079 DEBUG os_vif [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:49:11,bridge_name='br-int',has_traffic_filtering=True,id=8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8c03d5ae-dc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.768 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.768 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.769 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.772 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.773 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c03d5ae-dc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.774 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8c03d5ae-dc, col_values=(('external_ids', {'iface-id': '8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:49:11', 'vm-uuid': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.776 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:54 compute-0 NetworkManager[55454]: <info>  [1769102634.7778] manager: (tap8c03d5ae-dc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/206)
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.779 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.784 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.785 183079 INFO os_vif [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:49:11,bridge_name='br-int',has_traffic_filtering=True,id=8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8c03d5ae-dc')
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.857 183079 DEBUG nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.858 183079 DEBUG nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No VIF found with MAC fa:16:3e:4f:49:11, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:23:54 compute-0 kernel: tap8c03d5ae-dc: entered promiscuous mode
Jan 22 17:23:54 compute-0 NetworkManager[55454]: <info>  [1769102634.9534] manager: (tap8c03d5ae-dc): new Tun device (/org/freedesktop/NetworkManager/Devices/207)
Jan 22 17:23:54 compute-0 ovn_controller[95372]: 2026-01-22T17:23:54Z|00474|binding|INFO|Claiming lport 8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d for this chassis.
Jan 22 17:23:54 compute-0 ovn_controller[95372]: 2026-01-22T17:23:54Z|00475|binding|INFO|8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d: Claiming fa:16:3e:4f:49:11 10.100.0.14
Jan 22 17:23:54 compute-0 nova_compute[183075]: 2026-01-22 17:23:54.958 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:54.970 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:49:11 10.100.0.14'], port_security=['fa:16:3e:4f:49:11 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.232'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=9, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:23:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:54.971 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d in datapath eee918a6-66b2-47ae-b702-620a23ef395b bound to our chassis
Jan 22 17:23:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:54.974 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:23:54 compute-0 systemd-udevd[228765]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:23:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:54.993 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[816501a8-01b6-4d0d-b889-5a0e630e43d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:54 compute-0 ovn_controller[95372]: 2026-01-22T17:23:54Z|00476|binding|INFO|Setting lport 8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d ovn-installed in OVS
Jan 22 17:23:54 compute-0 ovn_controller[95372]: 2026-01-22T17:23:54Z|00477|binding|INFO|Setting lport 8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d up in Southbound
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.000 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:55 compute-0 NetworkManager[55454]: <info>  [1769102635.0108] device (tap8c03d5ae-dc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:23:55 compute-0 NetworkManager[55454]: <info>  [1769102635.0117] device (tap8c03d5ae-dc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:23:55 compute-0 systemd-machined[154382]: New machine qemu-42-instance-0000002a.
Jan 22 17:23:55 compute-0 systemd[1]: Started Virtual Machine qemu-42-instance-0000002a.
Jan 22 17:23:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:55.035 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5b445b4b-bb67-4334-b0c8-7cfa3ff6c8c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:55.038 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f931297a-96cf-41bd-ae79-75c29c01ca43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:55.078 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[7d10a2c6-6017-435f-b109-bcf2b6d75aa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:55.107 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[eaa5cf1a-e825-4b87-aadc-34aa9b594459]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 105, 'rx_bytes': 17308, 'tx_bytes': 12007, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 105, 'rx_bytes': 17308, 'tx_bytes': 12007, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492684, 'reachable_time': 18835, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228779, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:55.130 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[125188da-7f3c-4448-9f92-6f2a3eab0dda]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492699, 'tstamp': 492699}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228780, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492702, 'tstamp': 492702}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228780, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:23:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:55.132 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.134 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.135 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:55.136 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee918a6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:55.137 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:23:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:55.138 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeee918a6-60, col_values=(('external_ids', {'iface-id': '15d4de90-41f4-4532-aebd-197c2a33c6d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:23:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:23:55.138 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.214 183079 INFO nova.compute.manager [None req-a54bd87e-be7b-4634-8add-d7a9292748c3 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Get console output
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.220 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.456 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '89784883-b435-428a-8936-a513f9e65fe0', 'name': 'tempest-server-test-1551955303', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000029', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '02818155e7af4645bc909d4ba671f11f', 'user_id': '1148a46489e842e6a0c7660c54567798', 'hostId': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.459 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'name': 'tempest-server-test-1599866618', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000027', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '02818155e7af4645bc909d4ba671f11f', 'user_id': '1148a46489e842e6a0c7660c54567798', 'hostId': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.462 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'name': 'tempest-server-0-352913770', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000028', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e4c0bb18013747dfad2e25b2495090eb', 'user_id': '852aea4e08344f39ae07e6b57393c767', 'hostId': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.463 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'name': 'tempest-server-test-1395753074', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '02818155e7af4645bc909d4ba671f11f', 'user_id': '1148a46489e842e6a0c7660c54567798', 'hostId': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.464 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.479 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/cpu volume: 11270000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.497 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/cpu volume: 10870000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.498 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102635.4982886, c4708d03-a0cd-40d7-be06-e97b6a4b45b7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.499 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] VM Started (Lifecycle Event)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.513 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/cpu volume: 11940000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.525 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.530 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102635.4984918, c4708d03-a0cd-40d7-be06-e97b6a4b45b7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.531 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] VM Paused (Lifecycle Event)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.541 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.541 183079 DEBUG nova.compute.manager [req-61b83316-3869-4a3f-bb57-8ea3b7b967f0 req-96a1845c-e8f4-470a-aa79-ff23ea9b93c0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Received event network-vif-plugged-8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.542 183079 DEBUG oslo_concurrency.lockutils [req-61b83316-3869-4a3f-bb57-8ea3b7b967f0 req-96a1845c-e8f4-470a-aa79-ff23ea9b93c0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.542 183079 DEBUG oslo_concurrency.lockutils [req-61b83316-3869-4a3f-bb57-8ea3b7b967f0 req-96a1845c-e8f4-470a-aa79-ff23ea9b93c0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.542 183079 DEBUG oslo_concurrency.lockutils [req-61b83316-3869-4a3f-bb57-8ea3b7b967f0 req-96a1845c-e8f4-470a-aa79-ff23ea9b93c0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.542 183079 DEBUG nova.compute.manager [req-61b83316-3869-4a3f-bb57-8ea3b7b967f0 req-96a1845c-e8f4-470a-aa79-ff23ea9b93c0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Processing event network-vif-plugged-8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.545 183079 DEBUG nova.compute.manager [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3bf5344-06d1-4a16-a299-526dfdee84aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11270000000, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '89784883-b435-428a-8936-a513f9e65fe0', 'timestamp': '2026-01-22T17:23:55.464158', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'instance-00000029', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '211841ca-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.239966096, 'message_signature': '74ae921c1122c3942545b3bcc1cce22b1dbf3baf590458fe15b51bf9e0dee672'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10870000000, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'timestamp': '2026-01-22T17:23:55.464158', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'instance-00000027', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '211aff00-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.257803213, 'message_signature': '752818613eec383a9d48326b0f44588d3615df6e2654c0dfe72c32a9ad048aa1'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11940000000, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'timestamp': '2026-01-22T17:23:55.464158', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'instance-00000028', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '211d8afe-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.274296303, 'message_signature': '25ac1fa878176996d17ef271a6b6efe510fce0407db1f96df479eb25e1161eda'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'timestamp': '2026-01-22T17:23:55.464158', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'instance-0000002a', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2121d032-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.302043804, 'message_signature': '93115f241501ba2a2c463075e8f33b9b1f30686353e3ab794703759327b7fbeb'}]}, 'timestamp': '2026-01-22 17:23:55.543224', '_unique_id': 'cee9d870d8bf4787a3442982a25aea74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.544 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.547 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.549 183079 DEBUG nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.551 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 89784883-b435-428a-8936-a513f9e65fe0 / tapb49f37fd-77 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.552 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.554 183079 INFO nova.virt.libvirt.driver [-] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Instance spawned successfully.
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.555 183079 DEBUG nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.555 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a7440e72-b977-4601-88ad-ce8a4c72e883 / tap6b312169-d5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.556 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.559 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7c4cc341-c93c-4077-a541-31a8487482f0 / tapdbad268a-40 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.560 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.560 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.563 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c4708d03-a0cd-40d7-be06-e97b6a4b45b7 / tap8c03d5ae-dc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.563 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.564 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102635.5484984, c4708d03-a0cd-40d7-be06-e97b6a4b45b7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.564 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] VM Resumed (Lifecycle Event)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5c0ca44-b782-401b-8a89-a714a20ca4d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000029-89784883-b435-428a-8936-a513f9e65fe0-tapb49f37fd-77', 'timestamp': '2026-01-22T17:23:55.547912', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'tapb49f37fd-77', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:2f:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb49f37fd-77'}, 'message_id': '2123662c-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.308406274, 'message_signature': 'b74ca814e231bdaa73542e3864915416d8c1237e6c50342da6fbd5fbd2842c63'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000027-a7440e72-b977-4601-88ad-ce8a4c72e883-tap6b312169-d5', 'timestamp': '2026-01-22T17:23:55.547912', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'tap6b312169-d5', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:52:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b312169-d5'}, 'message_id': '2123f92a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.313706616, 'message_signature': '8324a07be8af33cd4a0fdf3780d8fbf4f7c9ac96d6ee13213cfc5e74fbdf2d80'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000028-7c4cc341-c93c-4077-a541-31a8487482f0-tapdbad268a-40', 'timestamp': '2026-01-22T17:23:55.547912', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'tapdbad268a-40', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:a8:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdbad268a-40'}, 'message_id': '21248584-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.317432786, 'message_signature': '1a095e1c29975322b84bc1fe87413c65559f8cc85bfc0c427edef65b10590785'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-0000002a-c4708d03-a0cd-40d7-be06-e97b6a4b45b7-tap8c03d5ae-dc', 'timestamp': '2026-01-22T17:23:55.547912', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'tap8c03d5ae-dc', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:49:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8c03d5ae-dc'}, 'message_id': '21251558-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.320920659, 'message_signature': '22dd6dd4d8541991a68fbeff101259aaa531620b23731c59bdc8febce73bd966'}]}, 'timestamp': '2026-01-22 17:23:55.564297', '_unique_id': '0f541600de4046a99c83f8e0b41306e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.567 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.569 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.569 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/network.outgoing.packets volume: 119 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.569 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/network.outgoing.packets volume: 121 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.570 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/network.outgoing.packets volume: 131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.570 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16a9d7dd-438e-43eb-bd1b-cd70945a30d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 119, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000029-89784883-b435-428a-8936-a513f9e65fe0-tapb49f37fd-77', 'timestamp': '2026-01-22T17:23:55.569234', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'tapb49f37fd-77', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:2f:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb49f37fd-77'}, 'message_id': '2125f004-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.308406274, 'message_signature': '53174f00feab57b48ac6d38a5ebad58051b71039d069eee686bbbcf272569695'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 121, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000027-a7440e72-b977-4601-88ad-ce8a4c72e883-tap6b312169-d5', 'timestamp': '2026-01-22T17:23:55.569234', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'tap6b312169-d5', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:52:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b312169-d5'}, 'message_id': '21260562-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.313706616, 'message_signature': 'e33fbcf55071521ee6be4d962f01fd147f06ba3bef543e85226e03967dc45115'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 131, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000028-7c4cc341-c93c-4077-a541-31a8487482f0-tapdbad268a-40', 'timestamp': '2026-01-22T17:23:55.569234', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'tapdbad268a-40', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:a8:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdbad268a-40'}, 'message_id': '21261778-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.317432786, 'message_signature': '7297191996eb180dbe7d5e7976062b83c62f924ab9d6ef7bda8857720b2e4625'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-0000002a-c4708d03-a0cd-40d7-be06-e97b6a4b45b7-tap8c03d5ae-dc', 'timestamp': '2026-01-22T17:23:55.569234', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'tap8c03d5ae-dc', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:49:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8c03d5ae-dc'}, 'message_id': '21262a7e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.320920659, 'message_signature': '2fa2e94e7b28d1407a051ac3c543a3f2dfd929616eeda392be15f02a4bf712f3'}]}, 'timestamp': '2026-01-22 17:23:55.571367', '_unique_id': 'f553517c07c642c69ebf46a92becfbf3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.574 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.574 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.574 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1551955303>, <NovaLikeServer: tempest-server-test-1599866618>, <NovaLikeServer: tempest-server-0-352913770>, <NovaLikeServer: tempest-server-test-1395753074>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1551955303>, <NovaLikeServer: tempest-server-test-1599866618>, <NovaLikeServer: tempest-server-0-352913770>, <NovaLikeServer: tempest-server-test-1395753074>]
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.575 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.582 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.584 183079 DEBUG nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.584 183079 DEBUG nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.585 183079 DEBUG nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.585 183079 DEBUG nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.585 183079 DEBUG nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.586 183079 DEBUG nova.virt.libvirt.driver [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.590 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.592 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.592 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.601 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.615 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8eb69310-2dd0-48b9-b799-71572c4bd7f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '89784883-b435-428a-8936-a513f9e65fe0-vda', 'timestamp': '2026-01-22T17:23:55.575173', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'instance-00000029', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2128051a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.335642462, 'message_signature': 'bab2e50403092b277da4488b576377abfa1b039d93233660ed853d648bb046d0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883-vda', 'timestamp': '2026-01-22T17:23:55.575173', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'instance-00000027', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2129853e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.343847081, 'message_signature': 'ce6925e7b24fc274fc00c79372f6f1b3d74a526d1d04c470e58177067c3f0205'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '7c4cc341-c93c-4077-a541-31a8487482f0-vda', 'timestamp': '2026-01-22T17:23:55.575173', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'instance-00000028', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '212ada74-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.353766006, 'message_signature': 'ffe6ed4dbe2188ebadceb39485ed732c87cf8c474717de9e0a8f4ef49dd6c661'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7-vda', 'timestamp': '2026-01-22T17:23:55.575173', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'instance-0000002a', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '212d0362-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.362431158, 'message_signature': '167645af04e3bd6aea2a987c06ab4bd8a7e9eea35fa5e0f1e8a6b1e4892f056d'}]}, 'timestamp': '2026-01-22 17:23:55.616296', '_unique_id': '1eaec31621df45b696b1f5aff123da25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.618 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.619 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.620 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.620 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1551955303>, <NovaLikeServer: tempest-server-test-1599866618>, <NovaLikeServer: tempest-server-0-352913770>, <NovaLikeServer: tempest-server-test-1395753074>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1551955303>, <NovaLikeServer: tempest-server-test-1599866618>, <NovaLikeServer: tempest-server-0-352913770>, <NovaLikeServer: tempest-server-test-1395753074>]
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.620 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.620 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.621 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.621 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.622 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aaa70c87-4d98-4888-9992-b5eceddfb463', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000029-89784883-b435-428a-8936-a513f9e65fe0-tapb49f37fd-77', 'timestamp': '2026-01-22T17:23:55.620834', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'tapb49f37fd-77', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:2f:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb49f37fd-77'}, 'message_id': '212dcdba-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.308406274, 'message_signature': '70659d9d87d64dcadf00ba12fa6476d532b473e2d21c86b35cfb8f1642fc98b1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000027-a7440e72-b977-4601-88ad-ce8a4c72e883-tap6b312169-d5', 'timestamp': '2026-01-22T17:23:55.620834', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'tap6b312169-d5', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:52:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b312169-d5'}, 'message_id': '212de016-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.313706616, 'message_signature': 'e27ac038cee84dfd7917a84858922baa4377e91b5a35fea011a45d04a23d5aa0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000028-7c4cc341-c93c-4077-a541-31a8487482f0-tapdbad268a-40', 'timestamp': '2026-01-22T17:23:55.620834', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'tapdbad268a-40', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:a8:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdbad268a-40'}, 'message_id': '212df2e0-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.317432786, 'message_signature': '8ca29df977d234b3dd1ad749d3a7d6af273e8e0a302d61ea07c20564b7508e9e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-0000002a-c4708d03-a0cd-40d7-be06-e97b6a4b45b7-tap8c03d5ae-dc', 'timestamp': '2026-01-22T17:23:55.620834', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'tap8c03d5ae-dc', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:49:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8c03d5ae-dc'}, 'message_id': '212e0384-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.320920659, 'message_signature': '33470e75942a42cf86fe3833e00f7060d017ee57d2b6a7818c5e4375f78a085d'}]}, 'timestamp': '2026-01-22 17:23:55.622738', '_unique_id': '3d1dea6049414dff83f206f31074c40a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.623 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.625 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.627 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.650 183079 INFO nova.compute.manager [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Took 5.27 seconds to spawn the instance on the hypervisor.
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.651 183079 DEBUG nova.compute.manager [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.651 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/disk.device.read.requests volume: 1161 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.672 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/disk.device.read.requests volume: 1146 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.694 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/disk.device.read.requests volume: 1162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.717 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a010599-8b93-4855-bb67-951a113f8e27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1161, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '89784883-b435-428a-8936-a513f9e65fe0-vda', 'timestamp': '2026-01-22T17:23:55.625446', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'instance-00000029', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21329a20-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.387280302, 'message_signature': '580a576d2399e7f18ffe853347e6074dccd9c5e973b918e8d776a14f7766621b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1146, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883-vda', 'timestamp': '2026-01-22T17:23:55.625446', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'instance-00000027', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2135c574-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.413464831, 'message_signature': '9668d01000be61fd9011cd8c6036162a8741d974474267b14dad3c89d1464b12'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1162, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '7c4cc341-c93c-4077-a541-31a8487482f0-vda', 'timestamp': '2026-01-22T17:23:55.625446', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'instance-00000028', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21390e32-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.434101393, 'message_signature': '7193a619a622084807993aca9ffb62ccd1660af6038a01d3e5a8cb09cb45013a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7-vda', 'timestamp': '2026-01-22T17:23:55.625446', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'instance-0000002a', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '213cb2a8-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.455558926, 'message_signature': '391c0638e6bb36b5762a25132edcab44ce3a45ecba57f65b59daa19c79ef3b5a'}]}, 'timestamp': '2026-01-22 17:23:55.719035', '_unique_id': '84af90570e9142d7a5274b0266234820'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.721 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.723 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.723 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/disk.device.write.requests volume: 290 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.723 183079 INFO nova.compute.manager [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Took 5.74 seconds to build instance.
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.724 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/disk.device.write.requests volume: 334 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.725 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/disk.device.write.requests volume: 325 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.726 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '414a30de-9491-4ab7-8abb-0ca1e2bcbc06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 290, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '89784883-b435-428a-8936-a513f9e65fe0-vda', 'timestamp': '2026-01-22T17:23:55.723918', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'instance-00000029', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '213d8d7c-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.387280302, 'message_signature': 'd4633158caca75411101de8660486cdbe693fb1bbd9ba1a4e2ac4a469351c4bd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 334, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883-vda', 'timestamp': '2026-01-22T17:23:55.723918', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'instance-00000027', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '213dada2-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.413464831, 'message_signature': '75fd0705e951e505bea0b9dfcee7de8e11567eebe27ef192dbec179009f123fc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 325, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '7c4cc341-c93c-4077-a541-31a8487482f0-vda', 'timestamp': '2026-01-22T17:23:55.723918', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'instance-00000028', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '213dcb8e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.434101393, 'message_signature': '4f572fc551fe674fa984f799579b411a3c2a3b70967c754030181199fe38c531'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7-vda', 'timestamp': '2026-01-22T17:23:55.723918', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'instance-0000002a', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '213de3da-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.455558926, 'message_signature': '41940a95e94ea43fb5d2f02d2b5aa7d07ce7a138ba8f430b13f51c6bfec52794'}]}, 'timestamp': '2026-01-22 17:23:55.727017', '_unique_id': '9a13d170083246028a75ca44783f6898'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.728 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.737 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.738 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.738 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1551955303>, <NovaLikeServer: tempest-server-test-1599866618>, <NovaLikeServer: tempest-server-0-352913770>, <NovaLikeServer: tempest-server-test-1395753074>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1551955303>, <NovaLikeServer: tempest-server-test-1599866618>, <NovaLikeServer: tempest-server-0-352913770>, <NovaLikeServer: tempest-server-test-1395753074>]
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.738 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.738 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/network.outgoing.bytes volume: 10446 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.738 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/network.outgoing.bytes volume: 10616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.739 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/network.outgoing.bytes volume: 11596 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.739 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'afb80a68-895e-43c3-80a7-080f7a91f435', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10446, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000029-89784883-b435-428a-8936-a513f9e65fe0-tapb49f37fd-77', 'timestamp': '2026-01-22T17:23:55.738479', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'tapb49f37fd-77', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:2f:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb49f37fd-77'}, 'message_id': '213fbf70-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.308406274, 'message_signature': '7be717fd674ed4df6bf90c231ef9093d429a03fb9e103d6229cd2ef5637dd558'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10616, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000027-a7440e72-b977-4601-88ad-ce8a4c72e883-tap6b312169-d5', 'timestamp': '2026-01-22T17:23:55.738479', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'tap6b312169-d5', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:52:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b312169-d5'}, 'message_id': '213fcd12-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.313706616, 'message_signature': '6196456740a233c073ffa03da61a86035edc6d08b07524edc90eb2e1c2812aa3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11596, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000028-7c4cc341-c93c-4077-a541-31a8487482f0-tapdbad268a-40', 'timestamp': '2026-01-22T17:23:55.738479', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'tapdbad268a-40', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:a8:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdbad268a-40'}, 'message_id': '213fd88e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.317432786, 'message_signature': 'a7867209599f53fb60aec1fc849368c134e7dcac075ded0690e58a794721764f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-0000002a-c4708d03-a0cd-40d7-be06-e97b6a4b45b7-tap8c03d5ae-dc', 'timestamp': '2026-01-22T17:23:55.738479', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'tap8c03d5ae-dc', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:49:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8c03d5ae-dc'}, 'message_id': '213fe68a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.320920659, 'message_signature': '4ee59af92a5ccb37075d23f8a4bf70210fc2d33f2a684c37628128bcb7e64369'}]}, 'timestamp': '2026-01-22 17:23:55.739877', '_unique_id': '2c62d4f5591e4c499ebcf9b2736bebd5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.740 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.741 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.741 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/disk.device.read.bytes volume: 31181312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.742 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/disk.device.read.bytes volume: 31119872 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.742 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/disk.device.read.bytes volume: 31468032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.742 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f3cc16a-b4c1-44b8-81e6-d8d7801abb77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31181312, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '89784883-b435-428a-8936-a513f9e65fe0-vda', 'timestamp': '2026-01-22T17:23:55.741921', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'instance-00000029', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '214043b4-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.387280302, 'message_signature': '5f83b973ea35e3636355f1f2352fe66a83437108deae33dd9ee948881a670862'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31119872, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883-vda', 'timestamp': '2026-01-22T17:23:55.741921', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'instance-00000027', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21405098-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.413464831, 'message_signature': '47611992e9ddf1d95aaf3e3f7261cdae3d3513332f132fb3afa54392555e1b5c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31468032, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '7c4cc341-c93c-4077-a541-31a8487482f0-vda', 'timestamp': '2026-01-22T17:23:55.741921', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'instance-00000028', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21405db8-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.434101393, 'message_signature': '647377b12c0e60b6acd8368e106ce8305f1c0dbbcae180f60a72b5c8df107706'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7-vda', 'timestamp': '2026-01-22T17:23:55.741921', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'instance-0000002a', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '214068d0-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.455558926, 'message_signature': 'eaf3778877b652bdc60c2b26d77e2caa5580c66cc6739e722e6f276f677a77e7'}]}, 'timestamp': '2026-01-22 17:23:55.743367', '_unique_id': 'a805469b5efb43faa7568cb5d858b31d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.744 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.745 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.745 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.745 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.745 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.746 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0faf581-8d6a-4df9-8228-164380f3a9f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000029-89784883-b435-428a-8936-a513f9e65fe0-tapb49f37fd-77', 'timestamp': '2026-01-22T17:23:55.745289', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'tapb49f37fd-77', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:2f:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb49f37fd-77'}, 'message_id': '2140c708-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.308406274, 'message_signature': 'b7ddcb6653439d461e346028606ac0f2283b5300c1ae720d3e76fa1886ece067'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000027-a7440e72-b977-4601-88ad-ce8a4c72e883-tap6b312169-d5', 'timestamp': '2026-01-22T17:23:55.745289', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'tap6b312169-d5', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:52:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b312169-d5'}, 'message_id': '2140d522-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.313706616, 'message_signature': '921f94da2b806e4180b0da66e03485fbade2ceef6a1172ffc36a15062c985ed9'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000028-7c4cc341-c93c-4077-a541-31a8487482f0-tapdbad268a-40', 'timestamp': '2026-01-22T17:23:55.745289', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'tapdbad268a-40', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:a8:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdbad268a-40'}, 'message_id': '2140e08a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.317432786, 'message_signature': '965e9f638d9532602fec0547727d8ca57bdd36924816caa9e01394a340ef4c36'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-0000002a-c4708d03-a0cd-40d7-be06-e97b6a4b45b7-tap8c03d5ae-dc', 'timestamp': '2026-01-22T17:23:55.745289', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'tap8c03d5ae-dc', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:49:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8c03d5ae-dc'}, 'message_id': '2140eb98-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.320920659, 'message_signature': '806a59e071225e273a9bd641381b0df51c54678362acf1ff59e9e4be8b826864'}]}, 'timestamp': '2026-01-22 17:23:55.746539', '_unique_id': '966716a392bc47a688019df25eb20190'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.747 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.748 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.748 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/disk.device.write.bytes volume: 73011200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.748 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/disk.device.write.bytes volume: 73134080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.748 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.749 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbccf13e-da73-42f5-b1b8-243cd3555ccd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73011200, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '89784883-b435-428a-8936-a513f9e65fe0-vda', 'timestamp': '2026-01-22T17:23:55.748243', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'instance-00000029', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21413a44-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.387280302, 'message_signature': '0090661a18dce601c47c28c0674cffba7ef34dd3e372987a90d07cb8cea38ed4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73134080, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883-vda', 'timestamp': '2026-01-22T17:23:55.748243', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'instance-00000027', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '214146f6-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.413464831, 'message_signature': 'b3fb6b3e63636ade7cdf29e3a9e15d84dd9e7e74259c9f04e817ec614b09e1d5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '7c4cc341-c93c-4077-a541-31a8487482f0-vda', 'timestamp': '2026-01-22T17:23:55.748243', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'instance-00000028', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21415362-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.434101393, 'message_signature': 'bd092768657384026b713eb09962fd3e7b3d6afe75e39619fd952d1b15dc521a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7-vda', 'timestamp': '2026-01-22T17:23:55.748243', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'instance-0000002a', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21416384-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.455558926, 'message_signature': 'a03ef6d019f7ea8b5a8d0dee0a6352cb7cb04ab6b3e37b74a6c5dff159eb5abd'}]}, 'timestamp': '2026-01-22 17:23:55.749616', '_unique_id': '89ed510166754c9ea17db953c995157a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.750 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.751 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.751 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/disk.device.write.latency volume: 27518965048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.751 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/disk.device.write.latency volume: 63843941664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.752 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/disk.device.write.latency volume: 27013084168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.752 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7880d8e1-a395-430c-83ea-72517f588465', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27518965048, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '89784883-b435-428a-8936-a513f9e65fe0-vda', 'timestamp': '2026-01-22T17:23:55.751389', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'instance-00000029', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2141b50a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.387280302, 'message_signature': '1603ea04c10ad9ea1e44f54848eef7c35c8f238aef7f7b21354f03debe9d5c9b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63843941664, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883-vda', 'timestamp': '2026-01-22T17:23:55.751389', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'instance-00000027', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2141c2b6-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.413464831, 'message_signature': 'a4fa62cc145d81a8e0b85c0d23bfcc8f11ea0a08f99e823232d1a3f1f551f4a2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27013084168, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '7c4cc341-c93c-4077-a541-31a8487482f0-vda', 'timestamp': '2026-01-22T17:23:55.751389', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'instance-00000028', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2141cd60-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.434101393, 'message_signature': '3814700238ee856177c523081416648d453f3e928f1e5f43636cd3ff52c0c7d3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7-vda', 'timestamp': '2026-01-22T17:23:55.751389', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'instance-0000002a', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2141d85a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.455558926, 'message_signature': '354f2538f8f3471e2a10a121dfc86c3f5a38511ebaff7ac1af81bd1ebb01bc52'}]}, 'timestamp': '2026-01-22 17:23:55.752592', '_unique_id': '3559b7ecddee4906a61ffa1c57b5c027'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.753 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.754 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.754 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.754 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.755 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.755 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 nova_compute[183075]: 2026-01-22 17:23:55.755 183079 DEBUG oslo_concurrency.lockutils [None req-4a81a00d-70a0-49ca-b821-5d54d8d13690 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.835s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55a99f85-3d9e-49ca-a68e-7ccf98ae07e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '89784883-b435-428a-8936-a513f9e65fe0-vda', 'timestamp': '2026-01-22T17:23:55.754319', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'instance-00000029', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21422788-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.335642462, 'message_signature': '843876ab1a31b760ba833d06d26026a74ab7b60e7f2e8311a0b56c70044312e4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883-vda', 'timestamp': '2026-01-22T17:23:55.754319', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'instance-00000027', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2142376e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.343847081, 'message_signature': 'a508efca5d2090ed869e012b58d0ec11f341ea6f11eedd83f38ab8e68086a4d1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '7c4cc341-c93c-4077-a541-31a8487482f0-vda', 'timestamp': '2026-01-22T17:23:55.754319', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'instance-00000028', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21424286-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.353766006, 'message_signature': '76ed6772d87d321c68918a04b7df363e62c6bd344a7203536a3525ec3fe60b6c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7-vda', 'timestamp': '2026-01-22T17:23:55.754319', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'instance-0000002a', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21424cf4-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.362431158, 'message_signature': '6a6044ea0a38c45cc5eb56e98bfbc378c6fd361bd238a33d05856ef35a65bc17'}]}, 'timestamp': '2026-01-22 17:23:55.755594', '_unique_id': 'c450e098c7ef4bddbbad0c0a8c8d0e4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.756 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.757 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.757 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.757 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.758 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.758 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd86a1b2-085b-4163-b71e-878a28bb3e2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000029-89784883-b435-428a-8936-a513f9e65fe0-tapb49f37fd-77', 'timestamp': '2026-01-22T17:23:55.757348', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'tapb49f37fd-77', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:2f:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb49f37fd-77'}, 'message_id': '21429df8-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.308406274, 'message_signature': 'fc08ee63220e20c4c7fda294fa5955aa4dfb449f06c33bbb9b00d3bcb44e955d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000027-a7440e72-b977-4601-88ad-ce8a4c72e883-tap6b312169-d5', 'timestamp': '2026-01-22T17:23:55.757348', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'tap6b312169-d5', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:52:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b312169-d5'}, 'message_id': '2142aaa0-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.313706616, 'message_signature': '6d567e90929db93c79898a9acde4834715c29cae4c9e1fe65fa757546e409faa'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000028-7c4cc341-c93c-4077-a541-31a8487482f0-tapdbad268a-40', 'timestamp': '2026-01-22T17:23:55.757348', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'tapdbad268a-40', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:a8:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdbad268a-40'}, 'message_id': '2142b680-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.317432786, 'message_signature': '60afd40977aaed5e13ac94a1c71d0f27c353f779c820e19fea4e1d6ad6678d26'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-0000002a-c4708d03-a0cd-40d7-be06-e97b6a4b45b7-tap8c03d5ae-dc', 'timestamp': '2026-01-22T17:23:55.757348', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'tap8c03d5ae-dc', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:49:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8c03d5ae-dc'}, 'message_id': '2142c184-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.320920659, 'message_signature': '9e27e6a9e7765269d7ad1da155d7ace8f647ab867c3d69405a64b24e9136a2e2'}]}, 'timestamp': '2026-01-22 17:23:55.758570', '_unique_id': 'e48858e6c9d64fd786accd387b5dcb17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.759 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.760 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.760 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/memory.usage volume: 42.76953125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.760 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/memory.usage volume: 42.328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.760 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/memory.usage volume: 43.32421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.761 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.761 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance c4708d03-a0cd-40d7-be06-e97b6a4b45b7: ceilometer.compute.pollsters.NoVolumeException
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'acf4de1a-da09-4700-9744-90533e40421e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.76953125, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '89784883-b435-428a-8936-a513f9e65fe0', 'timestamp': '2026-01-22T17:23:55.760287', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'instance-00000029', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '21431116-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.239966096, 'message_signature': '9feac1df845ac96183de72417f24b172c7230e115f14e123d0b16abbf93ca8ee'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.328125, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'timestamp': '2026-01-22T17:23:55.760287', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'instance-00000027', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '21431d0a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.257803213, 'message_signature': '9261847bbae0ce51ce866211567fd21e84da9025051906fc8e2c252d3eaaf2fa'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.32421875, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'timestamp': '2026-01-22T17:23:55.760287', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'instance-00000028', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '21432926-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.274296303, 'message_signature': 'e27e14c6cf681ee54d50b5fbcb9a74fc00c3cf3bf185acc41a3d18a1ef61e062'}]}, 'timestamp': '2026-01-22 17:23:55.761448', '_unique_id': '97b775ee7f8f4cf29c6941e1b0411f1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.762 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.763 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.763 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.763 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.763 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '135761b9-e81a-44eb-be23-d2c5ed4ba4f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000029-89784883-b435-428a-8936-a513f9e65fe0-tapb49f37fd-77', 'timestamp': '2026-01-22T17:23:55.763069', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'tapb49f37fd-77', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:2f:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb49f37fd-77'}, 'message_id': '21437de0-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.308406274, 'message_signature': '2f3b7c82a630ce2d06200f8e26c3c5e5a4adcec14e5b2b91dd67050867797cb5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000027-a7440e72-b977-4601-88ad-ce8a4c72e883-tap6b312169-d5', 'timestamp': '2026-01-22T17:23:55.763069', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'tap6b312169-d5', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:52:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b312169-d5'}, 'message_id': '2143897a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.313706616, 'message_signature': '630a26dbee0d2b589b763e0de20811b1b253c41fd34c2fedb9e9dcd4e570debd'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000028-7c4cc341-c93c-4077-a541-31a8487482f0-tapdbad268a-40', 'timestamp': '2026-01-22T17:23:55.763069', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'tapdbad268a-40', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:a8:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdbad268a-40'}, 'message_id': '21439596-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.317432786, 'message_signature': 'b2277ad35a90e5c28d773d356002380067c744aad8a664c9687f652c9e590e0e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-0000002a-c4708d03-a0cd-40d7-be06-e97b6a4b45b7-tap8c03d5ae-dc', 'timestamp': '2026-01-22T17:23:55.763069', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'tap8c03d5ae-dc', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:49:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8c03d5ae-dc'}, 'message_id': '2143a0f4-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.320920659, 'message_signature': '35842f15998e6e6c10f3456034e22cd03dbbe6403ae32b6ba3f5e59d96b3dce3'}]}, 'timestamp': '2026-01-22 17:23:55.764291', '_unique_id': '7125483e46d248c5aee90d03ce7fedda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.765 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.766 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.766 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.766 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.766 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7f3b1dd-62b4-4163-bede-c51db04549d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '89784883-b435-428a-8936-a513f9e65fe0-vda', 'timestamp': '2026-01-22T17:23:55.766006', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'instance-00000029', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2143f004-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.335642462, 'message_signature': '654d3289795a417d7566b86a93727d41673248201ae709fc00079fb544a97b21'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883-vda', 'timestamp': '2026-01-22T17:23:55.766006', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'instance-00000027', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2143fb9e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.343847081, 'message_signature': 'cba9c6b3657395cc732a252a7ff2f9b1274a250f5c8d3da18edcf8cda7738087'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '7c4cc341-c93c-4077-a541-31a8487482f0-vda', 'timestamp': '2026-01-22T17:23:55.766006', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'instance-00000028', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '214407d8-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.353766006, 'message_signature': '024e01ea2c0fcf6c83907144c76130e46c97eda614cfddd078791659c1ac47ee'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7-vda', 'timestamp': '2026-01-22T17:23:55.766006', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'instance-0000002a', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21441250-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.362431158, 'message_signature': '98270957a5bff3b03594b109afff537515b293fc8488c15b180ce446f749a2f6'}]}, 'timestamp': '2026-01-22 17:23:55.767181', '_unique_id': '02ac67d29d894532a7991c5d1708cb64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.767 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.768 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.768 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/network.incoming.bytes volume: 7388 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.769 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/network.incoming.bytes volume: 7579 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.769 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/network.incoming.bytes volume: 7202 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.769 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a78f1a3-c9b5-4c4a-9821-990a4eb9e7ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7388, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000029-89784883-b435-428a-8936-a513f9e65fe0-tapb49f37fd-77', 'timestamp': '2026-01-22T17:23:55.768927', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'tapb49f37fd-77', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:2f:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb49f37fd-77'}, 'message_id': '21446278-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.308406274, 'message_signature': 'e3b6149fe9e1e70ae9907f17702dfa4c1f64dfb5cb85f2544584a46d34276dc0'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7579, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000027-a7440e72-b977-4601-88ad-ce8a4c72e883-tap6b312169-d5', 'timestamp': '2026-01-22T17:23:55.768927', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'tap6b312169-d5', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:52:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b312169-d5'}, 'message_id': '21446ebc-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.313706616, 'message_signature': 'a26c09492953381d0ac9fee80339cf0fea5895103e55f855a8ecf15bfcbd9b46'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7202, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000028-7c4cc341-c93c-4077-a541-31a8487482f0-tapdbad268a-40', 'timestamp': '2026-01-22T17:23:55.768927', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'tapdbad268a-40', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:a8:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdbad268a-40'}, 'message_id': '21447ae2-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.317432786, 'message_signature': '00499c52c4e5fef3b2aec32cf32355e3b7a6967f8d4ed11f70dcbc1f8043dfc4'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-0000002a-c4708d03-a0cd-40d7-be06-e97b6a4b45b7-tap8c03d5ae-dc', 'timestamp': '2026-01-22T17:23:55.768927', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'tap8c03d5ae-dc', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:49:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8c03d5ae-dc'}, 'message_id': '2144869a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.320920659, 'message_signature': '7f4bc40e6ec4818f966561e7af169356fb54c509244838609e0115b5b81cd83c'}]}, 'timestamp': '2026-01-22 17:23:55.770168', '_unique_id': 'd440b07215184b5f8041eca8c52e4c37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.770 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.771 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.771 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/disk.device.read.latency volume: 229100652 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.772 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/disk.device.read.latency volume: 231203765 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.772 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/disk.device.read.latency volume: 251057876 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.772 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b3edd8f-a38d-45ee-afaf-158a27161756', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 229100652, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': '89784883-b435-428a-8936-a513f9e65fe0-vda', 'timestamp': '2026-01-22T17:23:55.771893', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'instance-00000029', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2144d5fa-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.387280302, 'message_signature': 'e018e4ac7740e50c90d3609ed23ddf684c381dac5ad600aa54db57dd6e55fa86'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 231203765, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883-vda', 'timestamp': '2026-01-22T17:23:55.771893', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'instance-00000027', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2144e14e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.413464831, 'message_signature': 'ba19aaa5b4293dd22c3c773e3ecc6842476668517f32c3d41e53178b2896993c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 251057876, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': '7c4cc341-c93c-4077-a541-31a8487482f0-vda', 'timestamp': '2026-01-22T17:23:55.771893', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'instance-00000028', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2144ed38-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.434101393, 'message_signature': 'b2aa347fea1592ee0ed77d8d57d1528848ccd60a03913407344cae57e1814de4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7-vda', 'timestamp': '2026-01-22T17:23:55.771893', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'instance-0000002a', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2144f8c8-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.455558926, 'message_signature': 'b47892b7372c46a92c2e8da752856112c4f55096d311a60f5013861ba538bb2b'}]}, 'timestamp': '2026-01-22 17:23:55.773084', '_unique_id': 'a0aa8967d4cb426f928936f8f3e989e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.773 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.774 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.774 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/network.incoming.packets volume: 64 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.775 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/network.incoming.packets volume: 67 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.775 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.775 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a635816c-1c2e-4e09-8d8a-47d5c7fa8d14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 64, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000029-89784883-b435-428a-8936-a513f9e65fe0-tapb49f37fd-77', 'timestamp': '2026-01-22T17:23:55.774759', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'tapb49f37fd-77', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:2f:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb49f37fd-77'}, 'message_id': '21454e86-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.308406274, 'message_signature': '074495759599c1ed8ffc4549d5a2fc00cf9ee7a352691eb9166d0074f63fd482'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 67, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000027-a7440e72-b977-4601-88ad-ce8a4c72e883-tap6b312169-d5', 'timestamp': '2026-01-22T17:23:55.774759', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'tap6b312169-d5', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:52:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b312169-d5'}, 'message_id': '21455a7a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.313706616, 'message_signature': '6fb4292c76a1d7c383d339bd6a8bb8b908af4af474d818e95b6b06b25319865e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000028-7c4cc341-c93c-4077-a541-31a8487482f0-tapdbad268a-40', 'timestamp': '2026-01-22T17:23:55.774759', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'tapdbad268a-40', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:a8:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdbad268a-40'}, 'message_id': '2145675e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.317432786, 'message_signature': '2a35ed30b9ecd27d632e07c1d160c4e7b3b0f12cf391748e048cf075625cf00c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-0000002a-c4708d03-a0cd-40d7-be06-e97b6a4b45b7-tap8c03d5ae-dc', 'timestamp': '2026-01-22T17:23:55.774759', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'tap8c03d5ae-dc', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:49:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8c03d5ae-dc'}, 'message_id': '2145728a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.320920659, 'message_signature': 'd3d6bf31eb9be350a7b851532efcc4cdbe02ba5d1ad6bc5f8e2809525e981657'}]}, 'timestamp': '2026-01-22 17:23:55.776206', '_unique_id': '1fdac18094f34c70a465f88196b74775'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.777 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.778 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1551955303>, <NovaLikeServer: tempest-server-test-1599866618>, <NovaLikeServer: tempest-server-0-352913770>, <NovaLikeServer: tempest-server-test-1395753074>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1551955303>, <NovaLikeServer: tempest-server-test-1599866618>, <NovaLikeServer: tempest-server-0-352913770>, <NovaLikeServer: tempest-server-test-1395753074>]
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.778 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.778 12 DEBUG ceilometer.compute.pollsters [-] 89784883-b435-428a-8936-a513f9e65fe0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.778 12 DEBUG ceilometer.compute.pollsters [-] a7440e72-b977-4601-88ad-ce8a4c72e883/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.778 12 DEBUG ceilometer.compute.pollsters [-] 7c4cc341-c93c-4077-a541-31a8487482f0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.779 12 DEBUG ceilometer.compute.pollsters [-] c4708d03-a0cd-40d7-be06-e97b6a4b45b7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2cd1f75-308b-45ca-a5a5-1924ba9babdd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000029-89784883-b435-428a-8936-a513f9e65fe0-tapb49f37fd-77', 'timestamp': '2026-01-22T17:23:55.778357', 'resource_metadata': {'display_name': 'tempest-server-test-1551955303', 'name': 'tapb49f37fd-77', 'instance_id': '89784883-b435-428a-8936-a513f9e65fe0', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:2f:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb49f37fd-77'}, 'message_id': '2145d284-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.308406274, 'message_signature': '5753b57ef4f20c3b54a8201db2dc73277ca9657046580fd1ea5187c15a023fd5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-00000027-a7440e72-b977-4601-88ad-ce8a4c72e883-tap6b312169-d5', 'timestamp': '2026-01-22T17:23:55.778357', 'resource_metadata': {'display_name': 'tempest-server-test-1599866618', 'name': 'tap6b312169-d5', 'instance_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:52:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b312169-d5'}, 'message_id': '2145df54-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.313706616, 'message_signature': '806c49c65153a5023f1c32ab52e6c70168d39fafb228a9c956a3764162050ecd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-00000028-7c4cc341-c93c-4077-a541-31a8487482f0-tapdbad268a-40', 'timestamp': '2026-01-22T17:23:55.778357', 'resource_metadata': {'display_name': 'tempest-server-0-352913770', 'name': 'tapdbad268a-40', 'instance_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d2:a8:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdbad268a-40'}, 'message_id': '2145ea6c-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.317432786, 'message_signature': '96a62b05d50faaa20642213ca1f2975c762fd084285f37f03833a49d4308c04e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_name': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_name': None, 'resource_id': 'instance-0000002a-c4708d03-a0cd-40d7-be06-e97b6a4b45b7-tap8c03d5ae-dc', 'timestamp': '2026-01-22T17:23:55.778357', 'resource_metadata': {'display_name': 'tempest-server-test-1395753074', 'name': 'tap8c03d5ae-dc', 'instance_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'instance_type': 'm1.nano', 'host': 'c395ce7f1cab7b0698ef9ed70b7a78f91427b8e516dcfd9f7fa53dec', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:49:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8c03d5ae-dc'}, 'message_id': '2145f548-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 4999.320920659, 'message_signature': 'ab59290b53f0bc7f79bea8c4cca8bbfc5c2a028e26cee64bd4be5bec32b71f84'}]}, 'timestamp': '2026-01-22 17:23:55.779555', '_unique_id': 'e3b131e6695e4eea9f33506ab440489c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:23:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:23:55.780 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:23:56 compute-0 nova_compute[183075]: 2026-01-22 17:23:56.483 183079 DEBUG nova.network.neutron [req-418324c7-e253-4686-a7c5-75c944e43877 req-02e3ce01-cce6-4b7d-a3fe-345bd9747737 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Updated VIF entry in instance network info cache for port 8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:23:56 compute-0 nova_compute[183075]: 2026-01-22 17:23:56.483 183079 DEBUG nova.network.neutron [req-418324c7-e253-4686-a7c5-75c944e43877 req-02e3ce01-cce6-4b7d-a3fe-345bd9747737 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Updating instance_info_cache with network_info: [{"id": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "address": "fa:16:3e:4f:49:11", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c03d5ae-dc", "ovs_interfaceid": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:23:56 compute-0 nova_compute[183075]: 2026-01-22 17:23:56.498 183079 DEBUG oslo_concurrency.lockutils [req-418324c7-e253-4686-a7c5-75c944e43877 req-02e3ce01-cce6-4b7d-a3fe-345bd9747737 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-c4708d03-a0cd-40d7-be06-e97b6a4b45b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:23:56 compute-0 nova_compute[183075]: 2026-01-22 17:23:56.738 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:23:57 compute-0 nova_compute[183075]: 2026-01-22 17:23:57.650 183079 DEBUG nova.compute.manager [req-4361114e-6784-4f7d-b93d-829286b20f13 req-4c6677e2-f93c-429e-ba84-a668a3844ef1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Received event network-vif-plugged-8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:23:57 compute-0 nova_compute[183075]: 2026-01-22 17:23:57.651 183079 DEBUG oslo_concurrency.lockutils [req-4361114e-6784-4f7d-b93d-829286b20f13 req-4c6677e2-f93c-429e-ba84-a668a3844ef1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:23:57 compute-0 nova_compute[183075]: 2026-01-22 17:23:57.652 183079 DEBUG oslo_concurrency.lockutils [req-4361114e-6784-4f7d-b93d-829286b20f13 req-4c6677e2-f93c-429e-ba84-a668a3844ef1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:23:57 compute-0 nova_compute[183075]: 2026-01-22 17:23:57.652 183079 DEBUG oslo_concurrency.lockutils [req-4361114e-6784-4f7d-b93d-829286b20f13 req-4c6677e2-f93c-429e-ba84-a668a3844ef1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:23:57 compute-0 nova_compute[183075]: 2026-01-22 17:23:57.653 183079 DEBUG nova.compute.manager [req-4361114e-6784-4f7d-b93d-829286b20f13 req-4c6677e2-f93c-429e-ba84-a668a3844ef1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] No waiting events found dispatching network-vif-plugged-8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:23:57 compute-0 nova_compute[183075]: 2026-01-22 17:23:57.653 183079 WARNING nova.compute.manager [req-4361114e-6784-4f7d-b93d-829286b20f13 req-4c6677e2-f93c-429e-ba84-a668a3844ef1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Received unexpected event network-vif-plugged-8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d for instance with vm_state active and task_state None.
Jan 22 17:23:58 compute-0 nova_compute[183075]: 2026-01-22 17:23:58.619 183079 INFO nova.compute.manager [None req-841e9c9a-9c01-4e62-9533-c6f3ccd5513e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Get console output
Jan 22 17:23:59 compute-0 nova_compute[183075]: 2026-01-22 17:23:59.777 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:00 compute-0 nova_compute[183075]: 2026-01-22 17:24:00.332 183079 INFO nova.compute.manager [None req-13549efc-0959-4307-a71c-98cdf3b80862 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Get console output
Jan 22 17:24:00 compute-0 nova_compute[183075]: 2026-01-22 17:24:00.337 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:24:01 compute-0 podman[228789]: 2026-01-22 17:24:01.349610612 +0000 UTC m=+0.056374147 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:24:01 compute-0 nova_compute[183075]: 2026-01-22 17:24:01.741 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:02 compute-0 nova_compute[183075]: 2026-01-22 17:24:02.056 183079 INFO nova.compute.manager [None req-390dd599-a378-418e-8131-66a1c2fbb0ed 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Get console output
Jan 22 17:24:02 compute-0 nova_compute[183075]: 2026-01-22 17:24:02.063 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:24:03 compute-0 nova_compute[183075]: 2026-01-22 17:24:03.195 183079 DEBUG nova.compute.manager [req-47b60b49-0dad-40a9-b825-89e2b4427bc6 req-5e2d73b0-e81d-4a46-9c2f-be2033e2ef8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Received event network-changed-dbad268a-40fe-4d38-aab1-20fbfbcc0775 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:24:03 compute-0 nova_compute[183075]: 2026-01-22 17:24:03.195 183079 DEBUG nova.compute.manager [req-47b60b49-0dad-40a9-b825-89e2b4427bc6 req-5e2d73b0-e81d-4a46-9c2f-be2033e2ef8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Refreshing instance network info cache due to event network-changed-dbad268a-40fe-4d38-aab1-20fbfbcc0775. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:24:03 compute-0 nova_compute[183075]: 2026-01-22 17:24:03.196 183079 DEBUG oslo_concurrency.lockutils [req-47b60b49-0dad-40a9-b825-89e2b4427bc6 req-5e2d73b0-e81d-4a46-9c2f-be2033e2ef8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-7c4cc341-c93c-4077-a541-31a8487482f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:24:03 compute-0 nova_compute[183075]: 2026-01-22 17:24:03.196 183079 DEBUG oslo_concurrency.lockutils [req-47b60b49-0dad-40a9-b825-89e2b4427bc6 req-5e2d73b0-e81d-4a46-9c2f-be2033e2ef8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-7c4cc341-c93c-4077-a541-31a8487482f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:24:03 compute-0 nova_compute[183075]: 2026-01-22 17:24:03.196 183079 DEBUG nova.network.neutron [req-47b60b49-0dad-40a9-b825-89e2b4427bc6 req-5e2d73b0-e81d-4a46-9c2f-be2033e2ef8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Refreshing network info cache for port dbad268a-40fe-4d38-aab1-20fbfbcc0775 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:24:03 compute-0 nova_compute[183075]: 2026-01-22 17:24:03.757 183079 INFO nova.compute.manager [None req-feb8e4dc-8ee1-41e9-8e64-2b953a98481a 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Get console output
Jan 22 17:24:03 compute-0 nova_compute[183075]: 2026-01-22 17:24:03.764 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:24:04 compute-0 nova_compute[183075]: 2026-01-22 17:24:04.641 183079 DEBUG nova.network.neutron [req-47b60b49-0dad-40a9-b825-89e2b4427bc6 req-5e2d73b0-e81d-4a46-9c2f-be2033e2ef8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Updated VIF entry in instance network info cache for port dbad268a-40fe-4d38-aab1-20fbfbcc0775. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:24:04 compute-0 nova_compute[183075]: 2026-01-22 17:24:04.642 183079 DEBUG nova.network.neutron [req-47b60b49-0dad-40a9-b825-89e2b4427bc6 req-5e2d73b0-e81d-4a46-9c2f-be2033e2ef8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Updating instance_info_cache with network_info: [{"id": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "address": "fa:16:3e:d2:a8:c8", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}, {"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbad268a-40", "ovs_interfaceid": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:24:04 compute-0 nova_compute[183075]: 2026-01-22 17:24:04.662 183079 DEBUG oslo_concurrency.lockutils [req-47b60b49-0dad-40a9-b825-89e2b4427bc6 req-5e2d73b0-e81d-4a46-9c2f-be2033e2ef8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-7c4cc341-c93c-4077-a541-31a8487482f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:24:04 compute-0 nova_compute[183075]: 2026-01-22 17:24:04.828 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:06 compute-0 nova_compute[183075]: 2026-01-22 17:24:06.745 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:07 compute-0 podman[228832]: 2026-01-22 17:24:07.389130738 +0000 UTC m=+0.080444030 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:24:07 compute-0 ovn_controller[95372]: 2026-01-22T17:24:07Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4f:49:11 10.100.0.14
Jan 22 17:24:07 compute-0 ovn_controller[95372]: 2026-01-22T17:24:07Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4f:49:11 10.100.0.14
Jan 22 17:24:08 compute-0 nova_compute[183075]: 2026-01-22 17:24:08.949 183079 INFO nova.compute.manager [None req-1ca6c66e-433f-4414-be64-313b76cac169 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Get console output
Jan 22 17:24:08 compute-0 nova_compute[183075]: 2026-01-22 17:24:08.956 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:24:09 compute-0 nova_compute[183075]: 2026-01-22 17:24:09.831 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:11 compute-0 nova_compute[183075]: 2026-01-22 17:24:11.747 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:11 compute-0 nova_compute[183075]: 2026-01-22 17:24:11.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:24:11 compute-0 nova_compute[183075]: 2026-01-22 17:24:11.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:24:11 compute-0 nova_compute[183075]: 2026-01-22 17:24:11.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 17:24:14 compute-0 nova_compute[183075]: 2026-01-22 17:24:14.103 183079 INFO nova.compute.manager [None req-7cf6bdd7-5dd3-4ecf-94ee-51daf7685994 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Get console output
Jan 22 17:24:14 compute-0 nova_compute[183075]: 2026-01-22 17:24:14.112 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:24:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:14.349 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:24:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:14.352 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:24:14 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:24:14 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:24:14 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:24:14 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:24:14 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:24:14 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:24:14 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:24:14 compute-0 nova_compute[183075]: 2026-01-22 17:24:14.833 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.554 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.554 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.2031271
Jan 22 17:24:15 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.14:34728 [22/Jan/2026:17:24:14.348] listener listener/metadata 0/0/0/1205/1205 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.561 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.562 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.581 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.582 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0202823
Jan 22 17:24:15 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.14:34730 [22/Jan/2026:17:24:15.560] listener listener/metadata 0/0/0/21/21 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.587 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.587 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.609 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:24:15 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.14:34746 [22/Jan/2026:17:24:15.586] listener listener/metadata 0/0/0/23/23 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.609 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0222569
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.615 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.615 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.636 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.637 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0214210
Jan 22 17:24:15 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.14:34754 [22/Jan/2026:17:24:15.614] listener listener/metadata 0/0/0/22/22 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.643 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.645 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.665 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.665 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0206466
Jan 22 17:24:15 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.14:34756 [22/Jan/2026:17:24:15.643] listener listener/metadata 0/0/0/22/22 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.670 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.671 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.689 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.690 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0185590
Jan 22 17:24:15 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.14:34758 [22/Jan/2026:17:24:15.670] listener listener/metadata 0/0/0/19/19 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.698 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.698 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.721 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:24:15 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.14:34766 [22/Jan/2026:17:24:15.697] listener listener/metadata 0/0/0/24/24 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.722 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0236597
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.731 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.732 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.755 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:24:15 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.14:34776 [22/Jan/2026:17:24:15.731] listener listener/metadata 0/0/0/24/24 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.756 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0234532
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.764 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.765 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.791 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:24:15 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.14:34782 [22/Jan/2026:17:24:15.763] listener listener/metadata 0/0/0/27/27 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.791 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0264211
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.799 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.800 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.823 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:24:15 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.14:34788 [22/Jan/2026:17:24:15.799] listener listener/metadata 0/0/0/24/24 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.823 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0231059
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.831 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.832 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:24:15 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.14:34792 [22/Jan/2026:17:24:15.831] listener listener/metadata 0/0/0/26/26 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.857 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0244155
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.873 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.874 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.894 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:24:15 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.14:34794 [22/Jan/2026:17:24:15.872] listener listener/metadata 0/0/0/22/22 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.895 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0216291
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.902 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.903 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.927 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.927 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0241675
Jan 22 17:24:15 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.14:34810 [22/Jan/2026:17:24:15.901] listener listener/metadata 0/0/0/25/25 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.932 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.933 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.948 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.948 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0155783
Jan 22 17:24:15 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.14:34818 [22/Jan/2026:17:24:15.932] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.956 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.957 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.978 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.979 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0218561
Jan 22 17:24:15 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.14:34822 [22/Jan/2026:17:24:15.955] listener listener/metadata 0/0/0/23/23 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.985 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:15.986 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:24:15 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:24:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:16.003 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:24:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:16.003 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0170012
Jan 22 17:24:16 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[228028]: 10.100.0.14:34828 [22/Jan/2026:17:24:15.985] listener listener/metadata 0/0/0/18/18 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:24:16 compute-0 nova_compute[183075]: 2026-01-22 17:24:16.751 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:17 compute-0 nova_compute[183075]: 2026-01-22 17:24:17.859 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:24:18 compute-0 podman[228859]: 2026-01-22 17:24:18.412824403 +0000 UTC m=+0.087992792 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6)
Jan 22 17:24:18 compute-0 podman[228858]: 2026-01-22 17:24:18.414144798 +0000 UTC m=+0.095107502 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 22 17:24:18 compute-0 podman[228857]: 2026-01-22 17:24:18.4377972 +0000 UTC m=+0.124633071 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:24:18 compute-0 nova_compute[183075]: 2026-01-22 17:24:18.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:24:19 compute-0 nova_compute[183075]: 2026-01-22 17:24:19.277 183079 INFO nova.compute.manager [None req-9f92bcc4-8d8d-4688-9368-11fb1f520280 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Get console output
Jan 22 17:24:19 compute-0 nova_compute[183075]: 2026-01-22 17:24:19.282 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:24:19 compute-0 nova_compute[183075]: 2026-01-22 17:24:19.835 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:21 compute-0 nova_compute[183075]: 2026-01-22 17:24:21.754 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:21 compute-0 nova_compute[183075]: 2026-01-22 17:24:21.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.161 183079 DEBUG oslo_concurrency.lockutils [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.162 183079 DEBUG oslo_concurrency.lockutils [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.162 183079 DEBUG oslo_concurrency.lockutils [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.163 183079 DEBUG oslo_concurrency.lockutils [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.163 183079 DEBUG oslo_concurrency.lockutils [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.165 183079 INFO nova.compute.manager [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Terminating instance
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.168 183079 DEBUG nova.compute.manager [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:24:22 compute-0 kernel: tap8c03d5ae-dc (unregistering): left promiscuous mode
Jan 22 17:24:22 compute-0 NetworkManager[55454]: <info>  [1769102662.2007] device (tap8c03d5ae-dc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.209 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:22 compute-0 ovn_controller[95372]: 2026-01-22T17:24:22Z|00478|binding|INFO|Releasing lport 8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d from this chassis (sb_readonly=0)
Jan 22 17:24:22 compute-0 ovn_controller[95372]: 2026-01-22T17:24:22Z|00479|binding|INFO|Setting lport 8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d down in Southbound
Jan 22 17:24:22 compute-0 ovn_controller[95372]: 2026-01-22T17:24:22Z|00480|binding|INFO|Removing iface tap8c03d5ae-dc ovn-installed in OVS
Jan 22 17:24:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:22.220 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:49:11 10.100.0.14'], port_security=['fa:16:3e:4f:49:11 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c4708d03-a0cd-40d7-be06-e97b6a4b45b7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.232', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[], tunnel_key=9, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:24:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:22.224 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d in datapath eee918a6-66b2-47ae-b702-620a23ef395b unbound from our chassis
Jan 22 17:24:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:22.228 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:24:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:22.258 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a2345308-b49d-446f-a9d7-abbb20bc05b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.289 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:22.300 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5d062444-0732-4cfb-913d-f7d682d13fb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:22 compute-0 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Jan 22 17:24:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:22.304 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[47edf98d-e790-4a42-ab79-292bf430cb3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:22 compute-0 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000002a.scope: Consumed 12.930s CPU time.
Jan 22 17:24:22 compute-0 systemd-machined[154382]: Machine qemu-42-instance-0000002a terminated.
Jan 22 17:24:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:22.342 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ebbe0f9d-1cab-437f-a97c-011db09826c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:22.367 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5295b05c-359c-417c-a83c-9bead7026606]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 301, 'tx_packets': 156, 'rx_bytes': 25738, 'tx_bytes': 17884, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 301, 'tx_packets': 156, 'rx_bytes': 25738, 'tx_bytes': 17884, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492684, 'reachable_time': 18835, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228932, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:22.391 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[866fb614-f0bd-4fb6-8bba-20d067911284]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492699, 'tstamp': 492699}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228933, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492702, 'tstamp': 492702}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228933, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:22.393 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.396 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.404 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:22.406 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee918a6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:22.406 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:24:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:22.407 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeee918a6-60, col_values=(('external_ids', {'iface-id': '15d4de90-41f4-4532-aebd-197c2a33c6d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:22.408 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.445 183079 INFO nova.virt.libvirt.driver [-] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Instance destroyed successfully.
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.447 183079 DEBUG nova.objects.instance [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'resources' on Instance uuid c4708d03-a0cd-40d7-be06-e97b6a4b45b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.463 183079 DEBUG nova.virt.libvirt.vif [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:23:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1395753074',display_name='tempest-server-test-1395753074',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1395753074',id=42,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:23:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-hkqxuw3r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:23:55Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=c4708d03-a0cd-40d7-be06-e97b6a4b45b7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "address": "fa:16:3e:4f:49:11", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c03d5ae-dc", "ovs_interfaceid": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.464 183079 DEBUG nova.network.os_vif_util [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "address": "fa:16:3e:4f:49:11", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c03d5ae-dc", "ovs_interfaceid": "8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.465 183079 DEBUG nova.network.os_vif_util [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:49:11,bridge_name='br-int',has_traffic_filtering=True,id=8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8c03d5ae-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.465 183079 DEBUG os_vif [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:49:11,bridge_name='br-int',has_traffic_filtering=True,id=8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8c03d5ae-dc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.467 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.467 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c03d5ae-dc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.469 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.472 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.474 183079 INFO os_vif [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:49:11,bridge_name='br-int',has_traffic_filtering=True,id=8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8c03d5ae-dc')
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.475 183079 INFO nova.virt.libvirt.driver [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Deleting instance files /var/lib/nova/instances/c4708d03-a0cd-40d7-be06-e97b6a4b45b7_del
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.475 183079 INFO nova.virt.libvirt.driver [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Deletion of /var/lib/nova/instances/c4708d03-a0cd-40d7-be06-e97b6a4b45b7_del complete
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.531 183079 INFO nova.compute.manager [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.533 183079 DEBUG oslo.service.loopingcall [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.533 183079 DEBUG nova.compute.manager [-] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.533 183079 DEBUG nova.network.neutron [-] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.592 183079 DEBUG nova.compute.manager [req-42911873-b40f-44eb-83f2-5a58324fd3c0 req-e42319fd-9a92-4cf2-8ccf-141befbe06d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Received event network-vif-unplugged-8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.593 183079 DEBUG oslo_concurrency.lockutils [req-42911873-b40f-44eb-83f2-5a58324fd3c0 req-e42319fd-9a92-4cf2-8ccf-141befbe06d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.593 183079 DEBUG oslo_concurrency.lockutils [req-42911873-b40f-44eb-83f2-5a58324fd3c0 req-e42319fd-9a92-4cf2-8ccf-141befbe06d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.594 183079 DEBUG oslo_concurrency.lockutils [req-42911873-b40f-44eb-83f2-5a58324fd3c0 req-e42319fd-9a92-4cf2-8ccf-141befbe06d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.594 183079 DEBUG nova.compute.manager [req-42911873-b40f-44eb-83f2-5a58324fd3c0 req-e42319fd-9a92-4cf2-8ccf-141befbe06d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] No waiting events found dispatching network-vif-unplugged-8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.594 183079 DEBUG nova.compute.manager [req-42911873-b40f-44eb-83f2-5a58324fd3c0 req-e42319fd-9a92-4cf2-8ccf-141befbe06d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Received event network-vif-unplugged-8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:24:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:22.741 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:24:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:22.741 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.744 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:24:22 compute-0 nova_compute[183075]: 2026-01-22 17:24:22.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:24:23 compute-0 nova_compute[183075]: 2026-01-22 17:24:23.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:24:24 compute-0 nova_compute[183075]: 2026-01-22 17:24:24.681 183079 DEBUG nova.compute.manager [req-7a40fb24-c783-4b10-b9bd-be9348d0003a req-06c2f98c-7080-465f-b00e-ee96b3289de4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Received event network-vif-plugged-8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:24:24 compute-0 nova_compute[183075]: 2026-01-22 17:24:24.682 183079 DEBUG oslo_concurrency.lockutils [req-7a40fb24-c783-4b10-b9bd-be9348d0003a req-06c2f98c-7080-465f-b00e-ee96b3289de4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:24 compute-0 nova_compute[183075]: 2026-01-22 17:24:24.682 183079 DEBUG oslo_concurrency.lockutils [req-7a40fb24-c783-4b10-b9bd-be9348d0003a req-06c2f98c-7080-465f-b00e-ee96b3289de4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:24 compute-0 nova_compute[183075]: 2026-01-22 17:24:24.682 183079 DEBUG oslo_concurrency.lockutils [req-7a40fb24-c783-4b10-b9bd-be9348d0003a req-06c2f98c-7080-465f-b00e-ee96b3289de4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:24 compute-0 nova_compute[183075]: 2026-01-22 17:24:24.682 183079 DEBUG nova.compute.manager [req-7a40fb24-c783-4b10-b9bd-be9348d0003a req-06c2f98c-7080-465f-b00e-ee96b3289de4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] No waiting events found dispatching network-vif-plugged-8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:24:24 compute-0 nova_compute[183075]: 2026-01-22 17:24:24.683 183079 WARNING nova.compute.manager [req-7a40fb24-c783-4b10-b9bd-be9348d0003a req-06c2f98c-7080-465f-b00e-ee96b3289de4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Received unexpected event network-vif-plugged-8c03d5ae-dcc8-4eb7-ad07-f9c5268a4f3d for instance with vm_state active and task_state deleting.
Jan 22 17:24:24 compute-0 nova_compute[183075]: 2026-01-22 17:24:24.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:24:24 compute-0 nova_compute[183075]: 2026-01-22 17:24:24.817 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:24 compute-0 nova_compute[183075]: 2026-01-22 17:24:24.818 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:24 compute-0 nova_compute[183075]: 2026-01-22 17:24:24.818 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:24 compute-0 nova_compute[183075]: 2026-01-22 17:24:24.819 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:24:24 compute-0 nova_compute[183075]: 2026-01-22 17:24:24.919 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:24:24 compute-0 podman[228951]: 2026-01-22 17:24:24.997125671 +0000 UTC m=+0.109816165 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.002 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.005 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.086 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.097 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.192 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.193 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.274 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.283 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.338 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.340 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.396 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.610 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.612 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5239MB free_disk=73.28149032592773GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.613 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.613 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.721 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance a7440e72-b977-4601-88ad-ce8a4c72e883 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.722 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 7c4cc341-c93c-4077-a541-31a8487482f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.722 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 89784883-b435-428a-8936-a513f9e65fe0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.722 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance c4708d03-a0cd-40d7-be06-e97b6a4b45b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.722 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:24:25 compute-0 nova_compute[183075]: 2026-01-22 17:24:25.722 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:24:26 compute-0 nova_compute[183075]: 2026-01-22 17:24:26.005 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:24:26 compute-0 nova_compute[183075]: 2026-01-22 17:24:26.025 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:24:26 compute-0 nova_compute[183075]: 2026-01-22 17:24:26.051 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:24:26 compute-0 nova_compute[183075]: 2026-01-22 17:24:26.052 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.439s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:26 compute-0 nova_compute[183075]: 2026-01-22 17:24:26.574 183079 DEBUG nova.network.neutron [-] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:24:26 compute-0 nova_compute[183075]: 2026-01-22 17:24:26.593 183079 INFO nova.compute.manager [-] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Took 4.06 seconds to deallocate network for instance.
Jan 22 17:24:26 compute-0 nova_compute[183075]: 2026-01-22 17:24:26.634 183079 DEBUG oslo_concurrency.lockutils [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:26 compute-0 nova_compute[183075]: 2026-01-22 17:24:26.635 183079 DEBUG oslo_concurrency.lockutils [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:26 compute-0 nova_compute[183075]: 2026-01-22 17:24:26.734 183079 DEBUG nova.compute.provider_tree [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:24:26 compute-0 nova_compute[183075]: 2026-01-22 17:24:26.746 183079 DEBUG nova.scheduler.client.report [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:24:26 compute-0 nova_compute[183075]: 2026-01-22 17:24:26.757 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:26 compute-0 nova_compute[183075]: 2026-01-22 17:24:26.765 183079 DEBUG oslo_concurrency.lockutils [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:26 compute-0 nova_compute[183075]: 2026-01-22 17:24:26.783 183079 INFO nova.scheduler.client.report [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Deleted allocations for instance c4708d03-a0cd-40d7-be06-e97b6a4b45b7
Jan 22 17:24:26 compute-0 nova_compute[183075]: 2026-01-22 17:24:26.836 183079 DEBUG oslo_concurrency.lockutils [None req-23954fbd-19e7-4075-9dae-52753b24cc70 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "c4708d03-a0cd-40d7-be06-e97b6a4b45b7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:27 compute-0 nova_compute[183075]: 2026-01-22 17:24:27.053 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:24:27 compute-0 nova_compute[183075]: 2026-01-22 17:24:27.054 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:24:27 compute-0 nova_compute[183075]: 2026-01-22 17:24:27.240 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-7c4cc341-c93c-4077-a541-31a8487482f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:24:27 compute-0 nova_compute[183075]: 2026-01-22 17:24:27.240 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-7c4cc341-c93c-4077-a541-31a8487482f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:24:27 compute-0 nova_compute[183075]: 2026-01-22 17:24:27.240 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:24:27 compute-0 nova_compute[183075]: 2026-01-22 17:24:27.469 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.021 183079 DEBUG oslo_concurrency.lockutils [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "89784883-b435-428a-8936-a513f9e65fe0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.021 183079 DEBUG oslo_concurrency.lockutils [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "89784883-b435-428a-8936-a513f9e65fe0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.022 183079 DEBUG oslo_concurrency.lockutils [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "89784883-b435-428a-8936-a513f9e65fe0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.022 183079 DEBUG oslo_concurrency.lockutils [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "89784883-b435-428a-8936-a513f9e65fe0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.023 183079 DEBUG oslo_concurrency.lockutils [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "89784883-b435-428a-8936-a513f9e65fe0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.025 183079 INFO nova.compute.manager [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Terminating instance
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.026 183079 DEBUG nova.compute.manager [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:24:28 compute-0 kernel: tapb49f37fd-77 (unregistering): left promiscuous mode
Jan 22 17:24:28 compute-0 NetworkManager[55454]: <info>  [1769102668.0561] device (tapb49f37fd-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:24:28 compute-0 ovn_controller[95372]: 2026-01-22T17:24:28Z|00481|binding|INFO|Releasing lport b49f37fd-778f-41bc-b520-547fbfd8002e from this chassis (sb_readonly=0)
Jan 22 17:24:28 compute-0 ovn_controller[95372]: 2026-01-22T17:24:28Z|00482|binding|INFO|Setting lport b49f37fd-778f-41bc-b520-547fbfd8002e down in Southbound
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.071 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:28 compute-0 ovn_controller[95372]: 2026-01-22T17:24:28Z|00483|binding|INFO|Removing iface tapb49f37fd-77 ovn-installed in OVS
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.075 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:28.082 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:2f:57 10.100.0.3'], port_security=['fa:16:3e:95:2f:57 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '89784883-b435-428a-8936-a513f9e65fe0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=b49f37fd-778f-41bc-b520-547fbfd8002e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:24:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:28.084 104629 INFO neutron.agent.ovn.metadata.agent [-] Port b49f37fd-778f-41bc-b520-547fbfd8002e in datapath eee918a6-66b2-47ae-b702-620a23ef395b unbound from our chassis
Jan 22 17:24:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:28.087 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.101 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:28.115 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[48f51a35-bb32-4683-a9ee-f008d7c8674d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:28 compute-0 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000029.scope: Deactivated successfully.
Jan 22 17:24:28 compute-0 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000029.scope: Consumed 15.242s CPU time.
Jan 22 17:24:28 compute-0 systemd-machined[154382]: Machine qemu-41-instance-00000029 terminated.
Jan 22 17:24:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:28.165 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d689eb44-632a-4c91-b929-816ad9a6ad84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:28.170 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[fcbf5939-1cc8-47a7-8784-7f726453e1b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:28.212 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a6172b-0702-40bb-8a34-2ca2679c96a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:28.236 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[45b222e7-99d0-44c7-bbbb-7bd8a2a5a823]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 301, 'tx_packets': 158, 'rx_bytes': 25738, 'tx_bytes': 17968, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 301, 'tx_packets': 158, 'rx_bytes': 25738, 'tx_bytes': 17968, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492684, 'reachable_time': 18835, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229001, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.255 183079 DEBUG nova.compute.manager [req-5caf29be-21e8-45e1-9664-443a6e1a38f4 req-b753f32b-f6c8-4167-9435-4037a0551735 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Received event network-vif-unplugged-b49f37fd-778f-41bc-b520-547fbfd8002e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.256 183079 DEBUG oslo_concurrency.lockutils [req-5caf29be-21e8-45e1-9664-443a6e1a38f4 req-b753f32b-f6c8-4167-9435-4037a0551735 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "89784883-b435-428a-8936-a513f9e65fe0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.257 183079 DEBUG oslo_concurrency.lockutils [req-5caf29be-21e8-45e1-9664-443a6e1a38f4 req-b753f32b-f6c8-4167-9435-4037a0551735 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "89784883-b435-428a-8936-a513f9e65fe0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:28.256 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f181efdf-e4a2-4720-8939-d5917c0be83b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492699, 'tstamp': 492699}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229003, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492702, 'tstamp': 492702}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229003, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.257 183079 DEBUG oslo_concurrency.lockutils [req-5caf29be-21e8-45e1-9664-443a6e1a38f4 req-b753f32b-f6c8-4167-9435-4037a0551735 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "89784883-b435-428a-8936-a513f9e65fe0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.257 183079 DEBUG nova.compute.manager [req-5caf29be-21e8-45e1-9664-443a6e1a38f4 req-b753f32b-f6c8-4167-9435-4037a0551735 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] No waiting events found dispatching network-vif-unplugged-b49f37fd-778f-41bc-b520-547fbfd8002e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.258 183079 DEBUG nova.compute.manager [req-5caf29be-21e8-45e1-9664-443a6e1a38f4 req-b753f32b-f6c8-4167-9435-4037a0551735 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Received event network-vif-unplugged-b49f37fd-778f-41bc-b520-547fbfd8002e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:24:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:28.258 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.259 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.268 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:28.268 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee918a6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:28.268 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:24:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:28.269 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeee918a6-60, col_values=(('external_ids', {'iface-id': '15d4de90-41f4-4532-aebd-197c2a33c6d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:28.269 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.298 183079 INFO nova.virt.libvirt.driver [-] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Instance destroyed successfully.
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.299 183079 DEBUG nova.objects.instance [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'resources' on Instance uuid 89784883-b435-428a-8936-a513f9e65fe0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.316 183079 DEBUG nova.virt.libvirt.vif [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:23:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1551955303',display_name='tempest-server-test-1551955303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1551955303',id=41,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:23:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-zmub7s9r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:23:18Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=89784883-b435-428a-8936-a513f9e65fe0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b49f37fd-778f-41bc-b520-547fbfd8002e", "address": "fa:16:3e:95:2f:57", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb49f37fd-77", "ovs_interfaceid": "b49f37fd-778f-41bc-b520-547fbfd8002e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.316 183079 DEBUG nova.network.os_vif_util [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "b49f37fd-778f-41bc-b520-547fbfd8002e", "address": "fa:16:3e:95:2f:57", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb49f37fd-77", "ovs_interfaceid": "b49f37fd-778f-41bc-b520-547fbfd8002e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.317 183079 DEBUG nova.network.os_vif_util [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:2f:57,bridge_name='br-int',has_traffic_filtering=True,id=b49f37fd-778f-41bc-b520-547fbfd8002e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb49f37fd-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.318 183079 DEBUG os_vif [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:2f:57,bridge_name='br-int',has_traffic_filtering=True,id=b49f37fd-778f-41bc-b520-547fbfd8002e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb49f37fd-77') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.320 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.320 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb49f37fd-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.322 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.329 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.334 183079 INFO os_vif [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:2f:57,bridge_name='br-int',has_traffic_filtering=True,id=b49f37fd-778f-41bc-b520-547fbfd8002e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb49f37fd-77')
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.335 183079 INFO nova.virt.libvirt.driver [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Deleting instance files /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0_del
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.335 183079 INFO nova.virt.libvirt.driver [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Deletion of /var/lib/nova/instances/89784883-b435-428a-8936-a513f9e65fe0_del complete
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.385 183079 INFO nova.compute.manager [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.387 183079 DEBUG oslo.service.loopingcall [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.388 183079 DEBUG nova.compute.manager [-] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:24:28 compute-0 nova_compute[183075]: 2026-01-22 17:24:28.388 183079 DEBUG nova.network.neutron [-] [instance: 89784883-b435-428a-8936-a513f9e65fe0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:24:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:28.744 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:30 compute-0 nova_compute[183075]: 2026-01-22 17:24:30.464 183079 DEBUG nova.compute.manager [req-9bf92545-6a85-46c8-bd4e-f987661bb10c req-b4ba2250-3c82-4893-87ea-6d722d2b6021 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Received event network-vif-plugged-b49f37fd-778f-41bc-b520-547fbfd8002e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:24:30 compute-0 nova_compute[183075]: 2026-01-22 17:24:30.465 183079 DEBUG oslo_concurrency.lockutils [req-9bf92545-6a85-46c8-bd4e-f987661bb10c req-b4ba2250-3c82-4893-87ea-6d722d2b6021 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "89784883-b435-428a-8936-a513f9e65fe0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:30 compute-0 nova_compute[183075]: 2026-01-22 17:24:30.465 183079 DEBUG oslo_concurrency.lockutils [req-9bf92545-6a85-46c8-bd4e-f987661bb10c req-b4ba2250-3c82-4893-87ea-6d722d2b6021 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "89784883-b435-428a-8936-a513f9e65fe0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:30 compute-0 nova_compute[183075]: 2026-01-22 17:24:30.466 183079 DEBUG oslo_concurrency.lockutils [req-9bf92545-6a85-46c8-bd4e-f987661bb10c req-b4ba2250-3c82-4893-87ea-6d722d2b6021 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "89784883-b435-428a-8936-a513f9e65fe0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:30 compute-0 nova_compute[183075]: 2026-01-22 17:24:30.466 183079 DEBUG nova.compute.manager [req-9bf92545-6a85-46c8-bd4e-f987661bb10c req-b4ba2250-3c82-4893-87ea-6d722d2b6021 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] No waiting events found dispatching network-vif-plugged-b49f37fd-778f-41bc-b520-547fbfd8002e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:24:30 compute-0 nova_compute[183075]: 2026-01-22 17:24:30.467 183079 WARNING nova.compute.manager [req-9bf92545-6a85-46c8-bd4e-f987661bb10c req-b4ba2250-3c82-4893-87ea-6d722d2b6021 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Received unexpected event network-vif-plugged-b49f37fd-778f-41bc-b520-547fbfd8002e for instance with vm_state active and task_state deleting.
Jan 22 17:24:31 compute-0 nova_compute[183075]: 2026-01-22 17:24:31.758 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:32 compute-0 nova_compute[183075]: 2026-01-22 17:24:32.365 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Updating instance_info_cache with network_info: [{"id": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "address": "fa:16:3e:d2:a8:c8", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}, {"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbad268a-40", "ovs_interfaceid": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:24:32 compute-0 podman[229033]: 2026-01-22 17:24:32.378888876 +0000 UTC m=+0.075890009 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:24:32 compute-0 nova_compute[183075]: 2026-01-22 17:24:32.387 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-7c4cc341-c93c-4077-a541-31a8487482f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:24:32 compute-0 nova_compute[183075]: 2026-01-22 17:24:32.387 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:24:32 compute-0 nova_compute[183075]: 2026-01-22 17:24:32.388 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:24:32 compute-0 nova_compute[183075]: 2026-01-22 17:24:32.388 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:24:32 compute-0 nova_compute[183075]: 2026-01-22 17:24:32.389 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 17:24:32 compute-0 nova_compute[183075]: 2026-01-22 17:24:32.406 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 17:24:33 compute-0 nova_compute[183075]: 2026-01-22 17:24:33.323 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:33 compute-0 nova_compute[183075]: 2026-01-22 17:24:33.326 183079 DEBUG nova.network.neutron [-] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:24:33 compute-0 nova_compute[183075]: 2026-01-22 17:24:33.354 183079 INFO nova.compute.manager [-] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Took 4.97 seconds to deallocate network for instance.
Jan 22 17:24:33 compute-0 nova_compute[183075]: 2026-01-22 17:24:33.422 183079 DEBUG oslo_concurrency.lockutils [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:33 compute-0 nova_compute[183075]: 2026-01-22 17:24:33.423 183079 DEBUG oslo_concurrency.lockutils [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:33 compute-0 nova_compute[183075]: 2026-01-22 17:24:33.544 183079 DEBUG nova.compute.provider_tree [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:24:33 compute-0 nova_compute[183075]: 2026-01-22 17:24:33.562 183079 DEBUG nova.scheduler.client.report [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:24:33 compute-0 nova_compute[183075]: 2026-01-22 17:24:33.587 183079 DEBUG oslo_concurrency.lockutils [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:33 compute-0 nova_compute[183075]: 2026-01-22 17:24:33.624 183079 INFO nova.scheduler.client.report [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Deleted allocations for instance 89784883-b435-428a-8936-a513f9e65fe0
Jan 22 17:24:33 compute-0 nova_compute[183075]: 2026-01-22 17:24:33.695 183079 DEBUG oslo_concurrency.lockutils [None req-4e2cb0da-d2c0-4444-aaaf-2db0bc4f1f85 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "89784883-b435-428a-8936-a513f9e65fe0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.673s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:34 compute-0 ovn_controller[95372]: 2026-01-22T17:24:34Z|00484|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 22 17:24:35 compute-0 nova_compute[183075]: 2026-01-22 17:24:35.914 183079 DEBUG oslo_concurrency.lockutils [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "a7440e72-b977-4601-88ad-ce8a4c72e883" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:35 compute-0 nova_compute[183075]: 2026-01-22 17:24:35.915 183079 DEBUG oslo_concurrency.lockutils [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "a7440e72-b977-4601-88ad-ce8a4c72e883" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:35 compute-0 nova_compute[183075]: 2026-01-22 17:24:35.916 183079 DEBUG oslo_concurrency.lockutils [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:35 compute-0 nova_compute[183075]: 2026-01-22 17:24:35.916 183079 DEBUG oslo_concurrency.lockutils [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:35 compute-0 nova_compute[183075]: 2026-01-22 17:24:35.917 183079 DEBUG oslo_concurrency.lockutils [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:35 compute-0 nova_compute[183075]: 2026-01-22 17:24:35.919 183079 INFO nova.compute.manager [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Terminating instance
Jan 22 17:24:35 compute-0 nova_compute[183075]: 2026-01-22 17:24:35.921 183079 DEBUG nova.compute.manager [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:24:35 compute-0 kernel: tap6b312169-d5 (unregistering): left promiscuous mode
Jan 22 17:24:35 compute-0 NetworkManager[55454]: <info>  [1769102675.9518] device (tap6b312169-d5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:24:35 compute-0 nova_compute[183075]: 2026-01-22 17:24:35.956 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:35 compute-0 ovn_controller[95372]: 2026-01-22T17:24:35Z|00485|binding|INFO|Releasing lport 6b312169-d575-48ac-b3c3-72634952d91f from this chassis (sb_readonly=0)
Jan 22 17:24:35 compute-0 ovn_controller[95372]: 2026-01-22T17:24:35Z|00486|binding|INFO|Setting lport 6b312169-d575-48ac-b3c3-72634952d91f down in Southbound
Jan 22 17:24:35 compute-0 ovn_controller[95372]: 2026-01-22T17:24:35Z|00487|binding|INFO|Removing iface tap6b312169-d5 ovn-installed in OVS
Jan 22 17:24:35 compute-0 nova_compute[183075]: 2026-01-22 17:24:35.959 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:35.967 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:52:41 10.100.0.10'], port_security=['fa:16:3e:bf:52:41 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'a7440e72-b977-4601-88ad-ce8a4c72e883', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.213', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=6b312169-d575-48ac-b3c3-72634952d91f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:24:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:35.970 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 6b312169-d575-48ac-b3c3-72634952d91f in datapath eee918a6-66b2-47ae-b702-620a23ef395b unbound from our chassis
Jan 22 17:24:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:35.974 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eee918a6-66b2-47ae-b702-620a23ef395b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:24:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:35.976 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fb144f18-03f9-48f9-a717-7acb16d22a3a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:35.977 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b namespace which is not needed anymore
Jan 22 17:24:35 compute-0 nova_compute[183075]: 2026-01-22 17:24:35.978 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:36 compute-0 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000027.scope: Deactivated successfully.
Jan 22 17:24:36 compute-0 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000027.scope: Consumed 16.745s CPU time.
Jan 22 17:24:36 compute-0 systemd-machined[154382]: Machine qemu-39-instance-00000027 terminated.
Jan 22 17:24:36 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[228022]: [NOTICE]   (228026) : haproxy version is 2.8.14-c23fe91
Jan 22 17:24:36 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[228022]: [NOTICE]   (228026) : path to executable is /usr/sbin/haproxy
Jan 22 17:24:36 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[228022]: [WARNING]  (228026) : Exiting Master process...
Jan 22 17:24:36 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[228022]: [WARNING]  (228026) : Exiting Master process...
Jan 22 17:24:36 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[228022]: [ALERT]    (228026) : Current worker (228028) exited with code 143 (Terminated)
Jan 22 17:24:36 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[228022]: [WARNING]  (228026) : All workers exited. Exiting... (0)
Jan 22 17:24:36 compute-0 systemd[1]: libpod-965a6e6fb48931b041a0430e955e9d11408557368b5064c9e68f9e7227950093.scope: Deactivated successfully.
Jan 22 17:24:36 compute-0 podman[229083]: 2026-01-22 17:24:36.174165023 +0000 UTC m=+0.053836299 container died 965a6e6fb48931b041a0430e955e9d11408557368b5064c9e68f9e7227950093 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.192 183079 INFO nova.virt.libvirt.driver [-] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Instance destroyed successfully.
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.193 183079 DEBUG nova.objects.instance [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'resources' on Instance uuid a7440e72-b977-4601-88ad-ce8a4c72e883 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.208 183079 DEBUG nova.virt.libvirt.vif [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:22:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1599866618',display_name='tempest-server-test-1599866618',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1599866618',id=39,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:22:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-pr2e3mbj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:22:44Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=a7440e72-b977-4601-88ad-ce8a4c72e883,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6b312169-d575-48ac-b3c3-72634952d91f", "address": "fa:16:3e:bf:52:41", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b312169-d5", "ovs_interfaceid": "6b312169-d575-48ac-b3c3-72634952d91f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.209 183079 DEBUG nova.network.os_vif_util [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "6b312169-d575-48ac-b3c3-72634952d91f", "address": "fa:16:3e:bf:52:41", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b312169-d5", "ovs_interfaceid": "6b312169-d575-48ac-b3c3-72634952d91f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.210 183079 DEBUG nova.network.os_vif_util [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bf:52:41,bridge_name='br-int',has_traffic_filtering=True,id=6b312169-d575-48ac-b3c3-72634952d91f,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6b312169-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.210 183079 DEBUG os_vif [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:52:41,bridge_name='br-int',has_traffic_filtering=True,id=6b312169-d575-48ac-b3c3-72634952d91f,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6b312169-d5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:24:36 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-965a6e6fb48931b041a0430e955e9d11408557368b5064c9e68f9e7227950093-userdata-shm.mount: Deactivated successfully.
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.213 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.213 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b312169-d5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-4eedbd7cc9a36558f05b5597a3eb5eb97b510e492fcef83ea116135477665986-merged.mount: Deactivated successfully.
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.215 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.217 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.219 183079 INFO os_vif [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:52:41,bridge_name='br-int',has_traffic_filtering=True,id=6b312169-d575-48ac-b3c3-72634952d91f,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6b312169-d5')
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.220 183079 INFO nova.virt.libvirt.driver [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Deleting instance files /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883_del
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.220 183079 INFO nova.virt.libvirt.driver [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Deletion of /var/lib/nova/instances/a7440e72-b977-4601-88ad-ce8a4c72e883_del complete
Jan 22 17:24:36 compute-0 podman[229083]: 2026-01-22 17:24:36.222430833 +0000 UTC m=+0.102102139 container cleanup 965a6e6fb48931b041a0430e955e9d11408557368b5064c9e68f9e7227950093 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:24:36 compute-0 systemd[1]: libpod-conmon-965a6e6fb48931b041a0430e955e9d11408557368b5064c9e68f9e7227950093.scope: Deactivated successfully.
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.267 183079 INFO nova.compute.manager [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.268 183079 DEBUG oslo.service.loopingcall [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.268 183079 DEBUG nova.compute.manager [-] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.268 183079 DEBUG nova.network.neutron [-] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:24:36 compute-0 podman[229127]: 2026-01-22 17:24:36.292344011 +0000 UTC m=+0.044810428 container remove 965a6e6fb48931b041a0430e955e9d11408557368b5064c9e68f9e7227950093 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:24:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:36.297 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[522bda12-d751-4ef5-891d-25133804390b]: (4, ('Thu Jan 22 05:24:36 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b (965a6e6fb48931b041a0430e955e9d11408557368b5064c9e68f9e7227950093)\n965a6e6fb48931b041a0430e955e9d11408557368b5064c9e68f9e7227950093\nThu Jan 22 05:24:36 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b (965a6e6fb48931b041a0430e955e9d11408557368b5064c9e68f9e7227950093)\n965a6e6fb48931b041a0430e955e9d11408557368b5064c9e68f9e7227950093\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:36.298 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f32e74a4-3f1b-476b-baa6-cfc5f34b4efd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:36.299 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:36 compute-0 kernel: tapeee918a6-60: left promiscuous mode
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.301 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.302 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:36.305 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e8a45549-342b-442f-8001-ed97f7b6616e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.315 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:36.325 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c427b209-9c30-4309-b106-988351f46d2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:36.326 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[634cc384-64d5-44f6-ba6f-13f540fcf28d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:36.341 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8613f567-bdab-4ca6-bc8e-b9c66f43b080]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492676, 'reachable_time': 43105, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229142, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:36.344 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:24:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:36.344 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[b65fdd5f-ec8c-40ba-abf2-014cb41b8931]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:36 compute-0 systemd[1]: run-netns-ovnmeta\x2deee918a6\x2d66b2\x2d47ae\x2db702\x2d620a23ef395b.mount: Deactivated successfully.
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.534 183079 DEBUG nova.compute.manager [req-102f9814-1469-4376-b058-6e63f3d63632 req-6041bcdd-41ca-492b-a034-28dc5ad73ccd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Received event network-vif-unplugged-6b312169-d575-48ac-b3c3-72634952d91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.534 183079 DEBUG oslo_concurrency.lockutils [req-102f9814-1469-4376-b058-6e63f3d63632 req-6041bcdd-41ca-492b-a034-28dc5ad73ccd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.535 183079 DEBUG oslo_concurrency.lockutils [req-102f9814-1469-4376-b058-6e63f3d63632 req-6041bcdd-41ca-492b-a034-28dc5ad73ccd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.535 183079 DEBUG oslo_concurrency.lockutils [req-102f9814-1469-4376-b058-6e63f3d63632 req-6041bcdd-41ca-492b-a034-28dc5ad73ccd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.535 183079 DEBUG nova.compute.manager [req-102f9814-1469-4376-b058-6e63f3d63632 req-6041bcdd-41ca-492b-a034-28dc5ad73ccd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] No waiting events found dispatching network-vif-unplugged-6b312169-d575-48ac-b3c3-72634952d91f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.535 183079 DEBUG nova.compute.manager [req-102f9814-1469-4376-b058-6e63f3d63632 req-6041bcdd-41ca-492b-a034-28dc5ad73ccd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Received event network-vif-unplugged-6b312169-d575-48ac-b3c3-72634952d91f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:24:36 compute-0 nova_compute[183075]: 2026-01-22 17:24:36.761 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:37 compute-0 nova_compute[183075]: 2026-01-22 17:24:37.442 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102662.4415002, c4708d03-a0cd-40d7-be06-e97b6a4b45b7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:24:37 compute-0 nova_compute[183075]: 2026-01-22 17:24:37.443 183079 INFO nova.compute.manager [-] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] VM Stopped (Lifecycle Event)
Jan 22 17:24:37 compute-0 nova_compute[183075]: 2026-01-22 17:24:37.485 183079 DEBUG nova.compute.manager [None req-34222a71-cf35-469b-a18e-fcb47394763b - - - - - -] [instance: c4708d03-a0cd-40d7-be06-e97b6a4b45b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:24:37 compute-0 nova_compute[183075]: 2026-01-22 17:24:37.855 183079 DEBUG nova.network.neutron [-] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:24:37 compute-0 nova_compute[183075]: 2026-01-22 17:24:37.874 183079 INFO nova.compute.manager [-] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Took 1.61 seconds to deallocate network for instance.
Jan 22 17:24:37 compute-0 nova_compute[183075]: 2026-01-22 17:24:37.921 183079 DEBUG oslo_concurrency.lockutils [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:37 compute-0 nova_compute[183075]: 2026-01-22 17:24:37.921 183079 DEBUG oslo_concurrency.lockutils [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:37 compute-0 nova_compute[183075]: 2026-01-22 17:24:37.984 183079 DEBUG nova.compute.provider_tree [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:24:38 compute-0 nova_compute[183075]: 2026-01-22 17:24:38.002 183079 DEBUG nova.scheduler.client.report [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:24:38 compute-0 nova_compute[183075]: 2026-01-22 17:24:38.021 183079 DEBUG oslo_concurrency.lockutils [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:38 compute-0 nova_compute[183075]: 2026-01-22 17:24:38.039 183079 INFO nova.scheduler.client.report [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Deleted allocations for instance a7440e72-b977-4601-88ad-ce8a4c72e883
Jan 22 17:24:38 compute-0 nova_compute[183075]: 2026-01-22 17:24:38.087 183079 DEBUG oslo_concurrency.lockutils [None req-a67ad9c9-1ddd-45ee-b731-51895c57a4cd 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "a7440e72-b977-4601-88ad-ce8a4c72e883" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:38 compute-0 podman[229143]: 2026-01-22 17:24:38.356739244 +0000 UTC m=+0.059449989 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:24:38 compute-0 nova_compute[183075]: 2026-01-22 17:24:38.644 183079 DEBUG nova.compute.manager [req-634fbf64-1b26-4151-ae47-e2f711bed7de req-f256c894-63ef-4692-b5f3-e64d9880961a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Received event network-vif-plugged-6b312169-d575-48ac-b3c3-72634952d91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:24:38 compute-0 nova_compute[183075]: 2026-01-22 17:24:38.645 183079 DEBUG oslo_concurrency.lockutils [req-634fbf64-1b26-4151-ae47-e2f711bed7de req-f256c894-63ef-4692-b5f3-e64d9880961a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:38 compute-0 nova_compute[183075]: 2026-01-22 17:24:38.646 183079 DEBUG oslo_concurrency.lockutils [req-634fbf64-1b26-4151-ae47-e2f711bed7de req-f256c894-63ef-4692-b5f3-e64d9880961a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:38 compute-0 nova_compute[183075]: 2026-01-22 17:24:38.646 183079 DEBUG oslo_concurrency.lockutils [req-634fbf64-1b26-4151-ae47-e2f711bed7de req-f256c894-63ef-4692-b5f3-e64d9880961a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a7440e72-b977-4601-88ad-ce8a4c72e883-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:38 compute-0 nova_compute[183075]: 2026-01-22 17:24:38.647 183079 DEBUG nova.compute.manager [req-634fbf64-1b26-4151-ae47-e2f711bed7de req-f256c894-63ef-4692-b5f3-e64d9880961a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] No waiting events found dispatching network-vif-plugged-6b312169-d575-48ac-b3c3-72634952d91f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:24:38 compute-0 nova_compute[183075]: 2026-01-22 17:24:38.647 183079 WARNING nova.compute.manager [req-634fbf64-1b26-4151-ae47-e2f711bed7de req-f256c894-63ef-4692-b5f3-e64d9880961a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Received unexpected event network-vif-plugged-6b312169-d575-48ac-b3c3-72634952d91f for instance with vm_state deleted and task_state None.
Jan 22 17:24:40 compute-0 nova_compute[183075]: 2026-01-22 17:24:40.612 183079 INFO nova.compute.manager [None req-000c0523-151c-4373-8216-ef56f78da4f7 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Get console output
Jan 22 17:24:40 compute-0 nova_compute[183075]: 2026-01-22 17:24:40.621 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.217 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.221 183079 DEBUG oslo_concurrency.lockutils [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "7c4cc341-c93c-4077-a541-31a8487482f0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.221 183079 DEBUG oslo_concurrency.lockutils [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "7c4cc341-c93c-4077-a541-31a8487482f0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.222 183079 DEBUG oslo_concurrency.lockutils [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.222 183079 DEBUG oslo_concurrency.lockutils [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.222 183079 DEBUG oslo_concurrency.lockutils [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.223 183079 INFO nova.compute.manager [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Terminating instance
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.224 183079 DEBUG nova.compute.manager [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:24:41 compute-0 kernel: tapdbad268a-40 (unregistering): left promiscuous mode
Jan 22 17:24:41 compute-0 NetworkManager[55454]: <info>  [1769102681.2503] device (tapdbad268a-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:24:41 compute-0 ovn_controller[95372]: 2026-01-22T17:24:41Z|00488|binding|INFO|Releasing lport dbad268a-40fe-4d38-aab1-20fbfbcc0775 from this chassis (sb_readonly=0)
Jan 22 17:24:41 compute-0 ovn_controller[95372]: 2026-01-22T17:24:41Z|00489|binding|INFO|Setting lport dbad268a-40fe-4d38-aab1-20fbfbcc0775 down in Southbound
Jan 22 17:24:41 compute-0 ovn_controller[95372]: 2026-01-22T17:24:41Z|00490|binding|INFO|Removing iface tapdbad268a-40 ovn-installed in OVS
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.264 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.272 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:a8:c8 10.100.0.11 10.100.0.4'], port_security=['fa:16:3e:d2:a8:c8 10.100.0.11 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28 10.100.0.4/28', 'neutron:device_id': '7c4cc341-c93c-4077-a541-31a8487482f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'neutron:revision_number': '5', 'neutron:security_group_ids': '94e3530f-8012-4817-a338-7919b109ef3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12343ce0-7cef-4f7f-9439-6550d878d4ba, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=dbad268a-40fe-4d38-aab1-20fbfbcc0775) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.275 104629 INFO neutron.agent.ovn.metadata.agent [-] Port dbad268a-40fe-4d38-aab1-20fbfbcc0775 in datapath 44326f3c-1431-44d6-85ce-61ecbbb5ed7a unbound from our chassis
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.279 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44326f3c-1431-44d6-85ce-61ecbbb5ed7a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.281 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[861ca907-4411-41ee-a996-67ada58d92ea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.282 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a namespace which is not needed anymore
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.287 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:41 compute-0 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000028.scope: Deactivated successfully.
Jan 22 17:24:41 compute-0 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000028.scope: Consumed 17.729s CPU time.
Jan 22 17:24:41 compute-0 systemd-machined[154382]: Machine qemu-40-instance-00000028 terminated.
Jan 22 17:24:41 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228396]: [NOTICE]   (228400) : haproxy version is 2.8.14-c23fe91
Jan 22 17:24:41 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228396]: [NOTICE]   (228400) : path to executable is /usr/sbin/haproxy
Jan 22 17:24:41 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228396]: [WARNING]  (228400) : Exiting Master process...
Jan 22 17:24:41 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228396]: [ALERT]    (228400) : Current worker (228402) exited with code 143 (Terminated)
Jan 22 17:24:41 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[228396]: [WARNING]  (228400) : All workers exited. Exiting... (0)
Jan 22 17:24:41 compute-0 systemd[1]: libpod-916153c4b06663be31dca44d1d83090ae4fd70bc7c5370aace02e9a7aa5b4a74.scope: Deactivated successfully.
Jan 22 17:24:41 compute-0 conmon[228396]: conmon 916153c4b06663be31dc <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-916153c4b06663be31dca44d1d83090ae4fd70bc7c5370aace02e9a7aa5b4a74.scope/container/memory.events
Jan 22 17:24:41 compute-0 NetworkManager[55454]: <info>  [1769102681.4474] manager: (tapdbad268a-40): new Tun device (/org/freedesktop/NetworkManager/Devices/208)
Jan 22 17:24:41 compute-0 podman[229195]: 2026-01-22 17:24:41.449268006 +0000 UTC m=+0.057981240 container died 916153c4b06663be31dca44d1d83090ae4fd70bc7c5370aace02e9a7aa5b4a74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:24:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-916153c4b06663be31dca44d1d83090ae4fd70bc7c5370aace02e9a7aa5b4a74-userdata-shm.mount: Deactivated successfully.
Jan 22 17:24:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-cfb125d12df97dbf39808d17fbefcdaef63f0fd816b75d3157f1d18bcc1e9450-merged.mount: Deactivated successfully.
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.506 183079 INFO nova.virt.libvirt.driver [-] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Instance destroyed successfully.
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.507 183079 DEBUG nova.objects.instance [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lazy-loading 'resources' on Instance uuid 7c4cc341-c93c-4077-a541-31a8487482f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:24:41 compute-0 podman[229195]: 2026-01-22 17:24:41.514467158 +0000 UTC m=+0.123180422 container cleanup 916153c4b06663be31dca44d1d83090ae4fd70bc7c5370aace02e9a7aa5b4a74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.521 183079 DEBUG nova.virt.libvirt.vif [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:22:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-0-352913770',display_name='tempest-server-0-352913770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-0-352913770',id=40,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMynqvoXBI34qH0ZDQNq01ZPmk41Xdi4JxkvNO4nJ7GrdggfhpXkKQhhUZRV3fv3bSwcXMqi04ipV9xe7IsTm4GBRV+U9o6VN5JSMLulp+KIvjPuEmjq0h65ra09/zQB9w==',key_name='tempest-keypair-test-1762910261',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:23:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e4c0bb18013747dfad2e25b2495090eb',ramdisk_id='',reservation_id='r-xwmek31g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortForwardingTestJSON-1240706675',owner_user_name='tempest-PortForwardingTestJSON-1240706675-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:23:06Z,user_data=None,user_id='852aea4e08344f39ae07e6b57393c767',uuid=7c4cc341-c93c-4077-a541-31a8487482f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "address": "fa:16:3e:d2:a8:c8", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}, {"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbad268a-40", "ovs_interfaceid": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.521 183079 DEBUG nova.network.os_vif_util [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converting VIF {"id": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "address": "fa:16:3e:d2:a8:c8", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}, {"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbad268a-40", "ovs_interfaceid": "dbad268a-40fe-4d38-aab1-20fbfbcc0775", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.522 183079 DEBUG nova.network.os_vif_util [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d2:a8:c8,bridge_name='br-int',has_traffic_filtering=True,id=dbad268a-40fe-4d38-aab1-20fbfbcc0775,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdbad268a-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.522 183079 DEBUG os_vif [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d2:a8:c8,bridge_name='br-int',has_traffic_filtering=True,id=dbad268a-40fe-4d38-aab1-20fbfbcc0775,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdbad268a-40') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.523 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.523 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdbad268a-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.525 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.526 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.527 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.528 183079 INFO os_vif [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d2:a8:c8,bridge_name='br-int',has_traffic_filtering=True,id=dbad268a-40fe-4d38-aab1-20fbfbcc0775,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdbad268a-40')
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.529 183079 INFO nova.virt.libvirt.driver [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Deleting instance files /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0_del
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.529 183079 INFO nova.virt.libvirt.driver [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Deletion of /var/lib/nova/instances/7c4cc341-c93c-4077-a541-31a8487482f0_del complete
Jan 22 17:24:41 compute-0 systemd[1]: libpod-conmon-916153c4b06663be31dca44d1d83090ae4fd70bc7c5370aace02e9a7aa5b4a74.scope: Deactivated successfully.
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.589 183079 INFO nova.compute.manager [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.590 183079 DEBUG oslo.service.loopingcall [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.590 183079 DEBUG nova.compute.manager [-] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.591 183079 DEBUG nova.network.neutron [-] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:24:41 compute-0 podman[229235]: 2026-01-22 17:24:41.600758834 +0000 UTC m=+0.053793728 container remove 916153c4b06663be31dca44d1d83090ae4fd70bc7c5370aace02e9a7aa5b4a74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.610 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4b5315c4-1937-4901-a0a9-942fa83667c2]: (4, ('Thu Jan 22 05:24:41 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a (916153c4b06663be31dca44d1d83090ae4fd70bc7c5370aace02e9a7aa5b4a74)\n916153c4b06663be31dca44d1d83090ae4fd70bc7c5370aace02e9a7aa5b4a74\nThu Jan 22 05:24:41 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a (916153c4b06663be31dca44d1d83090ae4fd70bc7c5370aace02e9a7aa5b4a74)\n916153c4b06663be31dca44d1d83090ae4fd70bc7c5370aace02e9a7aa5b4a74\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.613 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ea3a1a40-3d90-4b29-8aeb-452e3ba721d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.615 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44326f3c-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.617 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:41 compute-0 kernel: tap44326f3c-10: left promiscuous mode
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.643 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.647 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[794ee5e2-706b-4486-b4ae-94d6a252e6b8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.663 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7c556ae8-94e8-4f59-98fc-9afb0b717c9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.664 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[76551b68-061a-40e6-9a39-8983cd3c9d94]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.687 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ecb24bd2-3073-4c33-8ff3-7976993457de]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494933, 'reachable_time': 25399, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229253, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.690 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.690 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[ece1892f-72b2-4ff9-8e04-21f50cee67ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:41 compute-0 systemd[1]: run-netns-ovnmeta\x2d44326f3c\x2d1431\x2d44d6\x2d85ce\x2d61ecbbb5ed7a.mount: Deactivated successfully.
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.763 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.802 183079 DEBUG nova.compute.manager [req-a620fc9a-ed6e-4d5d-9270-a043e9be01c2 req-20a60a49-d737-4d10-b9dc-7fd2996ddd7f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Received event network-vif-unplugged-dbad268a-40fe-4d38-aab1-20fbfbcc0775 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.803 183079 DEBUG oslo_concurrency.lockutils [req-a620fc9a-ed6e-4d5d-9270-a043e9be01c2 req-20a60a49-d737-4d10-b9dc-7fd2996ddd7f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.804 183079 DEBUG oslo_concurrency.lockutils [req-a620fc9a-ed6e-4d5d-9270-a043e9be01c2 req-20a60a49-d737-4d10-b9dc-7fd2996ddd7f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.804 183079 DEBUG oslo_concurrency.lockutils [req-a620fc9a-ed6e-4d5d-9270-a043e9be01c2 req-20a60a49-d737-4d10-b9dc-7fd2996ddd7f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.805 183079 DEBUG nova.compute.manager [req-a620fc9a-ed6e-4d5d-9270-a043e9be01c2 req-20a60a49-d737-4d10-b9dc-7fd2996ddd7f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] No waiting events found dispatching network-vif-unplugged-dbad268a-40fe-4d38-aab1-20fbfbcc0775 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:24:41 compute-0 nova_compute[183075]: 2026-01-22 17:24:41.805 183079 DEBUG nova.compute.manager [req-a620fc9a-ed6e-4d5d-9270-a043e9be01c2 req-20a60a49-d737-4d10-b9dc-7fd2996ddd7f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Received event network-vif-unplugged-dbad268a-40fe-4d38-aab1-20fbfbcc0775 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.941 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.942 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:41.942 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:42 compute-0 nova_compute[183075]: 2026-01-22 17:24:42.621 183079 DEBUG nova.network.neutron [-] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:24:42 compute-0 nova_compute[183075]: 2026-01-22 17:24:42.638 183079 INFO nova.compute.manager [-] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Took 1.05 seconds to deallocate network for instance.
Jan 22 17:24:42 compute-0 nova_compute[183075]: 2026-01-22 17:24:42.689 183079 DEBUG oslo_concurrency.lockutils [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:42 compute-0 nova_compute[183075]: 2026-01-22 17:24:42.690 183079 DEBUG oslo_concurrency.lockutils [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:42 compute-0 nova_compute[183075]: 2026-01-22 17:24:42.732 183079 DEBUG nova.compute.provider_tree [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:24:42 compute-0 nova_compute[183075]: 2026-01-22 17:24:42.749 183079 DEBUG nova.scheduler.client.report [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:24:42 compute-0 nova_compute[183075]: 2026-01-22 17:24:42.779 183079 DEBUG oslo_concurrency.lockutils [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:42 compute-0 nova_compute[183075]: 2026-01-22 17:24:42.804 183079 INFO nova.scheduler.client.report [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Deleted allocations for instance 7c4cc341-c93c-4077-a541-31a8487482f0
Jan 22 17:24:42 compute-0 nova_compute[183075]: 2026-01-22 17:24:42.880 183079 DEBUG oslo_concurrency.lockutils [None req-93266290-8116-4846-855a-a1191d50292b 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "7c4cc341-c93c-4077-a541-31a8487482f0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:43 compute-0 nova_compute[183075]: 2026-01-22 17:24:43.296 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102668.2953455, 89784883-b435-428a-8936-a513f9e65fe0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:24:43 compute-0 nova_compute[183075]: 2026-01-22 17:24:43.297 183079 INFO nova.compute.manager [-] [instance: 89784883-b435-428a-8936-a513f9e65fe0] VM Stopped (Lifecycle Event)
Jan 22 17:24:43 compute-0 nova_compute[183075]: 2026-01-22 17:24:43.315 183079 DEBUG nova.compute.manager [None req-ca2b3b45-a385-4254-b9ea-5bb616719a83 - - - - - -] [instance: 89784883-b435-428a-8936-a513f9e65fe0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:24:43 compute-0 nova_compute[183075]: 2026-01-22 17:24:43.532 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:43 compute-0 nova_compute[183075]: 2026-01-22 17:24:43.532 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:43 compute-0 nova_compute[183075]: 2026-01-22 17:24:43.550 183079 DEBUG nova.compute.manager [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:24:43 compute-0 nova_compute[183075]: 2026-01-22 17:24:43.677 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:43 compute-0 nova_compute[183075]: 2026-01-22 17:24:43.678 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:43 compute-0 nova_compute[183075]: 2026-01-22 17:24:43.686 183079 DEBUG nova.virt.hardware [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:24:43 compute-0 nova_compute[183075]: 2026-01-22 17:24:43.687 183079 INFO nova.compute.claims [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:24:43 compute-0 nova_compute[183075]: 2026-01-22 17:24:43.829 183079 DEBUG nova.compute.provider_tree [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.140 183079 DEBUG nova.scheduler.client.report [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.489 183079 DEBUG nova.compute.manager [req-44792dd5-d9c7-4ae9-b65a-dec4c7735414 req-92f5ccf7-2598-46d0-8b4e-998cd6d2aa48 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Received event network-vif-plugged-dbad268a-40fe-4d38-aab1-20fbfbcc0775 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.490 183079 DEBUG oslo_concurrency.lockutils [req-44792dd5-d9c7-4ae9-b65a-dec4c7735414 req-92f5ccf7-2598-46d0-8b4e-998cd6d2aa48 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.490 183079 DEBUG oslo_concurrency.lockutils [req-44792dd5-d9c7-4ae9-b65a-dec4c7735414 req-92f5ccf7-2598-46d0-8b4e-998cd6d2aa48 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.491 183079 DEBUG oslo_concurrency.lockutils [req-44792dd5-d9c7-4ae9-b65a-dec4c7735414 req-92f5ccf7-2598-46d0-8b4e-998cd6d2aa48 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7c4cc341-c93c-4077-a541-31a8487482f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.491 183079 DEBUG nova.compute.manager [req-44792dd5-d9c7-4ae9-b65a-dec4c7735414 req-92f5ccf7-2598-46d0-8b4e-998cd6d2aa48 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] No waiting events found dispatching network-vif-plugged-dbad268a-40fe-4d38-aab1-20fbfbcc0775 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.492 183079 WARNING nova.compute.manager [req-44792dd5-d9c7-4ae9-b65a-dec4c7735414 req-92f5ccf7-2598-46d0-8b4e-998cd6d2aa48 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Received unexpected event network-vif-plugged-dbad268a-40fe-4d38-aab1-20fbfbcc0775 for instance with vm_state deleted and task_state None.
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.507 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.829s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.507 183079 DEBUG nova.compute.manager [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.578 183079 DEBUG nova.compute.manager [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.579 183079 DEBUG nova.network.neutron [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.603 183079 INFO nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.628 183079 DEBUG nova.compute.manager [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.725 183079 DEBUG nova.compute.manager [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.726 183079 DEBUG nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.727 183079 INFO nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Creating image(s)
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.727 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "/var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.729 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.730 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.750 183079 DEBUG oslo_concurrency.processutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.809 183079 DEBUG oslo_concurrency.processutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.811 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.812 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.829 183079 DEBUG oslo_concurrency.processutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.895 183079 DEBUG oslo_concurrency.processutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.896 183079 DEBUG oslo_concurrency.processutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.926 183079 DEBUG oslo_concurrency.processutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.927 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.928 183079 DEBUG oslo_concurrency.processutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.946 183079 DEBUG nova.policy [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.984 183079 DEBUG oslo_concurrency.processutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.984 183079 DEBUG nova.virt.disk.api [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Checking if we can resize image /var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:24:44 compute-0 nova_compute[183075]: 2026-01-22 17:24:44.985 183079 DEBUG oslo_concurrency.processutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:24:45 compute-0 nova_compute[183075]: 2026-01-22 17:24:45.079 183079 DEBUG oslo_concurrency.processutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:24:45 compute-0 nova_compute[183075]: 2026-01-22 17:24:45.081 183079 DEBUG nova.virt.disk.api [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Cannot resize image /var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:24:45 compute-0 nova_compute[183075]: 2026-01-22 17:24:45.082 183079 DEBUG nova.objects.instance [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'migration_context' on Instance uuid bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:24:45 compute-0 nova_compute[183075]: 2026-01-22 17:24:45.098 183079 DEBUG nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:24:45 compute-0 nova_compute[183075]: 2026-01-22 17:24:45.099 183079 DEBUG nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Ensure instance console log exists: /var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:24:45 compute-0 nova_compute[183075]: 2026-01-22 17:24:45.100 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:45 compute-0 nova_compute[183075]: 2026-01-22 17:24:45.100 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:45 compute-0 nova_compute[183075]: 2026-01-22 17:24:45.101 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:46 compute-0 nova_compute[183075]: 2026-01-22 17:24:46.218 183079 DEBUG nova.network.neutron [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Successfully updated port: 284d7527-ffe2-4ee7-bb76-65a68cce769e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:24:46 compute-0 nova_compute[183075]: 2026-01-22 17:24:46.327 183079 DEBUG nova.compute.manager [req-2d021cbf-7daf-4487-841d-247711f38ec8 req-de54fc8f-2ec5-47b6-b132-3d0101d55a57 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Received event network-changed-284d7527-ffe2-4ee7-bb76-65a68cce769e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:24:46 compute-0 nova_compute[183075]: 2026-01-22 17:24:46.328 183079 DEBUG nova.compute.manager [req-2d021cbf-7daf-4487-841d-247711f38ec8 req-de54fc8f-2ec5-47b6-b132-3d0101d55a57 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Refreshing instance network info cache due to event network-changed-284d7527-ffe2-4ee7-bb76-65a68cce769e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:24:46 compute-0 nova_compute[183075]: 2026-01-22 17:24:46.329 183079 DEBUG oslo_concurrency.lockutils [req-2d021cbf-7daf-4487-841d-247711f38ec8 req-de54fc8f-2ec5-47b6-b132-3d0101d55a57 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:24:46 compute-0 nova_compute[183075]: 2026-01-22 17:24:46.329 183079 DEBUG oslo_concurrency.lockutils [req-2d021cbf-7daf-4487-841d-247711f38ec8 req-de54fc8f-2ec5-47b6-b132-3d0101d55a57 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:24:46 compute-0 nova_compute[183075]: 2026-01-22 17:24:46.329 183079 DEBUG nova.network.neutron [req-2d021cbf-7daf-4487-841d-247711f38ec8 req-de54fc8f-2ec5-47b6-b132-3d0101d55a57 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Refreshing network info cache for port 284d7527-ffe2-4ee7-bb76-65a68cce769e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:24:46 compute-0 nova_compute[183075]: 2026-01-22 17:24:46.367 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "refresh_cache-bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:24:46 compute-0 nova_compute[183075]: 2026-01-22 17:24:46.490 183079 DEBUG nova.network.neutron [req-2d021cbf-7daf-4487-841d-247711f38ec8 req-de54fc8f-2ec5-47b6-b132-3d0101d55a57 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:24:46 compute-0 nova_compute[183075]: 2026-01-22 17:24:46.528 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:46 compute-0 nova_compute[183075]: 2026-01-22 17:24:46.765 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:46 compute-0 nova_compute[183075]: 2026-01-22 17:24:46.848 183079 DEBUG nova.network.neutron [req-2d021cbf-7daf-4487-841d-247711f38ec8 req-de54fc8f-2ec5-47b6-b132-3d0101d55a57 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:24:46 compute-0 nova_compute[183075]: 2026-01-22 17:24:46.868 183079 DEBUG oslo_concurrency.lockutils [req-2d021cbf-7daf-4487-841d-247711f38ec8 req-de54fc8f-2ec5-47b6-b132-3d0101d55a57 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:24:46 compute-0 nova_compute[183075]: 2026-01-22 17:24:46.868 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquired lock "refresh_cache-bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:24:46 compute-0 nova_compute[183075]: 2026-01-22 17:24:46.869 183079 DEBUG nova.network.neutron [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:24:47 compute-0 nova_compute[183075]: 2026-01-22 17:24:47.300 183079 DEBUG nova.network.neutron [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:24:47 compute-0 nova_compute[183075]: 2026-01-22 17:24:47.858 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:47 compute-0 nova_compute[183075]: 2026-01-22 17:24:47.858 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:47 compute-0 nova_compute[183075]: 2026-01-22 17:24:47.889 183079 DEBUG nova.compute.manager [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:24:47 compute-0 nova_compute[183075]: 2026-01-22 17:24:47.948 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:47 compute-0 nova_compute[183075]: 2026-01-22 17:24:47.949 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:47 compute-0 nova_compute[183075]: 2026-01-22 17:24:47.957 183079 DEBUG nova.virt.hardware [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:24:47 compute-0 nova_compute[183075]: 2026-01-22 17:24:47.957 183079 INFO nova.compute.claims [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.042 183079 DEBUG nova.network.neutron [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Updating instance_info_cache with network_info: [{"id": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "address": "fa:16:3e:83:1a:da", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap284d7527-ff", "ovs_interfaceid": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.072 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Releasing lock "refresh_cache-bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.073 183079 DEBUG nova.compute.manager [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Instance network_info: |[{"id": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "address": "fa:16:3e:83:1a:da", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap284d7527-ff", "ovs_interfaceid": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.078 183079 DEBUG nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Start _get_guest_xml network_info=[{"id": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "address": "fa:16:3e:83:1a:da", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap284d7527-ff", "ovs_interfaceid": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.084 183079 WARNING nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.091 183079 DEBUG nova.virt.libvirt.host [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.092 183079 DEBUG nova.virt.libvirt.host [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.096 183079 DEBUG nova.virt.libvirt.host [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.097 183079 DEBUG nova.virt.libvirt.host [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.098 183079 DEBUG nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.098 183079 DEBUG nova.virt.hardware [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.099 183079 DEBUG nova.virt.hardware [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.099 183079 DEBUG nova.virt.hardware [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.100 183079 DEBUG nova.virt.hardware [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.100 183079 DEBUG nova.virt.hardware [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.101 183079 DEBUG nova.virt.hardware [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.101 183079 DEBUG nova.virt.hardware [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.101 183079 DEBUG nova.virt.hardware [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.102 183079 DEBUG nova.virt.hardware [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.102 183079 DEBUG nova.virt.hardware [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.103 183079 DEBUG nova.virt.hardware [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.109 183079 DEBUG nova.virt.libvirt.vif [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:24:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-992571725',display_name='tempest-server-test-992571725',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-992571725',id=43,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-45nx0jeh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:24:44Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "address": "fa:16:3e:83:1a:da", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap284d7527-ff", "ovs_interfaceid": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.110 183079 DEBUG nova.network.os_vif_util [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "address": "fa:16:3e:83:1a:da", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap284d7527-ff", "ovs_interfaceid": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.111 183079 DEBUG nova.network.os_vif_util [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:1a:da,bridge_name='br-int',has_traffic_filtering=True,id=284d7527-ffe2-4ee7-bb76-65a68cce769e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap284d7527-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.112 183079 DEBUG nova.objects.instance [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'pci_devices' on Instance uuid bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.122 183079 DEBUG nova.compute.provider_tree [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.128 183079 DEBUG nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:24:48 compute-0 nova_compute[183075]:   <uuid>bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1</uuid>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   <name>instance-0000002b</name>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-992571725</nova:name>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:24:48</nova:creationTime>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:24:48 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:24:48 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:24:48 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:24:48 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:24:48 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:24:48 compute-0 nova_compute[183075]:         <nova:user uuid="1148a46489e842e6a0c7660c54567798">tempest-FloatingIpSameNetwork-953620552-project-member</nova:user>
Jan 22 17:24:48 compute-0 nova_compute[183075]:         <nova:project uuid="02818155e7af4645bc909d4ba671f11f">tempest-FloatingIpSameNetwork-953620552</nova:project>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:24:48 compute-0 nova_compute[183075]:         <nova:port uuid="284d7527-ffe2-4ee7-bb76-65a68cce769e">
Jan 22 17:24:48 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <system>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <entry name="serial">bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1</entry>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <entry name="uuid">bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1</entry>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     </system>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   <os>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   </os>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   <features>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   </features>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1/disk"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:83:1a:da"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <target dev="tap284d7527-ff"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1/console.log" append="off"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <video>
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     </video>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:24:48 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:24:48 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:24:48 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:24:48 compute-0 nova_compute[183075]: </domain>
Jan 22 17:24:48 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.130 183079 DEBUG nova.compute.manager [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Preparing to wait for external event network-vif-plugged-284d7527-ffe2-4ee7-bb76-65a68cce769e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.130 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.131 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.131 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.131 183079 DEBUG nova.virt.libvirt.vif [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:24:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-992571725',display_name='tempest-server-test-992571725',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-992571725',id=43,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-45nx0jeh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:24:44Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "address": "fa:16:3e:83:1a:da", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap284d7527-ff", "ovs_interfaceid": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.132 183079 DEBUG nova.network.os_vif_util [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "address": "fa:16:3e:83:1a:da", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap284d7527-ff", "ovs_interfaceid": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.132 183079 DEBUG nova.network.os_vif_util [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:1a:da,bridge_name='br-int',has_traffic_filtering=True,id=284d7527-ffe2-4ee7-bb76-65a68cce769e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap284d7527-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.133 183079 DEBUG os_vif [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:1a:da,bridge_name='br-int',has_traffic_filtering=True,id=284d7527-ffe2-4ee7-bb76-65a68cce769e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap284d7527-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.134 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.134 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.134 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.137 183079 DEBUG nova.scheduler.client.report [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.141 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.141 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap284d7527-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.141 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap284d7527-ff, col_values=(('external_ids', {'iface-id': '284d7527-ffe2-4ee7-bb76-65a68cce769e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:1a:da', 'vm-uuid': 'bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.143 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:48 compute-0 NetworkManager[55454]: <info>  [1769102688.1447] manager: (tap284d7527-ff): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/209)
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.144 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.151 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.152 183079 INFO os_vif [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:1a:da,bridge_name='br-int',has_traffic_filtering=True,id=284d7527-ffe2-4ee7-bb76-65a68cce769e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap284d7527-ff')
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.157 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.157 183079 DEBUG nova.compute.manager [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.204 183079 DEBUG nova.compute.manager [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.204 183079 DEBUG nova.network.neutron [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.223 183079 INFO nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.228 183079 DEBUG nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.228 183079 DEBUG nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No VIF found with MAC fa:16:3e:83:1a:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.236 183079 DEBUG nova.compute.manager [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.315 183079 DEBUG nova.compute.manager [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.316 183079 DEBUG nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.316 183079 INFO nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Creating image(s)
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.317 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "/var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.317 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "/var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.317 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "/var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:48 compute-0 kernel: tap284d7527-ff: entered promiscuous mode
Jan 22 17:24:48 compute-0 NetworkManager[55454]: <info>  [1769102688.3309] manager: (tap284d7527-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/210)
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.330 183079 DEBUG oslo_concurrency.processutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:24:48 compute-0 ovn_controller[95372]: 2026-01-22T17:24:48Z|00491|binding|INFO|Claiming lport 284d7527-ffe2-4ee7-bb76-65a68cce769e for this chassis.
Jan 22 17:24:48 compute-0 ovn_controller[95372]: 2026-01-22T17:24:48Z|00492|binding|INFO|284d7527-ffe2-4ee7-bb76-65a68cce769e: Claiming fa:16:3e:83:1a:da 10.100.0.11
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.339 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:1a:da 10.100.0.11'], port_security=['fa:16:3e:83:1a:da 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.233'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=10, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=284d7527-ffe2-4ee7-bb76-65a68cce769e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.341 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 284d7527-ffe2-4ee7-bb76-65a68cce769e in datapath eee918a6-66b2-47ae-b702-620a23ef395b bound to our chassis
Jan 22 17:24:48 compute-0 ovn_controller[95372]: 2026-01-22T17:24:48Z|00493|binding|INFO|Setting lport 284d7527-ffe2-4ee7-bb76-65a68cce769e ovn-installed in OVS
Jan 22 17:24:48 compute-0 ovn_controller[95372]: 2026-01-22T17:24:48Z|00494|binding|INFO|Setting lport 284d7527-ffe2-4ee7-bb76-65a68cce769e up in Southbound
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.343 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.361 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d8c66a8a-d12d-4e2d-ac33-cd64006e058f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.363 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapeee918a6-61 in ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.366 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.370 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapeee918a6-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.370 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[177d0ca8-7b39-4294-b831-b0bbf16a9a26]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.371 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[23bb6e2c-3f2d-4856-a3c2-d16e40107fd3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:48 compute-0 systemd-udevd[229290]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:24:48 compute-0 systemd-machined[154382]: New machine qemu-43-instance-0000002b.
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.394 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[c689a036-981e-4d87-826e-7fc4a9cdc460]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:48 compute-0 systemd[1]: Started Virtual Machine qemu-43-instance-0000002b.
Jan 22 17:24:48 compute-0 NetworkManager[55454]: <info>  [1769102688.4069] device (tap284d7527-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:24:48 compute-0 NetworkManager[55454]: <info>  [1769102688.4079] device (tap284d7527-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.427 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a574d5a2-4c87-4aac-ae67-1b3d8b5e01ea]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.435 183079 DEBUG oslo_concurrency.processutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.436 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.437 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.460 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[73ae94fe-b143-4bd0-be68-c8ff0a7e07ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.461 183079 DEBUG oslo_concurrency.processutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.468 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ad376a29-b5df-42a1-af35-8963065cfbfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:48 compute-0 NetworkManager[55454]: <info>  [1769102688.4704] manager: (tapeee918a6-60): new Veth device (/org/freedesktop/NetworkManager/Devices/211)
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.509 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[bd2ee539-11b5-4f11-a9d0-1df12b6337b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.513 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[7b169e3c-208c-42e8-9b7c-aaea17d084ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:48 compute-0 NetworkManager[55454]: <info>  [1769102688.5451] device (tapeee918a6-60): carrier: link connected
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.549 183079 DEBUG oslo_concurrency.processutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.550 183079 DEBUG oslo_concurrency.processutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.552 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d0d1de7f-e5b2-4a71-9f8c-f93e517a17bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.574 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[63ac2010-0dd6-4f36-bd91-267f0a105919]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505224, 'reachable_time': 37699, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229366, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:48 compute-0 podman[229322]: 2026-01-22 17:24:48.583348365 +0000 UTC m=+0.068344127 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.587 183079 DEBUG nova.compute.manager [req-2ba92e79-5ce2-4316-bc77-f58f2b71bf17 req-4612a526-621e-4edc-9192-f8bc1c884cc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Received event network-vif-plugged-284d7527-ffe2-4ee7-bb76-65a68cce769e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.587 183079 DEBUG oslo_concurrency.lockutils [req-2ba92e79-5ce2-4316-bc77-f58f2b71bf17 req-4612a526-621e-4edc-9192-f8bc1c884cc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.588 183079 DEBUG oslo_concurrency.lockutils [req-2ba92e79-5ce2-4316-bc77-f58f2b71bf17 req-4612a526-621e-4edc-9192-f8bc1c884cc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.588 183079 DEBUG oslo_concurrency.lockutils [req-2ba92e79-5ce2-4316-bc77-f58f2b71bf17 req-4612a526-621e-4edc-9192-f8bc1c884cc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.589 183079 DEBUG nova.compute.manager [req-2ba92e79-5ce2-4316-bc77-f58f2b71bf17 req-4612a526-621e-4edc-9192-f8bc1c884cc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Processing event network-vif-plugged-284d7527-ffe2-4ee7-bb76-65a68cce769e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.591 183079 DEBUG oslo_concurrency.processutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.591 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.592 183079 DEBUG oslo_concurrency.processutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.595 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f70b9718-be0a-4324-b80d-031495c33f29]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe68:e27e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505224, 'tstamp': 505224}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229385, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.610 183079 DEBUG nova.policy [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:24:48 compute-0 podman[229314]: 2026-01-22 17:24:48.611222459 +0000 UTC m=+0.095111112 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.615 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f7677ed7-513e-4fce-971b-497fe2cab3c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505224, 'reachable_time': 37699, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229393, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:48 compute-0 podman[229324]: 2026-01-22 17:24:48.618517814 +0000 UTC m=+0.099199091 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal)
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.652 183079 DEBUG oslo_concurrency.processutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.653 183079 DEBUG nova.virt.disk.api [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Checking if we can resize image /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.653 183079 DEBUG oslo_concurrency.processutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.655 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8c1cbb95-b3d8-483a-b8a0-e658767f2474]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.708 183079 DEBUG oslo_concurrency.processutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.708 183079 DEBUG nova.virt.disk.api [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Cannot resize image /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.709 183079 DEBUG nova.objects.instance [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lazy-loading 'migration_context' on Instance uuid a6598da5-2e3d-4ca1-90ab-2a8db7241468 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.719 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fb191f37-4f50-4ad1-aca6-a04666261547]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.720 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.721 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.721 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee918a6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.722 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:48 compute-0 NetworkManager[55454]: <info>  [1769102688.7234] manager: (tapeee918a6-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/212)
Jan 22 17:24:48 compute-0 kernel: tapeee918a6-60: entered promiscuous mode
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.726 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeee918a6-60, col_values=(('external_ids', {'iface-id': '15d4de90-41f4-4532-aebd-197c2a33c6d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.727 183079 DEBUG nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.727 183079 DEBUG nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Ensure instance console log exists: /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:24:48 compute-0 ovn_controller[95372]: 2026-01-22T17:24:48Z|00495|binding|INFO|Releasing lport 15d4de90-41f4-4532-aebd-197c2a33c6d6 from this chassis (sb_readonly=0)
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.728 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.728 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.728 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eee918a6-66b2-47ae-b702-620a23ef395b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eee918a6-66b2-47ae-b702-620a23ef395b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.728 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.729 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.729 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e229bd0a-e5a5-400a-81e7-dded59f44c2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.730 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/eee918a6-66b2-47ae-b702-620a23ef395b.pid.haproxy
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:24:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:48.730 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'env', 'PROCESS_TAG=haproxy-eee918a6-66b2-47ae-b702-620a23ef395b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/eee918a6-66b2-47ae-b702-620a23ef395b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.738 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.931 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102688.9312077, bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.932 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] VM Started (Lifecycle Event)
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.940 183079 DEBUG nova.compute.manager [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.943 183079 DEBUG nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.947 183079 INFO nova.virt.libvirt.driver [-] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Instance spawned successfully.
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.947 183079 DEBUG nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.963 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.969 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.974 183079 DEBUG nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.974 183079 DEBUG nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.975 183079 DEBUG nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.976 183079 DEBUG nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.976 183079 DEBUG nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.977 183079 DEBUG nova.virt.libvirt.driver [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.990 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.991 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102688.9314184, bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:24:48 compute-0 nova_compute[183075]: 2026-01-22 17:24:48.991 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] VM Paused (Lifecycle Event)
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.016 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.019 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102688.9427018, bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.019 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] VM Resumed (Lifecycle Event)
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.047 183079 INFO nova.compute.manager [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Took 4.32 seconds to spawn the instance on the hypervisor.
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.047 183079 DEBUG nova.compute.manager [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.049 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.054 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.085 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.115 183079 INFO nova.compute.manager [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Took 5.50 seconds to build instance.
Jan 22 17:24:49 compute-0 podman[229437]: 2026-01-22 17:24:49.116617972 +0000 UTC m=+0.049220646 container create 63ef95d51246a7972c49e5667b1a1bdd2d735676fb52121165191bff8b343e32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.133 183079 DEBUG oslo_concurrency.lockutils [None req-66dfbdfd-87c0-4ed0-89e2-2020af474522 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:49 compute-0 systemd[1]: Started libpod-conmon-63ef95d51246a7972c49e5667b1a1bdd2d735676fb52121165191bff8b343e32.scope.
Jan 22 17:24:49 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:24:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2f1474a7079b267a1c55d91def8af2b533876438a16430b09ff8a4609526ec9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:24:49 compute-0 podman[229437]: 2026-01-22 17:24:49.091590273 +0000 UTC m=+0.024192967 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:24:49 compute-0 podman[229437]: 2026-01-22 17:24:49.19814969 +0000 UTC m=+0.130752364 container init 63ef95d51246a7972c49e5667b1a1bdd2d735676fb52121165191bff8b343e32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:24:49 compute-0 podman[229437]: 2026-01-22 17:24:49.208061315 +0000 UTC m=+0.140663989 container start 63ef95d51246a7972c49e5667b1a1bdd2d735676fb52121165191bff8b343e32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:24:49 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[229452]: [NOTICE]   (229456) : New worker (229458) forked
Jan 22 17:24:49 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[229452]: [NOTICE]   (229456) : Loading success.
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.479 183079 DEBUG nova.network.neutron [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Successfully updated port: ff4c20a1-cc0e-4a39-80b4-bb1426093c82 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.497 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "refresh_cache-a6598da5-2e3d-4ca1-90ab-2a8db7241468" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.498 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquired lock "refresh_cache-a6598da5-2e3d-4ca1-90ab-2a8db7241468" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.498 183079 DEBUG nova.network.neutron [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.544 183079 DEBUG nova.compute.manager [req-61c7b5fb-724f-4672-8d84-8f8a2698baf4 req-f4ae7069-f2e4-4209-abee-d01325a33316 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Received event network-changed-ff4c20a1-cc0e-4a39-80b4-bb1426093c82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.544 183079 DEBUG nova.compute.manager [req-61c7b5fb-724f-4672-8d84-8f8a2698baf4 req-f4ae7069-f2e4-4209-abee-d01325a33316 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Refreshing instance network info cache due to event network-changed-ff4c20a1-cc0e-4a39-80b4-bb1426093c82. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.544 183079 DEBUG oslo_concurrency.lockutils [req-61c7b5fb-724f-4672-8d84-8f8a2698baf4 req-f4ae7069-f2e4-4209-abee-d01325a33316 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-a6598da5-2e3d-4ca1-90ab-2a8db7241468" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:24:49 compute-0 nova_compute[183075]: 2026-01-22 17:24:49.634 183079 DEBUG nova.network.neutron [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.319 183079 DEBUG nova.network.neutron [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Updating instance_info_cache with network_info: [{"id": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "address": "fa:16:3e:41:19:a3", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4c20a1-cc", "ovs_interfaceid": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.352 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Releasing lock "refresh_cache-a6598da5-2e3d-4ca1-90ab-2a8db7241468" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.352 183079 DEBUG nova.compute.manager [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Instance network_info: |[{"id": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "address": "fa:16:3e:41:19:a3", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4c20a1-cc", "ovs_interfaceid": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.352 183079 DEBUG oslo_concurrency.lockutils [req-61c7b5fb-724f-4672-8d84-8f8a2698baf4 req-f4ae7069-f2e4-4209-abee-d01325a33316 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-a6598da5-2e3d-4ca1-90ab-2a8db7241468" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.353 183079 DEBUG nova.network.neutron [req-61c7b5fb-724f-4672-8d84-8f8a2698baf4 req-f4ae7069-f2e4-4209-abee-d01325a33316 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Refreshing network info cache for port ff4c20a1-cc0e-4a39-80b4-bb1426093c82 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.356 183079 DEBUG nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Start _get_guest_xml network_info=[{"id": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "address": "fa:16:3e:41:19:a3", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4c20a1-cc", "ovs_interfaceid": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.361 183079 WARNING nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.366 183079 DEBUG nova.virt.libvirt.host [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.367 183079 DEBUG nova.virt.libvirt.host [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.374 183079 DEBUG nova.virt.libvirt.host [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.375 183079 DEBUG nova.virt.libvirt.host [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.375 183079 DEBUG nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.375 183079 DEBUG nova.virt.hardware [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.376 183079 DEBUG nova.virt.hardware [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.376 183079 DEBUG nova.virt.hardware [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.376 183079 DEBUG nova.virt.hardware [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.377 183079 DEBUG nova.virt.hardware [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.377 183079 DEBUG nova.virt.hardware [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.377 183079 DEBUG nova.virt.hardware [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.377 183079 DEBUG nova.virt.hardware [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.378 183079 DEBUG nova.virt.hardware [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.378 183079 DEBUG nova.virt.hardware [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.378 183079 DEBUG nova.virt.hardware [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.383 183079 DEBUG nova.virt.libvirt.vif [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:24:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-1-468539916',display_name='tempest-server-1-468539916',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-1-468539916',id=44,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMynqvoXBI34qH0ZDQNq01ZPmk41Xdi4JxkvNO4nJ7GrdggfhpXkKQhhUZRV3fv3bSwcXMqi04ipV9xe7IsTm4GBRV+U9o6VN5JSMLulp+KIvjPuEmjq0h65ra09/zQB9w==',key_name='tempest-keypair-test-1762910261',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e4c0bb18013747dfad2e25b2495090eb',ramdisk_id='',reservation_id='r-tqk4zn7k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortForwardingTestJSON-1240706675',owner_user_name='tempest-PortForwardingTestJSON-1240706675-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:24:48Z,user_data=None,user_id='852aea4e08344f39ae07e6b57393c767',uuid=a6598da5-2e3d-4ca1-90ab-2a8db7241468,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "address": "fa:16:3e:41:19:a3", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4c20a1-cc", "ovs_interfaceid": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.383 183079 DEBUG nova.network.os_vif_util [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converting VIF {"id": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "address": "fa:16:3e:41:19:a3", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4c20a1-cc", "ovs_interfaceid": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.384 183079 DEBUG nova.network.os_vif_util [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:19:a3,bridge_name='br-int',has_traffic_filtering=True,id=ff4c20a1-cc0e-4a39-80b4-bb1426093c82,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapff4c20a1-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.385 183079 DEBUG nova.objects.instance [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lazy-loading 'pci_devices' on Instance uuid a6598da5-2e3d-4ca1-90ab-2a8db7241468 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.400 183079 DEBUG nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:24:50 compute-0 nova_compute[183075]:   <uuid>a6598da5-2e3d-4ca1-90ab-2a8db7241468</uuid>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   <name>instance-0000002c</name>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <nova:name>tempest-server-1-468539916</nova:name>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:24:50</nova:creationTime>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:24:50 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:24:50 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:24:50 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:24:50 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:24:50 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:24:50 compute-0 nova_compute[183075]:         <nova:user uuid="852aea4e08344f39ae07e6b57393c767">tempest-PortForwardingTestJSON-1240706675-project-member</nova:user>
Jan 22 17:24:50 compute-0 nova_compute[183075]:         <nova:project uuid="e4c0bb18013747dfad2e25b2495090eb">tempest-PortForwardingTestJSON-1240706675</nova:project>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:24:50 compute-0 nova_compute[183075]:         <nova:port uuid="ff4c20a1-cc0e-4a39-80b4-bb1426093c82">
Jan 22 17:24:50 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <system>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <entry name="serial">a6598da5-2e3d-4ca1-90ab-2a8db7241468</entry>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <entry name="uuid">a6598da5-2e3d-4ca1-90ab-2a8db7241468</entry>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     </system>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   <os>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   </os>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   <features>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   </features>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:41:19:a3"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <target dev="tapff4c20a1-cc"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/console.log" append="off"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <video>
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     </video>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:24:50 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:24:50 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:24:50 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:24:50 compute-0 nova_compute[183075]: </domain>
Jan 22 17:24:50 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.400 183079 DEBUG nova.compute.manager [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Preparing to wait for external event network-vif-plugged-ff4c20a1-cc0e-4a39-80b4-bb1426093c82 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.400 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.401 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.401 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:50 compute-0 NetworkManager[55454]: <info>  [1769102690.4092] manager: (tapff4c20a1-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/213)
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.402 183079 DEBUG nova.virt.libvirt.vif [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:24:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-1-468539916',display_name='tempest-server-1-468539916',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-1-468539916',id=44,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMynqvoXBI34qH0ZDQNq01ZPmk41Xdi4JxkvNO4nJ7GrdggfhpXkKQhhUZRV3fv3bSwcXMqi04ipV9xe7IsTm4GBRV+U9o6VN5JSMLulp+KIvjPuEmjq0h65ra09/zQB9w==',key_name='tempest-keypair-test-1762910261',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e4c0bb18013747dfad2e25b2495090eb',ramdisk_id='',reservation_id='r-tqk4zn7k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortForwardingTestJSON-1240706675',owner_user_name='tempest-PortForwardingTestJSON-1240706675-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:24:48Z,user_data=None,user_id='852aea4e08344f39ae07e6b57393c767',uuid=a6598da5-2e3d-4ca1-90ab-2a8db7241468,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "address": "fa:16:3e:41:19:a3", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4c20a1-cc", "ovs_interfaceid": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.402 183079 DEBUG nova.network.os_vif_util [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converting VIF {"id": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "address": "fa:16:3e:41:19:a3", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4c20a1-cc", "ovs_interfaceid": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.402 183079 DEBUG nova.network.os_vif_util [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:19:a3,bridge_name='br-int',has_traffic_filtering=True,id=ff4c20a1-cc0e-4a39-80b4-bb1426093c82,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapff4c20a1-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.403 183079 DEBUG os_vif [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:19:a3,bridge_name='br-int',has_traffic_filtering=True,id=ff4c20a1-cc0e-4a39-80b4-bb1426093c82,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapff4c20a1-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.403 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.404 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.404 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.406 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.406 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapff4c20a1-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.407 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapff4c20a1-cc, col_values=(('external_ids', {'iface-id': 'ff4c20a1-cc0e-4a39-80b4-bb1426093c82', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:41:19:a3', 'vm-uuid': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.408 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.411 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.422 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.423 183079 INFO os_vif [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:19:a3,bridge_name='br-int',has_traffic_filtering=True,id=ff4c20a1-cc0e-4a39-80b4-bb1426093c82,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapff4c20a1-cc')
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.490 183079 DEBUG nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.490 183079 DEBUG nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] No VIF found with MAC fa:16:3e:41:19:a3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:24:50 compute-0 kernel: tapff4c20a1-cc: entered promiscuous mode
Jan 22 17:24:50 compute-0 NetworkManager[55454]: <info>  [1769102690.5546] manager: (tapff4c20a1-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/214)
Jan 22 17:24:50 compute-0 ovn_controller[95372]: 2026-01-22T17:24:50Z|00496|binding|INFO|Claiming lport ff4c20a1-cc0e-4a39-80b4-bb1426093c82 for this chassis.
Jan 22 17:24:50 compute-0 ovn_controller[95372]: 2026-01-22T17:24:50Z|00497|binding|INFO|ff4c20a1-cc0e-4a39-80b4-bb1426093c82: Claiming fa:16:3e:41:19:a3 10.100.0.12
Jan 22 17:24:50 compute-0 systemd-udevd[229310]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.556 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.562 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:19:a3 10.100.0.12'], port_security=['fa:16:3e:41:19:a3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '94e3530f-8012-4817-a338-7919b109ef3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12343ce0-7cef-4f7f-9439-6550d878d4ba, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=ff4c20a1-cc0e-4a39-80b4-bb1426093c82) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.563 104629 INFO neutron.agent.ovn.metadata.agent [-] Port ff4c20a1-cc0e-4a39-80b4-bb1426093c82 in datapath 44326f3c-1431-44d6-85ce-61ecbbb5ed7a bound to our chassis
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.565 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44326f3c-1431-44d6-85ce-61ecbbb5ed7a
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.576 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:50 compute-0 ovn_controller[95372]: 2026-01-22T17:24:50Z|00498|binding|INFO|Setting lport ff4c20a1-cc0e-4a39-80b4-bb1426093c82 ovn-installed in OVS
Jan 22 17:24:50 compute-0 ovn_controller[95372]: 2026-01-22T17:24:50Z|00499|binding|INFO|Setting lport ff4c20a1-cc0e-4a39-80b4-bb1426093c82 up in Southbound
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.578 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.580 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:50 compute-0 NetworkManager[55454]: <info>  [1769102690.5819] device (tapff4c20a1-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:24:50 compute-0 NetworkManager[55454]: <info>  [1769102690.5829] device (tapff4c20a1-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.593 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[941b750b-cabe-4729-8b3d-d05be5c135ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.593 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap44326f3c-11 in ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.596 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap44326f3c-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.596 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[33f9cffc-7534-4551-9316-03fcdd51e00c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.597 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8de41f37-3829-4767-b1b7-169b0481bd98]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:50 compute-0 systemd-machined[154382]: New machine qemu-44-instance-0000002c.
Jan 22 17:24:50 compute-0 systemd[1]: Started Virtual Machine qemu-44-instance-0000002c.
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.620 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[901986a2-ddfb-40c7-a020-2caf3b6d039a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.637 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[03c05f66-af7f-4d42-8233-3656c2e74eb2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.678 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d007a129-428f-4ccf-98ac-64c5b58734a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:50 compute-0 NetworkManager[55454]: <info>  [1769102690.6880] manager: (tap44326f3c-10): new Veth device (/org/freedesktop/NetworkManager/Devices/215)
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.685 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1595eba0-15e2-4a42-9144-13900dd3c9cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.735 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[23fe49fb-9f7c-412a-a330-1af57b810901]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.738 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[808b9a29-2cae-411c-9481-e5632f1c3aed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:50 compute-0 NetworkManager[55454]: <info>  [1769102690.7778] device (tap44326f3c-10): carrier: link connected
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.785 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[3072fc04-eab0-4518-8b83-d163d48a398d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.806 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[58dd5047-1efb-4a0a-afca-bd8a81916569]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44326f3c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:1b:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505447, 'reachable_time': 18257, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229501, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.829 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6871daf3-986e-41cf-b3c5-fc5551792670]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe37:1b89'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505447, 'tstamp': 505447}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229502, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.854 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b680e71e-6b4d-4a52-a740-239392718956]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44326f3c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:1b:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505447, 'reachable_time': 18257, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229503, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.896 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[19f0c9ac-583c-406f-8f54-29c76e38d857]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.966 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c87ee6af-d0f0-45b7-8039-8383990d856e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.968 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44326f3c-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.968 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.969 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44326f3c-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:50 compute-0 NetworkManager[55454]: <info>  [1769102690.9721] manager: (tap44326f3c-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/216)
Jan 22 17:24:50 compute-0 kernel: tap44326f3c-10: entered promiscuous mode
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.975 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.983 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44326f3c-10, col_values=(('external_ids', {'iface-id': '118957e0-7da0-4d87-b7d4-2c204e19e5b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:24:50 compute-0 ovn_controller[95372]: 2026-01-22T17:24:50Z|00500|binding|INFO|Releasing lport 118957e0-7da0-4d87-b7d4-2c204e19e5b6 from this chassis (sb_readonly=0)
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.991 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44326f3c-1431-44d6-85ce-61ecbbb5ed7a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44326f3c-1431-44d6-85ce-61ecbbb5ed7a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.992 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[66b622df-f119-4ffb-a0cc-7c3d3db5c818]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.993 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/44326f3c-1431-44d6-85ce-61ecbbb5ed7a.pid.haproxy
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 44326f3c-1431-44d6-85ce-61ecbbb5ed7a
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:24:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:24:50.994 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'env', 'PROCESS_TAG=haproxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/44326f3c-1431-44d6-85ce-61ecbbb5ed7a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:24:50 compute-0 nova_compute[183075]: 2026-01-22 17:24:50.987 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:51 compute-0 nova_compute[183075]: 2026-01-22 17:24:51.000 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:51 compute-0 nova_compute[183075]: 2026-01-22 17:24:51.182 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102676.181503, a7440e72-b977-4601-88ad-ce8a4c72e883 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:24:51 compute-0 nova_compute[183075]: 2026-01-22 17:24:51.183 183079 INFO nova.compute.manager [-] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] VM Stopped (Lifecycle Event)
Jan 22 17:24:51 compute-0 nova_compute[183075]: 2026-01-22 17:24:51.204 183079 DEBUG nova.compute.manager [None req-ac54daf0-8d16-4163-9851-7376326be094 - - - - - -] [instance: a7440e72-b977-4601-88ad-ce8a4c72e883] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:24:51 compute-0 nova_compute[183075]: 2026-01-22 17:24:51.277 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102691.2769406, a6598da5-2e3d-4ca1-90ab-2a8db7241468 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:24:51 compute-0 nova_compute[183075]: 2026-01-22 17:24:51.279 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] VM Started (Lifecycle Event)
Jan 22 17:24:51 compute-0 nova_compute[183075]: 2026-01-22 17:24:51.300 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:24:51 compute-0 nova_compute[183075]: 2026-01-22 17:24:51.305 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102691.27845, a6598da5-2e3d-4ca1-90ab-2a8db7241468 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:24:51 compute-0 nova_compute[183075]: 2026-01-22 17:24:51.305 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] VM Paused (Lifecycle Event)
Jan 22 17:24:51 compute-0 nova_compute[183075]: 2026-01-22 17:24:51.328 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:24:51 compute-0 nova_compute[183075]: 2026-01-22 17:24:51.332 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:24:51 compute-0 nova_compute[183075]: 2026-01-22 17:24:51.352 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:24:51 compute-0 podman[229542]: 2026-01-22 17:24:51.368040912 +0000 UTC m=+0.051017975 container create 279272a5e1fa0de3698abf8d8f92a1e99d0514fd276f8ddd35ac6051459072ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 22 17:24:51 compute-0 systemd[1]: Started libpod-conmon-279272a5e1fa0de3698abf8d8f92a1e99d0514fd276f8ddd35ac6051459072ae.scope.
Jan 22 17:24:51 compute-0 podman[229542]: 2026-01-22 17:24:51.342704555 +0000 UTC m=+0.025681638 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:24:51 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:24:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e89833e8b875aa10e7cd22530c38c4380399c9bde1e944c194d71e9398d8822/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:24:51 compute-0 podman[229542]: 2026-01-22 17:24:51.46531354 +0000 UTC m=+0.148290623 container init 279272a5e1fa0de3698abf8d8f92a1e99d0514fd276f8ddd35ac6051459072ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:24:51 compute-0 podman[229542]: 2026-01-22 17:24:51.475191134 +0000 UTC m=+0.158168217 container start 279272a5e1fa0de3698abf8d8f92a1e99d0514fd276f8ddd35ac6051459072ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:24:51 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229558]: [NOTICE]   (229562) : New worker (229564) forked
Jan 22 17:24:51 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229558]: [NOTICE]   (229562) : Loading success.
Jan 22 17:24:51 compute-0 nova_compute[183075]: 2026-01-22 17:24:51.767 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.448 183079 DEBUG nova.compute.manager [req-80d8bfa2-3e11-4f27-8d88-1db9555bc519 req-016e92ea-ec4d-40f7-ae39-c9019910a76e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Received event network-vif-plugged-284d7527-ffe2-4ee7-bb76-65a68cce769e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.448 183079 DEBUG oslo_concurrency.lockutils [req-80d8bfa2-3e11-4f27-8d88-1db9555bc519 req-016e92ea-ec4d-40f7-ae39-c9019910a76e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.448 183079 DEBUG oslo_concurrency.lockutils [req-80d8bfa2-3e11-4f27-8d88-1db9555bc519 req-016e92ea-ec4d-40f7-ae39-c9019910a76e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.449 183079 DEBUG oslo_concurrency.lockutils [req-80d8bfa2-3e11-4f27-8d88-1db9555bc519 req-016e92ea-ec4d-40f7-ae39-c9019910a76e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.449 183079 DEBUG nova.compute.manager [req-80d8bfa2-3e11-4f27-8d88-1db9555bc519 req-016e92ea-ec4d-40f7-ae39-c9019910a76e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] No waiting events found dispatching network-vif-plugged-284d7527-ffe2-4ee7-bb76-65a68cce769e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.449 183079 WARNING nova.compute.manager [req-80d8bfa2-3e11-4f27-8d88-1db9555bc519 req-016e92ea-ec4d-40f7-ae39-c9019910a76e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Received unexpected event network-vif-plugged-284d7527-ffe2-4ee7-bb76-65a68cce769e for instance with vm_state active and task_state None.
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.527 183079 INFO nova.compute.manager [None req-f51fef27-afd0-4d31-bc5f-da14068ee023 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Get console output
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.725 183079 DEBUG nova.compute.manager [req-75bc4f26-a066-49ad-b83e-ad82c8e20111 req-a8c1e925-2c20-49e1-9f25-f8cb255b64a4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Received event network-vif-plugged-ff4c20a1-cc0e-4a39-80b4-bb1426093c82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.726 183079 DEBUG oslo_concurrency.lockutils [req-75bc4f26-a066-49ad-b83e-ad82c8e20111 req-a8c1e925-2c20-49e1-9f25-f8cb255b64a4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.727 183079 DEBUG oslo_concurrency.lockutils [req-75bc4f26-a066-49ad-b83e-ad82c8e20111 req-a8c1e925-2c20-49e1-9f25-f8cb255b64a4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.727 183079 DEBUG oslo_concurrency.lockutils [req-75bc4f26-a066-49ad-b83e-ad82c8e20111 req-a8c1e925-2c20-49e1-9f25-f8cb255b64a4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.728 183079 DEBUG nova.compute.manager [req-75bc4f26-a066-49ad-b83e-ad82c8e20111 req-a8c1e925-2c20-49e1-9f25-f8cb255b64a4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Processing event network-vif-plugged-ff4c20a1-cc0e-4a39-80b4-bb1426093c82 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.729 183079 DEBUG nova.compute.manager [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.734 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102692.7333674, a6598da5-2e3d-4ca1-90ab-2a8db7241468 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.734 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] VM Resumed (Lifecycle Event)
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.737 183079 DEBUG nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.743 183079 INFO nova.virt.libvirt.driver [-] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Instance spawned successfully.
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.743 183079 DEBUG nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.769 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.777 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.783 183079 DEBUG nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.783 183079 DEBUG nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.785 183079 DEBUG nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.785 183079 DEBUG nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.786 183079 DEBUG nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.787 183079 DEBUG nova.virt.libvirt.driver [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.796 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.835 183079 INFO nova.compute.manager [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Took 4.52 seconds to spawn the instance on the hypervisor.
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.836 183079 DEBUG nova.compute.manager [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.927 183079 INFO nova.compute.manager [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Took 5.00 seconds to build instance.
Jan 22 17:24:52 compute-0 nova_compute[183075]: 2026-01-22 17:24:52.947 183079 DEBUG oslo_concurrency.lockutils [None req-b9a1f0e1-8422-4da5-af65-395839a7750e 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:53 compute-0 nova_compute[183075]: 2026-01-22 17:24:53.228 183079 DEBUG nova.network.neutron [req-61c7b5fb-724f-4672-8d84-8f8a2698baf4 req-f4ae7069-f2e4-4209-abee-d01325a33316 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Updated VIF entry in instance network info cache for port ff4c20a1-cc0e-4a39-80b4-bb1426093c82. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:24:53 compute-0 nova_compute[183075]: 2026-01-22 17:24:53.229 183079 DEBUG nova.network.neutron [req-61c7b5fb-724f-4672-8d84-8f8a2698baf4 req-f4ae7069-f2e4-4209-abee-d01325a33316 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Updating instance_info_cache with network_info: [{"id": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "address": "fa:16:3e:41:19:a3", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4c20a1-cc", "ovs_interfaceid": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:24:53 compute-0 nova_compute[183075]: 2026-01-22 17:24:53.244 183079 DEBUG oslo_concurrency.lockutils [req-61c7b5fb-724f-4672-8d84-8f8a2698baf4 req-f4ae7069-f2e4-4209-abee-d01325a33316 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-a6598da5-2e3d-4ca1-90ab-2a8db7241468" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:24:54 compute-0 nova_compute[183075]: 2026-01-22 17:24:54.754 183079 INFO nova.compute.manager [None req-c6492bc1-aa3b-4077-b81c-34dee81a29ab 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Get console output
Jan 22 17:24:54 compute-0 nova_compute[183075]: 2026-01-22 17:24:54.760 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:24:54 compute-0 nova_compute[183075]: 2026-01-22 17:24:54.805 183079 DEBUG nova.compute.manager [req-776ff73c-a030-49e6-93e3-f8aace5453eb req-306acf79-5a4c-488e-bcc2-057f3104dd69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Received event network-vif-plugged-ff4c20a1-cc0e-4a39-80b4-bb1426093c82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:24:54 compute-0 nova_compute[183075]: 2026-01-22 17:24:54.806 183079 DEBUG oslo_concurrency.lockutils [req-776ff73c-a030-49e6-93e3-f8aace5453eb req-306acf79-5a4c-488e-bcc2-057f3104dd69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:24:54 compute-0 nova_compute[183075]: 2026-01-22 17:24:54.807 183079 DEBUG oslo_concurrency.lockutils [req-776ff73c-a030-49e6-93e3-f8aace5453eb req-306acf79-5a4c-488e-bcc2-057f3104dd69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:24:54 compute-0 nova_compute[183075]: 2026-01-22 17:24:54.807 183079 DEBUG oslo_concurrency.lockutils [req-776ff73c-a030-49e6-93e3-f8aace5453eb req-306acf79-5a4c-488e-bcc2-057f3104dd69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:24:54 compute-0 nova_compute[183075]: 2026-01-22 17:24:54.807 183079 DEBUG nova.compute.manager [req-776ff73c-a030-49e6-93e3-f8aace5453eb req-306acf79-5a4c-488e-bcc2-057f3104dd69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] No waiting events found dispatching network-vif-plugged-ff4c20a1-cc0e-4a39-80b4-bb1426093c82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:24:54 compute-0 nova_compute[183075]: 2026-01-22 17:24:54.807 183079 WARNING nova.compute.manager [req-776ff73c-a030-49e6-93e3-f8aace5453eb req-306acf79-5a4c-488e-bcc2-057f3104dd69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Received unexpected event network-vif-plugged-ff4c20a1-cc0e-4a39-80b4-bb1426093c82 for instance with vm_state active and task_state None.
Jan 22 17:24:55 compute-0 podman[229573]: 2026-01-22 17:24:55.375880917 +0000 UTC m=+0.076276769 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:24:55 compute-0 nova_compute[183075]: 2026-01-22 17:24:55.409 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:56 compute-0 nova_compute[183075]: 2026-01-22 17:24:56.500 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102681.4990592, 7c4cc341-c93c-4077-a541-31a8487482f0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:24:56 compute-0 nova_compute[183075]: 2026-01-22 17:24:56.501 183079 INFO nova.compute.manager [-] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] VM Stopped (Lifecycle Event)
Jan 22 17:24:56 compute-0 nova_compute[183075]: 2026-01-22 17:24:56.518 183079 DEBUG nova.compute.manager [None req-dd270fd8-a59f-4782-b12f-4360dfca5e65 - - - - - -] [instance: 7c4cc341-c93c-4077-a541-31a8487482f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:24:56 compute-0 nova_compute[183075]: 2026-01-22 17:24:56.769 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:24:57 compute-0 nova_compute[183075]: 2026-01-22 17:24:57.689 183079 INFO nova.compute.manager [None req-ff94a3e7-49f9-4c60-aca6-01824c334692 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Get console output
Jan 22 17:24:59 compute-0 nova_compute[183075]: 2026-01-22 17:24:59.904 183079 INFO nova.compute.manager [None req-e50ff2fb-c4c1-474f-b519-31b7cb00abc0 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Get console output
Jan 22 17:25:00 compute-0 nova_compute[183075]: 2026-01-22 17:25:00.413 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:01 compute-0 nova_compute[183075]: 2026-01-22 17:25:01.771 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:02 compute-0 ovn_controller[95372]: 2026-01-22T17:25:02Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:1a:da 10.100.0.11
Jan 22 17:25:02 compute-0 ovn_controller[95372]: 2026-01-22T17:25:02Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:1a:da 10.100.0.11
Jan 22 17:25:02 compute-0 nova_compute[183075]: 2026-01-22 17:25:02.870 183079 INFO nova.compute.manager [None req-a0a6e7ce-71e4-4373-abe3-44fa7af71ef5 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Get console output
Jan 22 17:25:02 compute-0 nova_compute[183075]: 2026-01-22 17:25:02.876 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:03 compute-0 podman[229626]: 2026-01-22 17:25:03.344781498 +0000 UTC m=+0.056725027 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:25:03 compute-0 ovn_controller[95372]: 2026-01-22T17:25:03Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:41:19:a3 10.100.0.12
Jan 22 17:25:03 compute-0 ovn_controller[95372]: 2026-01-22T17:25:03Z|00091|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:41:19:a3 10.100.0.12
Jan 22 17:25:05 compute-0 nova_compute[183075]: 2026-01-22 17:25:05.072 183079 INFO nova.compute.manager [None req-446c89ca-41dc-41cd-9ec1-371789a10ed6 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Get console output
Jan 22 17:25:05 compute-0 nova_compute[183075]: 2026-01-22 17:25:05.415 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:06 compute-0 nova_compute[183075]: 2026-01-22 17:25:06.778 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:08 compute-0 nova_compute[183075]: 2026-01-22 17:25:08.094 183079 INFO nova.compute.manager [None req-f7c2da61-7285-4e97-8db7-ac6dcb7ee897 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Get console output
Jan 22 17:25:08 compute-0 nova_compute[183075]: 2026-01-22 17:25:08.101 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:08.401 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:08.402 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:25:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:25:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 podman[229659]: 2026-01-22 17:25:09.35469781 +0000 UTC m=+0.056249224 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.487 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.487 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.0852005
Jan 22 17:25:09 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.11:56904 [22/Jan/2026:17:25:08.400] listener listener/metadata 0/0/0/1087/1087 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.503 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.504 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.504 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.508 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.522 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.523 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0148213
Jan 22 17:25:09 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.11:56920 [22/Jan/2026:17:25:09.503] listener listener/metadata 0/0/0/19/19 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.530 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.531 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.546 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:09 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.11:56934 [22/Jan/2026:17:25:09.530] listener listener/metadata 0/0/0/16/16 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.546 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0155435
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.596 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.597 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.624 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.625 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0277634
Jan 22 17:25:09 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.11:56940 [22/Jan/2026:17:25:09.595] listener listener/metadata 0/0/0/29/29 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.631 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.632 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.646 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.647 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0146604
Jan 22 17:25:09 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.11:56950 [22/Jan/2026:17:25:09.631] listener listener/metadata 0/0/0/15/15 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.651 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.651 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.664 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.664 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0131316
Jan 22 17:25:09 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.11:56958 [22/Jan/2026:17:25:09.651] listener listener/metadata 0/0/0/13/13 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.669 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.669 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.683 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.683 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0142422
Jan 22 17:25:09 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.11:56974 [22/Jan/2026:17:25:09.668] listener listener/metadata 0/0/0/15/15 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.694 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.694 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.709 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.709 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0147955
Jan 22 17:25:09 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.11:56978 [22/Jan/2026:17:25:09.693] listener listener/metadata 0/0/0/16/16 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.718 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.718 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.736 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.736 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0179563
Jan 22 17:25:09 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.11:56984 [22/Jan/2026:17:25:09.717] listener listener/metadata 0/0/0/18/18 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.746 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.747 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.764 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.764 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0173647
Jan 22 17:25:09 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.11:56990 [22/Jan/2026:17:25:09.745] listener listener/metadata 0/0/0/18/18 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.773 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.774 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.11:57000 [22/Jan/2026:17:25:09.772] listener listener/metadata 0/0/0/18/18 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.791 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0170240
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.810 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.811 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.830 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.831 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0200460
Jan 22 17:25:09 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.11:57006 [22/Jan/2026:17:25:09.810] listener listener/metadata 0/0/0/21/21 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.838 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.838 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.856 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.856 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0176458
Jan 22 17:25:09 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.11:57008 [22/Jan/2026:17:25:09.837] listener listener/metadata 0/0/0/18/18 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.863 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.864 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.885 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.885 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0217378
Jan 22 17:25:09 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.11:57024 [22/Jan/2026:17:25:09.863] listener listener/metadata 0/0/0/22/22 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.895 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.896 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.919 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:09 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.11:57026 [22/Jan/2026:17:25:09.894] listener listener/metadata 0/0/0/24/24 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.919 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0229940
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.930 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.931 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:09 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.11:57036 [22/Jan/2026:17:25:09.930] listener listener/metadata 0/0/0/23/23 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.953 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:09.953 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0216999
Jan 22 17:25:10 compute-0 nova_compute[183075]: 2026-01-22 17:25:10.266 183079 INFO nova.compute.manager [None req-c99f2430-df4c-4b2c-86aa-d8fd96f9b68a 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Get console output
Jan 22 17:25:10 compute-0 nova_compute[183075]: 2026-01-22 17:25:10.274 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.368 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.368 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.8640838
Jan 22 17:25:10 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.12:44536 [22/Jan/2026:17:25:09.503] listener listener/metadata 0/0/0/865/865 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.385 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.386 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.402 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.402 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0160456
Jan 22 17:25:10 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.12:44538 [22/Jan/2026:17:25:10.385] listener listener/metadata 0/0/0/17/17 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.408 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.408 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:10 compute-0 nova_compute[183075]: 2026-01-22 17:25:10.416 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.430 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.430 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0218632
Jan 22 17:25:10 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.12:44542 [22/Jan/2026:17:25:10.407] listener listener/metadata 0/0/0/23/23 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.437 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.437 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.461 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.462 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0246685
Jan 22 17:25:10 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.12:44556 [22/Jan/2026:17:25:10.436] listener listener/metadata 0/0/0/25/25 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.468 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.469 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.516 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:10 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.12:44568 [22/Jan/2026:17:25:10.468] listener listener/metadata 0/0/0/47/47 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.516 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0468166
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.524 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.524 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.647 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.647 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.1224723
Jan 22 17:25:10 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.12:44584 [22/Jan/2026:17:25:10.524] listener listener/metadata 0/0/0/123/123 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.652 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.652 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.665 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.666 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0137742
Jan 22 17:25:10 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.12:44592 [22/Jan/2026:17:25:10.651] listener listener/metadata 0/0/0/14/14 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.670 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.671 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.681 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:10 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.12:44598 [22/Jan/2026:17:25:10.670] listener listener/metadata 0/0/0/11/11 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.681 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0106378
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.685 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.686 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.696 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.697 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 162 time: 0.0110898
Jan 22 17:25:10 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.12:44614 [22/Jan/2026:17:25:10.685] listener listener/metadata 0/0/0/11/11 200 146 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.701 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.701 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.716 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.716 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 162 time: 0.0148771
Jan 22 17:25:10 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.12:44624 [22/Jan/2026:17:25:10.700] listener listener/metadata 0/0/0/15/15 200 146 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.726 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.726 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:10 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.12:44634 [22/Jan/2026:17:25:10.725] listener listener/metadata 0/0/0/16/16 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.742 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0155263
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.753 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.753 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.768 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:10 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.12:44636 [22/Jan/2026:17:25:10.752] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.769 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0154550
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.774 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.774 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.787 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.787 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0126514
Jan 22 17:25:10 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.12:44648 [22/Jan/2026:17:25:10.774] listener listener/metadata 0/0/0/13/13 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.792 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.792 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.806 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:10 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.12:44658 [22/Jan/2026:17:25:10.792] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.806 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0137522
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.812 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.812 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.826 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.827 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 162 time: 0.0144064
Jan 22 17:25:10 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.12:44662 [22/Jan/2026:17:25:10.811] listener listener/metadata 0/0/0/15/15 200 146 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.833 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.833 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.847 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:10.847 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0144124
Jan 22 17:25:10 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.12:44672 [22/Jan/2026:17:25:10.832] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:25:11 compute-0 nova_compute[183075]: 2026-01-22 17:25:11.778 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:11 compute-0 nova_compute[183075]: 2026-01-22 17:25:11.801 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:25:13 compute-0 nova_compute[183075]: 2026-01-22 17:25:13.237 183079 INFO nova.compute.manager [None req-3ae4c5c5-a3c4-4c6d-8586-1d23c88f862f 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Get console output
Jan 22 17:25:13 compute-0 nova_compute[183075]: 2026-01-22 17:25:13.242 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:15 compute-0 nova_compute[183075]: 2026-01-22 17:25:15.381 183079 INFO nova.compute.manager [None req-191a9cad-939a-4b3f-b498-a8a550050abb 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Get console output
Jan 22 17:25:15 compute-0 nova_compute[183075]: 2026-01-22 17:25:15.385 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:15 compute-0 nova_compute[183075]: 2026-01-22 17:25:15.419 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:16 compute-0 nova_compute[183075]: 2026-01-22 17:25:16.782 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:18 compute-0 nova_compute[183075]: 2026-01-22 17:25:18.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:25:18 compute-0 nova_compute[183075]: 2026-01-22 17:25:18.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:25:19 compute-0 podman[229685]: 2026-01-22 17:25:19.359545937 +0000 UTC m=+0.061751460 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 17:25:19 compute-0 podman[229684]: 2026-01-22 17:25:19.379541202 +0000 UTC m=+0.084016686 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 17:25:19 compute-0 podman[229683]: 2026-01-22 17:25:19.38884641 +0000 UTC m=+0.093561700 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:25:20 compute-0 nova_compute[183075]: 2026-01-22 17:25:20.421 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:20 compute-0 nova_compute[183075]: 2026-01-22 17:25:20.532 183079 INFO nova.compute.manager [None req-415fb581-e87e-45d7-bef5-6a2955a00216 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Get console output
Jan 22 17:25:20 compute-0 nova_compute[183075]: 2026-01-22 17:25:20.537 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:20 compute-0 nova_compute[183075]: 2026-01-22 17:25:20.770 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "3e702ebd-3e54-4c9a-937e-f38331893ad1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:20 compute-0 nova_compute[183075]: 2026-01-22 17:25:20.771 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "3e702ebd-3e54-4c9a-937e-f38331893ad1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:20 compute-0 nova_compute[183075]: 2026-01-22 17:25:20.787 183079 DEBUG nova.compute.manager [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:25:20 compute-0 nova_compute[183075]: 2026-01-22 17:25:20.891 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:20 compute-0 nova_compute[183075]: 2026-01-22 17:25:20.892 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:20 compute-0 nova_compute[183075]: 2026-01-22 17:25:20.900 183079 DEBUG nova.virt.hardware [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:25:20 compute-0 nova_compute[183075]: 2026-01-22 17:25:20.900 183079 INFO nova.compute.claims [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.061 183079 DEBUG nova.compute.provider_tree [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.081 183079 DEBUG nova.scheduler.client.report [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.123 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.124 183079 DEBUG nova.compute.manager [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.178 183079 DEBUG nova.compute.manager [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.178 183079 DEBUG nova.network.neutron [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.200 183079 INFO nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.215 183079 DEBUG nova.compute.manager [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.314 183079 DEBUG nova.compute.manager [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.315 183079 DEBUG nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.316 183079 INFO nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Creating image(s)
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.316 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "/var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.317 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.317 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "/var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.328 183079 DEBUG oslo_concurrency.processutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.384 183079 DEBUG oslo_concurrency.processutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.385 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.386 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.397 183079 DEBUG oslo_concurrency.processutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.451 183079 DEBUG oslo_concurrency.processutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.452 183079 DEBUG oslo_concurrency.processutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.483 183079 DEBUG oslo_concurrency.processutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.484 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.485 183079 DEBUG oslo_concurrency.processutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.506 183079 DEBUG nova.policy [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1148a46489e842e6a0c7660c54567798', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02818155e7af4645bc909d4ba671f11f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.548 183079 DEBUG oslo_concurrency.processutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.549 183079 DEBUG nova.virt.disk.api [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Checking if we can resize image /var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.550 183079 DEBUG oslo_concurrency.processutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.604 183079 DEBUG oslo_concurrency.processutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.606 183079 DEBUG nova.virt.disk.api [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Cannot resize image /var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.606 183079 DEBUG nova.objects.instance [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'migration_context' on Instance uuid 3e702ebd-3e54-4c9a-937e-f38331893ad1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.622 183079 DEBUG nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.623 183079 DEBUG nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Ensure instance console log exists: /var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.624 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.624 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.625 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:21 compute-0 nova_compute[183075]: 2026-01-22 17:25:21.785 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:22 compute-0 nova_compute[183075]: 2026-01-22 17:25:22.134 183079 DEBUG nova.network.neutron [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Successfully updated port: 6288bbd7-e25f-4e69-9a9c-37513d7c0d28 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:25:22 compute-0 nova_compute[183075]: 2026-01-22 17:25:22.149 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "refresh_cache-3e702ebd-3e54-4c9a-937e-f38331893ad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:25:22 compute-0 nova_compute[183075]: 2026-01-22 17:25:22.149 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquired lock "refresh_cache-3e702ebd-3e54-4c9a-937e-f38331893ad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:25:22 compute-0 nova_compute[183075]: 2026-01-22 17:25:22.150 183079 DEBUG nova.network.neutron [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:25:22 compute-0 nova_compute[183075]: 2026-01-22 17:25:22.239 183079 DEBUG nova.compute.manager [req-73f0e3d8-8377-4d23-aade-716d96b077e9 req-57cfc891-c463-4197-9dee-0620b51f5dda a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Received event network-changed-6288bbd7-e25f-4e69-9a9c-37513d7c0d28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:25:22 compute-0 nova_compute[183075]: 2026-01-22 17:25:22.239 183079 DEBUG nova.compute.manager [req-73f0e3d8-8377-4d23-aade-716d96b077e9 req-57cfc891-c463-4197-9dee-0620b51f5dda a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Refreshing instance network info cache due to event network-changed-6288bbd7-e25f-4e69-9a9c-37513d7c0d28. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:25:22 compute-0 nova_compute[183075]: 2026-01-22 17:25:22.240 183079 DEBUG oslo_concurrency.lockutils [req-73f0e3d8-8377-4d23-aade-716d96b077e9 req-57cfc891-c463-4197-9dee-0620b51f5dda a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-3e702ebd-3e54-4c9a-937e-f38331893ad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:25:22 compute-0 nova_compute[183075]: 2026-01-22 17:25:22.385 183079 DEBUG nova.network.neutron [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:25:22 compute-0 nova_compute[183075]: 2026-01-22 17:25:22.784 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.033 183079 DEBUG nova.network.neutron [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Updating instance_info_cache with network_info: [{"id": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "address": "fa:16:3e:8f:00:92", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6288bbd7-e2", "ovs_interfaceid": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.050 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Releasing lock "refresh_cache-3e702ebd-3e54-4c9a-937e-f38331893ad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.050 183079 DEBUG nova.compute.manager [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Instance network_info: |[{"id": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "address": "fa:16:3e:8f:00:92", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6288bbd7-e2", "ovs_interfaceid": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.051 183079 DEBUG oslo_concurrency.lockutils [req-73f0e3d8-8377-4d23-aade-716d96b077e9 req-57cfc891-c463-4197-9dee-0620b51f5dda a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-3e702ebd-3e54-4c9a-937e-f38331893ad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.052 183079 DEBUG nova.network.neutron [req-73f0e3d8-8377-4d23-aade-716d96b077e9 req-57cfc891-c463-4197-9dee-0620b51f5dda a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Refreshing network info cache for port 6288bbd7-e25f-4e69-9a9c-37513d7c0d28 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.056 183079 DEBUG nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Start _get_guest_xml network_info=[{"id": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "address": "fa:16:3e:8f:00:92", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6288bbd7-e2", "ovs_interfaceid": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.063 183079 WARNING nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.077 183079 DEBUG nova.virt.libvirt.host [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.078 183079 DEBUG nova.virt.libvirt.host [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.083 183079 DEBUG nova.virt.libvirt.host [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.084 183079 DEBUG nova.virt.libvirt.host [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.084 183079 DEBUG nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.085 183079 DEBUG nova.virt.hardware [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.086 183079 DEBUG nova.virt.hardware [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.086 183079 DEBUG nova.virt.hardware [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.087 183079 DEBUG nova.virt.hardware [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.087 183079 DEBUG nova.virt.hardware [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.088 183079 DEBUG nova.virt.hardware [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.088 183079 DEBUG nova.virt.hardware [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.089 183079 DEBUG nova.virt.hardware [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.089 183079 DEBUG nova.virt.hardware [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.090 183079 DEBUG nova.virt.hardware [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.090 183079 DEBUG nova.virt.hardware [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.097 183079 DEBUG nova.virt.libvirt.vif [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:25:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-944898772',display_name='tempest-server-test-944898772',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-944898772',id=45,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-ymqul523',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:25:21Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=3e702ebd-3e54-4c9a-937e-f38331893ad1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "address": "fa:16:3e:8f:00:92", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6288bbd7-e2", "ovs_interfaceid": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.097 183079 DEBUG nova.network.os_vif_util [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "address": "fa:16:3e:8f:00:92", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6288bbd7-e2", "ovs_interfaceid": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.099 183079 DEBUG nova.network.os_vif_util [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:00:92,bridge_name='br-int',has_traffic_filtering=True,id=6288bbd7-e25f-4e69-9a9c-37513d7c0d28,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6288bbd7-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.100 183079 DEBUG nova.objects.instance [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'pci_devices' on Instance uuid 3e702ebd-3e54-4c9a-937e-f38331893ad1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.117 183079 DEBUG nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:25:23 compute-0 nova_compute[183075]:   <uuid>3e702ebd-3e54-4c9a-937e-f38331893ad1</uuid>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   <name>instance-0000002d</name>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-944898772</nova:name>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:25:23</nova:creationTime>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:25:23 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:25:23 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:25:23 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:25:23 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:25:23 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:25:23 compute-0 nova_compute[183075]:         <nova:user uuid="1148a46489e842e6a0c7660c54567798">tempest-FloatingIpSameNetwork-953620552-project-member</nova:user>
Jan 22 17:25:23 compute-0 nova_compute[183075]:         <nova:project uuid="02818155e7af4645bc909d4ba671f11f">tempest-FloatingIpSameNetwork-953620552</nova:project>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:25:23 compute-0 nova_compute[183075]:         <nova:port uuid="6288bbd7-e25f-4e69-9a9c-37513d7c0d28">
Jan 22 17:25:23 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <system>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <entry name="serial">3e702ebd-3e54-4c9a-937e-f38331893ad1</entry>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <entry name="uuid">3e702ebd-3e54-4c9a-937e-f38331893ad1</entry>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     </system>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   <os>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   </os>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   <features>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   </features>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1/disk"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:8f:00:92"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <target dev="tap6288bbd7-e2"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1/console.log" append="off"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <video>
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     </video>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:25:23 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:25:23 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:25:23 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:25:23 compute-0 nova_compute[183075]: </domain>
Jan 22 17:25:23 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.118 183079 DEBUG nova.compute.manager [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Preparing to wait for external event network-vif-plugged-6288bbd7-e25f-4e69-9a9c-37513d7c0d28 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.253 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.253 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.254 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.254 183079 DEBUG nova.virt.libvirt.vif [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:25:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-944898772',display_name='tempest-server-test-944898772',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-944898772',id=45,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-ymqul523',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:25:21Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=3e702ebd-3e54-4c9a-937e-f38331893ad1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "address": "fa:16:3e:8f:00:92", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6288bbd7-e2", "ovs_interfaceid": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.254 183079 DEBUG nova.network.os_vif_util [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "address": "fa:16:3e:8f:00:92", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6288bbd7-e2", "ovs_interfaceid": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.255 183079 DEBUG nova.network.os_vif_util [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:00:92,bridge_name='br-int',has_traffic_filtering=True,id=6288bbd7-e25f-4e69-9a9c-37513d7c0d28,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6288bbd7-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.255 183079 DEBUG os_vif [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:00:92,bridge_name='br-int',has_traffic_filtering=True,id=6288bbd7-e25f-4e69-9a9c-37513d7c0d28,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6288bbd7-e2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.258 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.258 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.258 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.260 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.260 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6288bbd7-e2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.261 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6288bbd7-e2, col_values=(('external_ids', {'iface-id': '6288bbd7-e25f-4e69-9a9c-37513d7c0d28', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8f:00:92', 'vm-uuid': '3e702ebd-3e54-4c9a-937e-f38331893ad1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.262 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:23 compute-0 NetworkManager[55454]: <info>  [1769102723.2635] manager: (tap6288bbd7-e2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/217)
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.264 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.271 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.272 183079 INFO os_vif [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:00:92,bridge_name='br-int',has_traffic_filtering=True,id=6288bbd7-e25f-4e69-9a9c-37513d7c0d28,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6288bbd7-e2')
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.317 183079 DEBUG nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.318 183079 DEBUG nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] No VIF found with MAC fa:16:3e:8f:00:92, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:25:23 compute-0 kernel: tap6288bbd7-e2: entered promiscuous mode
Jan 22 17:25:23 compute-0 NetworkManager[55454]: <info>  [1769102723.3678] manager: (tap6288bbd7-e2): new Tun device (/org/freedesktop/NetworkManager/Devices/218)
Jan 22 17:25:23 compute-0 ovn_controller[95372]: 2026-01-22T17:25:23Z|00501|binding|INFO|Claiming lport 6288bbd7-e25f-4e69-9a9c-37513d7c0d28 for this chassis.
Jan 22 17:25:23 compute-0 ovn_controller[95372]: 2026-01-22T17:25:23Z|00502|binding|INFO|6288bbd7-e25f-4e69-9a9c-37513d7c0d28: Claiming fa:16:3e:8f:00:92 10.100.0.12
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.478 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.479 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:23 compute-0 systemd-udevd[229772]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:25:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:23.483 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:00:92 10.100.0.12'], port_security=['fa:16:3e:8f:00:92 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3e702ebd-3e54-4c9a-937e-f38331893ad1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=11, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=6288bbd7-e25f-4e69-9a9c-37513d7c0d28) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:25:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:23.484 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 6288bbd7-e25f-4e69-9a9c-37513d7c0d28 in datapath eee918a6-66b2-47ae-b702-620a23ef395b bound to our chassis
Jan 22 17:25:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:23.485 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:25:23 compute-0 ovn_controller[95372]: 2026-01-22T17:25:23Z|00503|binding|INFO|Setting lport 6288bbd7-e25f-4e69-9a9c-37513d7c0d28 ovn-installed in OVS
Jan 22 17:25:23 compute-0 ovn_controller[95372]: 2026-01-22T17:25:23Z|00504|binding|INFO|Setting lport 6288bbd7-e25f-4e69-9a9c-37513d7c0d28 up in Southbound
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.493 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:23 compute-0 NetworkManager[55454]: <info>  [1769102723.5031] device (tap6288bbd7-e2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:25:23 compute-0 NetworkManager[55454]: <info>  [1769102723.5036] device (tap6288bbd7-e2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:25:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:23.502 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5a386750-096a-4481-9172-cf83c3b4b257]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:23 compute-0 systemd-machined[154382]: New machine qemu-45-instance-0000002d.
Jan 22 17:25:23 compute-0 systemd[1]: Started Virtual Machine qemu-45-instance-0000002d.
Jan 22 17:25:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:23.541 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c3a7d033-35e2-47dc-b49d-f49f6544b743]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:23.545 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[900440e5-b538-4a96-ab67-c544954e5af4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:23.576 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[98f67873-9749-4516-b626-f7eb1923a65d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:23.593 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c6ab3e20-7626-4d82-9b09-5d47a3d20210]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 55, 'rx_bytes': 8920, 'tx_bytes': 6230, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 55, 'rx_bytes': 8920, 'tx_bytes': 6230, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505224, 'reachable_time': 37699, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229789, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:23.610 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cea222c2-ab0e-464c-8156-e3b1daa059e4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505238, 'tstamp': 505238}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229790, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505241, 'tstamp': 505241}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229790, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:23.612 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.613 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.614 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:23.614 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee918a6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:23.615 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:25:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:23.615 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeee918a6-60, col_values=(('external_ids', {'iface-id': '15d4de90-41f4-4532-aebd-197c2a33c6d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:23.615 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.738 183079 DEBUG nova.compute.manager [req-d707d1ee-07a9-4724-907a-1396a47b3c94 req-d32c7e02-9341-4c9a-a73f-33ac28207ed2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Received event network-vif-plugged-6288bbd7-e25f-4e69-9a9c-37513d7c0d28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.738 183079 DEBUG oslo_concurrency.lockutils [req-d707d1ee-07a9-4724-907a-1396a47b3c94 req-d32c7e02-9341-4c9a-a73f-33ac28207ed2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.739 183079 DEBUG oslo_concurrency.lockutils [req-d707d1ee-07a9-4724-907a-1396a47b3c94 req-d32c7e02-9341-4c9a-a73f-33ac28207ed2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.739 183079 DEBUG oslo_concurrency.lockutils [req-d707d1ee-07a9-4724-907a-1396a47b3c94 req-d32c7e02-9341-4c9a-a73f-33ac28207ed2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.739 183079 DEBUG nova.compute.manager [req-d707d1ee-07a9-4724-907a-1396a47b3c94 req-d32c7e02-9341-4c9a-a73f-33ac28207ed2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Processing event network-vif-plugged-6288bbd7-e25f-4e69-9a9c-37513d7c0d28 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.800 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102723.7997775, 3e702ebd-3e54-4c9a-937e-f38331893ad1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.800 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] VM Started (Lifecycle Event)
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.802 183079 DEBUG nova.compute.manager [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.804 183079 DEBUG nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.807 183079 INFO nova.virt.libvirt.driver [-] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Instance spawned successfully.
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.808 183079 DEBUG nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.826 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.832 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.834 183079 DEBUG nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.834 183079 DEBUG nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.835 183079 DEBUG nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.835 183079 DEBUG nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.836 183079 DEBUG nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.836 183079 DEBUG nova.virt.libvirt.driver [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.861 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.861 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102723.79987, 3e702ebd-3e54-4c9a-937e-f38331893ad1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.862 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] VM Paused (Lifecycle Event)
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.885 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.887 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102723.8040993, 3e702ebd-3e54-4c9a-937e-f38331893ad1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.888 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] VM Resumed (Lifecycle Event)
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.897 183079 INFO nova.compute.manager [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Took 2.58 seconds to spawn the instance on the hypervisor.
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.897 183079 DEBUG nova.compute.manager [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.904 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.907 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.926 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.945 183079 INFO nova.compute.manager [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Took 3.11 seconds to build instance.
Jan 22 17:25:23 compute-0 nova_compute[183075]: 2026-01-22 17:25:23.961 183079 DEBUG oslo_concurrency.lockutils [None req-1af92d2c-d5dd-4d87-9730-2025a56b0548 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "3e702ebd-3e54-4c9a-937e-f38331893ad1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:24 compute-0 nova_compute[183075]: 2026-01-22 17:25:24.266 183079 DEBUG nova.network.neutron [req-73f0e3d8-8377-4d23-aade-716d96b077e9 req-57cfc891-c463-4197-9dee-0620b51f5dda a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Updated VIF entry in instance network info cache for port 6288bbd7-e25f-4e69-9a9c-37513d7c0d28. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:25:24 compute-0 nova_compute[183075]: 2026-01-22 17:25:24.266 183079 DEBUG nova.network.neutron [req-73f0e3d8-8377-4d23-aade-716d96b077e9 req-57cfc891-c463-4197-9dee-0620b51f5dda a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Updating instance_info_cache with network_info: [{"id": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "address": "fa:16:3e:8f:00:92", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6288bbd7-e2", "ovs_interfaceid": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:25:24 compute-0 nova_compute[183075]: 2026-01-22 17:25:24.281 183079 DEBUG oslo_concurrency.lockutils [req-73f0e3d8-8377-4d23-aade-716d96b077e9 req-57cfc891-c463-4197-9dee-0620b51f5dda a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-3e702ebd-3e54-4c9a-937e-f38331893ad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:25:24 compute-0 nova_compute[183075]: 2026-01-22 17:25:24.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:25:24 compute-0 nova_compute[183075]: 2026-01-22 17:25:24.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.617 183079 INFO nova.compute.manager [None req-22788681-de54-476b-97e1-1452e6e4bdff 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Get console output
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.622 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.643 183079 INFO nova.compute.manager [None req-4c1de801-e509-4ee0-854a-b417cb4e8a80 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Get console output
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.647 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.816 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.817 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.817 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.817 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.828 183079 DEBUG nova.compute.manager [req-71c81ef5-9944-4943-bf1b-1bbf006b4807 req-f83b9c57-0556-43a3-aa8f-39493659f527 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Received event network-vif-plugged-6288bbd7-e25f-4e69-9a9c-37513d7c0d28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.829 183079 DEBUG oslo_concurrency.lockutils [req-71c81ef5-9944-4943-bf1b-1bbf006b4807 req-f83b9c57-0556-43a3-aa8f-39493659f527 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.829 183079 DEBUG oslo_concurrency.lockutils [req-71c81ef5-9944-4943-bf1b-1bbf006b4807 req-f83b9c57-0556-43a3-aa8f-39493659f527 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.830 183079 DEBUG oslo_concurrency.lockutils [req-71c81ef5-9944-4943-bf1b-1bbf006b4807 req-f83b9c57-0556-43a3-aa8f-39493659f527 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.830 183079 DEBUG nova.compute.manager [req-71c81ef5-9944-4943-bf1b-1bbf006b4807 req-f83b9c57-0556-43a3-aa8f-39493659f527 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] No waiting events found dispatching network-vif-plugged-6288bbd7-e25f-4e69-9a9c-37513d7c0d28 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.830 183079 WARNING nova.compute.manager [req-71c81ef5-9944-4943-bf1b-1bbf006b4807 req-f83b9c57-0556-43a3-aa8f-39493659f527 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Received unexpected event network-vif-plugged-6288bbd7-e25f-4e69-9a9c-37513d7c0d28 for instance with vm_state active and task_state None.
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.898 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:25:25 compute-0 podman[229799]: 2026-01-22 17:25:25.934428214 +0000 UTC m=+0.067989737 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.957 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:25:25 compute-0 nova_compute[183075]: 2026-01-22 17:25:25.958 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.015 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.021 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.073 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.074 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.126 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.131 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.191 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.192 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.246 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.399 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.400 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5259MB free_disk=73.3090705871582GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.401 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.401 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.587 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.588 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance a6598da5-2e3d-4ca1-90ab-2a8db7241468 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.588 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 3e702ebd-3e54-4c9a-937e-f38331893ad1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.588 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.589 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.676 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.787 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:26 compute-0 nova_compute[183075]: 2026-01-22 17:25:26.825 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:25:27 compute-0 nova_compute[183075]: 2026-01-22 17:25:27.097 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:25:27 compute-0 nova_compute[183075]: 2026-01-22 17:25:27.097 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:28 compute-0 nova_compute[183075]: 2026-01-22 17:25:28.264 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:29 compute-0 nova_compute[183075]: 2026-01-22 17:25:29.098 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:25:29 compute-0 nova_compute[183075]: 2026-01-22 17:25:29.099 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:25:29 compute-0 nova_compute[183075]: 2026-01-22 17:25:29.135 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:25:29 compute-0 nova_compute[183075]: 2026-01-22 17:25:29.136 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:25:30 compute-0 nova_compute[183075]: 2026-01-22 17:25:30.922 183079 INFO nova.compute.manager [None req-2b4919a4-f943-4e94-953f-54e8c7c09e7b 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Get console output
Jan 22 17:25:30 compute-0 nova_compute[183075]: 2026-01-22 17:25:30.928 183079 INFO nova.compute.manager [None req-6745dc93-e026-4943-bd72-217a07480309 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Get console output
Jan 22 17:25:30 compute-0 nova_compute[183075]: 2026-01-22 17:25:30.930 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:30 compute-0 nova_compute[183075]: 2026-01-22 17:25:30.933 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:31 compute-0 nova_compute[183075]: 2026-01-22 17:25:31.789 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:33 compute-0 nova_compute[183075]: 2026-01-22 17:25:33.315 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:34 compute-0 podman[229837]: 2026-01-22 17:25:34.407691609 +0000 UTC m=+0.089115921 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:25:35 compute-0 ovn_controller[95372]: 2026-01-22T17:25:35Z|00092|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8f:00:92 10.100.0.12
Jan 22 17:25:35 compute-0 ovn_controller[95372]: 2026-01-22T17:25:35Z|00093|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8f:00:92 10.100.0.12
Jan 22 17:25:36 compute-0 nova_compute[183075]: 2026-01-22 17:25:36.209 183079 INFO nova.compute.manager [None req-e2656aa7-4a9c-47be-b39c-5453c0657b21 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Get console output
Jan 22 17:25:36 compute-0 nova_compute[183075]: 2026-01-22 17:25:36.210 183079 INFO nova.compute.manager [None req-17961859-136f-40fe-88d2-14ebe7feb649 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Get console output
Jan 22 17:25:36 compute-0 nova_compute[183075]: 2026-01-22 17:25:36.216 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:36 compute-0 nova_compute[183075]: 2026-01-22 17:25:36.218 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:36 compute-0 nova_compute[183075]: 2026-01-22 17:25:36.791 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:38 compute-0 nova_compute[183075]: 2026-01-22 17:25:38.322 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:40 compute-0 podman[229881]: 2026-01-22 17:25:40.361385413 +0000 UTC m=+0.065542702 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:25:40 compute-0 nova_compute[183075]: 2026-01-22 17:25:40.822 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:25:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:40.835 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:40.837 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:25:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:41 compute-0 nova_compute[183075]: 2026-01-22 17:25:41.358 183079 INFO nova.compute.manager [None req-c01ee9ca-4288-41e6-9070-ad2bd8061398 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Get console output
Jan 22 17:25:41 compute-0 nova_compute[183075]: 2026-01-22 17:25:41.364 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:41 compute-0 nova_compute[183075]: 2026-01-22 17:25:41.376 183079 INFO nova.compute.manager [None req-aceec313-9a98-42db-82db-561352f82f62 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Get console output
Jan 22 17:25:41 compute-0 nova_compute[183075]: 2026-01-22 17:25:41.380 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:41 compute-0 nova_compute[183075]: 2026-01-22 17:25:41.795 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:41.942 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:41.943 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:41.944 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.391 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.391 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.5541515
Jan 22 17:25:42 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.12:48116 [22/Jan/2026:17:25:40.835] listener listener/metadata 0/0/0/1556/1556 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.405 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.406 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.424 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:42 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.12:48130 [22/Jan/2026:17:25:42.405] listener listener/metadata 0/0/0/18/18 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.424 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0181782
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.428 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.429 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.446 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.446 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0178413
Jan 22 17:25:42 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.12:48140 [22/Jan/2026:17:25:42.428] listener listener/metadata 0/0/0/18/18 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.451 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.451 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.475 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.475 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0237699
Jan 22 17:25:42 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.12:48156 [22/Jan/2026:17:25:42.450] listener listener/metadata 0/0/0/24/24 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.483 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.484 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.498 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:42 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.12:48158 [22/Jan/2026:17:25:42.483] listener listener/metadata 0/0/0/15/15 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.499 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0146503
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.505 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.505 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.520 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.521 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0154045
Jan 22 17:25:42 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.12:48162 [22/Jan/2026:17:25:42.504] listener listener/metadata 0/0/0/16/16 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.527 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.527 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.541 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.542 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0145314
Jan 22 17:25:42 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.12:48176 [22/Jan/2026:17:25:42.526] listener listener/metadata 0/0/0/15/15 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.548 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.549 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.563 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.564 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0147092
Jan 22 17:25:42 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.12:48184 [22/Jan/2026:17:25:42.548] listener listener/metadata 0/0/0/15/15 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.568 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.569 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.584 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.584 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0157208
Jan 22 17:25:42 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.12:48188 [22/Jan/2026:17:25:42.568] listener listener/metadata 0/0/0/16/16 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.589 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.590 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.609 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:42 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.12:48200 [22/Jan/2026:17:25:42.589] listener listener/metadata 0/0/0/20/20 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.609 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0197585
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.615 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.616 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:42 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.12:48212 [22/Jan/2026:17:25:42.615] listener listener/metadata 0/0/0/15/15 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.630 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0145657
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.645 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.646 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.661 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.662 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0160062
Jan 22 17:25:42 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.12:48216 [22/Jan/2026:17:25:42.644] listener listener/metadata 0/0/0/17/17 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.665 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.666 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.680 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:42 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.12:48226 [22/Jan/2026:17:25:42.665] listener listener/metadata 0/0/0/15/15 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.680 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0146294
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.685 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.686 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.698 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.699 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0130479
Jan 22 17:25:42 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.12:48228 [22/Jan/2026:17:25:42.685] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.707 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.708 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.725 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.726 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0179489
Jan 22 17:25:42 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.12:48244 [22/Jan/2026:17:25:42.707] listener listener/metadata 0/0/0/18/18 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.730 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.731 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eee918a6-66b2-47ae-b702-620a23ef395b __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.747 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:25:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:42.748 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0170205
Jan 22 17:25:42 compute-0 haproxy-metadata-proxy-eee918a6-66b2-47ae-b702-620a23ef395b[229458]: 10.100.0.12:48260 [22/Jan/2026:17:25:42.730] listener listener/metadata 0/0/0/17/17 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:25:43 compute-0 nova_compute[183075]: 2026-01-22 17:25:43.367 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:46 compute-0 nova_compute[183075]: 2026-01-22 17:25:46.492 183079 INFO nova.compute.manager [None req-669ff131-3df1-4e9a-aff3-bc408a77da32 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Get console output
Jan 22 17:25:46 compute-0 nova_compute[183075]: 2026-01-22 17:25:46.493 183079 INFO nova.compute.manager [None req-910d6ab7-734a-44ca-9360-f13905cc3efc 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Get console output
Jan 22 17:25:46 compute-0 nova_compute[183075]: 2026-01-22 17:25:46.498 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:46 compute-0 nova_compute[183075]: 2026-01-22 17:25:46.498 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:46 compute-0 nova_compute[183075]: 2026-01-22 17:25:46.797 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:47 compute-0 nova_compute[183075]: 2026-01-22 17:25:47.770 183079 INFO nova.compute.manager [None req-c8f6d716-84c6-4657-8dda-0bef40c7c60d 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Get console output
Jan 22 17:25:47 compute-0 nova_compute[183075]: 2026-01-22 17:25:47.775 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:47 compute-0 nova_compute[183075]: 2026-01-22 17:25:47.876 183079 DEBUG oslo_concurrency.lockutils [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "3e702ebd-3e54-4c9a-937e-f38331893ad1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:47 compute-0 nova_compute[183075]: 2026-01-22 17:25:47.877 183079 DEBUG oslo_concurrency.lockutils [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "3e702ebd-3e54-4c9a-937e-f38331893ad1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:47 compute-0 nova_compute[183075]: 2026-01-22 17:25:47.877 183079 DEBUG oslo_concurrency.lockutils [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:47 compute-0 nova_compute[183075]: 2026-01-22 17:25:47.877 183079 DEBUG oslo_concurrency.lockutils [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:47 compute-0 nova_compute[183075]: 2026-01-22 17:25:47.878 183079 DEBUG oslo_concurrency.lockutils [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:47 compute-0 nova_compute[183075]: 2026-01-22 17:25:47.879 183079 INFO nova.compute.manager [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Terminating instance
Jan 22 17:25:47 compute-0 nova_compute[183075]: 2026-01-22 17:25:47.880 183079 DEBUG nova.compute.manager [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:25:47 compute-0 kernel: tap6288bbd7-e2 (unregistering): left promiscuous mode
Jan 22 17:25:47 compute-0 NetworkManager[55454]: <info>  [1769102747.8976] device (tap6288bbd7-e2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:25:47 compute-0 ovn_controller[95372]: 2026-01-22T17:25:47Z|00505|binding|INFO|Releasing lport 6288bbd7-e25f-4e69-9a9c-37513d7c0d28 from this chassis (sb_readonly=0)
Jan 22 17:25:47 compute-0 ovn_controller[95372]: 2026-01-22T17:25:47Z|00506|binding|INFO|Setting lport 6288bbd7-e25f-4e69-9a9c-37513d7c0d28 down in Southbound
Jan 22 17:25:47 compute-0 ovn_controller[95372]: 2026-01-22T17:25:47Z|00507|binding|INFO|Removing iface tap6288bbd7-e2 ovn-installed in OVS
Jan 22 17:25:47 compute-0 nova_compute[183075]: 2026-01-22 17:25:47.954 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:47 compute-0 nova_compute[183075]: 2026-01-22 17:25:47.955 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:47 compute-0 nova_compute[183075]: 2026-01-22 17:25:47.966 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:47.975 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:00:92 10.100.0.12'], port_security=['fa:16:3e:8f:00:92 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3e702ebd-3e54-4c9a-937e-f38331893ad1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[], tunnel_key=11, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=6288bbd7-e25f-4e69-9a9c-37513d7c0d28) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:25:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:47.976 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 6288bbd7-e25f-4e69-9a9c-37513d7c0d28 in datapath eee918a6-66b2-47ae-b702-620a23ef395b unbound from our chassis
Jan 22 17:25:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:47.977 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eee918a6-66b2-47ae-b702-620a23ef395b
Jan 22 17:25:47 compute-0 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Jan 22 17:25:47 compute-0 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000002d.scope: Consumed 12.346s CPU time.
Jan 22 17:25:47 compute-0 systemd-machined[154382]: Machine qemu-45-instance-0000002d terminated.
Jan 22 17:25:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:47.993 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e1955b89-0a20-46db-b82d-4ad833194d95]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:48.020 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c32be4d2-26bc-4349-a7e6-2468904efbcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:48.023 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[2770e825-32ee-439b-bd1d-1498d3d5a40a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:48.049 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[3b8dedc7-ea54-446b-8743-eb294d32d4a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:48.066 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[554ed5a6-b53c-440c-942b-dc2a08cfe9f0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeee918a6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:e2:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 106, 'rx_bytes': 17308, 'tx_bytes': 12088, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 106, 'rx_bytes': 17308, 'tx_bytes': 12088, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505224, 'reachable_time': 37699, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229927, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:48.086 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f8ed428e-23e1-4225-857c-8900c63626ec]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505238, 'tstamp': 505238}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229928, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapeee918a6-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505241, 'tstamp': 505241}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229928, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:48.088 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.089 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.094 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:48.095 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee918a6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:48.096 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:25:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:48.096 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeee918a6-60, col_values=(('external_ids', {'iface-id': '15d4de90-41f4-4532-aebd-197c2a33c6d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:48.097 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.142 183079 INFO nova.virt.libvirt.driver [-] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Instance destroyed successfully.
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.142 183079 DEBUG nova.objects.instance [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'resources' on Instance uuid 3e702ebd-3e54-4c9a-937e-f38331893ad1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.167 183079 DEBUG nova.virt.libvirt.vif [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:25:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-944898772',display_name='tempest-server-test-944898772',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-944898772',id=45,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:25:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-ymqul523',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:25:23Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=3e702ebd-3e54-4c9a-937e-f38331893ad1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "address": "fa:16:3e:8f:00:92", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6288bbd7-e2", "ovs_interfaceid": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.168 183079 DEBUG nova.network.os_vif_util [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "address": "fa:16:3e:8f:00:92", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6288bbd7-e2", "ovs_interfaceid": "6288bbd7-e25f-4e69-9a9c-37513d7c0d28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.168 183079 DEBUG nova.network.os_vif_util [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:00:92,bridge_name='br-int',has_traffic_filtering=True,id=6288bbd7-e25f-4e69-9a9c-37513d7c0d28,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6288bbd7-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.169 183079 DEBUG os_vif [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:00:92,bridge_name='br-int',has_traffic_filtering=True,id=6288bbd7-e25f-4e69-9a9c-37513d7c0d28,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6288bbd7-e2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.170 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.170 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6288bbd7-e2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.172 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.174 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.176 183079 INFO os_vif [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:00:92,bridge_name='br-int',has_traffic_filtering=True,id=6288bbd7-e25f-4e69-9a9c-37513d7c0d28,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6288bbd7-e2')
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.177 183079 INFO nova.virt.libvirt.driver [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Deleting instance files /var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1_del
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.177 183079 INFO nova.virt.libvirt.driver [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Deletion of /var/lib/nova/instances/3e702ebd-3e54-4c9a-937e-f38331893ad1_del complete
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.231 183079 DEBUG nova.compute.manager [req-15ebd68f-f868-42dd-b745-db5a9b8b761e req-aacaad8e-14de-40f2-8806-7d2f8e020619 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Received event network-vif-unplugged-6288bbd7-e25f-4e69-9a9c-37513d7c0d28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.231 183079 DEBUG oslo_concurrency.lockutils [req-15ebd68f-f868-42dd-b745-db5a9b8b761e req-aacaad8e-14de-40f2-8806-7d2f8e020619 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.231 183079 DEBUG oslo_concurrency.lockutils [req-15ebd68f-f868-42dd-b745-db5a9b8b761e req-aacaad8e-14de-40f2-8806-7d2f8e020619 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.231 183079 DEBUG oslo_concurrency.lockutils [req-15ebd68f-f868-42dd-b745-db5a9b8b761e req-aacaad8e-14de-40f2-8806-7d2f8e020619 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.232 183079 DEBUG nova.compute.manager [req-15ebd68f-f868-42dd-b745-db5a9b8b761e req-aacaad8e-14de-40f2-8806-7d2f8e020619 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] No waiting events found dispatching network-vif-unplugged-6288bbd7-e25f-4e69-9a9c-37513d7c0d28 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.232 183079 DEBUG nova.compute.manager [req-15ebd68f-f868-42dd-b745-db5a9b8b761e req-aacaad8e-14de-40f2-8806-7d2f8e020619 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Received event network-vif-unplugged-6288bbd7-e25f-4e69-9a9c-37513d7c0d28 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.256 183079 INFO nova.compute.manager [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.257 183079 DEBUG oslo.service.loopingcall [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.257 183079 DEBUG nova.compute.manager [-] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.257 183079 DEBUG nova.network.neutron [-] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:25:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:48.403 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:25:48 compute-0 nova_compute[183075]: 2026-01-22 17:25:48.403 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:48.405 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:25:49 compute-0 nova_compute[183075]: 2026-01-22 17:25:49.859 183079 DEBUG nova.network.neutron [-] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:25:49 compute-0 nova_compute[183075]: 2026-01-22 17:25:49.906 183079 INFO nova.compute.manager [-] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Took 1.65 seconds to deallocate network for instance.
Jan 22 17:25:50 compute-0 nova_compute[183075]: 2026-01-22 17:25:50.028 183079 DEBUG oslo_concurrency.lockutils [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:50 compute-0 nova_compute[183075]: 2026-01-22 17:25:50.029 183079 DEBUG oslo_concurrency.lockutils [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:50 compute-0 nova_compute[183075]: 2026-01-22 17:25:50.163 183079 DEBUG nova.compute.provider_tree [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:25:50 compute-0 nova_compute[183075]: 2026-01-22 17:25:50.190 183079 DEBUG nova.scheduler.client.report [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:25:50 compute-0 nova_compute[183075]: 2026-01-22 17:25:50.236 183079 DEBUG oslo_concurrency.lockutils [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:50 compute-0 nova_compute[183075]: 2026-01-22 17:25:50.292 183079 INFO nova.scheduler.client.report [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Deleted allocations for instance 3e702ebd-3e54-4c9a-937e-f38331893ad1
Jan 22 17:25:50 compute-0 podman[229946]: 2026-01-22 17:25:50.369329141 +0000 UTC m=+0.064397172 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:25:50 compute-0 podman[229947]: 2026-01-22 17:25:50.37378316 +0000 UTC m=+0.069056726 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64)
Jan 22 17:25:50 compute-0 nova_compute[183075]: 2026-01-22 17:25:50.379 183079 DEBUG nova.compute.manager [req-ad8958e2-ef29-423c-9e25-a7ef522c696c req-001a3afc-3b2d-4f00-bbf7-78902da31387 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Received event network-vif-plugged-6288bbd7-e25f-4e69-9a9c-37513d7c0d28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:25:50 compute-0 nova_compute[183075]: 2026-01-22 17:25:50.380 183079 DEBUG oslo_concurrency.lockutils [req-ad8958e2-ef29-423c-9e25-a7ef522c696c req-001a3afc-3b2d-4f00-bbf7-78902da31387 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:50 compute-0 nova_compute[183075]: 2026-01-22 17:25:50.380 183079 DEBUG oslo_concurrency.lockutils [req-ad8958e2-ef29-423c-9e25-a7ef522c696c req-001a3afc-3b2d-4f00-bbf7-78902da31387 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:50 compute-0 nova_compute[183075]: 2026-01-22 17:25:50.380 183079 DEBUG oslo_concurrency.lockutils [req-ad8958e2-ef29-423c-9e25-a7ef522c696c req-001a3afc-3b2d-4f00-bbf7-78902da31387 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "3e702ebd-3e54-4c9a-937e-f38331893ad1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:50 compute-0 nova_compute[183075]: 2026-01-22 17:25:50.380 183079 DEBUG nova.compute.manager [req-ad8958e2-ef29-423c-9e25-a7ef522c696c req-001a3afc-3b2d-4f00-bbf7-78902da31387 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] No waiting events found dispatching network-vif-plugged-6288bbd7-e25f-4e69-9a9c-37513d7c0d28 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:25:50 compute-0 nova_compute[183075]: 2026-01-22 17:25:50.380 183079 WARNING nova.compute.manager [req-ad8958e2-ef29-423c-9e25-a7ef522c696c req-001a3afc-3b2d-4f00-bbf7-78902da31387 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Received unexpected event network-vif-plugged-6288bbd7-e25f-4e69-9a9c-37513d7c0d28 for instance with vm_state deleted and task_state None.
Jan 22 17:25:50 compute-0 nova_compute[183075]: 2026-01-22 17:25:50.393 183079 DEBUG oslo_concurrency.lockutils [None req-a488d269-c59f-4987-8bc8-1d917c5c5a7e 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "3e702ebd-3e54-4c9a-937e-f38331893ad1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.516s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:50 compute-0 podman[229945]: 2026-01-22 17:25:50.401801408 +0000 UTC m=+0.095992745 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.207 183079 DEBUG oslo_concurrency.lockutils [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.207 183079 DEBUG oslo_concurrency.lockutils [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.207 183079 DEBUG oslo_concurrency.lockutils [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.208 183079 DEBUG oslo_concurrency.lockutils [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.208 183079 DEBUG oslo_concurrency.lockutils [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.209 183079 INFO nova.compute.manager [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Terminating instance
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.210 183079 DEBUG nova.compute.manager [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:25:51 compute-0 kernel: tap284d7527-ff (unregistering): left promiscuous mode
Jan 22 17:25:51 compute-0 NetworkManager[55454]: <info>  [1769102751.2371] device (tap284d7527-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.282 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:51 compute-0 ovn_controller[95372]: 2026-01-22T17:25:51Z|00508|binding|INFO|Releasing lport 284d7527-ffe2-4ee7-bb76-65a68cce769e from this chassis (sb_readonly=0)
Jan 22 17:25:51 compute-0 ovn_controller[95372]: 2026-01-22T17:25:51Z|00509|binding|INFO|Setting lport 284d7527-ffe2-4ee7-bb76-65a68cce769e down in Southbound
Jan 22 17:25:51 compute-0 ovn_controller[95372]: 2026-01-22T17:25:51Z|00510|binding|INFO|Removing iface tap284d7527-ff ovn-installed in OVS
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.286 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:51.293 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:1a:da 10.100.0.11'], port_security=['fa:16:3e:83:1a:da 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eee918a6-66b2-47ae-b702-620a23ef395b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02818155e7af4645bc909d4ba671f11f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c960fc2f-fbcd-48ff-b6bd-d3b900f07332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.233', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bad11b1d-0e62-4467-b3f8-4f80f0fd75da, chassis=[], tunnel_key=10, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=284d7527-ffe2-4ee7-bb76-65a68cce769e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:25:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:51.294 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 284d7527-ffe2-4ee7-bb76-65a68cce769e in datapath eee918a6-66b2-47ae-b702-620a23ef395b unbound from our chassis
Jan 22 17:25:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:51.295 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eee918a6-66b2-47ae-b702-620a23ef395b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:25:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:51.296 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0d04a8bb-7266-4e11-bb57-2cdedb91422c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:51.296 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b namespace which is not needed anymore
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.297 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:51 compute-0 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d0000002b.scope: Deactivated successfully.
Jan 22 17:25:51 compute-0 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d0000002b.scope: Consumed 15.088s CPU time.
Jan 22 17:25:51 compute-0 systemd-machined[154382]: Machine qemu-43-instance-0000002b terminated.
Jan 22 17:25:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:51.408 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:51 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[229452]: [NOTICE]   (229456) : haproxy version is 2.8.14-c23fe91
Jan 22 17:25:51 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[229452]: [NOTICE]   (229456) : path to executable is /usr/sbin/haproxy
Jan 22 17:25:51 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[229452]: [WARNING]  (229456) : Exiting Master process...
Jan 22 17:25:51 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[229452]: [ALERT]    (229456) : Current worker (229458) exited with code 143 (Terminated)
Jan 22 17:25:51 compute-0 neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b[229452]: [WARNING]  (229456) : All workers exited. Exiting... (0)
Jan 22 17:25:51 compute-0 systemd[1]: libpod-63ef95d51246a7972c49e5667b1a1bdd2d735676fb52121165191bff8b343e32.scope: Deactivated successfully.
Jan 22 17:25:51 compute-0 podman[230032]: 2026-01-22 17:25:51.444810884 +0000 UTC m=+0.058611277 container died 63ef95d51246a7972c49e5667b1a1bdd2d735676fb52121165191bff8b343e32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.459 183079 INFO nova.virt.libvirt.driver [-] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Instance destroyed successfully.
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.459 183079 DEBUG nova.objects.instance [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lazy-loading 'resources' on Instance uuid bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.488 183079 DEBUG nova.virt.libvirt.vif [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:24:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-992571725',display_name='tempest-server-test-992571725',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-992571725',id=43,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBZ1addGiefngOIZky2bZbl/LpQHm8ydIPPcYG47RZ8D1pquJ/FsAcVn7mNe0QyfiQPXvuSs1eSw/jGOy+x3K8tzxt5N8fIelXKcTizhDHr81ws2uYn+dW6F9Hiy6pJp3A==',key_name='tempest-keypair-test-453437320',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:24:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02818155e7af4645bc909d4ba671f11f',ramdisk_id='',reservation_id='r-45nx0jeh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIpSameNetwork-953620552',owner_user_name='tempest-FloatingIpSameNetwork-953620552-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:24:49Z,user_data=None,user_id='1148a46489e842e6a0c7660c54567798',uuid=bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "address": "fa:16:3e:83:1a:da", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap284d7527-ff", "ovs_interfaceid": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.488 183079 DEBUG nova.network.os_vif_util [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converting VIF {"id": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "address": "fa:16:3e:83:1a:da", "network": {"id": "eee918a6-66b2-47ae-b702-620a23ef395b", "bridge": "br-int", "label": "tempest-test-network--1264727851", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02818155e7af4645bc909d4ba671f11f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap284d7527-ff", "ovs_interfaceid": "284d7527-ffe2-4ee7-bb76-65a68cce769e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.489 183079 DEBUG nova.network.os_vif_util [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:1a:da,bridge_name='br-int',has_traffic_filtering=True,id=284d7527-ffe2-4ee7-bb76-65a68cce769e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap284d7527-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.489 183079 DEBUG os_vif [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:1a:da,bridge_name='br-int',has_traffic_filtering=True,id=284d7527-ffe2-4ee7-bb76-65a68cce769e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap284d7527-ff') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.491 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.491 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap284d7527-ff, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.492 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.495 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.496 183079 INFO os_vif [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:1a:da,bridge_name='br-int',has_traffic_filtering=True,id=284d7527-ffe2-4ee7-bb76-65a68cce769e,network=Network(eee918a6-66b2-47ae-b702-620a23ef395b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap284d7527-ff')
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.497 183079 INFO nova.virt.libvirt.driver [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Deleting instance files /var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1_del
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.497 183079 INFO nova.virt.libvirt.driver [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Deletion of /var/lib/nova/instances/bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1_del complete
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.554 183079 INFO nova.compute.manager [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.555 183079 DEBUG oslo.service.loopingcall [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.555 183079 DEBUG nova.compute.manager [-] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.555 183079 DEBUG nova.network.neutron [-] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:25:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-63ef95d51246a7972c49e5667b1a1bdd2d735676fb52121165191bff8b343e32-userdata-shm.mount: Deactivated successfully.
Jan 22 17:25:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-a2f1474a7079b267a1c55d91def8af2b533876438a16430b09ff8a4609526ec9-merged.mount: Deactivated successfully.
Jan 22 17:25:51 compute-0 podman[230032]: 2026-01-22 17:25:51.65575472 +0000 UTC m=+0.269555083 container cleanup 63ef95d51246a7972c49e5667b1a1bdd2d735676fb52121165191bff8b343e32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:25:51 compute-0 systemd[1]: libpod-conmon-63ef95d51246a7972c49e5667b1a1bdd2d735676fb52121165191bff8b343e32.scope: Deactivated successfully.
Jan 22 17:25:51 compute-0 podman[230079]: 2026-01-22 17:25:51.783309997 +0000 UTC m=+0.108882790 container remove 63ef95d51246a7972c49e5667b1a1bdd2d735676fb52121165191bff8b343e32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:25:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:51.788 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[96622c7b-8f14-4013-b518-d243da6ab32a]: (4, ('Thu Jan 22 05:25:51 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b (63ef95d51246a7972c49e5667b1a1bdd2d735676fb52121165191bff8b343e32)\n63ef95d51246a7972c49e5667b1a1bdd2d735676fb52121165191bff8b343e32\nThu Jan 22 05:25:51 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b (63ef95d51246a7972c49e5667b1a1bdd2d735676fb52121165191bff8b343e32)\n63ef95d51246a7972c49e5667b1a1bdd2d735676fb52121165191bff8b343e32\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:51.790 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6acc5db9-5c01-4f3d-a3a9-cb7cc12488af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:51.791 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee918a6-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:51 compute-0 kernel: tapeee918a6-60: left promiscuous mode
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.792 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.804 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:51 compute-0 nova_compute[183075]: 2026-01-22 17:25:51.805 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:51.807 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0c83ff63-bb3a-4d4f-adb0-3c9eeb00f9fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:51.829 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e091d96b-a940-4536-bc5b-df26db0b4b05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:51.830 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f5a12be5-062a-4e26-8b34-0afaab3952f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:51.845 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[07867e69-0f4a-46b7-9517-c3c2baee71ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505215, 'reachable_time': 37282, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230094, 'error': None, 'target': 'ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:51 compute-0 systemd[1]: run-netns-ovnmeta\x2deee918a6\x2d66b2\x2d47ae\x2db702\x2d620a23ef395b.mount: Deactivated successfully.
Jan 22 17:25:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:51.849 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-eee918a6-66b2-47ae-b702-620a23ef395b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:25:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:51.849 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[4e600ba4-b0f8-4e2a-81c0-2bd9af662d2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:52 compute-0 nova_compute[183075]: 2026-01-22 17:25:52.773 183079 DEBUG nova.compute.manager [req-e9a88f91-a5e2-4c6a-9db7-7ef32496b117 req-e3709cc4-645d-432c-ad5c-120084674e5e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Received event network-vif-unplugged-284d7527-ffe2-4ee7-bb76-65a68cce769e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:25:52 compute-0 nova_compute[183075]: 2026-01-22 17:25:52.773 183079 DEBUG oslo_concurrency.lockutils [req-e9a88f91-a5e2-4c6a-9db7-7ef32496b117 req-e3709cc4-645d-432c-ad5c-120084674e5e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:52 compute-0 nova_compute[183075]: 2026-01-22 17:25:52.774 183079 DEBUG oslo_concurrency.lockutils [req-e9a88f91-a5e2-4c6a-9db7-7ef32496b117 req-e3709cc4-645d-432c-ad5c-120084674e5e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:52 compute-0 nova_compute[183075]: 2026-01-22 17:25:52.774 183079 DEBUG oslo_concurrency.lockutils [req-e9a88f91-a5e2-4c6a-9db7-7ef32496b117 req-e3709cc4-645d-432c-ad5c-120084674e5e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:52 compute-0 nova_compute[183075]: 2026-01-22 17:25:52.774 183079 DEBUG nova.compute.manager [req-e9a88f91-a5e2-4c6a-9db7-7ef32496b117 req-e3709cc4-645d-432c-ad5c-120084674e5e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] No waiting events found dispatching network-vif-unplugged-284d7527-ffe2-4ee7-bb76-65a68cce769e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:25:52 compute-0 nova_compute[183075]: 2026-01-22 17:25:52.774 183079 DEBUG nova.compute.manager [req-e9a88f91-a5e2-4c6a-9db7-7ef32496b117 req-e3709cc4-645d-432c-ad5c-120084674e5e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Received event network-vif-unplugged-284d7527-ffe2-4ee7-bb76-65a68cce769e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:25:52 compute-0 nova_compute[183075]: 2026-01-22 17:25:52.774 183079 DEBUG nova.compute.manager [req-e9a88f91-a5e2-4c6a-9db7-7ef32496b117 req-e3709cc4-645d-432c-ad5c-120084674e5e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Received event network-vif-plugged-284d7527-ffe2-4ee7-bb76-65a68cce769e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:25:52 compute-0 nova_compute[183075]: 2026-01-22 17:25:52.774 183079 DEBUG oslo_concurrency.lockutils [req-e9a88f91-a5e2-4c6a-9db7-7ef32496b117 req-e3709cc4-645d-432c-ad5c-120084674e5e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:52 compute-0 nova_compute[183075]: 2026-01-22 17:25:52.775 183079 DEBUG oslo_concurrency.lockutils [req-e9a88f91-a5e2-4c6a-9db7-7ef32496b117 req-e3709cc4-645d-432c-ad5c-120084674e5e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:52 compute-0 nova_compute[183075]: 2026-01-22 17:25:52.775 183079 DEBUG oslo_concurrency.lockutils [req-e9a88f91-a5e2-4c6a-9db7-7ef32496b117 req-e3709cc4-645d-432c-ad5c-120084674e5e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:52 compute-0 nova_compute[183075]: 2026-01-22 17:25:52.775 183079 DEBUG nova.compute.manager [req-e9a88f91-a5e2-4c6a-9db7-7ef32496b117 req-e3709cc4-645d-432c-ad5c-120084674e5e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] No waiting events found dispatching network-vif-plugged-284d7527-ffe2-4ee7-bb76-65a68cce769e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:25:52 compute-0 nova_compute[183075]: 2026-01-22 17:25:52.775 183079 WARNING nova.compute.manager [req-e9a88f91-a5e2-4c6a-9db7-7ef32496b117 req-e3709cc4-645d-432c-ad5c-120084674e5e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Received unexpected event network-vif-plugged-284d7527-ffe2-4ee7-bb76-65a68cce769e for instance with vm_state active and task_state deleting.
Jan 22 17:25:52 compute-0 nova_compute[183075]: 2026-01-22 17:25:52.938 183079 DEBUG nova.network.neutron [-] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:25:52 compute-0 nova_compute[183075]: 2026-01-22 17:25:52.974 183079 INFO nova.compute.manager [-] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Took 1.42 seconds to deallocate network for instance.
Jan 22 17:25:53 compute-0 nova_compute[183075]: 2026-01-22 17:25:53.025 183079 DEBUG oslo_concurrency.lockutils [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:53 compute-0 nova_compute[183075]: 2026-01-22 17:25:53.026 183079 DEBUG oslo_concurrency.lockutils [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:53 compute-0 nova_compute[183075]: 2026-01-22 17:25:53.080 183079 DEBUG nova.compute.provider_tree [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:25:53 compute-0 nova_compute[183075]: 2026-01-22 17:25:53.107 183079 DEBUG nova.scheduler.client.report [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:25:53 compute-0 nova_compute[183075]: 2026-01-22 17:25:53.145 183079 DEBUG oslo_concurrency.lockutils [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:53 compute-0 nova_compute[183075]: 2026-01-22 17:25:53.195 183079 INFO nova.scheduler.client.report [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Deleted allocations for instance bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1
Jan 22 17:25:53 compute-0 nova_compute[183075]: 2026-01-22 17:25:53.288 183079 DEBUG oslo_concurrency.lockutils [None req-2c2b89b1-2b9d-42f8-a4e8-e2a6d12475fc 1148a46489e842e6a0c7660c54567798 02818155e7af4645bc909d4ba671f11f - - default default] Lock "bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:53 compute-0 nova_compute[183075]: 2026-01-22 17:25:53.763 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:53 compute-0 nova_compute[183075]: 2026-01-22 17:25:53.764 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:53 compute-0 nova_compute[183075]: 2026-01-22 17:25:53.809 183079 DEBUG nova.compute.manager [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:25:53 compute-0 nova_compute[183075]: 2026-01-22 17:25:53.893 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:53 compute-0 nova_compute[183075]: 2026-01-22 17:25:53.893 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:53 compute-0 nova_compute[183075]: 2026-01-22 17:25:53.900 183079 DEBUG nova.virt.hardware [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:25:53 compute-0 nova_compute[183075]: 2026-01-22 17:25:53.900 183079 INFO nova.compute.claims [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.025 183079 DEBUG nova.compute.provider_tree [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.043 183079 DEBUG nova.scheduler.client.report [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.077 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.077 183079 DEBUG nova.compute.manager [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.131 183079 DEBUG nova.compute.manager [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.131 183079 DEBUG nova.network.neutron [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.168 183079 INFO nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.199 183079 DEBUG nova.compute.manager [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.328 183079 DEBUG nova.compute.manager [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.329 183079 DEBUG nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.330 183079 INFO nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Creating image(s)
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.330 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "/var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.330 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "/var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.331 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "/var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.342 183079 DEBUG oslo_concurrency.processutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.400 183079 DEBUG oslo_concurrency.processutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.401 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.402 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.412 183079 DEBUG oslo_concurrency.processutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.463 183079 DEBUG oslo_concurrency.processutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.464 183079 DEBUG oslo_concurrency.processutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.547 183079 DEBUG oslo_concurrency.processutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4/disk 1073741824" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.548 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.549 183079 DEBUG oslo_concurrency.processutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.595 183079 DEBUG nova.policy [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.611 183079 DEBUG oslo_concurrency.processutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.612 183079 DEBUG nova.virt.disk.api [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Checking if we can resize image /var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.613 183079 DEBUG oslo_concurrency.processutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.667 183079 DEBUG oslo_concurrency.processutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.668 183079 DEBUG nova.virt.disk.api [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Cannot resize image /var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.668 183079 DEBUG nova.objects.instance [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lazy-loading 'migration_context' on Instance uuid 7148e02d-0822-41cf-b2f4-41ccec2a2fe4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.697 183079 DEBUG nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.697 183079 DEBUG nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Ensure instance console log exists: /var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.698 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.698 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:54 compute-0 nova_compute[183075]: 2026-01-22 17:25:54.698 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:55 compute-0 nova_compute[183075]: 2026-01-22 17:25:55.359 183079 DEBUG nova.network.neutron [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Successfully updated port: 7cdbd897-944c-4b6f-980e-c220cf2c2532 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:25:55 compute-0 nova_compute[183075]: 2026-01-22 17:25:55.391 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "refresh_cache-7148e02d-0822-41cf-b2f4-41ccec2a2fe4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:25:55 compute-0 nova_compute[183075]: 2026-01-22 17:25:55.392 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquired lock "refresh_cache-7148e02d-0822-41cf-b2f4-41ccec2a2fe4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:25:55 compute-0 nova_compute[183075]: 2026-01-22 17:25:55.392 183079 DEBUG nova.network.neutron [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.458 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'name': 'tempest-server-1-468539916', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e4c0bb18013747dfad2e25b2495090eb', 'user_id': '852aea4e08344f39ae07e6b57393c767', 'hostId': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:25:55 compute-0 nova_compute[183075]: 2026-01-22 17:25:55.459 183079 DEBUG nova.compute.manager [req-fa459ca2-71fb-4b44-adf1-aeb8afa868b9 req-1caa1539-c9a2-49d6-b075-25429a27c74c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Received event network-changed-7cdbd897-944c-4b6f-980e-c220cf2c2532 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:25:55 compute-0 nova_compute[183075]: 2026-01-22 17:25:55.459 183079 DEBUG nova.compute.manager [req-fa459ca2-71fb-4b44-adf1-aeb8afa868b9 req-1caa1539-c9a2-49d6-b075-25429a27c74c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Refreshing instance network info cache due to event network-changed-7cdbd897-944c-4b6f-980e-c220cf2c2532. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.459 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:25:55 compute-0 nova_compute[183075]: 2026-01-22 17:25:55.459 183079 DEBUG oslo_concurrency.lockutils [req-fa459ca2-71fb-4b44-adf1-aeb8afa868b9 req-1caa1539-c9a2-49d6-b075-25429a27c74c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-7148e02d-0822-41cf-b2f4-41ccec2a2fe4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.463 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a6598da5-2e3d-4ca1-90ab-2a8db7241468 / tapff4c20a1-cc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.463 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbe38485-eeeb-4cf9-b392-4ca7259ae18d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-0000002c-a6598da5-2e3d-4ca1-90ab-2a8db7241468-tapff4c20a1-cc', 'timestamp': '2026-01-22T17:25:55.459799', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'tapff4c20a1-cc', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:41:19:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff4c20a1-cc'}, 'message_id': '689c6d96-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.220245451, 'message_signature': '46b608eb17779210fba32732464e8776b6324adbf431d48cc263919f30841004'}]}, 'timestamp': '2026-01-22 17:25:55.464560', '_unique_id': 'ab201c6320a44709af1b7d1c1319ca49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.465 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.466 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.467 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/network.outgoing.bytes volume: 11596 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e74bd5c2-c342-43c0-a640-7e5f6c2250c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11596, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-0000002c-a6598da5-2e3d-4ca1-90ab-2a8db7241468-tapff4c20a1-cc', 'timestamp': '2026-01-22T17:25:55.467075', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'tapff4c20a1-cc', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:41:19:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff4c20a1-cc'}, 'message_id': '689ce186-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.220245451, 'message_signature': 'ce77203834bfead7157420a6de3388f6d861fed9822b95e71f5c099683289375'}]}, 'timestamp': '2026-01-22 17:25:55.467530', '_unique_id': 'bb83f1d563414709a0e22e64ffee83b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.468 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.469 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.485 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk.device.write.requests volume: 327 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64c721a9-07b8-44a5-bed8-024668a7cce1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 327, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468-vda', 'timestamp': '2026-01-22T17:25:55.469217', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'instance-0000002c', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '689fae52-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.229664633, 'message_signature': '94b22ab692b9062051716f86fbe5422e59c62419e2e132903c0a6d106c9e42bf'}]}, 'timestamp': '2026-01-22 17:25:55.485779', '_unique_id': 'f65496d8080044899e844984f9fdf709'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.486 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk.device.write.latency volume: 2461377937 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd26b3442-da0b-4cc7-a80a-60a42d31df58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2461377937, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468-vda', 'timestamp': '2026-01-22T17:25:55.487052', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'instance-0000002c', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '689fea5c-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.229664633, 'message_signature': 'c7036d9b57a8d9e7892818aaaf19c80796fc5560dd6a8e407a883a20c72a4ddb'}]}, 'timestamp': '2026-01-22 17:25:55.487298', '_unique_id': 'b189ec3ded6542cfa4821a6165bbfd04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.487 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.488 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.488 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11f952e7-ff20-4b59-be1e-64f9bbe56e20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-0000002c-a6598da5-2e3d-4ca1-90ab-2a8db7241468-tapff4c20a1-cc', 'timestamp': '2026-01-22T17:25:55.488368', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'tapff4c20a1-cc', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:41:19:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff4c20a1-cc'}, 'message_id': '68a01d24-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.220245451, 'message_signature': '6e81276ce8833e6bd80a36eec05786695a55daa9b1523c0f296ab5d940aa5d17'}]}, 'timestamp': '2026-01-22 17:25:55.488598', '_unique_id': '6251f9d4c15144ff8b518e211d6a94e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.489 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43560f7f-bd48-49da-a80a-71ff136a0abf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-0000002c-a6598da5-2e3d-4ca1-90ab-2a8db7241468-tapff4c20a1-cc', 'timestamp': '2026-01-22T17:25:55.489751', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'tapff4c20a1-cc', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:41:19:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff4c20a1-cc'}, 'message_id': '68a05316-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.220245451, 'message_signature': 'd694e36cc2e92cbe35666009a2da161df3fc33e01b748e4206b94b1da00b89cf'}]}, 'timestamp': '2026-01-22 17:25:55.489980', '_unique_id': '49b1399dd920444fba59beacc1c69930'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.490 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk.device.read.latency volume: 197905783 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c756701-7eed-437e-b3f6-cca625a49652', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 197905783, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468-vda', 'timestamp': '2026-01-22T17:25:55.491072', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'instance-0000002c', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '68a0867e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.229664633, 'message_signature': '4e770bb2dcf21a54df4b5e4700980deb2a86e669156c5bd250aa088c7263b5ff'}]}, 'timestamp': '2026-01-22 17:25:55.491289', '_unique_id': '71930515c5f94ae4bdc539c56435fdbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.491 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.492 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.492 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk.device.read.bytes volume: 31283712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2c22ac0-5d8b-4559-b262-f5455a85d030', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31283712, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468-vda', 'timestamp': '2026-01-22T17:25:55.492412', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'instance-0000002c', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '68a0bb1c-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.229664633, 'message_signature': '9adab62ea5a5cbafea35817dce543c3a3688ce6ccd3437b960fa16eaedc2b6e8'}]}, 'timestamp': '2026-01-22 17:25:55.492651', '_unique_id': '035bacea5daf40fea69d2e133407cc43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.493 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-1-468539916>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-1-468539916>]
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.494 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.510 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/cpu volume: 11250000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9b8724f-bd15-427b-bf4a-f0870749c33d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11250000000, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'timestamp': '2026-01-22T17:25:55.494114', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'instance-0000002c', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '68a39076-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.270980897, 'message_signature': '1d9af13d049c5d3c9608ee29245cfbfe8256666f9847dc7eacb24dca2d0b77a6'}]}, 'timestamp': '2026-01-22 17:25:55.511287', '_unique_id': '92bbcd11dd4c43e69a1b08f8e76127e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.512 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.513 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.513 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-1-468539916>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-1-468539916>]
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.513 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.513 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/network.incoming.packets volume: 62 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98f907a7-c465-4f3c-95dd-04734cf361a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 62, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-0000002c-a6598da5-2e3d-4ca1-90ab-2a8db7241468-tapff4c20a1-cc', 'timestamp': '2026-01-22T17:25:55.513361', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'tapff4c20a1-cc', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:41:19:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff4c20a1-cc'}, 'message_id': '68a3ef26-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.220245451, 'message_signature': '2d555eefd78fa8b7b43c722210016b1dd658963e55eb47667671d081c5d224ad'}]}, 'timestamp': '2026-01-22 17:25:55.513661', '_unique_id': '06c3808c93ed4bdfbde5559ca53fa588'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.514 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-1-468539916>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-1-468539916>]
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.515 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.520 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1cc8f291-9b02-47f8-8164-6e90852c85f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468-vda', 'timestamp': '2026-01-22T17:25:55.515106', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'instance-0000002c', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '68a51b3a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.275538289, 'message_signature': 'e6d3b8e0b1dbe560941db0da48949c34e5cafb04cfaffe1f4cfd4cd390a3095b'}]}, 'timestamp': '2026-01-22 17:25:55.521351', '_unique_id': 'f0cc093cfcee4fbba447361f85753512'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.522 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e00b4f18-50e4-4b51-a07e-efc6dd81a75d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468-vda', 'timestamp': '2026-01-22T17:25:55.522998', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'instance-0000002c', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '68a5661c-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.275538289, 'message_signature': '7fc0543b8d16bf993f238133b55d893b5fd52dd814d34b3cf388e9ce52dd6988'}]}, 'timestamp': '2026-01-22 17:25:55.523233', '_unique_id': 'f0e5a47fdbd0436ab5777a10b187b9e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.524 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.524 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk.device.read.requests volume: 1156 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea4aa0e6-9493-4245-9aee-40ac2d6fe4b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1156, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468-vda', 'timestamp': '2026-01-22T17:25:55.524365', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'instance-0000002c', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '68a59b28-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.229664633, 'message_signature': 'f15321557133bac982aa3b50b8f9b86860a1989700f4760304c3c3f025eeb0ce'}]}, 'timestamp': '2026-01-22 17:25:55.524587', '_unique_id': '344d681224f2453ba60c495440f4c677'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.525 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-1-468539916>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-1-468539916>]
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk.device.write.bytes volume: 73097216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2162f2c4-e0a1-45b7-9236-f2a0fc6dde1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73097216, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468-vda', 'timestamp': '2026-01-22T17:25:55.526168', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'instance-0000002c', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '68a5e196-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.229664633, 'message_signature': '84d88443ae99bf89911212be1dcae9f39f233f0e7d20a4b7e9246a2217792a44'}]}, 'timestamp': '2026-01-22 17:25:55.526391', '_unique_id': '1d43db711c7d4060b150d62800122b53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.527 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.527 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d48c78e-d70d-49de-8de5-def063e46a77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468-vda', 'timestamp': '2026-01-22T17:25:55.527485', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'instance-0000002c', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '68a614f4-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.275538289, 'message_signature': '6fde16e9710d81bd6f1f77ef70fef2e071ad3a739061a7b6b58c62c75d65b3a0'}]}, 'timestamp': '2026-01-22 17:25:55.527722', '_unique_id': 'd61c86169a58462b820d1094933cde02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.529 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.529 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aab053b3-5827-43b8-bfe9-07edca758484', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-0000002c-a6598da5-2e3d-4ca1-90ab-2a8db7241468-tapff4c20a1-cc', 'timestamp': '2026-01-22T17:25:55.529191', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'tapff4c20a1-cc', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:41:19:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff4c20a1-cc'}, 'message_id': '68a65928-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.220245451, 'message_signature': 'afdb5c959589dede57f946603a6a898f52ddaeab27eb4474398d08937a7cd4a9'}]}, 'timestamp': '2026-01-22 17:25:55.529483', '_unique_id': '56836b4b9d204a9793c8f4cd99cd3408'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.530 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7349b493-a53b-41a2-977a-99f30bdec2f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-0000002c-a6598da5-2e3d-4ca1-90ab-2a8db7241468-tapff4c20a1-cc', 'timestamp': '2026-01-22T17:25:55.530845', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'tapff4c20a1-cc', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:41:19:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff4c20a1-cc'}, 'message_id': '68a69988-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.220245451, 'message_signature': '8eb7ec501c49d59f45829577544efc59ee852df1f16fb5c9c1ec26d148452238'}]}, 'timestamp': '2026-01-22 17:25:55.531136', '_unique_id': '0d1314dcfb394a7793ae8f8460a54e20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.532 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.532 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/memory.usage volume: 42.046875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9562c73b-0b1f-4233-a0ef-16944f1e147a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.046875, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'timestamp': '2026-01-22T17:25:55.532281', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'instance-0000002c', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '68a6d06a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.270980897, 'message_signature': 'e52b1ca1bbb46239692756fb98bb26e0c5d865b8e08030937261d408a31298b8'}]}, 'timestamp': '2026-01-22 17:25:55.532504', '_unique_id': 'f1067f097d1d49418ef3a1c8b831519c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.533 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/network.outgoing.packets volume: 131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '606b5010-0245-430a-a5d9-b815dd99f200', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 131, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-0000002c-a6598da5-2e3d-4ca1-90ab-2a8db7241468-tapff4c20a1-cc', 'timestamp': '2026-01-22T17:25:55.533966', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'tapff4c20a1-cc', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:41:19:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff4c20a1-cc'}, 'message_id': '68a71336-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.220245451, 'message_signature': '0786e7d8a5d96036e33c89e2b3be7e4a0c7333cc510b5ef9294027e812ae40d0'}]}, 'timestamp': '2026-01-22 17:25:55.534235', '_unique_id': '75a5e7f336fa441b9082b94fc9533b76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.535 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.535 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/network.incoming.bytes volume: 7334 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aaea99ce-1bfb-47b2-9c00-cb5471bbbfaf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7334, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-0000002c-a6598da5-2e3d-4ca1-90ab-2a8db7241468-tapff4c20a1-cc', 'timestamp': '2026-01-22T17:25:55.535436', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'tapff4c20a1-cc', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:41:19:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff4c20a1-cc'}, 'message_id': '68a74bc6-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.220245451, 'message_signature': '3947478b0ff9714564680f6c073750957e2da59b3de3a48c2bfa961d4af939f0'}]}, 'timestamp': '2026-01-22 17:25:55.535692', '_unique_id': '987af2cf64cb4d3fa12db1f9fb1fa5a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.536 12 DEBUG ceilometer.compute.pollsters [-] a6598da5-2e3d-4ca1-90ab-2a8db7241468/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8aa05c82-1212-4575-be58-7371ed9112b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '852aea4e08344f39ae07e6b57393c767', 'user_name': None, 'project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'project_name': None, 'resource_id': 'instance-0000002c-a6598da5-2e3d-4ca1-90ab-2a8db7241468-tapff4c20a1-cc', 'timestamp': '2026-01-22T17:25:55.536809', 'resource_metadata': {'display_name': 'tempest-server-1-468539916', 'name': 'tapff4c20a1-cc', 'instance_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'instance_type': 'm1.nano', 'host': 'cc4838dcb5607773b01a8638dbe007b5d3c8cdd5c0f743d2811d9fa9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:41:19:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff4c20a1-cc'}, 'message_id': '68a781ea-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5119.220245451, 'message_signature': '722ec9dae699066bc4f60775889b55da9a632e1d8269104642b06a05d783bf0a'}]}, 'timestamp': '2026-01-22 17:25:55.537055', '_unique_id': 'ba3a35a3bb0a47239b5121bce614e038'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:25:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:25:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:25:55 compute-0 nova_compute[183075]: 2026-01-22 17:25:55.551 183079 DEBUG nova.network.neutron [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:25:56 compute-0 podman[230110]: 2026-01-22 17:25:56.351735719 +0000 UTC m=+0.057862207 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.425 183079 DEBUG nova.network.neutron [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Updating instance_info_cache with network_info: [{"id": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "address": "fa:16:3e:0a:5f:4d", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cdbd897-94", "ovs_interfaceid": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.470 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Releasing lock "refresh_cache-7148e02d-0822-41cf-b2f4-41ccec2a2fe4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.470 183079 DEBUG nova.compute.manager [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Instance network_info: |[{"id": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "address": "fa:16:3e:0a:5f:4d", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cdbd897-94", "ovs_interfaceid": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.471 183079 DEBUG oslo_concurrency.lockutils [req-fa459ca2-71fb-4b44-adf1-aeb8afa868b9 req-1caa1539-c9a2-49d6-b075-25429a27c74c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-7148e02d-0822-41cf-b2f4-41ccec2a2fe4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.471 183079 DEBUG nova.network.neutron [req-fa459ca2-71fb-4b44-adf1-aeb8afa868b9 req-1caa1539-c9a2-49d6-b075-25429a27c74c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Refreshing network info cache for port 7cdbd897-944c-4b6f-980e-c220cf2c2532 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.473 183079 DEBUG nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Start _get_guest_xml network_info=[{"id": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "address": "fa:16:3e:0a:5f:4d", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cdbd897-94", "ovs_interfaceid": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.477 183079 WARNING nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.483 183079 DEBUG nova.virt.libvirt.host [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.485 183079 DEBUG nova.virt.libvirt.host [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.488 183079 DEBUG nova.virt.libvirt.host [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.489 183079 DEBUG nova.virt.libvirt.host [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.489 183079 DEBUG nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.489 183079 DEBUG nova.virt.hardware [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.489 183079 DEBUG nova.virt.hardware [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.490 183079 DEBUG nova.virt.hardware [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.490 183079 DEBUG nova.virt.hardware [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.490 183079 DEBUG nova.virt.hardware [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.490 183079 DEBUG nova.virt.hardware [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.491 183079 DEBUG nova.virt.hardware [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.491 183079 DEBUG nova.virt.hardware [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.491 183079 DEBUG nova.virt.hardware [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.491 183079 DEBUG nova.virt.hardware [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.491 183079 DEBUG nova.virt.hardware [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.495 183079 DEBUG nova.virt.libvirt.vif [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:25:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-2-1964744070',display_name='tempest-server-2-1964744070',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-2-1964744070',id=46,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMynqvoXBI34qH0ZDQNq01ZPmk41Xdi4JxkvNO4nJ7GrdggfhpXkKQhhUZRV3fv3bSwcXMqi04ipV9xe7IsTm4GBRV+U9o6VN5JSMLulp+KIvjPuEmjq0h65ra09/zQB9w==',key_name='tempest-keypair-test-1762910261',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e4c0bb18013747dfad2e25b2495090eb',ramdisk_id='',reservation_id='r-p7dnx8q0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortForwardingTestJSON-1240706675',owner_user_name='tempest-PortForwardingTestJSON-1240706675-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:25:54Z,user_data=None,user_id='852aea4e08344f39ae07e6b57393c767',uuid=7148e02d-0822-41cf-b2f4-41ccec2a2fe4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "address": "fa:16:3e:0a:5f:4d", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cdbd897-94", "ovs_interfaceid": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.495 183079 DEBUG nova.network.os_vif_util [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converting VIF {"id": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "address": "fa:16:3e:0a:5f:4d", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cdbd897-94", "ovs_interfaceid": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.495 183079 DEBUG nova.network.os_vif_util [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:5f:4d,bridge_name='br-int',has_traffic_filtering=True,id=7cdbd897-944c-4b6f-980e-c220cf2c2532,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7cdbd897-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.497 183079 DEBUG nova.objects.instance [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lazy-loading 'pci_devices' on Instance uuid 7148e02d-0822-41cf-b2f4-41ccec2a2fe4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.497 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.519 183079 DEBUG nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:25:56 compute-0 nova_compute[183075]:   <uuid>7148e02d-0822-41cf-b2f4-41ccec2a2fe4</uuid>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   <name>instance-0000002e</name>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <nova:name>tempest-server-2-1964744070</nova:name>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:25:56</nova:creationTime>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:25:56 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:25:56 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:25:56 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:25:56 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:25:56 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:25:56 compute-0 nova_compute[183075]:         <nova:user uuid="852aea4e08344f39ae07e6b57393c767">tempest-PortForwardingTestJSON-1240706675-project-member</nova:user>
Jan 22 17:25:56 compute-0 nova_compute[183075]:         <nova:project uuid="e4c0bb18013747dfad2e25b2495090eb">tempest-PortForwardingTestJSON-1240706675</nova:project>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:25:56 compute-0 nova_compute[183075]:         <nova:port uuid="7cdbd897-944c-4b6f-980e-c220cf2c2532">
Jan 22 17:25:56 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <system>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <entry name="serial">7148e02d-0822-41cf-b2f4-41ccec2a2fe4</entry>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <entry name="uuid">7148e02d-0822-41cf-b2f4-41ccec2a2fe4</entry>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     </system>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   <os>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   </os>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   <features>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   </features>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4/disk"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:0a:5f:4d"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <target dev="tap7cdbd897-94"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4/console.log" append="off"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <video>
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     </video>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:25:56 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:25:56 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:25:56 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:25:56 compute-0 nova_compute[183075]: </domain>
Jan 22 17:25:56 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.520 183079 DEBUG nova.compute.manager [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Preparing to wait for external event network-vif-plugged-7cdbd897-944c-4b6f-980e-c220cf2c2532 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.520 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.520 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.521 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.521 183079 DEBUG nova.virt.libvirt.vif [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:25:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-2-1964744070',display_name='tempest-server-2-1964744070',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-2-1964744070',id=46,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMynqvoXBI34qH0ZDQNq01ZPmk41Xdi4JxkvNO4nJ7GrdggfhpXkKQhhUZRV3fv3bSwcXMqi04ipV9xe7IsTm4GBRV+U9o6VN5JSMLulp+KIvjPuEmjq0h65ra09/zQB9w==',key_name='tempest-keypair-test-1762910261',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e4c0bb18013747dfad2e25b2495090eb',ramdisk_id='',reservation_id='r-p7dnx8q0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortForwardingTestJSON-1240706675',owner_user_name='tempest-PortForwardingTestJSON-1240706675-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:25:54Z,user_data=None,user_id='852aea4e08344f39ae07e6b57393c767',uuid=7148e02d-0822-41cf-b2f4-41ccec2a2fe4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "address": "fa:16:3e:0a:5f:4d", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cdbd897-94", "ovs_interfaceid": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.521 183079 DEBUG nova.network.os_vif_util [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converting VIF {"id": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "address": "fa:16:3e:0a:5f:4d", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cdbd897-94", "ovs_interfaceid": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.522 183079 DEBUG nova.network.os_vif_util [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:5f:4d,bridge_name='br-int',has_traffic_filtering=True,id=7cdbd897-944c-4b6f-980e-c220cf2c2532,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7cdbd897-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.522 183079 DEBUG os_vif [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:5f:4d,bridge_name='br-int',has_traffic_filtering=True,id=7cdbd897-944c-4b6f-980e-c220cf2c2532,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7cdbd897-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.523 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.523 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.524 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.526 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.526 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7cdbd897-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.526 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7cdbd897-94, col_values=(('external_ids', {'iface-id': '7cdbd897-944c-4b6f-980e-c220cf2c2532', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0a:5f:4d', 'vm-uuid': '7148e02d-0822-41cf-b2f4-41ccec2a2fe4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.528 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:56 compute-0 NetworkManager[55454]: <info>  [1769102756.5291] manager: (tap7cdbd897-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/219)
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.530 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.532 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.533 183079 INFO os_vif [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:5f:4d,bridge_name='br-int',has_traffic_filtering=True,id=7cdbd897-944c-4b6f-980e-c220cf2c2532,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7cdbd897-94')
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.647 183079 DEBUG nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.647 183079 DEBUG nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] No VIF found with MAC fa:16:3e:0a:5f:4d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:25:56 compute-0 kernel: tap7cdbd897-94: entered promiscuous mode
Jan 22 17:25:56 compute-0 NetworkManager[55454]: <info>  [1769102756.7086] manager: (tap7cdbd897-94): new Tun device (/org/freedesktop/NetworkManager/Devices/220)
Jan 22 17:25:56 compute-0 ovn_controller[95372]: 2026-01-22T17:25:56Z|00511|binding|INFO|Claiming lport 7cdbd897-944c-4b6f-980e-c220cf2c2532 for this chassis.
Jan 22 17:25:56 compute-0 ovn_controller[95372]: 2026-01-22T17:25:56Z|00512|binding|INFO|7cdbd897-944c-4b6f-980e-c220cf2c2532: Claiming fa:16:3e:0a:5f:4d 10.100.0.13
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.710 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:56.726 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:5f:4d 10.100.0.13'], port_security=['fa:16:3e:0a:5f:4d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '7148e02d-0822-41cf-b2f4-41ccec2a2fe4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '94e3530f-8012-4817-a338-7919b109ef3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12343ce0-7cef-4f7f-9439-6550d878d4ba, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=7cdbd897-944c-4b6f-980e-c220cf2c2532) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:25:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:56.727 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 7cdbd897-944c-4b6f-980e-c220cf2c2532 in datapath 44326f3c-1431-44d6-85ce-61ecbbb5ed7a bound to our chassis
Jan 22 17:25:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:56.729 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44326f3c-1431-44d6-85ce-61ecbbb5ed7a
Jan 22 17:25:56 compute-0 systemd-udevd[230145]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:25:56 compute-0 ovn_controller[95372]: 2026-01-22T17:25:56Z|00513|binding|INFO|Setting lport 7cdbd897-944c-4b6f-980e-c220cf2c2532 ovn-installed in OVS
Jan 22 17:25:56 compute-0 ovn_controller[95372]: 2026-01-22T17:25:56Z|00514|binding|INFO|Setting lport 7cdbd897-944c-4b6f-980e-c220cf2c2532 up in Southbound
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.741 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:56.743 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0f5bd1a5-b70a-43d2-8dff-7da8b5bb800b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.747 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:56 compute-0 NetworkManager[55454]: <info>  [1769102756.7547] device (tap7cdbd897-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:25:56 compute-0 NetworkManager[55454]: <info>  [1769102756.7557] device (tap7cdbd897-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:25:56 compute-0 systemd-machined[154382]: New machine qemu-46-instance-0000002e.
Jan 22 17:25:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:56.773 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ed75f59d-8919-4d5d-b277-56ae1bc1ad21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:56.776 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[82703015-df8e-449f-a4bf-638722ff831d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:56 compute-0 systemd[1]: Started Virtual Machine qemu-46-instance-0000002e.
Jan 22 17:25:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:56.800 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[94cb40ab-2620-449b-9bff-ad9e70e26cf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.807 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:56.819 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6fe0949f-b07e-4398-92a7-ed8514b36a5b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44326f3c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:1b:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 56, 'rx_bytes': 8920, 'tx_bytes': 6252, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 56, 'rx_bytes': 8920, 'tx_bytes': 6252, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505447, 'reachable_time': 18257, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230159, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:56.836 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[defb5d5c-dc44-45e6-9849-bd5abb2cb876]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap44326f3c-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505462, 'tstamp': 505462}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230162, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap44326f3c-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505465, 'tstamp': 505465}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230162, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:25:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:56.837 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44326f3c-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.838 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:56 compute-0 nova_compute[183075]: 2026-01-22 17:25:56.839 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:25:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:56.839 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44326f3c-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:56.840 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:25:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:56.840 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44326f3c-10, col_values=(('external_ids', {'iface-id': '118957e0-7da0-4d87-b7d4-2c204e19e5b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:25:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:25:56.840 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.005 183079 DEBUG nova.compute.manager [req-2dd2c518-7ee3-4787-a530-b7d684355ad7 req-ffbe8341-6d10-456b-8060-f2e5b1b6c5a3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Received event network-vif-plugged-7cdbd897-944c-4b6f-980e-c220cf2c2532 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.007 183079 DEBUG oslo_concurrency.lockutils [req-2dd2c518-7ee3-4787-a530-b7d684355ad7 req-ffbe8341-6d10-456b-8060-f2e5b1b6c5a3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.008 183079 DEBUG oslo_concurrency.lockutils [req-2dd2c518-7ee3-4787-a530-b7d684355ad7 req-ffbe8341-6d10-456b-8060-f2e5b1b6c5a3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.009 183079 DEBUG oslo_concurrency.lockutils [req-2dd2c518-7ee3-4787-a530-b7d684355ad7 req-ffbe8341-6d10-456b-8060-f2e5b1b6c5a3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.009 183079 DEBUG nova.compute.manager [req-2dd2c518-7ee3-4787-a530-b7d684355ad7 req-ffbe8341-6d10-456b-8060-f2e5b1b6c5a3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Processing event network-vif-plugged-7cdbd897-944c-4b6f-980e-c220cf2c2532 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.099 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102757.0987785, 7148e02d-0822-41cf-b2f4-41ccec2a2fe4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.100 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] VM Started (Lifecycle Event)
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.104 183079 DEBUG nova.compute.manager [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.108 183079 DEBUG nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.114 183079 INFO nova.virt.libvirt.driver [-] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Instance spawned successfully.
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.114 183079 DEBUG nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.124 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.135 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.142 183079 DEBUG nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.143 183079 DEBUG nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.144 183079 DEBUG nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.145 183079 DEBUG nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.146 183079 DEBUG nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.147 183079 DEBUG nova.virt.libvirt.driver [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.156 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.157 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102757.0989323, 7148e02d-0822-41cf-b2f4-41ccec2a2fe4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.157 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] VM Paused (Lifecycle Event)
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.187 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.192 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102757.1073065, 7148e02d-0822-41cf-b2f4-41ccec2a2fe4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.192 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] VM Resumed (Lifecycle Event)
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.211 183079 INFO nova.compute.manager [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Took 2.88 seconds to spawn the instance on the hypervisor.
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.211 183079 DEBUG nova.compute.manager [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.213 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.222 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.265 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.287 183079 INFO nova.compute.manager [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Took 3.41 seconds to build instance.
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.304 183079 DEBUG oslo_concurrency.lockutils [None req-6988a600-93aa-4b74-a52d-ed034df10d5c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.541s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.889 183079 INFO nova.compute.manager [None req-46a55508-441e-4210-9455-dacf487f09ab 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Get console output
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.895 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.975 183079 DEBUG nova.network.neutron [req-fa459ca2-71fb-4b44-adf1-aeb8afa868b9 req-1caa1539-c9a2-49d6-b075-25429a27c74c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Updated VIF entry in instance network info cache for port 7cdbd897-944c-4b6f-980e-c220cf2c2532. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:25:57 compute-0 nova_compute[183075]: 2026-01-22 17:25:57.976 183079 DEBUG nova.network.neutron [req-fa459ca2-71fb-4b44-adf1-aeb8afa868b9 req-1caa1539-c9a2-49d6-b075-25429a27c74c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Updating instance_info_cache with network_info: [{"id": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "address": "fa:16:3e:0a:5f:4d", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cdbd897-94", "ovs_interfaceid": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:25:58 compute-0 nova_compute[183075]: 2026-01-22 17:25:58.132 183079 DEBUG oslo_concurrency.lockutils [req-fa459ca2-71fb-4b44-adf1-aeb8afa868b9 req-1caa1539-c9a2-49d6-b075-25429a27c74c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-7148e02d-0822-41cf-b2f4-41ccec2a2fe4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:25:59 compute-0 nova_compute[183075]: 2026-01-22 17:25:59.121 183079 DEBUG nova.compute.manager [req-5c4ce4b3-778b-4a89-b5ca-3fa28857d0db req-2ff3c506-93ad-44f5-8ebc-42ae26ed08b0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Received event network-vif-plugged-7cdbd897-944c-4b6f-980e-c220cf2c2532 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:25:59 compute-0 nova_compute[183075]: 2026-01-22 17:25:59.121 183079 DEBUG oslo_concurrency.lockutils [req-5c4ce4b3-778b-4a89-b5ca-3fa28857d0db req-2ff3c506-93ad-44f5-8ebc-42ae26ed08b0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:25:59 compute-0 nova_compute[183075]: 2026-01-22 17:25:59.121 183079 DEBUG oslo_concurrency.lockutils [req-5c4ce4b3-778b-4a89-b5ca-3fa28857d0db req-2ff3c506-93ad-44f5-8ebc-42ae26ed08b0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:25:59 compute-0 nova_compute[183075]: 2026-01-22 17:25:59.122 183079 DEBUG oslo_concurrency.lockutils [req-5c4ce4b3-778b-4a89-b5ca-3fa28857d0db req-2ff3c506-93ad-44f5-8ebc-42ae26ed08b0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:25:59 compute-0 nova_compute[183075]: 2026-01-22 17:25:59.122 183079 DEBUG nova.compute.manager [req-5c4ce4b3-778b-4a89-b5ca-3fa28857d0db req-2ff3c506-93ad-44f5-8ebc-42ae26ed08b0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] No waiting events found dispatching network-vif-plugged-7cdbd897-944c-4b6f-980e-c220cf2c2532 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:25:59 compute-0 nova_compute[183075]: 2026-01-22 17:25:59.122 183079 WARNING nova.compute.manager [req-5c4ce4b3-778b-4a89-b5ca-3fa28857d0db req-2ff3c506-93ad-44f5-8ebc-42ae26ed08b0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Received unexpected event network-vif-plugged-7cdbd897-944c-4b6f-980e-c220cf2c2532 for instance with vm_state active and task_state None.
Jan 22 17:26:01 compute-0 nova_compute[183075]: 2026-01-22 17:26:01.531 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:01 compute-0 ovn_controller[95372]: 2026-01-22T17:26:01Z|00515|binding|INFO|Releasing lport 118957e0-7da0-4d87-b7d4-2c204e19e5b6 from this chassis (sb_readonly=0)
Jan 22 17:26:01 compute-0 nova_compute[183075]: 2026-01-22 17:26:01.794 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:01 compute-0 nova_compute[183075]: 2026-01-22 17:26:01.857 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:01 compute-0 ovn_controller[95372]: 2026-01-22T17:26:01Z|00516|binding|INFO|Releasing lport 118957e0-7da0-4d87-b7d4-2c204e19e5b6 from this chassis (sb_readonly=0)
Jan 22 17:26:01 compute-0 nova_compute[183075]: 2026-01-22 17:26:01.867 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:03 compute-0 nova_compute[183075]: 2026-01-22 17:26:03.017 183079 INFO nova.compute.manager [None req-05173e11-4ee3-4e18-8fd2-3418482ce61c 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Get console output
Jan 22 17:26:03 compute-0 nova_compute[183075]: 2026-01-22 17:26:03.024 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:26:03 compute-0 nova_compute[183075]: 2026-01-22 17:26:03.140 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102748.139227, 3e702ebd-3e54-4c9a-937e-f38331893ad1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:26:03 compute-0 nova_compute[183075]: 2026-01-22 17:26:03.141 183079 INFO nova.compute.manager [-] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] VM Stopped (Lifecycle Event)
Jan 22 17:26:03 compute-0 nova_compute[183075]: 2026-01-22 17:26:03.156 183079 DEBUG nova.compute.manager [None req-e03e8173-c290-4dff-9a15-3c86fb902119 - - - - - -] [instance: 3e702ebd-3e54-4c9a-937e-f38331893ad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:26:05 compute-0 podman[230173]: 2026-01-22 17:26:05.380765253 +0000 UTC m=+0.082698900 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:26:06 compute-0 nova_compute[183075]: 2026-01-22 17:26:06.458 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102751.4572327, bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:26:06 compute-0 nova_compute[183075]: 2026-01-22 17:26:06.459 183079 INFO nova.compute.manager [-] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] VM Stopped (Lifecycle Event)
Jan 22 17:26:06 compute-0 nova_compute[183075]: 2026-01-22 17:26:06.480 183079 DEBUG nova.compute.manager [None req-d5b2dad9-a6a0-4cec-a1f3-3a3b50654893 - - - - - -] [instance: bf3e2d5d-90d9-4e1b-8c7f-bfc45374f1b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:26:06 compute-0 nova_compute[183075]: 2026-01-22 17:26:06.534 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:06 compute-0 nova_compute[183075]: 2026-01-22 17:26:06.861 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:10 compute-0 nova_compute[183075]: 2026-01-22 17:26:10.109 183079 INFO nova.compute.manager [None req-13292d76-cafe-4796-87f0-45973f574e9a 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Get console output
Jan 22 17:26:10 compute-0 nova_compute[183075]: 2026-01-22 17:26:10.114 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:26:11 compute-0 podman[230207]: 2026-01-22 17:26:11.341297198 +0000 UTC m=+0.044324145 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:26:11 compute-0 nova_compute[183075]: 2026-01-22 17:26:11.537 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:11 compute-0 nova_compute[183075]: 2026-01-22 17:26:11.863 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:12 compute-0 nova_compute[183075]: 2026-01-22 17:26:12.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:26:12 compute-0 ovn_controller[95372]: 2026-01-22T17:26:12Z|00094|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0a:5f:4d 10.100.0.13
Jan 22 17:26:12 compute-0 ovn_controller[95372]: 2026-01-22T17:26:12Z|00095|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0a:5f:4d 10.100.0.13
Jan 22 17:26:15 compute-0 nova_compute[183075]: 2026-01-22 17:26:15.396 183079 INFO nova.compute.manager [None req-ed1d9dd7-4353-4765-b141-68e9f9532809 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Get console output
Jan 22 17:26:15 compute-0 nova_compute[183075]: 2026-01-22 17:26:15.402 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:26:16 compute-0 nova_compute[183075]: 2026-01-22 17:26:16.540 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:16 compute-0 nova_compute[183075]: 2026-01-22 17:26:16.864 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:19 compute-0 nova_compute[183075]: 2026-01-22 17:26:19.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:26:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:19.847 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:26:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:19.848 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:26:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:26:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:26:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:26:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:26:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:26:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:26:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.611 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.612 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.7641509
Jan 22 17:26:20 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.13:38342 [22/Jan/2026:17:26:19.846] listener listener/metadata 0/0/0/765/765 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.620 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.622 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:26:20 compute-0 nova_compute[183075]: 2026-01-22 17:26:20.628 183079 INFO nova.compute.manager [None req-7f0ab430-b696-432d-8fdb-25417594f673 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Get console output
Jan 22 17:26:20 compute-0 nova_compute[183075]: 2026-01-22 17:26:20.633 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.639 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.639 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0175712
Jan 22 17:26:20 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.13:38352 [22/Jan/2026:17:26:20.620] listener listener/metadata 0/0/0/19/19 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.645 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.646 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.662 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.662 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0164313
Jan 22 17:26:20 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.13:59156 [22/Jan/2026:17:26:20.645] listener listener/metadata 0/0/0/17/17 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.666 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.666 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.678 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.678 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0119641
Jan 22 17:26:20 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.13:59158 [22/Jan/2026:17:26:20.666] listener listener/metadata 0/0/0/12/12 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.682 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.683 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.696 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.697 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0138824
Jan 22 17:26:20 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.13:59162 [22/Jan/2026:17:26:20.682] listener listener/metadata 0/0/0/14/14 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.700 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.701 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.725 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:26:20 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.13:59170 [22/Jan/2026:17:26:20.700] listener listener/metadata 0/0/0/24/24 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.725 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0240157
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.729 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.730 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.744 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.744 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0140212
Jan 22 17:26:20 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.13:59176 [22/Jan/2026:17:26:20.729] listener listener/metadata 0/0/0/15/15 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.749 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.750 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.770 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.770 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0203238
Jan 22 17:26:20 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.13:59180 [22/Jan/2026:17:26:20.749] listener listener/metadata 0/0/0/21/21 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.781 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.781 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:26:20 compute-0 nova_compute[183075]: 2026-01-22 17:26:20.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.799 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.800 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 163 time: 0.0185888
Jan 22 17:26:20 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.13:59186 [22/Jan/2026:17:26:20.780] listener listener/metadata 0/0/0/19/19 200 147 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.805 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.806 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.826 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.827 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 163 time: 0.0211341
Jan 22 17:26:20 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.13:59198 [22/Jan/2026:17:26:20.805] listener listener/metadata 0/0/0/22/22 200 147 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.833 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.833 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:26:20 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.13:59210 [22/Jan/2026:17:26:20.832] listener listener/metadata 0/0/0/16/16 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.848 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0146542
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.867 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.868 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.887 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.887 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0191786
Jan 22 17:26:20 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.13:59216 [22/Jan/2026:17:26:20.866] listener listener/metadata 0/0/0/21/21 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.892 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.892 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.908 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.909 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0162604
Jan 22 17:26:20 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.13:59222 [22/Jan/2026:17:26:20.891] listener listener/metadata 0/0/0/17/17 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.913 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.913 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.933 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.934 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0206203
Jan 22 17:26:20 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.13:59228 [22/Jan/2026:17:26:20.912] listener listener/metadata 0/0/0/21/21 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.938 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.939 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.954 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.954 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 163 time: 0.0155425
Jan 22 17:26:20 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.13:59232 [22/Jan/2026:17:26:20.938] listener listener/metadata 0/0/0/16/16 200 147 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.958 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.959 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 44326f3c-1431-44d6-85ce-61ecbbb5ed7a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.977 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:26:20 compute-0 haproxy-metadata-proxy-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229564]: 10.100.0.13:59248 [22/Jan/2026:17:26:20.958] listener listener/metadata 0/0/0/19/19 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:26:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:20.978 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0193615
Jan 22 17:26:21 compute-0 podman[230262]: 2026-01-22 17:26:21.39921369 +0000 UTC m=+0.078987532 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 17:26:21 compute-0 podman[230261]: 2026-01-22 17:26:21.399361283 +0000 UTC m=+0.095518943 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 17:26:21 compute-0 podman[230260]: 2026-01-22 17:26:21.417356434 +0000 UTC m=+0.108227942 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:26:21 compute-0 nova_compute[183075]: 2026-01-22 17:26:21.541 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:21 compute-0 nova_compute[183075]: 2026-01-22 17:26:21.866 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:24 compute-0 nova_compute[183075]: 2026-01-22 17:26:24.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:26:25 compute-0 nova_compute[183075]: 2026-01-22 17:26:25.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:26:25 compute-0 nova_compute[183075]: 2026-01-22 17:26:25.828 183079 INFO nova.compute.manager [None req-37f21cae-0683-4534-967e-dd302d2b0843 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Get console output
Jan 22 17:26:25 compute-0 nova_compute[183075]: 2026-01-22 17:26:25.833 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:26:26 compute-0 nova_compute[183075]: 2026-01-22 17:26:26.545 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:26 compute-0 nova_compute[183075]: 2026-01-22 17:26:26.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:26:26 compute-0 nova_compute[183075]: 2026-01-22 17:26:26.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:26:26 compute-0 nova_compute[183075]: 2026-01-22 17:26:26.888 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:27 compute-0 podman[230325]: 2026-01-22 17:26:27.350781744 +0000 UTC m=+0.062376147 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:26:27 compute-0 nova_compute[183075]: 2026-01-22 17:26:27.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:26:27 compute-0 nova_compute[183075]: 2026-01-22 17:26:27.818 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:26:27 compute-0 nova_compute[183075]: 2026-01-22 17:26:27.819 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:26:27 compute-0 nova_compute[183075]: 2026-01-22 17:26:27.819 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:26:27 compute-0 nova_compute[183075]: 2026-01-22 17:26:27.820 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:26:27 compute-0 nova_compute[183075]: 2026-01-22 17:26:27.913 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.010 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.012 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.072 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.078 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.179 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.180 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.274 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.497 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.498 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5395MB free_disk=73.3099365234375GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.499 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.499 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.679 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance a6598da5-2e3d-4ca1-90ab-2a8db7241468 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.679 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 7148e02d-0822-41cf-b2f4-41ccec2a2fe4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.680 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.680 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.763 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.795 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.836 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:26:28 compute-0 nova_compute[183075]: 2026-01-22 17:26:28.837 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:26:29 compute-0 nova_compute[183075]: 2026-01-22 17:26:29.838 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:26:29 compute-0 nova_compute[183075]: 2026-01-22 17:26:29.839 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:26:29 compute-0 nova_compute[183075]: 2026-01-22 17:26:29.839 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:26:30 compute-0 nova_compute[183075]: 2026-01-22 17:26:30.424 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-a6598da5-2e3d-4ca1-90ab-2a8db7241468" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:26:30 compute-0 nova_compute[183075]: 2026-01-22 17:26:30.425 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-a6598da5-2e3d-4ca1-90ab-2a8db7241468" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:26:30 compute-0 nova_compute[183075]: 2026-01-22 17:26:30.425 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:26:30 compute-0 nova_compute[183075]: 2026-01-22 17:26:30.426 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a6598da5-2e3d-4ca1-90ab-2a8db7241468 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:26:30 compute-0 nova_compute[183075]: 2026-01-22 17:26:30.950 183079 INFO nova.compute.manager [None req-a966d0a9-8afb-45c4-86b4-e259c5f8654f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Get console output
Jan 22 17:26:30 compute-0 nova_compute[183075]: 2026-01-22 17:26:30.959 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:26:31 compute-0 nova_compute[183075]: 2026-01-22 17:26:31.548 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:31 compute-0 nova_compute[183075]: 2026-01-22 17:26:31.890 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:32 compute-0 nova_compute[183075]: 2026-01-22 17:26:32.812 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Updating instance_info_cache with network_info: [{"id": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "address": "fa:16:3e:41:19:a3", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4c20a1-cc", "ovs_interfaceid": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:26:32 compute-0 nova_compute[183075]: 2026-01-22 17:26:32.845 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-a6598da5-2e3d-4ca1-90ab-2a8db7241468" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:26:32 compute-0 nova_compute[183075]: 2026-01-22 17:26:32.845 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:26:32 compute-0 nova_compute[183075]: 2026-01-22 17:26:32.846 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:26:36 compute-0 nova_compute[183075]: 2026-01-22 17:26:36.107 183079 INFO nova.compute.manager [None req-ddb09f61-f63c-47b0-a4b3-440039dc1dae 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Get console output
Jan 22 17:26:36 compute-0 nova_compute[183075]: 2026-01-22 17:26:36.112 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:26:36 compute-0 podman[230360]: 2026-01-22 17:26:36.368431344 +0000 UTC m=+0.064873364 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:26:36 compute-0 nova_compute[183075]: 2026-01-22 17:26:36.551 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:36 compute-0 nova_compute[183075]: 2026-01-22 17:26:36.893 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:41 compute-0 nova_compute[183075]: 2026-01-22 17:26:41.244 183079 INFO nova.compute.manager [None req-b878bb79-0abb-45d9-8989-780ff7380bec 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Get console output
Jan 22 17:26:41 compute-0 nova_compute[183075]: 2026-01-22 17:26:41.249 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:26:41 compute-0 nova_compute[183075]: 2026-01-22 17:26:41.555 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:41 compute-0 nova_compute[183075]: 2026-01-22 17:26:41.895 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:41.943 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:26:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:41.944 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:26:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:41.944 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:26:42 compute-0 podman[230385]: 2026-01-22 17:26:42.384148323 +0000 UTC m=+0.089574844 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:26:46 compute-0 nova_compute[183075]: 2026-01-22 17:26:46.440 183079 INFO nova.compute.manager [None req-a9fd42b4-426c-43ec-b36e-e25e3a019371 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Get console output
Jan 22 17:26:46 compute-0 nova_compute[183075]: 2026-01-22 17:26:46.445 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:26:46 compute-0 nova_compute[183075]: 2026-01-22 17:26:46.557 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:46 compute-0 nova_compute[183075]: 2026-01-22 17:26:46.927 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:51 compute-0 nova_compute[183075]: 2026-01-22 17:26:51.560 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:51 compute-0 nova_compute[183075]: 2026-01-22 17:26:51.599 183079 INFO nova.compute.manager [None req-4e22e156-9b99-4575-9166-57a4c613d112 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Get console output
Jan 22 17:26:51 compute-0 nova_compute[183075]: 2026-01-22 17:26:51.606 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:26:51 compute-0 nova_compute[183075]: 2026-01-22 17:26:51.773 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:26:51 compute-0 nova_compute[183075]: 2026-01-22 17:26:51.774 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:26:51 compute-0 nova_compute[183075]: 2026-01-22 17:26:51.814 183079 DEBUG nova.compute.manager [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:26:51 compute-0 nova_compute[183075]: 2026-01-22 17:26:51.892 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:26:51 compute-0 nova_compute[183075]: 2026-01-22 17:26:51.892 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:26:51 compute-0 nova_compute[183075]: 2026-01-22 17:26:51.902 183079 DEBUG nova.virt.hardware [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:26:51 compute-0 nova_compute[183075]: 2026-01-22 17:26:51.902 183079 INFO nova.compute.claims [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:26:51 compute-0 nova_compute[183075]: 2026-01-22 17:26:51.928 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.057 183079 DEBUG nova.compute.provider_tree [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.083 183079 DEBUG nova.scheduler.client.report [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.109 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.110 183079 DEBUG nova.compute.manager [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.177 183079 DEBUG nova.compute.manager [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.178 183079 DEBUG nova.network.neutron [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.200 183079 INFO nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.220 183079 DEBUG nova.compute.manager [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.294 183079 DEBUG nova.compute.manager [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.295 183079 DEBUG nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.296 183079 INFO nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Creating image(s)
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.296 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "/var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.296 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "/var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.297 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "/var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.308 183079 DEBUG oslo_concurrency.processutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:26:52 compute-0 podman[230411]: 2026-01-22 17:26:52.368749748 +0000 UTC m=+0.050662374 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.369 183079 DEBUG oslo_concurrency.processutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.378 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.380 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:26:52 compute-0 podman[230412]: 2026-01-22 17:26:52.382393593 +0000 UTC m=+0.061934646 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=)
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.392 183079 DEBUG oslo_concurrency.processutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:26:52 compute-0 podman[230410]: 2026-01-22 17:26:52.398483473 +0000 UTC m=+0.097029794 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.445 183079 DEBUG oslo_concurrency.processutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.445 183079 DEBUG oslo_concurrency.processutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.474 183079 DEBUG oslo_concurrency.processutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.475 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.476 183079 DEBUG oslo_concurrency.processutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.529 183079 DEBUG oslo_concurrency.processutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.531 183079 DEBUG nova.virt.disk.api [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Checking if we can resize image /var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.532 183079 DEBUG oslo_concurrency.processutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.589 183079 DEBUG oslo_concurrency.processutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.591 183079 DEBUG nova.virt.disk.api [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Cannot resize image /var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.592 183079 DEBUG nova.objects.instance [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lazy-loading 'migration_context' on Instance uuid b3d0d846-1f10-43c4-8e0c-9ba93967fdc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.613 183079 DEBUG nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.613 183079 DEBUG nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Ensure instance console log exists: /var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.614 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.615 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.615 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:26:52 compute-0 nova_compute[183075]: 2026-01-22 17:26:52.679 183079 DEBUG nova.policy [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:26:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:53.432 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:26:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:53.433 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:26:53 compute-0 nova_compute[183075]: 2026-01-22 17:26:53.472 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:53 compute-0 nova_compute[183075]: 2026-01-22 17:26:53.820 183079 DEBUG nova.network.neutron [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Successfully updated port: 06bfc5f7-2163-4a40-87a7-050edd036a92 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:26:53 compute-0 nova_compute[183075]: 2026-01-22 17:26:53.849 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "refresh_cache-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:26:53 compute-0 nova_compute[183075]: 2026-01-22 17:26:53.850 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquired lock "refresh_cache-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:26:53 compute-0 nova_compute[183075]: 2026-01-22 17:26:53.850 183079 DEBUG nova.network.neutron [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:26:53 compute-0 nova_compute[183075]: 2026-01-22 17:26:53.994 183079 DEBUG nova.compute.manager [req-cce4e63c-d24c-4de2-8738-13d14167c881 req-bcd850af-bb72-4728-a054-1775e3254ac3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Received event network-changed-06bfc5f7-2163-4a40-87a7-050edd036a92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:26:53 compute-0 nova_compute[183075]: 2026-01-22 17:26:53.995 183079 DEBUG nova.compute.manager [req-cce4e63c-d24c-4de2-8738-13d14167c881 req-bcd850af-bb72-4728-a054-1775e3254ac3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Refreshing instance network info cache due to event network-changed-06bfc5f7-2163-4a40-87a7-050edd036a92. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:26:53 compute-0 nova_compute[183075]: 2026-01-22 17:26:53.995 183079 DEBUG oslo_concurrency.lockutils [req-cce4e63c-d24c-4de2-8738-13d14167c881 req-bcd850af-bb72-4728-a054-1775e3254ac3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:26:54 compute-0 nova_compute[183075]: 2026-01-22 17:26:54.396 183079 DEBUG nova.network.neutron [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:26:55 compute-0 nova_compute[183075]: 2026-01-22 17:26:55.629 183079 DEBUG nova.network.neutron [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Updating instance_info_cache with network_info: [{"id": "06bfc5f7-2163-4a40-87a7-050edd036a92", "address": "fa:16:3e:9e:13:c8", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06bfc5f7-21", "ovs_interfaceid": "06bfc5f7-2163-4a40-87a7-050edd036a92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.342 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Releasing lock "refresh_cache-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.342 183079 DEBUG nova.compute.manager [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Instance network_info: |[{"id": "06bfc5f7-2163-4a40-87a7-050edd036a92", "address": "fa:16:3e:9e:13:c8", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06bfc5f7-21", "ovs_interfaceid": "06bfc5f7-2163-4a40-87a7-050edd036a92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.342 183079 DEBUG oslo_concurrency.lockutils [req-cce4e63c-d24c-4de2-8738-13d14167c881 req-bcd850af-bb72-4728-a054-1775e3254ac3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.343 183079 DEBUG nova.network.neutron [req-cce4e63c-d24c-4de2-8738-13d14167c881 req-bcd850af-bb72-4728-a054-1775e3254ac3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Refreshing network info cache for port 06bfc5f7-2163-4a40-87a7-050edd036a92 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.345 183079 DEBUG nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Start _get_guest_xml network_info=[{"id": "06bfc5f7-2163-4a40-87a7-050edd036a92", "address": "fa:16:3e:9e:13:c8", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06bfc5f7-21", "ovs_interfaceid": "06bfc5f7-2163-4a40-87a7-050edd036a92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.352 183079 WARNING nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.356 183079 DEBUG nova.virt.libvirt.host [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.357 183079 DEBUG nova.virt.libvirt.host [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.360 183079 DEBUG nova.virt.libvirt.host [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.360 183079 DEBUG nova.virt.libvirt.host [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.361 183079 DEBUG nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.361 183079 DEBUG nova.virt.hardware [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.362 183079 DEBUG nova.virt.hardware [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.362 183079 DEBUG nova.virt.hardware [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.362 183079 DEBUG nova.virt.hardware [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.363 183079 DEBUG nova.virt.hardware [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.363 183079 DEBUG nova.virt.hardware [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.363 183079 DEBUG nova.virt.hardware [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.363 183079 DEBUG nova.virt.hardware [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.364 183079 DEBUG nova.virt.hardware [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.364 183079 DEBUG nova.virt.hardware [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.364 183079 DEBUG nova.virt.hardware [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.369 183079 DEBUG nova.virt.libvirt.vif [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:26:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-06bfc5f7-2022213617',display_name='tempest-server-06bfc5f7-2022213617',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-06bfc5f7-2022213617',id=47,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHCLHRqMwuIHHegtVN/dBsFuYJfMhVe3L9LmEAUU1mvb3zBmzXJa8cLxmVJrl+X3/Ox/s4/8DkIlSNEMGQ0/B6pNriTMnK4hV1OfgcB6Su2v76kqiKHqmCcfWjFp6l6YnA==',key_name='tempest-keypair-test-1032085623',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6cc38cd108414e729395073de19dceae',ramdisk_id='',reservation_id='r-bitt3llt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestFloatingIPUpdate-1973932938',owner_user_name='tempest-TestFloatingIPUpdate-1973932938-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:26:52Z,user_data=None,user_id='841f48f635f54f619a9de1d6bbc8f832',uuid=b3d0d846-1f10-43c4-8e0c-9ba93967fdc7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06bfc5f7-2163-4a40-87a7-050edd036a92", "address": "fa:16:3e:9e:13:c8", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06bfc5f7-21", "ovs_interfaceid": "06bfc5f7-2163-4a40-87a7-050edd036a92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.369 183079 DEBUG nova.network.os_vif_util [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Converting VIF {"id": "06bfc5f7-2163-4a40-87a7-050edd036a92", "address": "fa:16:3e:9e:13:c8", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06bfc5f7-21", "ovs_interfaceid": "06bfc5f7-2163-4a40-87a7-050edd036a92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.370 183079 DEBUG nova.network.os_vif_util [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:13:c8,bridge_name='br-int',has_traffic_filtering=True,id=06bfc5f7-2163-4a40-87a7-050edd036a92,network=Network(900bfc1d-c57a-4f7e-92bf-4e7a876cd570),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06bfc5f7-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.371 183079 DEBUG nova.objects.instance [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lazy-loading 'pci_devices' on Instance uuid b3d0d846-1f10-43c4-8e0c-9ba93967fdc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.547 183079 DEBUG nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:26:56 compute-0 nova_compute[183075]:   <uuid>b3d0d846-1f10-43c4-8e0c-9ba93967fdc7</uuid>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   <name>instance-0000002f</name>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <nova:name>tempest-server-06bfc5f7-2022213617</nova:name>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:26:56</nova:creationTime>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:26:56 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:26:56 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:26:56 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:26:56 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:26:56 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:26:56 compute-0 nova_compute[183075]:         <nova:user uuid="841f48f635f54f619a9de1d6bbc8f832">tempest-TestFloatingIPUpdate-1973932938-project-member</nova:user>
Jan 22 17:26:56 compute-0 nova_compute[183075]:         <nova:project uuid="6cc38cd108414e729395073de19dceae">tempest-TestFloatingIPUpdate-1973932938</nova:project>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:26:56 compute-0 nova_compute[183075]:         <nova:port uuid="06bfc5f7-2163-4a40-87a7-050edd036a92">
Jan 22 17:26:56 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <system>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <entry name="serial">b3d0d846-1f10-43c4-8e0c-9ba93967fdc7</entry>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <entry name="uuid">b3d0d846-1f10-43c4-8e0c-9ba93967fdc7</entry>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     </system>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   <os>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   </os>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   <features>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   </features>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:9e:13:c8"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <target dev="tap06bfc5f7-21"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/console.log" append="off"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <video>
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     </video>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:26:56 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:26:56 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:26:56 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:26:56 compute-0 nova_compute[183075]: </domain>
Jan 22 17:26:56 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.548 183079 DEBUG nova.compute.manager [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Preparing to wait for external event network-vif-plugged-06bfc5f7-2163-4a40-87a7-050edd036a92 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.548 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.548 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.548 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.549 183079 DEBUG nova.virt.libvirt.vif [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:26:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-06bfc5f7-2022213617',display_name='tempest-server-06bfc5f7-2022213617',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-06bfc5f7-2022213617',id=47,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHCLHRqMwuIHHegtVN/dBsFuYJfMhVe3L9LmEAUU1mvb3zBmzXJa8cLxmVJrl+X3/Ox/s4/8DkIlSNEMGQ0/B6pNriTMnK4hV1OfgcB6Su2v76kqiKHqmCcfWjFp6l6YnA==',key_name='tempest-keypair-test-1032085623',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6cc38cd108414e729395073de19dceae',ramdisk_id='',reservation_id='r-bitt3llt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestFloatingIPUpdate-1973932938',owner_user_name='tempest-TestFloatingIPUpdate-1973932938-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:26:52Z,user_data=None,user_id='841f48f635f54f619a9de1d6bbc8f832',uuid=b3d0d846-1f10-43c4-8e0c-9ba93967fdc7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06bfc5f7-2163-4a40-87a7-050edd036a92", "address": "fa:16:3e:9e:13:c8", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06bfc5f7-21", "ovs_interfaceid": "06bfc5f7-2163-4a40-87a7-050edd036a92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.549 183079 DEBUG nova.network.os_vif_util [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Converting VIF {"id": "06bfc5f7-2163-4a40-87a7-050edd036a92", "address": "fa:16:3e:9e:13:c8", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06bfc5f7-21", "ovs_interfaceid": "06bfc5f7-2163-4a40-87a7-050edd036a92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.550 183079 DEBUG nova.network.os_vif_util [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:13:c8,bridge_name='br-int',has_traffic_filtering=True,id=06bfc5f7-2163-4a40-87a7-050edd036a92,network=Network(900bfc1d-c57a-4f7e-92bf-4e7a876cd570),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06bfc5f7-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.550 183079 DEBUG os_vif [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:13:c8,bridge_name='br-int',has_traffic_filtering=True,id=06bfc5f7-2163-4a40-87a7-050edd036a92,network=Network(900bfc1d-c57a-4f7e-92bf-4e7a876cd570),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06bfc5f7-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.551 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.551 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.551 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.554 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.554 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06bfc5f7-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.555 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06bfc5f7-21, col_values=(('external_ids', {'iface-id': '06bfc5f7-2163-4a40-87a7-050edd036a92', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9e:13:c8', 'vm-uuid': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.556 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:56 compute-0 NetworkManager[55454]: <info>  [1769102816.5588] manager: (tap06bfc5f7-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/221)
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.559 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.568 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.569 183079 INFO os_vif [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:13:c8,bridge_name='br-int',has_traffic_filtering=True,id=06bfc5f7-2163-4a40-87a7-050edd036a92,network=Network(900bfc1d-c57a-4f7e-92bf-4e7a876cd570),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06bfc5f7-21')
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.612 183079 DEBUG nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.613 183079 DEBUG nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] No VIF found with MAC fa:16:3e:9e:13:c8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:26:56 compute-0 kernel: tap06bfc5f7-21: entered promiscuous mode
Jan 22 17:26:56 compute-0 NetworkManager[55454]: <info>  [1769102816.6794] manager: (tap06bfc5f7-21): new Tun device (/org/freedesktop/NetworkManager/Devices/222)
Jan 22 17:26:56 compute-0 ovn_controller[95372]: 2026-01-22T17:26:56Z|00517|binding|INFO|Claiming lport 06bfc5f7-2163-4a40-87a7-050edd036a92 for this chassis.
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.682 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:56 compute-0 ovn_controller[95372]: 2026-01-22T17:26:56Z|00518|binding|INFO|06bfc5f7-2163-4a40-87a7-050edd036a92: Claiming fa:16:3e:9e:13:c8 10.100.0.9
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.686 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.689 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.702 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:13:c8 10.100.0.9'], port_security=['fa:16:3e:9e:13:c8 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-900bfc1d-c57a-4f7e-92bf-4e7a876cd570', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc38cd108414e729395073de19dceae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ae587f87-81d1-4ff6-8e92-98dcd41d2886', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c590c1d-a36d-48b3-bcde-1b6ee74771df, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=06bfc5f7-2163-4a40-87a7-050edd036a92) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.703 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 06bfc5f7-2163-4a40-87a7-050edd036a92 in datapath 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 bound to our chassis
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.705 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 900bfc1d-c57a-4f7e-92bf-4e7a876cd570
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.717 183079 INFO nova.compute.manager [None req-028e48d6-f542-4748-8655-f5340dff0ce9 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Get console output
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.721 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[99339da6-ab50-4de1-99dd-2202da9bcdc0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.722 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap900bfc1d-c1 in ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.724 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap900bfc1d-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.724 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[24f8b4d9-ec21-444b-b2e0-25e6b3624fb2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.726 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[337395ee-52b4-4b2d-960c-8c08e8627ce8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.727 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:26:56 compute-0 systemd-udevd[230506]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:26:56 compute-0 systemd-machined[154382]: New machine qemu-47-instance-0000002f.
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.743 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[d6356d40-d6d8-4f15-9c22-2c1f227929c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:26:56 compute-0 NetworkManager[55454]: <info>  [1769102816.7505] device (tap06bfc5f7-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:26:56 compute-0 NetworkManager[55454]: <info>  [1769102816.7513] device (tap06bfc5f7-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:26:56 compute-0 systemd[1]: Started Virtual Machine qemu-47-instance-0000002f.
Jan 22 17:26:56 compute-0 ovn_controller[95372]: 2026-01-22T17:26:56Z|00519|binding|INFO|Setting lport 06bfc5f7-2163-4a40-87a7-050edd036a92 ovn-installed in OVS
Jan 22 17:26:56 compute-0 ovn_controller[95372]: 2026-01-22T17:26:56Z|00520|binding|INFO|Setting lport 06bfc5f7-2163-4a40-87a7-050edd036a92 up in Southbound
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.773 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.778 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5de0c252-4f7c-4b54-bd66-829a5d1218dd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.815 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[8c906856-b1e6-4762-9018-2d17403265b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.821 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f28511e1-952e-430f-8043-9ec706aeec2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:26:56 compute-0 NetworkManager[55454]: <info>  [1769102816.8232] manager: (tap900bfc1d-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/223)
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.852 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[656e3bca-f431-4a98-9880-6175d1f4deb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.857 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[1bd24942-91d1-4cbf-91dd-9f2e5f37450b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:26:56 compute-0 NetworkManager[55454]: <info>  [1769102816.8836] device (tap900bfc1d-c0): carrier: link connected
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.889 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[19893972-a923-4523-83c8-82036db5c40f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.912 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[29634150-da04-4e78-b037-a91d66e65c03]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap900bfc1d-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b7:1d:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 149], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518058, 'reachable_time': 23458, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230538, 'error': None, 'target': 'ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.929 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3561cd44-56fc-4232-915a-ef44725cadbe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb7:1d49'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 518058, 'tstamp': 518058}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230539, 'error': None, 'target': 'ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.930 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.947 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a612a080-6b68-4a98-bc6c-6f266340ab6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap900bfc1d-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b7:1d:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 149], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518058, 'reachable_time': 23458, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230540, 'error': None, 'target': 'ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.960 183079 DEBUG nova.compute.manager [req-182f762e-6680-4a1f-9987-98b53917e746 req-9308c3e6-d8f8-43c7-a4ad-9a3497f23bac a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Received event network-vif-plugged-06bfc5f7-2163-4a40-87a7-050edd036a92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.960 183079 DEBUG oslo_concurrency.lockutils [req-182f762e-6680-4a1f-9987-98b53917e746 req-9308c3e6-d8f8-43c7-a4ad-9a3497f23bac a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.960 183079 DEBUG oslo_concurrency.lockutils [req-182f762e-6680-4a1f-9987-98b53917e746 req-9308c3e6-d8f8-43c7-a4ad-9a3497f23bac a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.960 183079 DEBUG oslo_concurrency.lockutils [req-182f762e-6680-4a1f-9987-98b53917e746 req-9308c3e6-d8f8-43c7-a4ad-9a3497f23bac a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:26:56 compute-0 nova_compute[183075]: 2026-01-22 17:26:56.961 183079 DEBUG nova.compute.manager [req-182f762e-6680-4a1f-9987-98b53917e746 req-9308c3e6-d8f8-43c7-a4ad-9a3497f23bac a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Processing event network-vif-plugged-06bfc5f7-2163-4a40-87a7-050edd036a92 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:26:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:56.982 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e813a759-5d5a-4e6f-8ee5-6480e0c0a962]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:57.052 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5f5e0467-2c9f-4a89-a05c-e4a041e21764]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:57.054 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap900bfc1d-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:57.055 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:57.056 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap900bfc1d-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.111 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:57 compute-0 NetworkManager[55454]: <info>  [1769102817.1124] manager: (tap900bfc1d-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Jan 22 17:26:57 compute-0 kernel: tap900bfc1d-c0: entered promiscuous mode
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:57.115 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap900bfc1d-c0, col_values=(('external_ids', {'iface-id': '23a8c53c-68ec-4f6e-8f99-d6d830063975'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:26:57 compute-0 ovn_controller[95372]: 2026-01-22T17:26:57Z|00521|binding|INFO|Releasing lport 23a8c53c-68ec-4f6e-8f99-d6d830063975 from this chassis (sb_readonly=0)
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.118 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.128 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:57.129 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/900bfc1d-c57a-4f7e-92bf-4e7a876cd570.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/900bfc1d-c57a-4f7e-92bf-4e7a876cd570.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:57.130 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[16e91ab6-864e-4935-8f28-2f616888198c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:57.131 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/900bfc1d-c57a-4f7e-92bf-4e7a876cd570.pid.haproxy
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 900bfc1d-c57a-4f7e-92bf-4e7a876cd570
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:57.131 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570', 'env', 'PROCESS_TAG=haproxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/900bfc1d-c57a-4f7e-92bf-4e7a876cd570.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:26:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:26:57.435 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.485 183079 INFO nova.compute.manager [None req-721cad91-052b-4f4f-b54b-3f3b481c2c21 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Get console output
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.492 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:26:57 compute-0 podman[230573]: 2026-01-22 17:26:57.546985742 +0000 UTC m=+0.063534038 container create 1f55cc3755c88522aa06ec6e483372559c981a69e468b8dcaacf2188595136b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.566 183079 DEBUG nova.network.neutron [req-cce4e63c-d24c-4de2-8738-13d14167c881 req-bcd850af-bb72-4728-a054-1775e3254ac3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Updated VIF entry in instance network info cache for port 06bfc5f7-2163-4a40-87a7-050edd036a92. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.566 183079 DEBUG nova.network.neutron [req-cce4e63c-d24c-4de2-8738-13d14167c881 req-bcd850af-bb72-4728-a054-1775e3254ac3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Updating instance_info_cache with network_info: [{"id": "06bfc5f7-2163-4a40-87a7-050edd036a92", "address": "fa:16:3e:9e:13:c8", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06bfc5f7-21", "ovs_interfaceid": "06bfc5f7-2163-4a40-87a7-050edd036a92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:26:57 compute-0 systemd[1]: Started libpod-conmon-1f55cc3755c88522aa06ec6e483372559c981a69e468b8dcaacf2188595136b6.scope.
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.588 183079 DEBUG oslo_concurrency.lockutils [req-cce4e63c-d24c-4de2-8738-13d14167c881 req-bcd850af-bb72-4728-a054-1775e3254ac3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:26:57 compute-0 podman[230573]: 2026-01-22 17:26:57.51322752 +0000 UTC m=+0.029775856 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:26:57 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:26:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05a8afac8db34f01db264fd3ed28aa86e59baff82ee583fd8bd64b17abd9488d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.620 183079 DEBUG nova.compute.manager [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.621 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102817.6201184, b3d0d846-1f10-43c4-8e0c-9ba93967fdc7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.622 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] VM Started (Lifecycle Event)
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.630 183079 DEBUG nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:26:57 compute-0 podman[230573]: 2026-01-22 17:26:57.637028698 +0000 UTC m=+0.153577014 container init 1f55cc3755c88522aa06ec6e483372559c981a69e468b8dcaacf2188595136b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.638 183079 INFO nova.virt.libvirt.driver [-] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Instance spawned successfully.
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.638 183079 DEBUG nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:26:57 compute-0 podman[230573]: 2026-01-22 17:26:57.644419335 +0000 UTC m=+0.160967631 container start 1f55cc3755c88522aa06ec6e483372559c981a69e468b8dcaacf2188595136b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.655 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.661 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.664 183079 DEBUG nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:26:57 compute-0 podman[230592]: 2026-01-22 17:26:57.664866112 +0000 UTC m=+0.078724655 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.664 183079 DEBUG nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.665 183079 DEBUG nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.665 183079 DEBUG nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.666 183079 DEBUG nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.666 183079 DEBUG nova.virt.libvirt.driver [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:26:57 compute-0 neutron-haproxy-ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230595]: [NOTICE]   (230618) : New worker (230621) forked
Jan 22 17:26:57 compute-0 neutron-haproxy-ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230595]: [NOTICE]   (230618) : Loading success.
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.690 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.690 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102817.621485, b3d0d846-1f10-43c4-8e0c-9ba93967fdc7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.692 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] VM Paused (Lifecycle Event)
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.715 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.718 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102817.6287212, b3d0d846-1f10-43c4-8e0c-9ba93967fdc7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.718 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] VM Resumed (Lifecycle Event)
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.729 183079 INFO nova.compute.manager [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Took 5.43 seconds to spawn the instance on the hypervisor.
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.730 183079 DEBUG nova.compute.manager [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.737 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.740 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.769 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.786 183079 INFO nova.compute.manager [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Took 5.93 seconds to build instance.
Jan 22 17:26:57 compute-0 nova_compute[183075]: 2026-01-22 17:26:57.803 183079 DEBUG oslo_concurrency.lockutils [None req-46722477-408b-4dc0-88c4-aeaa4bdb3961 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.030s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:26:59 compute-0 nova_compute[183075]: 2026-01-22 17:26:59.166 183079 DEBUG nova.compute.manager [req-5c34399d-d4b6-4907-9427-4e74beb1139b req-3a847871-ff59-4d3c-bd2d-978080b63057 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Received event network-vif-plugged-06bfc5f7-2163-4a40-87a7-050edd036a92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:26:59 compute-0 nova_compute[183075]: 2026-01-22 17:26:59.166 183079 DEBUG oslo_concurrency.lockutils [req-5c34399d-d4b6-4907-9427-4e74beb1139b req-3a847871-ff59-4d3c-bd2d-978080b63057 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:26:59 compute-0 nova_compute[183075]: 2026-01-22 17:26:59.166 183079 DEBUG oslo_concurrency.lockutils [req-5c34399d-d4b6-4907-9427-4e74beb1139b req-3a847871-ff59-4d3c-bd2d-978080b63057 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:26:59 compute-0 nova_compute[183075]: 2026-01-22 17:26:59.167 183079 DEBUG oslo_concurrency.lockutils [req-5c34399d-d4b6-4907-9427-4e74beb1139b req-3a847871-ff59-4d3c-bd2d-978080b63057 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:26:59 compute-0 nova_compute[183075]: 2026-01-22 17:26:59.167 183079 DEBUG nova.compute.manager [req-5c34399d-d4b6-4907-9427-4e74beb1139b req-3a847871-ff59-4d3c-bd2d-978080b63057 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] No waiting events found dispatching network-vif-plugged-06bfc5f7-2163-4a40-87a7-050edd036a92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:26:59 compute-0 nova_compute[183075]: 2026-01-22 17:26:59.167 183079 WARNING nova.compute.manager [req-5c34399d-d4b6-4907-9427-4e74beb1139b req-3a847871-ff59-4d3c-bd2d-978080b63057 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Received unexpected event network-vif-plugged-06bfc5f7-2163-4a40-87a7-050edd036a92 for instance with vm_state active and task_state None.
Jan 22 17:26:59 compute-0 nova_compute[183075]: 2026-01-22 17:26:59.469 183079 INFO nova.compute.manager [None req-89d4c8ba-52c4-4899-8305-f27ecb9bd43a 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Get console output
Jan 22 17:27:01 compute-0 nova_compute[183075]: 2026-01-22 17:27:01.557 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:01 compute-0 nova_compute[183075]: 2026-01-22 17:27:01.932 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:04 compute-0 nova_compute[183075]: 2026-01-22 17:27:04.594 183079 INFO nova.compute.manager [None req-4f6ba740-6b03-4e37-91f0-81dd57a3ed2d 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Get console output
Jan 22 17:27:04 compute-0 nova_compute[183075]: 2026-01-22 17:27:04.601 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.692 183079 DEBUG oslo_concurrency.lockutils [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.693 183079 DEBUG oslo_concurrency.lockutils [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.693 183079 DEBUG oslo_concurrency.lockutils [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.694 183079 DEBUG oslo_concurrency.lockutils [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.694 183079 DEBUG oslo_concurrency.lockutils [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.695 183079 INFO nova.compute.manager [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Terminating instance
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.696 183079 DEBUG nova.compute.manager [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:27:05 compute-0 kernel: tap7cdbd897-94 (unregistering): left promiscuous mode
Jan 22 17:27:05 compute-0 NetworkManager[55454]: <info>  [1769102825.7174] device (tap7cdbd897-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:27:05 compute-0 ovn_controller[95372]: 2026-01-22T17:27:05Z|00522|binding|INFO|Releasing lport 7cdbd897-944c-4b6f-980e-c220cf2c2532 from this chassis (sb_readonly=0)
Jan 22 17:27:05 compute-0 ovn_controller[95372]: 2026-01-22T17:27:05Z|00523|binding|INFO|Setting lport 7cdbd897-944c-4b6f-980e-c220cf2c2532 down in Southbound
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.725 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:05 compute-0 ovn_controller[95372]: 2026-01-22T17:27:05Z|00524|binding|INFO|Removing iface tap7cdbd897-94 ovn-installed in OVS
Jan 22 17:27:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:05.734 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:5f:4d 10.100.0.13'], port_security=['fa:16:3e:0a:5f:4d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '7148e02d-0822-41cf-b2f4-41ccec2a2fe4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'neutron:revision_number': '4', 'neutron:security_group_ids': '94e3530f-8012-4817-a338-7919b109ef3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12343ce0-7cef-4f7f-9439-6550d878d4ba, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=7cdbd897-944c-4b6f-980e-c220cf2c2532) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:27:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:05.735 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 7cdbd897-944c-4b6f-980e-c220cf2c2532 in datapath 44326f3c-1431-44d6-85ce-61ecbbb5ed7a unbound from our chassis
Jan 22 17:27:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:05.737 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44326f3c-1431-44d6-85ce-61ecbbb5ed7a
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.736 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.745 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:05.755 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[66f47033-941a-4774-91f7-932941e99c3f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:05 compute-0 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000002e.scope: Deactivated successfully.
Jan 22 17:27:05 compute-0 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000002e.scope: Consumed 15.090s CPU time.
Jan 22 17:27:05 compute-0 systemd-machined[154382]: Machine qemu-46-instance-0000002e terminated.
Jan 22 17:27:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:05.782 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[8a69b377-4e4d-4944-ba6d-8d8ddc0f2f39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:05.786 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5b04ecf7-2ffc-4715-ba15-56aeea5f87be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:05.813 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d33c043e-9cae-44d5-9810-32c7ca2fa415]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:05.831 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b1edf27c-a73f-4dde-9b38-cddaeb1589a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44326f3c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:1b:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 107, 'rx_bytes': 17308, 'tx_bytes': 12105, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 107, 'rx_bytes': 17308, 'tx_bytes': 12105, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505447, 'reachable_time': 18257, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230643, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:05.844 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3aaebb9a-e24b-4840-915c-43bb7a13e629]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap44326f3c-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505462, 'tstamp': 505462}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230644, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap44326f3c-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505465, 'tstamp': 505465}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230644, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:05.846 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44326f3c-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.879 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.883 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:05.883 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44326f3c-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:27:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:05.883 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:27:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:05.884 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44326f3c-10, col_values=(('external_ids', {'iface-id': '118957e0-7da0-4d87-b7d4-2c204e19e5b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:27:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:05.884 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.919 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.953 183079 INFO nova.virt.libvirt.driver [-] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Instance destroyed successfully.
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.953 183079 DEBUG nova.objects.instance [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lazy-loading 'resources' on Instance uuid 7148e02d-0822-41cf-b2f4-41ccec2a2fe4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.968 183079 DEBUG nova.virt.libvirt.vif [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:25:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-2-1964744070',display_name='tempest-server-2-1964744070',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-2-1964744070',id=46,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMynqvoXBI34qH0ZDQNq01ZPmk41Xdi4JxkvNO4nJ7GrdggfhpXkKQhhUZRV3fv3bSwcXMqi04ipV9xe7IsTm4GBRV+U9o6VN5JSMLulp+KIvjPuEmjq0h65ra09/zQB9w==',key_name='tempest-keypair-test-1762910261',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:25:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e4c0bb18013747dfad2e25b2495090eb',ramdisk_id='',reservation_id='r-p7dnx8q0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortForwardingTestJSON-1240706675',owner_user_name='tempest-PortForwardingTestJSON-1240706675-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:25:57Z,user_data=None,user_id='852aea4e08344f39ae07e6b57393c767',uuid=7148e02d-0822-41cf-b2f4-41ccec2a2fe4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "address": "fa:16:3e:0a:5f:4d", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cdbd897-94", "ovs_interfaceid": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.968 183079 DEBUG nova.network.os_vif_util [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converting VIF {"id": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "address": "fa:16:3e:0a:5f:4d", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cdbd897-94", "ovs_interfaceid": "7cdbd897-944c-4b6f-980e-c220cf2c2532", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.969 183079 DEBUG nova.network.os_vif_util [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:5f:4d,bridge_name='br-int',has_traffic_filtering=True,id=7cdbd897-944c-4b6f-980e-c220cf2c2532,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7cdbd897-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.969 183079 DEBUG os_vif [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:5f:4d,bridge_name='br-int',has_traffic_filtering=True,id=7cdbd897-944c-4b6f-980e-c220cf2c2532,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7cdbd897-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.970 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.971 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7cdbd897-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.972 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.973 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.975 183079 INFO os_vif [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:5f:4d,bridge_name='br-int',has_traffic_filtering=True,id=7cdbd897-944c-4b6f-980e-c220cf2c2532,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7cdbd897-94')
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.975 183079 INFO nova.virt.libvirt.driver [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Deleting instance files /var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4_del
Jan 22 17:27:05 compute-0 nova_compute[183075]: 2026-01-22 17:27:05.976 183079 INFO nova.virt.libvirt.driver [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Deletion of /var/lib/nova/instances/7148e02d-0822-41cf-b2f4-41ccec2a2fe4_del complete
Jan 22 17:27:06 compute-0 nova_compute[183075]: 2026-01-22 17:27:06.019 183079 INFO nova.compute.manager [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Took 0.32 seconds to destroy the instance on the hypervisor.
Jan 22 17:27:06 compute-0 nova_compute[183075]: 2026-01-22 17:27:06.019 183079 DEBUG oslo.service.loopingcall [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:27:06 compute-0 nova_compute[183075]: 2026-01-22 17:27:06.020 183079 DEBUG nova.compute.manager [-] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:27:06 compute-0 nova_compute[183075]: 2026-01-22 17:27:06.020 183079 DEBUG nova.network.neutron [-] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:27:06 compute-0 nova_compute[183075]: 2026-01-22 17:27:06.443 183079 DEBUG nova.compute.manager [req-0f738c61-4f5e-436e-ac4a-6a664c6063ec req-e70f8e16-3e57-4301-b5e3-132dab4d1bbf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Received event network-vif-unplugged-7cdbd897-944c-4b6f-980e-c220cf2c2532 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:27:06 compute-0 nova_compute[183075]: 2026-01-22 17:27:06.444 183079 DEBUG oslo_concurrency.lockutils [req-0f738c61-4f5e-436e-ac4a-6a664c6063ec req-e70f8e16-3e57-4301-b5e3-132dab4d1bbf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:06 compute-0 nova_compute[183075]: 2026-01-22 17:27:06.444 183079 DEBUG oslo_concurrency.lockutils [req-0f738c61-4f5e-436e-ac4a-6a664c6063ec req-e70f8e16-3e57-4301-b5e3-132dab4d1bbf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:06 compute-0 nova_compute[183075]: 2026-01-22 17:27:06.445 183079 DEBUG oslo_concurrency.lockutils [req-0f738c61-4f5e-436e-ac4a-6a664c6063ec req-e70f8e16-3e57-4301-b5e3-132dab4d1bbf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:06 compute-0 nova_compute[183075]: 2026-01-22 17:27:06.445 183079 DEBUG nova.compute.manager [req-0f738c61-4f5e-436e-ac4a-6a664c6063ec req-e70f8e16-3e57-4301-b5e3-132dab4d1bbf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] No waiting events found dispatching network-vif-unplugged-7cdbd897-944c-4b6f-980e-c220cf2c2532 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:27:06 compute-0 nova_compute[183075]: 2026-01-22 17:27:06.446 183079 DEBUG nova.compute.manager [req-0f738c61-4f5e-436e-ac4a-6a664c6063ec req-e70f8e16-3e57-4301-b5e3-132dab4d1bbf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Received event network-vif-unplugged-7cdbd897-944c-4b6f-980e-c220cf2c2532 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:27:06 compute-0 nova_compute[183075]: 2026-01-22 17:27:06.997 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:07 compute-0 podman[230660]: 2026-01-22 17:27:07.346389599 +0000 UTC m=+0.057408885 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:27:07 compute-0 nova_compute[183075]: 2026-01-22 17:27:07.616 183079 DEBUG nova.network.neutron [-] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:27:07 compute-0 nova_compute[183075]: 2026-01-22 17:27:07.652 183079 INFO nova.compute.manager [-] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Took 1.63 seconds to deallocate network for instance.
Jan 22 17:27:07 compute-0 nova_compute[183075]: 2026-01-22 17:27:07.731 183079 DEBUG oslo_concurrency.lockutils [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:07 compute-0 nova_compute[183075]: 2026-01-22 17:27:07.732 183079 DEBUG oslo_concurrency.lockutils [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:07 compute-0 nova_compute[183075]: 2026-01-22 17:27:07.811 183079 DEBUG nova.compute.provider_tree [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:27:07 compute-0 nova_compute[183075]: 2026-01-22 17:27:07.832 183079 DEBUG nova.scheduler.client.report [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:27:07 compute-0 nova_compute[183075]: 2026-01-22 17:27:07.923 183079 DEBUG oslo_concurrency.lockutils [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:07 compute-0 nova_compute[183075]: 2026-01-22 17:27:07.947 183079 INFO nova.scheduler.client.report [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Deleted allocations for instance 7148e02d-0822-41cf-b2f4-41ccec2a2fe4
Jan 22 17:27:08 compute-0 nova_compute[183075]: 2026-01-22 17:27:08.010 183079 DEBUG oslo_concurrency.lockutils [None req-06a3018f-aba2-4eee-9735-a7c0047f994f 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:08 compute-0 nova_compute[183075]: 2026-01-22 17:27:08.768 183079 DEBUG nova.compute.manager [req-36cb6e3e-9af8-47f4-81e6-046727f612f3 req-cd7d93b6-a987-4490-a7f3-50e84ff2f5ae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Received event network-vif-plugged-7cdbd897-944c-4b6f-980e-c220cf2c2532 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:27:08 compute-0 nova_compute[183075]: 2026-01-22 17:27:08.768 183079 DEBUG oslo_concurrency.lockutils [req-36cb6e3e-9af8-47f4-81e6-046727f612f3 req-cd7d93b6-a987-4490-a7f3-50e84ff2f5ae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:08 compute-0 nova_compute[183075]: 2026-01-22 17:27:08.769 183079 DEBUG oslo_concurrency.lockutils [req-36cb6e3e-9af8-47f4-81e6-046727f612f3 req-cd7d93b6-a987-4490-a7f3-50e84ff2f5ae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:08 compute-0 nova_compute[183075]: 2026-01-22 17:27:08.769 183079 DEBUG oslo_concurrency.lockutils [req-36cb6e3e-9af8-47f4-81e6-046727f612f3 req-cd7d93b6-a987-4490-a7f3-50e84ff2f5ae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7148e02d-0822-41cf-b2f4-41ccec2a2fe4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:08 compute-0 nova_compute[183075]: 2026-01-22 17:27:08.769 183079 DEBUG nova.compute.manager [req-36cb6e3e-9af8-47f4-81e6-046727f612f3 req-cd7d93b6-a987-4490-a7f3-50e84ff2f5ae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] No waiting events found dispatching network-vif-plugged-7cdbd897-944c-4b6f-980e-c220cf2c2532 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:27:08 compute-0 nova_compute[183075]: 2026-01-22 17:27:08.769 183079 WARNING nova.compute.manager [req-36cb6e3e-9af8-47f4-81e6-046727f612f3 req-cd7d93b6-a987-4490-a7f3-50e84ff2f5ae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Received unexpected event network-vif-plugged-7cdbd897-944c-4b6f-980e-c220cf2c2532 for instance with vm_state deleted and task_state None.
Jan 22 17:27:08 compute-0 nova_compute[183075]: 2026-01-22 17:27:08.966 183079 DEBUG oslo_concurrency.lockutils [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:08 compute-0 nova_compute[183075]: 2026-01-22 17:27:08.967 183079 DEBUG oslo_concurrency.lockutils [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:08 compute-0 nova_compute[183075]: 2026-01-22 17:27:08.967 183079 DEBUG oslo_concurrency.lockutils [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:08 compute-0 nova_compute[183075]: 2026-01-22 17:27:08.967 183079 DEBUG oslo_concurrency.lockutils [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:08 compute-0 nova_compute[183075]: 2026-01-22 17:27:08.967 183079 DEBUG oslo_concurrency.lockutils [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:08 compute-0 nova_compute[183075]: 2026-01-22 17:27:08.968 183079 INFO nova.compute.manager [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Terminating instance
Jan 22 17:27:08 compute-0 nova_compute[183075]: 2026-01-22 17:27:08.969 183079 DEBUG nova.compute.manager [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:27:08 compute-0 kernel: tapff4c20a1-cc (unregistering): left promiscuous mode
Jan 22 17:27:09 compute-0 NetworkManager[55454]: <info>  [1769102829.0058] device (tapff4c20a1-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:27:09 compute-0 ovn_controller[95372]: 2026-01-22T17:27:09Z|00525|binding|INFO|Releasing lport ff4c20a1-cc0e-4a39-80b4-bb1426093c82 from this chassis (sb_readonly=0)
Jan 22 17:27:09 compute-0 ovn_controller[95372]: 2026-01-22T17:27:09Z|00526|binding|INFO|Setting lport ff4c20a1-cc0e-4a39-80b4-bb1426093c82 down in Southbound
Jan 22 17:27:09 compute-0 ovn_controller[95372]: 2026-01-22T17:27:09Z|00527|binding|INFO|Removing iface tapff4c20a1-cc ovn-installed in OVS
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.011 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:09.020 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:19:a3 10.100.0.12'], port_security=['fa:16:3e:41:19:a3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a6598da5-2e3d-4ca1-90ab-2a8db7241468', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e4c0bb18013747dfad2e25b2495090eb', 'neutron:revision_number': '4', 'neutron:security_group_ids': '94e3530f-8012-4817-a338-7919b109ef3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12343ce0-7cef-4f7f-9439-6550d878d4ba, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=ff4c20a1-cc0e-4a39-80b4-bb1426093c82) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:27:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:09.021 104629 INFO neutron.agent.ovn.metadata.agent [-] Port ff4c20a1-cc0e-4a39-80b4-bb1426093c82 in datapath 44326f3c-1431-44d6-85ce-61ecbbb5ed7a unbound from our chassis
Jan 22 17:27:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:09.022 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44326f3c-1431-44d6-85ce-61ecbbb5ed7a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:27:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:09.023 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[12498442-19f9-4e3c-90ec-47bf91edb7b3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:09.026 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a namespace which is not needed anymore
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.045 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:09 compute-0 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d0000002c.scope: Deactivated successfully.
Jan 22 17:27:09 compute-0 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d0000002c.scope: Consumed 17.983s CPU time.
Jan 22 17:27:09 compute-0 systemd-machined[154382]: Machine qemu-44-instance-0000002c terminated.
Jan 22 17:27:09 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229558]: [NOTICE]   (229562) : haproxy version is 2.8.14-c23fe91
Jan 22 17:27:09 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229558]: [NOTICE]   (229562) : path to executable is /usr/sbin/haproxy
Jan 22 17:27:09 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229558]: [WARNING]  (229562) : Exiting Master process...
Jan 22 17:27:09 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229558]: [ALERT]    (229562) : Current worker (229564) exited with code 143 (Terminated)
Jan 22 17:27:09 compute-0 neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a[229558]: [WARNING]  (229562) : All workers exited. Exiting... (0)
Jan 22 17:27:09 compute-0 systemd[1]: libpod-279272a5e1fa0de3698abf8d8f92a1e99d0514fd276f8ddd35ac6051459072ae.scope: Deactivated successfully.
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.192 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.196 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:09 compute-0 podman[230717]: 2026-01-22 17:27:09.197454773 +0000 UTC m=+0.051096266 container died 279272a5e1fa0de3698abf8d8f92a1e99d0514fd276f8ddd35ac6051459072ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.228 183079 INFO nova.virt.libvirt.driver [-] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Instance destroyed successfully.
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.228 183079 DEBUG nova.objects.instance [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lazy-loading 'resources' on Instance uuid a6598da5-2e3d-4ca1-90ab-2a8db7241468 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.246 183079 DEBUG nova.virt.libvirt.vif [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:24:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-1-468539916',display_name='tempest-server-1-468539916',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-1-468539916',id=44,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMynqvoXBI34qH0ZDQNq01ZPmk41Xdi4JxkvNO4nJ7GrdggfhpXkKQhhUZRV3fv3bSwcXMqi04ipV9xe7IsTm4GBRV+U9o6VN5JSMLulp+KIvjPuEmjq0h65ra09/zQB9w==',key_name='tempest-keypair-test-1762910261',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:24:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e4c0bb18013747dfad2e25b2495090eb',ramdisk_id='',reservation_id='r-tqk4zn7k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortForwardingTestJSON-1240706675',owner_user_name='tempest-PortForwardingTestJSON-1240706675-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:24:52Z,user_data=None,user_id='852aea4e08344f39ae07e6b57393c767',uuid=a6598da5-2e3d-4ca1-90ab-2a8db7241468,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "address": "fa:16:3e:41:19:a3", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4c20a1-cc", "ovs_interfaceid": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.247 183079 DEBUG nova.network.os_vif_util [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converting VIF {"id": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "address": "fa:16:3e:41:19:a3", "network": {"id": "44326f3c-1431-44d6-85ce-61ecbbb5ed7a", "bridge": "br-int", "label": "tempest-test-network--1807771683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e4c0bb18013747dfad2e25b2495090eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff4c20a1-cc", "ovs_interfaceid": "ff4c20a1-cc0e-4a39-80b4-bb1426093c82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.247 183079 DEBUG nova.network.os_vif_util [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:41:19:a3,bridge_name='br-int',has_traffic_filtering=True,id=ff4c20a1-cc0e-4a39-80b4-bb1426093c82,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapff4c20a1-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.248 183079 DEBUG os_vif [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:19:a3,bridge_name='br-int',has_traffic_filtering=True,id=ff4c20a1-cc0e-4a39-80b4-bb1426093c82,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapff4c20a1-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.249 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.249 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff4c20a1-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:27:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-279272a5e1fa0de3698abf8d8f92a1e99d0514fd276f8ddd35ac6051459072ae-userdata-shm.mount: Deactivated successfully.
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.253 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-3e89833e8b875aa10e7cd22530c38c4380399c9bde1e944c194d71e9398d8822-merged.mount: Deactivated successfully.
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.255 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.258 183079 INFO os_vif [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:19:a3,bridge_name='br-int',has_traffic_filtering=True,id=ff4c20a1-cc0e-4a39-80b4-bb1426093c82,network=Network(44326f3c-1431-44d6-85ce-61ecbbb5ed7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapff4c20a1-cc')
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.258 183079 INFO nova.virt.libvirt.driver [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Deleting instance files /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468_del
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.259 183079 INFO nova.virt.libvirt.driver [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Deletion of /var/lib/nova/instances/a6598da5-2e3d-4ca1-90ab-2a8db7241468_del complete
Jan 22 17:27:09 compute-0 podman[230717]: 2026-01-22 17:27:09.26283958 +0000 UTC m=+0.116481063 container cleanup 279272a5e1fa0de3698abf8d8f92a1e99d0514fd276f8ddd35ac6051459072ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:27:09 compute-0 systemd[1]: libpod-conmon-279272a5e1fa0de3698abf8d8f92a1e99d0514fd276f8ddd35ac6051459072ae.scope: Deactivated successfully.
Jan 22 17:27:09 compute-0 podman[230761]: 2026-01-22 17:27:09.323887911 +0000 UTC m=+0.040389490 container remove 279272a5e1fa0de3698abf8d8f92a1e99d0514fd276f8ddd35ac6051459072ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:27:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:09.330 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8b1b2f12-b9ac-4e55-861b-bdd5becf1ec8]: (4, ('Thu Jan 22 05:27:09 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a (279272a5e1fa0de3698abf8d8f92a1e99d0514fd276f8ddd35ac6051459072ae)\n279272a5e1fa0de3698abf8d8f92a1e99d0514fd276f8ddd35ac6051459072ae\nThu Jan 22 05:27:09 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a (279272a5e1fa0de3698abf8d8f92a1e99d0514fd276f8ddd35ac6051459072ae)\n279272a5e1fa0de3698abf8d8f92a1e99d0514fd276f8ddd35ac6051459072ae\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:09.331 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e2ee5c32-3b8d-4d20-b3a1-b9e003ea914e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:09.332 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44326f3c-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.334 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:09 compute-0 kernel: tap44326f3c-10: left promiscuous mode
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.341 183079 INFO nova.compute.manager [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.342 183079 DEBUG oslo.service.loopingcall [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.343 183079 DEBUG nova.compute.manager [-] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.343 183079 DEBUG nova.network.neutron [-] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.351 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:09.354 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4eff544e-8e99-4084-8396-c47b8a51c59b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:09.381 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7196cde5-da66-4f40-ab4c-18792d018292]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:09.383 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d2ee1c00-4e91-404d-856e-8b8cfd2d04cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:09.398 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[41d2f370-34d5-4907-852c-ea58a109890a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505437, 'reachable_time': 41438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230776, 'error': None, 'target': 'ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:09.401 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-44326f3c-1431-44d6-85ce-61ecbbb5ed7a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:27:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:09.401 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[002ad08e-5125-4116-9fc5-02bff82a1498]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:09 compute-0 systemd[1]: run-netns-ovnmeta\x2d44326f3c\x2d1431\x2d44d6\x2d85ce\x2d61ecbbb5ed7a.mount: Deactivated successfully.
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.975 183079 INFO nova.compute.manager [None req-8df068d2-535d-43d1-ade1-ee57504d7e0e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Get console output
Jan 22 17:27:09 compute-0 nova_compute[183075]: 2026-01-22 17:27:09.984 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:27:10 compute-0 ovn_controller[95372]: 2026-01-22T17:27:10Z|00096|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9e:13:c8 10.100.0.9
Jan 22 17:27:10 compute-0 ovn_controller[95372]: 2026-01-22T17:27:10Z|00097|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9e:13:c8 10.100.0.9
Jan 22 17:27:12 compute-0 nova_compute[183075]: 2026-01-22 17:27:12.036 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:12 compute-0 nova_compute[183075]: 2026-01-22 17:27:12.109 183079 DEBUG nova.compute.manager [req-0b6221d5-1d4e-47b9-b7a5-33e9bf5a9485 req-d25ca3ac-a196-49f9-91ee-2bff8e1069ee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Received event network-vif-unplugged-ff4c20a1-cc0e-4a39-80b4-bb1426093c82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:27:12 compute-0 nova_compute[183075]: 2026-01-22 17:27:12.109 183079 DEBUG oslo_concurrency.lockutils [req-0b6221d5-1d4e-47b9-b7a5-33e9bf5a9485 req-d25ca3ac-a196-49f9-91ee-2bff8e1069ee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:12 compute-0 nova_compute[183075]: 2026-01-22 17:27:12.109 183079 DEBUG oslo_concurrency.lockutils [req-0b6221d5-1d4e-47b9-b7a5-33e9bf5a9485 req-d25ca3ac-a196-49f9-91ee-2bff8e1069ee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:12 compute-0 nova_compute[183075]: 2026-01-22 17:27:12.109 183079 DEBUG oslo_concurrency.lockutils [req-0b6221d5-1d4e-47b9-b7a5-33e9bf5a9485 req-d25ca3ac-a196-49f9-91ee-2bff8e1069ee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:12 compute-0 nova_compute[183075]: 2026-01-22 17:27:12.110 183079 DEBUG nova.compute.manager [req-0b6221d5-1d4e-47b9-b7a5-33e9bf5a9485 req-d25ca3ac-a196-49f9-91ee-2bff8e1069ee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] No waiting events found dispatching network-vif-unplugged-ff4c20a1-cc0e-4a39-80b4-bb1426093c82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:27:12 compute-0 nova_compute[183075]: 2026-01-22 17:27:12.110 183079 DEBUG nova.compute.manager [req-0b6221d5-1d4e-47b9-b7a5-33e9bf5a9485 req-d25ca3ac-a196-49f9-91ee-2bff8e1069ee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Received event network-vif-unplugged-ff4c20a1-cc0e-4a39-80b4-bb1426093c82 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:27:13 compute-0 nova_compute[183075]: 2026-01-22 17:27:13.281 183079 DEBUG nova.network.neutron [-] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:27:13 compute-0 nova_compute[183075]: 2026-01-22 17:27:13.320 183079 INFO nova.compute.manager [-] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Took 3.98 seconds to deallocate network for instance.
Jan 22 17:27:13 compute-0 podman[230777]: 2026-01-22 17:27:13.357189398 +0000 UTC m=+0.066574060 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:27:13 compute-0 nova_compute[183075]: 2026-01-22 17:27:13.466 183079 DEBUG oslo_concurrency.lockutils [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:13 compute-0 nova_compute[183075]: 2026-01-22 17:27:13.466 183079 DEBUG oslo_concurrency.lockutils [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:13 compute-0 nova_compute[183075]: 2026-01-22 17:27:13.536 183079 DEBUG nova.compute.provider_tree [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:27:13 compute-0 nova_compute[183075]: 2026-01-22 17:27:13.550 183079 DEBUG nova.scheduler.client.report [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:27:13 compute-0 nova_compute[183075]: 2026-01-22 17:27:13.572 183079 DEBUG oslo_concurrency.lockutils [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:13 compute-0 nova_compute[183075]: 2026-01-22 17:27:13.602 183079 INFO nova.scheduler.client.report [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Deleted allocations for instance a6598da5-2e3d-4ca1-90ab-2a8db7241468
Jan 22 17:27:13 compute-0 nova_compute[183075]: 2026-01-22 17:27:13.666 183079 DEBUG oslo_concurrency.lockutils [None req-fc6d463b-9e54-4e78-bf74-a35a087be477 852aea4e08344f39ae07e6b57393c767 e4c0bb18013747dfad2e25b2495090eb - - default default] Lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:13 compute-0 nova_compute[183075]: 2026-01-22 17:27:13.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:27:14 compute-0 nova_compute[183075]: 2026-01-22 17:27:14.255 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:15 compute-0 nova_compute[183075]: 2026-01-22 17:27:15.299 183079 DEBUG nova.compute.manager [req-5cc37c4f-3372-4c2a-bcd8-0403f5344bcb req-a1609000-af47-40b0-be07-6410c5292529 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Received event network-vif-plugged-ff4c20a1-cc0e-4a39-80b4-bb1426093c82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:27:15 compute-0 nova_compute[183075]: 2026-01-22 17:27:15.300 183079 DEBUG oslo_concurrency.lockutils [req-5cc37c4f-3372-4c2a-bcd8-0403f5344bcb req-a1609000-af47-40b0-be07-6410c5292529 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:15 compute-0 nova_compute[183075]: 2026-01-22 17:27:15.300 183079 DEBUG oslo_concurrency.lockutils [req-5cc37c4f-3372-4c2a-bcd8-0403f5344bcb req-a1609000-af47-40b0-be07-6410c5292529 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:15 compute-0 nova_compute[183075]: 2026-01-22 17:27:15.300 183079 DEBUG oslo_concurrency.lockutils [req-5cc37c4f-3372-4c2a-bcd8-0403f5344bcb req-a1609000-af47-40b0-be07-6410c5292529 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "a6598da5-2e3d-4ca1-90ab-2a8db7241468-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:15 compute-0 nova_compute[183075]: 2026-01-22 17:27:15.300 183079 DEBUG nova.compute.manager [req-5cc37c4f-3372-4c2a-bcd8-0403f5344bcb req-a1609000-af47-40b0-be07-6410c5292529 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] No waiting events found dispatching network-vif-plugged-ff4c20a1-cc0e-4a39-80b4-bb1426093c82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:27:15 compute-0 nova_compute[183075]: 2026-01-22 17:27:15.301 183079 WARNING nova.compute.manager [req-5cc37c4f-3372-4c2a-bcd8-0403f5344bcb req-a1609000-af47-40b0-be07-6410c5292529 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Received unexpected event network-vif-plugged-ff4c20a1-cc0e-4a39-80b4-bb1426093c82 for instance with vm_state deleted and task_state None.
Jan 22 17:27:15 compute-0 nova_compute[183075]: 2026-01-22 17:27:15.618 183079 INFO nova.compute.manager [None req-0b0430db-521f-4592-9267-9cce3383e594 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Get console output
Jan 22 17:27:15 compute-0 nova_compute[183075]: 2026-01-22 17:27:15.624 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:27:17 compute-0 nova_compute[183075]: 2026-01-22 17:27:17.094 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:17.207 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:17.208 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:17.912 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:17.912 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.7038205
Jan 22 17:27:17 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.9:38488 [22/Jan/2026:17:27:17.206] listener listener/metadata 0/0/0/705/705 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:17.923 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:17.923 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:17.938 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:17.938 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0147433
Jan 22 17:27:17 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.9:38494 [22/Jan/2026:17:27:17.921] listener listener/metadata 0/0/0/16/16 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:17.943 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:17.943 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:17.963 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:17.963 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0202322
Jan 22 17:27:17 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.9:38506 [22/Jan/2026:17:27:17.942] listener listener/metadata 0/0/0/21/21 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:17.969 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:17.970 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:27:17 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.002 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.003 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0328386
Jan 22 17:27:18 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.9:38520 [22/Jan/2026:17:27:17.968] listener listener/metadata 0/0/0/34/34 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.009 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.010 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.033 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.034 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0239208
Jan 22 17:27:18 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.9:38534 [22/Jan/2026:17:27:18.008] listener listener/metadata 0/0/0/25/25 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.041 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.042 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.060 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.061 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0191069
Jan 22 17:27:18 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.9:38540 [22/Jan/2026:17:27:18.040] listener listener/metadata 0/0/0/20/20 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.066 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.067 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.086 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:18 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.9:38554 [22/Jan/2026:17:27:18.066] listener listener/metadata 0/0/0/20/20 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.087 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0198519
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.092 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.093 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.113 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.114 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0209682
Jan 22 17:27:18 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.9:38568 [22/Jan/2026:17:27:18.091] listener listener/metadata 0/0/0/22/22 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.120 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.120 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.133 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.134 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 170 time: 0.0132840
Jan 22 17:27:18 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.9:38576 [22/Jan/2026:17:27:18.119] listener listener/metadata 0/0/0/14/14 200 154 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.138 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.139 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.151 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.152 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 170 time: 0.0131557
Jan 22 17:27:18 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.9:38592 [22/Jan/2026:17:27:18.138] listener listener/metadata 0/0/0/13/13 200 154 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.156 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.156 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:18 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.9:38602 [22/Jan/2026:17:27:18.156] listener listener/metadata 0/0/0/15/15 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.171 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0145874
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.179 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.180 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.194 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.194 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0142021
Jan 22 17:27:18 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.9:38612 [22/Jan/2026:17:27:18.179] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.198 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.198 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.209 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.209 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0109625
Jan 22 17:27:18 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.9:38618 [22/Jan/2026:17:27:18.197] listener listener/metadata 0/0/0/11/11 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.212 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.213 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.226 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.226 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0130975
Jan 22 17:27:18 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.9:38628 [22/Jan/2026:17:27:18.212] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.231 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.232 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.248 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.248 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 170 time: 0.0165987
Jan 22 17:27:18 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.9:38630 [22/Jan/2026:17:27:18.230] listener listener/metadata 0/0/0/17/17 200 154 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.254 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.255 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.277 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:18 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.9:38640 [22/Jan/2026:17:27:18.254] listener listener/metadata 0/0/0/23/23 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:27:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:18.277 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0223238
Jan 22 17:27:19 compute-0 nova_compute[183075]: 2026-01-22 17:27:19.258 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:19 compute-0 nova_compute[183075]: 2026-01-22 17:27:19.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:27:20 compute-0 nova_compute[183075]: 2026-01-22 17:27:20.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:27:20 compute-0 nova_compute[183075]: 2026-01-22 17:27:20.952 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102825.9516969, 7148e02d-0822-41cf-b2f4-41ccec2a2fe4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:27:20 compute-0 nova_compute[183075]: 2026-01-22 17:27:20.953 183079 INFO nova.compute.manager [-] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] VM Stopped (Lifecycle Event)
Jan 22 17:27:21 compute-0 nova_compute[183075]: 2026-01-22 17:27:21.320 183079 INFO nova.compute.manager [None req-00f5ed7a-ffea-4c12-99a9-b9cf4a4ff4d4 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Get console output
Jan 22 17:27:21 compute-0 nova_compute[183075]: 2026-01-22 17:27:21.325 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:27:22 compute-0 nova_compute[183075]: 2026-01-22 17:27:22.097 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:23 compute-0 nova_compute[183075]: 2026-01-22 17:27:23.355 183079 DEBUG nova.compute.manager [None req-084c1cae-d385-410c-a068-a3636e89581e - - - - - -] [instance: 7148e02d-0822-41cf-b2f4-41ccec2a2fe4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:27:23 compute-0 podman[230808]: 2026-01-22 17:27:23.389485086 +0000 UTC m=+0.078007955 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 22 17:27:23 compute-0 podman[230807]: 2026-01-22 17:27:23.404093996 +0000 UTC m=+0.108183121 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 17:27:23 compute-0 podman[230809]: 2026-01-22 17:27:23.404094616 +0000 UTC m=+0.088404603 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vendor=Red Hat, Inc., distribution-scope=public, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-type=git)
Jan 22 17:27:24 compute-0 nova_compute[183075]: 2026-01-22 17:27:24.227 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102829.2253351, a6598da5-2e3d-4ca1-90ab-2a8db7241468 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:27:24 compute-0 nova_compute[183075]: 2026-01-22 17:27:24.227 183079 INFO nova.compute.manager [-] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] VM Stopped (Lifecycle Event)
Jan 22 17:27:24 compute-0 nova_compute[183075]: 2026-01-22 17:27:24.260 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:24 compute-0 nova_compute[183075]: 2026-01-22 17:27:24.384 183079 DEBUG nova.compute.manager [None req-965677ad-97f1-4144-952f-cda7b9e329dc - - - - - -] [instance: a6598da5-2e3d-4ca1-90ab-2a8db7241468] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:27:26 compute-0 nova_compute[183075]: 2026-01-22 17:27:26.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:27:26 compute-0 nova_compute[183075]: 2026-01-22 17:27:26.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:27:26 compute-0 nova_compute[183075]: 2026-01-22 17:27:26.834 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:26 compute-0 nova_compute[183075]: 2026-01-22 17:27:26.834 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:26 compute-0 nova_compute[183075]: 2026-01-22 17:27:26.852 183079 DEBUG nova.compute.manager [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:27:26 compute-0 nova_compute[183075]: 2026-01-22 17:27:26.933 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:26 compute-0 nova_compute[183075]: 2026-01-22 17:27:26.934 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:26 compute-0 nova_compute[183075]: 2026-01-22 17:27:26.942 183079 DEBUG nova.virt.hardware [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:27:26 compute-0 nova_compute[183075]: 2026-01-22 17:27:26.942 183079 INFO nova.compute.claims [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.100 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.118 183079 DEBUG nova.scheduler.client.report [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Refreshing inventories for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.134 183079 DEBUG nova.scheduler.client.report [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Updating ProviderTree inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.135 183079 DEBUG nova.compute.provider_tree [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.148 183079 DEBUG nova.scheduler.client.report [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Refreshing aggregate associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.170 183079 DEBUG nova.scheduler.client.report [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Refreshing trait associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NODE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.233 183079 DEBUG nova.compute.provider_tree [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.257 183079 DEBUG nova.scheduler.client.report [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.278 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.279 183079 DEBUG nova.compute.manager [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.328 183079 DEBUG nova.compute.manager [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.329 183079 DEBUG nova.network.neutron [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.353 183079 INFO nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.375 183079 DEBUG nova.compute.manager [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.472 183079 DEBUG nova.compute.manager [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.473 183079 DEBUG nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.474 183079 INFO nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Creating image(s)
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.474 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "/var/lib/nova/instances/0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.474 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "/var/lib/nova/instances/0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.475 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "/var/lib/nova/instances/0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.486 183079 DEBUG oslo_concurrency.processutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.545 183079 DEBUG oslo_concurrency.processutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.547 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.547 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.560 183079 DEBUG oslo_concurrency.processutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.619 183079 DEBUG oslo_concurrency.processutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.620 183079 DEBUG oslo_concurrency.processutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.655 183079 DEBUG oslo_concurrency.processutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.657 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.657 183079 DEBUG oslo_concurrency.processutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.717 183079 DEBUG oslo_concurrency.processutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.718 183079 DEBUG nova.virt.disk.api [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Checking if we can resize image /var/lib/nova/instances/0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.719 183079 DEBUG oslo_concurrency.processutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.774 183079 DEBUG nova.policy [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.777 183079 DEBUG oslo_concurrency.processutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.777 183079 DEBUG nova.virt.disk.api [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Cannot resize image /var/lib/nova/instances/0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.778 183079 DEBUG nova.objects.instance [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lazy-loading 'migration_context' on Instance uuid 0a31beaa-1978-4bdc-b51e-23750ba51b8a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.799 183079 DEBUG nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.800 183079 DEBUG nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Ensure instance console log exists: /var/lib/nova/instances/0a31beaa-1978-4bdc-b51e-23750ba51b8a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.801 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.802 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:27 compute-0 nova_compute[183075]: 2026-01-22 17:27:27.802 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:28 compute-0 podman[230887]: 2026-01-22 17:27:28.389429376 +0000 UTC m=+0.095248915 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:27:28 compute-0 nova_compute[183075]: 2026-01-22 17:27:28.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:27:28 compute-0 nova_compute[183075]: 2026-01-22 17:27:28.790 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:27:28 compute-0 nova_compute[183075]: 2026-01-22 17:27:28.889 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:27:29 compute-0 nova_compute[183075]: 2026-01-22 17:27:29.060 183079 DEBUG nova.network.neutron [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Successfully updated port: 8d128323-81b9-4ffa-92c7-39f1389aa99e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:27:29 compute-0 nova_compute[183075]: 2026-01-22 17:27:29.088 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "refresh_cache-0a31beaa-1978-4bdc-b51e-23750ba51b8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:27:29 compute-0 nova_compute[183075]: 2026-01-22 17:27:29.088 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquired lock "refresh_cache-0a31beaa-1978-4bdc-b51e-23750ba51b8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:27:29 compute-0 nova_compute[183075]: 2026-01-22 17:27:29.088 183079 DEBUG nova.network.neutron [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:27:29 compute-0 nova_compute[183075]: 2026-01-22 17:27:29.171 183079 DEBUG nova.compute.manager [req-4a267925-e193-43a3-b452-7847561da77e req-56333f3e-47a8-4c2c-9034-e99a0a3236e9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Received event network-changed-8d128323-81b9-4ffa-92c7-39f1389aa99e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:27:29 compute-0 nova_compute[183075]: 2026-01-22 17:27:29.172 183079 DEBUG nova.compute.manager [req-4a267925-e193-43a3-b452-7847561da77e req-56333f3e-47a8-4c2c-9034-e99a0a3236e9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Refreshing instance network info cache due to event network-changed-8d128323-81b9-4ffa-92c7-39f1389aa99e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:27:29 compute-0 nova_compute[183075]: 2026-01-22 17:27:29.173 183079 DEBUG oslo_concurrency.lockutils [req-4a267925-e193-43a3-b452-7847561da77e req-56333f3e-47a8-4c2c-9034-e99a0a3236e9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-0a31beaa-1978-4bdc-b51e-23750ba51b8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:27:29 compute-0 nova_compute[183075]: 2026-01-22 17:27:29.265 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:29 compute-0 nova_compute[183075]: 2026-01-22 17:27:29.285 183079 DEBUG nova.network.neutron [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:27:29 compute-0 nova_compute[183075]: 2026-01-22 17:27:29.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:27:29 compute-0 nova_compute[183075]: 2026-01-22 17:27:29.966 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:29 compute-0 nova_compute[183075]: 2026-01-22 17:27:29.968 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:29 compute-0 nova_compute[183075]: 2026-01-22 17:27:29.969 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:29 compute-0 nova_compute[183075]: 2026-01-22 17:27:29.969 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:27:30 compute-0 nova_compute[183075]: 2026-01-22 17:27:30.057 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:27:30 compute-0 nova_compute[183075]: 2026-01-22 17:27:30.140 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:27:30 compute-0 nova_compute[183075]: 2026-01-22 17:27:30.142 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:27:30 compute-0 nova_compute[183075]: 2026-01-22 17:27:30.208 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:27:30 compute-0 nova_compute[183075]: 2026-01-22 17:27:30.392 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:27:30 compute-0 nova_compute[183075]: 2026-01-22 17:27:30.393 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5529MB free_disk=73.33127975463867GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:27:30 compute-0 nova_compute[183075]: 2026-01-22 17:27:30.394 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:30 compute-0 nova_compute[183075]: 2026-01-22 17:27:30.394 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:30 compute-0 nova_compute[183075]: 2026-01-22 17:27:30.481 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance b3d0d846-1f10-43c4-8e0c-9ba93967fdc7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:27:30 compute-0 nova_compute[183075]: 2026-01-22 17:27:30.481 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 0a31beaa-1978-4bdc-b51e-23750ba51b8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:27:30 compute-0 nova_compute[183075]: 2026-01-22 17:27:30.482 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:27:30 compute-0 nova_compute[183075]: 2026-01-22 17:27:30.482 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:27:30 compute-0 nova_compute[183075]: 2026-01-22 17:27:30.574 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:27:30 compute-0 nova_compute[183075]: 2026-01-22 17:27:30.600 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:27:30 compute-0 nova_compute[183075]: 2026-01-22 17:27:30.636 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:27:30 compute-0 nova_compute[183075]: 2026-01-22 17:27:30.637 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:31 compute-0 nova_compute[183075]: 2026-01-22 17:27:31.157 183079 DEBUG nova.network.neutron [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Updating instance_info_cache with network_info: [{"id": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "address": "fa:16:3e:e5:fb:b4", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d128323-81", "ovs_interfaceid": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:27:31 compute-0 nova_compute[183075]: 2026-01-22 17:27:31.639 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.062 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Releasing lock "refresh_cache-0a31beaa-1978-4bdc-b51e-23750ba51b8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.063 183079 DEBUG nova.compute.manager [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Instance network_info: |[{"id": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "address": "fa:16:3e:e5:fb:b4", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d128323-81", "ovs_interfaceid": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.064 183079 DEBUG oslo_concurrency.lockutils [req-4a267925-e193-43a3-b452-7847561da77e req-56333f3e-47a8-4c2c-9034-e99a0a3236e9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-0a31beaa-1978-4bdc-b51e-23750ba51b8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.065 183079 DEBUG nova.network.neutron [req-4a267925-e193-43a3-b452-7847561da77e req-56333f3e-47a8-4c2c-9034-e99a0a3236e9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Refreshing network info cache for port 8d128323-81b9-4ffa-92c7-39f1389aa99e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.070 183079 DEBUG nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Start _get_guest_xml network_info=[{"id": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "address": "fa:16:3e:e5:fb:b4", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d128323-81", "ovs_interfaceid": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.076 183079 WARNING nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.083 183079 DEBUG nova.virt.libvirt.host [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.084 183079 DEBUG nova.virt.libvirt.host [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.094 183079 DEBUG nova.virt.libvirt.host [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.095 183079 DEBUG nova.virt.libvirt.host [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.096 183079 DEBUG nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.096 183079 DEBUG nova.virt.hardware [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.097 183079 DEBUG nova.virt.hardware [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.098 183079 DEBUG nova.virt.hardware [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.098 183079 DEBUG nova.virt.hardware [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.099 183079 DEBUG nova.virt.hardware [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.099 183079 DEBUG nova.virt.hardware [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.100 183079 DEBUG nova.virt.hardware [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.100 183079 DEBUG nova.virt.hardware [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.101 183079 DEBUG nova.virt.hardware [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.101 183079 DEBUG nova.virt.hardware [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.102 183079 DEBUG nova.virt.hardware [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.109 183079 DEBUG nova.virt.libvirt.vif [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:27:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-8d128323-779265437',display_name='tempest-server-8d128323-779265437',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-8d128323-779265437',id=48,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHCLHRqMwuIHHegtVN/dBsFuYJfMhVe3L9LmEAUU1mvb3zBmzXJa8cLxmVJrl+X3/Ox/s4/8DkIlSNEMGQ0/B6pNriTMnK4hV1OfgcB6Su2v76kqiKHqmCcfWjFp6l6YnA==',key_name='tempest-keypair-test-1032085623',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6cc38cd108414e729395073de19dceae',ramdisk_id='',reservation_id='r-t00rivbr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestFloatingIPUpdate-1973932938',owner_user_name='tempest-TestFloatingIPUpdate-1973932938-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:27:27Z,user_data=None,user_id='841f48f635f54f619a9de1d6bbc8f832',uuid=0a31beaa-1978-4bdc-b51e-23750ba51b8a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "address": "fa:16:3e:e5:fb:b4", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d128323-81", "ovs_interfaceid": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.110 183079 DEBUG nova.network.os_vif_util [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Converting VIF {"id": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "address": "fa:16:3e:e5:fb:b4", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d128323-81", "ovs_interfaceid": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.111 183079 DEBUG nova.network.os_vif_util [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:fb:b4,bridge_name='br-int',has_traffic_filtering=True,id=8d128323-81b9-4ffa-92c7-39f1389aa99e,network=Network(900bfc1d-c57a-4f7e-92bf-4e7a876cd570),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8d128323-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.113 183079 DEBUG nova.objects.instance [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lazy-loading 'pci_devices' on Instance uuid 0a31beaa-1978-4bdc-b51e-23750ba51b8a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.115 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.146 183079 DEBUG nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:27:32 compute-0 nova_compute[183075]:   <uuid>0a31beaa-1978-4bdc-b51e-23750ba51b8a</uuid>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   <name>instance-00000030</name>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <nova:name>tempest-server-8d128323-779265437</nova:name>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:27:32</nova:creationTime>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:27:32 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:27:32 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:27:32 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:27:32 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:27:32 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:27:32 compute-0 nova_compute[183075]:         <nova:user uuid="841f48f635f54f619a9de1d6bbc8f832">tempest-TestFloatingIPUpdate-1973932938-project-member</nova:user>
Jan 22 17:27:32 compute-0 nova_compute[183075]:         <nova:project uuid="6cc38cd108414e729395073de19dceae">tempest-TestFloatingIPUpdate-1973932938</nova:project>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:27:32 compute-0 nova_compute[183075]:         <nova:port uuid="8d128323-81b9-4ffa-92c7-39f1389aa99e">
Jan 22 17:27:32 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <system>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <entry name="serial">0a31beaa-1978-4bdc-b51e-23750ba51b8a</entry>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <entry name="uuid">0a31beaa-1978-4bdc-b51e-23750ba51b8a</entry>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     </system>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   <os>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   </os>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   <features>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   </features>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:e5:fb:b4"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <target dev="tap8d128323-81"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/0a31beaa-1978-4bdc-b51e-23750ba51b8a/console.log" append="off"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <video>
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     </video>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:27:32 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:27:32 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:27:32 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:27:32 compute-0 nova_compute[183075]: </domain>
Jan 22 17:27:32 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.148 183079 DEBUG nova.compute.manager [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Preparing to wait for external event network-vif-plugged-8d128323-81b9-4ffa-92c7-39f1389aa99e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.151 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.151 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.152 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.153 183079 DEBUG nova.virt.libvirt.vif [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:27:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-8d128323-779265437',display_name='tempest-server-8d128323-779265437',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-8d128323-779265437',id=48,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHCLHRqMwuIHHegtVN/dBsFuYJfMhVe3L9LmEAUU1mvb3zBmzXJa8cLxmVJrl+X3/Ox/s4/8DkIlSNEMGQ0/B6pNriTMnK4hV1OfgcB6Su2v76kqiKHqmCcfWjFp6l6YnA==',key_name='tempest-keypair-test-1032085623',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6cc38cd108414e729395073de19dceae',ramdisk_id='',reservation_id='r-t00rivbr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestFloatingIPUpdate-1973932938',owner_user_name='tempest-TestFloatingIPUpdate-1973932938-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:27:27Z,user_data=None,user_id='841f48f635f54f619a9de1d6bbc8f832',uuid=0a31beaa-1978-4bdc-b51e-23750ba51b8a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "address": "fa:16:3e:e5:fb:b4", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d128323-81", "ovs_interfaceid": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.154 183079 DEBUG nova.network.os_vif_util [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Converting VIF {"id": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "address": "fa:16:3e:e5:fb:b4", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d128323-81", "ovs_interfaceid": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.155 183079 DEBUG nova.network.os_vif_util [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:fb:b4,bridge_name='br-int',has_traffic_filtering=True,id=8d128323-81b9-4ffa-92c7-39f1389aa99e,network=Network(900bfc1d-c57a-4f7e-92bf-4e7a876cd570),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8d128323-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.156 183079 DEBUG os_vif [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:fb:b4,bridge_name='br-int',has_traffic_filtering=True,id=8d128323-81b9-4ffa-92c7-39f1389aa99e,network=Network(900bfc1d-c57a-4f7e-92bf-4e7a876cd570),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8d128323-81') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.157 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.157 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.158 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.163 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.163 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8d128323-81, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.165 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8d128323-81, col_values=(('external_ids', {'iface-id': '8d128323-81b9-4ffa-92c7-39f1389aa99e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:fb:b4', 'vm-uuid': '0a31beaa-1978-4bdc-b51e-23750ba51b8a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.167 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:32 compute-0 NetworkManager[55454]: <info>  [1769102852.1683] manager: (tap8d128323-81): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/225)
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.169 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.176 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.178 183079 INFO os_vif [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:fb:b4,bridge_name='br-int',has_traffic_filtering=True,id=8d128323-81b9-4ffa-92c7-39f1389aa99e,network=Network(900bfc1d-c57a-4f7e-92bf-4e7a876cd570),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8d128323-81')
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.475 183079 DEBUG nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.475 183079 DEBUG nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] No VIF found with MAC fa:16:3e:e5:fb:b4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:27:32 compute-0 kernel: tap8d128323-81: entered promiscuous mode
Jan 22 17:27:32 compute-0 NetworkManager[55454]: <info>  [1769102852.5383] manager: (tap8d128323-81): new Tun device (/org/freedesktop/NetworkManager/Devices/226)
Jan 22 17:27:32 compute-0 ovn_controller[95372]: 2026-01-22T17:27:32Z|00528|binding|INFO|Claiming lport 8d128323-81b9-4ffa-92c7-39f1389aa99e for this chassis.
Jan 22 17:27:32 compute-0 ovn_controller[95372]: 2026-01-22T17:27:32Z|00529|binding|INFO|8d128323-81b9-4ffa-92c7-39f1389aa99e: Claiming fa:16:3e:e5:fb:b4 10.100.0.7
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.539 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:32 compute-0 ovn_controller[95372]: 2026-01-22T17:27:32Z|00530|binding|INFO|Setting lport 8d128323-81b9-4ffa-92c7-39f1389aa99e ovn-installed in OVS
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.556 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:32 compute-0 ovn_controller[95372]: 2026-01-22T17:27:32Z|00531|binding|INFO|Setting lport 8d128323-81b9-4ffa-92c7-39f1389aa99e up in Southbound
Jan 22 17:27:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:32.558 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:fb:b4 10.100.0.7'], port_security=['fa:16:3e:e5:fb:b4 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-900bfc1d-c57a-4f7e-92bf-4e7a876cd570', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc38cd108414e729395073de19dceae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ae587f87-81d1-4ff6-8e92-98dcd41d2886', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c590c1d-a36d-48b3-bcde-1b6ee74771df, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=8d128323-81b9-4ffa-92c7-39f1389aa99e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:27:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:32.559 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 8d128323-81b9-4ffa-92c7-39f1389aa99e in datapath 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 bound to our chassis
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.560 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:32.561 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 900bfc1d-c57a-4f7e-92bf-4e7a876cd570
Jan 22 17:27:32 compute-0 systemd-machined[154382]: New machine qemu-48-instance-00000030.
Jan 22 17:27:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:32.579 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3dfdb071-e1fa-414d-b737-5d1788aabb14]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:32 compute-0 systemd[1]: Started Virtual Machine qemu-48-instance-00000030.
Jan 22 17:27:32 compute-0 systemd-udevd[230932]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:27:32 compute-0 NetworkManager[55454]: <info>  [1769102852.6036] device (tap8d128323-81): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:27:32 compute-0 NetworkManager[55454]: <info>  [1769102852.6044] device (tap8d128323-81): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:27:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:32.615 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5f3c0bda-d2f1-458a-b8b4-e75cc70a2bd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:32.618 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d97aea45-a61a-43e3-b33c-dec23e3b2ec9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:32.648 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a0003feb-afb6-4a70-8412-5e332dd38292]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:32.664 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2b775588-7d9b-49a6-b5dd-57822684a0b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap900bfc1d-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b7:1d:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6143, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6143, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 149], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518058, 'reachable_time': 23458, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230943, 'error': None, 'target': 'ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:32.681 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6e242d8a-e8da-48c9-bd2b-36dd11e33bdc]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap900bfc1d-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 518071, 'tstamp': 518071}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230945, 'error': None, 'target': 'ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap900bfc1d-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 518074, 'tstamp': 518074}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230945, 'error': None, 'target': 'ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:27:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:32.682 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap900bfc1d-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.733 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:32.734 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap900bfc1d-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:27:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:32.735 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:27:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:32.735 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap900bfc1d-c0, col_values=(('external_ids', {'iface-id': '23a8c53c-68ec-4f6e-8f99-d6d830063975'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:27:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:32.736 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.845 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102852.8449206, 0a31beaa-1978-4bdc-b51e-23750ba51b8a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:27:32 compute-0 nova_compute[183075]: 2026-01-22 17:27:32.846 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] VM Started (Lifecycle Event)
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.065 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.071 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102852.8450496, 0a31beaa-1978-4bdc-b51e-23750ba51b8a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.072 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] VM Paused (Lifecycle Event)
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.125 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.131 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.156 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.181 183079 DEBUG nova.compute.manager [req-8cb3d10a-6e71-44e4-a933-5dc0b48667f9 req-ec5daa69-dcf8-4e90-9e85-b3c0e4c988f1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Received event network-vif-plugged-8d128323-81b9-4ffa-92c7-39f1389aa99e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.181 183079 DEBUG oslo_concurrency.lockutils [req-8cb3d10a-6e71-44e4-a933-5dc0b48667f9 req-ec5daa69-dcf8-4e90-9e85-b3c0e4c988f1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.182 183079 DEBUG oslo_concurrency.lockutils [req-8cb3d10a-6e71-44e4-a933-5dc0b48667f9 req-ec5daa69-dcf8-4e90-9e85-b3c0e4c988f1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.182 183079 DEBUG oslo_concurrency.lockutils [req-8cb3d10a-6e71-44e4-a933-5dc0b48667f9 req-ec5daa69-dcf8-4e90-9e85-b3c0e4c988f1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.182 183079 DEBUG nova.compute.manager [req-8cb3d10a-6e71-44e4-a933-5dc0b48667f9 req-ec5daa69-dcf8-4e90-9e85-b3c0e4c988f1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Processing event network-vif-plugged-8d128323-81b9-4ffa-92c7-39f1389aa99e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.183 183079 DEBUG nova.compute.manager [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.188 183079 DEBUG nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.189 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102853.1879494, 0a31beaa-1978-4bdc-b51e-23750ba51b8a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.189 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] VM Resumed (Lifecycle Event)
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.194 183079 INFO nova.virt.libvirt.driver [-] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Instance spawned successfully.
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.194 183079 DEBUG nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.215 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.221 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.226 183079 DEBUG nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.227 183079 DEBUG nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.227 183079 DEBUG nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.228 183079 DEBUG nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.228 183079 DEBUG nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.229 183079 DEBUG nova.virt.libvirt.driver [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.237 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.273 183079 INFO nova.compute.manager [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Took 5.80 seconds to spawn the instance on the hypervisor.
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.274 183079 DEBUG nova.compute.manager [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.335 183079 INFO nova.compute.manager [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Took 6.44 seconds to build instance.
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.348 183079 DEBUG oslo_concurrency.lockutils [None req-d1261ff7-72ab-49ac-9eea-75736c048786 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.514s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:33 compute-0 ovn_controller[95372]: 2026-01-22T17:27:33Z|00532|binding|INFO|Releasing lport 23a8c53c-68ec-4f6e-8f99-d6d830063975 from this chassis (sb_readonly=0)
Jan 22 17:27:33 compute-0 nova_compute[183075]: 2026-01-22 17:27:33.712 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:34 compute-0 nova_compute[183075]: 2026-01-22 17:27:34.700 183079 DEBUG nova.network.neutron [req-4a267925-e193-43a3-b452-7847561da77e req-56333f3e-47a8-4c2c-9034-e99a0a3236e9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Updated VIF entry in instance network info cache for port 8d128323-81b9-4ffa-92c7-39f1389aa99e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:27:34 compute-0 nova_compute[183075]: 2026-01-22 17:27:34.701 183079 DEBUG nova.network.neutron [req-4a267925-e193-43a3-b452-7847561da77e req-56333f3e-47a8-4c2c-9034-e99a0a3236e9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Updating instance_info_cache with network_info: [{"id": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "address": "fa:16:3e:e5:fb:b4", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d128323-81", "ovs_interfaceid": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:27:34 compute-0 nova_compute[183075]: 2026-01-22 17:27:34.730 183079 DEBUG oslo_concurrency.lockutils [req-4a267925-e193-43a3-b452-7847561da77e req-56333f3e-47a8-4c2c-9034-e99a0a3236e9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-0a31beaa-1978-4bdc-b51e-23750ba51b8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:27:34 compute-0 nova_compute[183075]: 2026-01-22 17:27:34.811 183079 INFO nova.compute.manager [None req-11637e61-d3a8-4130-bff7-a0fc37a62816 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Get console output
Jan 22 17:27:35 compute-0 nova_compute[183075]: 2026-01-22 17:27:35.266 183079 DEBUG nova.compute.manager [req-87b6dd87-1d0c-45d5-ac2d-904b0f36727d req-aab37605-aee9-4239-a609-10b8f8f19bb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Received event network-vif-plugged-8d128323-81b9-4ffa-92c7-39f1389aa99e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:27:35 compute-0 nova_compute[183075]: 2026-01-22 17:27:35.266 183079 DEBUG oslo_concurrency.lockutils [req-87b6dd87-1d0c-45d5-ac2d-904b0f36727d req-aab37605-aee9-4239-a609-10b8f8f19bb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:35 compute-0 nova_compute[183075]: 2026-01-22 17:27:35.267 183079 DEBUG oslo_concurrency.lockutils [req-87b6dd87-1d0c-45d5-ac2d-904b0f36727d req-aab37605-aee9-4239-a609-10b8f8f19bb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:35 compute-0 nova_compute[183075]: 2026-01-22 17:27:35.267 183079 DEBUG oslo_concurrency.lockutils [req-87b6dd87-1d0c-45d5-ac2d-904b0f36727d req-aab37605-aee9-4239-a609-10b8f8f19bb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:35 compute-0 nova_compute[183075]: 2026-01-22 17:27:35.267 183079 DEBUG nova.compute.manager [req-87b6dd87-1d0c-45d5-ac2d-904b0f36727d req-aab37605-aee9-4239-a609-10b8f8f19bb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] No waiting events found dispatching network-vif-plugged-8d128323-81b9-4ffa-92c7-39f1389aa99e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:27:35 compute-0 nova_compute[183075]: 2026-01-22 17:27:35.267 183079 WARNING nova.compute.manager [req-87b6dd87-1d0c-45d5-ac2d-904b0f36727d req-aab37605-aee9-4239-a609-10b8f8f19bb5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Received unexpected event network-vif-plugged-8d128323-81b9-4ffa-92c7-39f1389aa99e for instance with vm_state active and task_state None.
Jan 22 17:27:37 compute-0 nova_compute[183075]: 2026-01-22 17:27:37.134 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:37 compute-0 nova_compute[183075]: 2026-01-22 17:27:37.167 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:38 compute-0 podman[230953]: 2026-01-22 17:27:38.350499721 +0000 UTC m=+0.058903194 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:27:39 compute-0 nova_compute[183075]: 2026-01-22 17:27:39.935 183079 INFO nova.compute.manager [None req-7ca8f899-f527-42ef-afef-d3bbaf1e719e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Get console output
Jan 22 17:27:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:41.944 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:27:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:41.944 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:27:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:41.945 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:27:42 compute-0 sshd-session[230977]: Received disconnect from 45.148.10.141 port 28184:11:  [preauth]
Jan 22 17:27:42 compute-0 sshd-session[230977]: Disconnected from authenticating user root 45.148.10.141 port 28184 [preauth]
Jan 22 17:27:42 compute-0 nova_compute[183075]: 2026-01-22 17:27:42.136 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:42 compute-0 nova_compute[183075]: 2026-01-22 17:27:42.169 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:43 compute-0 nova_compute[183075]: 2026-01-22 17:27:43.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:27:44 compute-0 podman[230993]: 2026-01-22 17:27:44.394199377 +0000 UTC m=+0.094312770 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:27:45 compute-0 nova_compute[183075]: 2026-01-22 17:27:45.158 183079 INFO nova.compute.manager [None req-b0d4e721-de77-454c-ac37-b007b1464de4 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Get console output
Jan 22 17:27:45 compute-0 nova_compute[183075]: 2026-01-22 17:27:45.164 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:27:46 compute-0 ovn_controller[95372]: 2026-01-22T17:27:46Z|00098|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e5:fb:b4 10.100.0.7
Jan 22 17:27:46 compute-0 ovn_controller[95372]: 2026-01-22T17:27:46Z|00099|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e5:fb:b4 10.100.0.7
Jan 22 17:27:47 compute-0 nova_compute[183075]: 2026-01-22 17:27:47.137 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:47 compute-0 nova_compute[183075]: 2026-01-22 17:27:47.172 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:50 compute-0 nova_compute[183075]: 2026-01-22 17:27:50.432 183079 INFO nova.compute.manager [None req-d1dcb689-b047-406f-95fa-3a0a0b17dc23 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Get console output
Jan 22 17:27:50 compute-0 nova_compute[183075]: 2026-01-22 17:27:50.436 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:27:52 compute-0 nova_compute[183075]: 2026-01-22 17:27:52.139 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:52 compute-0 nova_compute[183075]: 2026-01-22 17:27:52.174 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:52.755 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:52.755 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:27:52 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:52 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:52 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:52 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:52 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:52 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:27:52 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.221 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.222 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.4663439
Jan 22 17:27:53 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.7:50106 [22/Jan/2026:17:27:52.754] listener listener/metadata 0/0/0/468/468 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.229 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.230 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.245 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:53 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.7:50116 [22/Jan/2026:17:27:53.229] listener listener/metadata 0/0/0/16/16 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.245 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0155780
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.248 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.248 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.271 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:53 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.7:50122 [22/Jan/2026:17:27:53.248] listener listener/metadata 0/0/0/23/23 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.272 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0232100
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.276 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.276 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.291 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.291 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0146587
Jan 22 17:27:53 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.7:50134 [22/Jan/2026:17:27:53.275] listener listener/metadata 0/0/0/15/15 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.296 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.296 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.312 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.312 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0164146
Jan 22 17:27:53 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.7:50148 [22/Jan/2026:17:27:53.295] listener listener/metadata 0/0/0/17/17 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.317 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.317 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.332 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.332 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0152001
Jan 22 17:27:53 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.7:50154 [22/Jan/2026:17:27:53.316] listener listener/metadata 0/0/0/16/16 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.336 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.336 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.351 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:53 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.7:50156 [22/Jan/2026:17:27:53.336] listener listener/metadata 0/0/0/15/15 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.352 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0150630
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.355 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.356 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.376 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.376 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0201378
Jan 22 17:27:53 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.7:50158 [22/Jan/2026:17:27:53.355] listener listener/metadata 0/0/0/21/21 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.381 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.381 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.396 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:53 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.7:50170 [22/Jan/2026:17:27:53.380] listener listener/metadata 0/0/0/16/16 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.397 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 169 time: 0.0159702
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.401 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.402 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.416 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.416 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 169 time: 0.0144801
Jan 22 17:27:53 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.7:50186 [22/Jan/2026:17:27:53.401] listener listener/metadata 0/0/0/15/15 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.421 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.421 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:53 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.7:50202 [22/Jan/2026:17:27:53.421] listener listener/metadata 0/0/0/12/12 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.433 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0116954
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.440 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.441 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.455 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.455 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0141294
Jan 22 17:27:53 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.7:50212 [22/Jan/2026:17:27:53.440] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.459 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.459 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.474 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.475 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0154531
Jan 22 17:27:53 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.7:50228 [22/Jan/2026:17:27:53.458] listener listener/metadata 0/0/0/16/16 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.478 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.479 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.494 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:53 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.7:50242 [22/Jan/2026:17:27:53.478] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.494 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0152144
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.500 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.501 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.516 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:53 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.7:50248 [22/Jan/2026:17:27:53.500] listener listener/metadata 0/0/0/16/16 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.516 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 169 time: 0.0151608
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.524 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.525 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.539 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.539 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0142772
Jan 22 17:27:53 compute-0 haproxy-metadata-proxy-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230621]: 10.100.0.7:50258 [22/Jan/2026:17:27:53.524] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.688 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:27:53 compute-0 nova_compute[183075]: 2026-01-22 17:27:53.689 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:27:53.690 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:27:54 compute-0 podman[231021]: 2026-01-22 17:27:54.338494375 +0000 UTC m=+0.047368846 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 17:27:54 compute-0 podman[231022]: 2026-01-22 17:27:54.349569511 +0000 UTC m=+0.055201856 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git)
Jan 22 17:27:54 compute-0 podman[231020]: 2026-01-22 17:27:54.371400014 +0000 UTC m=+0.081966980 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.458 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'name': 'tempest-server-8d128323-779265437', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000030', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6cc38cd108414e729395073de19dceae', 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'hostId': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.460 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'name': 'tempest-server-06bfc5f7-2022213617', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002f', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6cc38cd108414e729395073de19dceae', 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'hostId': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.460 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.463 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 0a31beaa-1978-4bdc-b51e-23750ba51b8a / tap8d128323-81 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.464 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.466 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b3d0d846-1f10-43c4-8e0c-9ba93967fdc7 / tap06bfc5f7-21 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.467 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16be6d64-923a-40ea-85bd-c59761b47bf7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-00000030-0a31beaa-1978-4bdc-b51e-23750ba51b8a-tap8d128323-81', 'timestamp': '2026-01-22T17:27:55.461076', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'tap8d128323-81', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:fb:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8d128323-81'}, 'message_id': 'b02305a8-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.221505208, 'message_signature': '00c7102936994d9ee8c3e77db4847794f521f0eb1249f719a0a6353fbc027bdb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-0000002f-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-tap06bfc5f7-21', 'timestamp': '2026-01-22T17:27:55.461076', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'tap06bfc5f7-21', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:13:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06bfc5f7-21'}, 'message_id': 'b023694e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.225229778, 'message_signature': '2461c768f59c7fe02078ce314a3ad36e572adb919a8a95bbbf26f96b25f047ed'}]}, 'timestamp': '2026-01-22 17:27:55.467319', '_unique_id': 'b1e00adf55f54ea785ce694af4478ff3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.468 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.469 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.480 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk.device.read.requests volume: 1113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.496 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk.device.read.requests volume: 1134 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32f53f6c-4813-4908-8b77-df7a88872ad9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1113, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a-vda', 'timestamp': '2026-01-22T17:27:55.469190', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'instance-00000030', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b0257e28-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.229610835, 'message_signature': '9afcec8dd48cbf269f556b61c584dc8561a7d478ce7713b77245c487d67bb4ad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1134, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-vda', 'timestamp': '2026-01-22T17:27:55.469190', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'instance-0000002f', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b02801ac-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.24142506, 'message_signature': '73f4f83ec1bb0cf824ffd3359e2815fd1e5fd4ad50f7f0d8c23bb26226d2a8b6'}]}, 'timestamp': '2026-01-22 17:27:55.497654', '_unique_id': 'e0d7c835c58c4043b50b9786b64860db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.500 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.510 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.518 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f76578d-780d-4221-a67d-83c346064b03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a-vda', 'timestamp': '2026-01-22T17:27:55.500884', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'instance-00000030', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b02a13de-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.261389714, 'message_signature': 'a845c43033bccceed30715a8c65699f41cb5b84451fc5b57e6af194788e4da62'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-vda', 'timestamp': '2026-01-22T17:27:55.500884', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'instance-0000002f', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b02b4394-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.271647838, 'message_signature': 'c811ea10a5423bdc01c24539a95b4f264b0eb089f3e1bfa5f6bd7a1a608b21d4'}]}, 'timestamp': '2026-01-22 17:27:55.518926', '_unique_id': '00b40d4e896b4bacb786197fddd9b4c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.520 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.521 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.521 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.521 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ab152e7-de6f-4339-82bc-61ba4edf4299', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-00000030-0a31beaa-1978-4bdc-b51e-23750ba51b8a-tap8d128323-81', 'timestamp': '2026-01-22T17:27:55.521353', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'tap8d128323-81', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:fb:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8d128323-81'}, 'message_id': 'b02bb4aa-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.221505208, 'message_signature': '45121375fa8c7031e76ce33cebcc4a6533257004dc1831e87b59448cc297ddb0'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-0000002f-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-tap06bfc5f7-21', 'timestamp': '2026-01-22T17:27:55.521353', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'tap06bfc5f7-21', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:13:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06bfc5f7-21'}, 'message_id': 'b02bc0f8-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.225229778, 'message_signature': '84bc88472b980499f86be3edafbeccb92ff8fca0cb23db4cf9feafc8ac445a15'}]}, 'timestamp': '2026-01-22 17:27:55.522012', '_unique_id': 'b2ff2aeee7b24e2db4bf573bf65c845d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.523 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.523 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk.device.write.requests volume: 312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.523 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk.device.write.requests volume: 346 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab3d6e88-e0dd-4d0c-a38f-d95a4e2f2a29', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 312, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a-vda', 'timestamp': '2026-01-22T17:27:55.523583', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'instance-00000030', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b02c0c48-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.229610835, 'message_signature': 'b5ba2cd735778e4ada0ec596b9c1cbd6bc1741613ae593ed341d6c72c68ca129'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 346, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-vda', 'timestamp': '2026-01-22T17:27:55.523583', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'instance-0000002f', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b02c17d8-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.24142506, 'message_signature': '9575eca112ce364fc38288d8b99f643509951966d91828baf9ef0cfae69f920b'}]}, 'timestamp': '2026-01-22 17:27:55.524223', '_unique_id': '05e08cf7f3d642a391558dfa03fc60a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.525 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.526 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/network.incoming.bytes volume: 7314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.526 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/network.incoming.bytes volume: 7491 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6aa06910-d98f-482e-96fd-a0bb96515b4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7314, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-00000030-0a31beaa-1978-4bdc-b51e-23750ba51b8a-tap8d128323-81', 'timestamp': '2026-01-22T17:27:55.526031', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'tap8d128323-81', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:fb:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8d128323-81'}, 'message_id': 'b02c6a44-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.221505208, 'message_signature': '975045039b5e84f800734a7ee4bd2d60858022518fa5644054b51b6d141b2711'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7491, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-0000002f-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-tap06bfc5f7-21', 'timestamp': '2026-01-22T17:27:55.526031', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'tap06bfc5f7-21', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:13:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06bfc5f7-21'}, 'message_id': 'b02c764c-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.225229778, 'message_signature': 'bc2bb357e3f0c6894815be2f35eb61c88b67655c450c43b234bc65775dbb3afc'}]}, 'timestamp': '2026-01-22 17:27:55.526690', '_unique_id': '72fd9211d7c84b8a896b0b74cee8685e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.528 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.528 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.529 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f534a798-4baf-4d24-916b-4c1f42e19d37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-00000030-0a31beaa-1978-4bdc-b51e-23750ba51b8a-tap8d128323-81', 'timestamp': '2026-01-22T17:27:55.528594', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'tap8d128323-81', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:fb:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8d128323-81'}, 'message_id': 'b02ccffc-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.221505208, 'message_signature': '467e72760516ed1e2db262e2fc4a0fbc25adf67fa90782a8b763646fdcaac40a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-0000002f-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-tap06bfc5f7-21', 'timestamp': '2026-01-22T17:27:55.528594', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'tap06bfc5f7-21', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:13:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06bfc5f7-21'}, 'message_id': 'b02ce5b4-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.225229778, 'message_signature': 'd1c2102508c2818825f159961936def1e6095deed659836f7183bdd10952240c'}]}, 'timestamp': '2026-01-22 17:27:55.529522', '_unique_id': 'a638dacf30084c0e9e4c284685fe7629'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.531 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.531 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk.device.read.latency volume: 147440447 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.531 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk.device.read.latency volume: 210453256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5a3f99b-ae2d-4c27-81b9-1805be225a7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 147440447, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a-vda', 'timestamp': '2026-01-22T17:27:55.531531', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'instance-00000030', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b02d422a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.229610835, 'message_signature': '220fff043917fa1a30adba3f0eca82be64b4ec4cd7c47a6fd99ce004ad9abaa0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 210453256, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-vda', 'timestamp': '2026-01-22T17:27:55.531531', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'instance-0000002f', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b02d4dd8-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.24142506, 'message_signature': '9b368d1c933450d948724b653d1e102dc2f3996d46da31cfbbdbc964a25dba39'}]}, 'timestamp': '2026-01-22 17:27:55.532169', '_unique_id': 'e3d16a197b3649e18573d928e7fe7a45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.533 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.534 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.534 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2c1c67c-439f-4505-a9af-92da587765e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-00000030-0a31beaa-1978-4bdc-b51e-23750ba51b8a-tap8d128323-81', 'timestamp': '2026-01-22T17:27:55.534013', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'tap8d128323-81', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:fb:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8d128323-81'}, 'message_id': 'b02da1fc-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.221505208, 'message_signature': 'd27e1da1b1966385f97be29e5975240581be5fb64edadab50fbe0a64223e0213'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-0000002f-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-tap06bfc5f7-21', 'timestamp': '2026-01-22T17:27:55.534013', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'tap06bfc5f7-21', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:13:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06bfc5f7-21'}, 'message_id': 'b02dadbe-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.225229778, 'message_signature': '3c252f962101cbf1739e0d4bb53edc9c018bd9c6655caa0338ecfde595fe1745'}]}, 'timestamp': '2026-01-22 17:27:55.534649', '_unique_id': '57fae3e56cf84f53a4d07e99e099806f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.536 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.536 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/network.outgoing.bytes volume: 10121 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.536 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/network.outgoing.bytes volume: 10616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb569965-6bf3-4943-a07d-f7a3801f06e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10121, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-00000030-0a31beaa-1978-4bdc-b51e-23750ba51b8a-tap8d128323-81', 'timestamp': '2026-01-22T17:27:55.536286', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'tap8d128323-81', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:fb:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8d128323-81'}, 'message_id': 'b02dfae4-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.221505208, 'message_signature': '03445a31dd1be68353eb367f45a088cfb2b7770a124b5abf79cfbc2848062d8c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10616, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-0000002f-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-tap06bfc5f7-21', 'timestamp': '2026-01-22T17:27:55.536286', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'tap06bfc5f7-21', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:13:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06bfc5f7-21'}, 'message_id': 'b02e0872-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.225229778, 'message_signature': 'f2f720428f8584efbd37c0d8f042521639c00a51f0182b08c7070ee03069921f'}]}, 'timestamp': '2026-01-22 17:27:55.536964', '_unique_id': 'ad3b92b6bf6e43b3a47d2bde4c522009'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.538 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.538 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk.device.write.bytes volume: 72871936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.539 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk.device.write.bytes volume: 73129984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48144e7b-fabc-40cf-a0bc-104c87115883', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72871936, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a-vda', 'timestamp': '2026-01-22T17:27:55.538803', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'instance-00000030', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b02e5d90-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.229610835, 'message_signature': '890f37ad908286f6ec782b686b1bc17e8f2e945059c025b728a76aa6eb7ed9dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73129984, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-vda', 'timestamp': '2026-01-22T17:27:55.538803', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'instance-0000002f', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b02e6952-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.24142506, 'message_signature': 'f3757460c9526ce498469c1664f0d8debcaa1cc5fad16639c7747f3ad5ed1677'}]}, 'timestamp': '2026-01-22 17:27:55.539424', '_unique_id': '935d55e132144192a64863515fa8d3d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.540 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.541 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.541 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.541 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-8d128323-779265437>, <NovaLikeServer: tempest-server-06bfc5f7-2022213617>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-8d128323-779265437>, <NovaLikeServer: tempest-server-06bfc5f7-2022213617>]
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.541 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.558 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/memory.usage volume: 42.625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.572 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/memory.usage volume: 42.84375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a20e136a-851e-4257-a744-8c33fdcbe129', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.625, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'timestamp': '2026-01-22T17:27:55.541683', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'instance-00000030', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'b0317ade-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.319095145, 'message_signature': '75a7b3ba47f7f1b85e65fdfe88a6ff99e736d5487eea1f2aa71fc930c7515ecb'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.84375, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'timestamp': '2026-01-22T17:27:55.541683', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'instance-0000002f', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'b0338856-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.332845253, 'message_signature': '8bd27ab5c6a3763af178220db24b41ddceacee68090fa8ae389be1822a84e97c'}]}, 'timestamp': '2026-01-22 17:27:55.573056', '_unique_id': '557c4975f90b4020b37b30069c669064'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.575 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.575 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.575 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-8d128323-779265437>, <NovaLikeServer: tempest-server-06bfc5f7-2022213617>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-8d128323-779265437>, <NovaLikeServer: tempest-server-06bfc5f7-2022213617>]
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.575 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.575 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk.device.usage volume: 29818880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.576 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f9092da-12c3-4f61-b7af-e398ca9357e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29818880, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a-vda', 'timestamp': '2026-01-22T17:27:55.575755', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'instance-00000030', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b0340006-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.261389714, 'message_signature': '4cfd36510c72e0a66cb483c03905bb9cced9cead924eb39a890805bf732b853b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-vda', 'timestamp': '2026-01-22T17:27:55.575755', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'instance-0000002f', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b0340d9e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.271647838, 'message_signature': 'b26eb47db01fce6d83f8b888a2271633b61a447d004b06bee51188d548c93836'}]}, 'timestamp': '2026-01-22 17:27:55.576396', '_unique_id': '2d43f769691d43d7ae370bff5da527ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.577 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-8d128323-779265437>, <NovaLikeServer: tempest-server-06bfc5f7-2022213617>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-8d128323-779265437>, <NovaLikeServer: tempest-server-06bfc5f7-2022213617>]
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.578 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.578 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/network.outgoing.packets volume: 115 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.578 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/network.outgoing.packets volume: 121 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1474db9-589d-4f93-bfd8-4e65f59aec4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 115, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-00000030-0a31beaa-1978-4bdc-b51e-23750ba51b8a-tap8d128323-81', 'timestamp': '2026-01-22T17:27:55.578240', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'tap8d128323-81', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:fb:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8d128323-81'}, 'message_id': 'b0346172-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.221505208, 'message_signature': '9ac4895bbb2596edb46137dd20d750d9af5a5ebda787234965ed1c0b835cf3e0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 121, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-0000002f-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-tap06bfc5f7-21', 'timestamp': '2026-01-22T17:27:55.578240', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'tap06bfc5f7-21', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:13:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06bfc5f7-21'}, 'message_id': 'b0346d8e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.225229778, 'message_signature': '2230b08b958c1a4c51a32f0afa2f7e9f050cbe907f5bb27fd96db773b218db14'}]}, 'timestamp': '2026-01-22 17:27:55.578860', '_unique_id': '8119a10d337e4f23892a6c7096aeef59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.580 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.581 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.581 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk.device.write.latency volume: 8271925146 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.581 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk.device.write.latency volume: 4093546795 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '895182fc-17fc-4663-b725-8bdb409698ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8271925146, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a-vda', 'timestamp': '2026-01-22T17:27:55.581420', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'instance-00000030', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b034ddaa-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.229610835, 'message_signature': '27ef89b1eb6f4930335311461320bd26e5f64154fc3cee51f40b8f68b4378f51'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4093546795, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-vda', 'timestamp': '2026-01-22T17:27:55.581420', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'instance-0000002f', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b034e8ae-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.24142506, 'message_signature': 'f0aecb4a115d2b5b6cfc40fbc4c0596ad4927fd7d904d847fbea748dd0ac3167'}]}, 'timestamp': '2026-01-22 17:27:55.581973', '_unique_id': '5acdbc511cdc4ecc9e1946fdca9230c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.582 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.583 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.583 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.583 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6819a8bf-8074-4ade-9ead-c2d7d68a4571', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-00000030-0a31beaa-1978-4bdc-b51e-23750ba51b8a-tap8d128323-81', 'timestamp': '2026-01-22T17:27:55.583314', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'tap8d128323-81', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:fb:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8d128323-81'}, 'message_id': 'b0352634-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.221505208, 'message_signature': '4db5489569686ef403417f3227f7598c2d0f892ab95a90252e146ec3bc4a5e08'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-0000002f-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-tap06bfc5f7-21', 'timestamp': '2026-01-22T17:27:55.583314', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'tap06bfc5f7-21', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:13:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06bfc5f7-21'}, 'message_id': 'b0352f6c-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.225229778, 'message_signature': '36873fffcf0e4003ec1da0ddd11be87fc400d34398bf7b64149aba2d97c2a68a'}]}, 'timestamp': '2026-01-22 17:27:55.583781', '_unique_id': '0b60da0e6c754edc9b60fc62f8cced83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.584 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/cpu volume: 11100000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/cpu volume: 11640000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e222d16-1c43-4d7b-8902-d13ae52fc12f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11100000000, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'timestamp': '2026-01-22T17:27:55.584870', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'instance-00000030', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b03562c0-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.319095145, 'message_signature': '94823ad556ce6f52d0f4058b89bf7f757e4ac6c59f6bcd73666103f70f2bd2e9'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11640000000, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'timestamp': '2026-01-22T17:27:55.584870', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'instance-0000002f', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b0356a7c-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.332845253, 'message_signature': '1694bc30fe5d36c02ab95ca652b977cda4690a4ddff7e7a3253660c5767c2070'}]}, 'timestamp': '2026-01-22 17:27:55.585305', '_unique_id': 'fbdc1594c6ab4ae792834e6cbb6f5e9e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.586 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.586 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.586 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-8d128323-779265437>, <NovaLikeServer: tempest-server-06bfc5f7-2022213617>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-8d128323-779265437>, <NovaLikeServer: tempest-server-06bfc5f7-2022213617>]
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.586 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.586 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk.device.read.bytes volume: 30050816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk.device.read.bytes volume: 30808576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc39bd8d-47cf-46d7-aacf-22bce3148683', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30050816, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a-vda', 'timestamp': '2026-01-22T17:27:55.586945', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'instance-00000030', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b035b3b0-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.229610835, 'message_signature': '46cb52d0fd4fa012aabf56358df1f4625472b8facd2d39277782fab4cb4d7b70'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30808576, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-vda', 'timestamp': '2026-01-22T17:27:55.586945', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'instance-0000002f', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b035bba8-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.24142506, 'message_signature': '318db1ec93474bbf212880a91b063488c8a850a61b812c228b73fab6056516cf'}]}, 'timestamp': '2026-01-22 17:27:55.587362', '_unique_id': '21b2b1d1b37d4a908bb09be5b2d14059'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.588 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.588 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.588 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70f67014-0878-4452-a12a-b404eee04094', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-00000030-0a31beaa-1978-4bdc-b51e-23750ba51b8a-tap8d128323-81', 'timestamp': '2026-01-22T17:27:55.588447', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'tap8d128323-81', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:fb:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8d128323-81'}, 'message_id': 'b035ee48-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.221505208, 'message_signature': 'e68201da097953a4b2607769b2fdf12129c2ac7e3d317a891c8af6943fd805ff'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-0000002f-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-tap06bfc5f7-21', 'timestamp': '2026-01-22T17:27:55.588447', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'tap06bfc5f7-21', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:13:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06bfc5f7-21'}, 'message_id': 'b035f776-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.225229778, 'message_signature': '9417a244490218df26eb828b2ce360c04cacbd112d8b0e06dc41c8c204ffe59c'}]}, 'timestamp': '2026-01-22 17:27:55.588899', '_unique_id': '8c4efac256344ea8ab8105ea6a1b7ede'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.589 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.590 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.590 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf6284df-d67f-4fc7-87a9-f489ce22aa01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a-vda', 'timestamp': '2026-01-22T17:27:55.590038', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'instance-00000030', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b0362db8-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.261389714, 'message_signature': 'ab3e5ebe8f76fa954c7fa52ec7af696f75682011e2532bb5d1933b08bfebc5dd'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-vda', 'timestamp': '2026-01-22T17:27:55.590038', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'instance-0000002f', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b0363862-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.271647838, 'message_signature': '467dc273b4d06be0ef4e21f8826038731317e3f6311b589bd6b697874ff21788'}]}, 'timestamp': '2026-01-22 17:27:55.590565', '_unique_id': '7fcd034c2c3d4197996a7603e938b0cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.591 12 DEBUG ceilometer.compute.pollsters [-] 0a31beaa-1978-4bdc-b51e-23750ba51b8a/network.incoming.packets volume: 62 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 DEBUG ceilometer.compute.pollsters [-] b3d0d846-1f10-43c4-8e0c-9ba93967fdc7/network.incoming.packets volume: 65 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f2f6815-5f5d-4f1d-b933-fb6115872f87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 62, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-00000030-0a31beaa-1978-4bdc-b51e-23750ba51b8a-tap8d128323-81', 'timestamp': '2026-01-22T17:27:55.591793', 'resource_metadata': {'display_name': 'tempest-server-8d128323-779265437', 'name': 'tap8d128323-81', 'instance_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:fb:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8d128323-81'}, 'message_id': 'b036728c-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.221505208, 'message_signature': '5af0a46218604d994217efc5ba6d023514c32e3e03b2c8a40fb2085c56a71d75'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 65, 'user_id': '841f48f635f54f619a9de1d6bbc8f832', 'user_name': None, 'project_id': '6cc38cd108414e729395073de19dceae', 'project_name': None, 'resource_id': 'instance-0000002f-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-tap06bfc5f7-21', 'timestamp': '2026-01-22T17:27:55.591793', 'resource_metadata': {'display_name': 'tempest-server-06bfc5f7-2022213617', 'name': 'tap06bfc5f7-21', 'instance_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'instance_type': 'm1.nano', 'host': '554cc96be986cb98bb9736f082d51bb0a8a19cabbd2f4c110f1715ca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:13:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06bfc5f7-21'}, 'message_id': 'b0367d4a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5239.225229778, 'message_signature': '80e1198c125f85553b3daf6846531ff3bec5ee182236613f693ed2b3d3e77241'}]}, 'timestamp': '2026-01-22 17:27:55.592336', '_unique_id': '61e2152b74c8464bbf5b9603b05d4cab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:27:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:27:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:27:55 compute-0 nova_compute[183075]: 2026-01-22 17:27:55.621 183079 INFO nova.compute.manager [None req-969b7858-5747-4b5e-85d7-4a55e441a39e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Get console output
Jan 22 17:27:55 compute-0 nova_compute[183075]: 2026-01-22 17:27:55.625 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:27:57 compute-0 nova_compute[183075]: 2026-01-22 17:27:57.213 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:27:58 compute-0 nova_compute[183075]: 2026-01-22 17:27:57.999 183079 INFO nova.compute.manager [None req-1abe3b00-bc17-4f10-901c-30bdfa32cfc3 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Get console output
Jan 22 17:27:58 compute-0 nova_compute[183075]: 2026-01-22 17:27:58.006 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:27:59 compute-0 podman[231083]: 2026-01-22 17:27:59.35692591 +0000 UTC m=+0.061351100 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:28:02 compute-0 nova_compute[183075]: 2026-01-22 17:28:02.226 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:28:02 compute-0 nova_compute[183075]: 2026-01-22 17:28:02.227 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:02 compute-0 nova_compute[183075]: 2026-01-22 17:28:02.227 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5013 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 17:28:02 compute-0 nova_compute[183075]: 2026-01-22 17:28:02.227 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 17:28:02 compute-0 nova_compute[183075]: 2026-01-22 17:28:02.227 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 17:28:02 compute-0 nova_compute[183075]: 2026-01-22 17:28:02.229 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:02 compute-0 nova_compute[183075]: 2026-01-22 17:28:02.564 183079 INFO nova.compute.manager [None req-f2dbd8fa-f04b-4f44-b163-213b3b30f440 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Get console output
Jan 22 17:28:02 compute-0 nova_compute[183075]: 2026-01-22 17:28:02.571 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:28:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:02.693 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:28:07 compute-0 nova_compute[183075]: 2026-01-22 17:28:07.231 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:28:07 compute-0 nova_compute[183075]: 2026-01-22 17:28:07.233 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:28:07 compute-0 nova_compute[183075]: 2026-01-22 17:28:07.233 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 17:28:07 compute-0 nova_compute[183075]: 2026-01-22 17:28:07.233 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 17:28:07 compute-0 nova_compute[183075]: 2026-01-22 17:28:07.269 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:07 compute-0 nova_compute[183075]: 2026-01-22 17:28:07.270 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 17:28:07 compute-0 nova_compute[183075]: 2026-01-22 17:28:07.532 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:07 compute-0 NetworkManager[55454]: <info>  [1769102887.5337] manager: (patch-br-int-to-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/227)
Jan 22 17:28:07 compute-0 NetworkManager[55454]: <info>  [1769102887.5357] manager: (patch-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/228)
Jan 22 17:28:07 compute-0 ovn_controller[95372]: 2026-01-22T17:28:07Z|00533|binding|INFO|Releasing lport 23a8c53c-68ec-4f6e-8f99-d6d830063975 from this chassis (sb_readonly=0)
Jan 22 17:28:07 compute-0 nova_compute[183075]: 2026-01-22 17:28:07.602 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:07 compute-0 nova_compute[183075]: 2026-01-22 17:28:07.609 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:08 compute-0 nova_compute[183075]: 2026-01-22 17:28:08.963 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:08 compute-0 nova_compute[183075]: 2026-01-22 17:28:08.967 183079 DEBUG nova.compute.manager [req-f442904c-669c-4850-b5d6-ba5ee82f089b req-d69f6751-4f5f-4bc2-8a94-0106e4594b0c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Received event network-changed-06bfc5f7-2163-4a40-87a7-050edd036a92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:28:08 compute-0 nova_compute[183075]: 2026-01-22 17:28:08.967 183079 DEBUG nova.compute.manager [req-f442904c-669c-4850-b5d6-ba5ee82f089b req-d69f6751-4f5f-4bc2-8a94-0106e4594b0c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Refreshing instance network info cache due to event network-changed-06bfc5f7-2163-4a40-87a7-050edd036a92. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:28:08 compute-0 nova_compute[183075]: 2026-01-22 17:28:08.967 183079 DEBUG oslo_concurrency.lockutils [req-f442904c-669c-4850-b5d6-ba5ee82f089b req-d69f6751-4f5f-4bc2-8a94-0106e4594b0c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:28:08 compute-0 nova_compute[183075]: 2026-01-22 17:28:08.968 183079 DEBUG oslo_concurrency.lockutils [req-f442904c-669c-4850-b5d6-ba5ee82f089b req-d69f6751-4f5f-4bc2-8a94-0106e4594b0c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:28:08 compute-0 nova_compute[183075]: 2026-01-22 17:28:08.968 183079 DEBUG nova.network.neutron [req-f442904c-669c-4850-b5d6-ba5ee82f089b req-d69f6751-4f5f-4bc2-8a94-0106e4594b0c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Refreshing network info cache for port 06bfc5f7-2163-4a40-87a7-050edd036a92 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:28:09 compute-0 podman[231105]: 2026-01-22 17:28:09.39641988 +0000 UTC m=+0.089060550 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.575 183079 DEBUG oslo_concurrency.lockutils [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.576 183079 DEBUG oslo_concurrency.lockutils [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.576 183079 DEBUG oslo_concurrency.lockutils [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.576 183079 DEBUG oslo_concurrency.lockutils [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.576 183079 DEBUG oslo_concurrency.lockutils [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.577 183079 INFO nova.compute.manager [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Terminating instance
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.578 183079 DEBUG nova.compute.manager [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:28:10 compute-0 kernel: tap8d128323-81 (unregistering): left promiscuous mode
Jan 22 17:28:10 compute-0 NetworkManager[55454]: <info>  [1769102890.6038] device (tap8d128323-81): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.615 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:10 compute-0 ovn_controller[95372]: 2026-01-22T17:28:10Z|00534|binding|INFO|Releasing lport 8d128323-81b9-4ffa-92c7-39f1389aa99e from this chassis (sb_readonly=0)
Jan 22 17:28:10 compute-0 ovn_controller[95372]: 2026-01-22T17:28:10Z|00535|binding|INFO|Setting lport 8d128323-81b9-4ffa-92c7-39f1389aa99e down in Southbound
Jan 22 17:28:10 compute-0 ovn_controller[95372]: 2026-01-22T17:28:10Z|00536|binding|INFO|Removing iface tap8d128323-81 ovn-installed in OVS
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.619 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:10.626 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:fb:b4 10.100.0.7'], port_security=['fa:16:3e:e5:fb:b4 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0a31beaa-1978-4bdc-b51e-23750ba51b8a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-900bfc1d-c57a-4f7e-92bf-4e7a876cd570', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc38cd108414e729395073de19dceae', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ae587f87-81d1-4ff6-8e92-98dcd41d2886', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c590c1d-a36d-48b3-bcde-1b6ee74771df, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=8d128323-81b9-4ffa-92c7-39f1389aa99e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:28:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:10.629 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 8d128323-81b9-4ffa-92c7-39f1389aa99e in datapath 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 unbound from our chassis
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.630 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:10.632 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 900bfc1d-c57a-4f7e-92bf-4e7a876cd570
Jan 22 17:28:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:10.650 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0e41ee3e-cb93-4010-9a7a-e65a0dd816fa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:10 compute-0 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000030.scope: Deactivated successfully.
Jan 22 17:28:10 compute-0 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000030.scope: Consumed 13.172s CPU time.
Jan 22 17:28:10 compute-0 systemd-machined[154382]: Machine qemu-48-instance-00000030 terminated.
Jan 22 17:28:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:10.684 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[cb6f98e3-72aa-4766-975d-d02937de9c3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:10.687 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ba369135-7c59-4c40-b5b0-dd5f640b3911]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:10.719 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ee2965cf-bb49-4817-9bb8-6be1467b0495]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:10.738 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[86281df9-41f3-4d54-9c7e-6d40c9803b06]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap900bfc1d-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b7:1d:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 105, 'rx_bytes': 17308, 'tx_bytes': 12013, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 105, 'rx_bytes': 17308, 'tx_bytes': 12013, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 149], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518058, 'reachable_time': 23458, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231141, 'error': None, 'target': 'ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:10.758 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[411f1796-51b1-469a-8cf8-a80c75929363]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap900bfc1d-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 518071, 'tstamp': 518071}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231142, 'error': None, 'target': 'ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap900bfc1d-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 518074, 'tstamp': 518074}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231142, 'error': None, 'target': 'ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:10.759 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap900bfc1d-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.760 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.766 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:10.766 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap900bfc1d-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:28:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:10.767 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:28:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:10.767 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap900bfc1d-c0, col_values=(('external_ids', {'iface-id': '23a8c53c-68ec-4f6e-8f99-d6d830063975'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:28:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:10.767 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.839 183079 INFO nova.virt.libvirt.driver [-] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Instance destroyed successfully.
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.840 183079 DEBUG nova.objects.instance [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lazy-loading 'resources' on Instance uuid 0a31beaa-1978-4bdc-b51e-23750ba51b8a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.863 183079 DEBUG nova.virt.libvirt.vif [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:27:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-8d128323-779265437',display_name='tempest-server-8d128323-779265437',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-8d128323-779265437',id=48,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHCLHRqMwuIHHegtVN/dBsFuYJfMhVe3L9LmEAUU1mvb3zBmzXJa8cLxmVJrl+X3/Ox/s4/8DkIlSNEMGQ0/B6pNriTMnK4hV1OfgcB6Su2v76kqiKHqmCcfWjFp6l6YnA==',key_name='tempest-keypair-test-1032085623',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:27:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6cc38cd108414e729395073de19dceae',ramdisk_id='',reservation_id='r-t00rivbr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestFloatingIPUpdate-1973932938',owner_user_name='tempest-TestFloatingIPUpdate-1973932938-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:27:33Z,user_data=None,user_id='841f48f635f54f619a9de1d6bbc8f832',uuid=0a31beaa-1978-4bdc-b51e-23750ba51b8a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "address": "fa:16:3e:e5:fb:b4", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d128323-81", "ovs_interfaceid": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.864 183079 DEBUG nova.network.os_vif_util [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Converting VIF {"id": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "address": "fa:16:3e:e5:fb:b4", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d128323-81", "ovs_interfaceid": "8d128323-81b9-4ffa-92c7-39f1389aa99e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.864 183079 DEBUG nova.network.os_vif_util [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:fb:b4,bridge_name='br-int',has_traffic_filtering=True,id=8d128323-81b9-4ffa-92c7-39f1389aa99e,network=Network(900bfc1d-c57a-4f7e-92bf-4e7a876cd570),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8d128323-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.865 183079 DEBUG os_vif [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:fb:b4,bridge_name='br-int',has_traffic_filtering=True,id=8d128323-81b9-4ffa-92c7-39f1389aa99e,network=Network(900bfc1d-c57a-4f7e-92bf-4e7a876cd570),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8d128323-81') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.866 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.866 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d128323-81, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.868 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.870 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.872 183079 INFO os_vif [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:fb:b4,bridge_name='br-int',has_traffic_filtering=True,id=8d128323-81b9-4ffa-92c7-39f1389aa99e,network=Network(900bfc1d-c57a-4f7e-92bf-4e7a876cd570),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8d128323-81')
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.872 183079 INFO nova.virt.libvirt.driver [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Deleting instance files /var/lib/nova/instances/0a31beaa-1978-4bdc-b51e-23750ba51b8a_del
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.873 183079 INFO nova.virt.libvirt.driver [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Deletion of /var/lib/nova/instances/0a31beaa-1978-4bdc-b51e-23750ba51b8a_del complete
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.931 183079 INFO nova.compute.manager [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.932 183079 DEBUG oslo.service.loopingcall [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.932 183079 DEBUG nova.compute.manager [-] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:28:10 compute-0 nova_compute[183075]: 2026-01-22 17:28:10.932 183079 DEBUG nova.network.neutron [-] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:28:11 compute-0 nova_compute[183075]: 2026-01-22 17:28:11.058 183079 DEBUG nova.compute.manager [req-1fb1fdf2-bee9-442d-b1d6-5a9c83f0abb4 req-6f6c7310-2f74-499d-8bbb-5601d2855e54 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Received event network-changed-06bfc5f7-2163-4a40-87a7-050edd036a92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:28:11 compute-0 nova_compute[183075]: 2026-01-22 17:28:11.059 183079 DEBUG nova.compute.manager [req-1fb1fdf2-bee9-442d-b1d6-5a9c83f0abb4 req-6f6c7310-2f74-499d-8bbb-5601d2855e54 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Refreshing instance network info cache due to event network-changed-06bfc5f7-2163-4a40-87a7-050edd036a92. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:28:11 compute-0 nova_compute[183075]: 2026-01-22 17:28:11.059 183079 DEBUG oslo_concurrency.lockutils [req-1fb1fdf2-bee9-442d-b1d6-5a9c83f0abb4 req-6f6c7310-2f74-499d-8bbb-5601d2855e54 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:28:11 compute-0 nova_compute[183075]: 2026-01-22 17:28:11.544 183079 DEBUG nova.network.neutron [req-f442904c-669c-4850-b5d6-ba5ee82f089b req-d69f6751-4f5f-4bc2-8a94-0106e4594b0c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Updated VIF entry in instance network info cache for port 06bfc5f7-2163-4a40-87a7-050edd036a92. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:28:11 compute-0 nova_compute[183075]: 2026-01-22 17:28:11.544 183079 DEBUG nova.network.neutron [req-f442904c-669c-4850-b5d6-ba5ee82f089b req-d69f6751-4f5f-4bc2-8a94-0106e4594b0c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Updating instance_info_cache with network_info: [{"id": "06bfc5f7-2163-4a40-87a7-050edd036a92", "address": "fa:16:3e:9e:13:c8", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06bfc5f7-21", "ovs_interfaceid": "06bfc5f7-2163-4a40-87a7-050edd036a92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:28:11 compute-0 nova_compute[183075]: 2026-01-22 17:28:11.562 183079 DEBUG oslo_concurrency.lockutils [req-f442904c-669c-4850-b5d6-ba5ee82f089b req-d69f6751-4f5f-4bc2-8a94-0106e4594b0c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:28:11 compute-0 nova_compute[183075]: 2026-01-22 17:28:11.562 183079 DEBUG oslo_concurrency.lockutils [req-1fb1fdf2-bee9-442d-b1d6-5a9c83f0abb4 req-6f6c7310-2f74-499d-8bbb-5601d2855e54 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:28:11 compute-0 nova_compute[183075]: 2026-01-22 17:28:11.563 183079 DEBUG nova.network.neutron [req-1fb1fdf2-bee9-442d-b1d6-5a9c83f0abb4 req-6f6c7310-2f74-499d-8bbb-5601d2855e54 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Refreshing network info cache for port 06bfc5f7-2163-4a40-87a7-050edd036a92 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:28:11 compute-0 nova_compute[183075]: 2026-01-22 17:28:11.645 183079 DEBUG nova.compute.manager [req-a04fa2aa-3855-4f2b-b340-16c6093978f9 req-e5366876-1277-444a-abe7-d6353553e5bb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Received event network-vif-unplugged-8d128323-81b9-4ffa-92c7-39f1389aa99e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:28:11 compute-0 nova_compute[183075]: 2026-01-22 17:28:11.645 183079 DEBUG oslo_concurrency.lockutils [req-a04fa2aa-3855-4f2b-b340-16c6093978f9 req-e5366876-1277-444a-abe7-d6353553e5bb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:11 compute-0 nova_compute[183075]: 2026-01-22 17:28:11.646 183079 DEBUG oslo_concurrency.lockutils [req-a04fa2aa-3855-4f2b-b340-16c6093978f9 req-e5366876-1277-444a-abe7-d6353553e5bb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:11 compute-0 nova_compute[183075]: 2026-01-22 17:28:11.646 183079 DEBUG oslo_concurrency.lockutils [req-a04fa2aa-3855-4f2b-b340-16c6093978f9 req-e5366876-1277-444a-abe7-d6353553e5bb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:11 compute-0 nova_compute[183075]: 2026-01-22 17:28:11.646 183079 DEBUG nova.compute.manager [req-a04fa2aa-3855-4f2b-b340-16c6093978f9 req-e5366876-1277-444a-abe7-d6353553e5bb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] No waiting events found dispatching network-vif-unplugged-8d128323-81b9-4ffa-92c7-39f1389aa99e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:28:11 compute-0 nova_compute[183075]: 2026-01-22 17:28:11.646 183079 DEBUG nova.compute.manager [req-a04fa2aa-3855-4f2b-b340-16c6093978f9 req-e5366876-1277-444a-abe7-d6353553e5bb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Received event network-vif-unplugged-8d128323-81b9-4ffa-92c7-39f1389aa99e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:28:12 compute-0 nova_compute[183075]: 2026-01-22 17:28:12.275 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:14 compute-0 nova_compute[183075]: 2026-01-22 17:28:14.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:28:15 compute-0 podman[231159]: 2026-01-22 17:28:15.381251293 +0000 UTC m=+0.081102478 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:28:15 compute-0 nova_compute[183075]: 2026-01-22 17:28:15.870 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:17 compute-0 nova_compute[183075]: 2026-01-22 17:28:17.311 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:18 compute-0 nova_compute[183075]: 2026-01-22 17:28:18.570 183079 DEBUG nova.compute.manager [req-8eb229c1-99f9-43cf-ac89-6fa76ab8cec8 req-4da13f25-daeb-48be-b98e-43f03be573ec a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Received event network-vif-plugged-8d128323-81b9-4ffa-92c7-39f1389aa99e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:28:18 compute-0 nova_compute[183075]: 2026-01-22 17:28:18.571 183079 DEBUG oslo_concurrency.lockutils [req-8eb229c1-99f9-43cf-ac89-6fa76ab8cec8 req-4da13f25-daeb-48be-b98e-43f03be573ec a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:18 compute-0 nova_compute[183075]: 2026-01-22 17:28:18.571 183079 DEBUG oslo_concurrency.lockutils [req-8eb229c1-99f9-43cf-ac89-6fa76ab8cec8 req-4da13f25-daeb-48be-b98e-43f03be573ec a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:18 compute-0 nova_compute[183075]: 2026-01-22 17:28:18.572 183079 DEBUG oslo_concurrency.lockutils [req-8eb229c1-99f9-43cf-ac89-6fa76ab8cec8 req-4da13f25-daeb-48be-b98e-43f03be573ec a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:18 compute-0 nova_compute[183075]: 2026-01-22 17:28:18.572 183079 DEBUG nova.compute.manager [req-8eb229c1-99f9-43cf-ac89-6fa76ab8cec8 req-4da13f25-daeb-48be-b98e-43f03be573ec a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] No waiting events found dispatching network-vif-plugged-8d128323-81b9-4ffa-92c7-39f1389aa99e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:28:18 compute-0 nova_compute[183075]: 2026-01-22 17:28:18.572 183079 WARNING nova.compute.manager [req-8eb229c1-99f9-43cf-ac89-6fa76ab8cec8 req-4da13f25-daeb-48be-b98e-43f03be573ec a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Received unexpected event network-vif-plugged-8d128323-81b9-4ffa-92c7-39f1389aa99e for instance with vm_state active and task_state deleting.
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.541 183079 DEBUG nova.network.neutron [-] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.570 183079 INFO nova.compute.manager [-] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Took 8.64 seconds to deallocate network for instance.
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.612 183079 DEBUG oslo_concurrency.lockutils [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.613 183079 DEBUG oslo_concurrency.lockutils [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.695 183079 DEBUG nova.compute.provider_tree [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.710 183079 DEBUG nova.scheduler.client.report [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.730 183079 DEBUG nova.network.neutron [req-1fb1fdf2-bee9-442d-b1d6-5a9c83f0abb4 req-6f6c7310-2f74-499d-8bbb-5601d2855e54 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Updated VIF entry in instance network info cache for port 06bfc5f7-2163-4a40-87a7-050edd036a92. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.730 183079 DEBUG nova.network.neutron [req-1fb1fdf2-bee9-442d-b1d6-5a9c83f0abb4 req-6f6c7310-2f74-499d-8bbb-5601d2855e54 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Updating instance_info_cache with network_info: [{"id": "06bfc5f7-2163-4a40-87a7-050edd036a92", "address": "fa:16:3e:9e:13:c8", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06bfc5f7-21", "ovs_interfaceid": "06bfc5f7-2163-4a40-87a7-050edd036a92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.740 183079 DEBUG oslo_concurrency.lockutils [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.748 183079 DEBUG oslo_concurrency.lockutils [req-1fb1fdf2-bee9-442d-b1d6-5a9c83f0abb4 req-6f6c7310-2f74-499d-8bbb-5601d2855e54 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.749 183079 DEBUG nova.compute.manager [req-1fb1fdf2-bee9-442d-b1d6-5a9c83f0abb4 req-6f6c7310-2f74-499d-8bbb-5601d2855e54 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Received event network-changed-8d128323-81b9-4ffa-92c7-39f1389aa99e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.749 183079 DEBUG nova.compute.manager [req-1fb1fdf2-bee9-442d-b1d6-5a9c83f0abb4 req-6f6c7310-2f74-499d-8bbb-5601d2855e54 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Refreshing instance network info cache due to event network-changed-8d128323-81b9-4ffa-92c7-39f1389aa99e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.750 183079 DEBUG oslo_concurrency.lockutils [req-1fb1fdf2-bee9-442d-b1d6-5a9c83f0abb4 req-6f6c7310-2f74-499d-8bbb-5601d2855e54 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-0a31beaa-1978-4bdc-b51e-23750ba51b8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.750 183079 DEBUG oslo_concurrency.lockutils [req-1fb1fdf2-bee9-442d-b1d6-5a9c83f0abb4 req-6f6c7310-2f74-499d-8bbb-5601d2855e54 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-0a31beaa-1978-4bdc-b51e-23750ba51b8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.750 183079 DEBUG nova.network.neutron [req-1fb1fdf2-bee9-442d-b1d6-5a9c83f0abb4 req-6f6c7310-2f74-499d-8bbb-5601d2855e54 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Refreshing network info cache for port 8d128323-81b9-4ffa-92c7-39f1389aa99e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.778 183079 INFO nova.scheduler.client.report [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Deleted allocations for instance 0a31beaa-1978-4bdc-b51e-23750ba51b8a
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.852 183079 DEBUG oslo_concurrency.lockutils [None req-296e4e43-eed8-4eab-99d4-f02482bec741 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "0a31beaa-1978-4bdc-b51e-23750ba51b8a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:19 compute-0 nova_compute[183075]: 2026-01-22 17:28:19.923 183079 DEBUG nova.network.neutron [req-1fb1fdf2-bee9-442d-b1d6-5a9c83f0abb4 req-6f6c7310-2f74-499d-8bbb-5601d2855e54 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:28:20 compute-0 nova_compute[183075]: 2026-01-22 17:28:20.249 183079 DEBUG nova.network.neutron [req-1fb1fdf2-bee9-442d-b1d6-5a9c83f0abb4 req-6f6c7310-2f74-499d-8bbb-5601d2855e54 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:28:20 compute-0 nova_compute[183075]: 2026-01-22 17:28:20.268 183079 DEBUG oslo_concurrency.lockutils [req-1fb1fdf2-bee9-442d-b1d6-5a9c83f0abb4 req-6f6c7310-2f74-499d-8bbb-5601d2855e54 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-0a31beaa-1978-4bdc-b51e-23750ba51b8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:28:20 compute-0 nova_compute[183075]: 2026-01-22 17:28:20.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:28:20 compute-0 nova_compute[183075]: 2026-01-22 17:28:20.872 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.055 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "63c5b8fd-b774-4470-872c-5e8c954d75e3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.056 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "63c5b8fd-b774-4470-872c-5e8c954d75e3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.073 183079 DEBUG nova.compute.manager [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.133 183079 DEBUG oslo_concurrency.lockutils [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.134 183079 DEBUG oslo_concurrency.lockutils [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.134 183079 DEBUG oslo_concurrency.lockutils [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.134 183079 DEBUG oslo_concurrency.lockutils [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.134 183079 DEBUG oslo_concurrency.lockutils [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.135 183079 INFO nova.compute.manager [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Terminating instance
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.136 183079 DEBUG nova.compute.manager [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.151 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.151 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:21 compute-0 kernel: tap06bfc5f7-21 (unregistering): left promiscuous mode
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.157 183079 DEBUG nova.virt.hardware [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.157 183079 INFO nova.compute.claims [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:28:21 compute-0 NetworkManager[55454]: <info>  [1769102901.1609] device (tap06bfc5f7-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:28:21 compute-0 ovn_controller[95372]: 2026-01-22T17:28:21Z|00537|binding|INFO|Releasing lport 06bfc5f7-2163-4a40-87a7-050edd036a92 from this chassis (sb_readonly=0)
Jan 22 17:28:21 compute-0 ovn_controller[95372]: 2026-01-22T17:28:21Z|00538|binding|INFO|Setting lport 06bfc5f7-2163-4a40-87a7-050edd036a92 down in Southbound
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.169 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:21 compute-0 ovn_controller[95372]: 2026-01-22T17:28:21Z|00539|binding|INFO|Removing iface tap06bfc5f7-21 ovn-installed in OVS
Jan 22 17:28:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:21.176 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:13:c8 10.100.0.9'], port_security=['fa:16:3e:9e:13:c8 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b3d0d846-1f10-43c4-8e0c-9ba93967fdc7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-900bfc1d-c57a-4f7e-92bf-4e7a876cd570', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc38cd108414e729395073de19dceae', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ae587f87-81d1-4ff6-8e92-98dcd41d2886', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c590c1d-a36d-48b3-bcde-1b6ee74771df, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=06bfc5f7-2163-4a40-87a7-050edd036a92) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:28:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:21.177 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 06bfc5f7-2163-4a40-87a7-050edd036a92 in datapath 900bfc1d-c57a-4f7e-92bf-4e7a876cd570 unbound from our chassis
Jan 22 17:28:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:21.179 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 900bfc1d-c57a-4f7e-92bf-4e7a876cd570, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:28:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:21.181 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6aadbe38-2475-486f-91ad-fe3bb9f51317]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:21.181 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570 namespace which is not needed anymore
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.192 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:21 compute-0 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000002f.scope: Deactivated successfully.
Jan 22 17:28:21 compute-0 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000002f.scope: Consumed 16.227s CPU time.
Jan 22 17:28:21 compute-0 systemd-machined[154382]: Machine qemu-47-instance-0000002f terminated.
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.274 183079 DEBUG nova.compute.provider_tree [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.288 183079 DEBUG nova.scheduler.client.report [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.310 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.311 183079 DEBUG nova.compute.manager [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:28:21 compute-0 neutron-haproxy-ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230595]: [NOTICE]   (230618) : haproxy version is 2.8.14-c23fe91
Jan 22 17:28:21 compute-0 neutron-haproxy-ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230595]: [NOTICE]   (230618) : path to executable is /usr/sbin/haproxy
Jan 22 17:28:21 compute-0 neutron-haproxy-ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230595]: [WARNING]  (230618) : Exiting Master process...
Jan 22 17:28:21 compute-0 neutron-haproxy-ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230595]: [ALERT]    (230618) : Current worker (230621) exited with code 143 (Terminated)
Jan 22 17:28:21 compute-0 neutron-haproxy-ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570[230595]: [WARNING]  (230618) : All workers exited. Exiting... (0)
Jan 22 17:28:21 compute-0 systemd[1]: libpod-1f55cc3755c88522aa06ec6e483372559c981a69e468b8dcaacf2188595136b6.scope: Deactivated successfully.
Jan 22 17:28:21 compute-0 podman[231210]: 2026-01-22 17:28:21.323368346 +0000 UTC m=+0.043678948 container died 1f55cc3755c88522aa06ec6e483372559c981a69e468b8dcaacf2188595136b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:28:21 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1f55cc3755c88522aa06ec6e483372559c981a69e468b8dcaacf2188595136b6-userdata-shm.mount: Deactivated successfully.
Jan 22 17:28:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-05a8afac8db34f01db264fd3ed28aa86e59baff82ee583fd8bd64b17abd9488d-merged.mount: Deactivated successfully.
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.356 183079 DEBUG nova.compute.manager [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.357 183079 DEBUG nova.network.neutron [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:28:21 compute-0 podman[231210]: 2026-01-22 17:28:21.360602561 +0000 UTC m=+0.080913173 container cleanup 1f55cc3755c88522aa06ec6e483372559c981a69e468b8dcaacf2188595136b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.360 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.366 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.375 183079 INFO nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:28:21 compute-0 systemd[1]: libpod-conmon-1f55cc3755c88522aa06ec6e483372559c981a69e468b8dcaacf2188595136b6.scope: Deactivated successfully.
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.394 183079 DEBUG nova.compute.manager [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.404 183079 INFO nova.virt.libvirt.driver [-] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Instance destroyed successfully.
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.404 183079 DEBUG nova.objects.instance [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lazy-loading 'resources' on Instance uuid b3d0d846-1f10-43c4-8e0c-9ba93967fdc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.421 183079 DEBUG nova.virt.libvirt.vif [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:26:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-06bfc5f7-2022213617',display_name='tempest-server-06bfc5f7-2022213617',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-06bfc5f7-2022213617',id=47,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHCLHRqMwuIHHegtVN/dBsFuYJfMhVe3L9LmEAUU1mvb3zBmzXJa8cLxmVJrl+X3/Ox/s4/8DkIlSNEMGQ0/B6pNriTMnK4hV1OfgcB6Su2v76kqiKHqmCcfWjFp6l6YnA==',key_name='tempest-keypair-test-1032085623',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:26:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6cc38cd108414e729395073de19dceae',ramdisk_id='',reservation_id='r-bitt3llt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestFloatingIPUpdate-1973932938',owner_user_name='tempest-TestFloatingIPUpdate-1973932938-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:26:57Z,user_data=None,user_id='841f48f635f54f619a9de1d6bbc8f832',uuid=b3d0d846-1f10-43c4-8e0c-9ba93967fdc7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06bfc5f7-2163-4a40-87a7-050edd036a92", "address": "fa:16:3e:9e:13:c8", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06bfc5f7-21", "ovs_interfaceid": "06bfc5f7-2163-4a40-87a7-050edd036a92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.422 183079 DEBUG nova.network.os_vif_util [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Converting VIF {"id": "06bfc5f7-2163-4a40-87a7-050edd036a92", "address": "fa:16:3e:9e:13:c8", "network": {"id": "900bfc1d-c57a-4f7e-92bf-4e7a876cd570", "bridge": "br-int", "label": "tempest-test-network--1765746424", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6cc38cd108414e729395073de19dceae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06bfc5f7-21", "ovs_interfaceid": "06bfc5f7-2163-4a40-87a7-050edd036a92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.422 183079 DEBUG nova.network.os_vif_util [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9e:13:c8,bridge_name='br-int',has_traffic_filtering=True,id=06bfc5f7-2163-4a40-87a7-050edd036a92,network=Network(900bfc1d-c57a-4f7e-92bf-4e7a876cd570),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06bfc5f7-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.423 183079 DEBUG os_vif [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9e:13:c8,bridge_name='br-int',has_traffic_filtering=True,id=06bfc5f7-2163-4a40-87a7-050edd036a92,network=Network(900bfc1d-c57a-4f7e-92bf-4e7a876cd570),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06bfc5f7-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.424 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.425 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06bfc5f7-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.426 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.429 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.430 183079 INFO os_vif [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9e:13:c8,bridge_name='br-int',has_traffic_filtering=True,id=06bfc5f7-2163-4a40-87a7-050edd036a92,network=Network(900bfc1d-c57a-4f7e-92bf-4e7a876cd570),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06bfc5f7-21')
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.431 183079 INFO nova.virt.libvirt.driver [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Deleting instance files /var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7_del
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.431 183079 INFO nova.virt.libvirt.driver [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Deletion of /var/lib/nova/instances/b3d0d846-1f10-43c4-8e0c-9ba93967fdc7_del complete
Jan 22 17:28:21 compute-0 podman[231247]: 2026-01-22 17:28:21.43504774 +0000 UTC m=+0.047747947 container remove 1f55cc3755c88522aa06ec6e483372559c981a69e468b8dcaacf2188595136b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:28:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:21.439 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e9764aee-a362-4c03-b2ea-50d7628d6d2e]: (4, ('Thu Jan 22 05:28:21 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570 (1f55cc3755c88522aa06ec6e483372559c981a69e468b8dcaacf2188595136b6)\n1f55cc3755c88522aa06ec6e483372559c981a69e468b8dcaacf2188595136b6\nThu Jan 22 05:28:21 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570 (1f55cc3755c88522aa06ec6e483372559c981a69e468b8dcaacf2188595136b6)\n1f55cc3755c88522aa06ec6e483372559c981a69e468b8dcaacf2188595136b6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:21.441 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2836a83a-87b9-4a1d-b88d-08d9ea164b69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:21.442 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap900bfc1d-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.443 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:21 compute-0 kernel: tap900bfc1d-c0: left promiscuous mode
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.457 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:21.459 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6e3c30e2-ca31-48e9-ab5d-26cbaa6a19d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:21.475 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[165a5f3c-f94d-4517-a7ef-d4b682ebbef1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:21.476 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1f31b625-84f9-444f-ae39-226dcc732f13]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:21.491 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1a449ef1-b5ac-42f7-b76a-04cb5137ec7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518051, 'reachable_time': 29882, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231268, 'error': None, 'target': 'ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.492 183079 DEBUG nova.compute.manager [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.493 183079 DEBUG nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.494 183079 INFO nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Creating image(s)
Jan 22 17:28:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:21.494 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-900bfc1d-c57a-4f7e-92bf-4e7a876cd570 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.494 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "/var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:21.494 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[9f6efe19-796d-44e5-9381-6b2a8fa4bc5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.495 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "/var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.495 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "/var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:21 compute-0 systemd[1]: run-netns-ovnmeta\x2d900bfc1d\x2dc57a\x2d4f7e\x2d92bf\x2d4e7a876cd570.mount: Deactivated successfully.
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.508 183079 DEBUG oslo_concurrency.processutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.531 183079 INFO nova.compute.manager [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.532 183079 DEBUG oslo.service.loopingcall [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.533 183079 DEBUG nova.compute.manager [-] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.533 183079 DEBUG nova.network.neutron [-] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.590 183079 DEBUG oslo_concurrency.processutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.591 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.593 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.616 183079 DEBUG oslo_concurrency.processutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.672 183079 DEBUG oslo_concurrency.processutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.673 183079 DEBUG oslo_concurrency.processutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.703 183079 DEBUG nova.compute.manager [req-5a280283-41c6-4477-af9b-28ac6cdef2ec req-e357374f-bfe2-4972-ab2d-56da08ffa8cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Received event network-vif-unplugged-06bfc5f7-2163-4a40-87a7-050edd036a92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.704 183079 DEBUG oslo_concurrency.lockutils [req-5a280283-41c6-4477-af9b-28ac6cdef2ec req-e357374f-bfe2-4972-ab2d-56da08ffa8cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.704 183079 DEBUG oslo_concurrency.lockutils [req-5a280283-41c6-4477-af9b-28ac6cdef2ec req-e357374f-bfe2-4972-ab2d-56da08ffa8cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.705 183079 DEBUG oslo_concurrency.lockutils [req-5a280283-41c6-4477-af9b-28ac6cdef2ec req-e357374f-bfe2-4972-ab2d-56da08ffa8cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.705 183079 DEBUG nova.compute.manager [req-5a280283-41c6-4477-af9b-28ac6cdef2ec req-e357374f-bfe2-4972-ab2d-56da08ffa8cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] No waiting events found dispatching network-vif-unplugged-06bfc5f7-2163-4a40-87a7-050edd036a92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.705 183079 DEBUG nova.compute.manager [req-5a280283-41c6-4477-af9b-28ac6cdef2ec req-e357374f-bfe2-4972-ab2d-56da08ffa8cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Received event network-vif-unplugged-06bfc5f7-2163-4a40-87a7-050edd036a92 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.716 183079 DEBUG oslo_concurrency.processutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.717 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.717 183079 DEBUG oslo_concurrency.processutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.791 183079 DEBUG oslo_concurrency.processutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.791 183079 DEBUG nova.virt.disk.api [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Checking if we can resize image /var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.792 183079 DEBUG oslo_concurrency.processutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.812 183079 DEBUG nova.policy [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3a896d4927d442ffba421873948034be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.872 183079 DEBUG oslo_concurrency.processutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.873 183079 DEBUG nova.virt.disk.api [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Cannot resize image /var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.873 183079 DEBUG nova.objects.instance [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lazy-loading 'migration_context' on Instance uuid 63c5b8fd-b774-4470-872c-5e8c954d75e3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.888 183079 DEBUG nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.888 183079 DEBUG nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Ensure instance console log exists: /var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.889 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.889 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:21 compute-0 nova_compute[183075]: 2026-01-22 17:28:21.889 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:22 compute-0 nova_compute[183075]: 2026-01-22 17:28:22.315 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.245 183079 DEBUG nova.network.neutron [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Successfully updated port: cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.271 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "refresh_cache-63c5b8fd-b774-4470-872c-5e8c954d75e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.271 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquired lock "refresh_cache-63c5b8fd-b774-4470-872c-5e8c954d75e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.272 183079 DEBUG nova.network.neutron [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.377 183079 DEBUG nova.compute.manager [req-e7ae988f-2039-401e-8ea2-4106934ff6bc req-d0f5388d-cc7b-445c-8ce3-c7435c01b9e0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Received event network-changed-cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.378 183079 DEBUG nova.compute.manager [req-e7ae988f-2039-401e-8ea2-4106934ff6bc req-d0f5388d-cc7b-445c-8ce3-c7435c01b9e0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Refreshing instance network info cache due to event network-changed-cd6c900d-6fa3-4cc6-bc5c-763ec54d8240. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.378 183079 DEBUG oslo_concurrency.lockutils [req-e7ae988f-2039-401e-8ea2-4106934ff6bc req-d0f5388d-cc7b-445c-8ce3-c7435c01b9e0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-63c5b8fd-b774-4470-872c-5e8c954d75e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.400 183079 DEBUG nova.network.neutron [-] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.414 183079 INFO nova.compute.manager [-] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Took 1.88 seconds to deallocate network for instance.
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.459 183079 DEBUG oslo_concurrency.lockutils [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.459 183079 DEBUG oslo_concurrency.lockutils [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.533 183079 DEBUG nova.compute.provider_tree [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.547 183079 DEBUG nova.scheduler.client.report [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.567 183079 DEBUG oslo_concurrency.lockutils [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.597 183079 INFO nova.scheduler.client.report [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Deleted allocations for instance b3d0d846-1f10-43c4-8e0c-9ba93967fdc7
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.614 183079 DEBUG nova.network.neutron [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.650 183079 DEBUG oslo_concurrency.lockutils [None req-6bc30570-994d-4419-ae0b-63a82c99fe4e 841f48f635f54f619a9de1d6bbc8f832 6cc38cd108414e729395073de19dceae - - default default] Lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.516s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.764 183079 DEBUG nova.compute.manager [req-6b68092c-99f0-4dca-879b-946e8af6cfab req-3493eaa9-a409-4f6e-9a0a-66f09c10bc95 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Received event network-vif-plugged-06bfc5f7-2163-4a40-87a7-050edd036a92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.764 183079 DEBUG oslo_concurrency.lockutils [req-6b68092c-99f0-4dca-879b-946e8af6cfab req-3493eaa9-a409-4f6e-9a0a-66f09c10bc95 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.765 183079 DEBUG oslo_concurrency.lockutils [req-6b68092c-99f0-4dca-879b-946e8af6cfab req-3493eaa9-a409-4f6e-9a0a-66f09c10bc95 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.765 183079 DEBUG oslo_concurrency.lockutils [req-6b68092c-99f0-4dca-879b-946e8af6cfab req-3493eaa9-a409-4f6e-9a0a-66f09c10bc95 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b3d0d846-1f10-43c4-8e0c-9ba93967fdc7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.765 183079 DEBUG nova.compute.manager [req-6b68092c-99f0-4dca-879b-946e8af6cfab req-3493eaa9-a409-4f6e-9a0a-66f09c10bc95 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] No waiting events found dispatching network-vif-plugged-06bfc5f7-2163-4a40-87a7-050edd036a92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:28:23 compute-0 nova_compute[183075]: 2026-01-22 17:28:23.765 183079 WARNING nova.compute.manager [req-6b68092c-99f0-4dca-879b-946e8af6cfab req-3493eaa9-a409-4f6e-9a0a-66f09c10bc95 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Received unexpected event network-vif-plugged-06bfc5f7-2163-4a40-87a7-050edd036a92 for instance with vm_state deleted and task_state None.
Jan 22 17:28:24 compute-0 nova_compute[183075]: 2026-01-22 17:28:24.251 183079 DEBUG nova.network.neutron [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Updating instance_info_cache with network_info: [{"id": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "address": "fa:16:3e:48:31:10", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6c900d-6f", "ovs_interfaceid": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:28:25 compute-0 podman[231285]: 2026-01-22 17:28:25.361357086 +0000 UTC m=+0.059803398 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:28:25 compute-0 podman[231286]: 2026-01-22 17:28:25.382857681 +0000 UTC m=+0.071662026 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, version=9.6, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.384 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Releasing lock "refresh_cache-63c5b8fd-b774-4470-872c-5e8c954d75e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.385 183079 DEBUG nova.compute.manager [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Instance network_info: |[{"id": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "address": "fa:16:3e:48:31:10", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6c900d-6f", "ovs_interfaceid": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.386 183079 DEBUG oslo_concurrency.lockutils [req-e7ae988f-2039-401e-8ea2-4106934ff6bc req-d0f5388d-cc7b-445c-8ce3-c7435c01b9e0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-63c5b8fd-b774-4470-872c-5e8c954d75e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.386 183079 DEBUG nova.network.neutron [req-e7ae988f-2039-401e-8ea2-4106934ff6bc req-d0f5388d-cc7b-445c-8ce3-c7435c01b9e0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Refreshing network info cache for port cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.389 183079 DEBUG nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Start _get_guest_xml network_info=[{"id": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "address": "fa:16:3e:48:31:10", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6c900d-6f", "ovs_interfaceid": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.395 183079 WARNING nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:28:25 compute-0 podman[231284]: 2026-01-22 17:28:25.397958974 +0000 UTC m=+0.096825437 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.402 183079 DEBUG nova.virt.libvirt.host [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.402 183079 DEBUG nova.virt.libvirt.host [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.406 183079 DEBUG nova.virt.libvirt.host [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.407 183079 DEBUG nova.virt.libvirt.host [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.408 183079 DEBUG nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.408 183079 DEBUG nova.virt.hardware [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.409 183079 DEBUG nova.virt.hardware [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.409 183079 DEBUG nova.virt.hardware [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.410 183079 DEBUG nova.virt.hardware [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.410 183079 DEBUG nova.virt.hardware [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.410 183079 DEBUG nova.virt.hardware [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.411 183079 DEBUG nova.virt.hardware [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.411 183079 DEBUG nova.virt.hardware [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.411 183079 DEBUG nova.virt.hardware [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.412 183079 DEBUG nova.virt.hardware [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.412 183079 DEBUG nova.virt.hardware [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.418 183079 DEBUG nova.virt.libvirt.vif [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:28:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1014501839',display_name='tempest-server-test-1014501839',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1014501839',id=49,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQrJNwmVmi0v3CvDVkdf2ULfmIKW9OE2obw9UEIh0JilIeeueUzwA1cDH+T5CoOIGZz/satGSZDSgKqtLklRpNQ/Wm6QLNBLAjV/3q74U9Y8J0BPwM5hIfTkFkrKFKf2g==',key_name='tempest-keypair-test-1359386190',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ff1e5ce4806445a8e463c71b6930bec',ramdisk_id='',reservation_id='r-2jkf7qy7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortsTest-1337721110',owner_user_name='tempest-PortsTest-1337721110-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:28:21Z,user_data=None,user_id='3a896d4927d442ffba421873948034be',uuid=63c5b8fd-b774-4470-872c-5e8c954d75e3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "address": "fa:16:3e:48:31:10", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6c900d-6f", "ovs_interfaceid": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.419 183079 DEBUG nova.network.os_vif_util [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converting VIF {"id": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "address": "fa:16:3e:48:31:10", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6c900d-6f", "ovs_interfaceid": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.420 183079 DEBUG nova.network.os_vif_util [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:31:10,bridge_name='br-int',has_traffic_filtering=True,id=cd6c900d-6fa3-4cc6-bc5c-763ec54d8240,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd6c900d-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.421 183079 DEBUG nova.objects.instance [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lazy-loading 'pci_devices' on Instance uuid 63c5b8fd-b774-4470-872c-5e8c954d75e3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.502 183079 DEBUG nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:28:25 compute-0 nova_compute[183075]:   <uuid>63c5b8fd-b774-4470-872c-5e8c954d75e3</uuid>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   <name>instance-00000031</name>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1014501839</nova:name>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:28:25</nova:creationTime>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:28:25 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:28:25 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:28:25 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:28:25 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:28:25 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:28:25 compute-0 nova_compute[183075]:         <nova:user uuid="3a896d4927d442ffba421873948034be">tempest-PortsTest-1337721110-project-member</nova:user>
Jan 22 17:28:25 compute-0 nova_compute[183075]:         <nova:project uuid="7ff1e5ce4806445a8e463c71b6930bec">tempest-PortsTest-1337721110</nova:project>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:28:25 compute-0 nova_compute[183075]:         <nova:port uuid="cd6c900d-6fa3-4cc6-bc5c-763ec54d8240">
Jan 22 17:28:25 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <system>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <entry name="serial">63c5b8fd-b774-4470-872c-5e8c954d75e3</entry>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <entry name="uuid">63c5b8fd-b774-4470-872c-5e8c954d75e3</entry>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     </system>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   <os>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   </os>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   <features>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   </features>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/disk"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:48:31:10"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <target dev="tapcd6c900d-6f"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/console.log" append="off"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <video>
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     </video>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:28:25 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:28:25 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:28:25 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:28:25 compute-0 nova_compute[183075]: </domain>
Jan 22 17:28:25 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.505 183079 DEBUG nova.compute.manager [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Preparing to wait for external event network-vif-plugged-cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.505 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.506 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.506 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.508 183079 DEBUG nova.virt.libvirt.vif [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:28:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1014501839',display_name='tempest-server-test-1014501839',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1014501839',id=49,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQrJNwmVmi0v3CvDVkdf2ULfmIKW9OE2obw9UEIh0JilIeeueUzwA1cDH+T5CoOIGZz/satGSZDSgKqtLklRpNQ/Wm6QLNBLAjV/3q74U9Y8J0BPwM5hIfTkFkrKFKf2g==',key_name='tempest-keypair-test-1359386190',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ff1e5ce4806445a8e463c71b6930bec',ramdisk_id='',reservation_id='r-2jkf7qy7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortsTest-1337721110',owner_user_name='tempest-PortsTest-1337721110-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:28:21Z,user_data=None,user_id='3a896d4927d442ffba421873948034be',uuid=63c5b8fd-b774-4470-872c-5e8c954d75e3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "address": "fa:16:3e:48:31:10", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6c900d-6f", "ovs_interfaceid": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.508 183079 DEBUG nova.network.os_vif_util [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converting VIF {"id": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "address": "fa:16:3e:48:31:10", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6c900d-6f", "ovs_interfaceid": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.509 183079 DEBUG nova.network.os_vif_util [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:31:10,bridge_name='br-int',has_traffic_filtering=True,id=cd6c900d-6fa3-4cc6-bc5c-763ec54d8240,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd6c900d-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.510 183079 DEBUG os_vif [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:31:10,bridge_name='br-int',has_traffic_filtering=True,id=cd6c900d-6fa3-4cc6-bc5c-763ec54d8240,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd6c900d-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.511 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.511 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.512 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.517 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.517 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd6c900d-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.518 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcd6c900d-6f, col_values=(('external_ids', {'iface-id': 'cd6c900d-6fa3-4cc6-bc5c-763ec54d8240', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:48:31:10', 'vm-uuid': '63c5b8fd-b774-4470-872c-5e8c954d75e3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:28:25 compute-0 NetworkManager[55454]: <info>  [1769102905.5209] manager: (tapcd6c900d-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.520 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.522 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.525 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.526 183079 INFO os_vif [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:31:10,bridge_name='br-int',has_traffic_filtering=True,id=cd6c900d-6fa3-4cc6-bc5c-763ec54d8240,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd6c900d-6f')
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.621 183079 DEBUG nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.622 183079 DEBUG nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] No VIF found with MAC fa:16:3e:48:31:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:28:25 compute-0 kernel: tapcd6c900d-6f: entered promiscuous mode
Jan 22 17:28:25 compute-0 NetworkManager[55454]: <info>  [1769102905.6994] manager: (tapcd6c900d-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/230)
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.700 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:25 compute-0 ovn_controller[95372]: 2026-01-22T17:28:25Z|00540|binding|INFO|Claiming lport cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 for this chassis.
Jan 22 17:28:25 compute-0 ovn_controller[95372]: 2026-01-22T17:28:25Z|00541|binding|INFO|cd6c900d-6fa3-4cc6-bc5c-763ec54d8240: Claiming fa:16:3e:48:31:10 10.100.0.14
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.708 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:31:10 10.100.0.14'], port_security=['fa:16:3e:48:31:10 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-fixed_ip_port-448850519', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '63c5b8fd-b774-4470-872c-5e8c954d75e3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-fixed_ip_port-448850519', 'neutron:project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a8d0872f-de3c-4404-94dc-7a328e5b8aa9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a510149a-49ad-47bc-a8ad-05908544b3cc, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=cd6c900d-6fa3-4cc6-bc5c-763ec54d8240) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.709 104629 INFO neutron.agent.ovn.metadata.agent [-] Port cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 in datapath 012007cf-673c-4f83-a4b9-f21a913a1ccf bound to our chassis
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.711 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 012007cf-673c-4f83-a4b9-f21a913a1ccf
Jan 22 17:28:25 compute-0 systemd-udevd[231366]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:28:25 compute-0 ovn_controller[95372]: 2026-01-22T17:28:25Z|00542|binding|INFO|Setting lport cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 ovn-installed in OVS
Jan 22 17:28:25 compute-0 ovn_controller[95372]: 2026-01-22T17:28:25Z|00543|binding|INFO|Setting lport cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 up in Southbound
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.730 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.732 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.731 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[545a9a9c-a6f0-4186-a706-05b503247c37]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.733 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap012007cf-61 in ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.734 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap012007cf-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.735 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[506afb41-a1dc-4f79-b201-f8fc0f2106db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.736 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ed93d193-20ea-4ffa-a769-5e65015b3c34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:25 compute-0 NetworkManager[55454]: <info>  [1769102905.7412] device (tapcd6c900d-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:28:25 compute-0 NetworkManager[55454]: <info>  [1769102905.7422] device (tapcd6c900d-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.751 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[9e5b1c6c-0d8d-4800-bc44-9a5fae480454]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:25 compute-0 systemd-machined[154382]: New machine qemu-49-instance-00000031.
Jan 22 17:28:25 compute-0 systemd[1]: Started Virtual Machine qemu-49-instance-00000031.
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.775 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f9859208-ed9b-4b7b-b129-05da60532388]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.815 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[dc4a5578-e993-43f3-afb2-aa6ac37afb25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.820 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ee5c6e6b-d64f-4f04-9898-1fffdb2a8299]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:25 compute-0 NetworkManager[55454]: <info>  [1769102905.8219] manager: (tap012007cf-60): new Veth device (/org/freedesktop/NetworkManager/Devices/231)
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.835 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102890.8351018, 0a31beaa-1978-4bdc-b51e-23750ba51b8a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.836 183079 INFO nova.compute.manager [-] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] VM Stopped (Lifecycle Event)
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.856 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[24e8b64e-c1c6-44c7-9e0a-a390fd3c4378]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:25 compute-0 nova_compute[183075]: 2026-01-22 17:28:25.862 183079 DEBUG nova.compute.manager [None req-a84e6ac1-4953-44af-a1a1-91af87524870 - - - - - -] [instance: 0a31beaa-1978-4bdc-b51e-23750ba51b8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.864 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5c2deea6-185c-434a-a2ee-10d657087781]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:25 compute-0 NetworkManager[55454]: <info>  [1769102905.8928] device (tap012007cf-60): carrier: link connected
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.903 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[2c38baa9-574f-490c-9781-292421281d88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.926 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3c461dcb-22a1-4efc-849b-d4b549fadbcb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap012007cf-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:ca:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 156], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526959, 'reachable_time': 33022, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231401, 'error': None, 'target': 'ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.947 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c6d8a7a9-a6be-4af6-bbee-92fe25c986a8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:cae2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526959, 'tstamp': 526959}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231402, 'error': None, 'target': 'ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:25.969 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7a74ef04-78ca-4379-8df8-15bee4aca8c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap012007cf-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:ca:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 156], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526959, 'reachable_time': 33022, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231403, 'error': None, 'target': 'ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:26.001 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7bf5c4c3-0e9a-4f1a-b857-f8ae060b776d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:26.073 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2f4d1584-1787-451a-a421-84e205bb1a5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:26.075 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap012007cf-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:26.075 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:26.076 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap012007cf-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:28:26 compute-0 NetworkManager[55454]: <info>  [1769102906.0787] manager: (tap012007cf-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/232)
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.078 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:26 compute-0 kernel: tap012007cf-60: entered promiscuous mode
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:26.083 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap012007cf-60, col_values=(('external_ids', {'iface-id': 'd7c95871-d767-4104-b2f1-75f5b06e0524'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.084 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:26 compute-0 ovn_controller[95372]: 2026-01-22T17:28:26Z|00544|binding|INFO|Releasing lport d7c95871-d767-4104-b2f1-75f5b06e0524 from this chassis (sb_readonly=0)
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:26.085 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/012007cf-673c-4f83-a4b9-f21a913a1ccf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/012007cf-673c-4f83-a4b9-f21a913a1ccf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:26.086 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f6ad8578-0bc6-41a2-a90b-496e7dc3bed7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:26.087 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/012007cf-673c-4f83-a4b9-f21a913a1ccf.pid.haproxy
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 012007cf-673c-4f83-a4b9-f21a913a1ccf
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:28:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:26.087 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'env', 'PROCESS_TAG=haproxy-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/012007cf-673c-4f83-a4b9-f21a913a1ccf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.097 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.325 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102906.3240507, 63c5b8fd-b774-4470-872c-5e8c954d75e3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.326 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] VM Started (Lifecycle Event)
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.361 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.367 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102906.3247461, 63c5b8fd-b774-4470-872c-5e8c954d75e3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.367 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] VM Paused (Lifecycle Event)
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.398 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.404 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.425 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:28:26 compute-0 podman[231442]: 2026-01-22 17:28:26.457420199 +0000 UTC m=+0.057747814 container create 78a9b321be2cc52cbf4e1b1d063fd3878ba4075bc0a98f2cdfe1fee328b061ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 17:28:26 compute-0 systemd[1]: Started libpod-conmon-78a9b321be2cc52cbf4e1b1d063fd3878ba4075bc0a98f2cdfe1fee328b061ed.scope.
Jan 22 17:28:26 compute-0 podman[231442]: 2026-01-22 17:28:26.427187171 +0000 UTC m=+0.027514876 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:28:26 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:28:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db8cb2ffd055694dba6406ff389fb600eb164060cddf50c5bcd7a262c0c8c1cb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:28:26 compute-0 podman[231442]: 2026-01-22 17:28:26.55442489 +0000 UTC m=+0.154752585 container init 78a9b321be2cc52cbf4e1b1d063fd3878ba4075bc0a98f2cdfe1fee328b061ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:28:26 compute-0 podman[231442]: 2026-01-22 17:28:26.567996153 +0000 UTC m=+0.168323778 container start 78a9b321be2cc52cbf4e1b1d063fd3878ba4075bc0a98f2cdfe1fee328b061ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.589 183079 DEBUG nova.compute.manager [req-e4b10e1d-d593-461d-8b9d-52a9e5981db4 req-e50b8223-c87e-4eae-bee9-a3dbc360219d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Received event network-vif-plugged-cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.591 183079 DEBUG oslo_concurrency.lockutils [req-e4b10e1d-d593-461d-8b9d-52a9e5981db4 req-e50b8223-c87e-4eae-bee9-a3dbc360219d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.592 183079 DEBUG oslo_concurrency.lockutils [req-e4b10e1d-d593-461d-8b9d-52a9e5981db4 req-e50b8223-c87e-4eae-bee9-a3dbc360219d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.592 183079 DEBUG oslo_concurrency.lockutils [req-e4b10e1d-d593-461d-8b9d-52a9e5981db4 req-e50b8223-c87e-4eae-bee9-a3dbc360219d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.593 183079 DEBUG nova.compute.manager [req-e4b10e1d-d593-461d-8b9d-52a9e5981db4 req-e50b8223-c87e-4eae-bee9-a3dbc360219d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Processing event network-vif-plugged-cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.594 183079 DEBUG nova.compute.manager [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:28:26 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[231457]: [NOTICE]   (231461) : New worker (231463) forked
Jan 22 17:28:26 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[231457]: [NOTICE]   (231461) : Loading success.
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.599 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102906.5986953, 63c5b8fd-b774-4470-872c-5e8c954d75e3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.600 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] VM Resumed (Lifecycle Event)
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.602 183079 DEBUG nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.606 183079 INFO nova.virt.libvirt.driver [-] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Instance spawned successfully.
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.606 183079 DEBUG nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.628 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.634 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.638 183079 DEBUG nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.639 183079 DEBUG nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.639 183079 DEBUG nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.640 183079 DEBUG nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.641 183079 DEBUG nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.641 183079 DEBUG nova.virt.libvirt.driver [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.668 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.710 183079 INFO nova.compute.manager [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Took 5.22 seconds to spawn the instance on the hypervisor.
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.711 183079 DEBUG nova.compute.manager [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:28:26 compute-0 nova_compute[183075]: 2026-01-22 17:28:26.854 183079 INFO nova.compute.manager [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Took 5.72 seconds to build instance.
Jan 22 17:28:27 compute-0 nova_compute[183075]: 2026-01-22 17:28:27.149 183079 DEBUG oslo_concurrency.lockutils [None req-e3724aa6-2ce8-416d-a68a-04cde0544398 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "63c5b8fd-b774-4470-872c-5e8c954d75e3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:27 compute-0 nova_compute[183075]: 2026-01-22 17:28:27.370 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:27 compute-0 nova_compute[183075]: 2026-01-22 17:28:27.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:28:28 compute-0 nova_compute[183075]: 2026-01-22 17:28:28.784 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:28:28 compute-0 nova_compute[183075]: 2026-01-22 17:28:28.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:28:28 compute-0 nova_compute[183075]: 2026-01-22 17:28:28.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:28:29 compute-0 nova_compute[183075]: 2026-01-22 17:28:29.609 183079 DEBUG nova.compute.manager [req-7e568c06-f287-4d80-a905-321072c3ca16 req-50098c92-2db4-4b27-9e7d-00d94f63d1da a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Received event network-vif-plugged-cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:28:29 compute-0 nova_compute[183075]: 2026-01-22 17:28:29.610 183079 DEBUG oslo_concurrency.lockutils [req-7e568c06-f287-4d80-a905-321072c3ca16 req-50098c92-2db4-4b27-9e7d-00d94f63d1da a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:29 compute-0 nova_compute[183075]: 2026-01-22 17:28:29.610 183079 DEBUG oslo_concurrency.lockutils [req-7e568c06-f287-4d80-a905-321072c3ca16 req-50098c92-2db4-4b27-9e7d-00d94f63d1da a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:29 compute-0 nova_compute[183075]: 2026-01-22 17:28:29.610 183079 DEBUG oslo_concurrency.lockutils [req-7e568c06-f287-4d80-a905-321072c3ca16 req-50098c92-2db4-4b27-9e7d-00d94f63d1da a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:29 compute-0 nova_compute[183075]: 2026-01-22 17:28:29.611 183079 DEBUG nova.compute.manager [req-7e568c06-f287-4d80-a905-321072c3ca16 req-50098c92-2db4-4b27-9e7d-00d94f63d1da a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] No waiting events found dispatching network-vif-plugged-cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:28:29 compute-0 nova_compute[183075]: 2026-01-22 17:28:29.611 183079 WARNING nova.compute.manager [req-7e568c06-f287-4d80-a905-321072c3ca16 req-50098c92-2db4-4b27-9e7d-00d94f63d1da a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Received unexpected event network-vif-plugged-cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 for instance with vm_state active and task_state None.
Jan 22 17:28:29 compute-0 nova_compute[183075]: 2026-01-22 17:28:29.613 183079 INFO nova.compute.manager [None req-0600fbb6-d3a4-42f4-8de7-9476eba645fc 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Get console output
Jan 22 17:28:29 compute-0 nova_compute[183075]: 2026-01-22 17:28:29.618 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:28:30 compute-0 podman[231472]: 2026-01-22 17:28:30.368652723 +0000 UTC m=+0.077689057 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 22 17:28:30 compute-0 nova_compute[183075]: 2026-01-22 17:28:30.497 183079 DEBUG nova.network.neutron [req-e7ae988f-2039-401e-8ea2-4106934ff6bc req-d0f5388d-cc7b-445c-8ce3-c7435c01b9e0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Updated VIF entry in instance network info cache for port cd6c900d-6fa3-4cc6-bc5c-763ec54d8240. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:28:30 compute-0 nova_compute[183075]: 2026-01-22 17:28:30.497 183079 DEBUG nova.network.neutron [req-e7ae988f-2039-401e-8ea2-4106934ff6bc req-d0f5388d-cc7b-445c-8ce3-c7435c01b9e0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Updating instance_info_cache with network_info: [{"id": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "address": "fa:16:3e:48:31:10", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6c900d-6f", "ovs_interfaceid": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:28:30 compute-0 nova_compute[183075]: 2026-01-22 17:28:30.519 183079 DEBUG oslo_concurrency.lockutils [req-e7ae988f-2039-401e-8ea2-4106934ff6bc req-d0f5388d-cc7b-445c-8ce3-c7435c01b9e0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-63c5b8fd-b774-4470-872c-5e8c954d75e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:28:30 compute-0 nova_compute[183075]: 2026-01-22 17:28:30.521 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:30 compute-0 nova_compute[183075]: 2026-01-22 17:28:30.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:28:30 compute-0 nova_compute[183075]: 2026-01-22 17:28:30.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:28:30 compute-0 nova_compute[183075]: 2026-01-22 17:28:30.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:28:31 compute-0 nova_compute[183075]: 2026-01-22 17:28:31.460 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-63c5b8fd-b774-4470-872c-5e8c954d75e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:28:31 compute-0 nova_compute[183075]: 2026-01-22 17:28:31.461 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-63c5b8fd-b774-4470-872c-5e8c954d75e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:28:31 compute-0 nova_compute[183075]: 2026-01-22 17:28:31.461 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:28:31 compute-0 nova_compute[183075]: 2026-01-22 17:28:31.461 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 63c5b8fd-b774-4470-872c-5e8c954d75e3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:28:32 compute-0 nova_compute[183075]: 2026-01-22 17:28:32.372 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.479 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Updating instance_info_cache with network_info: [{"id": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "address": "fa:16:3e:48:31:10", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6c900d-6f", "ovs_interfaceid": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.500 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-63c5b8fd-b774-4470-872c-5e8c954d75e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.500 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.500 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.501 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.527 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.527 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.528 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.528 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.595 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.653 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.654 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.714 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.723 183079 INFO nova.compute.manager [None req-b2985461-50f0-447b-95f6-1a1746c8ca29 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Get console output
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.879 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.880 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5508MB free_disk=73.35940551757812GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.880 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.880 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.950 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 63c5b8fd-b774-4470-872c-5e8c954d75e3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.951 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.951 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:28:34 compute-0 nova_compute[183075]: 2026-01-22 17:28:34.990 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:28:35 compute-0 nova_compute[183075]: 2026-01-22 17:28:35.006 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:28:35 compute-0 nova_compute[183075]: 2026-01-22 17:28:35.029 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:28:35 compute-0 nova_compute[183075]: 2026-01-22 17:28:35.030 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:35 compute-0 nova_compute[183075]: 2026-01-22 17:28:35.522 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:36 compute-0 nova_compute[183075]: 2026-01-22 17:28:36.403 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102901.4020042, b3d0d846-1f10-43c4-8e0c-9ba93967fdc7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:28:36 compute-0 nova_compute[183075]: 2026-01-22 17:28:36.403 183079 INFO nova.compute.manager [-] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] VM Stopped (Lifecycle Event)
Jan 22 17:28:36 compute-0 nova_compute[183075]: 2026-01-22 17:28:36.773 183079 DEBUG nova.compute.manager [None req-653f2aae-b460-43cf-865c-82dc5b1d8eb1 - - - - - -] [instance: b3d0d846-1f10-43c4-8e0c-9ba93967fdc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:28:37 compute-0 nova_compute[183075]: 2026-01-22 17:28:37.375 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:38 compute-0 ovn_controller[95372]: 2026-01-22T17:28:38Z|00100|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:48:31:10 10.100.0.14
Jan 22 17:28:38 compute-0 ovn_controller[95372]: 2026-01-22T17:28:38Z|00101|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:48:31:10 10.100.0.14
Jan 22 17:28:39 compute-0 nova_compute[183075]: 2026-01-22 17:28:39.861 183079 INFO nova.compute.manager [None req-72ab42cf-66b5-4f59-ac05-7a99a3f2f90e 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Get console output
Jan 22 17:28:40 compute-0 podman[231513]: 2026-01-22 17:28:40.36541485 +0000 UTC m=+0.076679539 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:28:40 compute-0 nova_compute[183075]: 2026-01-22 17:28:40.525 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:41 compute-0 ovn_controller[95372]: 2026-01-22T17:28:41Z|00545|binding|INFO|Releasing lport d7c95871-d767-4104-b2f1-75f5b06e0524 from this chassis (sb_readonly=0)
Jan 22 17:28:41 compute-0 nova_compute[183075]: 2026-01-22 17:28:41.118 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:41 compute-0 ovn_controller[95372]: 2026-01-22T17:28:41Z|00546|binding|INFO|Releasing lport d7c95871-d767-4104-b2f1-75f5b06e0524 from this chassis (sb_readonly=0)
Jan 22 17:28:41 compute-0 nova_compute[183075]: 2026-01-22 17:28:41.233 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:41.945 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:41.945 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:41.946 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:42 compute-0 nova_compute[183075]: 2026-01-22 17:28:42.377 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.022 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.023 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.609 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.610 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.5868602
Jan 22 17:28:44 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231463]: 10.100.0.14:37104 [22/Jan/2026:17:28:44.021] listener listener/metadata 0/0/0/588/588 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.620 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.621 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.648 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.648 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0274105
Jan 22 17:28:44 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231463]: 10.100.0.14:37112 [22/Jan/2026:17:28:44.620] listener listener/metadata 0/0/0/28/28 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.652 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.653 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.665 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.666 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0131059
Jan 22 17:28:44 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231463]: 10.100.0.14:37124 [22/Jan/2026:17:28:44.652] listener listener/metadata 0/0/0/13/13 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.670 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.671 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.685 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:28:44 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231463]: 10.100.0.14:37126 [22/Jan/2026:17:28:44.670] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.685 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0145104
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.689 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.690 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.702 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.703 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0130355
Jan 22 17:28:44 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231463]: 10.100.0.14:37132 [22/Jan/2026:17:28:44.689] listener listener/metadata 0/0/0/13/13 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.707 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.709 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.726 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.727 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0184376
Jan 22 17:28:44 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231463]: 10.100.0.14:37148 [22/Jan/2026:17:28:44.707] listener listener/metadata 0/0/0/20/20 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.731 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.732 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.749 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:28:44 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231463]: 10.100.0.14:37158 [22/Jan/2026:17:28:44.731] listener listener/metadata 0/0/0/18/18 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.750 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0174534
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.754 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.755 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.771 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.771 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0165005
Jan 22 17:28:44 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231463]: 10.100.0.14:37174 [22/Jan/2026:17:28:44.754] listener listener/metadata 0/0/0/17/17 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.780 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.781 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.803 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.804 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0226595
Jan 22 17:28:44 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231463]: 10.100.0.14:37186 [22/Jan/2026:17:28:44.779] listener listener/metadata 0/0/0/24/24 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.813 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.813 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.830 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.830 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0168204
Jan 22 17:28:44 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231463]: 10.100.0.14:37198 [22/Jan/2026:17:28:44.812] listener listener/metadata 0/0/0/18/18 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.838 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.839 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:28:44 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231463]: 10.100.0.14:37200 [22/Jan/2026:17:28:44.838] listener listener/metadata 0/0/0/17/17 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.856 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0167642
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.876 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.877 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.902 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.903 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0256257
Jan 22 17:28:44 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231463]: 10.100.0.14:37214 [22/Jan/2026:17:28:44.876] listener listener/metadata 0/0/0/27/27 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.907 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.908 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.924 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.924 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0159175
Jan 22 17:28:44 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231463]: 10.100.0.14:37228 [22/Jan/2026:17:28:44.907] listener listener/metadata 0/0/0/17/17 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.929 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.929 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.942 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.943 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0135183
Jan 22 17:28:44 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231463]: 10.100.0.14:37230 [22/Jan/2026:17:28:44.928] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.947 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.947 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.962 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.963 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0154960
Jan 22 17:28:44 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231463]: 10.100.0.14:37236 [22/Jan/2026:17:28:44.946] listener listener/metadata 0/0/0/16/16 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.967 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.968 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.984 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:28:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:44.984 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0163689
Jan 22 17:28:44 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231463]: 10.100.0.14:37252 [22/Jan/2026:17:28:44.967] listener listener/metadata 0/0/0/17/17 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:28:45 compute-0 nova_compute[183075]: 2026-01-22 17:28:45.001 183079 INFO nova.compute.manager [None req-3c4885d7-4027-4063-adad-425a910cddcd 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Get console output
Jan 22 17:28:45 compute-0 nova_compute[183075]: 2026-01-22 17:28:45.006 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:28:45 compute-0 nova_compute[183075]: 2026-01-22 17:28:45.527 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:46 compute-0 podman[231538]: 2026-01-22 17:28:46.341365247 +0000 UTC m=+0.053220682 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:28:47 compute-0 nova_compute[183075]: 2026-01-22 17:28:47.378 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:50 compute-0 nova_compute[183075]: 2026-01-22 17:28:50.199 183079 INFO nova.compute.manager [None req-345cee60-9b45-4665-9200-6293fcb70153 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Get console output
Jan 22 17:28:50 compute-0 nova_compute[183075]: 2026-01-22 17:28:50.204 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:28:50 compute-0 nova_compute[183075]: 2026-01-22 17:28:50.580 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.788 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.788 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.789 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.789 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.789 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.790 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.836 183079 DEBUG nova.virt.libvirt.imagecache [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.836 183079 DEBUG nova.virt.libvirt.imagecache [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Image id e1b65bbe-5c14-4552-a5d9-d275c9dd42d3 yields fingerprint dc114733697ffcf2ceab5e1bcdf92e07c516f218 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.837 183079 INFO nova.virt.libvirt.imagecache [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] image e1b65bbe-5c14-4552-a5d9-d275c9dd42d3 at (/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218): checking
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.837 183079 DEBUG nova.virt.libvirt.imagecache [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] image e1b65bbe-5c14-4552-a5d9-d275c9dd42d3 at (/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.838 183079 DEBUG nova.virt.libvirt.imagecache [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.839 183079 DEBUG nova.virt.libvirt.imagecache [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] 63c5b8fd-b774-4470-872c-5e8c954d75e3 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.839 183079 DEBUG nova.virt.libvirt.imagecache [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] 63c5b8fd-b774-4470-872c-5e8c954d75e3 has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.840 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.900 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.901 183079 DEBUG nova.virt.libvirt.imagecache [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 63c5b8fd-b774-4470-872c-5e8c954d75e3 is backed by dc114733697ffcf2ceab5e1bcdf92e07c516f218 _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.901 183079 INFO nova.virt.libvirt.imagecache [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Active base files: /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.901 183079 DEBUG nova.virt.libvirt.imagecache [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.902 183079 DEBUG nova.virt.libvirt.imagecache [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Jan 22 17:28:51 compute-0 nova_compute[183075]: 2026-01-22 17:28:51.902 183079 DEBUG nova.virt.libvirt.imagecache [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Jan 22 17:28:52 compute-0 ovn_controller[95372]: 2026-01-22T17:28:52Z|00547|binding|INFO|Releasing lport d7c95871-d767-4104-b2f1-75f5b06e0524 from this chassis (sb_readonly=0)
Jan 22 17:28:52 compute-0 nova_compute[183075]: 2026-01-22 17:28:52.368 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:52 compute-0 NetworkManager[55454]: <info>  [1769102932.3691] manager: (patch-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/233)
Jan 22 17:28:52 compute-0 NetworkManager[55454]: <info>  [1769102932.3700] manager: (patch-br-int-to-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/234)
Jan 22 17:28:52 compute-0 nova_compute[183075]: 2026-01-22 17:28:52.397 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:52 compute-0 ovn_controller[95372]: 2026-01-22T17:28:52Z|00548|binding|INFO|Releasing lport d7c95871-d767-4104-b2f1-75f5b06e0524 from this chassis (sb_readonly=0)
Jan 22 17:28:52 compute-0 nova_compute[183075]: 2026-01-22 17:28:52.405 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:52 compute-0 nova_compute[183075]: 2026-01-22 17:28:52.841 183079 DEBUG nova.compute.manager [req-c2a046bc-85df-4d36-8ade-3f7e7324815b req-f5e2b6c9-b9f0-4601-9749-b1ea6ab2e72a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Received event network-changed-cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:28:52 compute-0 nova_compute[183075]: 2026-01-22 17:28:52.842 183079 DEBUG nova.compute.manager [req-c2a046bc-85df-4d36-8ade-3f7e7324815b req-f5e2b6c9-b9f0-4601-9749-b1ea6ab2e72a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Refreshing instance network info cache due to event network-changed-cd6c900d-6fa3-4cc6-bc5c-763ec54d8240. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:28:52 compute-0 nova_compute[183075]: 2026-01-22 17:28:52.842 183079 DEBUG oslo_concurrency.lockutils [req-c2a046bc-85df-4d36-8ade-3f7e7324815b req-f5e2b6c9-b9f0-4601-9749-b1ea6ab2e72a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-63c5b8fd-b774-4470-872c-5e8c954d75e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:28:52 compute-0 nova_compute[183075]: 2026-01-22 17:28:52.842 183079 DEBUG oslo_concurrency.lockutils [req-c2a046bc-85df-4d36-8ade-3f7e7324815b req-f5e2b6c9-b9f0-4601-9749-b1ea6ab2e72a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-63c5b8fd-b774-4470-872c-5e8c954d75e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:28:52 compute-0 nova_compute[183075]: 2026-01-22 17:28:52.842 183079 DEBUG nova.network.neutron [req-c2a046bc-85df-4d36-8ade-3f7e7324815b req-f5e2b6c9-b9f0-4601-9749-b1ea6ab2e72a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Refreshing network info cache for port cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:28:52 compute-0 nova_compute[183075]: 2026-01-22 17:28:52.964 183079 DEBUG oslo_concurrency.lockutils [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "63c5b8fd-b774-4470-872c-5e8c954d75e3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:52 compute-0 nova_compute[183075]: 2026-01-22 17:28:52.965 183079 DEBUG oslo_concurrency.lockutils [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "63c5b8fd-b774-4470-872c-5e8c954d75e3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:52 compute-0 nova_compute[183075]: 2026-01-22 17:28:52.965 183079 DEBUG oslo_concurrency.lockutils [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:52 compute-0 nova_compute[183075]: 2026-01-22 17:28:52.965 183079 DEBUG oslo_concurrency.lockutils [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:52 compute-0 nova_compute[183075]: 2026-01-22 17:28:52.966 183079 DEBUG oslo_concurrency.lockutils [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:52 compute-0 nova_compute[183075]: 2026-01-22 17:28:52.967 183079 INFO nova.compute.manager [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Terminating instance
Jan 22 17:28:52 compute-0 nova_compute[183075]: 2026-01-22 17:28:52.968 183079 DEBUG nova.compute.manager [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:28:52 compute-0 kernel: tapcd6c900d-6f (unregistering): left promiscuous mode
Jan 22 17:28:53 compute-0 NetworkManager[55454]: <info>  [1769102933.0006] device (tapcd6c900d-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:28:53 compute-0 ovn_controller[95372]: 2026-01-22T17:28:53Z|00549|binding|INFO|Releasing lport cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 from this chassis (sb_readonly=0)
Jan 22 17:28:53 compute-0 ovn_controller[95372]: 2026-01-22T17:28:53Z|00550|binding|INFO|Setting lport cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 down in Southbound
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.012 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:53 compute-0 ovn_controller[95372]: 2026-01-22T17:28:53Z|00551|binding|INFO|Removing iface tapcd6c900d-6f ovn-installed in OVS
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.015 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.031 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:53.056 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:31:10 10.100.0.14'], port_security=['fa:16:3e:48:31:10 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-fixed_ip_port-448850519', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '63c5b8fd-b774-4470-872c-5e8c954d75e3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-fixed_ip_port-448850519', 'neutron:project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a8d0872f-de3c-4404-94dc-7a328e5b8aa9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.190'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a510149a-49ad-47bc-a8ad-05908544b3cc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=cd6c900d-6fa3-4cc6-bc5c-763ec54d8240) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:28:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:53.057 104629 INFO neutron.agent.ovn.metadata.agent [-] Port cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 in datapath 012007cf-673c-4f83-a4b9-f21a913a1ccf unbound from our chassis
Jan 22 17:28:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:53.058 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 012007cf-673c-4f83-a4b9-f21a913a1ccf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:28:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:53.059 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c75664ea-b683-452e-9b39-9bed961bac16]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:53.060 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf namespace which is not needed anymore
Jan 22 17:28:53 compute-0 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000031.scope: Deactivated successfully.
Jan 22 17:28:53 compute-0 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000031.scope: Consumed 12.990s CPU time.
Jan 22 17:28:53 compute-0 systemd-machined[154382]: Machine qemu-49-instance-00000031 terminated.
Jan 22 17:28:53 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[231457]: [NOTICE]   (231461) : haproxy version is 2.8.14-c23fe91
Jan 22 17:28:53 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[231457]: [NOTICE]   (231461) : path to executable is /usr/sbin/haproxy
Jan 22 17:28:53 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[231457]: [WARNING]  (231461) : Exiting Master process...
Jan 22 17:28:53 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[231457]: [ALERT]    (231461) : Current worker (231463) exited with code 143 (Terminated)
Jan 22 17:28:53 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[231457]: [WARNING]  (231461) : All workers exited. Exiting... (0)
Jan 22 17:28:53 compute-0 systemd[1]: libpod-78a9b321be2cc52cbf4e1b1d063fd3878ba4075bc0a98f2cdfe1fee328b061ed.scope: Deactivated successfully.
Jan 22 17:28:53 compute-0 podman[231588]: 2026-01-22 17:28:53.189707013 +0000 UTC m=+0.043707169 container died 78a9b321be2cc52cbf4e1b1d063fd3878ba4075bc0a98f2cdfe1fee328b061ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:28:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-db8cb2ffd055694dba6406ff389fb600eb164060cddf50c5bcd7a262c0c8c1cb-merged.mount: Deactivated successfully.
Jan 22 17:28:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-78a9b321be2cc52cbf4e1b1d063fd3878ba4075bc0a98f2cdfe1fee328b061ed-userdata-shm.mount: Deactivated successfully.
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.220 183079 INFO nova.virt.libvirt.driver [-] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Instance destroyed successfully.
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.220 183079 DEBUG nova.objects.instance [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lazy-loading 'resources' on Instance uuid 63c5b8fd-b774-4470-872c-5e8c954d75e3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:28:53 compute-0 podman[231588]: 2026-01-22 17:28:53.223939337 +0000 UTC m=+0.077939493 container cleanup 78a9b321be2cc52cbf4e1b1d063fd3878ba4075bc0a98f2cdfe1fee328b061ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:28:53 compute-0 systemd[1]: libpod-conmon-78a9b321be2cc52cbf4e1b1d063fd3878ba4075bc0a98f2cdfe1fee328b061ed.scope: Deactivated successfully.
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.241 183079 DEBUG nova.virt.libvirt.vif [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:28:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1014501839',display_name='tempest-server-test-1014501839',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1014501839',id=49,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQrJNwmVmi0v3CvDVkdf2ULfmIKW9OE2obw9UEIh0JilIeeueUzwA1cDH+T5CoOIGZz/satGSZDSgKqtLklRpNQ/Wm6QLNBLAjV/3q74U9Y8J0BPwM5hIfTkFkrKFKf2g==',key_name='tempest-keypair-test-1359386190',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:28:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ff1e5ce4806445a8e463c71b6930bec',ramdisk_id='',reservation_id='r-2jkf7qy7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortsTest-1337721110',owner_user_name='tempest-PortsTest-1337721110-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:28:26Z,user_data=None,user_id='3a896d4927d442ffba421873948034be',uuid=63c5b8fd-b774-4470-872c-5e8c954d75e3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "address": "fa:16:3e:48:31:10", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6c900d-6f", "ovs_interfaceid": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.242 183079 DEBUG nova.network.os_vif_util [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converting VIF {"id": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "address": "fa:16:3e:48:31:10", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6c900d-6f", "ovs_interfaceid": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.242 183079 DEBUG nova.network.os_vif_util [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:48:31:10,bridge_name='br-int',has_traffic_filtering=True,id=cd6c900d-6fa3-4cc6-bc5c-763ec54d8240,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd6c900d-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.243 183079 DEBUG os_vif [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:48:31:10,bridge_name='br-int',has_traffic_filtering=True,id=cd6c900d-6fa3-4cc6-bc5c-763ec54d8240,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd6c900d-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.244 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.244 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd6c900d-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.248 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.249 183079 INFO os_vif [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:48:31:10,bridge_name='br-int',has_traffic_filtering=True,id=cd6c900d-6fa3-4cc6-bc5c-763ec54d8240,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd6c900d-6f')
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.250 183079 INFO nova.virt.libvirt.driver [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Deleting instance files /var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3_del
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.250 183079 INFO nova.virt.libvirt.driver [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Deletion of /var/lib/nova/instances/63c5b8fd-b774-4470-872c-5e8c954d75e3_del complete
Jan 22 17:28:53 compute-0 podman[231634]: 2026-01-22 17:28:53.283859748 +0000 UTC m=+0.039540737 container remove 78a9b321be2cc52cbf4e1b1d063fd3878ba4075bc0a98f2cdfe1fee328b061ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:28:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:53.288 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1424f457-4e9e-4855-869d-0a89a2ede2de]: (4, ('Thu Jan 22 05:28:53 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf (78a9b321be2cc52cbf4e1b1d063fd3878ba4075bc0a98f2cdfe1fee328b061ed)\n78a9b321be2cc52cbf4e1b1d063fd3878ba4075bc0a98f2cdfe1fee328b061ed\nThu Jan 22 05:28:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf (78a9b321be2cc52cbf4e1b1d063fd3878ba4075bc0a98f2cdfe1fee328b061ed)\n78a9b321be2cc52cbf4e1b1d063fd3878ba4075bc0a98f2cdfe1fee328b061ed\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:53.289 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6d296d53-e738-4b2b-8942-f7b16e9d21d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:53.290 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap012007cf-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.292 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:53 compute-0 kernel: tap012007cf-60: left promiscuous mode
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.303 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:53.306 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2768de5d-9297-41f8-8219-fb1f4fe71567]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.318 183079 INFO nova.compute.manager [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.318 183079 DEBUG oslo.service.loopingcall [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.319 183079 DEBUG nova.compute.manager [-] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:28:53 compute-0 nova_compute[183075]: 2026-01-22 17:28:53.319 183079 DEBUG nova.network.neutron [-] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:28:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:53.324 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[67deb78c-65f6-429b-b4ea-b7e0d9ed7f0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:53.325 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[48193e11-c361-4a5d-aa0b-f7356c5201c1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:53.342 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[48579f8e-5e67-4ba3-813d-b4122169ff67]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526951, 'reachable_time': 17823, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231649, 'error': None, 'target': 'ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:53.344 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:28:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:53.344 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[ed373f4a-8340-45ca-812f-749c89a6d1c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:28:53 compute-0 systemd[1]: run-netns-ovnmeta\x2d012007cf\x2d673c\x2d4f83\x2da4b9\x2df21a913a1ccf.mount: Deactivated successfully.
Jan 22 17:28:54 compute-0 nova_compute[183075]: 2026-01-22 17:28:54.983 183079 DEBUG nova.compute.manager [req-46080a91-8456-4945-b099-5f9605c7484d req-d10ba8e9-8cec-4dd7-9ef5-71c94bdf0c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Received event network-vif-unplugged-cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:28:54 compute-0 nova_compute[183075]: 2026-01-22 17:28:54.983 183079 DEBUG oslo_concurrency.lockutils [req-46080a91-8456-4945-b099-5f9605c7484d req-d10ba8e9-8cec-4dd7-9ef5-71c94bdf0c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:54 compute-0 nova_compute[183075]: 2026-01-22 17:28:54.984 183079 DEBUG oslo_concurrency.lockutils [req-46080a91-8456-4945-b099-5f9605c7484d req-d10ba8e9-8cec-4dd7-9ef5-71c94bdf0c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:54 compute-0 nova_compute[183075]: 2026-01-22 17:28:54.984 183079 DEBUG oslo_concurrency.lockutils [req-46080a91-8456-4945-b099-5f9605c7484d req-d10ba8e9-8cec-4dd7-9ef5-71c94bdf0c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:54 compute-0 nova_compute[183075]: 2026-01-22 17:28:54.985 183079 DEBUG nova.compute.manager [req-46080a91-8456-4945-b099-5f9605c7484d req-d10ba8e9-8cec-4dd7-9ef5-71c94bdf0c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] No waiting events found dispatching network-vif-unplugged-cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:28:54 compute-0 nova_compute[183075]: 2026-01-22 17:28:54.985 183079 DEBUG nova.compute.manager [req-46080a91-8456-4945-b099-5f9605c7484d req-d10ba8e9-8cec-4dd7-9ef5-71c94bdf0c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Received event network-vif-unplugged-cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:28:54 compute-0 nova_compute[183075]: 2026-01-22 17:28:54.985 183079 DEBUG nova.compute.manager [req-46080a91-8456-4945-b099-5f9605c7484d req-d10ba8e9-8cec-4dd7-9ef5-71c94bdf0c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Received event network-vif-plugged-cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:28:54 compute-0 nova_compute[183075]: 2026-01-22 17:28:54.986 183079 DEBUG oslo_concurrency.lockutils [req-46080a91-8456-4945-b099-5f9605c7484d req-d10ba8e9-8cec-4dd7-9ef5-71c94bdf0c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:54 compute-0 nova_compute[183075]: 2026-01-22 17:28:54.986 183079 DEBUG oslo_concurrency.lockutils [req-46080a91-8456-4945-b099-5f9605c7484d req-d10ba8e9-8cec-4dd7-9ef5-71c94bdf0c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:54 compute-0 nova_compute[183075]: 2026-01-22 17:28:54.986 183079 DEBUG oslo_concurrency.lockutils [req-46080a91-8456-4945-b099-5f9605c7484d req-d10ba8e9-8cec-4dd7-9ef5-71c94bdf0c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "63c5b8fd-b774-4470-872c-5e8c954d75e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:54 compute-0 nova_compute[183075]: 2026-01-22 17:28:54.987 183079 DEBUG nova.compute.manager [req-46080a91-8456-4945-b099-5f9605c7484d req-d10ba8e9-8cec-4dd7-9ef5-71c94bdf0c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] No waiting events found dispatching network-vif-plugged-cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:28:54 compute-0 nova_compute[183075]: 2026-01-22 17:28:54.987 183079 WARNING nova.compute.manager [req-46080a91-8456-4945-b099-5f9605c7484d req-d10ba8e9-8cec-4dd7-9ef5-71c94bdf0c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Received unexpected event network-vif-plugged-cd6c900d-6fa3-4cc6-bc5c-763ec54d8240 for instance with vm_state active and task_state deleting.
Jan 22 17:28:55 compute-0 nova_compute[183075]: 2026-01-22 17:28:55.071 183079 DEBUG nova.network.neutron [req-c2a046bc-85df-4d36-8ade-3f7e7324815b req-f5e2b6c9-b9f0-4601-9749-b1ea6ab2e72a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Updated VIF entry in instance network info cache for port cd6c900d-6fa3-4cc6-bc5c-763ec54d8240. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:28:55 compute-0 nova_compute[183075]: 2026-01-22 17:28:55.072 183079 DEBUG nova.network.neutron [req-c2a046bc-85df-4d36-8ade-3f7e7324815b req-f5e2b6c9-b9f0-4601-9749-b1ea6ab2e72a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Updating instance_info_cache with network_info: [{"id": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "address": "fa:16:3e:48:31:10", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6c900d-6f", "ovs_interfaceid": "cd6c900d-6fa3-4cc6-bc5c-763ec54d8240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:28:55 compute-0 nova_compute[183075]: 2026-01-22 17:28:55.091 183079 DEBUG oslo_concurrency.lockutils [req-c2a046bc-85df-4d36-8ade-3f7e7324815b req-f5e2b6c9-b9f0-4601-9749-b1ea6ab2e72a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-63c5b8fd-b774-4470-872c-5e8c954d75e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:28:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:55.458 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:28:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:55.459 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:28:55 compute-0 nova_compute[183075]: 2026-01-22 17:28:55.490 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:55 compute-0 nova_compute[183075]: 2026-01-22 17:28:55.595 183079 DEBUG nova.network.neutron [-] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:28:55 compute-0 nova_compute[183075]: 2026-01-22 17:28:55.609 183079 INFO nova.compute.manager [-] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Took 2.29 seconds to deallocate network for instance.
Jan 22 17:28:55 compute-0 nova_compute[183075]: 2026-01-22 17:28:55.673 183079 DEBUG oslo_concurrency.lockutils [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:28:55 compute-0 nova_compute[183075]: 2026-01-22 17:28:55.673 183079 DEBUG oslo_concurrency.lockutils [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:28:55 compute-0 nova_compute[183075]: 2026-01-22 17:28:55.727 183079 DEBUG nova.compute.provider_tree [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:28:55 compute-0 nova_compute[183075]: 2026-01-22 17:28:55.747 183079 DEBUG nova.scheduler.client.report [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:28:55 compute-0 nova_compute[183075]: 2026-01-22 17:28:55.768 183079 DEBUG oslo_concurrency.lockutils [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:55 compute-0 nova_compute[183075]: 2026-01-22 17:28:55.814 183079 INFO nova.scheduler.client.report [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Deleted allocations for instance 63c5b8fd-b774-4470-872c-5e8c954d75e3
Jan 22 17:28:55 compute-0 nova_compute[183075]: 2026-01-22 17:28:55.885 183079 DEBUG oslo_concurrency.lockutils [None req-937bf7f7-f937-4fd7-8fb7-85ed91146b65 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "63c5b8fd-b774-4470-872c-5e8c954d75e3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.920s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:28:56 compute-0 podman[231651]: 2026-01-22 17:28:56.372618728 +0000 UTC m=+0.075807786 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:28:56 compute-0 podman[231650]: 2026-01-22 17:28:56.378508216 +0000 UTC m=+0.084936080 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 22 17:28:56 compute-0 podman[231652]: 2026-01-22 17:28:56.385680867 +0000 UTC m=+0.073166335 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Jan 22 17:28:57 compute-0 nova_compute[183075]: 2026-01-22 17:28:57.403 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:28:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:28:57.461 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:28:58 compute-0 nova_compute[183075]: 2026-01-22 17:28:58.247 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:01 compute-0 podman[231714]: 2026-01-22 17:29:01.799367452 +0000 UTC m=+0.499618387 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:29:02 compute-0 nova_compute[183075]: 2026-01-22 17:29:02.405 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:03 compute-0 nova_compute[183075]: 2026-01-22 17:29:03.250 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:07 compute-0 nova_compute[183075]: 2026-01-22 17:29:07.407 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:08 compute-0 nova_compute[183075]: 2026-01-22 17:29:08.220 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769102933.2196062, 63c5b8fd-b774-4470-872c-5e8c954d75e3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:29:08 compute-0 nova_compute[183075]: 2026-01-22 17:29:08.221 183079 INFO nova.compute.manager [-] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] VM Stopped (Lifecycle Event)
Jan 22 17:29:08 compute-0 nova_compute[183075]: 2026-01-22 17:29:08.252 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:09 compute-0 nova_compute[183075]: 2026-01-22 17:29:09.807 183079 DEBUG nova.compute.manager [None req-af223bda-8a45-480e-8f78-10e171ecb8a3 - - - - - -] [instance: 63c5b8fd-b774-4470-872c-5e8c954d75e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:29:11 compute-0 nova_compute[183075]: 2026-01-22 17:29:11.112 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "558df49c-4071-47cf-9f12-34cc70b1f266" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:11 compute-0 nova_compute[183075]: 2026-01-22 17:29:11.112 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "558df49c-4071-47cf-9f12-34cc70b1f266" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:11 compute-0 podman[231735]: 2026-01-22 17:29:11.353720651 +0000 UTC m=+0.062509911 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:29:11 compute-0 nova_compute[183075]: 2026-01-22 17:29:11.739 183079 DEBUG nova.compute.manager [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:29:11 compute-0 nova_compute[183075]: 2026-01-22 17:29:11.826 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:11 compute-0 nova_compute[183075]: 2026-01-22 17:29:11.827 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:11 compute-0 nova_compute[183075]: 2026-01-22 17:29:11.834 183079 DEBUG nova.virt.hardware [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:29:11 compute-0 nova_compute[183075]: 2026-01-22 17:29:11.835 183079 INFO nova.compute.claims [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:29:11 compute-0 nova_compute[183075]: 2026-01-22 17:29:11.964 183079 DEBUG nova.compute.provider_tree [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:29:11 compute-0 nova_compute[183075]: 2026-01-22 17:29:11.982 183079 DEBUG nova.scheduler.client.report [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.009 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.010 183079 DEBUG nova.compute.manager [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.066 183079 DEBUG nova.compute.manager [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.066 183079 DEBUG nova.network.neutron [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.086 183079 INFO nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.110 183079 DEBUG nova.compute.manager [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.196 183079 DEBUG nova.compute.manager [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.197 183079 DEBUG nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.197 183079 INFO nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Creating image(s)
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.198 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "/var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.198 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "/var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.199 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "/var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.211 183079 DEBUG oslo_concurrency.processutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.279 183079 DEBUG oslo_concurrency.processutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.280 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.280 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.291 183079 DEBUG oslo_concurrency.processutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.350 183079 DEBUG oslo_concurrency.processutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.352 183079 DEBUG oslo_concurrency.processutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.397 183079 DEBUG oslo_concurrency.processutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.398 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.399 183079 DEBUG oslo_concurrency.processutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.415 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.455 183079 DEBUG oslo_concurrency.processutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.455 183079 DEBUG nova.virt.disk.api [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Checking if we can resize image /var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.456 183079 DEBUG oslo_concurrency.processutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.514 183079 DEBUG oslo_concurrency.processutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.515 183079 DEBUG nova.virt.disk.api [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Cannot resize image /var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.515 183079 DEBUG nova.objects.instance [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lazy-loading 'migration_context' on Instance uuid 558df49c-4071-47cf-9f12-34cc70b1f266 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.529 183079 DEBUG nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.529 183079 DEBUG nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Ensure instance console log exists: /var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.529 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.530 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.530 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:12 compute-0 nova_compute[183075]: 2026-01-22 17:29:12.704 183079 DEBUG nova.policy [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3a896d4927d442ffba421873948034be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:29:13 compute-0 nova_compute[183075]: 2026-01-22 17:29:13.254 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:13 compute-0 nova_compute[183075]: 2026-01-22 17:29:13.507 183079 DEBUG nova.network.neutron [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Successfully updated port: cd31a146-70f5-4610-88f1-ae4772887ce2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:29:13 compute-0 nova_compute[183075]: 2026-01-22 17:29:13.522 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "refresh_cache-558df49c-4071-47cf-9f12-34cc70b1f266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:29:13 compute-0 nova_compute[183075]: 2026-01-22 17:29:13.523 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquired lock "refresh_cache-558df49c-4071-47cf-9f12-34cc70b1f266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:29:13 compute-0 nova_compute[183075]: 2026-01-22 17:29:13.523 183079 DEBUG nova.network.neutron [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:29:13 compute-0 nova_compute[183075]: 2026-01-22 17:29:13.628 183079 DEBUG nova.compute.manager [req-7006c33f-4f1b-4abb-b26b-6c378d38dbb0 req-0fd1701a-5a15-405a-a97b-32ea78679472 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Received event network-changed-cd31a146-70f5-4610-88f1-ae4772887ce2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:29:13 compute-0 nova_compute[183075]: 2026-01-22 17:29:13.628 183079 DEBUG nova.compute.manager [req-7006c33f-4f1b-4abb-b26b-6c378d38dbb0 req-0fd1701a-5a15-405a-a97b-32ea78679472 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Refreshing instance network info cache due to event network-changed-cd31a146-70f5-4610-88f1-ae4772887ce2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:29:13 compute-0 nova_compute[183075]: 2026-01-22 17:29:13.629 183079 DEBUG oslo_concurrency.lockutils [req-7006c33f-4f1b-4abb-b26b-6c378d38dbb0 req-0fd1701a-5a15-405a-a97b-32ea78679472 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-558df49c-4071-47cf-9f12-34cc70b1f266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:29:13 compute-0 nova_compute[183075]: 2026-01-22 17:29:13.702 183079 DEBUG nova.network.neutron [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.157 183079 DEBUG nova.network.neutron [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Updating instance_info_cache with network_info: [{"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.522 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Releasing lock "refresh_cache-558df49c-4071-47cf-9f12-34cc70b1f266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.522 183079 DEBUG nova.compute.manager [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Instance network_info: |[{"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.523 183079 DEBUG oslo_concurrency.lockutils [req-7006c33f-4f1b-4abb-b26b-6c378d38dbb0 req-0fd1701a-5a15-405a-a97b-32ea78679472 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-558df49c-4071-47cf-9f12-34cc70b1f266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.523 183079 DEBUG nova.network.neutron [req-7006c33f-4f1b-4abb-b26b-6c378d38dbb0 req-0fd1701a-5a15-405a-a97b-32ea78679472 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Refreshing network info cache for port cd31a146-70f5-4610-88f1-ae4772887ce2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.528 183079 DEBUG nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Start _get_guest_xml network_info=[{"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.534 183079 WARNING nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.541 183079 DEBUG nova.virt.libvirt.host [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.541 183079 DEBUG nova.virt.libvirt.host [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.550 183079 DEBUG nova.virt.libvirt.host [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.550 183079 DEBUG nova.virt.libvirt.host [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.551 183079 DEBUG nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.551 183079 DEBUG nova.virt.hardware [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.552 183079 DEBUG nova.virt.hardware [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.552 183079 DEBUG nova.virt.hardware [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.553 183079 DEBUG nova.virt.hardware [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.553 183079 DEBUG nova.virt.hardware [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.553 183079 DEBUG nova.virt.hardware [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.554 183079 DEBUG nova.virt.hardware [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.554 183079 DEBUG nova.virt.hardware [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.554 183079 DEBUG nova.virt.hardware [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.554 183079 DEBUG nova.virt.hardware [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.555 183079 DEBUG nova.virt.hardware [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.560 183079 DEBUG nova.virt.libvirt.vif [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:29:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1076717562',display_name='tempest-server-test-1076717562',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1076717562',id=50,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQrJNwmVmi0v3CvDVkdf2ULfmIKW9OE2obw9UEIh0JilIeeueUzwA1cDH+T5CoOIGZz/satGSZDSgKqtLklRpNQ/Wm6QLNBLAjV/3q74U9Y8J0BPwM5hIfTkFkrKFKf2g==',key_name='tempest-keypair-test-1359386190',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ff1e5ce4806445a8e463c71b6930bec',ramdisk_id='',reservation_id='r-74azlj4z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortsTest-1337721110',owner_user_name='tempest-PortsTest-1337721110-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:29:12Z,user_data=None,user_id='3a896d4927d442ffba421873948034be',uuid=558df49c-4071-47cf-9f12-34cc70b1f266,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.560 183079 DEBUG nova.network.os_vif_util [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converting VIF {"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.561 183079 DEBUG nova.network.os_vif_util [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:f4:bc,bridge_name='br-int',has_traffic_filtering=True,id=cd31a146-70f5-4610-88f1-ae4772887ce2,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd31a146-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.562 183079 DEBUG nova.objects.instance [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lazy-loading 'pci_devices' on Instance uuid 558df49c-4071-47cf-9f12-34cc70b1f266 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.645 183079 DEBUG nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:29:15 compute-0 nova_compute[183075]:   <uuid>558df49c-4071-47cf-9f12-34cc70b1f266</uuid>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   <name>instance-00000032</name>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1076717562</nova:name>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:29:15</nova:creationTime>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:29:15 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:29:15 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:29:15 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:29:15 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:29:15 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:29:15 compute-0 nova_compute[183075]:         <nova:user uuid="3a896d4927d442ffba421873948034be">tempest-PortsTest-1337721110-project-member</nova:user>
Jan 22 17:29:15 compute-0 nova_compute[183075]:         <nova:project uuid="7ff1e5ce4806445a8e463c71b6930bec">tempest-PortsTest-1337721110</nova:project>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:29:15 compute-0 nova_compute[183075]:         <nova:port uuid="cd31a146-70f5-4610-88f1-ae4772887ce2">
Jan 22 17:29:15 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <system>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <entry name="serial">558df49c-4071-47cf-9f12-34cc70b1f266</entry>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <entry name="uuid">558df49c-4071-47cf-9f12-34cc70b1f266</entry>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     </system>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   <os>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   </os>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   <features>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   </features>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266/disk"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:f2:f4:bc"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <target dev="tapcd31a146-70"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266/console.log" append="off"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <video>
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     </video>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:29:15 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:29:15 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:29:15 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:29:15 compute-0 nova_compute[183075]: </domain>
Jan 22 17:29:15 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.646 183079 DEBUG nova.compute.manager [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Preparing to wait for external event network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.646 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.647 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.647 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.648 183079 DEBUG nova.virt.libvirt.vif [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:29:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1076717562',display_name='tempest-server-test-1076717562',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1076717562',id=50,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQrJNwmVmi0v3CvDVkdf2ULfmIKW9OE2obw9UEIh0JilIeeueUzwA1cDH+T5CoOIGZz/satGSZDSgKqtLklRpNQ/Wm6QLNBLAjV/3q74U9Y8J0BPwM5hIfTkFkrKFKf2g==',key_name='tempest-keypair-test-1359386190',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ff1e5ce4806445a8e463c71b6930bec',ramdisk_id='',reservation_id='r-74azlj4z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortsTest-1337721110',owner_user_name='tempest-PortsTest-1337721110-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:29:12Z,user_data=None,user_id='3a896d4927d442ffba421873948034be',uuid=558df49c-4071-47cf-9f12-34cc70b1f266,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.648 183079 DEBUG nova.network.os_vif_util [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converting VIF {"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.649 183079 DEBUG nova.network.os_vif_util [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:f4:bc,bridge_name='br-int',has_traffic_filtering=True,id=cd31a146-70f5-4610-88f1-ae4772887ce2,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd31a146-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.649 183079 DEBUG os_vif [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:f4:bc,bridge_name='br-int',has_traffic_filtering=True,id=cd31a146-70f5-4610-88f1-ae4772887ce2,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd31a146-70') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.650 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.650 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.651 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.653 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.653 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd31a146-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.654 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcd31a146-70, col_values=(('external_ids', {'iface-id': 'cd31a146-70f5-4610-88f1-ae4772887ce2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:f4:bc', 'vm-uuid': '558df49c-4071-47cf-9f12-34cc70b1f266'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.655 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:15 compute-0 NetworkManager[55454]: <info>  [1769102955.6570] manager: (tapcd31a146-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/235)
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.658 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.662 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.663 183079 INFO os_vif [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:f4:bc,bridge_name='br-int',has_traffic_filtering=True,id=cd31a146-70f5-4610-88f1-ae4772887ce2,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd31a146-70')
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.859 183079 DEBUG nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.860 183079 DEBUG nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] No VIF found with MAC fa:16:3e:f2:f4:bc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:29:15 compute-0 kernel: tapcd31a146-70: entered promiscuous mode
Jan 22 17:29:15 compute-0 NetworkManager[55454]: <info>  [1769102955.9388] manager: (tapcd31a146-70): new Tun device (/org/freedesktop/NetworkManager/Devices/236)
Jan 22 17:29:15 compute-0 ovn_controller[95372]: 2026-01-22T17:29:15Z|00552|binding|INFO|Claiming lport cd31a146-70f5-4610-88f1-ae4772887ce2 for this chassis.
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.939 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:15 compute-0 ovn_controller[95372]: 2026-01-22T17:29:15Z|00553|binding|INFO|cd31a146-70f5-4610-88f1-ae4772887ce2: Claiming fa:16:3e:f2:f4:bc 10.100.0.10
Jan 22 17:29:15 compute-0 ovn_controller[95372]: 2026-01-22T17:29:15Z|00554|binding|INFO|Setting lport cd31a146-70f5-4610-88f1-ae4772887ce2 ovn-installed in OVS
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.952 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:15 compute-0 nova_compute[183075]: 2026-01-22 17:29:15.957 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:15 compute-0 systemd-udevd[231791]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:29:15 compute-0 systemd-machined[154382]: New machine qemu-50-instance-00000032.
Jan 22 17:29:15 compute-0 NetworkManager[55454]: <info>  [1769102955.9867] device (tapcd31a146-70): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:29:15 compute-0 NetworkManager[55454]: <info>  [1769102955.9877] device (tapcd31a146-70): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:29:16 compute-0 systemd[1]: Started Virtual Machine qemu-50-instance-00000032.
Jan 22 17:29:16 compute-0 nova_compute[183075]: 2026-01-22 17:29:16.245 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102956.2445607, 558df49c-4071-47cf-9f12-34cc70b1f266 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:29:16 compute-0 nova_compute[183075]: 2026-01-22 17:29:16.245 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] VM Started (Lifecycle Event)
Jan 22 17:29:16 compute-0 ovn_controller[95372]: 2026-01-22T17:29:16Z|00555|binding|INFO|Setting lport cd31a146-70f5-4610-88f1-ae4772887ce2 up in Southbound
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.459 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:f4:bc 10.100.0.10'], port_security=['fa:16:3e:f2:f4:bc 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-port-288740050', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-port-288740050', 'neutron:project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a8d0872f-de3c-4404-94dc-7a328e5b8aa9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a510149a-49ad-47bc-a8ad-05908544b3cc, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=cd31a146-70f5-4610-88f1-ae4772887ce2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.461 104629 INFO neutron.agent.ovn.metadata.agent [-] Port cd31a146-70f5-4610-88f1-ae4772887ce2 in datapath 012007cf-673c-4f83-a4b9-f21a913a1ccf bound to our chassis
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.462 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 012007cf-673c-4f83-a4b9-f21a913a1ccf
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.471 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb5e987-3543-4d1a-ad65-b3f0c82365cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.472 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap012007cf-61 in ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.474 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap012007cf-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.474 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8f3da74a-cfeb-4431-a5d9-150910545ce7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.475 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1be4c657-e1bd-4228-a288-ac002bc52f2f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.488 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[1be6e410-1bf7-4508-8891-f72d89d52f5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.509 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8502a197-c822-4e02-99fd-3710481531a2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.533 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5dfd2563-978d-4ed6-9bd2-ea93f1b0449b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.537 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fe89aa08-c17b-497e-8b4e-eac105f7103c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:16 compute-0 systemd-udevd[231794]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:29:16 compute-0 NetworkManager[55454]: <info>  [1769102956.5389] manager: (tap012007cf-60): new Veth device (/org/freedesktop/NetworkManager/Devices/237)
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.565 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[79850883-282f-4eb7-847b-0dec984bd5b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.568 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d1c38f77-e9f7-41ab-b069-0aef4f2cffb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:16 compute-0 podman[231810]: 2026-01-22 17:29:16.575415786 +0000 UTC m=+0.059252924 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:29:16 compute-0 NetworkManager[55454]: <info>  [1769102956.5901] device (tap012007cf-60): carrier: link connected
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.594 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[1284a647-6338-4a03-9a58-0b429d4f3417]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.610 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[66e3982b-4947-4641-88bc-d794dd5af039]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap012007cf-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:ca:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532029, 'reachable_time': 42420, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231856, 'error': None, 'target': 'ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.622 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e2c6554c-fe1d-497d-aaed-ecf07f5837bd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:cae2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532029, 'tstamp': 532029}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231858, 'error': None, 'target': 'ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.640 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3e93a99c-6378-4ca0-8141-50d52cb85e6a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap012007cf-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:ca:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532029, 'reachable_time': 42420, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231859, 'error': None, 'target': 'ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.666 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[540faa94-af22-4942-8e00-1eb2c1f68a92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.722 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c576d281-3a30-4ac9-8479-a62217279a82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.723 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap012007cf-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.724 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.724 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap012007cf-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:29:16 compute-0 nova_compute[183075]: 2026-01-22 17:29:16.725 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:29:16 compute-0 nova_compute[183075]: 2026-01-22 17:29:16.726 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:16 compute-0 NetworkManager[55454]: <info>  [1769102956.7267] manager: (tap012007cf-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/238)
Jan 22 17:29:16 compute-0 kernel: tap012007cf-60: entered promiscuous mode
Jan 22 17:29:16 compute-0 nova_compute[183075]: 2026-01-22 17:29:16.728 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.730 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap012007cf-60, col_values=(('external_ids', {'iface-id': 'd7c95871-d767-4104-b2f1-75f5b06e0524'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:29:16 compute-0 nova_compute[183075]: 2026-01-22 17:29:16.732 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:16 compute-0 ovn_controller[95372]: 2026-01-22T17:29:16Z|00556|binding|INFO|Releasing lport d7c95871-d767-4104-b2f1-75f5b06e0524 from this chassis (sb_readonly=1)
Jan 22 17:29:16 compute-0 nova_compute[183075]: 2026-01-22 17:29:16.735 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102956.2447534, 558df49c-4071-47cf-9f12-34cc70b1f266 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:29:16 compute-0 nova_compute[183075]: 2026-01-22 17:29:16.737 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] VM Paused (Lifecycle Event)
Jan 22 17:29:16 compute-0 nova_compute[183075]: 2026-01-22 17:29:16.744 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.745 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/012007cf-673c-4f83-a4b9-f21a913a1ccf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/012007cf-673c-4f83-a4b9-f21a913a1ccf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.746 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7e48bbf0-6f50-439a-a4a2-fa1e89a2a883]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.746 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/012007cf-673c-4f83-a4b9-f21a913a1ccf.pid.haproxy
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 012007cf-673c-4f83-a4b9-f21a913a1ccf
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:29:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:16.747 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'env', 'PROCESS_TAG=haproxy-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/012007cf-673c-4f83-a4b9-f21a913a1ccf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:29:16 compute-0 nova_compute[183075]: 2026-01-22 17:29:16.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:29:16 compute-0 nova_compute[183075]: 2026-01-22 17:29:16.788 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:29:16 compute-0 nova_compute[183075]: 2026-01-22 17:29:16.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:29:16 compute-0 nova_compute[183075]: 2026-01-22 17:29:16.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 17:29:16 compute-0 nova_compute[183075]: 2026-01-22 17:29:16.792 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:29:16 compute-0 nova_compute[183075]: 2026-01-22 17:29:16.881 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:29:17 compute-0 podman[231891]: 2026-01-22 17:29:17.085802432 +0000 UTC m=+0.022899213 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:29:17 compute-0 nova_compute[183075]: 2026-01-22 17:29:17.410 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:17 compute-0 podman[231891]: 2026-01-22 17:29:17.57193595 +0000 UTC m=+0.509032701 container create 65367d4e527a226baad5edeb19ef248381bcf3fdab4f6102eaee16f9b44d027f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:29:17 compute-0 systemd[1]: Started libpod-conmon-65367d4e527a226baad5edeb19ef248381bcf3fdab4f6102eaee16f9b44d027f.scope.
Jan 22 17:29:17 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:29:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/254e9dcd2cf88a7cc5d32c3cbf3230ac62d236a9489264d7f822f955dc954c32/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:29:18 compute-0 podman[231891]: 2026-01-22 17:29:18.139317279 +0000 UTC m=+1.076414050 container init 65367d4e527a226baad5edeb19ef248381bcf3fdab4f6102eaee16f9b44d027f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 17:29:18 compute-0 podman[231891]: 2026-01-22 17:29:18.146734917 +0000 UTC m=+1.083831668 container start 65367d4e527a226baad5edeb19ef248381bcf3fdab4f6102eaee16f9b44d027f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:29:18 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[231907]: [NOTICE]   (231911) : New worker (231913) forked
Jan 22 17:29:18 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[231907]: [NOTICE]   (231911) : Loading success.
Jan 22 17:29:19 compute-0 nova_compute[183075]: 2026-01-22 17:29:19.752 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:20 compute-0 nova_compute[183075]: 2026-01-22 17:29:20.503 183079 DEBUG nova.network.neutron [req-7006c33f-4f1b-4abb-b26b-6c378d38dbb0 req-0fd1701a-5a15-405a-a97b-32ea78679472 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Updated VIF entry in instance network info cache for port cd31a146-70f5-4610-88f1-ae4772887ce2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:29:20 compute-0 nova_compute[183075]: 2026-01-22 17:29:20.503 183079 DEBUG nova.network.neutron [req-7006c33f-4f1b-4abb-b26b-6c378d38dbb0 req-0fd1701a-5a15-405a-a97b-32ea78679472 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Updating instance_info_cache with network_info: [{"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:29:20 compute-0 nova_compute[183075]: 2026-01-22 17:29:20.525 183079 DEBUG oslo_concurrency.lockutils [req-7006c33f-4f1b-4abb-b26b-6c378d38dbb0 req-0fd1701a-5a15-405a-a97b-32ea78679472 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-558df49c-4071-47cf-9f12-34cc70b1f266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:29:20 compute-0 nova_compute[183075]: 2026-01-22 17:29:20.704 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:20 compute-0 nova_compute[183075]: 2026-01-22 17:29:20.878 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:29:21 compute-0 nova_compute[183075]: 2026-01-22 17:29:21.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:29:22 compute-0 nova_compute[183075]: 2026-01-22 17:29:22.413 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:25 compute-0 nova_compute[183075]: 2026-01-22 17:29:25.770 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:27 compute-0 podman[231924]: 2026-01-22 17:29:27.359314124 +0000 UTC m=+0.063140078 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, release=1755695350, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., distribution-scope=public)
Jan 22 17:29:27 compute-0 podman[231923]: 2026-01-22 17:29:27.364297857 +0000 UTC m=+0.070967017 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 17:29:27 compute-0 podman[231922]: 2026-01-22 17:29:27.405507038 +0000 UTC m=+0.114380967 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 22 17:29:27 compute-0 nova_compute[183075]: 2026-01-22 17:29:27.414 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:27 compute-0 nova_compute[183075]: 2026-01-22 17:29:27.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:29:29 compute-0 nova_compute[183075]: 2026-01-22 17:29:29.208 183079 DEBUG nova.compute.manager [req-eae02c68-6d89-48b2-b68c-14371497c6c8 req-3e24cc12-3107-426c-884c-348172cab9b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Received event network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:29:29 compute-0 nova_compute[183075]: 2026-01-22 17:29:29.209 183079 DEBUG oslo_concurrency.lockutils [req-eae02c68-6d89-48b2-b68c-14371497c6c8 req-3e24cc12-3107-426c-884c-348172cab9b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:29 compute-0 nova_compute[183075]: 2026-01-22 17:29:29.210 183079 DEBUG oslo_concurrency.lockutils [req-eae02c68-6d89-48b2-b68c-14371497c6c8 req-3e24cc12-3107-426c-884c-348172cab9b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:29 compute-0 nova_compute[183075]: 2026-01-22 17:29:29.210 183079 DEBUG oslo_concurrency.lockutils [req-eae02c68-6d89-48b2-b68c-14371497c6c8 req-3e24cc12-3107-426c-884c-348172cab9b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:29 compute-0 nova_compute[183075]: 2026-01-22 17:29:29.210 183079 DEBUG nova.compute.manager [req-eae02c68-6d89-48b2-b68c-14371497c6c8 req-3e24cc12-3107-426c-884c-348172cab9b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Processing event network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:29:29 compute-0 nova_compute[183075]: 2026-01-22 17:29:29.211 183079 DEBUG nova.compute.manager [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Instance event wait completed in 12 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:29:29 compute-0 nova_compute[183075]: 2026-01-22 17:29:29.219 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102969.2196975, 558df49c-4071-47cf-9f12-34cc70b1f266 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:29:29 compute-0 nova_compute[183075]: 2026-01-22 17:29:29.220 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] VM Resumed (Lifecycle Event)
Jan 22 17:29:29 compute-0 nova_compute[183075]: 2026-01-22 17:29:29.222 183079 DEBUG nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:29:29 compute-0 nova_compute[183075]: 2026-01-22 17:29:29.228 183079 INFO nova.virt.libvirt.driver [-] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Instance spawned successfully.
Jan 22 17:29:29 compute-0 nova_compute[183075]: 2026-01-22 17:29:29.228 183079 DEBUG nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:29:30 compute-0 nova_compute[183075]: 2026-01-22 17:29:30.350 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:29:30 compute-0 nova_compute[183075]: 2026-01-22 17:29:30.350 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:29:30 compute-0 nova_compute[183075]: 2026-01-22 17:29:30.350 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:29:30 compute-0 nova_compute[183075]: 2026-01-22 17:29:30.357 183079 DEBUG nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:29:30 compute-0 nova_compute[183075]: 2026-01-22 17:29:30.357 183079 DEBUG nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:29:30 compute-0 nova_compute[183075]: 2026-01-22 17:29:30.357 183079 DEBUG nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:29:30 compute-0 nova_compute[183075]: 2026-01-22 17:29:30.358 183079 DEBUG nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:29:30 compute-0 nova_compute[183075]: 2026-01-22 17:29:30.358 183079 DEBUG nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:29:30 compute-0 nova_compute[183075]: 2026-01-22 17:29:30.359 183079 DEBUG nova.virt.libvirt.driver [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:29:30 compute-0 nova_compute[183075]: 2026-01-22 17:29:30.361 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:29:30 compute-0 nova_compute[183075]: 2026-01-22 17:29:30.764 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:29:30 compute-0 nova_compute[183075]: 2026-01-22 17:29:30.772 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:30 compute-0 nova_compute[183075]: 2026-01-22 17:29:30.784 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:29:31 compute-0 nova_compute[183075]: 2026-01-22 17:29:31.349 183079 DEBUG nova.compute.manager [req-d9bc7d83-b70e-4697-8c2d-4d99ff425f9f req-719a9f2b-e9ad-46f5-8e56-70d14f46e0ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Received event network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:29:31 compute-0 nova_compute[183075]: 2026-01-22 17:29:31.349 183079 DEBUG oslo_concurrency.lockutils [req-d9bc7d83-b70e-4697-8c2d-4d99ff425f9f req-719a9f2b-e9ad-46f5-8e56-70d14f46e0ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:31 compute-0 nova_compute[183075]: 2026-01-22 17:29:31.349 183079 DEBUG oslo_concurrency.lockutils [req-d9bc7d83-b70e-4697-8c2d-4d99ff425f9f req-719a9f2b-e9ad-46f5-8e56-70d14f46e0ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:31 compute-0 nova_compute[183075]: 2026-01-22 17:29:31.350 183079 DEBUG oslo_concurrency.lockutils [req-d9bc7d83-b70e-4697-8c2d-4d99ff425f9f req-719a9f2b-e9ad-46f5-8e56-70d14f46e0ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:31 compute-0 nova_compute[183075]: 2026-01-22 17:29:31.350 183079 DEBUG nova.compute.manager [req-d9bc7d83-b70e-4697-8c2d-4d99ff425f9f req-719a9f2b-e9ad-46f5-8e56-70d14f46e0ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] No waiting events found dispatching network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:29:31 compute-0 nova_compute[183075]: 2026-01-22 17:29:31.350 183079 WARNING nova.compute.manager [req-d9bc7d83-b70e-4697-8c2d-4d99ff425f9f req-719a9f2b-e9ad-46f5-8e56-70d14f46e0ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Received unexpected event network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 for instance with vm_state building and task_state spawning.
Jan 22 17:29:31 compute-0 nova_compute[183075]: 2026-01-22 17:29:31.389 183079 INFO nova.compute.manager [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Took 19.19 seconds to spawn the instance on the hypervisor.
Jan 22 17:29:31 compute-0 nova_compute[183075]: 2026-01-22 17:29:31.389 183079 DEBUG nova.compute.manager [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:29:31 compute-0 nova_compute[183075]: 2026-01-22 17:29:31.540 183079 INFO nova.compute.manager [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Took 19.74 seconds to build instance.
Jan 22 17:29:31 compute-0 nova_compute[183075]: 2026-01-22 17:29:31.661 183079 DEBUG oslo_concurrency.lockutils [None req-669102f4-9054-4f95-86ff-b69ddf293417 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "558df49c-4071-47cf-9f12-34cc70b1f266" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.549s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:31 compute-0 nova_compute[183075]: 2026-01-22 17:29:31.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:29:31 compute-0 nova_compute[183075]: 2026-01-22 17:29:31.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:29:32 compute-0 nova_compute[183075]: 2026-01-22 17:29:32.037 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:32 compute-0 nova_compute[183075]: 2026-01-22 17:29:32.038 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:32 compute-0 nova_compute[183075]: 2026-01-22 17:29:32.038 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:32 compute-0 nova_compute[183075]: 2026-01-22 17:29:32.038 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:29:32 compute-0 nova_compute[183075]: 2026-01-22 17:29:32.151 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:29:32 compute-0 podman[231990]: 2026-01-22 17:29:32.184997146 +0000 UTC m=+0.093794924 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 17:29:32 compute-0 nova_compute[183075]: 2026-01-22 17:29:32.206 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:29:32 compute-0 nova_compute[183075]: 2026-01-22 17:29:32.207 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:29:32 compute-0 nova_compute[183075]: 2026-01-22 17:29:32.265 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:29:32 compute-0 nova_compute[183075]: 2026-01-22 17:29:32.417 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:32 compute-0 nova_compute[183075]: 2026-01-22 17:29:32.429 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:29:32 compute-0 nova_compute[183075]: 2026-01-22 17:29:32.430 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5571MB free_disk=73.35896682739258GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:29:32 compute-0 nova_compute[183075]: 2026-01-22 17:29:32.430 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:32 compute-0 nova_compute[183075]: 2026-01-22 17:29:32.430 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:33 compute-0 nova_compute[183075]: 2026-01-22 17:29:33.346 183079 INFO nova.compute.manager [None req-55fbffde-ee4c-4819-9416-6552865706ea 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Get console output
Jan 22 17:29:33 compute-0 nova_compute[183075]: 2026-01-22 17:29:33.350 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:29:33 compute-0 nova_compute[183075]: 2026-01-22 17:29:33.538 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 558df49c-4071-47cf-9f12-34cc70b1f266 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:29:33 compute-0 nova_compute[183075]: 2026-01-22 17:29:33.565 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:33 compute-0 nova_compute[183075]: 2026-01-22 17:29:33.565 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:33 compute-0 nova_compute[183075]: 2026-01-22 17:29:33.767 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance ae41baf4-b0eb-4402-aebe-718f5c7f3ed9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692
Jan 22 17:29:33 compute-0 nova_compute[183075]: 2026-01-22 17:29:33.767 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:29:33 compute-0 nova_compute[183075]: 2026-01-22 17:29:33.767 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:29:33 compute-0 nova_compute[183075]: 2026-01-22 17:29:33.898 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:29:34 compute-0 nova_compute[183075]: 2026-01-22 17:29:34.399 183079 DEBUG nova.compute.manager [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:29:34 compute-0 nova_compute[183075]: 2026-01-22 17:29:34.443 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:29:35 compute-0 nova_compute[183075]: 2026-01-22 17:29:35.177 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:35 compute-0 nova_compute[183075]: 2026-01-22 17:29:35.811 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:36 compute-0 nova_compute[183075]: 2026-01-22 17:29:36.985 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:29:36 compute-0 nova_compute[183075]: 2026-01-22 17:29:36.986 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.555s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:36 compute-0 nova_compute[183075]: 2026-01-22 17:29:36.987 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 1.810s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:36 compute-0 nova_compute[183075]: 2026-01-22 17:29:36.988 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:29:36 compute-0 nova_compute[183075]: 2026-01-22 17:29:36.988 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 17:29:37 compute-0 nova_compute[183075]: 2026-01-22 17:29:37.002 183079 DEBUG nova.virt.hardware [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:29:37 compute-0 nova_compute[183075]: 2026-01-22 17:29:37.002 183079 INFO nova.compute.claims [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:29:37 compute-0 nova_compute[183075]: 2026-01-22 17:29:37.142 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 17:29:37 compute-0 nova_compute[183075]: 2026-01-22 17:29:37.420 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:38 compute-0 nova_compute[183075]: 2026-01-22 17:29:38.143 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:29:38 compute-0 nova_compute[183075]: 2026-01-22 17:29:38.143 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:29:38 compute-0 nova_compute[183075]: 2026-01-22 17:29:38.143 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:29:38 compute-0 nova_compute[183075]: 2026-01-22 17:29:38.731 183079 INFO nova.compute.manager [None req-3444a3e9-68aa-44da-83a2-87116e4a7be7 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Get console output
Jan 22 17:29:38 compute-0 nova_compute[183075]: 2026-01-22 17:29:38.741 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 17:29:38 compute-0 nova_compute[183075]: 2026-01-22 17:29:38.739 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:29:38 compute-0 nova_compute[183075]: 2026-01-22 17:29:38.808 183079 DEBUG nova.compute.provider_tree [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:29:38 compute-0 nova_compute[183075]: 2026-01-22 17:29:38.887 183079 DEBUG nova.scheduler.client.report [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.091 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.091 183079 DEBUG nova.compute.manager [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.198 183079 DEBUG nova.compute.manager [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.199 183079 DEBUG nova.network.neutron [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.259 183079 INFO nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.347 183079 DEBUG nova.compute.manager [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.452 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-558df49c-4071-47cf-9f12-34cc70b1f266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.453 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-558df49c-4071-47cf-9f12-34cc70b1f266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.453 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.453 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 558df49c-4071-47cf-9f12-34cc70b1f266 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.465 183079 DEBUG nova.compute.manager [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.466 183079 DEBUG nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.466 183079 INFO nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Creating image(s)
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.467 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "/var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.467 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "/var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.468 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "/var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.484 183079 DEBUG oslo_concurrency.processutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.538 183079 DEBUG oslo_concurrency.processutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.540 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.540 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.556 183079 DEBUG oslo_concurrency.processutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.583 183079 DEBUG nova.policy [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.614 183079 DEBUG oslo_concurrency.processutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:29:39 compute-0 nova_compute[183075]: 2026-01-22 17:29:39.615 183079 DEBUG oslo_concurrency.processutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:29:40 compute-0 nova_compute[183075]: 2026-01-22 17:29:40.097 183079 DEBUG oslo_concurrency.processutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk 1073741824" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:29:40 compute-0 nova_compute[183075]: 2026-01-22 17:29:40.097 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:40 compute-0 nova_compute[183075]: 2026-01-22 17:29:40.098 183079 DEBUG oslo_concurrency.processutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:29:40 compute-0 nova_compute[183075]: 2026-01-22 17:29:40.170 183079 DEBUG oslo_concurrency.processutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:29:40 compute-0 nova_compute[183075]: 2026-01-22 17:29:40.171 183079 DEBUG nova.virt.disk.api [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Checking if we can resize image /var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:29:40 compute-0 nova_compute[183075]: 2026-01-22 17:29:40.171 183079 DEBUG oslo_concurrency.processutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:29:40 compute-0 nova_compute[183075]: 2026-01-22 17:29:40.223 183079 DEBUG oslo_concurrency.processutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:29:40 compute-0 nova_compute[183075]: 2026-01-22 17:29:40.224 183079 DEBUG nova.virt.disk.api [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Cannot resize image /var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:29:40 compute-0 nova_compute[183075]: 2026-01-22 17:29:40.225 183079 DEBUG nova.objects.instance [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lazy-loading 'migration_context' on Instance uuid ae41baf4-b0eb-4402-aebe-718f5c7f3ed9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:29:40 compute-0 nova_compute[183075]: 2026-01-22 17:29:40.301 183079 DEBUG nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:29:40 compute-0 nova_compute[183075]: 2026-01-22 17:29:40.302 183079 DEBUG nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Ensure instance console log exists: /var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:29:40 compute-0 nova_compute[183075]: 2026-01-22 17:29:40.303 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:40 compute-0 nova_compute[183075]: 2026-01-22 17:29:40.303 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:40 compute-0 nova_compute[183075]: 2026-01-22 17:29:40.304 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:40 compute-0 nova_compute[183075]: 2026-01-22 17:29:40.813 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:41.946 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:41.948 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:41.949 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:42 compute-0 podman[232053]: 2026-01-22 17:29:42.35031412 +0000 UTC m=+0.058130348 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:29:42 compute-0 nova_compute[183075]: 2026-01-22 17:29:42.422 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:42 compute-0 nova_compute[183075]: 2026-01-22 17:29:42.484 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Updating instance_info_cache with network_info: [{"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:29:43 compute-0 ovn_controller[95372]: 2026-01-22T17:29:43Z|00102|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f2:f4:bc 10.100.0.10
Jan 22 17:29:43 compute-0 ovn_controller[95372]: 2026-01-22T17:29:43Z|00103|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f2:f4:bc 10.100.0.10
Jan 22 17:29:43 compute-0 nova_compute[183075]: 2026-01-22 17:29:43.897 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-558df49c-4071-47cf-9f12-34cc70b1f266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:29:43 compute-0 nova_compute[183075]: 2026-01-22 17:29:43.898 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:29:44 compute-0 nova_compute[183075]: 2026-01-22 17:29:44.111 183079 INFO nova.compute.manager [None req-7c76dbca-b635-4569-9383-b8a1bf916cda 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Get console output
Jan 22 17:29:44 compute-0 nova_compute[183075]: 2026-01-22 17:29:44.119 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:29:44 compute-0 nova_compute[183075]: 2026-01-22 17:29:44.540 183079 DEBUG nova.network.neutron [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Successfully updated port: c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:29:44 compute-0 nova_compute[183075]: 2026-01-22 17:29:44.658 183079 DEBUG nova.compute.manager [req-c633456f-d166-442c-ada5-b95d963af4b8 req-243075b5-13b5-4461-b2ea-6e637d5fea3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Received event network-changed-c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:29:44 compute-0 nova_compute[183075]: 2026-01-22 17:29:44.658 183079 DEBUG nova.compute.manager [req-c633456f-d166-442c-ada5-b95d963af4b8 req-243075b5-13b5-4461-b2ea-6e637d5fea3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Refreshing instance network info cache due to event network-changed-c9f56d85-9dbf-4ec0-8ee7-82cf5387b536. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:29:44 compute-0 nova_compute[183075]: 2026-01-22 17:29:44.659 183079 DEBUG oslo_concurrency.lockutils [req-c633456f-d166-442c-ada5-b95d963af4b8 req-243075b5-13b5-4461-b2ea-6e637d5fea3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:29:44 compute-0 nova_compute[183075]: 2026-01-22 17:29:44.659 183079 DEBUG oslo_concurrency.lockutils [req-c633456f-d166-442c-ada5-b95d963af4b8 req-243075b5-13b5-4461-b2ea-6e637d5fea3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:29:44 compute-0 nova_compute[183075]: 2026-01-22 17:29:44.659 183079 DEBUG nova.network.neutron [req-c633456f-d166-442c-ada5-b95d963af4b8 req-243075b5-13b5-4461-b2ea-6e637d5fea3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Refreshing network info cache for port c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:29:44 compute-0 nova_compute[183075]: 2026-01-22 17:29:44.685 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "refresh_cache-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:29:44 compute-0 nova_compute[183075]: 2026-01-22 17:29:44.855 183079 DEBUG nova.network.neutron [req-c633456f-d166-442c-ada5-b95d963af4b8 req-243075b5-13b5-4461-b2ea-6e637d5fea3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:29:45 compute-0 nova_compute[183075]: 2026-01-22 17:29:45.613 183079 DEBUG nova.network.neutron [req-c633456f-d166-442c-ada5-b95d963af4b8 req-243075b5-13b5-4461-b2ea-6e637d5fea3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:29:45 compute-0 nova_compute[183075]: 2026-01-22 17:29:45.865 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:46 compute-0 nova_compute[183075]: 2026-01-22 17:29:46.360 183079 DEBUG oslo_concurrency.lockutils [req-c633456f-d166-442c-ada5-b95d963af4b8 req-243075b5-13b5-4461-b2ea-6e637d5fea3d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:29:46 compute-0 nova_compute[183075]: 2026-01-22 17:29:46.361 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquired lock "refresh_cache-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:29:46 compute-0 nova_compute[183075]: 2026-01-22 17:29:46.361 183079 DEBUG nova.network.neutron [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:29:47 compute-0 podman[232077]: 2026-01-22 17:29:47.33083458 +0000 UTC m=+0.041082709 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:29:47 compute-0 nova_compute[183075]: 2026-01-22 17:29:47.423 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:47 compute-0 nova_compute[183075]: 2026-01-22 17:29:47.450 183079 DEBUG nova.network.neutron [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:29:49 compute-0 nova_compute[183075]: 2026-01-22 17:29:49.217 183079 INFO nova.compute.manager [None req-f201a534-484d-4c2c-8ce7-982dfd40bcbe 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Get console output
Jan 22 17:29:49 compute-0 nova_compute[183075]: 2026-01-22 17:29:49.222 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:29:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:50.076 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:29:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:50.076 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:29:50 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:29:50 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:29:50 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:29:50 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:29:50 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:29:50 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:29:50 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:29:50 compute-0 nova_compute[183075]: 2026-01-22 17:29:50.538 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:29:50 compute-0 nova_compute[183075]: 2026-01-22 17:29:50.867 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.465 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.466 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.3892877
Jan 22 17:29:51 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231913]: 10.100.0.10:53232 [22/Jan/2026:17:29:50.075] listener listener/metadata 0/0/0/1390/1390 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.473 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.473 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.492 183079 DEBUG nova.network.neutron [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Updating instance_info_cache with network_info: [{"id": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "address": "fa:16:3e:90:3e:82", "network": {"id": "eddf21f6-fbb7-4dbb-a219-f930432ddc71", "bridge": "br-int", "label": "tempest-internal-dns-test-network-1900776616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f56d85-9d", "ovs_interfaceid": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.504 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.504 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0311742
Jan 22 17:29:51 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231913]: 10.100.0.10:53246 [22/Jan/2026:17:29:51.472] listener listener/metadata 0/0/0/32/32 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.510 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.510 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.518 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Releasing lock "refresh_cache-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.519 183079 DEBUG nova.compute.manager [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Instance network_info: |[{"id": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "address": "fa:16:3e:90:3e:82", "network": {"id": "eddf21f6-fbb7-4dbb-a219-f930432ddc71", "bridge": "br-int", "label": "tempest-internal-dns-test-network-1900776616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f56d85-9d", "ovs_interfaceid": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.521 183079 DEBUG nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Start _get_guest_xml network_info=[{"id": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "address": "fa:16:3e:90:3e:82", "network": {"id": "eddf21f6-fbb7-4dbb-a219-f930432ddc71", "bridge": "br-int", "label": "tempest-internal-dns-test-network-1900776616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f56d85-9d", "ovs_interfaceid": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.523 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.524 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0135503
Jan 22 17:29:51 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231913]: 10.100.0.10:53248 [22/Jan/2026:17:29:51.509] listener listener/metadata 0/0/0/14/14 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.527 183079 WARNING nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.528 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.529 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.531 183079 DEBUG nova.virt.libvirt.host [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.532 183079 DEBUG nova.virt.libvirt.host [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.535 183079 DEBUG nova.virt.libvirt.host [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.536 183079 DEBUG nova.virt.libvirt.host [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.536 183079 DEBUG nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.536 183079 DEBUG nova.virt.hardware [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.537 183079 DEBUG nova.virt.hardware [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.537 183079 DEBUG nova.virt.hardware [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.537 183079 DEBUG nova.virt.hardware [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.537 183079 DEBUG nova.virt.hardware [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.538 183079 DEBUG nova.virt.hardware [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.538 183079 DEBUG nova.virt.hardware [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.538 183079 DEBUG nova.virt.hardware [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.538 183079 DEBUG nova.virt.hardware [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.539 183079 DEBUG nova.virt.hardware [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.539 183079 DEBUG nova.virt.hardware [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.542 183079 DEBUG nova.virt.libvirt.vif [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:29:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-internal-dns-test-vm-2137758858',display_name='tempest-internal-dns-test-vm-2137758858',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-internal-dns-test-vm-2137758858',id=51,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOLWbfUjAYIoN+tjNcC/qQrGiTRWkuWLPOSnVFATm/h1ZcSJk8EAZI4vKn1W/DBRn+mGMC9ealMQjIznUDRCwtGbPPBWKrg20ByYe0VtCSnmvWLZm4uyTyPBgNtKHGDVqg==',key_name='tempest-internal-dns-test-shared-keypair-1135515409',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='89916c03f6f440f6ae7cf81f2ae99bad',ramdisk_id='',reservation_id='r-fzp1nqoq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InternalDNSTest-38234021',owner_user_name='tempest-InternalDNSTest-38234021-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:29:39Z,user_data=None,user_id='1ddebe2a251e4b118d9469f7d6fdb2ce',uuid=ae41baf4-b0eb-4402-aebe-718f5c7f3ed9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "address": "fa:16:3e:90:3e:82", "network": {"id": "eddf21f6-fbb7-4dbb-a219-f930432ddc71", "bridge": "br-int", "label": "tempest-internal-dns-test-network-1900776616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f56d85-9d", "ovs_interfaceid": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.542 183079 DEBUG nova.network.os_vif_util [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converting VIF {"id": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "address": "fa:16:3e:90:3e:82", "network": {"id": "eddf21f6-fbb7-4dbb-a219-f930432ddc71", "bridge": "br-int", "label": "tempest-internal-dns-test-network-1900776616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f56d85-9d", "ovs_interfaceid": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.543 183079 DEBUG nova.network.os_vif_util [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:3e:82,bridge_name='br-int',has_traffic_filtering=True,id=c9f56d85-9dbf-4ec0-8ee7-82cf5387b536,network=Network(eddf21f6-fbb7-4dbb-a219-f930432ddc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc9f56d85-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.543 183079 DEBUG nova.objects.instance [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lazy-loading 'pci_devices' on Instance uuid ae41baf4-b0eb-4402-aebe-718f5c7f3ed9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.545 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.546 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0164616
Jan 22 17:29:51 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231913]: 10.100.0.10:53260 [22/Jan/2026:17:29:51.528] listener listener/metadata 0/0/0/17/17 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.551 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.552 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:29:51 compute-0 ovn_controller[95372]: 2026-01-22T17:29:51Z|00557|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.562 183079 DEBUG nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:29:51 compute-0 nova_compute[183075]:   <uuid>ae41baf4-b0eb-4402-aebe-718f5c7f3ed9</uuid>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   <name>instance-00000033</name>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <nova:name>tempest-internal-dns-test-vm-2137758858</nova:name>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:29:51</nova:creationTime>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:29:51 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:29:51 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:29:51 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:29:51 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:29:51 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:29:51 compute-0 nova_compute[183075]:         <nova:user uuid="1ddebe2a251e4b118d9469f7d6fdb2ce">tempest-InternalDNSTest-38234021-project-member</nova:user>
Jan 22 17:29:51 compute-0 nova_compute[183075]:         <nova:project uuid="89916c03f6f440f6ae7cf81f2ae99bad">tempest-InternalDNSTest-38234021</nova:project>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:29:51 compute-0 nova_compute[183075]:         <nova:port uuid="c9f56d85-9dbf-4ec0-8ee7-82cf5387b536">
Jan 22 17:29:51 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <system>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <entry name="serial">ae41baf4-b0eb-4402-aebe-718f5c7f3ed9</entry>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <entry name="uuid">ae41baf4-b0eb-4402-aebe-718f5c7f3ed9</entry>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     </system>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   <os>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   </os>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   <features>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   </features>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:90:3e:82"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <target dev="tapc9f56d85-9d"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/console.log" append="off"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <video>
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     </video>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:29:51 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:29:51 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:29:51 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:29:51 compute-0 nova_compute[183075]: </domain>
Jan 22 17:29:51 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.563 183079 DEBUG nova.compute.manager [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Preparing to wait for external event network-vif-plugged-c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.563 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.563 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.564 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.564 183079 DEBUG nova.virt.libvirt.vif [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:29:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-internal-dns-test-vm-2137758858',display_name='tempest-internal-dns-test-vm-2137758858',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-internal-dns-test-vm-2137758858',id=51,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOLWbfUjAYIoN+tjNcC/qQrGiTRWkuWLPOSnVFATm/h1ZcSJk8EAZI4vKn1W/DBRn+mGMC9ealMQjIznUDRCwtGbPPBWKrg20ByYe0VtCSnmvWLZm4uyTyPBgNtKHGDVqg==',key_name='tempest-internal-dns-test-shared-keypair-1135515409',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='89916c03f6f440f6ae7cf81f2ae99bad',ramdisk_id='',reservation_id='r-fzp1nqoq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InternalDNSTest-38234021',owner_user_name='tempest-InternalDNSTest-38234021-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:29:39Z,user_data=None,user_id='1ddebe2a251e4b118d9469f7d6fdb2ce',uuid=ae41baf4-b0eb-4402-aebe-718f5c7f3ed9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "address": "fa:16:3e:90:3e:82", "network": {"id": "eddf21f6-fbb7-4dbb-a219-f930432ddc71", "bridge": "br-int", "label": "tempest-internal-dns-test-network-1900776616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f56d85-9d", "ovs_interfaceid": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.565 183079 DEBUG nova.network.os_vif_util [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converting VIF {"id": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "address": "fa:16:3e:90:3e:82", "network": {"id": "eddf21f6-fbb7-4dbb-a219-f930432ddc71", "bridge": "br-int", "label": "tempest-internal-dns-test-network-1900776616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f56d85-9d", "ovs_interfaceid": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.565 183079 DEBUG nova.network.os_vif_util [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:3e:82,bridge_name='br-int',has_traffic_filtering=True,id=c9f56d85-9dbf-4ec0-8ee7-82cf5387b536,network=Network(eddf21f6-fbb7-4dbb-a219-f930432ddc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc9f56d85-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.566 183079 DEBUG os_vif [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:3e:82,bridge_name='br-int',has_traffic_filtering=True,id=c9f56d85-9dbf-4ec0-8ee7-82cf5387b536,network=Network(eddf21f6-fbb7-4dbb-a219-f930432ddc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc9f56d85-9d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.567 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.567 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.567 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.571 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.571 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc9f56d85-9d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.571 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc9f56d85-9d, col_values=(('external_ids', {'iface-id': 'c9f56d85-9dbf-4ec0-8ee7-82cf5387b536', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:90:3e:82', 'vm-uuid': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.572 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.572 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0204442
Jan 22 17:29:51 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231913]: 10.100.0.10:53276 [22/Jan/2026:17:29:51.551] listener listener/metadata 0/0/0/21/21 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.573 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:51 compute-0 NetworkManager[55454]: <info>  [1769102991.5746] manager: (tapc9f56d85-9d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/239)
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.575 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.580 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.581 183079 INFO os_vif [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:3e:82,bridge_name='br-int',has_traffic_filtering=True,id=c9f56d85-9dbf-4ec0-8ee7-82cf5387b536,network=Network(eddf21f6-fbb7-4dbb-a219-f930432ddc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc9f56d85-9d')
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.582 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.583 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.602 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.603 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0196128
Jan 22 17:29:51 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231913]: 10.100.0.10:53288 [22/Jan/2026:17:29:51.582] listener listener/metadata 0/0/0/20/20 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.611 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.611 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.624 183079 DEBUG nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.624 183079 DEBUG nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] No VIF found with MAC fa:16:3e:90:3e:82, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.625 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.626 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0145545
Jan 22 17:29:51 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231913]: 10.100.0.10:53300 [22/Jan/2026:17:29:51.610] listener listener/metadata 0/0/0/15/15 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.631 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.632 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.647 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.648 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0159025
Jan 22 17:29:51 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231913]: 10.100.0.10:53302 [22/Jan/2026:17:29:51.631] listener listener/metadata 0/0/0/17/17 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.652 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.653 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.667 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.667 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0149314
Jan 22 17:29:51 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231913]: 10.100.0.10:53308 [22/Jan/2026:17:29:51.652] listener listener/metadata 0/0/0/15/15 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.672 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.672 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:29:51 compute-0 kernel: tapc9f56d85-9d: entered promiscuous mode
Jan 22 17:29:51 compute-0 ovn_controller[95372]: 2026-01-22T17:29:51Z|00558|binding|INFO|Claiming lport c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 for this chassis.
Jan 22 17:29:51 compute-0 ovn_controller[95372]: 2026-01-22T17:29:51Z|00559|binding|INFO|c9f56d85-9dbf-4ec0-8ee7-82cf5387b536: Claiming fa:16:3e:90:3e:82 10.100.0.4
Jan 22 17:29:51 compute-0 NetworkManager[55454]: <info>  [1769102991.6804] manager: (tapc9f56d85-9d): new Tun device (/org/freedesktop/NetworkManager/Devices/240)
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.680 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.689 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:3e:82 10.100.0.4'], port_security=['fa:16:3e:90:3e:82 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-internal-dns-test-port-1425761620', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eddf21f6-fbb7-4dbb-a219-f930432ddc71', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-internal-dns-test-port-1425761620', 'neutron:project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'e5aee07e-e85a-4653-b472-28e019cafee4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6e9bf76-88ee-4fb4-b07a-3ebfb5ef8df0, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=c9f56d85-9dbf-4ec0-8ee7-82cf5387b536) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.690 104629 INFO neutron.agent.ovn.metadata.agent [-] Port c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 in datapath eddf21f6-fbb7-4dbb-a219-f930432ddc71 bound to our chassis
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.692 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eddf21f6-fbb7-4dbb-a219-f930432ddc71
Jan 22 17:29:51 compute-0 ovn_controller[95372]: 2026-01-22T17:29:51Z|00560|binding|INFO|Setting lport c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 ovn-installed in OVS
Jan 22 17:29:51 compute-0 ovn_controller[95372]: 2026-01-22T17:29:51Z|00561|binding|INFO|Setting lport c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 up in Southbound
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.692 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.693 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:51 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231913]: 10.100.0.10:53312 [22/Jan/2026:17:29:51.671] listener listener/metadata 0/0/0/23/23 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.695 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0222564
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.696 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.700 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.701 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.703 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[203a56f6-a504-44f7-88a8-15912ab4adcc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.704 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapeddf21f6-f1 in ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.706 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapeddf21f6-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.706 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ce2a9652-ef04-452a-b4da-6c3dc602945a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.707 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9cd51830-4090-45c8-9732-e1f84a28afd2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:51 compute-0 systemd-machined[154382]: New machine qemu-51-instance-00000033.
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.719 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[7c1c1d10-1e2a-4d5d-97d9-5e7298ff8faa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:51 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231913]: 10.100.0.10:53322 [22/Jan/2026:17:29:51.700] listener listener/metadata 0/0/0/20/20 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.721 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0199986
Jan 22 17:29:51 compute-0 systemd-udevd[232119]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:29:51 compute-0 systemd[1]: Started Virtual Machine qemu-51-instance-00000033.
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.730 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.731 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.733 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8216e979-1140-41af-93f4-abc2c60ba5db]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:51 compute-0 NetworkManager[55454]: <info>  [1769102991.7369] device (tapc9f56d85-9d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:29:51 compute-0 NetworkManager[55454]: <info>  [1769102991.7374] device (tapc9f56d85-9d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:29:51 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231913]: 10.100.0.10:53324 [22/Jan/2026:17:29:51.730] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.745 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.746 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0148187
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.749 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.750 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.761 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[1099d74f-0ef0-4567-a7c0-1439f0a9ecb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.763 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.764 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0141385
Jan 22 17:29:51 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231913]: 10.100.0.10:53330 [22/Jan/2026:17:29:51.748] listener listener/metadata 0/0/0/15/15 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:29:51 compute-0 NetworkManager[55454]: <info>  [1769102991.7679] manager: (tapeddf21f6-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/241)
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.767 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b6d5f0ed-8296-45ff-8d5d-6c63362d486e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.769 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.770 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.784 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:29:51 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231913]: 10.100.0.10:53334 [22/Jan/2026:17:29:51.769] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.784 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0145314
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.797 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[8f7690b5-2148-4144-95ea-0dc6181775c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.800 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.801 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ffced1e6-6119-425b-bfa7-a7cced8f867a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.801 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.815 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:29:51 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231913]: 10.100.0.10:53344 [22/Jan/2026:17:29:51.800] listener listener/metadata 0/0/0/15/15 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.819 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0178499
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.819 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.820 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:29:51 compute-0 NetworkManager[55454]: <info>  [1769102991.8234] device (tapeddf21f6-f0): carrier: link connected
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.829 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[47e407eb-decb-4b32-979e-a53483afec6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.834 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.834 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0142627
Jan 22 17:29:51 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[231913]: 10.100.0.10:53346 [22/Jan/2026:17:29:51.819] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.846 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6acf883b-f53a-4f34-87c9-dc26fdf1b4b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeddf21f6-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:34:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 161], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535552, 'reachable_time': 41059, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232150, 'error': None, 'target': 'ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.860 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3606c1ae-e693-4184-8c3a-66787f842adc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe48:344d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 535552, 'tstamp': 535552}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232151, 'error': None, 'target': 'ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.873 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cd8f71ef-c8bc-45ff-9ad9-0fb5cc45a443]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeddf21f6-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:34:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 161], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535552, 'reachable_time': 41059, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232152, 'error': None, 'target': 'ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.900 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d61fc567-3cfd-4dda-adf8-012805dc10cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.953 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2c1c7b1b-8a48-4e87-94c5-9dc86b795664]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.954 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeddf21f6-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.955 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.955 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeddf21f6-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.956 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:51 compute-0 kernel: tapeddf21f6-f0: entered promiscuous mode
Jan 22 17:29:51 compute-0 NetworkManager[55454]: <info>  [1769102991.9578] manager: (tapeddf21f6-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/242)
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.959 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeddf21f6-f0, col_values=(('external_ids', {'iface-id': '192144c7-f1f9-4f09-8051-4874b559739b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:29:51 compute-0 ovn_controller[95372]: 2026-01-22T17:29:51Z|00562|binding|INFO|Releasing lport 192144c7-f1f9-4f09-8051-4874b559739b from this chassis (sb_readonly=0)
Jan 22 17:29:51 compute-0 nova_compute[183075]: 2026-01-22 17:29:51.974 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.975 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eddf21f6-fbb7-4dbb-a219-f930432ddc71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eddf21f6-fbb7-4dbb-a219-f930432ddc71.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.977 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0647c4dd-15ca-4152-aabd-3216a03f345f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.977 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/eddf21f6-fbb7-4dbb-a219-f930432ddc71.pid.haproxy
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID eddf21f6-fbb7-4dbb-a219-f930432ddc71
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:29:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:29:51.979 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71', 'env', 'PROCESS_TAG=haproxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/eddf21f6-fbb7-4dbb-a219-f930432ddc71.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.013 183079 DEBUG nova.compute.manager [req-c09beb57-4692-4cd2-9247-fc70ff90cac6 req-41d4d05b-dbe6-421d-bf6f-118332da02e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Received event network-vif-plugged-c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.014 183079 DEBUG oslo_concurrency.lockutils [req-c09beb57-4692-4cd2-9247-fc70ff90cac6 req-41d4d05b-dbe6-421d-bf6f-118332da02e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.014 183079 DEBUG oslo_concurrency.lockutils [req-c09beb57-4692-4cd2-9247-fc70ff90cac6 req-41d4d05b-dbe6-421d-bf6f-118332da02e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.014 183079 DEBUG oslo_concurrency.lockutils [req-c09beb57-4692-4cd2-9247-fc70ff90cac6 req-41d4d05b-dbe6-421d-bf6f-118332da02e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.014 183079 DEBUG nova.compute.manager [req-c09beb57-4692-4cd2-9247-fc70ff90cac6 req-41d4d05b-dbe6-421d-bf6f-118332da02e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Processing event network-vif-plugged-c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.054 183079 DEBUG nova.compute.manager [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.055 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102992.0540376, ae41baf4-b0eb-4402-aebe-718f5c7f3ed9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.055 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] VM Started (Lifecycle Event)
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.059 183079 DEBUG nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.062 183079 INFO nova.virt.libvirt.driver [-] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Instance spawned successfully.
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.063 183079 DEBUG nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.099 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.107 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.222 183079 DEBUG nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.222 183079 DEBUG nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.223 183079 DEBUG nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.223 183079 DEBUG nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.223 183079 DEBUG nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.223 183079 DEBUG nova.virt.libvirt.driver [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.383 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.383 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102992.054514, ae41baf4-b0eb-4402-aebe-718f5c7f3ed9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.383 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] VM Paused (Lifecycle Event)
Jan 22 17:29:52 compute-0 podman[232188]: 2026-01-22 17:29:52.330961083 +0000 UTC m=+0.021414563 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.426 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.946 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.952 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769102992.0584712, ae41baf4-b0eb-4402-aebe-718f5c7f3ed9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:29:52 compute-0 nova_compute[183075]: 2026-01-22 17:29:52.952 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] VM Resumed (Lifecycle Event)
Jan 22 17:29:53 compute-0 nova_compute[183075]: 2026-01-22 17:29:53.789 183079 INFO nova.compute.manager [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Took 14.32 seconds to spawn the instance on the hypervisor.
Jan 22 17:29:53 compute-0 nova_compute[183075]: 2026-01-22 17:29:53.789 183079 DEBUG nova.compute.manager [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:29:54 compute-0 nova_compute[183075]: 2026-01-22 17:29:54.136 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:29:54 compute-0 nova_compute[183075]: 2026-01-22 17:29:54.139 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:29:54 compute-0 podman[232188]: 2026-01-22 17:29:54.161358359 +0000 UTC m=+1.851811820 container create 60121fef443868fd7b468d61883cc3073fdd7284d5dec6beae7517ea512d001f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:29:54 compute-0 nova_compute[183075]: 2026-01-22 17:29:54.182 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:29:54 compute-0 nova_compute[183075]: 2026-01-22 17:29:54.201 183079 INFO nova.compute.manager [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Took 19.04 seconds to build instance.
Jan 22 17:29:54 compute-0 systemd[1]: Started libpod-conmon-60121fef443868fd7b468d61883cc3073fdd7284d5dec6beae7517ea512d001f.scope.
Jan 22 17:29:54 compute-0 nova_compute[183075]: 2026-01-22 17:29:54.214 183079 DEBUG oslo_concurrency.lockutils [None req-b083c6f9-1f56-4794-89f1-f4cc5e3a6817 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:54 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:29:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5dea59e7573bc63901a98ac8e42558d454a75fc5fb8fd86c3d3a415ec1adfa0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:29:54 compute-0 podman[232188]: 2026-01-22 17:29:54.249262877 +0000 UTC m=+1.939716357 container init 60121fef443868fd7b468d61883cc3073fdd7284d5dec6beae7517ea512d001f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:29:54 compute-0 podman[232188]: 2026-01-22 17:29:54.259832955 +0000 UTC m=+1.950286415 container start 60121fef443868fd7b468d61883cc3073fdd7284d5dec6beae7517ea512d001f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:29:54 compute-0 neutron-haproxy-ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232203]: [NOTICE]   (232207) : New worker (232209) forked
Jan 22 17:29:54 compute-0 neutron-haproxy-ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232203]: [NOTICE]   (232207) : Loading success.
Jan 22 17:29:54 compute-0 nova_compute[183075]: 2026-01-22 17:29:54.320 183079 INFO nova.compute.manager [None req-b924748c-4fb2-439d-a128-17135c06ffb2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Get console output
Jan 22 17:29:54 compute-0 nova_compute[183075]: 2026-01-22 17:29:54.327 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:29:54 compute-0 nova_compute[183075]: 2026-01-22 17:29:54.529 183079 DEBUG nova.compute.manager [req-93ffb19a-9a1c-44d9-af15-b00802e86da6 req-0ba0b140-1b7a-497f-9a30-371cae2d7456 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Received event network-vif-plugged-c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:29:54 compute-0 nova_compute[183075]: 2026-01-22 17:29:54.529 183079 DEBUG oslo_concurrency.lockutils [req-93ffb19a-9a1c-44d9-af15-b00802e86da6 req-0ba0b140-1b7a-497f-9a30-371cae2d7456 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:29:54 compute-0 nova_compute[183075]: 2026-01-22 17:29:54.530 183079 DEBUG oslo_concurrency.lockutils [req-93ffb19a-9a1c-44d9-af15-b00802e86da6 req-0ba0b140-1b7a-497f-9a30-371cae2d7456 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:29:54 compute-0 nova_compute[183075]: 2026-01-22 17:29:54.530 183079 DEBUG oslo_concurrency.lockutils [req-93ffb19a-9a1c-44d9-af15-b00802e86da6 req-0ba0b140-1b7a-497f-9a30-371cae2d7456 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:29:54 compute-0 nova_compute[183075]: 2026-01-22 17:29:54.530 183079 DEBUG nova.compute.manager [req-93ffb19a-9a1c-44d9-af15-b00802e86da6 req-0ba0b140-1b7a-497f-9a30-371cae2d7456 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] No waiting events found dispatching network-vif-plugged-c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:29:54 compute-0 nova_compute[183075]: 2026-01-22 17:29:54.530 183079 WARNING nova.compute.manager [req-93ffb19a-9a1c-44d9-af15-b00802e86da6 req-0ba0b140-1b7a-497f-9a30-371cae2d7456 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Received unexpected event network-vif-plugged-c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 for instance with vm_state active and task_state None.
Jan 22 17:29:54 compute-0 nova_compute[183075]: 2026-01-22 17:29:54.890 183079 INFO nova.compute.manager [None req-f55be1b1-856b-49b2-a45b-acc873d73b59 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Get console output
Jan 22 17:29:54 compute-0 nova_compute[183075]: 2026-01-22 17:29:54.895 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.457 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'name': 'tempest-server-test-1076717562', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000032', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7ff1e5ce4806445a8e463c71b6930bec', 'user_id': '3a896d4927d442ffba421873948034be', 'hostId': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.460 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'name': 'tempest-internal-dns-test-vm-2137758858', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000033', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'hostId': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.460 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.463 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 558df49c-4071-47cf-9f12-34cc70b1f266 / tapcd31a146-70 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.464 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.466 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ae41baf4-b0eb-4402-aebe-718f5c7f3ed9 / tapc9f56d85-9d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.467 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df2120f5-1a09-4faf-9a59-107431448edd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': 'instance-00000032-558df49c-4071-47cf-9f12-34cc70b1f266-tapcd31a146-70', 'timestamp': '2026-01-22T17:29:55.460691', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'tapcd31a146-70', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:f4:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd31a146-70'}, 'message_id': 'f7a98ae6-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.221118293, 'message_signature': '103742c5e3cf63981a0e8e82b9e0f760b450100fcb1161c3c5cbc2413be5d931'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000033-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-tapc9f56d85-9d', 'timestamp': '2026-01-22T17:29:55.460691', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'tapc9f56d85-9d', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:3e:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9f56d85-9d'}, 'message_id': 'f7a9f8aa-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.225140939, 'message_signature': '1d57d898f3f002fc713f47512104fac38b5fcef6226e9a5419a44d89a8a5b875'}]}, 'timestamp': '2026-01-22 17:29:55.467428', '_unique_id': 'd03ef7f296c340f4a5441cae8bf69a43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.468 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.469 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.469 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.469 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f24d69a-6f17-4645-aded-3d14e89a4b65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': 'instance-00000032-558df49c-4071-47cf-9f12-34cc70b1f266-tapcd31a146-70', 'timestamp': '2026-01-22T17:29:55.469606', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'tapcd31a146-70', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:f4:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd31a146-70'}, 'message_id': 'f7aa59da-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.221118293, 'message_signature': '71538032c0787ae6368284ad563c3b3857e75879aeef8cea21ab74ca366edb15'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000033-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-tapc9f56d85-9d', 'timestamp': '2026-01-22T17:29:55.469606', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'tapc9f56d85-9d', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:3e:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9f56d85-9d'}, 'message_id': 'f7aa624a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.225140939, 'message_signature': '04fb8fc69eaf5c72495646ac7907a151d7e47b006cd7f7b0d74786eaaa2ebc38'}]}, 'timestamp': '2026-01-22 17:29:55.470087', '_unique_id': '07a22ec27b0f4796bece2cc95631870c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.470 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.471 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.483 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/disk.device.read.latency volume: 825555832 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.505 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk.device.read.latency volume: 1007607965 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37035839-e1f5-4940-aa3a-b0b5b7bea9c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 825555832, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': '558df49c-4071-47cf-9f12-34cc70b1f266-vda', 'timestamp': '2026-01-22T17:29:55.471248', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'instance-00000032', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7ac9218-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.23167657, 'message_signature': '2d46e5e00d337de92a5ce84e2a45102d9df3412938c54860ff2588c1877e5992'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1007607965, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-vda', 'timestamp': '2026-01-22T17:29:55.471248', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'instance-00000033', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7afdaae-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.244947049, 'message_signature': '55329609500bef9773aa29da27fab82e0cd49cb2657e616d07990b5a2133a916'}]}, 'timestamp': '2026-01-22 17:29:55.506140', '_unique_id': '6e5187d21dd2474cb5b03f51d09fe506'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.509 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.509 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.510 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f4405c8-a118-4ad9-aadb-768b521f4c63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': 'instance-00000032-558df49c-4071-47cf-9f12-34cc70b1f266-tapcd31a146-70', 'timestamp': '2026-01-22T17:29:55.509905', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'tapcd31a146-70', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:f4:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd31a146-70'}, 'message_id': 'f7b08558-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.221118293, 'message_signature': '995d893215d952e95ca955eceb117cfc5b48ee3a2bc043cfade1d17fe47c27ac'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000033-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-tapc9f56d85-9d', 'timestamp': '2026-01-22T17:29:55.509905', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'tapc9f56d85-9d', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:3e:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9f56d85-9d'}, 'message_id': 'f7b09b60-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.225140939, 'message_signature': '94fc77ba7671bb89efbc6a35ccf6736d921127139e4982b6f85cd45891770a21'}]}, 'timestamp': '2026-01-22 17:29:55.511001', '_unique_id': '133cbdaf6a0941be9a0f859ecc8f9a8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.513 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.513 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.513 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1076717562>, <NovaLikeServer: tempest-internal-dns-test-vm-2137758858>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1076717562>, <NovaLikeServer: tempest-internal-dns-test-vm-2137758858>]
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.514 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.514 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.514 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1076717562>, <NovaLikeServer: tempest-internal-dns-test-vm-2137758858>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1076717562>, <NovaLikeServer: tempest-internal-dns-test-vm-2137758858>]
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.514 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.522 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.531 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ffd3b56-5cc9-44b2-93cc-f49a50835a02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': '558df49c-4071-47cf-9f12-34cc70b1f266-vda', 'timestamp': '2026-01-22T17:29:55.515097', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'instance-00000032', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7b28aba-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.275562333, 'message_signature': '5096fb2444c9cd6ed94f67bfde4e7950727c7e89ed80d68a3b77790477bf6c33'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-vda', 'timestamp': '2026-01-22T17:29:55.515097', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'instance-00000033', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7b3e50e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.28421328, 'message_signature': '8a7873a4fb322cc29edbb816b9e30f67062e1264192e7fefe281545cc6acbe24'}]}, 'timestamp': '2026-01-22 17:29:55.532840', '_unique_id': '5a28fac49deb4e71acad10a9d03e21f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.536 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.536 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.537 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efbbab9d-1695-4caf-b924-2afef78a82ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': 'instance-00000032-558df49c-4071-47cf-9f12-34cc70b1f266-tapcd31a146-70', 'timestamp': '2026-01-22T17:29:55.536578', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'tapcd31a146-70', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:f4:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd31a146-70'}, 'message_id': 'f7b49c7e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.221118293, 'message_signature': 'd181b4bc49d580578ad20e66311ef137a9eacc35b8ab3eec80994a2ef55f6bab'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000033-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-tapc9f56d85-9d', 'timestamp': '2026-01-22T17:29:55.536578', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'tapc9f56d85-9d', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:3e:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9f56d85-9d'}, 'message_id': 'f7b4af84-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.225140939, 'message_signature': 'c4abbd06a496e5d56e36ad6e166e6ad44855d8d98d5f6831cd8306532eaa2055'}]}, 'timestamp': '2026-01-22 17:29:55.537759', '_unique_id': '1e5b7a57687d4fadb8b0c2f3fe55e282'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.540 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.540 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.541 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3742e5f-1f45-4e4a-92e5-37a59e823a53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': 'instance-00000032-558df49c-4071-47cf-9f12-34cc70b1f266-tapcd31a146-70', 'timestamp': '2026-01-22T17:29:55.540793', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'tapcd31a146-70', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:f4:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd31a146-70'}, 'message_id': 'f7b53f12-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.221118293, 'message_signature': '52aec63dc1f5f12703a5da0e941869cc2aaf82a4d2ea7a3da1c0303971647583'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000033-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-tapc9f56d85-9d', 'timestamp': '2026-01-22T17:29:55.540793', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'tapc9f56d85-9d', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:3e:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9f56d85-9d'}, 'message_id': 'f7b552c2-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.225140939, 'message_signature': '88c79418f22b57483d78f288bc1946ac876b43b3d3b609d005bcac66e2265daa'}]}, 'timestamp': '2026-01-22 17:29:55.541923', '_unique_id': 'ae7c2542847f494eb8a3ff030b4d53b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.543 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.544 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.544 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/network.incoming.bytes volume: 7312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.545 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83a2f62f-5a81-4c58-a90e-917119ce195f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7312, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': 'instance-00000032-558df49c-4071-47cf-9f12-34cc70b1f266-tapcd31a146-70', 'timestamp': '2026-01-22T17:29:55.544892', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'tapcd31a146-70', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:f4:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd31a146-70'}, 'message_id': 'f7b5df76-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.221118293, 'message_signature': '5770c053e45b73651ba6777064034a915490f914b4b536ae0e6e646014df2ab1'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000033-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-tapc9f56d85-9d', 'timestamp': '2026-01-22T17:29:55.544892', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'tapc9f56d85-9d', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:3e:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9f56d85-9d'}, 'message_id': 'f7b5f466-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.225140939, 'message_signature': '4d3516de6e4d6728fb1663a900fc1eb24e1da8c903391cb9d140419c548c43c9'}]}, 'timestamp': '2026-01-22 17:29:55.546109', '_unique_id': '573cff348c9443eab3b434df79aff742'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.547 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.549 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.550 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.550 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1076717562>, <NovaLikeServer: tempest-internal-dns-test-vm-2137758858>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1076717562>, <NovaLikeServer: tempest-internal-dns-test-vm-2137758858>]
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.550 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.550 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.551 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '181abdaf-a6ea-4ce5-a4a5-490a4a21030a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': 'instance-00000032-558df49c-4071-47cf-9f12-34cc70b1f266-tapcd31a146-70', 'timestamp': '2026-01-22T17:29:55.550726', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'tapcd31a146-70', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:f4:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd31a146-70'}, 'message_id': 'f7b6bf5e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.221118293, 'message_signature': '40550494acc6b5a03d8be65a7e96dd853e4206e448bbc69c40efcadc4bda0f73'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000033-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-tapc9f56d85-9d', 'timestamp': '2026-01-22T17:29:55.550726', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'tapc9f56d85-9d', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:3e:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9f56d85-9d'}, 'message_id': 'f7b6d1f6-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.225140939, 'message_signature': 'c416b1f6c414d2af95c83b33a1e11eba08641fedc0b45e502a1b62400b87293b'}]}, 'timestamp': '2026-01-22 17:29:55.551957', '_unique_id': '0d7427b956a44f1787c5bc86e2639501'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.553 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.554 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.554 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.555 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f011562e-e6bd-47b0-bd59-903713b61408', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': 'instance-00000032-558df49c-4071-47cf-9f12-34cc70b1f266-tapcd31a146-70', 'timestamp': '2026-01-22T17:29:55.554896', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'tapcd31a146-70', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:f4:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd31a146-70'}, 'message_id': 'f7b762e2-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.221118293, 'message_signature': '6291aa38fb7d4044afb7c99596f19567bf2c456d8db8912d8e4c2be94e7108da'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000033-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-tapc9f56d85-9d', 'timestamp': '2026-01-22T17:29:55.554896', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'tapc9f56d85-9d', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:3e:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9f56d85-9d'}, 'message_id': 'f7b77610-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.225140939, 'message_signature': '8d45d04294ca4ab10303f4210259c23cc6b9a3ed49694d29a910ab0abb35d584'}]}, 'timestamp': '2026-01-22 17:29:55.556019', '_unique_id': 'fd0ecc3f6bf84058a586c64084ef7c76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.558 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.558 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/disk.device.read.bytes volume: 31382016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.559 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6a47798-e1e4-4128-9ceb-0860d9f70804', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31382016, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': '558df49c-4071-47cf-9f12-34cc70b1f266-vda', 'timestamp': '2026-01-22T17:29:55.558744', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'instance-00000032', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7b7fc66-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.23167657, 'message_signature': '9264930a36ee60d74ddb83322d70f4781200f1b7f6a65111fff228e7d05a510c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-vda', 'timestamp': '2026-01-22T17:29:55.558744', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'instance-00000033', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7b809e0-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.244947049, 'message_signature': 'bdd59bc6a1b56fb3cf81b5949276f7933e39ed9861fe9962dc9a83e9b06bbcdc'}]}, 'timestamp': '2026-01-22 17:29:55.559692', '_unique_id': '202da12e2ac040e0b2a58c9e0454f204'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.560 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.561 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.575 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/cpu volume: 11620000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.595 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/cpu volume: 3380000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d0f461e-2072-413c-9670-834f7337a04c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11620000000, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'timestamp': '2026-01-22T17:29:55.561917', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'instance-00000032', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f7ba734c-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.335358463, 'message_signature': 'ea6f76de81473cf800781f7bf8b0e5cb01a36a8b41a2ac336f52a582f2242561'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3380000000, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'timestamp': '2026-01-22T17:29:55.561917', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'instance-00000033', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f7bd9b76-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.355990374, 'message_signature': '1f01b3d2c332bcaf532196e42a876855146b16b15f3dfd6798ca923ddd869fa2'}]}, 'timestamp': '2026-01-22 17:29:55.596202', '_unique_id': 'aa4e675d02a64baca0a293deeee8e94e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.597 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/disk.device.read.requests volume: 1163 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be0ab9c9-ed3c-42d6-b3a4-0d01d7d92a82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1163, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': '558df49c-4071-47cf-9f12-34cc70b1f266-vda', 'timestamp': '2026-01-22T17:29:55.597847', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'instance-00000032', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7bde9be-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.23167657, 'message_signature': '5edec34c5a464fd5d3abb4f3a63995c5d7dbe8df04e35ff7224f1e333be587ce'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-vda', 'timestamp': '2026-01-22T17:29:55.597847', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'instance-00000033', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7bdf3c8-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.244947049, 'message_signature': '08a1d6b17f375b5d1f815eaa613252e92327f4f435a4d7c2490b70f7e388c1f7'}]}, 'timestamp': '2026-01-22 17:29:55.598325', '_unique_id': '31375f560d084693bb3b36b78ce77f53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.598 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.599 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.599 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/disk.device.write.bytes volume: 72871936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.599 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8624ee59-3f78-4c91-98e6-c38c9dabe461', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72871936, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': '558df49c-4071-47cf-9f12-34cc70b1f266-vda', 'timestamp': '2026-01-22T17:29:55.599522', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'instance-00000032', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7be3108-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.23167657, 'message_signature': '82f61a105ec6b4bd0ab7887dbe1df187d3edb57c41ea593b151f2b31724dfd4b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-vda', 'timestamp': '2026-01-22T17:29:55.599522', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'instance-00000033', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7be3a4a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.244947049, 'message_signature': '4b58beee8aa49b514a6f0a3576d29b253c973d19b86296fb2ee78e387f12efd0'}]}, 'timestamp': '2026-01-22 17:29:55.600131', '_unique_id': 'f79d6bf21c654c609fa6419a4733db98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.600 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.601 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.601 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/network.outgoing.packets volume: 115 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.601 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f873a43-3a5c-42ae-af7f-c2d20cdc3ed5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 115, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': 'instance-00000032-558df49c-4071-47cf-9f12-34cc70b1f266-tapcd31a146-70', 'timestamp': '2026-01-22T17:29:55.601327', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'tapcd31a146-70', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:f4:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd31a146-70'}, 'message_id': 'f7be71cc-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.221118293, 'message_signature': 'a604c75669d53eea3ecb0e75a2f2af5bbfc14e4b060be71456eadf8706f35806'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000033-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-tapc9f56d85-9d', 'timestamp': '2026-01-22T17:29:55.601327', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'tapc9f56d85-9d', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:3e:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9f56d85-9d'}, 'message_id': 'f7be7ac8-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.225140939, 'message_signature': '99b715e8b0cff9e3923d26af2ac189bf79e86f37811a71bb11c6d7ba63cd6785'}]}, 'timestamp': '2026-01-22 17:29:55.601786', '_unique_id': 'b5992382a4f94b099bd65fe1fd917a4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.602 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.603 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/network.outgoing.bytes volume: 10121 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.603 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fd01a8e-e4b5-43e4-a8ca-7fb6174d6507', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10121, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': 'instance-00000032-558df49c-4071-47cf-9f12-34cc70b1f266-tapcd31a146-70', 'timestamp': '2026-01-22T17:29:55.602998', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'tapcd31a146-70', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:f4:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd31a146-70'}, 'message_id': 'f7beb2d6-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.221118293, 'message_signature': '980606b5b68681424719448831512cbb5a26bc1897407c04c35092e69823755a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000033-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-tapc9f56d85-9d', 'timestamp': '2026-01-22T17:29:55.602998', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'tapc9f56d85-9d', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:3e:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9f56d85-9d'}, 'message_id': 'f7bebd3a-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.225140939, 'message_signature': '0f3849c99212d2ab72d17c1980a896be06e2f14f96f4305a6b4e47e371838403'}]}, 'timestamp': '2026-01-22 17:29:55.603490', '_unique_id': '34392898fa0d4508afa8218ec8abd693'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.604 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/disk.device.write.latency volume: 29709437629 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37a6bd71-02be-4e07-a39a-549764bd3796', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 29709437629, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': '558df49c-4071-47cf-9f12-34cc70b1f266-vda', 'timestamp': '2026-01-22T17:29:55.604853', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'instance-00000032', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7befb2e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.23167657, 'message_signature': '90aaa5cf50f6701e30f98747fd1c288c4a138819d4a5a0ba582445525e5ebc1e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-vda', 'timestamp': '2026-01-22T17:29:55.604853', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'instance-00000033', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7bf0326-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.244947049, 'message_signature': '70f4def856aa4d4569ef9b5edd369f9abecd6b530a81955f7e1578e67163504c'}]}, 'timestamp': '2026-01-22 17:29:55.605268', '_unique_id': 'e57766b553ab4d8993dc65a11d1d3d77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.605 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.606 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.606 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.606 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c2246b8-a8d8-49c3-9487-97a2be656833', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': '558df49c-4071-47cf-9f12-34cc70b1f266-vda', 'timestamp': '2026-01-22T17:29:55.606429', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'instance-00000032', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7bf397c-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.275562333, 'message_signature': '4cfc5cb570edd5f262b08d2c998fc852c5f567d4b18a578660cd2365c7e21c0a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-vda', 'timestamp': '2026-01-22T17:29:55.606429', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'instance-00000033', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7bf423c-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.28421328, 'message_signature': '2c8e497564040e7341f130a432df49304eef3d0951b272cd6bc8acf826f3f6b9'}]}, 'timestamp': '2026-01-22 17:29:55.606884', '_unique_id': 'c15cd2d48e9a4d339c3d3b67d7a194dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.607 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.608 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/disk.device.write.requests volume: 317 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.608 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a43640ac-7e0c-4355-891a-7100eb6e4f06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 317, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': '558df49c-4071-47cf-9f12-34cc70b1f266-vda', 'timestamp': '2026-01-22T17:29:55.607988', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'instance-00000032', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7bf7590-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.23167657, 'message_signature': 'dde85a5526e028e63427506295aa6e1a96016cd301b523a56c37850eb414fdb8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-vda', 'timestamp': '2026-01-22T17:29:55.607988', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'instance-00000033', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7bf7f0e-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.244947049, 'message_signature': '1fb06699d3c0e2f9bf12f92ad84989887b8c9e61cc8fd50e962431df7f6841dd'}]}, 'timestamp': '2026-01-22 17:29:55.608528', '_unique_id': 'cd00bc2fc5e045b99ab7c20c329883e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.609 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.610 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.610 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1076717562>, <NovaLikeServer: tempest-internal-dns-test-vm-2137758858>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1076717562>, <NovaLikeServer: tempest-internal-dns-test-vm-2137758858>]
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.610 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.610 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/memory.usage volume: 46.8203125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.610 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.610 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance ae41baf4-b0eb-4402-aebe-718f5c7f3ed9: ceilometer.compute.pollsters.NoVolumeException
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3af0c8df-3535-4ae5-b750-29ed98107e08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.8203125, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'timestamp': '2026-01-22T17:29:55.610284', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'instance-00000032', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f7bfcf18-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.335358463, 'message_signature': '25d333a05ff288eae783b08c2dfda49cf14257e0ff43c7c5035f6593d9ad1bad'}]}, 'timestamp': '2026-01-22 17:29:55.610830', '_unique_id': 'a70ad4f53a5142c4a8242c35de79d354'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.611 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.612 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.612 12 DEBUG ceilometer.compute.pollsters [-] 558df49c-4071-47cf-9f12-34cc70b1f266/disk.device.usage volume: 29818880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.612 12 DEBUG ceilometer.compute.pollsters [-] ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41c41aad-440b-4c5b-be59-7d7139c4440f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29818880, 'user_id': '3a896d4927d442ffba421873948034be', 'user_name': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_name': None, 'resource_id': '558df49c-4071-47cf-9f12-34cc70b1f266-vda', 'timestamp': '2026-01-22T17:29:55.612160', 'resource_metadata': {'display_name': 'tempest-server-test-1076717562', 'name': 'instance-00000032', 'instance_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'instance_type': 'm1.nano', 'host': '86f170f1058a5a3ada8f0089109122783cd07aceddaa6cda15aa7fda', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7c019c8-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.275562333, 'message_signature': '77444c7f23f953a5e1a9cb14a3c0383df80e9ed94b7597a4ba1260a316d89b96'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-vda', 'timestamp': '2026-01-22T17:29:55.612160', 'resource_metadata': {'display_name': 'tempest-internal-dns-test-vm-2137758858', 'name': 'instance-00000033', 'instance_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f7c02738-f7b7-11f0-9e69-fa163eaea1db', 'monotonic_time': 5359.28421328, 'message_signature': 'bbc5edc16d8f6d2d002bf910e905589fbc53879009d487f46c4dd86071a9442f'}]}, 'timestamp': '2026-01-22 17:29:55.612767', '_unique_id': 'b313627b66254c98ac427b16b50e7c03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:29:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:29:55.613 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:29:56 compute-0 nova_compute[183075]: 2026-01-22 17:29:56.575 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:57 compute-0 nova_compute[183075]: 2026-01-22 17:29:57.470 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:29:58 compute-0 podman[232219]: 2026-01-22 17:29:58.353025119 +0000 UTC m=+0.056116324 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:29:58 compute-0 podman[232220]: 2026-01-22 17:29:58.356022778 +0000 UTC m=+0.058847886 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 17:29:58 compute-0 podman[232218]: 2026-01-22 17:29:58.382750129 +0000 UTC m=+0.090743203 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 22 17:29:59 compute-0 nova_compute[183075]: 2026-01-22 17:29:59.435 183079 INFO nova.compute.manager [None req-0ff54db0-5542-48d7-bd71-07e0f6f59539 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Get console output
Jan 22 17:29:59 compute-0 nova_compute[183075]: 2026-01-22 17:29:59.440 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:30:00 compute-0 nova_compute[183075]: 2026-01-22 17:30:00.002 183079 INFO nova.compute.manager [None req-6defe914-f2a8-4dce-b585-b7c1ec19e1b4 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Get console output
Jan 22 17:30:00 compute-0 nova_compute[183075]: 2026-01-22 17:30:00.007 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:30:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:00.301 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:30:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:00.303 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:30:00 compute-0 nova_compute[183075]: 2026-01-22 17:30:00.301 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:01 compute-0 nova_compute[183075]: 2026-01-22 17:30:01.577 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:02 compute-0 podman[232286]: 2026-01-22 17:30:02.366785066 +0000 UTC m=+0.076450018 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute)
Jan 22 17:30:02 compute-0 nova_compute[183075]: 2026-01-22 17:30:02.471 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:02 compute-0 nova_compute[183075]: 2026-01-22 17:30:02.790 183079 DEBUG nova.compute.manager [req-ff0b4563-ff21-4cf2-b2b5-702567066e4b req-37df698b-ec1f-41cc-8c15-cfc71a38352b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Received event network-changed-cd31a146-70f5-4610-88f1-ae4772887ce2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:30:02 compute-0 nova_compute[183075]: 2026-01-22 17:30:02.790 183079 DEBUG nova.compute.manager [req-ff0b4563-ff21-4cf2-b2b5-702567066e4b req-37df698b-ec1f-41cc-8c15-cfc71a38352b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Refreshing instance network info cache due to event network-changed-cd31a146-70f5-4610-88f1-ae4772887ce2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:30:02 compute-0 nova_compute[183075]: 2026-01-22 17:30:02.791 183079 DEBUG oslo_concurrency.lockutils [req-ff0b4563-ff21-4cf2-b2b5-702567066e4b req-37df698b-ec1f-41cc-8c15-cfc71a38352b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-558df49c-4071-47cf-9f12-34cc70b1f266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:30:02 compute-0 nova_compute[183075]: 2026-01-22 17:30:02.792 183079 DEBUG oslo_concurrency.lockutils [req-ff0b4563-ff21-4cf2-b2b5-702567066e4b req-37df698b-ec1f-41cc-8c15-cfc71a38352b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-558df49c-4071-47cf-9f12-34cc70b1f266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:30:02 compute-0 nova_compute[183075]: 2026-01-22 17:30:02.792 183079 DEBUG nova.network.neutron [req-ff0b4563-ff21-4cf2-b2b5-702567066e4b req-37df698b-ec1f-41cc-8c15-cfc71a38352b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Refreshing network info cache for port cd31a146-70f5-4610-88f1-ae4772887ce2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:30:02 compute-0 nova_compute[183075]: 2026-01-22 17:30:02.906 183079 DEBUG oslo_concurrency.lockutils [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "558df49c-4071-47cf-9f12-34cc70b1f266" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:02 compute-0 nova_compute[183075]: 2026-01-22 17:30:02.907 183079 DEBUG oslo_concurrency.lockutils [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "558df49c-4071-47cf-9f12-34cc70b1f266" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:02 compute-0 nova_compute[183075]: 2026-01-22 17:30:02.907 183079 DEBUG oslo_concurrency.lockutils [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:02 compute-0 nova_compute[183075]: 2026-01-22 17:30:02.907 183079 DEBUG oslo_concurrency.lockutils [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:02 compute-0 nova_compute[183075]: 2026-01-22 17:30:02.907 183079 DEBUG oslo_concurrency.lockutils [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:02 compute-0 nova_compute[183075]: 2026-01-22 17:30:02.908 183079 INFO nova.compute.manager [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Terminating instance
Jan 22 17:30:02 compute-0 nova_compute[183075]: 2026-01-22 17:30:02.909 183079 DEBUG nova.compute.manager [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:30:02 compute-0 kernel: tapcd31a146-70 (unregistering): left promiscuous mode
Jan 22 17:30:02 compute-0 NetworkManager[55454]: <info>  [1769103002.9415] device (tapcd31a146-70): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:30:02 compute-0 ovn_controller[95372]: 2026-01-22T17:30:02Z|00563|binding|INFO|Releasing lport cd31a146-70f5-4610-88f1-ae4772887ce2 from this chassis (sb_readonly=0)
Jan 22 17:30:02 compute-0 ovn_controller[95372]: 2026-01-22T17:30:02Z|00564|binding|INFO|Setting lport cd31a146-70f5-4610-88f1-ae4772887ce2 down in Southbound
Jan 22 17:30:02 compute-0 ovn_controller[95372]: 2026-01-22T17:30:02Z|00565|binding|INFO|Removing iface tapcd31a146-70 ovn-installed in OVS
Jan 22 17:30:02 compute-0 nova_compute[183075]: 2026-01-22 17:30:02.984 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:02 compute-0 nova_compute[183075]: 2026-01-22 17:30:02.987 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:02.992 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:f4:bc 10.100.0.10'], port_security=['fa:16:3e:f2:f4:bc 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-port-288740050', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-port-288740050', 'neutron:project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a8d0872f-de3c-4404-94dc-7a328e5b8aa9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.201'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a510149a-49ad-47bc-a8ad-05908544b3cc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=cd31a146-70f5-4610-88f1-ae4772887ce2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:30:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:02.996 104629 INFO neutron.agent.ovn.metadata.agent [-] Port cd31a146-70f5-4610-88f1-ae4772887ce2 in datapath 012007cf-673c-4f83-a4b9-f21a913a1ccf unbound from our chassis
Jan 22 17:30:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:02.997 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 012007cf-673c-4f83-a4b9-f21a913a1ccf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:30:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:02.998 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7d6f64ac-cfd9-4e24-abfe-6fd90bb4ce95]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:02.999 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf namespace which is not needed anymore
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.000 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:03 compute-0 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000032.scope: Deactivated successfully.
Jan 22 17:30:03 compute-0 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000032.scope: Consumed 13.528s CPU time.
Jan 22 17:30:03 compute-0 systemd-machined[154382]: Machine qemu-50-instance-00000032 terminated.
Jan 22 17:30:03 compute-0 kernel: tapcd31a146-70: entered promiscuous mode
Jan 22 17:30:03 compute-0 kernel: tapcd31a146-70 (unregistering): left promiscuous mode
Jan 22 17:30:03 compute-0 ovn_controller[95372]: 2026-01-22T17:30:03Z|00566|binding|INFO|Claiming lport cd31a146-70f5-4610-88f1-ae4772887ce2 for this chassis.
Jan 22 17:30:03 compute-0 ovn_controller[95372]: 2026-01-22T17:30:03Z|00567|binding|INFO|cd31a146-70f5-4610-88f1-ae4772887ce2: Claiming fa:16:3e:f2:f4:bc 10.100.0.10
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.135 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:03 compute-0 ovn_controller[95372]: 2026-01-22T17:30:03Z|00568|binding|INFO|Setting lport cd31a146-70f5-4610-88f1-ae4772887ce2 ovn-installed in OVS
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.153 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:03 compute-0 ovn_controller[95372]: 2026-01-22T17:30:03Z|00569|if_status|INFO|Dropped 3 log messages in last 535 seconds (most recently, 535 seconds ago) due to excessive rate
Jan 22 17:30:03 compute-0 ovn_controller[95372]: 2026-01-22T17:30:03Z|00570|if_status|INFO|Not setting lport cd31a146-70f5-4610-88f1-ae4772887ce2 down as sb is readonly
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.163 183079 INFO nova.virt.libvirt.driver [-] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Instance destroyed successfully.
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.163 183079 DEBUG nova.objects.instance [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lazy-loading 'resources' on Instance uuid 558df49c-4071-47cf-9f12-34cc70b1f266 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:30:03 compute-0 ovn_controller[95372]: 2026-01-22T17:30:03Z|00571|binding|INFO|Releasing lport cd31a146-70f5-4610-88f1-ae4772887ce2 from this chassis (sb_readonly=0)
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.254 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:f4:bc 10.100.0.10'], port_security=['fa:16:3e:f2:f4:bc 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-port-288740050', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-port-288740050', 'neutron:project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a8d0872f-de3c-4404-94dc-7a328e5b8aa9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.201'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a510149a-49ad-47bc-a8ad-05908544b3cc, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=cd31a146-70f5-4610-88f1-ae4772887ce2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.264 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:f4:bc 10.100.0.10'], port_security=['fa:16:3e:f2:f4:bc 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-port-288740050', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '558df49c-4071-47cf-9f12-34cc70b1f266', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-port-288740050', 'neutron:project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a8d0872f-de3c-4404-94dc-7a328e5b8aa9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.201'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a510149a-49ad-47bc-a8ad-05908544b3cc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=cd31a146-70f5-4610-88f1-ae4772887ce2) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.267 183079 DEBUG nova.compute.manager [req-44815f21-5ab0-442e-a323-383ba427b09a req-e6c7ea09-3ca6-43b7-a6b0-fe60319e37a1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Received event network-vif-unplugged-cd31a146-70f5-4610-88f1-ae4772887ce2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.269 183079 DEBUG oslo_concurrency.lockutils [req-44815f21-5ab0-442e-a323-383ba427b09a req-e6c7ea09-3ca6-43b7-a6b0-fe60319e37a1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.270 183079 DEBUG oslo_concurrency.lockutils [req-44815f21-5ab0-442e-a323-383ba427b09a req-e6c7ea09-3ca6-43b7-a6b0-fe60319e37a1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.271 183079 DEBUG oslo_concurrency.lockutils [req-44815f21-5ab0-442e-a323-383ba427b09a req-e6c7ea09-3ca6-43b7-a6b0-fe60319e37a1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.271 183079 DEBUG nova.compute.manager [req-44815f21-5ab0-442e-a323-383ba427b09a req-e6c7ea09-3ca6-43b7-a6b0-fe60319e37a1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] No waiting events found dispatching network-vif-unplugged-cd31a146-70f5-4610-88f1-ae4772887ce2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.272 183079 DEBUG nova.compute.manager [req-44815f21-5ab0-442e-a323-383ba427b09a req-e6c7ea09-3ca6-43b7-a6b0-fe60319e37a1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Received event network-vif-unplugged-cd31a146-70f5-4610-88f1-ae4772887ce2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:30:03 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[231907]: [NOTICE]   (231911) : haproxy version is 2.8.14-c23fe91
Jan 22 17:30:03 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[231907]: [NOTICE]   (231911) : path to executable is /usr/sbin/haproxy
Jan 22 17:30:03 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[231907]: [WARNING]  (231911) : Exiting Master process...
Jan 22 17:30:03 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[231907]: [WARNING]  (231911) : Exiting Master process...
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.275 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:03 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[231907]: [ALERT]    (231911) : Current worker (231913) exited with code 143 (Terminated)
Jan 22 17:30:03 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[231907]: [WARNING]  (231911) : All workers exited. Exiting... (0)
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.277 183079 DEBUG nova.virt.libvirt.vif [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:29:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1076717562',display_name='tempest-server-test-1076717562',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1076717562',id=50,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQrJNwmVmi0v3CvDVkdf2ULfmIKW9OE2obw9UEIh0JilIeeueUzwA1cDH+T5CoOIGZz/satGSZDSgKqtLklRpNQ/Wm6QLNBLAjV/3q74U9Y8J0BPwM5hIfTkFkrKFKf2g==',key_name='tempest-keypair-test-1359386190',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:29:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ff1e5ce4806445a8e463c71b6930bec',ramdisk_id='',reservation_id='r-74azlj4z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortsTest-1337721110',owner_user_name='tempest-PortsTest-1337721110-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:29:31Z,user_data=None,user_id='3a896d4927d442ffba421873948034be',uuid=558df49c-4071-47cf-9f12-34cc70b1f266,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.277 183079 DEBUG nova.network.os_vif_util [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converting VIF {"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:30:03 compute-0 systemd[1]: libpod-65367d4e527a226baad5edeb19ef248381bcf3fdab4f6102eaee16f9b44d027f.scope: Deactivated successfully.
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.278 183079 DEBUG nova.network.os_vif_util [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f2:f4:bc,bridge_name='br-int',has_traffic_filtering=True,id=cd31a146-70f5-4610-88f1-ae4772887ce2,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd31a146-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.278 183079 DEBUG os_vif [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:f4:bc,bridge_name='br-int',has_traffic_filtering=True,id=cd31a146-70f5-4610-88f1-ae4772887ce2,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd31a146-70') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.280 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.281 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd31a146-70, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.283 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.285 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:30:03 compute-0 podman[232326]: 2026-01-22 17:30:03.285399024 +0000 UTC m=+0.206562114 container died 65367d4e527a226baad5edeb19ef248381bcf3fdab4f6102eaee16f9b44d027f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.287 183079 INFO os_vif [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:f4:bc,bridge_name='br-int',has_traffic_filtering=True,id=cd31a146-70f5-4610-88f1-ae4772887ce2,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd31a146-70')
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.287 183079 INFO nova.virt.libvirt.driver [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Deleting instance files /var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266_del
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.288 183079 INFO nova.virt.libvirt.driver [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Deletion of /var/lib/nova/instances/558df49c-4071-47cf-9f12-34cc70b1f266_del complete
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.370 183079 INFO nova.compute.manager [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Took 0.46 seconds to destroy the instance on the hypervisor.
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.370 183079 DEBUG oslo.service.loopingcall [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.371 183079 DEBUG nova.compute.manager [-] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.371 183079 DEBUG nova.network.neutron [-] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:30:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-65367d4e527a226baad5edeb19ef248381bcf3fdab4f6102eaee16f9b44d027f-userdata-shm.mount: Deactivated successfully.
Jan 22 17:30:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-254e9dcd2cf88a7cc5d32c3cbf3230ac62d236a9489264d7f822f955dc954c32-merged.mount: Deactivated successfully.
Jan 22 17:30:03 compute-0 podman[232326]: 2026-01-22 17:30:03.678899526 +0000 UTC m=+0.600062656 container cleanup 65367d4e527a226baad5edeb19ef248381bcf3fdab4f6102eaee16f9b44d027f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:30:03 compute-0 systemd[1]: libpod-conmon-65367d4e527a226baad5edeb19ef248381bcf3fdab4f6102eaee16f9b44d027f.scope: Deactivated successfully.
Jan 22 17:30:03 compute-0 podman[232378]: 2026-01-22 17:30:03.863100492 +0000 UTC m=+0.157049554 container remove 65367d4e527a226baad5edeb19ef248381bcf3fdab4f6102eaee16f9b44d027f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.868 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cd8ca276-0c11-4974-b5aa-961274440382]: (4, ('Thu Jan 22 05:30:03 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf (65367d4e527a226baad5edeb19ef248381bcf3fdab4f6102eaee16f9b44d027f)\n65367d4e527a226baad5edeb19ef248381bcf3fdab4f6102eaee16f9b44d027f\nThu Jan 22 05:30:03 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf (65367d4e527a226baad5edeb19ef248381bcf3fdab4f6102eaee16f9b44d027f)\n65367d4e527a226baad5edeb19ef248381bcf3fdab4f6102eaee16f9b44d027f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.871 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f0408abd-7aa4-4d35-a6ed-bad22af868db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.872 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap012007cf-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.874 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:03 compute-0 kernel: tap012007cf-60: left promiscuous mode
Jan 22 17:30:03 compute-0 nova_compute[183075]: 2026-01-22 17:30:03.884 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.887 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b8a2f460-fde7-4d48-b6fb-c52a6db0dd00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.901 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[83e26cd5-9777-4018-a984-fc34f622cc4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.902 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[24360116-6237-4214-8d43-906a73104b74]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.918 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[92e94322-5b01-452a-b59c-db617dc70967]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532023, 'reachable_time': 33142, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232400, 'error': None, 'target': 'ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:03 compute-0 systemd[1]: run-netns-ovnmeta\x2d012007cf\x2d673c\x2d4f83\x2da4b9\x2df21a913a1ccf.mount: Deactivated successfully.
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.922 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.922 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[7649fc2a-412d-4815-81c4-44db46229583]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.923 104629 INFO neutron.agent.ovn.metadata.agent [-] Port cd31a146-70f5-4610-88f1-ae4772887ce2 in datapath 012007cf-673c-4f83-a4b9-f21a913a1ccf unbound from our chassis
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.925 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 012007cf-673c-4f83-a4b9-f21a913a1ccf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.927 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7f591320-fb11-4951-a9c8-844b8f525bb3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.928 104629 INFO neutron.agent.ovn.metadata.agent [-] Port cd31a146-70f5-4610-88f1-ae4772887ce2 in datapath 012007cf-673c-4f83-a4b9-f21a913a1ccf unbound from our chassis
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.929 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 012007cf-673c-4f83-a4b9-f21a913a1ccf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:30:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:03.930 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7c1ebbb8-55a9-4bf6-b4d8-4f24701233eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:04 compute-0 ovn_controller[95372]: 2026-01-22T17:30:04Z|00104|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:90:3e:82 10.100.0.4
Jan 22 17:30:04 compute-0 ovn_controller[95372]: 2026-01-22T17:30:04Z|00105|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:90:3e:82 10.100.0.4
Jan 22 17:30:05 compute-0 nova_compute[183075]: 2026-01-22 17:30:05.112 183079 INFO nova.compute.manager [None req-2eaf350a-85fb-43df-bc53-b201166a52f3 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Get console output
Jan 22 17:30:05 compute-0 nova_compute[183075]: 2026-01-22 17:30:05.118 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:30:05 compute-0 nova_compute[183075]: 2026-01-22 17:30:05.533 183079 DEBUG nova.compute.manager [req-b3275e1e-40a7-4acd-ad25-6c5cf44acfde req-4ab3045b-80f2-4b6a-86ce-cd54a92c6fe4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Received event network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:30:05 compute-0 nova_compute[183075]: 2026-01-22 17:30:05.533 183079 DEBUG oslo_concurrency.lockutils [req-b3275e1e-40a7-4acd-ad25-6c5cf44acfde req-4ab3045b-80f2-4b6a-86ce-cd54a92c6fe4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:05 compute-0 nova_compute[183075]: 2026-01-22 17:30:05.533 183079 DEBUG oslo_concurrency.lockutils [req-b3275e1e-40a7-4acd-ad25-6c5cf44acfde req-4ab3045b-80f2-4b6a-86ce-cd54a92c6fe4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:05 compute-0 nova_compute[183075]: 2026-01-22 17:30:05.534 183079 DEBUG oslo_concurrency.lockutils [req-b3275e1e-40a7-4acd-ad25-6c5cf44acfde req-4ab3045b-80f2-4b6a-86ce-cd54a92c6fe4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "558df49c-4071-47cf-9f12-34cc70b1f266-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:05 compute-0 nova_compute[183075]: 2026-01-22 17:30:05.534 183079 DEBUG nova.compute.manager [req-b3275e1e-40a7-4acd-ad25-6c5cf44acfde req-4ab3045b-80f2-4b6a-86ce-cd54a92c6fe4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] No waiting events found dispatching network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:30:05 compute-0 nova_compute[183075]: 2026-01-22 17:30:05.534 183079 WARNING nova.compute.manager [req-b3275e1e-40a7-4acd-ad25-6c5cf44acfde req-4ab3045b-80f2-4b6a-86ce-cd54a92c6fe4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Received unexpected event network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 for instance with vm_state active and task_state deleting.
Jan 22 17:30:06 compute-0 nova_compute[183075]: 2026-01-22 17:30:06.540 183079 DEBUG nova.network.neutron [req-ff0b4563-ff21-4cf2-b2b5-702567066e4b req-37df698b-ec1f-41cc-8c15-cfc71a38352b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Updated VIF entry in instance network info cache for port cd31a146-70f5-4610-88f1-ae4772887ce2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:30:06 compute-0 nova_compute[183075]: 2026-01-22 17:30:06.541 183079 DEBUG nova.network.neutron [req-ff0b4563-ff21-4cf2-b2b5-702567066e4b req-37df698b-ec1f-41cc-8c15-cfc71a38352b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Updating instance_info_cache with network_info: [{"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:30:06 compute-0 nova_compute[183075]: 2026-01-22 17:30:06.563 183079 DEBUG oslo_concurrency.lockutils [req-ff0b4563-ff21-4cf2-b2b5-702567066e4b req-37df698b-ec1f-41cc-8c15-cfc71a38352b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-558df49c-4071-47cf-9f12-34cc70b1f266" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:30:07 compute-0 nova_compute[183075]: 2026-01-22 17:30:07.518 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:08 compute-0 nova_compute[183075]: 2026-01-22 17:30:08.282 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:08 compute-0 nova_compute[183075]: 2026-01-22 17:30:08.671 183079 DEBUG nova.network.neutron [-] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:30:08 compute-0 nova_compute[183075]: 2026-01-22 17:30:08.686 183079 INFO nova.compute.manager [-] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Took 5.31 seconds to deallocate network for instance.
Jan 22 17:30:09 compute-0 nova_compute[183075]: 2026-01-22 17:30:09.275 183079 DEBUG oslo_concurrency.lockutils [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:09 compute-0 nova_compute[183075]: 2026-01-22 17:30:09.276 183079 DEBUG oslo_concurrency.lockutils [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:09 compute-0 nova_compute[183075]: 2026-01-22 17:30:09.350 183079 DEBUG nova.compute.provider_tree [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:30:09 compute-0 nova_compute[183075]: 2026-01-22 17:30:09.364 183079 DEBUG nova.scheduler.client.report [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:30:09 compute-0 nova_compute[183075]: 2026-01-22 17:30:09.386 183079 DEBUG oslo_concurrency.lockutils [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:09 compute-0 nova_compute[183075]: 2026-01-22 17:30:09.603 183079 INFO nova.scheduler.client.report [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Deleted allocations for instance 558df49c-4071-47cf-9f12-34cc70b1f266
Jan 22 17:30:09 compute-0 nova_compute[183075]: 2026-01-22 17:30:09.713 183079 DEBUG oslo_concurrency.lockutils [None req-b6fa2c5a-1e68-4112-9c58-3da5b7d707e2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "558df49c-4071-47cf-9f12-34cc70b1f266" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.047 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.047 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eddf21f6-fbb7-4dbb-a219-f930432ddc71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:10 compute-0 nova_compute[183075]: 2026-01-22 17:30:10.228 183079 INFO nova.compute.manager [None req-b10f5e9a-97db-48d9-8a90-f5746e754709 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Get console output
Jan 22 17:30:10 compute-0 nova_compute[183075]: 2026-01-22 17:30:10.232 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.304 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.605 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:10 compute-0 haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232209]: 10.100.0.4:60534 [22/Jan/2026:17:30:10.046] listener listener/metadata 0/0/0/559/559 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.606 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.5582597
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.613 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.613 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eddf21f6-fbb7-4dbb-a219-f930432ddc71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.635 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.635 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 189 time: 0.0220339
Jan 22 17:30:10 compute-0 haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232209]: 10.100.0.4:60538 [22/Jan/2026:17:30:10.612] listener listener/metadata 0/0/0/23/23 200 173 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.639 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.640 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eddf21f6-fbb7-4dbb-a219-f930432ddc71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.651 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.652 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0118017
Jan 22 17:30:10 compute-0 haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232209]: 10.100.0.4:60544 [22/Jan/2026:17:30:10.639] listener listener/metadata 0/0/0/12/12 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.656 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.657 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eddf21f6-fbb7-4dbb-a219-f930432ddc71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.670 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.670 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0131052
Jan 22 17:30:10 compute-0 haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232209]: 10.100.0.4:60560 [22/Jan/2026:17:30:10.656] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.674 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.675 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eddf21f6-fbb7-4dbb-a219-f930432ddc71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.690 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.691 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0157518
Jan 22 17:30:10 compute-0 haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232209]: 10.100.0.4:60568 [22/Jan/2026:17:30:10.674] listener listener/metadata 0/0/0/16/16 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.695 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.696 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eddf21f6-fbb7-4dbb-a219-f930432ddc71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.711 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.711 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0148952
Jan 22 17:30:10 compute-0 haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232209]: 10.100.0.4:60576 [22/Jan/2026:17:30:10.695] listener listener/metadata 0/0/0/16/16 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.716 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.717 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eddf21f6-fbb7-4dbb-a219-f930432ddc71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.730 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:10 compute-0 haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232209]: 10.100.0.4:60592 [22/Jan/2026:17:30:10.715] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.731 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0140862
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.735 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.736 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eddf21f6-fbb7-4dbb-a219-f930432ddc71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.748 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.749 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0128727
Jan 22 17:30:10 compute-0 haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232209]: 10.100.0.4:60596 [22/Jan/2026:17:30:10.735] listener listener/metadata 0/0/0/13/13 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.753 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.753 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eddf21f6-fbb7-4dbb-a219-f930432ddc71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.765 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.766 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 175 time: 0.0122895
Jan 22 17:30:10 compute-0 haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232209]: 10.100.0.4:60612 [22/Jan/2026:17:30:10.752] listener listener/metadata 0/0/0/13/13 200 159 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.770 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.771 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eddf21f6-fbb7-4dbb-a219-f930432ddc71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.783 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.783 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 175 time: 0.0128012
Jan 22 17:30:10 compute-0 haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232209]: 10.100.0.4:60624 [22/Jan/2026:17:30:10.770] listener listener/metadata 0/0/0/13/13 200 159 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.788 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.789 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eddf21f6-fbb7-4dbb-a219-f930432ddc71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:10 compute-0 haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232209]: 10.100.0.4:60636 [22/Jan/2026:17:30:10.787] listener listener/metadata 0/0/0/14/14 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.802 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0139530
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.811 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.812 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eddf21f6-fbb7-4dbb-a219-f930432ddc71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.826 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.826 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0146141
Jan 22 17:30:10 compute-0 haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232209]: 10.100.0.4:60648 [22/Jan/2026:17:30:10.811] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.832 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.833 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eddf21f6-fbb7-4dbb-a219-f930432ddc71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.846 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.847 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0141299
Jan 22 17:30:10 compute-0 haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232209]: 10.100.0.4:60656 [22/Jan/2026:17:30:10.831] listener listener/metadata 0/0/0/15/15 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.852 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.853 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eddf21f6-fbb7-4dbb-a219-f930432ddc71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.866 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:10 compute-0 haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232209]: 10.100.0.4:60660 [22/Jan/2026:17:30:10.852] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.867 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0136690
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.872 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.873 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eddf21f6-fbb7-4dbb-a219-f930432ddc71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.886 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:10 compute-0 haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232209]: 10.100.0.4:60672 [22/Jan/2026:17:30:10.872] listener listener/metadata 0/0/0/14/14 200 159 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.887 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 175 time: 0.0133781
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.892 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.893 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: eddf21f6-fbb7-4dbb-a219-f930432ddc71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.910 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:10.910 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0173738
Jan 22 17:30:10 compute-0 haproxy-metadata-proxy-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232209]: 10.100.0.4:60688 [22/Jan/2026:17:30:10.892] listener listener/metadata 0/0/0/18/18 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:30:12 compute-0 nova_compute[183075]: 2026-01-22 17:30:12.520 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:12 compute-0 nova_compute[183075]: 2026-01-22 17:30:12.906 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:30:12 compute-0 nova_compute[183075]: 2026-01-22 17:30:12.932 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Triggering sync for uuid ae41baf4-b0eb-4402-aebe-718f5c7f3ed9 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 17:30:12 compute-0 nova_compute[183075]: 2026-01-22 17:30:12.933 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:12 compute-0 nova_compute[183075]: 2026-01-22 17:30:12.933 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:12 compute-0 nova_compute[183075]: 2026-01-22 17:30:12.960 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:13 compute-0 nova_compute[183075]: 2026-01-22 17:30:13.311 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:13 compute-0 podman[232401]: 2026-01-22 17:30:13.353461823 +0000 UTC m=+0.058843116 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:30:15 compute-0 nova_compute[183075]: 2026-01-22 17:30:15.328 183079 INFO nova.compute.manager [None req-359c16aa-a201-47fb-ab73-293491426fe2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Get console output
Jan 22 17:30:15 compute-0 nova_compute[183075]: 2026-01-22 17:30:15.332 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:30:15 compute-0 nova_compute[183075]: 2026-01-22 17:30:15.906 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:15 compute-0 nova_compute[183075]: 2026-01-22 17:30:15.907 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:15 compute-0 nova_compute[183075]: 2026-01-22 17:30:15.937 183079 DEBUG nova.compute.manager [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.020 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.020 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.024 183079 DEBUG nova.virt.hardware [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.024 183079 INFO nova.compute.claims [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.131 183079 DEBUG nova.compute.provider_tree [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.157 183079 DEBUG nova.scheduler.client.report [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.179 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.180 183079 DEBUG nova.compute.manager [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.237 183079 DEBUG nova.compute.manager [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.238 183079 DEBUG nova.network.neutron [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.271 183079 INFO nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.306 183079 DEBUG nova.compute.manager [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.410 183079 DEBUG nova.compute.manager [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.411 183079 DEBUG nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.412 183079 INFO nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Creating image(s)
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.412 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "/var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.413 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "/var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.413 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "/var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.425 183079 DEBUG oslo_concurrency.processutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.481 183079 DEBUG oslo_concurrency.processutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.482 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.482 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.497 183079 DEBUG oslo_concurrency.processutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.548 183079 DEBUG oslo_concurrency.processutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.549 183079 DEBUG oslo_concurrency.processutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.591 183079 DEBUG oslo_concurrency.processutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.592 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.592 183079 DEBUG oslo_concurrency.processutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.627 183079 DEBUG nova.policy [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3a896d4927d442ffba421873948034be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.645 183079 DEBUG oslo_concurrency.processutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.647 183079 DEBUG nova.virt.disk.api [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Checking if we can resize image /var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.647 183079 DEBUG oslo_concurrency.processutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.711 183079 DEBUG oslo_concurrency.processutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.712 183079 DEBUG nova.virt.disk.api [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Cannot resize image /var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.713 183079 DEBUG nova.objects.instance [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lazy-loading 'migration_context' on Instance uuid e8d327ba-899a-4ec7-8b78-0fb6dedf7371 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.728 183079 DEBUG nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.728 183079 DEBUG nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Ensure instance console log exists: /var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.729 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.729 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.729 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:16 compute-0 nova_compute[183075]: 2026-01-22 17:30:16.816 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:30:17 compute-0 nova_compute[183075]: 2026-01-22 17:30:17.216 183079 DEBUG nova.network.neutron [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Successfully updated port: cd31a146-70f5-4610-88f1-ae4772887ce2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:30:17 compute-0 nova_compute[183075]: 2026-01-22 17:30:17.234 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "refresh_cache-e8d327ba-899a-4ec7-8b78-0fb6dedf7371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:30:17 compute-0 nova_compute[183075]: 2026-01-22 17:30:17.234 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquired lock "refresh_cache-e8d327ba-899a-4ec7-8b78-0fb6dedf7371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:30:17 compute-0 nova_compute[183075]: 2026-01-22 17:30:17.235 183079 DEBUG nova.network.neutron [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:30:17 compute-0 nova_compute[183075]: 2026-01-22 17:30:17.294 183079 DEBUG nova.compute.manager [req-22bc9eef-ba95-495d-af29-978d28de185d req-75c74b92-52ae-4423-9e52-fd2ca2f811e8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Received event network-changed-cd31a146-70f5-4610-88f1-ae4772887ce2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:30:17 compute-0 nova_compute[183075]: 2026-01-22 17:30:17.295 183079 DEBUG nova.compute.manager [req-22bc9eef-ba95-495d-af29-978d28de185d req-75c74b92-52ae-4423-9e52-fd2ca2f811e8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Refreshing instance network info cache due to event network-changed-cd31a146-70f5-4610-88f1-ae4772887ce2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:30:17 compute-0 nova_compute[183075]: 2026-01-22 17:30:17.295 183079 DEBUG oslo_concurrency.lockutils [req-22bc9eef-ba95-495d-af29-978d28de185d req-75c74b92-52ae-4423-9e52-fd2ca2f811e8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-e8d327ba-899a-4ec7-8b78-0fb6dedf7371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:30:17 compute-0 nova_compute[183075]: 2026-01-22 17:30:17.371 183079 DEBUG nova.network.neutron [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:30:17 compute-0 nova_compute[183075]: 2026-01-22 17:30:17.524 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:18 compute-0 nova_compute[183075]: 2026-01-22 17:30:18.162 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103003.1615555, 558df49c-4071-47cf-9f12-34cc70b1f266 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:30:18 compute-0 nova_compute[183075]: 2026-01-22 17:30:18.163 183079 INFO nova.compute.manager [-] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] VM Stopped (Lifecycle Event)
Jan 22 17:30:18 compute-0 nova_compute[183075]: 2026-01-22 17:30:18.233 183079 DEBUG nova.compute.manager [None req-22339d6f-4b6a-46b2-8262-73dcb3599370 - - - - - -] [instance: 558df49c-4071-47cf-9f12-34cc70b1f266] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:30:18 compute-0 nova_compute[183075]: 2026-01-22 17:30:18.312 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:18 compute-0 podman[232440]: 2026-01-22 17:30:18.345754333 +0000 UTC m=+0.056814572 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:30:20 compute-0 nova_compute[183075]: 2026-01-22 17:30:20.515 183079 INFO nova.compute.manager [None req-9262698f-0596-4590-83ec-1e6097b70660 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Get console output
Jan 22 17:30:20 compute-0 nova_compute[183075]: 2026-01-22 17:30:20.523 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:30:21 compute-0 nova_compute[183075]: 2026-01-22 17:30:21.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:30:21 compute-0 nova_compute[183075]: 2026-01-22 17:30:21.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.490 183079 DEBUG nova.network.neutron [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Updating instance_info_cache with network_info: [{"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.526 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.838 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Releasing lock "refresh_cache-e8d327ba-899a-4ec7-8b78-0fb6dedf7371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.839 183079 DEBUG nova.compute.manager [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Instance network_info: |[{"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.840 183079 DEBUG oslo_concurrency.lockutils [req-22bc9eef-ba95-495d-af29-978d28de185d req-75c74b92-52ae-4423-9e52-fd2ca2f811e8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-e8d327ba-899a-4ec7-8b78-0fb6dedf7371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.841 183079 DEBUG nova.network.neutron [req-22bc9eef-ba95-495d-af29-978d28de185d req-75c74b92-52ae-4423-9e52-fd2ca2f811e8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Refreshing network info cache for port cd31a146-70f5-4610-88f1-ae4772887ce2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.846 183079 DEBUG nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Start _get_guest_xml network_info=[{"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.854 183079 WARNING nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.859 183079 DEBUG nova.virt.libvirt.host [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.859 183079 DEBUG nova.virt.libvirt.host [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.863 183079 DEBUG nova.virt.libvirt.host [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.863 183079 DEBUG nova.virt.libvirt.host [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.864 183079 DEBUG nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.864 183079 DEBUG nova.virt.hardware [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.864 183079 DEBUG nova.virt.hardware [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.864 183079 DEBUG nova.virt.hardware [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.865 183079 DEBUG nova.virt.hardware [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.865 183079 DEBUG nova.virt.hardware [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.865 183079 DEBUG nova.virt.hardware [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.865 183079 DEBUG nova.virt.hardware [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.865 183079 DEBUG nova.virt.hardware [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.866 183079 DEBUG nova.virt.hardware [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.866 183079 DEBUG nova.virt.hardware [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.866 183079 DEBUG nova.virt.hardware [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.869 183079 DEBUG nova.virt.libvirt.vif [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:30:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-691065204',display_name='tempest-server-test-691065204',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-691065204',id=52,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQrJNwmVmi0v3CvDVkdf2ULfmIKW9OE2obw9UEIh0JilIeeueUzwA1cDH+T5CoOIGZz/satGSZDSgKqtLklRpNQ/Wm6QLNBLAjV/3q74U9Y8J0BPwM5hIfTkFkrKFKf2g==',key_name='tempest-keypair-test-1359386190',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ff1e5ce4806445a8e463c71b6930bec',ramdisk_id='',reservation_id='r-r8poukwg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortsTest-1337721110',owner_user_name='tempest-PortsTest-1337721110-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:30:16Z,user_data=None,user_id='3a896d4927d442ffba421873948034be',uuid=e8d327ba-899a-4ec7-8b78-0fb6dedf7371,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.869 183079 DEBUG nova.network.os_vif_util [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converting VIF {"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.870 183079 DEBUG nova.network.os_vif_util [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:f4:bc,bridge_name='br-int',has_traffic_filtering=True,id=cd31a146-70f5-4610-88f1-ae4772887ce2,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd31a146-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.871 183079 DEBUG nova.objects.instance [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lazy-loading 'pci_devices' on Instance uuid e8d327ba-899a-4ec7-8b78-0fb6dedf7371 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.903 183079 DEBUG nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:30:22 compute-0 nova_compute[183075]:   <uuid>e8d327ba-899a-4ec7-8b78-0fb6dedf7371</uuid>
Jan 22 17:30:22 compute-0 nova_compute[183075]:   <name>instance-00000034</name>
Jan 22 17:30:22 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:30:22 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:30:22 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-691065204</nova:name>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:30:22</nova:creationTime>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:30:22 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:30:22 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:30:22 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:30:22 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:30:22 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:30:22 compute-0 nova_compute[183075]:         <nova:user uuid="3a896d4927d442ffba421873948034be">tempest-PortsTest-1337721110-project-member</nova:user>
Jan 22 17:30:22 compute-0 nova_compute[183075]:         <nova:project uuid="7ff1e5ce4806445a8e463c71b6930bec">tempest-PortsTest-1337721110</nova:project>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:30:22 compute-0 nova_compute[183075]:         <nova:port uuid="cd31a146-70f5-4610-88f1-ae4772887ce2">
Jan 22 17:30:22 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:30:22 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:30:22 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <system>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <entry name="serial">e8d327ba-899a-4ec7-8b78-0fb6dedf7371</entry>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <entry name="uuid">e8d327ba-899a-4ec7-8b78-0fb6dedf7371</entry>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     </system>
Jan 22 17:30:22 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:30:22 compute-0 nova_compute[183075]:   <os>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:   </os>
Jan 22 17:30:22 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:30:22 compute-0 nova_compute[183075]:   <features>
Jan 22 17:30:22 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:   </features>
Jan 22 17:30:22 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:30:22 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:30:22 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371/disk"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:f2:f4:bc"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <target dev="tapcd31a146-70"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371/console.log" append="off"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <video>
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     </video>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:30:22 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:30:22 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:30:22 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:30:22 compute-0 nova_compute[183075]: </domain>
Jan 22 17:30:22 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.904 183079 DEBUG nova.compute.manager [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Preparing to wait for external event network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.905 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.905 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.905 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.906 183079 DEBUG nova.virt.libvirt.vif [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:30:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-691065204',display_name='tempest-server-test-691065204',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-691065204',id=52,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQrJNwmVmi0v3CvDVkdf2ULfmIKW9OE2obw9UEIh0JilIeeueUzwA1cDH+T5CoOIGZz/satGSZDSgKqtLklRpNQ/Wm6QLNBLAjV/3q74U9Y8J0BPwM5hIfTkFkrKFKf2g==',key_name='tempest-keypair-test-1359386190',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ff1e5ce4806445a8e463c71b6930bec',ramdisk_id='',reservation_id='r-r8poukwg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortsTest-1337721110',owner_user_name='tempest-PortsTest-1337721110-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:30:16Z,user_data=None,user_id='3a896d4927d442ffba421873948034be',uuid=e8d327ba-899a-4ec7-8b78-0fb6dedf7371,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.906 183079 DEBUG nova.network.os_vif_util [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converting VIF {"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.907 183079 DEBUG nova.network.os_vif_util [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:f4:bc,bridge_name='br-int',has_traffic_filtering=True,id=cd31a146-70f5-4610-88f1-ae4772887ce2,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd31a146-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.908 183079 DEBUG os_vif [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:f4:bc,bridge_name='br-int',has_traffic_filtering=True,id=cd31a146-70f5-4610-88f1-ae4772887ce2,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd31a146-70') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.908 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.909 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.909 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.912 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.912 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd31a146-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.912 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcd31a146-70, col_values=(('external_ids', {'iface-id': 'cd31a146-70f5-4610-88f1-ae4772887ce2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:f4:bc', 'vm-uuid': 'e8d327ba-899a-4ec7-8b78-0fb6dedf7371'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.914 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:22 compute-0 NetworkManager[55454]: <info>  [1769103022.9152] manager: (tapcd31a146-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/243)
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.916 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.920 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.920 183079 INFO os_vif [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:f4:bc,bridge_name='br-int',has_traffic_filtering=True,id=cd31a146-70f5-4610-88f1-ae4772887ce2,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd31a146-70')
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.981 183079 DEBUG nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:30:22 compute-0 nova_compute[183075]: 2026-01-22 17:30:22.982 183079 DEBUG nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] No VIF found with MAC fa:16:3e:f2:f4:bc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:30:23 compute-0 kernel: tapcd31a146-70: entered promiscuous mode
Jan 22 17:30:23 compute-0 NetworkManager[55454]: <info>  [1769103023.0425] manager: (tapcd31a146-70): new Tun device (/org/freedesktop/NetworkManager/Devices/244)
Jan 22 17:30:23 compute-0 ovn_controller[95372]: 2026-01-22T17:30:23Z|00572|binding|INFO|Claiming lport cd31a146-70f5-4610-88f1-ae4772887ce2 for this chassis.
Jan 22 17:30:23 compute-0 ovn_controller[95372]: 2026-01-22T17:30:23Z|00573|binding|INFO|cd31a146-70f5-4610-88f1-ae4772887ce2: Claiming fa:16:3e:f2:f4:bc 10.100.0.10
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.043 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.050 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:f4:bc 10.100.0.10'], port_security=['fa:16:3e:f2:f4:bc 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-port-288740050', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'e8d327ba-899a-4ec7-8b78-0fb6dedf7371', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-port-288740050', 'neutron:project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'a8d0872f-de3c-4404-94dc-7a328e5b8aa9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a510149a-49ad-47bc-a8ad-05908544b3cc, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=cd31a146-70f5-4610-88f1-ae4772887ce2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.052 104629 INFO neutron.agent.ovn.metadata.agent [-] Port cd31a146-70f5-4610-88f1-ae4772887ce2 in datapath 012007cf-673c-4f83-a4b9-f21a913a1ccf bound to our chassis
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.053 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 012007cf-673c-4f83-a4b9-f21a913a1ccf
Jan 22 17:30:23 compute-0 ovn_controller[95372]: 2026-01-22T17:30:23Z|00574|binding|INFO|Setting lport cd31a146-70f5-4610-88f1-ae4772887ce2 ovn-installed in OVS
Jan 22 17:30:23 compute-0 ovn_controller[95372]: 2026-01-22T17:30:23Z|00575|binding|INFO|Setting lport cd31a146-70f5-4610-88f1-ae4772887ce2 up in Southbound
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.062 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.063 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6f8e5ab1-31c5-4acf-aa9f-d565de23c540]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.064 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap012007cf-61 in ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.065 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.070 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap012007cf-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.070 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[75c50e3e-5b8c-4624-b8ac-9a6aa77b635f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.071 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7abddf4b-cd16-4e4a-b433-834f7627b4f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:23 compute-0 systemd-udevd[232484]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:30:23 compute-0 systemd-machined[154382]: New machine qemu-52-instance-00000034.
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.087 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[718a7871-53ea-434c-81e9-b3d650571dbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:23 compute-0 NetworkManager[55454]: <info>  [1769103023.0965] device (tapcd31a146-70): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:30:23 compute-0 NetworkManager[55454]: <info>  [1769103023.0977] device (tapcd31a146-70): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:30:23 compute-0 systemd[1]: Started Virtual Machine qemu-52-instance-00000034.
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.112 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[697143ae-a8e3-42d9-8a95-08fc09247d90]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.141 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[94a9d231-c505-41be-91c1-51d08826d06b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.149 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c83ba7d1-5387-4015-99ab-33da69b96e52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:23 compute-0 NetworkManager[55454]: <info>  [1769103023.1505] manager: (tap012007cf-60): new Veth device (/org/freedesktop/NetworkManager/Devices/245)
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.174 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ea57fc59-a238-47b1-9564-c62093910227]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.178 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[7f38116b-be09-4bfc-b154-160a5012f02e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:23 compute-0 NetworkManager[55454]: <info>  [1769103023.1977] device (tap012007cf-60): carrier: link connected
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.202 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[0e2947cb-72c7-4a33-9052-7df742db324f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.221 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[441feca5-ebb6-4909-8565-f87b05118426]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap012007cf-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:ca:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538689, 'reachable_time': 34265, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232515, 'error': None, 'target': 'ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.237 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0e3716f5-3c1c-45bf-aa21-59684fa86ae0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:cae2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 538689, 'tstamp': 538689}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232516, 'error': None, 'target': 'ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.256 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[29306422-4f13-47a7-9687-ad2ffa67872b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap012007cf-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:ca:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538689, 'reachable_time': 34265, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232517, 'error': None, 'target': 'ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.289 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a8bfee23-cddd-4524-a6f5-fabd30201b38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.355 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[63519080-0fff-4a16-bff7-6bac5dc79b68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.356 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap012007cf-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.357 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.357 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap012007cf-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:30:23 compute-0 kernel: tap012007cf-60: entered promiscuous mode
Jan 22 17:30:23 compute-0 NetworkManager[55454]: <info>  [1769103023.4126] manager: (tap012007cf-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/246)
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.412 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.414 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.417 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap012007cf-60, col_values=(('external_ids', {'iface-id': 'd7c95871-d767-4104-b2f1-75f5b06e0524'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:30:23 compute-0 ovn_controller[95372]: 2026-01-22T17:30:23Z|00576|binding|INFO|Releasing lport d7c95871-d767-4104-b2f1-75f5b06e0524 from this chassis (sb_readonly=0)
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.419 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.421 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.421 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/012007cf-673c-4f83-a4b9-f21a913a1ccf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/012007cf-673c-4f83-a4b9-f21a913a1ccf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.422 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[adc119b1-702e-490b-9a2d-1e8307e67666]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.423 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/012007cf-673c-4f83-a4b9-f21a913a1ccf.pid.haproxy
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 012007cf-673c-4f83-a4b9-f21a913a1ccf
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:30:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:23.423 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'env', 'PROCESS_TAG=haproxy-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/012007cf-673c-4f83-a4b9-f21a913a1ccf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.431 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.698 183079 DEBUG nova.compute.manager [req-0d898119-ad29-4de7-b6ab-f7e01ba4e68a req-4dbbf27d-cafb-4637-aca1-52aad7872e1a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Received event network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.698 183079 DEBUG oslo_concurrency.lockutils [req-0d898119-ad29-4de7-b6ab-f7e01ba4e68a req-4dbbf27d-cafb-4637-aca1-52aad7872e1a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.699 183079 DEBUG oslo_concurrency.lockutils [req-0d898119-ad29-4de7-b6ab-f7e01ba4e68a req-4dbbf27d-cafb-4637-aca1-52aad7872e1a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.699 183079 DEBUG oslo_concurrency.lockutils [req-0d898119-ad29-4de7-b6ab-f7e01ba4e68a req-4dbbf27d-cafb-4637-aca1-52aad7872e1a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.700 183079 DEBUG nova.compute.manager [req-0d898119-ad29-4de7-b6ab-f7e01ba4e68a req-4dbbf27d-cafb-4637-aca1-52aad7872e1a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Processing event network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.802 183079 DEBUG nova.compute.manager [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.803 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103023.802023, e8d327ba-899a-4ec7-8b78-0fb6dedf7371 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.803 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] VM Started (Lifecycle Event)
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.806 183079 DEBUG nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.809 183079 INFO nova.virt.libvirt.driver [-] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Instance spawned successfully.
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.809 183079 DEBUG nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:30:23 compute-0 podman[232555]: 2026-01-22 17:30:23.767814907 +0000 UTC m=+0.020806158 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.867 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.869 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:30:23 compute-0 podman[232555]: 2026-01-22 17:30:23.92066061 +0000 UTC m=+0.173651841 container create 28d30c27807f19cfb0ffe9a4b80b1baed5e1e5d735c733d0636513a5c3b1862d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.941 183079 DEBUG nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.941 183079 DEBUG nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.942 183079 DEBUG nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.942 183079 DEBUG nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.943 183079 DEBUG nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.944 183079 DEBUG nova.virt.libvirt.driver [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.948 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.948 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103023.8030503, e8d327ba-899a-4ec7-8b78-0fb6dedf7371 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.949 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] VM Paused (Lifecycle Event)
Jan 22 17:30:23 compute-0 systemd[1]: Started libpod-conmon-28d30c27807f19cfb0ffe9a4b80b1baed5e1e5d735c733d0636513a5c3b1862d.scope.
Jan 22 17:30:23 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:30:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2be381dabb6ae08debb8105cbeeb78f07eeefee4bed9784a017aacf683a33f8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.985 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.988 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103023.8060954, e8d327ba-899a-4ec7-8b78-0fb6dedf7371 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:30:23 compute-0 nova_compute[183075]: 2026-01-22 17:30:23.988 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] VM Resumed (Lifecycle Event)
Jan 22 17:30:24 compute-0 podman[232555]: 2026-01-22 17:30:24.029797945 +0000 UTC m=+0.282789206 container init 28d30c27807f19cfb0ffe9a4b80b1baed5e1e5d735c733d0636513a5c3b1862d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:30:24 compute-0 podman[232555]: 2026-01-22 17:30:24.034642772 +0000 UTC m=+0.287634003 container start 28d30c27807f19cfb0ffe9a4b80b1baed5e1e5d735c733d0636513a5c3b1862d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 17:30:24 compute-0 nova_compute[183075]: 2026-01-22 17:30:24.037 183079 INFO nova.compute.manager [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Took 7.63 seconds to spawn the instance on the hypervisor.
Jan 22 17:30:24 compute-0 nova_compute[183075]: 2026-01-22 17:30:24.037 183079 DEBUG nova.compute.manager [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:30:24 compute-0 nova_compute[183075]: 2026-01-22 17:30:24.047 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:30:24 compute-0 nova_compute[183075]: 2026-01-22 17:30:24.049 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:30:24 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[232572]: [NOTICE]   (232576) : New worker (232578) forked
Jan 22 17:30:24 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[232572]: [NOTICE]   (232576) : Loading success.
Jan 22 17:30:24 compute-0 nova_compute[183075]: 2026-01-22 17:30:24.081 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:30:24 compute-0 nova_compute[183075]: 2026-01-22 17:30:24.143 183079 INFO nova.compute.manager [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Took 8.14 seconds to build instance.
Jan 22 17:30:24 compute-0 nova_compute[183075]: 2026-01-22 17:30:24.161 183079 DEBUG oslo_concurrency.lockutils [None req-408937ab-57f5-4932-921b-945f7f914c09 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:24 compute-0 nova_compute[183075]: 2026-01-22 17:30:24.822 183079 DEBUG nova.network.neutron [req-22bc9eef-ba95-495d-af29-978d28de185d req-75c74b92-52ae-4423-9e52-fd2ca2f811e8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Updated VIF entry in instance network info cache for port cd31a146-70f5-4610-88f1-ae4772887ce2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:30:24 compute-0 nova_compute[183075]: 2026-01-22 17:30:24.823 183079 DEBUG nova.network.neutron [req-22bc9eef-ba95-495d-af29-978d28de185d req-75c74b92-52ae-4423-9e52-fd2ca2f811e8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Updating instance_info_cache with network_info: [{"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:30:25 compute-0 nova_compute[183075]: 2026-01-22 17:30:25.025 183079 DEBUG oslo_concurrency.lockutils [req-22bc9eef-ba95-495d-af29-978d28de185d req-75c74b92-52ae-4423-9e52-fd2ca2f811e8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-e8d327ba-899a-4ec7-8b78-0fb6dedf7371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:30:25 compute-0 nova_compute[183075]: 2026-01-22 17:30:25.696 183079 INFO nova.compute.manager [None req-77482c15-84fe-4d6f-a2f9-e54a8b544a1d 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Get console output
Jan 22 17:30:25 compute-0 nova_compute[183075]: 2026-01-22 17:30:25.698 183079 INFO nova.compute.manager [None req-1a8f34b6-bfc0-41e8-8b8b-3863d88b5a48 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Get console output
Jan 22 17:30:25 compute-0 nova_compute[183075]: 2026-01-22 17:30:25.702 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:30:25 compute-0 nova_compute[183075]: 2026-01-22 17:30:25.703 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:30:25 compute-0 nova_compute[183075]: 2026-01-22 17:30:25.823 183079 DEBUG nova.compute.manager [req-0e32a1f1-d92d-4644-a962-dbf17ba12c86 req-ba33d1cf-433c-4065-8442-4194d36f88ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Received event network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:30:25 compute-0 nova_compute[183075]: 2026-01-22 17:30:25.823 183079 DEBUG oslo_concurrency.lockutils [req-0e32a1f1-d92d-4644-a962-dbf17ba12c86 req-ba33d1cf-433c-4065-8442-4194d36f88ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:25 compute-0 nova_compute[183075]: 2026-01-22 17:30:25.823 183079 DEBUG oslo_concurrency.lockutils [req-0e32a1f1-d92d-4644-a962-dbf17ba12c86 req-ba33d1cf-433c-4065-8442-4194d36f88ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:25 compute-0 nova_compute[183075]: 2026-01-22 17:30:25.824 183079 DEBUG oslo_concurrency.lockutils [req-0e32a1f1-d92d-4644-a962-dbf17ba12c86 req-ba33d1cf-433c-4065-8442-4194d36f88ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:25 compute-0 nova_compute[183075]: 2026-01-22 17:30:25.824 183079 DEBUG nova.compute.manager [req-0e32a1f1-d92d-4644-a962-dbf17ba12c86 req-ba33d1cf-433c-4065-8442-4194d36f88ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] No waiting events found dispatching network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:30:25 compute-0 nova_compute[183075]: 2026-01-22 17:30:25.824 183079 WARNING nova.compute.manager [req-0e32a1f1-d92d-4644-a962-dbf17ba12c86 req-ba33d1cf-433c-4065-8442-4194d36f88ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Received unexpected event network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 for instance with vm_state active and task_state None.
Jan 22 17:30:27 compute-0 nova_compute[183075]: 2026-01-22 17:30:27.529 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:27 compute-0 nova_compute[183075]: 2026-01-22 17:30:27.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:30:27 compute-0 nova_compute[183075]: 2026-01-22 17:30:27.915 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:29 compute-0 podman[232588]: 2026-01-22 17:30:29.349441109 +0000 UTC m=+0.054867882 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 17:30:29 compute-0 podman[232589]: 2026-01-22 17:30:29.361085154 +0000 UTC m=+0.065024708 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Jan 22 17:30:29 compute-0 podman[232587]: 2026-01-22 17:30:29.376046327 +0000 UTC m=+0.084088248 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:30:30 compute-0 nova_compute[183075]: 2026-01-22 17:30:30.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:30:30 compute-0 nova_compute[183075]: 2026-01-22 17:30:30.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:30:30 compute-0 nova_compute[183075]: 2026-01-22 17:30:30.852 183079 INFO nova.compute.manager [None req-2600f05e-d7bb-4350-b8bc-3210d651a8c2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Get console output
Jan 22 17:30:30 compute-0 nova_compute[183075]: 2026-01-22 17:30:30.856 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:30:30 compute-0 nova_compute[183075]: 2026-01-22 17:30:30.861 183079 INFO nova.compute.manager [None req-0a44bab7-9daa-4374-9c28-d701a5864a24 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Get console output
Jan 22 17:30:30 compute-0 nova_compute[183075]: 2026-01-22 17:30:30.865 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:30:31 compute-0 nova_compute[183075]: 2026-01-22 17:30:31.974 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:30:32 compute-0 nova_compute[183075]: 2026-01-22 17:30:32.531 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:32 compute-0 nova_compute[183075]: 2026-01-22 17:30:32.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:30:32 compute-0 nova_compute[183075]: 2026-01-22 17:30:32.917 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:33 compute-0 nova_compute[183075]: 2026-01-22 17:30:33.160 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:33 compute-0 nova_compute[183075]: 2026-01-22 17:30:33.160 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:33 compute-0 nova_compute[183075]: 2026-01-22 17:30:33.160 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:33 compute-0 nova_compute[183075]: 2026-01-22 17:30:33.161 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:30:33 compute-0 podman[232651]: 2026-01-22 17:30:33.260232664 +0000 UTC m=+0.061713472 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:30:33 compute-0 nova_compute[183075]: 2026-01-22 17:30:33.624 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:30:33 compute-0 nova_compute[183075]: 2026-01-22 17:30:33.684 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:30:33 compute-0 nova_compute[183075]: 2026-01-22 17:30:33.685 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:30:33 compute-0 nova_compute[183075]: 2026-01-22 17:30:33.740 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:30:33 compute-0 nova_compute[183075]: 2026-01-22 17:30:33.745 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:30:33 compute-0 nova_compute[183075]: 2026-01-22 17:30:33.810 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:30:33 compute-0 nova_compute[183075]: 2026-01-22 17:30:33.811 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:30:33 compute-0 nova_compute[183075]: 2026-01-22 17:30:33.863 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:30:34 compute-0 nova_compute[183075]: 2026-01-22 17:30:34.001 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:30:34 compute-0 nova_compute[183075]: 2026-01-22 17:30:34.003 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5364MB free_disk=73.33068084716797GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:30:34 compute-0 nova_compute[183075]: 2026-01-22 17:30:34.003 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:34 compute-0 nova_compute[183075]: 2026-01-22 17:30:34.004 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:34 compute-0 nova_compute[183075]: 2026-01-22 17:30:34.214 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance ae41baf4-b0eb-4402-aebe-718f5c7f3ed9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:30:34 compute-0 nova_compute[183075]: 2026-01-22 17:30:34.215 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance e8d327ba-899a-4ec7-8b78-0fb6dedf7371 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:30:34 compute-0 nova_compute[183075]: 2026-01-22 17:30:34.215 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:30:34 compute-0 nova_compute[183075]: 2026-01-22 17:30:34.215 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:30:34 compute-0 nova_compute[183075]: 2026-01-22 17:30:34.264 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:30:34 compute-0 nova_compute[183075]: 2026-01-22 17:30:34.493 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:30:34 compute-0 nova_compute[183075]: 2026-01-22 17:30:34.724 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:30:34 compute-0 nova_compute[183075]: 2026-01-22 17:30:34.725 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:35 compute-0 sshd-session[232695]: error: kex_exchange_identification: read: Connection reset by peer
Jan 22 17:30:35 compute-0 sshd-session[232695]: Connection reset by 176.120.22.52 port 36297
Jan 22 17:30:35 compute-0 nova_compute[183075]: 2026-01-22 17:30:35.725 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:30:35 compute-0 nova_compute[183075]: 2026-01-22 17:30:35.725 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:30:35 compute-0 nova_compute[183075]: 2026-01-22 17:30:35.726 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:30:36 compute-0 ovn_controller[95372]: 2026-01-22T17:30:36Z|00106|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f2:f4:bc 10.100.0.10
Jan 22 17:30:36 compute-0 ovn_controller[95372]: 2026-01-22T17:30:36Z|00107|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f2:f4:bc 10.100.0.10
Jan 22 17:30:36 compute-0 nova_compute[183075]: 2026-01-22 17:30:36.223 183079 INFO nova.compute.manager [None req-4885111f-fa8b-45b5-bdfd-ef10993148c1 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Get console output
Jan 22 17:30:36 compute-0 nova_compute[183075]: 2026-01-22 17:30:36.225 183079 INFO nova.compute.manager [None req-9057a096-4463-4b81-9da3-41d9627e38b2 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Get console output
Jan 22 17:30:36 compute-0 nova_compute[183075]: 2026-01-22 17:30:36.231 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:30:36 compute-0 nova_compute[183075]: 2026-01-22 17:30:36.232 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:30:36 compute-0 nova_compute[183075]: 2026-01-22 17:30:36.264 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:30:36 compute-0 nova_compute[183075]: 2026-01-22 17:30:36.264 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:30:36 compute-0 nova_compute[183075]: 2026-01-22 17:30:36.264 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:30:36 compute-0 nova_compute[183075]: 2026-01-22 17:30:36.265 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ae41baf4-b0eb-4402-aebe-718f5c7f3ed9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:30:37 compute-0 nova_compute[183075]: 2026-01-22 17:30:37.482 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Updating instance_info_cache with network_info: [{"id": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "address": "fa:16:3e:90:3e:82", "network": {"id": "eddf21f6-fbb7-4dbb-a219-f930432ddc71", "bridge": "br-int", "label": "tempest-internal-dns-test-network-1900776616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f56d85-9d", "ovs_interfaceid": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:30:37 compute-0 nova_compute[183075]: 2026-01-22 17:30:37.519 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:30:37 compute-0 nova_compute[183075]: 2026-01-22 17:30:37.519 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:30:37 compute-0 nova_compute[183075]: 2026-01-22 17:30:37.520 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:30:37 compute-0 nova_compute[183075]: 2026-01-22 17:30:37.533 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:37 compute-0 nova_compute[183075]: 2026-01-22 17:30:37.919 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:41 compute-0 nova_compute[183075]: 2026-01-22 17:30:41.396 183079 INFO nova.compute.manager [None req-7d220011-cd01-4449-86bb-3819c917b308 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Get console output
Jan 22 17:30:41 compute-0 nova_compute[183075]: 2026-01-22 17:30:41.398 183079 INFO nova.compute.manager [None req-c79fc9de-6093-47b6-8d6f-afc062c3ce58 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Get console output
Jan 22 17:30:41 compute-0 nova_compute[183075]: 2026-01-22 17:30:41.402 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:30:41 compute-0 nova_compute[183075]: 2026-01-22 17:30:41.404 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:30:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:41.526 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:41.526 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:30:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:30:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:41.946 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:41.947 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:41.947 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:42 compute-0 nova_compute[183075]: 2026-01-22 17:30:42.535 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.589 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.590 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.0632651
Jan 22 17:30:42 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[232578]: 10.100.0.10:39326 [22/Jan/2026:17:30:41.525] listener listener/metadata 0/0/0/1064/1064 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.604 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.605 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.625 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:42 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[232578]: 10.100.0.10:39340 [22/Jan/2026:17:30:42.604] listener listener/metadata 0/0/0/22/22 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.626 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0206337
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.631 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.632 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.648 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.649 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0171633
Jan 22 17:30:42 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[232578]: 10.100.0.10:39350 [22/Jan/2026:17:30:42.630] listener listener/metadata 0/0/0/18/18 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.654 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.655 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.671 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.672 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0165644
Jan 22 17:30:42 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[232578]: 10.100.0.10:39354 [22/Jan/2026:17:30:42.654] listener listener/metadata 0/0/0/17/17 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.677 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.677 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.695 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.695 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0176153
Jan 22 17:30:42 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[232578]: 10.100.0.10:39370 [22/Jan/2026:17:30:42.676] listener listener/metadata 0/0/0/18/18 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.702 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.702 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.721 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:42 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[232578]: 10.100.0.10:39374 [22/Jan/2026:17:30:42.701] listener listener/metadata 0/0/0/20/20 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.722 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0195897
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.729 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.729 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.746 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:42 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[232578]: 10.100.0.10:39388 [22/Jan/2026:17:30:42.728] listener listener/metadata 0/0/0/18/18 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.747 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0173762
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.755 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.755 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.773 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.773 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0180252
Jan 22 17:30:42 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[232578]: 10.100.0.10:39404 [22/Jan/2026:17:30:42.754] listener listener/metadata 0/0/0/19/19 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.778 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.779 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.793 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.794 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0150256
Jan 22 17:30:42 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[232578]: 10.100.0.10:39406 [22/Jan/2026:17:30:42.778] listener listener/metadata 0/0/0/15/15 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.799 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.799 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.813 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.813 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0137920
Jan 22 17:30:42 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[232578]: 10.100.0.10:39422 [22/Jan/2026:17:30:42.798] listener listener/metadata 0/0/0/14/14 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.818 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.818 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.832 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0138655
Jan 22 17:30:42 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[232578]: 10.100.0.10:39428 [22/Jan/2026:17:30:42.818] listener listener/metadata 0/0/0/14/14 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.846 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.846 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.862 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.863 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0165424
Jan 22 17:30:42 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[232578]: 10.100.0.10:39432 [22/Jan/2026:17:30:42.846] listener listener/metadata 0/0/0/17/17 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.867 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.868 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.884 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:42 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[232578]: 10.100.0.10:39434 [22/Jan/2026:17:30:42.867] listener listener/metadata 0/0/0/17/17 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.885 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0172083
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.888 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.889 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.905 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.905 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0163798
Jan 22 17:30:42 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[232578]: 10.100.0.10:39436 [22/Jan/2026:17:30:42.888] listener listener/metadata 0/0/0/17/17 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.910 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.910 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:42 compute-0 nova_compute[183075]: 2026-01-22 17:30:42.921 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.924 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.924 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0143623
Jan 22 17:30:42 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[232578]: 10.100.0.10:39452 [22/Jan/2026:17:30:42.909] listener listener/metadata 0/0/0/15/15 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.929 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.929 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 012007cf-673c-4f83-a4b9-f21a913a1ccf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.950 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:30:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:42.950 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0207224
Jan 22 17:30:42 compute-0 haproxy-metadata-proxy-012007cf-673c-4f83-a4b9-f21a913a1ccf[232578]: 10.100.0.10:39458 [22/Jan/2026:17:30:42.928] listener listener/metadata 0/0/0/21/21 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:30:42 compute-0 ovn_controller[95372]: 2026-01-22T17:30:42Z|00577|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 22 17:30:44 compute-0 podman[232708]: 2026-01-22 17:30:44.339538574 +0000 UTC m=+0.050390374 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:30:46 compute-0 nova_compute[183075]: 2026-01-22 17:30:46.522 183079 INFO nova.compute.manager [None req-e2193480-1f65-4edf-aa83-ccd24cd887b6 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Get console output
Jan 22 17:30:46 compute-0 nova_compute[183075]: 2026-01-22 17:30:46.527 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:30:46 compute-0 nova_compute[183075]: 2026-01-22 17:30:46.547 183079 INFO nova.compute.manager [None req-5ebfd955-6aaf-4a38-aa0d-29dfbba4aab4 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Get console output
Jan 22 17:30:46 compute-0 nova_compute[183075]: 2026-01-22 17:30:46.550 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:30:47 compute-0 nova_compute[183075]: 2026-01-22 17:30:47.536 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:47 compute-0 nova_compute[183075]: 2026-01-22 17:30:47.923 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:48 compute-0 nova_compute[183075]: 2026-01-22 17:30:48.556 183079 DEBUG nova.compute.manager [req-41b4340d-a23b-4013-9914-79fabff937e2 req-6470f317-0d33-4229-adcc-04e4ba03b6c8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Received event network-changed-c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:30:48 compute-0 nova_compute[183075]: 2026-01-22 17:30:48.557 183079 DEBUG nova.compute.manager [req-41b4340d-a23b-4013-9914-79fabff937e2 req-6470f317-0d33-4229-adcc-04e4ba03b6c8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Refreshing instance network info cache due to event network-changed-c9f56d85-9dbf-4ec0-8ee7-82cf5387b536. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:30:48 compute-0 nova_compute[183075]: 2026-01-22 17:30:48.557 183079 DEBUG oslo_concurrency.lockutils [req-41b4340d-a23b-4013-9914-79fabff937e2 req-6470f317-0d33-4229-adcc-04e4ba03b6c8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:30:48 compute-0 nova_compute[183075]: 2026-01-22 17:30:48.557 183079 DEBUG oslo_concurrency.lockutils [req-41b4340d-a23b-4013-9914-79fabff937e2 req-6470f317-0d33-4229-adcc-04e4ba03b6c8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:30:48 compute-0 nova_compute[183075]: 2026-01-22 17:30:48.558 183079 DEBUG nova.network.neutron [req-41b4340d-a23b-4013-9914-79fabff937e2 req-6470f317-0d33-4229-adcc-04e4ba03b6c8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Refreshing network info cache for port c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:30:48 compute-0 nova_compute[183075]: 2026-01-22 17:30:48.733 183079 DEBUG oslo_concurrency.lockutils [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:48 compute-0 nova_compute[183075]: 2026-01-22 17:30:48.733 183079 DEBUG oslo_concurrency.lockutils [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:48 compute-0 nova_compute[183075]: 2026-01-22 17:30:48.734 183079 DEBUG oslo_concurrency.lockutils [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:48 compute-0 nova_compute[183075]: 2026-01-22 17:30:48.734 183079 DEBUG oslo_concurrency.lockutils [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:48 compute-0 nova_compute[183075]: 2026-01-22 17:30:48.734 183079 DEBUG oslo_concurrency.lockutils [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:48 compute-0 nova_compute[183075]: 2026-01-22 17:30:48.735 183079 INFO nova.compute.manager [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Terminating instance
Jan 22 17:30:48 compute-0 nova_compute[183075]: 2026-01-22 17:30:48.736 183079 DEBUG nova.compute.manager [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:30:48 compute-0 kernel: tapc9f56d85-9d (unregistering): left promiscuous mode
Jan 22 17:30:48 compute-0 NetworkManager[55454]: <info>  [1769103048.7585] device (tapc9f56d85-9d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:30:48 compute-0 ovn_controller[95372]: 2026-01-22T17:30:48Z|00578|binding|INFO|Releasing lport c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 from this chassis (sb_readonly=0)
Jan 22 17:30:48 compute-0 ovn_controller[95372]: 2026-01-22T17:30:48Z|00579|binding|INFO|Setting lport c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 down in Southbound
Jan 22 17:30:48 compute-0 nova_compute[183075]: 2026-01-22 17:30:48.766 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:48 compute-0 ovn_controller[95372]: 2026-01-22T17:30:48Z|00580|binding|INFO|Removing iface tapc9f56d85-9d ovn-installed in OVS
Jan 22 17:30:48 compute-0 nova_compute[183075]: 2026-01-22 17:30:48.768 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:48.778 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:3e:82 10.100.0.4'], port_security=['fa:16:3e:90:3e:82 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-internal-dns-test-port-1425761620', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'ae41baf4-b0eb-4402-aebe-718f5c7f3ed9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eddf21f6-fbb7-4dbb-a219-f930432ddc71', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-internal-dns-test-port-1425761620', 'neutron:project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'e5aee07e-e85a-4653-b472-28e019cafee4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.222'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6e9bf76-88ee-4fb4-b07a-3ebfb5ef8df0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=c9f56d85-9dbf-4ec0-8ee7-82cf5387b536) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:30:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:48.779 104629 INFO neutron.agent.ovn.metadata.agent [-] Port c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 in datapath eddf21f6-fbb7-4dbb-a219-f930432ddc71 unbound from our chassis
Jan 22 17:30:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:48.781 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eddf21f6-fbb7-4dbb-a219-f930432ddc71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:30:48 compute-0 nova_compute[183075]: 2026-01-22 17:30:48.781 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:48.783 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c71e537e-88b4-4f86-903e-71fe21cb5770]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:48.783 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71 namespace which is not needed anymore
Jan 22 17:30:48 compute-0 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000033.scope: Deactivated successfully.
Jan 22 17:30:48 compute-0 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000033.scope: Consumed 14.486s CPU time.
Jan 22 17:30:48 compute-0 systemd-machined[154382]: Machine qemu-51-instance-00000033 terminated.
Jan 22 17:30:48 compute-0 podman[232732]: 2026-01-22 17:30:48.839505216 +0000 UTC m=+0.056181976 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:30:48 compute-0 neutron-haproxy-ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232203]: [NOTICE]   (232207) : haproxy version is 2.8.14-c23fe91
Jan 22 17:30:48 compute-0 neutron-haproxy-ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232203]: [NOTICE]   (232207) : path to executable is /usr/sbin/haproxy
Jan 22 17:30:48 compute-0 neutron-haproxy-ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232203]: [WARNING]  (232207) : Exiting Master process...
Jan 22 17:30:48 compute-0 neutron-haproxy-ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232203]: [ALERT]    (232207) : Current worker (232209) exited with code 143 (Terminated)
Jan 22 17:30:48 compute-0 neutron-haproxy-ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71[232203]: [WARNING]  (232207) : All workers exited. Exiting... (0)
Jan 22 17:30:48 compute-0 systemd[1]: libpod-60121fef443868fd7b468d61883cc3073fdd7284d5dec6beae7517ea512d001f.scope: Deactivated successfully.
Jan 22 17:30:48 compute-0 conmon[232203]: conmon 60121fef443868fd7b46 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-60121fef443868fd7b468d61883cc3073fdd7284d5dec6beae7517ea512d001f.scope/container/memory.events
Jan 22 17:30:48 compute-0 podman[232780]: 2026-01-22 17:30:48.932862567 +0000 UTC m=+0.047578620 container died 60121fef443868fd7b468d61883cc3073fdd7284d5dec6beae7517ea512d001f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:30:48 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-60121fef443868fd7b468d61883cc3073fdd7284d5dec6beae7517ea512d001f-userdata-shm.mount: Deactivated successfully.
Jan 22 17:30:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-e5dea59e7573bc63901a98ac8e42558d454a75fc5fb8fd86c3d3a415ec1adfa0-merged.mount: Deactivated successfully.
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.003 183079 INFO nova.virt.libvirt.driver [-] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Instance destroyed successfully.
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.005 183079 DEBUG nova.objects.instance [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lazy-loading 'resources' on Instance uuid ae41baf4-b0eb-4402-aebe-718f5c7f3ed9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:30:49 compute-0 podman[232780]: 2026-01-22 17:30:49.016508463 +0000 UTC m=+0.131224536 container cleanup 60121fef443868fd7b468d61883cc3073fdd7284d5dec6beae7517ea512d001f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.020 183079 DEBUG nova.virt.libvirt.vif [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:29:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-internal-dns-test-vm-2137758858',display_name='tempest-internal-dns-test-vm-2137758858',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-internal-dns-test-vm-2137758858',id=51,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOLWbfUjAYIoN+tjNcC/qQrGiTRWkuWLPOSnVFATm/h1ZcSJk8EAZI4vKn1W/DBRn+mGMC9ealMQjIznUDRCwtGbPPBWKrg20ByYe0VtCSnmvWLZm4uyTyPBgNtKHGDVqg==',key_name='tempest-internal-dns-test-shared-keypair-1135515409',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:29:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='89916c03f6f440f6ae7cf81f2ae99bad',ramdisk_id='',reservation_id='r-fzp1nqoq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InternalDNSTest-38234021',owner_user_name='tempest-InternalDNSTest-38234021-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:29:54Z,user_data=None,user_id='1ddebe2a251e4b118d9469f7d6fdb2ce',uuid=ae41baf4-b0eb-4402-aebe-718f5c7f3ed9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "address": "fa:16:3e:90:3e:82", "network": {"id": "eddf21f6-fbb7-4dbb-a219-f930432ddc71", "bridge": "br-int", "label": "tempest-internal-dns-test-network-1900776616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f56d85-9d", "ovs_interfaceid": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.021 183079 DEBUG nova.network.os_vif_util [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converting VIF {"id": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "address": "fa:16:3e:90:3e:82", "network": {"id": "eddf21f6-fbb7-4dbb-a219-f930432ddc71", "bridge": "br-int", "label": "tempest-internal-dns-test-network-1900776616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f56d85-9d", "ovs_interfaceid": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.021 183079 DEBUG nova.network.os_vif_util [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:90:3e:82,bridge_name='br-int',has_traffic_filtering=True,id=c9f56d85-9dbf-4ec0-8ee7-82cf5387b536,network=Network(eddf21f6-fbb7-4dbb-a219-f930432ddc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc9f56d85-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.022 183079 DEBUG os_vif [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:3e:82,bridge_name='br-int',has_traffic_filtering=True,id=c9f56d85-9dbf-4ec0-8ee7-82cf5387b536,network=Network(eddf21f6-fbb7-4dbb-a219-f930432ddc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc9f56d85-9d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.024 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.024 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc9f56d85-9d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.026 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:49 compute-0 systemd[1]: libpod-conmon-60121fef443868fd7b468d61883cc3073fdd7284d5dec6beae7517ea512d001f.scope: Deactivated successfully.
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.028 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.030 183079 INFO os_vif [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:3e:82,bridge_name='br-int',has_traffic_filtering=True,id=c9f56d85-9dbf-4ec0-8ee7-82cf5387b536,network=Network(eddf21f6-fbb7-4dbb-a219-f930432ddc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc9f56d85-9d')
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.031 183079 INFO nova.virt.libvirt.driver [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Deleting instance files /var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9_del
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.032 183079 INFO nova.virt.libvirt.driver [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Deletion of /var/lib/nova/instances/ae41baf4-b0eb-4402-aebe-718f5c7f3ed9_del complete
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.095 183079 INFO nova.compute.manager [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.096 183079 DEBUG oslo.service.loopingcall [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.096 183079 DEBUG nova.compute.manager [-] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.096 183079 DEBUG nova.network.neutron [-] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:30:49 compute-0 podman[232823]: 2026-01-22 17:30:49.099323258 +0000 UTC m=+0.056674259 container remove 60121fef443868fd7b468d61883cc3073fdd7284d5dec6beae7517ea512d001f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.106 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c7b286c9-200c-4c0d-a9a8-0eae174e9034]: (4, ('Thu Jan 22 05:30:48 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71 (60121fef443868fd7b468d61883cc3073fdd7284d5dec6beae7517ea512d001f)\n60121fef443868fd7b468d61883cc3073fdd7284d5dec6beae7517ea512d001f\nThu Jan 22 05:30:49 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71 (60121fef443868fd7b468d61883cc3073fdd7284d5dec6beae7517ea512d001f)\n60121fef443868fd7b468d61883cc3073fdd7284d5dec6beae7517ea512d001f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.107 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[31902ae1-293f-41a7-b2ea-04ad3ba470b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.108 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeddf21f6-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.109 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:49 compute-0 kernel: tapeddf21f6-f0: left promiscuous mode
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.122 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.126 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2a28fbac-43c3-4011-b2c3-fb362021a11a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.141 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[609f85a2-01d2-4ffe-8a58-bcd03afb06fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.143 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4b792369-3735-43bb-9ab0-6be6d50da532]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.157 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[40f24ee9-1bc0-4030-906e-2f60945bc983]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535545, 'reachable_time': 35006, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232838, 'error': None, 'target': 'ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.160 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-eddf21f6-fbb7-4dbb-a219-f930432ddc71 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.160 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[b61a3020-68c1-402d-9c36-a9ec9ef81401]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 systemd[1]: run-netns-ovnmeta\x2deddf21f6\x2dfbb7\x2d4dbb\x2da219\x2df930432ddc71.mount: Deactivated successfully.
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.176 183079 DEBUG nova.compute.manager [req-f97e81eb-9b28-4ced-885b-b510200aaeae req-99d235de-5a1c-4223-89b9-5ba55b41322a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Received event network-vif-unplugged-c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.176 183079 DEBUG oslo_concurrency.lockutils [req-f97e81eb-9b28-4ced-885b-b510200aaeae req-99d235de-5a1c-4223-89b9-5ba55b41322a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.176 183079 DEBUG oslo_concurrency.lockutils [req-f97e81eb-9b28-4ced-885b-b510200aaeae req-99d235de-5a1c-4223-89b9-5ba55b41322a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.177 183079 DEBUG oslo_concurrency.lockutils [req-f97e81eb-9b28-4ced-885b-b510200aaeae req-99d235de-5a1c-4223-89b9-5ba55b41322a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.177 183079 DEBUG nova.compute.manager [req-f97e81eb-9b28-4ced-885b-b510200aaeae req-99d235de-5a1c-4223-89b9-5ba55b41322a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] No waiting events found dispatching network-vif-unplugged-c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.177 183079 DEBUG nova.compute.manager [req-f97e81eb-9b28-4ced-885b-b510200aaeae req-99d235de-5a1c-4223-89b9-5ba55b41322a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Received event network-vif-unplugged-c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.515 183079 DEBUG oslo_concurrency.lockutils [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.515 183079 DEBUG oslo_concurrency.lockutils [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.515 183079 DEBUG oslo_concurrency.lockutils [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.516 183079 DEBUG oslo_concurrency.lockutils [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.516 183079 DEBUG oslo_concurrency.lockutils [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.517 183079 INFO nova.compute.manager [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Terminating instance
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.518 183079 DEBUG nova.compute.manager [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:30:49 compute-0 kernel: tapcd31a146-70 (unregistering): left promiscuous mode
Jan 22 17:30:49 compute-0 NetworkManager[55454]: <info>  [1769103049.5439] device (tapcd31a146-70): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.550 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:49 compute-0 ovn_controller[95372]: 2026-01-22T17:30:49Z|00581|binding|INFO|Releasing lport cd31a146-70f5-4610-88f1-ae4772887ce2 from this chassis (sb_readonly=0)
Jan 22 17:30:49 compute-0 ovn_controller[95372]: 2026-01-22T17:30:49Z|00582|binding|INFO|Setting lport cd31a146-70f5-4610-88f1-ae4772887ce2 down in Southbound
Jan 22 17:30:49 compute-0 ovn_controller[95372]: 2026-01-22T17:30:49Z|00583|binding|INFO|Removing iface tapcd31a146-70 ovn-installed in OVS
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.552 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.556 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:f4:bc 10.100.0.10'], port_security=['fa:16:3e:f2:f4:bc 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-port-288740050', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'e8d327ba-899a-4ec7-8b78-0fb6dedf7371', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-port-288740050', 'neutron:project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'a8d0872f-de3c-4404-94dc-7a328e5b8aa9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a510149a-49ad-47bc-a8ad-05908544b3cc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=cd31a146-70f5-4610-88f1-ae4772887ce2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.557 104629 INFO neutron.agent.ovn.metadata.agent [-] Port cd31a146-70f5-4610-88f1-ae4772887ce2 in datapath 012007cf-673c-4f83-a4b9-f21a913a1ccf unbound from our chassis
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.559 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 012007cf-673c-4f83-a4b9-f21a913a1ccf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.559 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f575105f-ed89-4b94-95e2-1c1fc92fb6ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.560 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf namespace which is not needed anymore
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.566 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:49 compute-0 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000034.scope: Deactivated successfully.
Jan 22 17:30:49 compute-0 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000034.scope: Consumed 12.969s CPU time.
Jan 22 17:30:49 compute-0 systemd-machined[154382]: Machine qemu-52-instance-00000034 terminated.
Jan 22 17:30:49 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[232572]: [NOTICE]   (232576) : haproxy version is 2.8.14-c23fe91
Jan 22 17:30:49 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[232572]: [NOTICE]   (232576) : path to executable is /usr/sbin/haproxy
Jan 22 17:30:49 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[232572]: [WARNING]  (232576) : Exiting Master process...
Jan 22 17:30:49 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[232572]: [ALERT]    (232576) : Current worker (232578) exited with code 143 (Terminated)
Jan 22 17:30:49 compute-0 neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf[232572]: [WARNING]  (232576) : All workers exited. Exiting... (0)
Jan 22 17:30:49 compute-0 systemd[1]: libpod-28d30c27807f19cfb0ffe9a4b80b1baed5e1e5d735c733d0636513a5c3b1862d.scope: Deactivated successfully.
Jan 22 17:30:49 compute-0 podman[232861]: 2026-01-22 17:30:49.679257174 +0000 UTC m=+0.041526631 container died 28d30c27807f19cfb0ffe9a4b80b1baed5e1e5d735c733d0636513a5c3b1862d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:30:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-28d30c27807f19cfb0ffe9a4b80b1baed5e1e5d735c733d0636513a5c3b1862d-userdata-shm.mount: Deactivated successfully.
Jan 22 17:30:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-f2be381dabb6ae08debb8105cbeeb78f07eeefee4bed9784a017aacf683a33f8-merged.mount: Deactivated successfully.
Jan 22 17:30:49 compute-0 podman[232861]: 2026-01-22 17:30:49.707402073 +0000 UTC m=+0.069671530 container cleanup 28d30c27807f19cfb0ffe9a4b80b1baed5e1e5d735c733d0636513a5c3b1862d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:30:49 compute-0 systemd[1]: libpod-conmon-28d30c27807f19cfb0ffe9a4b80b1baed5e1e5d735c733d0636513a5c3b1862d.scope: Deactivated successfully.
Jan 22 17:30:49 compute-0 kernel: tapcd31a146-70: entered promiscuous mode
Jan 22 17:30:49 compute-0 NetworkManager[55454]: <info>  [1769103049.7369] manager: (tapcd31a146-70): new Tun device (/org/freedesktop/NetworkManager/Devices/247)
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.738 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:49 compute-0 ovn_controller[95372]: 2026-01-22T17:30:49Z|00584|binding|INFO|Claiming lport cd31a146-70f5-4610-88f1-ae4772887ce2 for this chassis.
Jan 22 17:30:49 compute-0 ovn_controller[95372]: 2026-01-22T17:30:49Z|00585|binding|INFO|cd31a146-70f5-4610-88f1-ae4772887ce2: Claiming fa:16:3e:f2:f4:bc 10.100.0.10
Jan 22 17:30:49 compute-0 kernel: tapcd31a146-70 (unregistering): left promiscuous mode
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.744 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:f4:bc 10.100.0.10'], port_security=['fa:16:3e:f2:f4:bc 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-port-288740050', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'e8d327ba-899a-4ec7-8b78-0fb6dedf7371', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-port-288740050', 'neutron:project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'a8d0872f-de3c-4404-94dc-7a328e5b8aa9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a510149a-49ad-47bc-a8ad-05908544b3cc, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=cd31a146-70f5-4610-88f1-ae4772887ce2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:30:49 compute-0 ovn_controller[95372]: 2026-01-22T17:30:49Z|00586|binding|INFO|Releasing lport cd31a146-70f5-4610-88f1-ae4772887ce2 from this chassis (sb_readonly=0)
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.760 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.766 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:f4:bc 10.100.0.10'], port_security=['fa:16:3e:f2:f4:bc 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-port-288740050', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'e8d327ba-899a-4ec7-8b78-0fb6dedf7371', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-port-288740050', 'neutron:project_id': '7ff1e5ce4806445a8e463c71b6930bec', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'a8d0872f-de3c-4404-94dc-7a328e5b8aa9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a510149a-49ad-47bc-a8ad-05908544b3cc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=cd31a146-70f5-4610-88f1-ae4772887ce2) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:30:49 compute-0 podman[232892]: 2026-01-22 17:30:49.7754682 +0000 UTC m=+0.047310533 container remove 28d30c27807f19cfb0ffe9a4b80b1baed5e1e5d735c733d0636513a5c3b1862d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.777 183079 INFO nova.virt.libvirt.driver [-] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Instance destroyed successfully.
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.777 183079 DEBUG nova.objects.instance [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lazy-loading 'resources' on Instance uuid e8d327ba-899a-4ec7-8b78-0fb6dedf7371 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.780 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[72d1177e-c084-4741-840c-42dc7050aaad]: (4, ('Thu Jan 22 05:30:49 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf (28d30c27807f19cfb0ffe9a4b80b1baed5e1e5d735c733d0636513a5c3b1862d)\n28d30c27807f19cfb0ffe9a4b80b1baed5e1e5d735c733d0636513a5c3b1862d\nThu Jan 22 05:30:49 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf (28d30c27807f19cfb0ffe9a4b80b1baed5e1e5d735c733d0636513a5c3b1862d)\n28d30c27807f19cfb0ffe9a4b80b1baed5e1e5d735c733d0636513a5c3b1862d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.781 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[242c563b-27db-4eaa-9d0f-b7e36d2b5524]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.782 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap012007cf-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.783 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:49 compute-0 kernel: tap012007cf-60: left promiscuous mode
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.798 183079 DEBUG nova.virt.libvirt.vif [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:30:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-691065204',display_name='tempest-server-test-691065204',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-691065204',id=52,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQrJNwmVmi0v3CvDVkdf2ULfmIKW9OE2obw9UEIh0JilIeeueUzwA1cDH+T5CoOIGZz/satGSZDSgKqtLklRpNQ/Wm6QLNBLAjV/3q74U9Y8J0BPwM5hIfTkFkrKFKf2g==',key_name='tempest-keypair-test-1359386190',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:30:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ff1e5ce4806445a8e463c71b6930bec',ramdisk_id='',reservation_id='r-r8poukwg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortsTest-1337721110',owner_user_name='tempest-PortsTest-1337721110-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:30:24Z,user_data=None,user_id='3a896d4927d442ffba421873948034be',uuid=e8d327ba-899a-4ec7-8b78-0fb6dedf7371,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.798 183079 DEBUG nova.network.os_vif_util [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converting VIF {"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.799 183079 DEBUG nova.network.os_vif_util [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:f4:bc,bridge_name='br-int',has_traffic_filtering=True,id=cd31a146-70f5-4610-88f1-ae4772887ce2,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd31a146-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.799 183079 DEBUG os_vif [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:f4:bc,bridge_name='br-int',has_traffic_filtering=True,id=cd31a146-70f5-4610-88f1-ae4772887ce2,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd31a146-70') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.800 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.800 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[58fe677a-6665-4984-90ad-ecf62e40c78a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.801 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd31a146-70, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.802 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.804 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.806 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.807 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.808 183079 INFO os_vif [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:f4:bc,bridge_name='br-int',has_traffic_filtering=True,id=cd31a146-70f5-4610-88f1-ae4772887ce2,network=Network(012007cf-673c-4f83-a4b9-f21a913a1ccf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcd31a146-70')
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.809 183079 INFO nova.virt.libvirt.driver [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Deleting instance files /var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371_del
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.809 183079 INFO nova.virt.libvirt.driver [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Deletion of /var/lib/nova/instances/e8d327ba-899a-4ec7-8b78-0fb6dedf7371_del complete
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.819 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[95266248-7944-4cc9-a3fd-fdc2c1cc6e35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.820 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[78ae4589-2d37-4195-b074-b6e34a669e4f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.834 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[457c576e-fac9-41c1-9370-cb9ecc7c1163]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538683, 'reachable_time': 37735, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232920, 'error': None, 'target': 'ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.835 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-012007cf-673c-4f83-a4b9-f21a913a1ccf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.836 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[96493975-7fd9-4cf7-814b-d4406f04d755]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.836 104629 INFO neutron.agent.ovn.metadata.agent [-] Port cd31a146-70f5-4610-88f1-ae4772887ce2 in datapath 012007cf-673c-4f83-a4b9-f21a913a1ccf unbound from our chassis
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.837 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 012007cf-673c-4f83-a4b9-f21a913a1ccf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.837 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a66c3448-ed45-43f7-a1d0-49d9f7b94f95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.838 104629 INFO neutron.agent.ovn.metadata.agent [-] Port cd31a146-70f5-4610-88f1-ae4772887ce2 in datapath 012007cf-673c-4f83-a4b9-f21a913a1ccf unbound from our chassis
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.839 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 012007cf-673c-4f83-a4b9-f21a913a1ccf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:30:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:30:49.839 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[adc7007b-158e-46f4-82f9-c90678a4352e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.876 183079 INFO nova.compute.manager [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.876 183079 DEBUG oslo.service.loopingcall [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.877 183079 DEBUG nova.compute.manager [-] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:30:49 compute-0 nova_compute[183075]: 2026-01-22 17:30:49.877 183079 DEBUG nova.network.neutron [-] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:30:49 compute-0 systemd[1]: run-netns-ovnmeta\x2d012007cf\x2d673c\x2d4f83\x2da4b9\x2df21a913a1ccf.mount: Deactivated successfully.
Jan 22 17:30:50 compute-0 nova_compute[183075]: 2026-01-22 17:30:50.925 183079 DEBUG nova.network.neutron [req-41b4340d-a23b-4013-9914-79fabff937e2 req-6470f317-0d33-4229-adcc-04e4ba03b6c8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Updated VIF entry in instance network info cache for port c9f56d85-9dbf-4ec0-8ee7-82cf5387b536. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:30:50 compute-0 nova_compute[183075]: 2026-01-22 17:30:50.926 183079 DEBUG nova.network.neutron [req-41b4340d-a23b-4013-9914-79fabff937e2 req-6470f317-0d33-4229-adcc-04e4ba03b6c8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Updating instance_info_cache with network_info: [{"id": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "address": "fa:16:3e:90:3e:82", "network": {"id": "eddf21f6-fbb7-4dbb-a219-f930432ddc71", "bridge": "br-int", "label": "tempest-internal-dns-test-network-1900776616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f56d85-9d", "ovs_interfaceid": "c9f56d85-9dbf-4ec0-8ee7-82cf5387b536", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:30:50 compute-0 nova_compute[183075]: 2026-01-22 17:30:50.948 183079 DEBUG oslo_concurrency.lockutils [req-41b4340d-a23b-4013-9914-79fabff937e2 req-6470f317-0d33-4229-adcc-04e4ba03b6c8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:30:51 compute-0 nova_compute[183075]: 2026-01-22 17:30:51.675 183079 DEBUG nova.compute.manager [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Received event network-changed-cd31a146-70f5-4610-88f1-ae4772887ce2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:30:51 compute-0 nova_compute[183075]: 2026-01-22 17:30:51.675 183079 DEBUG nova.compute.manager [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Refreshing instance network info cache due to event network-changed-cd31a146-70f5-4610-88f1-ae4772887ce2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:30:51 compute-0 nova_compute[183075]: 2026-01-22 17:30:51.675 183079 DEBUG oslo_concurrency.lockutils [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-e8d327ba-899a-4ec7-8b78-0fb6dedf7371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:30:51 compute-0 nova_compute[183075]: 2026-01-22 17:30:51.676 183079 DEBUG oslo_concurrency.lockutils [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-e8d327ba-899a-4ec7-8b78-0fb6dedf7371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:30:51 compute-0 nova_compute[183075]: 2026-01-22 17:30:51.676 183079 DEBUG nova.network.neutron [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Refreshing network info cache for port cd31a146-70f5-4610-88f1-ae4772887ce2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:30:52 compute-0 nova_compute[183075]: 2026-01-22 17:30:52.313 183079 DEBUG nova.network.neutron [-] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:30:52 compute-0 nova_compute[183075]: 2026-01-22 17:30:52.331 183079 INFO nova.compute.manager [-] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Took 3.23 seconds to deallocate network for instance.
Jan 22 17:30:52 compute-0 nova_compute[183075]: 2026-01-22 17:30:52.383 183079 DEBUG oslo_concurrency.lockutils [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:52 compute-0 nova_compute[183075]: 2026-01-22 17:30:52.384 183079 DEBUG oslo_concurrency.lockutils [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:52 compute-0 nova_compute[183075]: 2026-01-22 17:30:52.473 183079 DEBUG nova.compute.provider_tree [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:30:52 compute-0 nova_compute[183075]: 2026-01-22 17:30:52.492 183079 DEBUG nova.scheduler.client.report [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:30:52 compute-0 nova_compute[183075]: 2026-01-22 17:30:52.520 183079 DEBUG oslo_concurrency.lockutils [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:52 compute-0 nova_compute[183075]: 2026-01-22 17:30:52.539 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:52 compute-0 nova_compute[183075]: 2026-01-22 17:30:52.547 183079 INFO nova.scheduler.client.report [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Deleted allocations for instance ae41baf4-b0eb-4402-aebe-718f5c7f3ed9
Jan 22 17:30:52 compute-0 nova_compute[183075]: 2026-01-22 17:30:52.631 183079 DEBUG oslo_concurrency.lockutils [None req-2ae0d16f-43c7-46ca-9802-c072b9bd585b 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.898s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:52 compute-0 nova_compute[183075]: 2026-01-22 17:30:52.641 183079 DEBUG nova.network.neutron [-] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:30:52 compute-0 nova_compute[183075]: 2026-01-22 17:30:52.666 183079 INFO nova.compute.manager [-] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Took 2.79 seconds to deallocate network for instance.
Jan 22 17:30:52 compute-0 nova_compute[183075]: 2026-01-22 17:30:52.712 183079 DEBUG oslo_concurrency.lockutils [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:52 compute-0 nova_compute[183075]: 2026-01-22 17:30:52.713 183079 DEBUG oslo_concurrency.lockutils [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:52 compute-0 nova_compute[183075]: 2026-01-22 17:30:52.778 183079 DEBUG nova.compute.provider_tree [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.254 183079 DEBUG nova.scheduler.client.report [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.290 183079 DEBUG oslo_concurrency.lockutils [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.311 183079 DEBUG nova.network.neutron [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Updated VIF entry in instance network info cache for port cd31a146-70f5-4610-88f1-ae4772887ce2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.311 183079 DEBUG nova.network.neutron [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Updating instance_info_cache with network_info: [{"id": "cd31a146-70f5-4610-88f1-ae4772887ce2", "address": "fa:16:3e:f2:f4:bc", "network": {"id": "012007cf-673c-4f83-a4b9-f21a913a1ccf", "bridge": "br-int", "label": "tempest-test-network--33826421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ff1e5ce4806445a8e463c71b6930bec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd31a146-70", "ovs_interfaceid": "cd31a146-70f5-4610-88f1-ae4772887ce2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.314 183079 INFO nova.scheduler.client.report [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Deleted allocations for instance e8d327ba-899a-4ec7-8b78-0fb6dedf7371
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.349 183079 DEBUG oslo_concurrency.lockutils [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-e8d327ba-899a-4ec7-8b78-0fb6dedf7371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.349 183079 DEBUG nova.compute.manager [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Received event network-vif-plugged-c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.350 183079 DEBUG oslo_concurrency.lockutils [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.350 183079 DEBUG oslo_concurrency.lockutils [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.350 183079 DEBUG oslo_concurrency.lockutils [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "ae41baf4-b0eb-4402-aebe-718f5c7f3ed9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.350 183079 DEBUG nova.compute.manager [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] No waiting events found dispatching network-vif-plugged-c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.351 183079 WARNING nova.compute.manager [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Received unexpected event network-vif-plugged-c9f56d85-9dbf-4ec0-8ee7-82cf5387b536 for instance with vm_state active and task_state deleting.
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.351 183079 DEBUG nova.compute.manager [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Received event network-vif-unplugged-cd31a146-70f5-4610-88f1-ae4772887ce2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.351 183079 DEBUG oslo_concurrency.lockutils [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.352 183079 DEBUG oslo_concurrency.lockutils [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.352 183079 DEBUG oslo_concurrency.lockutils [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.352 183079 DEBUG nova.compute.manager [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] No waiting events found dispatching network-vif-unplugged-cd31a146-70f5-4610-88f1-ae4772887ce2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.352 183079 DEBUG nova.compute.manager [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Received event network-vif-unplugged-cd31a146-70f5-4610-88f1-ae4772887ce2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.353 183079 DEBUG nova.compute.manager [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Received event network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.353 183079 DEBUG oslo_concurrency.lockutils [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.353 183079 DEBUG oslo_concurrency.lockutils [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.354 183079 DEBUG oslo_concurrency.lockutils [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.354 183079 DEBUG nova.compute.manager [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] No waiting events found dispatching network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.354 183079 WARNING nova.compute.manager [req-d58d186c-b5bc-4187-8794-6f8ca3a793e1 req-3c28765c-9473-4f7b-a45c-5bdfe40e389c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Received unexpected event network-vif-plugged-cd31a146-70f5-4610-88f1-ae4772887ce2 for instance with vm_state active and task_state deleting.
Jan 22 17:30:53 compute-0 nova_compute[183075]: 2026-01-22 17:30:53.386 183079 DEBUG oslo_concurrency.lockutils [None req-fb50d3ce-d737-4578-9f55-dbfe5d24b854 3a896d4927d442ffba421873948034be 7ff1e5ce4806445a8e463c71b6930bec - - default default] Lock "e8d327ba-899a-4ec7-8b78-0fb6dedf7371" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.870s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:30:54 compute-0 nova_compute[183075]: 2026-01-22 17:30:54.802 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:57 compute-0 nova_compute[183075]: 2026-01-22 17:30:57.573 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:59 compute-0 nova_compute[183075]: 2026-01-22 17:30:59.103 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:30:59 compute-0 nova_compute[183075]: 2026-01-22 17:30:59.824 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:00 compute-0 podman[232923]: 2026-01-22 17:31:00.362844784 +0000 UTC m=+0.064903405 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 17:31:00 compute-0 podman[232924]: 2026-01-22 17:31:00.374881321 +0000 UTC m=+0.067327129 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 17:31:00 compute-0 podman[232922]: 2026-01-22 17:31:00.390268855 +0000 UTC m=+0.091720670 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:31:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:00.945 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:31:00 compute-0 nova_compute[183075]: 2026-01-22 17:31:00.945 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:00.947 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:31:02 compute-0 nova_compute[183075]: 2026-01-22 17:31:02.575 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:04 compute-0 nova_compute[183075]: 2026-01-22 17:31:04.002 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103049.0006216, ae41baf4-b0eb-4402-aebe-718f5c7f3ed9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:31:04 compute-0 nova_compute[183075]: 2026-01-22 17:31:04.002 183079 INFO nova.compute.manager [-] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] VM Stopped (Lifecycle Event)
Jan 22 17:31:04 compute-0 nova_compute[183075]: 2026-01-22 17:31:04.023 183079 DEBUG nova.compute.manager [None req-e0e5f08c-81c1-4f02-b2eb-580cdc21896c - - - - - -] [instance: ae41baf4-b0eb-4402-aebe-718f5c7f3ed9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:31:04 compute-0 podman[232985]: 2026-01-22 17:31:04.355586832 +0000 UTC m=+0.060299974 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 22 17:31:04 compute-0 nova_compute[183075]: 2026-01-22 17:31:04.720 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:04 compute-0 nova_compute[183075]: 2026-01-22 17:31:04.777 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103049.775701, e8d327ba-899a-4ec7-8b78-0fb6dedf7371 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:31:04 compute-0 nova_compute[183075]: 2026-01-22 17:31:04.777 183079 INFO nova.compute.manager [-] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] VM Stopped (Lifecycle Event)
Jan 22 17:31:04 compute-0 nova_compute[183075]: 2026-01-22 17:31:04.802 183079 DEBUG nova.compute.manager [None req-cb0ecb70-1533-46cd-9d86-1dda6ccb85d4 - - - - - -] [instance: e8d327ba-899a-4ec7-8b78-0fb6dedf7371] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:31:04 compute-0 nova_compute[183075]: 2026-01-22 17:31:04.824 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:04 compute-0 nova_compute[183075]: 2026-01-22 17:31:04.865 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:06.950 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:31:07 compute-0 nova_compute[183075]: 2026-01-22 17:31:07.577 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:09 compute-0 nova_compute[183075]: 2026-01-22 17:31:09.827 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.455 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "b0949fde-940d-495c-bdb0-e6c996b0274f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.455 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "b0949fde-940d-495c-bdb0-e6c996b0274f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.481 183079 DEBUG nova.compute.manager [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.598 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.598 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.606 183079 DEBUG nova.virt.hardware [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.607 183079 INFO nova.compute.claims [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.733 183079 DEBUG nova.compute.provider_tree [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.755 183079 DEBUG nova.scheduler.client.report [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.788 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.789 183079 DEBUG nova.compute.manager [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.835 183079 DEBUG nova.compute.manager [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.836 183079 DEBUG nova.network.neutron [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.853 183079 INFO nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.870 183079 DEBUG nova.compute.manager [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.957 183079 DEBUG nova.compute.manager [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.958 183079 DEBUG nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.959 183079 INFO nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Creating image(s)
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.959 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "/var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.959 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "/var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.960 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "/var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:10 compute-0 nova_compute[183075]: 2026-01-22 17:31:10.972 183079 DEBUG oslo_concurrency.processutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.030 183079 DEBUG oslo_concurrency.processutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.031 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.032 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.043 183079 DEBUG oslo_concurrency.processutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.100 183079 DEBUG oslo_concurrency.processutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.101 183079 DEBUG oslo_concurrency.processutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.135 183079 DEBUG oslo_concurrency.processutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.136 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.136 183079 DEBUG oslo_concurrency.processutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.188 183079 DEBUG oslo_concurrency.processutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.189 183079 DEBUG nova.virt.disk.api [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Checking if we can resize image /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.189 183079 DEBUG oslo_concurrency.processutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.246 183079 DEBUG oslo_concurrency.processutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.247 183079 DEBUG nova.virt.disk.api [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Cannot resize image /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.248 183079 DEBUG nova.objects.instance [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lazy-loading 'migration_context' on Instance uuid b0949fde-940d-495c-bdb0-e6c996b0274f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.262 183079 DEBUG nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.263 183079 DEBUG nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Ensure instance console log exists: /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.263 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.264 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.264 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:11 compute-0 nova_compute[183075]: 2026-01-22 17:31:11.542 183079 DEBUG nova.policy [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:31:12 compute-0 nova_compute[183075]: 2026-01-22 17:31:12.579 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:13 compute-0 nova_compute[183075]: 2026-01-22 17:31:13.032 183079 DEBUG nova.network.neutron [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Successfully created port: 6aacbedb-6999-4006-9e77-6e540614dbea _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:31:14 compute-0 nova_compute[183075]: 2026-01-22 17:31:14.027 183079 DEBUG nova.network.neutron [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Successfully updated port: 6aacbedb-6999-4006-9e77-6e540614dbea _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:31:14 compute-0 nova_compute[183075]: 2026-01-22 17:31:14.210 183079 DEBUG nova.compute.manager [req-36f849ba-edf4-4df1-a247-05145cb204ce req-4a2bd5de-7637-4507-b13d-184289ac0058 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Received event network-changed-6aacbedb-6999-4006-9e77-6e540614dbea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:31:14 compute-0 nova_compute[183075]: 2026-01-22 17:31:14.211 183079 DEBUG nova.compute.manager [req-36f849ba-edf4-4df1-a247-05145cb204ce req-4a2bd5de-7637-4507-b13d-184289ac0058 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Refreshing instance network info cache due to event network-changed-6aacbedb-6999-4006-9e77-6e540614dbea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:31:14 compute-0 nova_compute[183075]: 2026-01-22 17:31:14.211 183079 DEBUG oslo_concurrency.lockutils [req-36f849ba-edf4-4df1-a247-05145cb204ce req-4a2bd5de-7637-4507-b13d-184289ac0058 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-b0949fde-940d-495c-bdb0-e6c996b0274f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:31:14 compute-0 nova_compute[183075]: 2026-01-22 17:31:14.211 183079 DEBUG oslo_concurrency.lockutils [req-36f849ba-edf4-4df1-a247-05145cb204ce req-4a2bd5de-7637-4507-b13d-184289ac0058 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-b0949fde-940d-495c-bdb0-e6c996b0274f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:31:14 compute-0 nova_compute[183075]: 2026-01-22 17:31:14.212 183079 DEBUG nova.network.neutron [req-36f849ba-edf4-4df1-a247-05145cb204ce req-4a2bd5de-7637-4507-b13d-184289ac0058 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Refreshing network info cache for port 6aacbedb-6999-4006-9e77-6e540614dbea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:31:14 compute-0 nova_compute[183075]: 2026-01-22 17:31:14.447 183079 DEBUG nova.network.neutron [req-36f849ba-edf4-4df1-a247-05145cb204ce req-4a2bd5de-7637-4507-b13d-184289ac0058 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:31:14 compute-0 nova_compute[183075]: 2026-01-22 17:31:14.830 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:14 compute-0 nova_compute[183075]: 2026-01-22 17:31:14.972 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "refresh_cache-b0949fde-940d-495c-bdb0-e6c996b0274f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:31:15 compute-0 nova_compute[183075]: 2026-01-22 17:31:15.185 183079 DEBUG nova.network.neutron [req-36f849ba-edf4-4df1-a247-05145cb204ce req-4a2bd5de-7637-4507-b13d-184289ac0058 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:31:15 compute-0 nova_compute[183075]: 2026-01-22 17:31:15.205 183079 DEBUG oslo_concurrency.lockutils [req-36f849ba-edf4-4df1-a247-05145cb204ce req-4a2bd5de-7637-4507-b13d-184289ac0058 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-b0949fde-940d-495c-bdb0-e6c996b0274f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:31:15 compute-0 nova_compute[183075]: 2026-01-22 17:31:15.207 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquired lock "refresh_cache-b0949fde-940d-495c-bdb0-e6c996b0274f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:31:15 compute-0 nova_compute[183075]: 2026-01-22 17:31:15.207 183079 DEBUG nova.network.neutron [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:31:15 compute-0 podman[233023]: 2026-01-22 17:31:15.345569775 +0000 UTC m=+0.057597833 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:31:15 compute-0 nova_compute[183075]: 2026-01-22 17:31:15.387 183079 DEBUG nova.network.neutron [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:31:16 compute-0 nova_compute[183075]: 2026-01-22 17:31:16.397 183079 DEBUG nova.compute.manager [req-5bb87c05-14ab-4289-b8cf-a32bb7878b8c req-b5f516f0-9bd8-4a84-9736-dc52a48187f8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Received event network-changed-6aacbedb-6999-4006-9e77-6e540614dbea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:31:16 compute-0 nova_compute[183075]: 2026-01-22 17:31:16.397 183079 DEBUG nova.compute.manager [req-5bb87c05-14ab-4289-b8cf-a32bb7878b8c req-b5f516f0-9bd8-4a84-9736-dc52a48187f8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Refreshing instance network info cache due to event network-changed-6aacbedb-6999-4006-9e77-6e540614dbea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:31:16 compute-0 nova_compute[183075]: 2026-01-22 17:31:16.397 183079 DEBUG oslo_concurrency.lockutils [req-5bb87c05-14ab-4289-b8cf-a32bb7878b8c req-b5f516f0-9bd8-4a84-9736-dc52a48187f8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-b0949fde-940d-495c-bdb0-e6c996b0274f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.537 183079 DEBUG nova.network.neutron [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Updating instance_info_cache with network_info: [{"id": "6aacbedb-6999-4006-9e77-6e540614dbea", "address": "fa:16:3e:35:c7:99", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6aacbedb-69", "ovs_interfaceid": "6aacbedb-6999-4006-9e77-6e540614dbea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.554 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Releasing lock "refresh_cache-b0949fde-940d-495c-bdb0-e6c996b0274f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.555 183079 DEBUG nova.compute.manager [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Instance network_info: |[{"id": "6aacbedb-6999-4006-9e77-6e540614dbea", "address": "fa:16:3e:35:c7:99", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6aacbedb-69", "ovs_interfaceid": "6aacbedb-6999-4006-9e77-6e540614dbea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.555 183079 DEBUG oslo_concurrency.lockutils [req-5bb87c05-14ab-4289-b8cf-a32bb7878b8c req-b5f516f0-9bd8-4a84-9736-dc52a48187f8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-b0949fde-940d-495c-bdb0-e6c996b0274f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.556 183079 DEBUG nova.network.neutron [req-5bb87c05-14ab-4289-b8cf-a32bb7878b8c req-b5f516f0-9bd8-4a84-9736-dc52a48187f8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Refreshing network info cache for port 6aacbedb-6999-4006-9e77-6e540614dbea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.560 183079 DEBUG nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Start _get_guest_xml network_info=[{"id": "6aacbedb-6999-4006-9e77-6e540614dbea", "address": "fa:16:3e:35:c7:99", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6aacbedb-69", "ovs_interfaceid": "6aacbedb-6999-4006-9e77-6e540614dbea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.567 183079 WARNING nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.579 183079 DEBUG nova.virt.libvirt.host [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.580 183079 DEBUG nova.virt.libvirt.host [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.582 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.587 183079 DEBUG nova.virt.libvirt.host [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.587 183079 DEBUG nova.virt.libvirt.host [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.589 183079 DEBUG nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.589 183079 DEBUG nova.virt.hardware [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.590 183079 DEBUG nova.virt.hardware [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.590 183079 DEBUG nova.virt.hardware [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.591 183079 DEBUG nova.virt.hardware [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.591 183079 DEBUG nova.virt.hardware [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.592 183079 DEBUG nova.virt.hardware [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.592 183079 DEBUG nova.virt.hardware [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.592 183079 DEBUG nova.virt.hardware [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.593 183079 DEBUG nova.virt.hardware [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.593 183079 DEBUG nova.virt.hardware [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.594 183079 DEBUG nova.virt.hardware [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.599 183079 DEBUG nova.virt.libvirt.vif [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:31:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='luke',display_name='luke',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='luke',id=53,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP31lzQFM+bzgy2rILfPJtItaZEOm3vqO/5STzV+08atBuxP1/4YmUTZSa+vuUIur2j2kVkdN8zrzADLiGWPcuNSoFATd7+40/kloWBkWhl+JRfBOGEMv65jtGsYaSx1Fw==',key_name='tempest-keypair-test-1852532011',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='89916c03f6f440f6ae7cf81f2ae99bad',ramdisk_id='',reservation_id='r-73e4bgnl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InternalDNSTest-38234021',owner_user_name='tempest-InternalDNSTest-38234021-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:31:10Z,user_data=None,user_id='1ddebe2a251e4b118d9469f7d6fdb2ce',uuid=b0949fde-940d-495c-bdb0-e6c996b0274f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6aacbedb-6999-4006-9e77-6e540614dbea", "address": "fa:16:3e:35:c7:99", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6aacbedb-69", "ovs_interfaceid": "6aacbedb-6999-4006-9e77-6e540614dbea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.600 183079 DEBUG nova.network.os_vif_util [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converting VIF {"id": "6aacbedb-6999-4006-9e77-6e540614dbea", "address": "fa:16:3e:35:c7:99", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6aacbedb-69", "ovs_interfaceid": "6aacbedb-6999-4006-9e77-6e540614dbea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.601 183079 DEBUG nova.network.os_vif_util [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:c7:99,bridge_name='br-int',has_traffic_filtering=True,id=6aacbedb-6999-4006-9e77-6e540614dbea,network=Network(ed94e4f1-14ed-42c4-8c8e-db508a59bd2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6aacbedb-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.602 183079 DEBUG nova.objects.instance [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lazy-loading 'pci_devices' on Instance uuid b0949fde-940d-495c-bdb0-e6c996b0274f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.618 183079 DEBUG nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:31:17 compute-0 nova_compute[183075]:   <uuid>b0949fde-940d-495c-bdb0-e6c996b0274f</uuid>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   <name>instance-00000035</name>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <nova:name>luke</nova:name>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:31:17</nova:creationTime>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:31:17 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:31:17 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:31:17 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:31:17 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:31:17 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:31:17 compute-0 nova_compute[183075]:         <nova:user uuid="1ddebe2a251e4b118d9469f7d6fdb2ce">tempest-InternalDNSTest-38234021-project-member</nova:user>
Jan 22 17:31:17 compute-0 nova_compute[183075]:         <nova:project uuid="89916c03f6f440f6ae7cf81f2ae99bad">tempest-InternalDNSTest-38234021</nova:project>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:31:17 compute-0 nova_compute[183075]:         <nova:port uuid="6aacbedb-6999-4006-9e77-6e540614dbea">
Jan 22 17:31:17 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.23" ipVersion="4"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <system>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <entry name="serial">b0949fde-940d-495c-bdb0-e6c996b0274f</entry>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <entry name="uuid">b0949fde-940d-495c-bdb0-e6c996b0274f</entry>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     </system>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   <os>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   </os>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   <features>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   </features>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:35:c7:99"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <target dev="tap6aacbedb-69"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/console.log" append="off"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <video>
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     </video>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:31:17 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:31:17 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:31:17 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:31:17 compute-0 nova_compute[183075]: </domain>
Jan 22 17:31:17 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.620 183079 DEBUG nova.compute.manager [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Preparing to wait for external event network-vif-plugged-6aacbedb-6999-4006-9e77-6e540614dbea prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.620 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.620 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.621 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.622 183079 DEBUG nova.virt.libvirt.vif [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:31:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='luke',display_name='luke',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='luke',id=53,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP31lzQFM+bzgy2rILfPJtItaZEOm3vqO/5STzV+08atBuxP1/4YmUTZSa+vuUIur2j2kVkdN8zrzADLiGWPcuNSoFATd7+40/kloWBkWhl+JRfBOGEMv65jtGsYaSx1Fw==',key_name='tempest-keypair-test-1852532011',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='89916c03f6f440f6ae7cf81f2ae99bad',ramdisk_id='',reservation_id='r-73e4bgnl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InternalDNSTest-38234021',owner_user_name='tempest-InternalDNSTest-38234021-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:31:10Z,user_data=None,user_id='1ddebe2a251e4b118d9469f7d6fdb2ce',uuid=b0949fde-940d-495c-bdb0-e6c996b0274f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6aacbedb-6999-4006-9e77-6e540614dbea", "address": "fa:16:3e:35:c7:99", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6aacbedb-69", "ovs_interfaceid": "6aacbedb-6999-4006-9e77-6e540614dbea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.622 183079 DEBUG nova.network.os_vif_util [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converting VIF {"id": "6aacbedb-6999-4006-9e77-6e540614dbea", "address": "fa:16:3e:35:c7:99", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6aacbedb-69", "ovs_interfaceid": "6aacbedb-6999-4006-9e77-6e540614dbea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.623 183079 DEBUG nova.network.os_vif_util [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:c7:99,bridge_name='br-int',has_traffic_filtering=True,id=6aacbedb-6999-4006-9e77-6e540614dbea,network=Network(ed94e4f1-14ed-42c4-8c8e-db508a59bd2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6aacbedb-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.623 183079 DEBUG os_vif [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:c7:99,bridge_name='br-int',has_traffic_filtering=True,id=6aacbedb-6999-4006-9e77-6e540614dbea,network=Network(ed94e4f1-14ed-42c4-8c8e-db508a59bd2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6aacbedb-69') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.624 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.624 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.625 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.629 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.630 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6aacbedb-69, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.630 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6aacbedb-69, col_values=(('external_ids', {'iface-id': '6aacbedb-6999-4006-9e77-6e540614dbea', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:c7:99', 'vm-uuid': 'b0949fde-940d-495c-bdb0-e6c996b0274f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.632 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:17 compute-0 NetworkManager[55454]: <info>  [1769103077.6332] manager: (tap6aacbedb-69): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.634 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.639 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.640 183079 INFO os_vif [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:c7:99,bridge_name='br-int',has_traffic_filtering=True,id=6aacbedb-6999-4006-9e77-6e540614dbea,network=Network(ed94e4f1-14ed-42c4-8c8e-db508a59bd2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6aacbedb-69')
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.709 183079 DEBUG nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.710 183079 DEBUG nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] No VIF found with MAC fa:16:3e:35:c7:99, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:31:17 compute-0 kernel: tap6aacbedb-69: entered promiscuous mode
Jan 22 17:31:17 compute-0 NetworkManager[55454]: <info>  [1769103077.7626] manager: (tap6aacbedb-69): new Tun device (/org/freedesktop/NetworkManager/Devices/249)
Jan 22 17:31:17 compute-0 ovn_controller[95372]: 2026-01-22T17:31:17Z|00587|binding|INFO|Claiming lport 6aacbedb-6999-4006-9e77-6e540614dbea for this chassis.
Jan 22 17:31:17 compute-0 ovn_controller[95372]: 2026-01-22T17:31:17Z|00588|binding|INFO|6aacbedb-6999-4006-9e77-6e540614dbea: Claiming fa:16:3e:35:c7:99 10.100.0.23
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.803 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:17 compute-0 systemd-udevd[233062]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:31:17 compute-0 NetworkManager[55454]: <info>  [1769103077.8191] device (tap6aacbedb-69): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:31:17 compute-0 NetworkManager[55454]: <info>  [1769103077.8199] device (tap6aacbedb-69): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:31:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:17.827 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:c7:99 10.100.0.23'], port_security=['fa:16:3e:35:c7:99 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6b35f822-4909-4e2a-a8bd-ef6e39136861', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2644338c-80e0-4701-bf75-229e5a6223da, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=6aacbedb-6999-4006-9e77-6e540614dbea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:31:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:17.828 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 6aacbedb-6999-4006-9e77-6e540614dbea in datapath ed94e4f1-14ed-42c4-8c8e-db508a59bd2c bound to our chassis
Jan 22 17:31:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:17.830 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed94e4f1-14ed-42c4-8c8e-db508a59bd2c
Jan 22 17:31:17 compute-0 systemd-machined[154382]: New machine qemu-53-instance-00000035.
Jan 22 17:31:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:17.845 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[298e3ac5-39dd-4db7-9f32-7c5cbf1ca011]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:17.848 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH taped94e4f1-11 in ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:31:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:17.849 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface taped94e4f1-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:31:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:17.849 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0a471c26-92b2-4835-aca4-ca6629d912f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:17.850 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[02c7cb17-4087-409f-a81b-49ba36005562]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:17.863 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[5263b013-f902-4c91-b254-331cd41763c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.875 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:17 compute-0 systemd[1]: Started Virtual Machine qemu-53-instance-00000035.
Jan 22 17:31:17 compute-0 ovn_controller[95372]: 2026-01-22T17:31:17Z|00589|binding|INFO|Setting lport 6aacbedb-6999-4006-9e77-6e540614dbea ovn-installed in OVS
Jan 22 17:31:17 compute-0 ovn_controller[95372]: 2026-01-22T17:31:17Z|00590|binding|INFO|Setting lport 6aacbedb-6999-4006-9e77-6e540614dbea up in Southbound
Jan 22 17:31:17 compute-0 nova_compute[183075]: 2026-01-22 17:31:17.881 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:17.893 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[714ad78e-3dc7-42e6-b4fb-324f7bf8ee97]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:17.923 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[50eba930-bade-44b9-ae03-da6b155c1220]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:17 compute-0 systemd-udevd[233065]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:31:17 compute-0 NetworkManager[55454]: <info>  [1769103077.9312] manager: (taped94e4f1-10): new Veth device (/org/freedesktop/NetworkManager/Devices/250)
Jan 22 17:31:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:17.930 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[61a36844-7cef-4607-90de-c0e08a1f0c31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:17.963 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f4eab986-6307-419e-a17a-4e76297b4736]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:17.967 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ddda2ee4-022d-46c1-96b9-19fc253334aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:17 compute-0 NetworkManager[55454]: <info>  [1769103077.9883] device (taped94e4f1-10): carrier: link connected
Jan 22 17:31:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:17.993 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[8db3c272-39a7-49c0-8654-349bbf31c3f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:18.010 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[83589006-4522-4cb1-ac96-8dd5049addb2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped94e4f1-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:6f:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 544169, 'reachable_time': 15481, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233098, 'error': None, 'target': 'ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:18.029 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d5f3393d-95ee-4b11-a021-fda92052c907]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1a:6f63'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 544169, 'tstamp': 544169}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233099, 'error': None, 'target': 'ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:18.048 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ec202b85-0e7c-4f9c-8580-72be9878144e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped94e4f1-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:6f:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 544169, 'reachable_time': 15481, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233100, 'error': None, 'target': 'ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:18.084 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7b13e155-8f04-41d4-b170-9111b19c4de7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:18.137 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e472993c-14e0-4d01-969e-a317604747ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:18.139 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped94e4f1-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:18.139 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.139 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103078.1388664, b0949fde-940d-495c-bdb0-e6c996b0274f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:18.139 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped94e4f1-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.139 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] VM Started (Lifecycle Event)
Jan 22 17:31:18 compute-0 kernel: taped94e4f1-10: entered promiscuous mode
Jan 22 17:31:18 compute-0 NetworkManager[55454]: <info>  [1769103078.1415] manager: (taped94e4f1-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/251)
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:18.143 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped94e4f1-10, col_values=(('external_ids', {'iface-id': '77b4e93d-6708-4efb-b060-601be2ddc621'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.142 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:18 compute-0 ovn_controller[95372]: 2026-01-22T17:31:18Z|00591|binding|INFO|Releasing lport 77b4e93d-6708-4efb-b060-601be2ddc621 from this chassis (sb_readonly=0)
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.155 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:18.155 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed94e4f1-14ed-42c4-8c8e-db508a59bd2c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed94e4f1-14ed-42c4-8c8e-db508a59bd2c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:18.156 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cb64c079-372d-4f35-901e-d6baa215f5ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:18.157 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/ed94e4f1-14ed-42c4-8c8e-db508a59bd2c.pid.haproxy
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID ed94e4f1-14ed-42c4-8c8e-db508a59bd2c
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:31:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:18.158 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c', 'env', 'PROCESS_TAG=haproxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ed94e4f1-14ed-42c4-8c8e-db508a59bd2c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.164 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.167 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103078.1390276, b0949fde-940d-495c-bdb0-e6c996b0274f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.168 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] VM Paused (Lifecycle Event)
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.192 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.196 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.214 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:31:18 compute-0 podman[233139]: 2026-01-22 17:31:18.501087311 +0000 UTC m=+0.047312384 container create b1ee1c00c4990dacd0863eb9a231ec1c1c2be81374bfc55badf636125a088602 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.516 183079 DEBUG nova.compute.manager [req-31a24dd7-a516-4fcb-b5ca-9c643b430fcd req-acda33cd-af17-40a6-a535-b7495ed9c370 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Received event network-vif-plugged-6aacbedb-6999-4006-9e77-6e540614dbea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.517 183079 DEBUG oslo_concurrency.lockutils [req-31a24dd7-a516-4fcb-b5ca-9c643b430fcd req-acda33cd-af17-40a6-a535-b7495ed9c370 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.517 183079 DEBUG oslo_concurrency.lockutils [req-31a24dd7-a516-4fcb-b5ca-9c643b430fcd req-acda33cd-af17-40a6-a535-b7495ed9c370 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.517 183079 DEBUG oslo_concurrency.lockutils [req-31a24dd7-a516-4fcb-b5ca-9c643b430fcd req-acda33cd-af17-40a6-a535-b7495ed9c370 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.518 183079 DEBUG nova.compute.manager [req-31a24dd7-a516-4fcb-b5ca-9c643b430fcd req-acda33cd-af17-40a6-a535-b7495ed9c370 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Processing event network-vif-plugged-6aacbedb-6999-4006-9e77-6e540614dbea _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.518 183079 DEBUG nova.compute.manager [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.521 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103078.5213168, b0949fde-940d-495c-bdb0-e6c996b0274f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.521 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] VM Resumed (Lifecycle Event)
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.523 183079 DEBUG nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.527 183079 INFO nova.virt.libvirt.driver [-] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Instance spawned successfully.
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.528 183079 DEBUG nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.545 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:31:18 compute-0 systemd[1]: Started libpod-conmon-b1ee1c00c4990dacd0863eb9a231ec1c1c2be81374bfc55badf636125a088602.scope.
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.556 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.560 183079 DEBUG nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.561 183079 DEBUG nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.562 183079 DEBUG nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.562 183079 DEBUG nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.563 183079 DEBUG nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.564 183079 DEBUG nova.virt.libvirt.driver [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:31:18 compute-0 podman[233139]: 2026-01-22 17:31:18.478292372 +0000 UTC m=+0.024517445 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:31:18 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:31:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a958efb9fc80caf042ae349a4e216b7d5113e0ab0397650d7bc476ecaf08dde3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.595 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:31:18 compute-0 nova_compute[183075]: 2026-01-22 17:31:18.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:31:19 compute-0 podman[233139]: 2026-01-22 17:31:19.335295422 +0000 UTC m=+0.881520575 container init b1ee1c00c4990dacd0863eb9a231ec1c1c2be81374bfc55badf636125a088602 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 17:31:19 compute-0 podman[233139]: 2026-01-22 17:31:19.346379023 +0000 UTC m=+0.892604126 container start b1ee1c00c4990dacd0863eb9a231ec1c1c2be81374bfc55badf636125a088602 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:31:19 compute-0 nova_compute[183075]: 2026-01-22 17:31:19.346 183079 INFO nova.compute.manager [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Took 8.39 seconds to spawn the instance on the hypervisor.
Jan 22 17:31:19 compute-0 nova_compute[183075]: 2026-01-22 17:31:19.347 183079 DEBUG nova.compute.manager [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:31:19 compute-0 podman[233158]: 2026-01-22 17:31:19.384029602 +0000 UTC m=+0.075863833 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:31:19 compute-0 neutron-haproxy-ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233155]: [NOTICE]   (233170) : New worker (233184) forked
Jan 22 17:31:19 compute-0 neutron-haproxy-ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233155]: [NOTICE]   (233170) : Loading success.
Jan 22 17:31:19 compute-0 nova_compute[183075]: 2026-01-22 17:31:19.452 183079 INFO nova.compute.manager [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Took 8.90 seconds to build instance.
Jan 22 17:31:19 compute-0 nova_compute[183075]: 2026-01-22 17:31:19.467 183079 DEBUG oslo_concurrency.lockutils [None req-f6681e1b-d4bf-4895-b55a-2fc8f5111b6a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "b0949fde-940d-495c-bdb0-e6c996b0274f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:20 compute-0 nova_compute[183075]: 2026-01-22 17:31:20.085 183079 DEBUG nova.network.neutron [req-5bb87c05-14ab-4289-b8cf-a32bb7878b8c req-b5f516f0-9bd8-4a84-9736-dc52a48187f8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Updated VIF entry in instance network info cache for port 6aacbedb-6999-4006-9e77-6e540614dbea. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:31:20 compute-0 nova_compute[183075]: 2026-01-22 17:31:20.087 183079 DEBUG nova.network.neutron [req-5bb87c05-14ab-4289-b8cf-a32bb7878b8c req-b5f516f0-9bd8-4a84-9736-dc52a48187f8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Updating instance_info_cache with network_info: [{"id": "6aacbedb-6999-4006-9e77-6e540614dbea", "address": "fa:16:3e:35:c7:99", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6aacbedb-69", "ovs_interfaceid": "6aacbedb-6999-4006-9e77-6e540614dbea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:31:20 compute-0 nova_compute[183075]: 2026-01-22 17:31:20.107 183079 DEBUG oslo_concurrency.lockutils [req-5bb87c05-14ab-4289-b8cf-a32bb7878b8c req-b5f516f0-9bd8-4a84-9736-dc52a48187f8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-b0949fde-940d-495c-bdb0-e6c996b0274f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:31:20 compute-0 nova_compute[183075]: 2026-01-22 17:31:20.630 183079 DEBUG nova.compute.manager [req-fe436636-50c4-4a3e-8c1b-bd9f21936dfe req-99408b42-c887-43dd-baf2-4ba60cc311d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Received event network-vif-plugged-6aacbedb-6999-4006-9e77-6e540614dbea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:31:20 compute-0 nova_compute[183075]: 2026-01-22 17:31:20.631 183079 DEBUG oslo_concurrency.lockutils [req-fe436636-50c4-4a3e-8c1b-bd9f21936dfe req-99408b42-c887-43dd-baf2-4ba60cc311d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:20 compute-0 nova_compute[183075]: 2026-01-22 17:31:20.631 183079 DEBUG oslo_concurrency.lockutils [req-fe436636-50c4-4a3e-8c1b-bd9f21936dfe req-99408b42-c887-43dd-baf2-4ba60cc311d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:20 compute-0 nova_compute[183075]: 2026-01-22 17:31:20.632 183079 DEBUG oslo_concurrency.lockutils [req-fe436636-50c4-4a3e-8c1b-bd9f21936dfe req-99408b42-c887-43dd-baf2-4ba60cc311d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:20 compute-0 nova_compute[183075]: 2026-01-22 17:31:20.632 183079 DEBUG nova.compute.manager [req-fe436636-50c4-4a3e-8c1b-bd9f21936dfe req-99408b42-c887-43dd-baf2-4ba60cc311d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] No waiting events found dispatching network-vif-plugged-6aacbedb-6999-4006-9e77-6e540614dbea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:31:20 compute-0 nova_compute[183075]: 2026-01-22 17:31:20.633 183079 WARNING nova.compute.manager [req-fe436636-50c4-4a3e-8c1b-bd9f21936dfe req-99408b42-c887-43dd-baf2-4ba60cc311d8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Received unexpected event network-vif-plugged-6aacbedb-6999-4006-9e77-6e540614dbea for instance with vm_state active and task_state None.
Jan 22 17:31:21 compute-0 nova_compute[183075]: 2026-01-22 17:31:21.440 183079 INFO nova.compute.manager [None req-80ed65cf-a83d-4e43-b8fd-b214c319f032 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Get console output
Jan 22 17:31:22 compute-0 nova_compute[183075]: 2026-01-22 17:31:22.585 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:22 compute-0 nova_compute[183075]: 2026-01-22 17:31:22.632 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:22 compute-0 nova_compute[183075]: 2026-01-22 17:31:22.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:31:23 compute-0 nova_compute[183075]: 2026-01-22 17:31:23.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:31:26 compute-0 nova_compute[183075]: 2026-01-22 17:31:26.638 183079 INFO nova.compute.manager [None req-58849b3d-ff2c-453c-b508-43c420d3c1ca 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Get console output
Jan 22 17:31:26 compute-0 nova_compute[183075]: 2026-01-22 17:31:26.645 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:31:27 compute-0 nova_compute[183075]: 2026-01-22 17:31:27.586 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:27 compute-0 nova_compute[183075]: 2026-01-22 17:31:27.634 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:27 compute-0 nova_compute[183075]: 2026-01-22 17:31:27.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:31:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:28.514 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:bc:27 10.100.0.2 2001:db8::f816:3eff:fee7:bc27'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee7:bc27/64', 'neutron:device_id': 'ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6159b68b-3c7b-43be-9fb9-7f846f3d3eb8, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0297f784-1a41-4744-b018-f503dfa93754) old=Port_Binding(mac=['fa:16:3e:e7:bc:27 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:31:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:28.517 104629 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0297f784-1a41-4744-b018-f503dfa93754 in datapath 2fd77df8-cf00-4afc-b4cf-75b5722c375c updated
Jan 22 17:31:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:28.520 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2fd77df8-cf00-4afc-b4cf-75b5722c375c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:31:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:28.522 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[43bfabeb-04c3-4bbc-ab27-e030e9d5c557]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:31 compute-0 ovn_controller[95372]: 2026-01-22T17:31:31Z|00108|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:35:c7:99 10.100.0.23
Jan 22 17:31:31 compute-0 ovn_controller[95372]: 2026-01-22T17:31:31Z|00109|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:35:c7:99 10.100.0.23
Jan 22 17:31:31 compute-0 podman[233211]: 2026-01-22 17:31:31.349144379 +0000 UTC m=+0.052743386 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-type=git, config_id=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41)
Jan 22 17:31:31 compute-0 podman[233209]: 2026-01-22 17:31:31.372485162 +0000 UTC m=+0.081690006 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 17:31:31 compute-0 podman[233210]: 2026-01-22 17:31:31.372576674 +0000 UTC m=+0.079670892 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 22 17:31:31 compute-0 nova_compute[183075]: 2026-01-22 17:31:31.777 183079 INFO nova.compute.manager [None req-34ab57a3-1b7d-4f97-8aee-9acd46ecf4df 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Get console output
Jan 22 17:31:31 compute-0 nova_compute[183075]: 2026-01-22 17:31:31.782 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:31:31 compute-0 nova_compute[183075]: 2026-01-22 17:31:31.782 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:31:32 compute-0 nova_compute[183075]: 2026-01-22 17:31:32.635 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:31:32 compute-0 nova_compute[183075]: 2026-01-22 17:31:32.637 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:31:32 compute-0 nova_compute[183075]: 2026-01-22 17:31:32.637 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 17:31:32 compute-0 nova_compute[183075]: 2026-01-22 17:31:32.637 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 17:31:32 compute-0 nova_compute[183075]: 2026-01-22 17:31:32.643 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:32 compute-0 nova_compute[183075]: 2026-01-22 17:31:32.644 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 17:31:32 compute-0 nova_compute[183075]: 2026-01-22 17:31:32.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:31:32 compute-0 nova_compute[183075]: 2026-01-22 17:31:32.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:31:32 compute-0 nova_compute[183075]: 2026-01-22 17:31:32.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:31:32 compute-0 nova_compute[183075]: 2026-01-22 17:31:32.999 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:32.999 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.000 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.000 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.063 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.126 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.128 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.187 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.336 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.337 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5539MB free_disk=73.33266067504883GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.338 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.338 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.417 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance b0949fde-940d-495c-bdb0-e6c996b0274f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.417 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.417 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.453 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.465 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.492 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:31:33 compute-0 nova_compute[183075]: 2026-01-22 17:31:33.492 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:35 compute-0 podman[233278]: 2026-01-22 17:31:35.340989131 +0000 UTC m=+0.052459918 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:31:35 compute-0 nova_compute[183075]: 2026-01-22 17:31:35.492 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:31:35 compute-0 nova_compute[183075]: 2026-01-22 17:31:35.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:31:35 compute-0 nova_compute[183075]: 2026-01-22 17:31:35.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:31:35 compute-0 nova_compute[183075]: 2026-01-22 17:31:35.806 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:31:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:36.288 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:31:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:36.289 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:31:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:31:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:31:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:31:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:31:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:31:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:31:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.621 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "81396ea9-a9a1-4a21-9808-608e45a7aa03" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.621 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "81396ea9-a9a1-4a21-9808-608e45a7aa03" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.640 183079 DEBUG nova.compute.manager [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.734 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.734 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.743 183079 DEBUG nova.virt.hardware [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.743 183079 INFO nova.compute.claims [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.871 183079 DEBUG nova.compute.provider_tree [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.885 183079 DEBUG nova.scheduler.client.report [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.908 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.909 183079 DEBUG nova.compute.manager [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.924 183079 INFO nova.compute.manager [None req-5a1ad708-229e-4dcd-903f-75fa729611bd 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Get console output
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.931 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.956 183079 DEBUG nova.compute.manager [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.957 183079 DEBUG nova.network.neutron [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.976 183079 INFO nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:31:36 compute-0 nova_compute[183075]: 2026-01-22 17:31:36.991 183079 DEBUG nova.compute.manager [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.069 183079 DEBUG nova.compute.manager [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.070 183079 DEBUG nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.071 183079 INFO nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Creating image(s)
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.071 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "/var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.072 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "/var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.072 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "/var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.088 183079 DEBUG oslo_concurrency.processutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.162 183079 DEBUG oslo_concurrency.processutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.163 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.164 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.180 183079 DEBUG oslo_concurrency.processutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.235 183079 DEBUG oslo_concurrency.processutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.236 183079 DEBUG oslo_concurrency.processutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.279 183079 DEBUG oslo_concurrency.processutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.280 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.281 183079 DEBUG oslo_concurrency.processutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.332 183079 DEBUG oslo_concurrency.processutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.333 183079 DEBUG nova.virt.disk.api [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Checking if we can resize image /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.334 183079 DEBUG oslo_concurrency.processutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.402 183079 DEBUG oslo_concurrency.processutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.403 183079 DEBUG nova.virt.disk.api [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Cannot resize image /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.403 183079 DEBUG nova.objects.instance [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lazy-loading 'migration_context' on Instance uuid 81396ea9-a9a1-4a21-9808-608e45a7aa03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.470 183079 DEBUG nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.470 183079 DEBUG nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Ensure instance console log exists: /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.471 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.471 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.471 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.545 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.546 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.2563119
Jan 22 17:31:37 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.23:44080 [22/Jan/2026:17:31:36.287] listener listener/metadata 0/0/0/1258/1258 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.553 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.554 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.556 183079 DEBUG nova.policy [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.573 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:31:37 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.23:44082 [22/Jan/2026:17:31:37.553] listener listener/metadata 0/0/0/20/20 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.574 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0195305
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.578 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.579 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.595 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.596 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0173054
Jan 22 17:31:37 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.23:44096 [22/Jan/2026:17:31:37.577] listener listener/metadata 0/0/0/19/19 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.601 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.601 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.618 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.618 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0168624
Jan 22 17:31:37 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.23:44106 [22/Jan/2026:17:31:37.600] listener listener/metadata 0/0/0/17/17 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.623 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.624 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.641 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.641 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0176778
Jan 22 17:31:37 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.23:44108 [22/Jan/2026:17:31:37.623] listener listener/metadata 0/0/0/18/18 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:31:37 compute-0 nova_compute[183075]: 2026-01-22 17:31:37.645 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.647 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.648 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.668 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.669 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0207269
Jan 22 17:31:37 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.23:44118 [22/Jan/2026:17:31:37.647] listener listener/metadata 0/0/0/21/21 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.674 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.674 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.686 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.686 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0115838
Jan 22 17:31:37 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.23:44132 [22/Jan/2026:17:31:37.674] listener listener/metadata 0/0/0/12/12 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.691 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.691 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.711 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.711 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0200412
Jan 22 17:31:37 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.23:44134 [22/Jan/2026:17:31:37.690] listener listener/metadata 0/0/0/21/21 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.716 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.716 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.733 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:31:37 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.23:44150 [22/Jan/2026:17:31:37.715] listener listener/metadata 0/0/0/20/20 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.736 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 139 time: 0.0191276
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.748 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.749 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.768 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.769 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 139 time: 0.0198176
Jan 22 17:31:37 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.23:44158 [22/Jan/2026:17:31:37.747] listener listener/metadata 0/0/0/22/22 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.775 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.776 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:31:37 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.23:44166 [22/Jan/2026:17:31:37.775] listener listener/metadata 0/0/0/32/32 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.807 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0311041
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.816 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.817 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.835 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:31:37 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.23:44174 [22/Jan/2026:17:31:37.816] listener listener/metadata 0/0/0/19/19 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.836 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0188231
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.839 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.840 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.865 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.865 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0254779
Jan 22 17:31:37 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.23:44190 [22/Jan/2026:17:31:37.839] listener listener/metadata 0/0/0/26/26 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.869 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.870 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.890 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:31:37 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.23:44192 [22/Jan/2026:17:31:37.869] listener listener/metadata 0/0/0/21/21 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.891 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0205154
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.895 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.896 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.913 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.914 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 139 time: 0.0182230
Jan 22 17:31:37 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.23:44206 [22/Jan/2026:17:31:37.895] listener listener/metadata 0/0/0/19/19 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.920 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.921 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.936 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:31:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:37.937 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0160294
Jan 22 17:31:37 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.23:44214 [22/Jan/2026:17:31:37.920] listener listener/metadata 0/0/0/17/17 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:31:39 compute-0 nova_compute[183075]: 2026-01-22 17:31:39.243 183079 DEBUG nova.network.neutron [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Successfully created port: 0290774a-bbee-4523-b342-ddb24aca4826 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:31:40 compute-0 nova_compute[183075]: 2026-01-22 17:31:40.482 183079 DEBUG nova.network.neutron [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Successfully updated port: 0290774a-bbee-4523-b342-ddb24aca4826 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:31:40 compute-0 nova_compute[183075]: 2026-01-22 17:31:40.762 183079 DEBUG nova.compute.manager [req-956c4a10-e2fd-406f-8f93-2da4b1a8a5df req-ca86896c-20ca-4d2a-997b-9b27fed2f457 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Received event network-changed-0290774a-bbee-4523-b342-ddb24aca4826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:31:40 compute-0 nova_compute[183075]: 2026-01-22 17:31:40.762 183079 DEBUG nova.compute.manager [req-956c4a10-e2fd-406f-8f93-2da4b1a8a5df req-ca86896c-20ca-4d2a-997b-9b27fed2f457 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Refreshing instance network info cache due to event network-changed-0290774a-bbee-4523-b342-ddb24aca4826. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:31:40 compute-0 nova_compute[183075]: 2026-01-22 17:31:40.763 183079 DEBUG oslo_concurrency.lockutils [req-956c4a10-e2fd-406f-8f93-2da4b1a8a5df req-ca86896c-20ca-4d2a-997b-9b27fed2f457 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-81396ea9-a9a1-4a21-9808-608e45a7aa03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:31:40 compute-0 nova_compute[183075]: 2026-01-22 17:31:40.763 183079 DEBUG oslo_concurrency.lockutils [req-956c4a10-e2fd-406f-8f93-2da4b1a8a5df req-ca86896c-20ca-4d2a-997b-9b27fed2f457 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-81396ea9-a9a1-4a21-9808-608e45a7aa03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:31:40 compute-0 nova_compute[183075]: 2026-01-22 17:31:40.763 183079 DEBUG nova.network.neutron [req-956c4a10-e2fd-406f-8f93-2da4b1a8a5df req-ca86896c-20ca-4d2a-997b-9b27fed2f457 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Refreshing network info cache for port 0290774a-bbee-4523-b342-ddb24aca4826 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:31:40 compute-0 nova_compute[183075]: 2026-01-22 17:31:40.768 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "refresh_cache-81396ea9-a9a1-4a21-9808-608e45a7aa03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:31:41 compute-0 nova_compute[183075]: 2026-01-22 17:31:41.497 183079 DEBUG nova.network.neutron [req-956c4a10-e2fd-406f-8f93-2da4b1a8a5df req-ca86896c-20ca-4d2a-997b-9b27fed2f457 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:31:41 compute-0 nova_compute[183075]: 2026-01-22 17:31:41.852 183079 DEBUG nova.network.neutron [req-956c4a10-e2fd-406f-8f93-2da4b1a8a5df req-ca86896c-20ca-4d2a-997b-9b27fed2f457 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:31:41 compute-0 nova_compute[183075]: 2026-01-22 17:31:41.870 183079 DEBUG oslo_concurrency.lockutils [req-956c4a10-e2fd-406f-8f93-2da4b1a8a5df req-ca86896c-20ca-4d2a-997b-9b27fed2f457 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-81396ea9-a9a1-4a21-9808-608e45a7aa03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:31:41 compute-0 nova_compute[183075]: 2026-01-22 17:31:41.870 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquired lock "refresh_cache-81396ea9-a9a1-4a21-9808-608e45a7aa03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:31:41 compute-0 nova_compute[183075]: 2026-01-22 17:31:41.870 183079 DEBUG nova.network.neutron [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:31:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:41.947 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:41.948 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:41.949 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:42 compute-0 nova_compute[183075]: 2026-01-22 17:31:42.037 183079 INFO nova.compute.manager [None req-71d66b60-c8b2-4ce5-a621-11e28bd0fa17 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Get console output
Jan 22 17:31:42 compute-0 nova_compute[183075]: 2026-01-22 17:31:42.044 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:31:42 compute-0 nova_compute[183075]: 2026-01-22 17:31:42.495 183079 DEBUG nova.network.neutron [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:31:42 compute-0 nova_compute[183075]: 2026-01-22 17:31:42.648 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.818 183079 DEBUG nova.network.neutron [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Updating instance_info_cache with network_info: [{"id": "0290774a-bbee-4523-b342-ddb24aca4826", "address": "fa:16:3e:60:a0:be", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:a0be", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0290774a-bb", "ovs_interfaceid": "0290774a-bbee-4523-b342-ddb24aca4826", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.845 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Releasing lock "refresh_cache-81396ea9-a9a1-4a21-9808-608e45a7aa03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.846 183079 DEBUG nova.compute.manager [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Instance network_info: |[{"id": "0290774a-bbee-4523-b342-ddb24aca4826", "address": "fa:16:3e:60:a0:be", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:a0be", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0290774a-bb", "ovs_interfaceid": "0290774a-bbee-4523-b342-ddb24aca4826", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.852 183079 DEBUG nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Start _get_guest_xml network_info=[{"id": "0290774a-bbee-4523-b342-ddb24aca4826", "address": "fa:16:3e:60:a0:be", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:a0be", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0290774a-bb", "ovs_interfaceid": "0290774a-bbee-4523-b342-ddb24aca4826", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.859 183079 WARNING nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.866 183079 DEBUG nova.virt.libvirt.host [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.867 183079 DEBUG nova.virt.libvirt.host [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.871 183079 DEBUG nova.virt.libvirt.host [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.872 183079 DEBUG nova.virt.libvirt.host [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.873 183079 DEBUG nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.873 183079 DEBUG nova.virt.hardware [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.874 183079 DEBUG nova.virt.hardware [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.874 183079 DEBUG nova.virt.hardware [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.875 183079 DEBUG nova.virt.hardware [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.876 183079 DEBUG nova.virt.hardware [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.876 183079 DEBUG nova.virt.hardware [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.876 183079 DEBUG nova.virt.hardware [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.876 183079 DEBUG nova.virt.hardware [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.877 183079 DEBUG nova.virt.hardware [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.877 183079 DEBUG nova.virt.hardware [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.877 183079 DEBUG nova.virt.hardware [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.883 183079 DEBUG nova.virt.libvirt.vif [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:31:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-761314490',display_name='tempest-server-test-761314490',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-761314490',id=54,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5QyK5QBIiC5rcI23u+Ha4rToriS54oXRHOciJ+8yg9OiIFQ5pHcofppLwzDPzqktD3JMTTskAgQadosoWLVCa44nM7NokRRlJk11u7nt0exfz9e0AepzmOn9wpcPYeVg==',key_name='tempest-keypair-test-1290435476',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='22f75e117e724f9aaadf5b8fd25a6ef6',ramdisk_id='',reservation_id='r-uvhq1w36',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessSecGroupDualStackDHCPv6StatelessTest-80323303',owner_user_name='tempest-StatelessSecGroupDualStackDHCPv6StatelessTest-80323303-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:31:37Z,user_data=None,user_id='c4621e42483d4f49b9a97f2b7eb886dc',uuid=81396ea9-a9a1-4a21-9808-608e45a7aa03,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0290774a-bbee-4523-b342-ddb24aca4826", "address": "fa:16:3e:60:a0:be", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:a0be", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0290774a-bb", "ovs_interfaceid": "0290774a-bbee-4523-b342-ddb24aca4826", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.883 183079 DEBUG nova.network.os_vif_util [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Converting VIF {"id": "0290774a-bbee-4523-b342-ddb24aca4826", "address": "fa:16:3e:60:a0:be", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:a0be", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0290774a-bb", "ovs_interfaceid": "0290774a-bbee-4523-b342-ddb24aca4826", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.884 183079 DEBUG nova.network.os_vif_util [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:a0:be,bridge_name='br-int',has_traffic_filtering=True,id=0290774a-bbee-4523-b342-ddb24aca4826,network=Network(2fd77df8-cf00-4afc-b4cf-75b5722c375c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0290774a-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.885 183079 DEBUG nova.objects.instance [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 81396ea9-a9a1-4a21-9808-608e45a7aa03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.899 183079 DEBUG nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:31:43 compute-0 nova_compute[183075]:   <uuid>81396ea9-a9a1-4a21-9808-608e45a7aa03</uuid>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   <name>instance-00000036</name>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-761314490</nova:name>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:31:43</nova:creationTime>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:31:43 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:31:43 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:31:43 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:31:43 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:31:43 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:31:43 compute-0 nova_compute[183075]:         <nova:user uuid="c4621e42483d4f49b9a97f2b7eb886dc">tempest-StatelessSecGroupDualStackDHCPv6StatelessTest-80323303-project-member</nova:user>
Jan 22 17:31:43 compute-0 nova_compute[183075]:         <nova:project uuid="22f75e117e724f9aaadf5b8fd25a6ef6">tempest-StatelessSecGroupDualStackDHCPv6StatelessTest-80323303</nova:project>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:31:43 compute-0 nova_compute[183075]:         <nova:port uuid="0290774a-bbee-4523-b342-ddb24aca4826">
Jan 22 17:31:43 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe60:a0be" ipVersion="6"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <system>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <entry name="serial">81396ea9-a9a1-4a21-9808-608e45a7aa03</entry>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <entry name="uuid">81396ea9-a9a1-4a21-9808-608e45a7aa03</entry>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     </system>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   <os>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   </os>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   <features>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   </features>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:60:a0:be"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <target dev="tap0290774a-bb"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/console.log" append="off"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <video>
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     </video>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:31:43 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:31:43 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:31:43 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:31:43 compute-0 nova_compute[183075]: </domain>
Jan 22 17:31:43 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.900 183079 DEBUG nova.compute.manager [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Preparing to wait for external event network-vif-plugged-0290774a-bbee-4523-b342-ddb24aca4826 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.901 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "81396ea9-a9a1-4a21-9808-608e45a7aa03-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.902 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "81396ea9-a9a1-4a21-9808-608e45a7aa03-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.902 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "81396ea9-a9a1-4a21-9808-608e45a7aa03-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.903 183079 DEBUG nova.virt.libvirt.vif [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:31:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-761314490',display_name='tempest-server-test-761314490',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-761314490',id=54,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5QyK5QBIiC5rcI23u+Ha4rToriS54oXRHOciJ+8yg9OiIFQ5pHcofppLwzDPzqktD3JMTTskAgQadosoWLVCa44nM7NokRRlJk11u7nt0exfz9e0AepzmOn9wpcPYeVg==',key_name='tempest-keypair-test-1290435476',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='22f75e117e724f9aaadf5b8fd25a6ef6',ramdisk_id='',reservation_id='r-uvhq1w36',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessSecGroupDualStackDHCPv6StatelessTest-80323303',owner_user_name='tempest-StatelessSecGroupDualStackDHCPv6StatelessTest-80323303-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:31:37Z,user_data=None,user_id='c4621e42483d4f49b9a97f2b7eb886dc',uuid=81396ea9-a9a1-4a21-9808-608e45a7aa03,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0290774a-bbee-4523-b342-ddb24aca4826", "address": "fa:16:3e:60:a0:be", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:a0be", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0290774a-bb", "ovs_interfaceid": "0290774a-bbee-4523-b342-ddb24aca4826", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.904 183079 DEBUG nova.network.os_vif_util [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Converting VIF {"id": "0290774a-bbee-4523-b342-ddb24aca4826", "address": "fa:16:3e:60:a0:be", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:a0be", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0290774a-bb", "ovs_interfaceid": "0290774a-bbee-4523-b342-ddb24aca4826", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.905 183079 DEBUG nova.network.os_vif_util [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:a0:be,bridge_name='br-int',has_traffic_filtering=True,id=0290774a-bbee-4523-b342-ddb24aca4826,network=Network(2fd77df8-cf00-4afc-b4cf-75b5722c375c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0290774a-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.906 183079 DEBUG os_vif [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:a0:be,bridge_name='br-int',has_traffic_filtering=True,id=0290774a-bbee-4523-b342-ddb24aca4826,network=Network(2fd77df8-cf00-4afc-b4cf-75b5722c375c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0290774a-bb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.907 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.907 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.908 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.910 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.911 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0290774a-bb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.912 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0290774a-bb, col_values=(('external_ids', {'iface-id': '0290774a-bbee-4523-b342-ddb24aca4826', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:a0:be', 'vm-uuid': '81396ea9-a9a1-4a21-9808-608e45a7aa03'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.913 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:43 compute-0 NetworkManager[55454]: <info>  [1769103103.9146] manager: (tap0290774a-bb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.917 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.922 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.923 183079 INFO os_vif [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:a0:be,bridge_name='br-int',has_traffic_filtering=True,id=0290774a-bbee-4523-b342-ddb24aca4826,network=Network(2fd77df8-cf00-4afc-b4cf-75b5722c375c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0290774a-bb')
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.970 183079 DEBUG nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:31:43 compute-0 nova_compute[183075]: 2026-01-22 17:31:43.971 183079 DEBUG nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] No VIF found with MAC fa:16:3e:60:a0:be, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:31:44 compute-0 kernel: tap0290774a-bb: entered promiscuous mode
Jan 22 17:31:44 compute-0 NetworkManager[55454]: <info>  [1769103104.0275] manager: (tap0290774a-bb): new Tun device (/org/freedesktop/NetworkManager/Devices/253)
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.028 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:44 compute-0 ovn_controller[95372]: 2026-01-22T17:31:44Z|00592|binding|INFO|Claiming lport 0290774a-bbee-4523-b342-ddb24aca4826 for this chassis.
Jan 22 17:31:44 compute-0 ovn_controller[95372]: 2026-01-22T17:31:44Z|00593|binding|INFO|0290774a-bbee-4523-b342-ddb24aca4826: Claiming fa:16:3e:60:a0:be 10.100.0.3 2001:db8::f816:3eff:fe60:a0be
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.033 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.039 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:a0:be 10.100.0.3 2001:db8::f816:3eff:fe60:a0be'], port_security=['fa:16:3e:60:a0:be 10.100.0.3 2001:db8::f816:3eff:fe60:a0be'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe60:a0be/64', 'neutron:device_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e838e551-4083-4143-b761-54a81d27a6c4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6159b68b-3c7b-43be-9fb9-7f846f3d3eb8, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=0290774a-bbee-4523-b342-ddb24aca4826) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.041 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 0290774a-bbee-4523-b342-ddb24aca4826 in datapath 2fd77df8-cf00-4afc-b4cf-75b5722c375c bound to our chassis
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.043 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2fd77df8-cf00-4afc-b4cf-75b5722c375c
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.054 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e23eefdc-34c5-4b11-8bb6-8ffcfb5fa493]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.055 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2fd77df8-c1 in ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.057 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2fd77df8-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.057 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f0c055d5-fa13-406a-9d14-9a02184404a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.057 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bcebbd03-1508-4f5a-a256-fbe52d682397]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:44 compute-0 systemd-udevd[233329]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.071 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[3350d9ed-11cd-4a0e-95f1-3999711dbac2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:44 compute-0 NetworkManager[55454]: <info>  [1769103104.0793] device (tap0290774a-bb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:31:44 compute-0 NetworkManager[55454]: <info>  [1769103104.0804] device (tap0290774a-bb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:31:44 compute-0 systemd-machined[154382]: New machine qemu-54-instance-00000036.
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.084 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.087 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[08ac93f5-7109-4c31-a877-23e568f1ac18]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:44 compute-0 ovn_controller[95372]: 2026-01-22T17:31:44Z|00594|binding|INFO|Setting lport 0290774a-bbee-4523-b342-ddb24aca4826 ovn-installed in OVS
Jan 22 17:31:44 compute-0 ovn_controller[95372]: 2026-01-22T17:31:44Z|00595|binding|INFO|Setting lport 0290774a-bbee-4523-b342-ddb24aca4826 up in Southbound
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.089 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:44 compute-0 systemd[1]: Started Virtual Machine qemu-54-instance-00000036.
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.116 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[56d9f179-cde6-49df-bbd2-cf775d5b25f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:44 compute-0 NetworkManager[55454]: <info>  [1769103104.1225] manager: (tap2fd77df8-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/254)
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.121 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bdb85e6d-e8f4-4ad2-9ab3-a3766a8dc0b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.152 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ccd719d3-9edc-4dad-83fb-d70cce9a3abd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.155 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[6fdd942c-2552-4d06-ab56-a3c1006c8afb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:44 compute-0 NetworkManager[55454]: <info>  [1769103104.1796] device (tap2fd77df8-c0): carrier: link connected
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.185 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c4e43e29-9706-4e43-98d9-aa0c3924044c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.202 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[547bb6d7-65a2-4dc7-b7fa-144e5318df57]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2fd77df8-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:bc:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 170], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546788, 'reachable_time': 41475, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233362, 'error': None, 'target': 'ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.217 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3babe2bb-207d-43a2-9e74-811dd3dac69d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee7:bc27'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546788, 'tstamp': 546788}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233363, 'error': None, 'target': 'ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.235 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4a1c2452-8595-4e86-b809-5672494719ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2fd77df8-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:bc:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 170], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546788, 'reachable_time': 41475, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233364, 'error': None, 'target': 'ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.263 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ffc8dd-1b7e-413a-a11b-34aea6c84dc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.320 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e1fbb2f2-dce4-4c4e-b1eb-b5784ad83160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.322 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2fd77df8-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.322 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.323 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2fd77df8-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:31:44 compute-0 NetworkManager[55454]: <info>  [1769103104.3267] manager: (tap2fd77df8-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/255)
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.326 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:44 compute-0 kernel: tap2fd77df8-c0: entered promiscuous mode
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.329 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.331 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2fd77df8-c0, col_values=(('external_ids', {'iface-id': '0297f784-1a41-4744-b018-f503dfa93754'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.332 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:44 compute-0 ovn_controller[95372]: 2026-01-22T17:31:44Z|00596|binding|INFO|Releasing lport 0297f784-1a41-4744-b018-f503dfa93754 from this chassis (sb_readonly=0)
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.333 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.334 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2fd77df8-cf00-4afc-b4cf-75b5722c375c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2fd77df8-cf00-4afc-b4cf-75b5722c375c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.335 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0ddf53d2-a78a-4be1-9a4b-88c6378ae4ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.336 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/2fd77df8-cf00-4afc-b4cf-75b5722c375c.pid.haproxy
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 2fd77df8-cf00-4afc-b4cf-75b5722c375c
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:31:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:31:44.338 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'env', 'PROCESS_TAG=haproxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2fd77df8-cf00-4afc-b4cf-75b5722c375c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.343 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.417 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103104.416888, 81396ea9-a9a1-4a21-9808-608e45a7aa03 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.417 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] VM Started (Lifecycle Event)
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.439 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.442 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103104.4171555, 81396ea9-a9a1-4a21-9808-608e45a7aa03 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.442 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] VM Paused (Lifecycle Event)
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.462 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.465 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.488 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:31:44 compute-0 podman[233403]: 2026-01-22 17:31:44.681719256 +0000 UTC m=+0.025764687 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.776 183079 DEBUG nova.compute.manager [req-7e910bf4-6a0f-4ff9-8102-0a388c0cea6f req-13f3e06e-edc3-4927-b2b4-62c0b173b898 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Received event network-vif-plugged-0290774a-bbee-4523-b342-ddb24aca4826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.778 183079 DEBUG oslo_concurrency.lockutils [req-7e910bf4-6a0f-4ff9-8102-0a388c0cea6f req-13f3e06e-edc3-4927-b2b4-62c0b173b898 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "81396ea9-a9a1-4a21-9808-608e45a7aa03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.778 183079 DEBUG oslo_concurrency.lockutils [req-7e910bf4-6a0f-4ff9-8102-0a388c0cea6f req-13f3e06e-edc3-4927-b2b4-62c0b173b898 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "81396ea9-a9a1-4a21-9808-608e45a7aa03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.779 183079 DEBUG oslo_concurrency.lockutils [req-7e910bf4-6a0f-4ff9-8102-0a388c0cea6f req-13f3e06e-edc3-4927-b2b4-62c0b173b898 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "81396ea9-a9a1-4a21-9808-608e45a7aa03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.779 183079 DEBUG nova.compute.manager [req-7e910bf4-6a0f-4ff9-8102-0a388c0cea6f req-13f3e06e-edc3-4927-b2b4-62c0b173b898 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Processing event network-vif-plugged-0290774a-bbee-4523-b342-ddb24aca4826 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.780 183079 DEBUG nova.compute.manager [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.784 183079 DEBUG nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.785 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103104.7843046, 81396ea9-a9a1-4a21-9808-608e45a7aa03 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.785 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] VM Resumed (Lifecycle Event)
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.789 183079 INFO nova.virt.libvirt.driver [-] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Instance spawned successfully.
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.789 183079 DEBUG nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:31:44 compute-0 podman[233403]: 2026-01-22 17:31:44.801234484 +0000 UTC m=+0.145279895 container create 0398fca3db2a42f4180be2b0d9abe8ff7436042cbc3c1e6d2a2aaf6a5516a2d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.822 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.826 183079 DEBUG nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.826 183079 DEBUG nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.827 183079 DEBUG nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.827 183079 DEBUG nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.827 183079 DEBUG nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.828 183079 DEBUG nova.virt.libvirt.driver [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.831 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:31:44 compute-0 systemd[1]: Started libpod-conmon-0398fca3db2a42f4180be2b0d9abe8ff7436042cbc3c1e6d2a2aaf6a5516a2d1.scope.
Jan 22 17:31:44 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:31:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/effe2baf214300b88948e861438c671909962aa2332964dd7bf3065cfe4fe4c6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:31:44 compute-0 podman[233403]: 2026-01-22 17:31:44.866882437 +0000 UTC m=+0.210927848 container init 0398fca3db2a42f4180be2b0d9abe8ff7436042cbc3c1e6d2a2aaf6a5516a2d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:31:44 compute-0 podman[233403]: 2026-01-22 17:31:44.87192342 +0000 UTC m=+0.215968831 container start 0398fca3db2a42f4180be2b0d9abe8ff7436042cbc3c1e6d2a2aaf6a5516a2d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.875 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:31:44 compute-0 neutron-haproxy-ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233418]: [NOTICE]   (233422) : New worker (233424) forked
Jan 22 17:31:44 compute-0 neutron-haproxy-ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233418]: [NOTICE]   (233422) : Loading success.
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.897 183079 INFO nova.compute.manager [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Took 7.83 seconds to spawn the instance on the hypervisor.
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.898 183079 DEBUG nova.compute.manager [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:31:44 compute-0 nova_compute[183075]: 2026-01-22 17:31:44.957 183079 INFO nova.compute.manager [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Took 8.24 seconds to build instance.
Jan 22 17:31:45 compute-0 nova_compute[183075]: 2026-01-22 17:31:45.152 183079 DEBUG oslo_concurrency.lockutils [None req-ed16907c-7c8b-4f2f-b2af-ec3b57b97b9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "81396ea9-a9a1-4a21-9808-608e45a7aa03" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.531s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:46 compute-0 podman[233433]: 2026-01-22 17:31:46.345435835 +0000 UTC m=+0.058143677 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:31:46 compute-0 nova_compute[183075]: 2026-01-22 17:31:46.518 183079 INFO nova.compute.manager [None req-f9bc4842-b632-4541-ad59-3da84bbd9329 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Get console output
Jan 22 17:31:46 compute-0 nova_compute[183075]: 2026-01-22 17:31:46.868 183079 DEBUG nova.compute.manager [req-33eb0c87-47fe-4b43-93ca-4b1fd19bb49d req-ff83112b-9be2-4577-b721-3400755bd4d6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Received event network-vif-plugged-0290774a-bbee-4523-b342-ddb24aca4826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:31:46 compute-0 nova_compute[183075]: 2026-01-22 17:31:46.868 183079 DEBUG oslo_concurrency.lockutils [req-33eb0c87-47fe-4b43-93ca-4b1fd19bb49d req-ff83112b-9be2-4577-b721-3400755bd4d6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "81396ea9-a9a1-4a21-9808-608e45a7aa03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:31:46 compute-0 nova_compute[183075]: 2026-01-22 17:31:46.869 183079 DEBUG oslo_concurrency.lockutils [req-33eb0c87-47fe-4b43-93ca-4b1fd19bb49d req-ff83112b-9be2-4577-b721-3400755bd4d6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "81396ea9-a9a1-4a21-9808-608e45a7aa03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:31:46 compute-0 nova_compute[183075]: 2026-01-22 17:31:46.869 183079 DEBUG oslo_concurrency.lockutils [req-33eb0c87-47fe-4b43-93ca-4b1fd19bb49d req-ff83112b-9be2-4577-b721-3400755bd4d6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "81396ea9-a9a1-4a21-9808-608e45a7aa03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:31:46 compute-0 nova_compute[183075]: 2026-01-22 17:31:46.869 183079 DEBUG nova.compute.manager [req-33eb0c87-47fe-4b43-93ca-4b1fd19bb49d req-ff83112b-9be2-4577-b721-3400755bd4d6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] No waiting events found dispatching network-vif-plugged-0290774a-bbee-4523-b342-ddb24aca4826 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:31:46 compute-0 nova_compute[183075]: 2026-01-22 17:31:46.869 183079 WARNING nova.compute.manager [req-33eb0c87-47fe-4b43-93ca-4b1fd19bb49d req-ff83112b-9be2-4577-b721-3400755bd4d6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Received unexpected event network-vif-plugged-0290774a-bbee-4523-b342-ddb24aca4826 for instance with vm_state active and task_state None.
Jan 22 17:31:47 compute-0 nova_compute[183075]: 2026-01-22 17:31:47.154 183079 INFO nova.compute.manager [None req-8dd5eb7c-8458-40f4-b7fc-7afffbfed6b7 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Get console output
Jan 22 17:31:47 compute-0 nova_compute[183075]: 2026-01-22 17:31:47.160 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:31:47 compute-0 nova_compute[183075]: 2026-01-22 17:31:47.649 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:47 compute-0 nova_compute[183075]: 2026-01-22 17:31:47.801 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:31:48 compute-0 nova_compute[183075]: 2026-01-22 17:31:48.914 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:50 compute-0 podman[233457]: 2026-01-22 17:31:50.353899365 +0000 UTC m=+0.063486238 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:31:51 compute-0 nova_compute[183075]: 2026-01-22 17:31:51.648 183079 INFO nova.compute.manager [None req-77456c6e-87e5-4cb4-ba82-c20282ccdc81 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Get console output
Jan 22 17:31:52 compute-0 nova_compute[183075]: 2026-01-22 17:31:52.652 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:52 compute-0 nova_compute[183075]: 2026-01-22 17:31:52.809 183079 INFO nova.compute.manager [None req-dfd3667f-26f0-4a2a-a034-66a03dd3814c 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Get console output
Jan 22 17:31:52 compute-0 nova_compute[183075]: 2026-01-22 17:31:52.816 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:31:53 compute-0 nova_compute[183075]: 2026-01-22 17:31:53.917 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.458 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'name': 'luke', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000035', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'hostId': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.462 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'name': 'tempest-server-test-761314490', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000036', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'hostId': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.462 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.466 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b0949fde-940d-495c-bdb0-e6c996b0274f / tap6aacbedb-69 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.467 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.470 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 81396ea9-a9a1-4a21-9808-608e45a7aa03 / tap0290774a-bb inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.470 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bab468fd-87ad-45c4-91a3-c4b7565cc1c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000035-b0949fde-940d-495c-bdb0-e6c996b0274f-tap6aacbedb-69', 'timestamp': '2026-01-22T17:31:55.463012', 'resource_metadata': {'display_name': 'luke', 'name': 'tap6aacbedb-69', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:c7:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6aacbedb-69'}, 'message_id': '3f309166-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.223466343, 'message_signature': '3b9bd8a1822344c48b5ff01dce77deb2cde32e35f8bcb285c0a363549d725ed8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:31:55.463012', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '3f3103c6-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.228247139, 'message_signature': '3f66642bd20f5112c925126e2603baf85c189b7b588f59d9a1cb34898b4a7ce2'}]}, 'timestamp': '2026-01-22 17:31:55.470717', '_unique_id': '34a1a12ccdf041b88f29c4da6b420a20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.471 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.472 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.499 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/memory.usage volume: 43.47265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.514 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/memory.usage volume: 40.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4da38138-45fb-444f-8e2b-831e17762c48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.47265625, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'timestamp': '2026-01-22T17:31:55.473048', 'resource_metadata': {'display_name': 'luke', 'name': 'instance-00000035', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '3f358982-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.260113325, 'message_signature': '37027510ac933df483e0eabb44fa215f5e13f4bb05bdf74fd58e752225ccf787'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.38671875, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'timestamp': '2026-01-22T17:31:55.473048', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '3f37d52a-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.275190541, 'message_signature': '507cf3f7d61db2de45c13c2867bd1f740663438bd99066bb7080c0503614d6b1'}]}, 'timestamp': '2026-01-22 17:31:55.515405', '_unique_id': '45b80e78a97c4334a39b42aea836a10a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.516 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.517 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.517 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.517 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3f55bdd-b45c-4791-bcb0-d9556d2e799d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000035-b0949fde-940d-495c-bdb0-e6c996b0274f-tap6aacbedb-69', 'timestamp': '2026-01-22T17:31:55.517256', 'resource_metadata': {'display_name': 'luke', 'name': 'tap6aacbedb-69', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:c7:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6aacbedb-69'}, 'message_id': '3f382ac0-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.223466343, 'message_signature': 'fc8b876e4ba800bc705d9c7d8496d599ece582bdaf6516d1b9c2b118056f3381'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:31:55.517256', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '3f3835ce-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.228247139, 'message_signature': '288414ea7af45804f9d4476466a03a02efd42db0066c5888d2dd1c3e9ffca721'}]}, 'timestamp': '2026-01-22 17:31:55.517798', '_unique_id': 'd761d89649794cb7b8bd6a4dcb8ec92b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.518 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.526 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.531 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f21af679-71a1-44c6-9247-0c1d06d2c71d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f-vda', 'timestamp': '2026-01-22T17:31:55.518941', 'resource_metadata': {'display_name': 'luke', 'name': 'instance-00000035', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f3997ca-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.279359001, 'message_signature': 'fa6bcc614155bae1097e8892e275a9ec8590b989e4cfa23e42b29ccec0e8d040'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:31:55.518941', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f3a4f76-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.287247518, 'message_signature': 'a6d148d6068d62075fbf16fd672582861ee1c8f21f938f3f499430e59ede96b9'}]}, 'timestamp': '2026-01-22 17:31:55.531554', '_unique_id': 'ed30855caf7f4fa5985581303447aa13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/network.outgoing.bytes volume: 10936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.532 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eaa8e0aa-e183-4d99-af89-cfbd1a0b2b51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10936, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000035-b0949fde-940d-495c-bdb0-e6c996b0274f-tap6aacbedb-69', 'timestamp': '2026-01-22T17:31:55.532693', 'resource_metadata': {'display_name': 'luke', 'name': 'tap6aacbedb-69', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:c7:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6aacbedb-69'}, 'message_id': '3f3a8464-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.223466343, 'message_signature': '97251f64dbe2e87800dada221db11670917c75c239ef01dadb011c27e19bc550'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:31:55.532693', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '3f3a8c66-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.228247139, 'message_signature': '1097d0e1d9006c64d3c8aa80f877f956b652c55a006f8e2a45c22d7b2d862b8d'}]}, 'timestamp': '2026-01-22 17:31:55.533115', '_unique_id': '031fce2bc54d43c4a6fbd243091e1395'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.534 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.549 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/disk.device.write.latency volume: 14935715334 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.567 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e50bda05-c7d9-4149-83c6-f7693ed2add6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14935715334, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f-vda', 'timestamp': '2026-01-22T17:31:55.534185', 'resource_metadata': {'display_name': 'luke', 'name': 'instance-00000035', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f3d23b8-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.294595271, 'message_signature': 'a4bcd8cd95b0958dbcf280fcea447c72d6e50642cb03404a58590dd6505c4fd5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:31:55.534185', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f3fdb30-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.310506128, 'message_signature': '1b7711676ad809039efd369da6d463e841f98c5dfe4732c6de93f4156fb80564'}]}, 'timestamp': '2026-01-22 17:31:55.567960', '_unique_id': '5045d996e5a54faeae36f7bfe319fc9f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.569 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.569 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/disk.device.read.latency volume: 224731544 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.569 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.read.latency volume: 192508732 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b178cd77-7bda-4d69-9515-060682809e86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 224731544, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f-vda', 'timestamp': '2026-01-22T17:31:55.569707', 'resource_metadata': {'display_name': 'luke', 'name': 'instance-00000035', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f402bc6-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.294595271, 'message_signature': '19fefb1c1d225010db23b8691d775d6b7f87e8a0842aaf6ec9214c2a21520c5e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 192508732, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:31:55.569707', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f40342c-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.310506128, 'message_signature': 'd4d913ba608edc37183c4e23255e8d7cb1baf1163d73a041c63ec58b833df0fa'}]}, 'timestamp': '2026-01-22 17:31:55.570171', '_unique_id': '7ba816469ca246948c1f7710b9167779'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.570 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.571 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.571 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.571 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0641c084-93e9-44cf-ae63-1a1c5e05d5b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000035-b0949fde-940d-495c-bdb0-e6c996b0274f-tap6aacbedb-69', 'timestamp': '2026-01-22T17:31:55.571558', 'resource_metadata': {'display_name': 'luke', 'name': 'tap6aacbedb-69', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:c7:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6aacbedb-69'}, 'message_id': '3f4073c4-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.223466343, 'message_signature': '46c1f5b9910cc5eed25dc2fab693e8089413569d505c745310ffc8b03072d244'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:31:55.571558', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '3f407c0c-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.228247139, 'message_signature': 'b448224102bef5c5fd47b912f2d0029b45c4dbefa376ef5cd04242d975cbced5'}]}, 'timestamp': '2026-01-22 17:31:55.572052', '_unique_id': '39db019d84fc4a4e820056f1b8e9d495'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.573 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.573 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/cpu volume: 12150000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.573 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/cpu volume: 9990000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c43bf688-f7de-41b9-a261-1fd6fae9bee6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12150000000, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'timestamp': '2026-01-22T17:31:55.573448', 'resource_metadata': {'display_name': 'luke', 'name': 'instance-00000035', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '3f40bcee-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.260113325, 'message_signature': '2bcb059c405fc491f568b01a7710d5ec3e9556567f5810660feacee920377430'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9990000000, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'timestamp': '2026-01-22T17:31:55.573448', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '3f40c6f8-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.275190541, 'message_signature': '6e07c29c40210f261015101dc7dc76c3d820dae87571a952acc827c4d9432e76'}]}, 'timestamp': '2026-01-22 17:31:55.573931', '_unique_id': 'e224bb0fe3764ecd8225d8b7d78c4623'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.575 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.575 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/disk.device.write.bytes volume: 72982528 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.575 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '937d77f6-5dc8-4b90-8c6a-0e83a16bd949', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72982528, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f-vda', 'timestamp': '2026-01-22T17:31:55.575217', 'resource_metadata': {'display_name': 'luke', 'name': 'instance-00000035', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f410168-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.294595271, 'message_signature': '1d53122927112f1e8ea2d1d3e08db3713663a8d11f466a9733236c19bde37cb0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:31:55.575217', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f410906-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.310506128, 'message_signature': 'd9d05e306fbfc0d023d84f3774cc75d1c65be8c86a0312ab9f54d37ec44ddb7c'}]}, 'timestamp': '2026-01-22 17:31:55.575619', '_unique_id': '2875f14e199e4d449e620debfc158f7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.576 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/network.outgoing.packets volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74cfe8dc-ee57-419e-a0f9-a3dc510fe21b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 124, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000035-b0949fde-940d-495c-bdb0-e6c996b0274f-tap6aacbedb-69', 'timestamp': '2026-01-22T17:31:55.576729', 'resource_metadata': {'display_name': 'luke', 'name': 'tap6aacbedb-69', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:c7:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6aacbedb-69'}, 'message_id': '3f413d9a-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.223466343, 'message_signature': '8242cd2ce0cb4b8a4a842bda5460b9d8a5d0aa39c5acb8829c6b99edf53faf96'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:31:55.576729', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '3f414704-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.228247139, 'message_signature': '17d89cbdeb23952370e41c27a7a4ea9de536d3c8ff229dd1b3010cc6776804cd'}]}, 'timestamp': '2026-01-22 17:31:55.577250', '_unique_id': '33f75a115ea842db9beac0cc8532a970'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.578 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.578 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.578 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: luke>, <NovaLikeServer: tempest-server-test-761314490>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: luke>, <NovaLikeServer: tempest-server-test-761314490>]
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.578 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.578 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/network.incoming.bytes volume: 7133 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.578 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '710a0203-4498-4b1d-a854-79c214d24caa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7133, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000035-b0949fde-940d-495c-bdb0-e6c996b0274f-tap6aacbedb-69', 'timestamp': '2026-01-22T17:31:55.578680', 'resource_metadata': {'display_name': 'luke', 'name': 'tap6aacbedb-69', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:c7:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6aacbedb-69'}, 'message_id': '3f4188a4-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.223466343, 'message_signature': '646672ba1738d72e1bb75d530200fc5d405b63b83fccc25d19c0b83226b9ba19'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:31:55.578680', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '3f4190ec-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.228247139, 'message_signature': '6b9dfecfa7019d08ca78238f9355bab461b50f52b3136caf8da38664542bac80'}]}, 'timestamp': '2026-01-22 17:31:55.579121', '_unique_id': '89244ba361a24777b5be67fc212a4bd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.580 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.580 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.580 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ae9a414-b915-4370-974d-086484c78921', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f-vda', 'timestamp': '2026-01-22T17:31:55.580177', 'resource_metadata': {'display_name': 'luke', 'name': 'instance-00000035', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f41c2f6-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.279359001, 'message_signature': '904f4a3c7847be7cef81e00b83eaafed723efc7b12863bbef255ff1063a57b9e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:31:55.580177', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f41ca6c-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.287247518, 'message_signature': '1828cd9b686894bf42b891c4d526d8c0ef78e81c09b4a9da647c658490c1e9e2'}]}, 'timestamp': '2026-01-22 17:31:55.580567', '_unique_id': '980680436ea248b2b68c68710ce214d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/disk.device.read.requests volume: 1112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.581 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.read.requests volume: 834 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b4edac8-5fd0-49dd-a3ea-40f2a8440973', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1112, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f-vda', 'timestamp': '2026-01-22T17:31:55.581606', 'resource_metadata': {'display_name': 'luke', 'name': 'instance-00000035', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f41fbea-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.294595271, 'message_signature': '5a143ddbe6dd27ab94ac2e849f2e2e88d5a2b13d0ce1b5b2d94c25ecf56ffc0e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 834, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:31:55.581606', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f420360-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.310506128, 'message_signature': '0ea02000c30d9257541cea042638a92e20410bc4e098e441b641d1ddbfb4bd40'}]}, 'timestamp': '2026-01-22 17:31:55.582026', '_unique_id': 'e4dae8a9e21e430798823ca0c73acdd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.582 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1f52e64-6921-4954-ac7c-32fb4478ac45', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000035-b0949fde-940d-495c-bdb0-e6c996b0274f-tap6aacbedb-69', 'timestamp': '2026-01-22T17:31:55.583105', 'resource_metadata': {'display_name': 'luke', 'name': 'tap6aacbedb-69', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:c7:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6aacbedb-69'}, 'message_id': '3f423588-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.223466343, 'message_signature': '4ac5262511d76ed5512ee051c8740184e984301ad4c869ba0111c7aa9da96cd0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:31:55.583105', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '3f423d4e-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.228247139, 'message_signature': '40dcd4ebac74e2469fbc4eb4ac4476cc6136c7be5513048c5e31b3203bcdd747'}]}, 'timestamp': '2026-01-22 17:31:55.583517', '_unique_id': 'c497c5a6ed80480c900028689ae519cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.583 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.584 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.584 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.584 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85824c6c-b3e7-444b-aca6-21553187331d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f-vda', 'timestamp': '2026-01-22T17:31:55.584711', 'resource_metadata': {'display_name': 'luke', 'name': 'instance-00000035', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f427480-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.279359001, 'message_signature': 'bd64914422bbe5cc5e0ab589befada6c0c8974443e5af4ee197c9bb8a0791bef'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:31:55.584711', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f427c1e-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.287247518, 'message_signature': '2039bf29445374e23be281e96b082cdc680d05f9863eafe325c016749158d2d2'}]}, 'timestamp': '2026-01-22 17:31:55.585120', '_unique_id': '9a6ea8631a564f97baf9ae3288f9fba0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.586 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.586 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.586 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: luke>, <NovaLikeServer: tempest-server-test-761314490>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: luke>, <NovaLikeServer: tempest-server-test-761314490>]
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.586 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.586 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/disk.device.read.bytes volume: 30046720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.586 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.read.bytes volume: 25333248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb4adf8a-1ddd-4086-b7ce-bc0258a248fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30046720, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f-vda', 'timestamp': '2026-01-22T17:31:55.586453', 'resource_metadata': {'display_name': 'luke', 'name': 'instance-00000035', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f42b828-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.294595271, 'message_signature': 'a10037d53d3eb63dba694d917b6d1aab58e776c1c0296c428dbaae317aa45b0e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25333248, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:31:55.586453', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f42c0e8-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.310506128, 'message_signature': 'e52426363483db8369f20245ef56cbaa37cd2ba076ba6c6939758c8cccc6ac84'}]}, 'timestamp': '2026-01-22 17:31:55.586879', '_unique_id': '79f6c02134834ebe8a340029daa928b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.587 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7256eb9-4f8b-4c62-8837-51879e4155fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000035-b0949fde-940d-495c-bdb0-e6c996b0274f-tap6aacbedb-69', 'timestamp': '2026-01-22T17:31:55.587921', 'resource_metadata': {'display_name': 'luke', 'name': 'tap6aacbedb-69', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:c7:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6aacbedb-69'}, 'message_id': '3f42f194-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.223466343, 'message_signature': '95fd5f102ee64f5a391b424ec81adabc3be5e688921a25bbb4430f9400e67631'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:31:55.587921', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '3f42f95a-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.228247139, 'message_signature': '0e5c5b047f0862fc783ceb64b1c1af7ea1cbb2a386520e111e8189ad813f7242'}]}, 'timestamp': '2026-01-22 17:31:55.588331', '_unique_id': '744ad6be135540d1b9e203b78514d978'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.588 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.589 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.589 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.589 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a355ee8a-7ea5-4440-8622-0bceb5fb3bb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000035-b0949fde-940d-495c-bdb0-e6c996b0274f-tap6aacbedb-69', 'timestamp': '2026-01-22T17:31:55.589524', 'resource_metadata': {'display_name': 'luke', 'name': 'tap6aacbedb-69', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:c7:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6aacbedb-69'}, 'message_id': '3f433276-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.223466343, 'message_signature': 'd2fa084c8323d796d746a1b43601060762fec289366cb3b52093e4ca1f6acf80'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:31:55.589524', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '3f433d5c-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.228247139, 'message_signature': '3da4ccd0e06e42e57ea06bb0d11277c88a79b38b57aaed8d50b3af1b18a61d86'}]}, 'timestamp': '2026-01-22 17:31:55.590119', '_unique_id': '51ccffe0d193403a8dd49de1e2d725a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.591 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.591 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.591 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '507f0bc1-8578-4c2d-bd24-0ab1a65ae9a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'instance-00000035-b0949fde-940d-495c-bdb0-e6c996b0274f-tap6aacbedb-69', 'timestamp': '2026-01-22T17:31:55.591372', 'resource_metadata': {'display_name': 'luke', 'name': 'tap6aacbedb-69', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:c7:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6aacbedb-69'}, 'message_id': '3f437894-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.223466343, 'message_signature': '95e3b205d4cb5bc94d5ed8550cd3d4f6eb891b0c4231c4f7fb959c8cf3fe8ec6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:31:55.591372', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '3f43815e-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.228247139, 'message_signature': '99465d71bb370ec60e50f81524bf230af26aa5432bcd44a7b9f94f9c7cbdcb36'}]}, 'timestamp': '2026-01-22 17:31:55.591815', '_unique_id': '4868d53bcef54a73aca2e70adcefbf88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.592 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: luke>, <NovaLikeServer: tempest-server-test-761314490>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: luke>, <NovaLikeServer: tempest-server-test-761314490>]
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.593 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.593 12 DEBUG ceilometer.compute.pollsters [-] b0949fde-940d-495c-bdb0-e6c996b0274f/disk.device.write.requests volume: 323 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.593 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1b6f0d4-f264-47b9-99aa-241f33b33b57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 323, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_name': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_name': None, 'resource_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f-vda', 'timestamp': '2026-01-22T17:31:55.593135', 'resource_metadata': {'display_name': 'luke', 'name': 'instance-00000035', 'instance_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'instance_type': 'm1.nano', 'host': '5e8970dc7adfb3ddde204c2974a1bf0f4bf3acf136f907860f30bd05', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f43bd18-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.294595271, 'message_signature': 'ec59a119250d76e345b166f03298865545a0a2f64c65892de633dde7111cf186'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:31:55.593135', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3f43c4ac-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5479.310506128, 'message_signature': 'ff9f71f019da44cae82cee3f0c9e2d0d4bfdec5f18b639b14bfa627cbcbac6fd'}]}, 'timestamp': '2026-01-22 17:31:55.593530', '_unique_id': '5029a1833055434a9c89111f987ef6c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:31:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:31:55.594 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: luke>, <NovaLikeServer: tempest-server-test-761314490>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: luke>, <NovaLikeServer: tempest-server-test-761314490>]
Jan 22 17:31:56 compute-0 nova_compute[183075]: 2026-01-22 17:31:56.773 183079 INFO nova.compute.manager [None req-300a76a8-aa4b-470f-ad48-1054acd77d91 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Get console output
Jan 22 17:31:56 compute-0 nova_compute[183075]: 2026-01-22 17:31:56.779 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:31:57 compute-0 ovn_controller[95372]: 2026-01-22T17:31:57Z|00110|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:60:a0:be 10.100.0.3
Jan 22 17:31:57 compute-0 ovn_controller[95372]: 2026-01-22T17:31:57Z|00111|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:60:a0:be 10.100.0.3
Jan 22 17:31:57 compute-0 nova_compute[183075]: 2026-01-22 17:31:57.705 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:31:58 compute-0 nova_compute[183075]: 2026-01-22 17:31:58.102 183079 INFO nova.compute.manager [None req-219d0c27-5a48-438f-9f12-87e3c472e50a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Get console output
Jan 22 17:31:58 compute-0 nova_compute[183075]: 2026-01-22 17:31:58.109 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:31:58 compute-0 nova_compute[183075]: 2026-01-22 17:31:58.920 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:02 compute-0 podman[233502]: 2026-01-22 17:32:02.356333303 +0000 UTC m=+0.057108736 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 17:32:02 compute-0 podman[233503]: 2026-01-22 17:32:02.381082816 +0000 UTC m=+0.066947783 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, managed_by=edpm_ansible)
Jan 22 17:32:02 compute-0 podman[233501]: 2026-01-22 17:32:02.412913783 +0000 UTC m=+0.104611719 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:32:02 compute-0 nova_compute[183075]: 2026-01-22 17:32:02.708 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:03 compute-0 nova_compute[183075]: 2026-01-22 17:32:03.163 183079 INFO nova.compute.manager [None req-505ea6af-60fe-4667-900d-87f6783eeadb c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Get console output
Jan 22 17:32:03 compute-0 nova_compute[183075]: 2026-01-22 17:32:03.170 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:32:03 compute-0 nova_compute[183075]: 2026-01-22 17:32:03.411 183079 INFO nova.compute.manager [None req-c85bb82d-0b04-4044-9d4d-ea8877766352 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Get console output
Jan 22 17:32:03 compute-0 nova_compute[183075]: 2026-01-22 17:32:03.419 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:32:03 compute-0 nova_compute[183075]: 2026-01-22 17:32:03.923 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:04.548 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:32:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:04.550 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:32:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:32:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:32:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:32:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:32:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:32:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:32:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.616 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.618 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.0677176
Jan 22 17:32:05 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.3:60768 [22/Jan/2026:17:32:04.547] listener listener/metadata 0/0/0/1071/1071 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.628 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.629 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.649 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.649 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0204194
Jan 22 17:32:05 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.3:60778 [22/Jan/2026:17:32:05.627] listener listener/metadata 0/0/0/21/21 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.656 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.657 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.671 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.671 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0144203
Jan 22 17:32:05 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.3:60790 [22/Jan/2026:17:32:05.655] listener listener/metadata 0/0/0/15/15 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.677 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.678 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.696 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.697 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0190957
Jan 22 17:32:05 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.3:60806 [22/Jan/2026:17:32:05.677] listener listener/metadata 0/0/0/20/20 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.702 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.703 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.719 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.719 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0167139
Jan 22 17:32:05 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.3:60816 [22/Jan/2026:17:32:05.702] listener listener/metadata 0/0/0/17/17 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.725 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.726 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.746 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.747 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0209687
Jan 22 17:32:05 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.3:60820 [22/Jan/2026:17:32:05.725] listener listener/metadata 0/0/0/21/21 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.751 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.752 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.770 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:32:05 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.3:60824 [22/Jan/2026:17:32:05.751] listener listener/metadata 0/0/0/19/19 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.770 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0184932
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.775 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.775 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.791 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.792 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0165017
Jan 22 17:32:05 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.3:60830 [22/Jan/2026:17:32:05.775] listener listener/metadata 0/0/0/17/17 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.797 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.797 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.815 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:32:05 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.3:60832 [22/Jan/2026:17:32:05.796] listener listener/metadata 0/0/0/19/19 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.816 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0185046
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.820 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.821 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.838 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:32:05 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.3:60836 [22/Jan/2026:17:32:05.820] listener listener/metadata 0/0/0/18/18 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.838 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0175815
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.842 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.843 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:32:05 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.3:60842 [22/Jan/2026:17:32:05.842] listener listener/metadata 0/0/0/18/18 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.861 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0178983
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.877 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.878 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.897 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:32:05 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.3:60854 [22/Jan/2026:17:32:05.876] listener listener/metadata 0/0/0/21/21 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.897 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0190060
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.903 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.904 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.921 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:32:05 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.3:60868 [22/Jan/2026:17:32:05.903] listener listener/metadata 0/0/0/19/19 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.922 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0177050
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.928 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.930 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.949 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.949 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0200028
Jan 22 17:32:05 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.3:60870 [22/Jan/2026:17:32:05.928] listener listener/metadata 0/0/0/21/21 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.958 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.959 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.980 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:32:05 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.3:60874 [22/Jan/2026:17:32:05.958] listener listener/metadata 0/0/0/23/23 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.981 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0216403
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.991 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:05.992 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:32:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:32:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:06.008 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:32:06 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.3:60880 [22/Jan/2026:17:32:05.990] listener listener/metadata 0/0/0/18/18 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:32:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:06.008 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0161097
Jan 22 17:32:06 compute-0 podman[233568]: 2026-01-22 17:32:06.353851434 +0000 UTC m=+0.057863216 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:32:07 compute-0 nova_compute[183075]: 2026-01-22 17:32:07.733 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:08 compute-0 nova_compute[183075]: 2026-01-22 17:32:08.303 183079 INFO nova.compute.manager [None req-8f637f68-17cb-4af3-9480-122d5a8d31be c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Get console output
Jan 22 17:32:08 compute-0 nova_compute[183075]: 2026-01-22 17:32:08.309 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:32:08 compute-0 nova_compute[183075]: 2026-01-22 17:32:08.556 183079 INFO nova.compute.manager [None req-23b63a22-4546-4816-9f60-2f10109b92a7 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Get console output
Jan 22 17:32:08 compute-0 nova_compute[183075]: 2026-01-22 17:32:08.559 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:32:08 compute-0 nova_compute[183075]: 2026-01-22 17:32:08.926 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:12 compute-0 nova_compute[183075]: 2026-01-22 17:32:12.735 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:13 compute-0 nova_compute[183075]: 2026-01-22 17:32:13.653 183079 INFO nova.compute.manager [None req-4f4830e3-87c7-4b4b-ad50-404dae93fae3 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Get console output
Jan 22 17:32:13 compute-0 nova_compute[183075]: 2026-01-22 17:32:13.658 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:32:13 compute-0 nova_compute[183075]: 2026-01-22 17:32:13.831 183079 INFO nova.compute.manager [None req-0cee39f4-1809-4dcb-80fa-585b704802b0 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Get console output
Jan 22 17:32:13 compute-0 nova_compute[183075]: 2026-01-22 17:32:13.842 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:32:13 compute-0 nova_compute[183075]: 2026-01-22 17:32:13.929 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:14 compute-0 ovn_controller[95372]: 2026-01-22T17:32:14Z|00597|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 22 17:32:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:14.837 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:32:14 compute-0 nova_compute[183075]: 2026-01-22 17:32:14.837 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:14.839 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:32:15 compute-0 nova_compute[183075]: 2026-01-22 17:32:15.807 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:15 compute-0 NetworkManager[55454]: <info>  [1769103135.8083] manager: (patch-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/256)
Jan 22 17:32:15 compute-0 NetworkManager[55454]: <info>  [1769103135.8091] manager: (patch-br-int-to-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/257)
Jan 22 17:32:15 compute-0 nova_compute[183075]: 2026-01-22 17:32:15.874 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:15 compute-0 ovn_controller[95372]: 2026-01-22T17:32:15Z|00598|binding|INFO|Releasing lport 77b4e93d-6708-4efb-b060-601be2ddc621 from this chassis (sb_readonly=0)
Jan 22 17:32:15 compute-0 ovn_controller[95372]: 2026-01-22T17:32:15Z|00599|binding|INFO|Releasing lport 0297f784-1a41-4744-b018-f503dfa93754 from this chassis (sb_readonly=0)
Jan 22 17:32:15 compute-0 nova_compute[183075]: 2026-01-22 17:32:15.891 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:16 compute-0 nova_compute[183075]: 2026-01-22 17:32:16.210 183079 DEBUG nova.compute.manager [req-e4ba0d44-9385-4089-86cc-6824db1a0745 req-c3bcc5cc-a383-4bfc-88c7-b47e649b4178 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Received event network-changed-6aacbedb-6999-4006-9e77-6e540614dbea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:32:16 compute-0 nova_compute[183075]: 2026-01-22 17:32:16.211 183079 DEBUG nova.compute.manager [req-e4ba0d44-9385-4089-86cc-6824db1a0745 req-c3bcc5cc-a383-4bfc-88c7-b47e649b4178 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Refreshing instance network info cache due to event network-changed-6aacbedb-6999-4006-9e77-6e540614dbea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:32:16 compute-0 nova_compute[183075]: 2026-01-22 17:32:16.212 183079 DEBUG oslo_concurrency.lockutils [req-e4ba0d44-9385-4089-86cc-6824db1a0745 req-c3bcc5cc-a383-4bfc-88c7-b47e649b4178 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-b0949fde-940d-495c-bdb0-e6c996b0274f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:32:16 compute-0 nova_compute[183075]: 2026-01-22 17:32:16.212 183079 DEBUG oslo_concurrency.lockutils [req-e4ba0d44-9385-4089-86cc-6824db1a0745 req-c3bcc5cc-a383-4bfc-88c7-b47e649b4178 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-b0949fde-940d-495c-bdb0-e6c996b0274f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:32:16 compute-0 nova_compute[183075]: 2026-01-22 17:32:16.212 183079 DEBUG nova.network.neutron [req-e4ba0d44-9385-4089-86cc-6824db1a0745 req-c3bcc5cc-a383-4bfc-88c7-b47e649b4178 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Refreshing network info cache for port 6aacbedb-6999-4006-9e77-6e540614dbea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:32:17 compute-0 podman[233588]: 2026-01-22 17:32:17.348618111 +0000 UTC m=+0.055920004 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:32:17 compute-0 nova_compute[183075]: 2026-01-22 17:32:17.378 183079 DEBUG nova.network.neutron [req-e4ba0d44-9385-4089-86cc-6824db1a0745 req-c3bcc5cc-a383-4bfc-88c7-b47e649b4178 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Updated VIF entry in instance network info cache for port 6aacbedb-6999-4006-9e77-6e540614dbea. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:32:17 compute-0 nova_compute[183075]: 2026-01-22 17:32:17.378 183079 DEBUG nova.network.neutron [req-e4ba0d44-9385-4089-86cc-6824db1a0745 req-c3bcc5cc-a383-4bfc-88c7-b47e649b4178 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Updating instance_info_cache with network_info: [{"id": "6aacbedb-6999-4006-9e77-6e540614dbea", "address": "fa:16:3e:35:c7:99", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6aacbedb-69", "ovs_interfaceid": "6aacbedb-6999-4006-9e77-6e540614dbea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:32:17 compute-0 nova_compute[183075]: 2026-01-22 17:32:17.397 183079 DEBUG oslo_concurrency.lockutils [req-e4ba0d44-9385-4089-86cc-6824db1a0745 req-c3bcc5cc-a383-4bfc-88c7-b47e649b4178 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-b0949fde-940d-495c-bdb0-e6c996b0274f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:32:17 compute-0 nova_compute[183075]: 2026-01-22 17:32:17.780 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:18 compute-0 nova_compute[183075]: 2026-01-22 17:32:18.781 183079 INFO nova.compute.manager [None req-2c059653-3014-469a-a698-4d26899d3b88 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Get console output
Jan 22 17:32:18 compute-0 nova_compute[183075]: 2026-01-22 17:32:18.785 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:32:18 compute-0 nova_compute[183075]: 2026-01-22 17:32:18.934 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:20 compute-0 nova_compute[183075]: 2026-01-22 17:32:20.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:32:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:20.841 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:32:21 compute-0 podman[233612]: 2026-01-22 17:32:21.345422613 +0000 UTC m=+0.054609308 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:32:22 compute-0 nova_compute[183075]: 2026-01-22 17:32:22.783 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:23 compute-0 nova_compute[183075]: 2026-01-22 17:32:23.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:32:23 compute-0 nova_compute[183075]: 2026-01-22 17:32:23.936 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:24 compute-0 nova_compute[183075]: 2026-01-22 17:32:24.599 183079 INFO nova.compute.manager [None req-7ceaa3f3-f40d-4658-9db5-5db623bd8d78 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Get console output
Jan 22 17:32:24 compute-0 nova_compute[183075]: 2026-01-22 17:32:24.604 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:32:24 compute-0 nova_compute[183075]: 2026-01-22 17:32:24.705 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:32:24 compute-0 nova_compute[183075]: 2026-01-22 17:32:24.705 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:32:24 compute-0 nova_compute[183075]: 2026-01-22 17:32:24.776 183079 DEBUG nova.compute.manager [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:32:24 compute-0 nova_compute[183075]: 2026-01-22 17:32:24.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:32:24 compute-0 nova_compute[183075]: 2026-01-22 17:32:24.912 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:32:24 compute-0 nova_compute[183075]: 2026-01-22 17:32:24.913 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:32:24 compute-0 nova_compute[183075]: 2026-01-22 17:32:24.922 183079 DEBUG nova.virt.hardware [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:32:24 compute-0 nova_compute[183075]: 2026-01-22 17:32:24.922 183079 INFO nova.compute.claims [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.137 183079 DEBUG nova.compute.provider_tree [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.165 183079 DEBUG nova.scheduler.client.report [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.236 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.237 183079 DEBUG nova.compute.manager [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.323 183079 DEBUG nova.compute.manager [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.324 183079 DEBUG nova.network.neutron [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.381 183079 INFO nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.419 183079 DEBUG nova.compute.manager [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.504 183079 DEBUG nova.policy [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ddebe2a251e4b118d9469f7d6fdb2ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.573 183079 DEBUG nova.compute.manager [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.575 183079 DEBUG nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.575 183079 INFO nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Creating image(s)
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.575 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "/var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.576 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "/var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.576 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "/var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.588 183079 DEBUG oslo_concurrency.processutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.685 183079 DEBUG oslo_concurrency.processutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.685 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.686 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.696 183079 DEBUG oslo_concurrency.processutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.755 183079 DEBUG oslo_concurrency.processutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:32:25 compute-0 nova_compute[183075]: 2026-01-22 17:32:25.756 183079 DEBUG oslo_concurrency.processutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:32:26 compute-0 nova_compute[183075]: 2026-01-22 17:32:26.164 183079 DEBUG oslo_concurrency.processutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4/disk 1073741824" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:32:26 compute-0 nova_compute[183075]: 2026-01-22 17:32:26.165 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.479s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:32:26 compute-0 nova_compute[183075]: 2026-01-22 17:32:26.166 183079 DEBUG oslo_concurrency.processutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:32:26 compute-0 nova_compute[183075]: 2026-01-22 17:32:26.223 183079 DEBUG oslo_concurrency.processutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:32:26 compute-0 nova_compute[183075]: 2026-01-22 17:32:26.225 183079 DEBUG nova.virt.disk.api [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Checking if we can resize image /var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:32:26 compute-0 nova_compute[183075]: 2026-01-22 17:32:26.225 183079 DEBUG oslo_concurrency.processutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:32:26 compute-0 nova_compute[183075]: 2026-01-22 17:32:26.280 183079 DEBUG oslo_concurrency.processutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:32:26 compute-0 nova_compute[183075]: 2026-01-22 17:32:26.282 183079 DEBUG nova.virt.disk.api [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Cannot resize image /var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:32:26 compute-0 nova_compute[183075]: 2026-01-22 17:32:26.282 183079 DEBUG nova.objects.instance [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lazy-loading 'migration_context' on Instance uuid b09d9bed-19f3-4aae-8aa4-7a87468084e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:32:26 compute-0 nova_compute[183075]: 2026-01-22 17:32:26.689 183079 DEBUG nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:32:26 compute-0 nova_compute[183075]: 2026-01-22 17:32:26.690 183079 DEBUG nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Ensure instance console log exists: /var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:32:26 compute-0 nova_compute[183075]: 2026-01-22 17:32:26.691 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:32:26 compute-0 nova_compute[183075]: 2026-01-22 17:32:26.691 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:32:26 compute-0 nova_compute[183075]: 2026-01-22 17:32:26.692 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:32:27 compute-0 nova_compute[183075]: 2026-01-22 17:32:27.651 183079 DEBUG nova.network.neutron [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Successfully created port: a49c001e-85aa-4216-a7dd-ed52fa4c71ac _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:32:27 compute-0 nova_compute[183075]: 2026-01-22 17:32:27.828 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:28 compute-0 nova_compute[183075]: 2026-01-22 17:32:28.979 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:29 compute-0 nova_compute[183075]: 2026-01-22 17:32:29.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:32:32 compute-0 nova_compute[183075]: 2026-01-22 17:32:32.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:32:32 compute-0 nova_compute[183075]: 2026-01-22 17:32:32.831 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:33 compute-0 podman[233652]: 2026-01-22 17:32:33.339100678 +0000 UTC m=+0.043212377 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 17:32:33 compute-0 podman[233651]: 2026-01-22 17:32:33.367964584 +0000 UTC m=+0.076190255 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:32:33 compute-0 podman[233653]: 2026-01-22 17:32:33.378262014 +0000 UTC m=+0.074460228 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 17:32:33 compute-0 nova_compute[183075]: 2026-01-22 17:32:33.982 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:34 compute-0 nova_compute[183075]: 2026-01-22 17:32:34.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:32:34 compute-0 nova_compute[183075]: 2026-01-22 17:32:34.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:32:34 compute-0 nova_compute[183075]: 2026-01-22 17:32:34.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:32:37 compute-0 podman[233710]: 2026-01-22 17:32:37.390646669 +0000 UTC m=+0.099051227 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:32:37 compute-0 nova_compute[183075]: 2026-01-22 17:32:37.787 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:32:37 compute-0 nova_compute[183075]: 2026-01-22 17:32:37.787 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:32:37 compute-0 nova_compute[183075]: 2026-01-22 17:32:37.788 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:32:37 compute-0 nova_compute[183075]: 2026-01-22 17:32:37.788 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:32:37 compute-0 nova_compute[183075]: 2026-01-22 17:32:37.833 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:37 compute-0 nova_compute[183075]: 2026-01-22 17:32:37.894 183079 INFO nova.compute.manager [None req-cf6e65cd-0f2a-4f39-909b-5844e2abe670 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Get console output
Jan 22 17:32:37 compute-0 nova_compute[183075]: 2026-01-22 17:32:37.901 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:32:38 compute-0 nova_compute[183075]: 2026-01-22 17:32:38.333 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:32:38 compute-0 nova_compute[183075]: 2026-01-22 17:32:38.403 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:32:38 compute-0 nova_compute[183075]: 2026-01-22 17:32:38.404 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:32:38 compute-0 nova_compute[183075]: 2026-01-22 17:32:38.460 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:32:38 compute-0 nova_compute[183075]: 2026-01-22 17:32:38.468 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:32:38 compute-0 nova_compute[183075]: 2026-01-22 17:32:38.524 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:32:38 compute-0 nova_compute[183075]: 2026-01-22 17:32:38.525 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:32:38 compute-0 nova_compute[183075]: 2026-01-22 17:32:38.591 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:32:38 compute-0 nova_compute[183075]: 2026-01-22 17:32:38.790 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:32:38 compute-0 nova_compute[183075]: 2026-01-22 17:32:38.792 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5387MB free_disk=73.30316925048828GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:32:38 compute-0 nova_compute[183075]: 2026-01-22 17:32:38.792 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:32:38 compute-0 nova_compute[183075]: 2026-01-22 17:32:38.792 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:32:38 compute-0 nova_compute[183075]: 2026-01-22 17:32:38.983 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.206 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance b0949fde-940d-495c-bdb0-e6c996b0274f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.206 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 81396ea9-a9a1-4a21-9808-608e45a7aa03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.206 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance b09d9bed-19f3-4aae-8aa4-7a87468084e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.206 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.206 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.338 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing inventories for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.356 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Updating ProviderTree inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.357 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.370 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing aggregate associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.392 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing trait associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NODE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.459 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.511 183079 DEBUG nova.network.neutron [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Successfully updated port: a49c001e-85aa-4216-a7dd-ed52fa4c71ac _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.575 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.706 183079 DEBUG nova.compute.manager [req-88a40a46-a27c-4e31-8b26-92e302b483dd req-3a8b4089-9e70-43e5-892c-afe8654a5f56 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Received event network-changed-a49c001e-85aa-4216-a7dd-ed52fa4c71ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.706 183079 DEBUG nova.compute.manager [req-88a40a46-a27c-4e31-8b26-92e302b483dd req-3a8b4089-9e70-43e5-892c-afe8654a5f56 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Refreshing instance network info cache due to event network-changed-a49c001e-85aa-4216-a7dd-ed52fa4c71ac. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.706 183079 DEBUG oslo_concurrency.lockutils [req-88a40a46-a27c-4e31-8b26-92e302b483dd req-3a8b4089-9e70-43e5-892c-afe8654a5f56 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-b09d9bed-19f3-4aae-8aa4-7a87468084e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.706 183079 DEBUG oslo_concurrency.lockutils [req-88a40a46-a27c-4e31-8b26-92e302b483dd req-3a8b4089-9e70-43e5-892c-afe8654a5f56 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-b09d9bed-19f3-4aae-8aa4-7a87468084e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.707 183079 DEBUG nova.network.neutron [req-88a40a46-a27c-4e31-8b26-92e302b483dd req-3a8b4089-9e70-43e5-892c-afe8654a5f56 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Refreshing network info cache for port a49c001e-85aa-4216-a7dd-ed52fa4c71ac _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.729 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:32:39 compute-0 nova_compute[183075]: 2026-01-22 17:32:39.729 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.937s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:32:40 compute-0 nova_compute[183075]: 2026-01-22 17:32:40.536 183079 DEBUG nova.network.neutron [req-88a40a46-a27c-4e31-8b26-92e302b483dd req-3a8b4089-9e70-43e5-892c-afe8654a5f56 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:32:41 compute-0 nova_compute[183075]: 2026-01-22 17:32:41.510 183079 DEBUG nova.network.neutron [req-88a40a46-a27c-4e31-8b26-92e302b483dd req-3a8b4089-9e70-43e5-892c-afe8654a5f56 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:32:41 compute-0 nova_compute[183075]: 2026-01-22 17:32:41.551 183079 DEBUG oslo_concurrency.lockutils [req-88a40a46-a27c-4e31-8b26-92e302b483dd req-3a8b4089-9e70-43e5-892c-afe8654a5f56 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-b09d9bed-19f3-4aae-8aa4-7a87468084e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:32:41 compute-0 nova_compute[183075]: 2026-01-22 17:32:41.730 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:32:41 compute-0 nova_compute[183075]: 2026-01-22 17:32:41.731 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:32:41 compute-0 nova_compute[183075]: 2026-01-22 17:32:41.731 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:32:41 compute-0 nova_compute[183075]: 2026-01-22 17:32:41.805 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 17:32:41 compute-0 nova_compute[183075]: 2026-01-22 17:32:41.830 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "refresh_cache-b09d9bed-19f3-4aae-8aa4-7a87468084e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:32:41 compute-0 nova_compute[183075]: 2026-01-22 17:32:41.830 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquired lock "refresh_cache-b09d9bed-19f3-4aae-8aa4-7a87468084e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:32:41 compute-0 nova_compute[183075]: 2026-01-22 17:32:41.830 183079 DEBUG nova.network.neutron [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:32:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:41.949 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:32:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:41.950 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:32:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:41.951 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:32:41 compute-0 nova_compute[183075]: 2026-01-22 17:32:41.954 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-b0949fde-940d-495c-bdb0-e6c996b0274f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:32:41 compute-0 nova_compute[183075]: 2026-01-22 17:32:41.954 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-b0949fde-940d-495c-bdb0-e6c996b0274f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:32:41 compute-0 nova_compute[183075]: 2026-01-22 17:32:41.955 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:32:41 compute-0 nova_compute[183075]: 2026-01-22 17:32:41.955 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b0949fde-940d-495c-bdb0-e6c996b0274f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:32:42 compute-0 nova_compute[183075]: 2026-01-22 17:32:42.222 183079 DEBUG nova.compute.manager [req-a9f5d948-cc51-47c3-86a6-28caf4cd3ee2 req-99a6d83a-ae5a-4377-a297-bfac6c71713b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Received event network-changed-a49c001e-85aa-4216-a7dd-ed52fa4c71ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:32:42 compute-0 nova_compute[183075]: 2026-01-22 17:32:42.224 183079 DEBUG nova.compute.manager [req-a9f5d948-cc51-47c3-86a6-28caf4cd3ee2 req-99a6d83a-ae5a-4377-a297-bfac6c71713b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Refreshing instance network info cache due to event network-changed-a49c001e-85aa-4216-a7dd-ed52fa4c71ac. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:32:42 compute-0 nova_compute[183075]: 2026-01-22 17:32:42.224 183079 DEBUG oslo_concurrency.lockutils [req-a9f5d948-cc51-47c3-86a6-28caf4cd3ee2 req-99a6d83a-ae5a-4377-a297-bfac6c71713b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-b09d9bed-19f3-4aae-8aa4-7a87468084e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:32:42 compute-0 nova_compute[183075]: 2026-01-22 17:32:42.533 183079 DEBUG nova.network.neutron [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:32:42 compute-0 nova_compute[183075]: 2026-01-22 17:32:42.835 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:43 compute-0 nova_compute[183075]: 2026-01-22 17:32:43.315 183079 INFO nova.compute.manager [None req-e03f7bbe-24be-42a0-babd-cdd561b5c02d c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Get console output
Jan 22 17:32:43 compute-0 nova_compute[183075]: 2026-01-22 17:32:43.320 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:32:43 compute-0 nova_compute[183075]: 2026-01-22 17:32:43.859 183079 DEBUG nova.network.neutron [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Updating instance_info_cache with network_info: [{"id": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "address": "fa:16:3e:a0:54:7f", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49c001e-85", "ovs_interfaceid": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:32:43 compute-0 nova_compute[183075]: 2026-01-22 17:32:43.988 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.018 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Releasing lock "refresh_cache-b09d9bed-19f3-4aae-8aa4-7a87468084e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.019 183079 DEBUG nova.compute.manager [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Instance network_info: |[{"id": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "address": "fa:16:3e:a0:54:7f", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49c001e-85", "ovs_interfaceid": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.019 183079 DEBUG oslo_concurrency.lockutils [req-a9f5d948-cc51-47c3-86a6-28caf4cd3ee2 req-99a6d83a-ae5a-4377-a297-bfac6c71713b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-b09d9bed-19f3-4aae-8aa4-7a87468084e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.020 183079 DEBUG nova.network.neutron [req-a9f5d948-cc51-47c3-86a6-28caf4cd3ee2 req-99a6d83a-ae5a-4377-a297-bfac6c71713b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Refreshing network info cache for port a49c001e-85aa-4216-a7dd-ed52fa4c71ac _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.022 183079 DEBUG nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Start _get_guest_xml network_info=[{"id": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "address": "fa:16:3e:a0:54:7f", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49c001e-85", "ovs_interfaceid": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.026 183079 WARNING nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.031 183079 DEBUG nova.virt.libvirt.host [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.031 183079 DEBUG nova.virt.libvirt.host [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.034 183079 DEBUG nova.virt.libvirt.host [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.034 183079 DEBUG nova.virt.libvirt.host [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.035 183079 DEBUG nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.035 183079 DEBUG nova.virt.hardware [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.035 183079 DEBUG nova.virt.hardware [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.036 183079 DEBUG nova.virt.hardware [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.036 183079 DEBUG nova.virt.hardware [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.036 183079 DEBUG nova.virt.hardware [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.036 183079 DEBUG nova.virt.hardware [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.037 183079 DEBUG nova.virt.hardware [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.037 183079 DEBUG nova.virt.hardware [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.037 183079 DEBUG nova.virt.hardware [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.037 183079 DEBUG nova.virt.hardware [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.038 183079 DEBUG nova.virt.hardware [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.041 183079 DEBUG nova.virt.libvirt.vif [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:32:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='leia',display_name='leia',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='leia',id=55,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP31lzQFM+bzgy2rILfPJtItaZEOm3vqO/5STzV+08atBuxP1/4YmUTZSa+vuUIur2j2kVkdN8zrzADLiGWPcuNSoFATd7+40/kloWBkWhl+JRfBOGEMv65jtGsYaSx1Fw==',key_name='tempest-keypair-test-1852532011',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='89916c03f6f440f6ae7cf81f2ae99bad',ramdisk_id='',reservation_id='r-ttdhfphp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InternalDNSTest-38234021',owner_user_name='tempest-InternalDNSTest-38234021-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:32:25Z,user_data=None,user_id='1ddebe2a251e4b118d9469f7d6fdb2ce',uuid=b09d9bed-19f3-4aae-8aa4-7a87468084e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "address": "fa:16:3e:a0:54:7f", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49c001e-85", "ovs_interfaceid": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.041 183079 DEBUG nova.network.os_vif_util [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converting VIF {"id": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "address": "fa:16:3e:a0:54:7f", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49c001e-85", "ovs_interfaceid": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.042 183079 DEBUG nova.network.os_vif_util [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:54:7f,bridge_name='br-int',has_traffic_filtering=True,id=a49c001e-85aa-4216-a7dd-ed52fa4c71ac,network=Network(ed94e4f1-14ed-42c4-8c8e-db508a59bd2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa49c001e-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.043 183079 DEBUG nova.objects.instance [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lazy-loading 'pci_devices' on Instance uuid b09d9bed-19f3-4aae-8aa4-7a87468084e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.112 183079 DEBUG nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:32:44 compute-0 nova_compute[183075]:   <uuid>b09d9bed-19f3-4aae-8aa4-7a87468084e4</uuid>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   <name>instance-00000037</name>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <nova:name>leia</nova:name>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:32:44</nova:creationTime>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:32:44 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:32:44 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:32:44 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:32:44 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:32:44 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:32:44 compute-0 nova_compute[183075]:         <nova:user uuid="1ddebe2a251e4b118d9469f7d6fdb2ce">tempest-InternalDNSTest-38234021-project-member</nova:user>
Jan 22 17:32:44 compute-0 nova_compute[183075]:         <nova:project uuid="89916c03f6f440f6ae7cf81f2ae99bad">tempest-InternalDNSTest-38234021</nova:project>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:32:44 compute-0 nova_compute[183075]:         <nova:port uuid="a49c001e-85aa-4216-a7dd-ed52fa4c71ac">
Jan 22 17:32:44 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.26" ipVersion="4"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <system>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <entry name="serial">b09d9bed-19f3-4aae-8aa4-7a87468084e4</entry>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <entry name="uuid">b09d9bed-19f3-4aae-8aa4-7a87468084e4</entry>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     </system>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   <os>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   </os>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   <features>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   </features>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4/disk"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:a0:54:7f"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <target dev="tapa49c001e-85"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4/console.log" append="off"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <video>
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     </video>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:32:44 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:32:44 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:32:44 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:32:44 compute-0 nova_compute[183075]: </domain>
Jan 22 17:32:44 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.113 183079 DEBUG nova.compute.manager [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Preparing to wait for external event network-vif-plugged-a49c001e-85aa-4216-a7dd-ed52fa4c71ac prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.117 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.119 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.120 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.123 183079 DEBUG nova.virt.libvirt.vif [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:32:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='leia',display_name='leia',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='leia',id=55,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP31lzQFM+bzgy2rILfPJtItaZEOm3vqO/5STzV+08atBuxP1/4YmUTZSa+vuUIur2j2kVkdN8zrzADLiGWPcuNSoFATd7+40/kloWBkWhl+JRfBOGEMv65jtGsYaSx1Fw==',key_name='tempest-keypair-test-1852532011',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='89916c03f6f440f6ae7cf81f2ae99bad',ramdisk_id='',reservation_id='r-ttdhfphp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InternalDNSTest-38234021',owner_user_name='tempest-InternalDNSTest-38234021-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:32:25Z,user_data=None,user_id='1ddebe2a251e4b118d9469f7d6fdb2ce',uuid=b09d9bed-19f3-4aae-8aa4-7a87468084e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "address": "fa:16:3e:a0:54:7f", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49c001e-85", "ovs_interfaceid": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.124 183079 DEBUG nova.network.os_vif_util [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converting VIF {"id": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "address": "fa:16:3e:a0:54:7f", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49c001e-85", "ovs_interfaceid": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.126 183079 DEBUG nova.network.os_vif_util [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:54:7f,bridge_name='br-int',has_traffic_filtering=True,id=a49c001e-85aa-4216-a7dd-ed52fa4c71ac,network=Network(ed94e4f1-14ed-42c4-8c8e-db508a59bd2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa49c001e-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.127 183079 DEBUG os_vif [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:54:7f,bridge_name='br-int',has_traffic_filtering=True,id=a49c001e-85aa-4216-a7dd-ed52fa4c71ac,network=Network(ed94e4f1-14ed-42c4-8c8e-db508a59bd2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa49c001e-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.130 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.130 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.130 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.132 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.133 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa49c001e-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.133 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa49c001e-85, col_values=(('external_ids', {'iface-id': 'a49c001e-85aa-4216-a7dd-ed52fa4c71ac', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:54:7f', 'vm-uuid': 'b09d9bed-19f3-4aae-8aa4-7a87468084e4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.135 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:44 compute-0 NetworkManager[55454]: <info>  [1769103164.1361] manager: (tapa49c001e-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.137 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.141 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.141 183079 INFO os_vif [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:54:7f,bridge_name='br-int',has_traffic_filtering=True,id=a49c001e-85aa-4216-a7dd-ed52fa4c71ac,network=Network(ed94e4f1-14ed-42c4-8c8e-db508a59bd2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa49c001e-85')
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.288 183079 DEBUG nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.289 183079 DEBUG nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] No VIF found with MAC fa:16:3e:a0:54:7f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:32:44 compute-0 kernel: tapa49c001e-85: entered promiscuous mode
Jan 22 17:32:44 compute-0 NetworkManager[55454]: <info>  [1769103164.3475] manager: (tapa49c001e-85): new Tun device (/org/freedesktop/NetworkManager/Devices/259)
Jan 22 17:32:44 compute-0 ovn_controller[95372]: 2026-01-22T17:32:44Z|00600|binding|INFO|Claiming lport a49c001e-85aa-4216-a7dd-ed52fa4c71ac for this chassis.
Jan 22 17:32:44 compute-0 ovn_controller[95372]: 2026-01-22T17:32:44Z|00601|binding|INFO|a49c001e-85aa-4216-a7dd-ed52fa4c71ac: Claiming fa:16:3e:a0:54:7f 10.100.0.26
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.349 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:44 compute-0 ovn_controller[95372]: 2026-01-22T17:32:44Z|00602|binding|INFO|Setting lport a49c001e-85aa-4216-a7dd-ed52fa4c71ac ovn-installed in OVS
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.361 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.364 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:44 compute-0 systemd-udevd[233765]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:32:44 compute-0 systemd-machined[154382]: New machine qemu-55-instance-00000037.
Jan 22 17:32:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:44.393 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:54:7f 10.100.0.26'], port_security=['fa:16:3e:a0:54:7f 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': 'b09d9bed-19f3-4aae-8aa4-7a87468084e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6b35f822-4909-4e2a-a8bd-ef6e39136861', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2644338c-80e0-4701-bf75-229e5a6223da, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=a49c001e-85aa-4216-a7dd-ed52fa4c71ac) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:32:44 compute-0 ovn_controller[95372]: 2026-01-22T17:32:44Z|00603|binding|INFO|Setting lport a49c001e-85aa-4216-a7dd-ed52fa4c71ac up in Southbound
Jan 22 17:32:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:44.394 104629 INFO neutron.agent.ovn.metadata.agent [-] Port a49c001e-85aa-4216-a7dd-ed52fa4c71ac in datapath ed94e4f1-14ed-42c4-8c8e-db508a59bd2c bound to our chassis
Jan 22 17:32:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:44.395 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed94e4f1-14ed-42c4-8c8e-db508a59bd2c
Jan 22 17:32:44 compute-0 NetworkManager[55454]: <info>  [1769103164.3973] device (tapa49c001e-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:32:44 compute-0 NetworkManager[55454]: <info>  [1769103164.3980] device (tapa49c001e-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:32:44 compute-0 systemd[1]: Started Virtual Machine qemu-55-instance-00000037.
Jan 22 17:32:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:44.410 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[05ab1099-c188-4230-ad84-792ff10def59]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:32:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:44.441 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa629f4-c3d8-40d1-ae28-0f9fa7202dcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:32:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:44.446 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5241931a-57eb-4e6b-877d-cf0c4898e4ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:32:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:44.475 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[fdaa6e28-76fc-4e3a-af05-b0af795a617d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:32:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:44.494 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f497cbdf-ff72-46a2-9296-a270b1830a4c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped94e4f1-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:6f:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6051, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6051, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 544169, 'reachable_time': 15481, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233779, 'error': None, 'target': 'ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:32:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:44.510 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f86f6194-9883-4ee8-b7d1-ccb0f9a1fd4c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'taped94e4f1-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 544180, 'tstamp': 544180}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233780, 'error': None, 'target': 'ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.18'], ['IFA_LOCAL', '10.100.0.18'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'taped94e4f1-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 544183, 'tstamp': 544183}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233780, 'error': None, 'target': 'ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:32:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:44.511 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped94e4f1-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.513 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.514 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:44.514 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped94e4f1-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:32:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:44.514 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:32:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:44.515 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped94e4f1-10, col_values=(('external_ids', {'iface-id': '77b4e93d-6708-4efb-b060-601be2ddc621'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:32:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:32:44.515 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.632 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Updating instance_info_cache with network_info: [{"id": "6aacbedb-6999-4006-9e77-6e540614dbea", "address": "fa:16:3e:35:c7:99", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6aacbedb-69", "ovs_interfaceid": "6aacbedb-6999-4006-9e77-6e540614dbea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.799 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-b0949fde-940d-495c-bdb0-e6c996b0274f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.800 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.800 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.932 183079 DEBUG nova.compute.manager [req-94301968-f01e-4478-92a9-e4c319a141e6 req-2718cd6c-a0f4-4064-a717-83fe0f83dfbd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Received event network-vif-plugged-a49c001e-85aa-4216-a7dd-ed52fa4c71ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.932 183079 DEBUG oslo_concurrency.lockutils [req-94301968-f01e-4478-92a9-e4c319a141e6 req-2718cd6c-a0f4-4064-a717-83fe0f83dfbd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.933 183079 DEBUG oslo_concurrency.lockutils [req-94301968-f01e-4478-92a9-e4c319a141e6 req-2718cd6c-a0f4-4064-a717-83fe0f83dfbd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.934 183079 DEBUG oslo_concurrency.lockutils [req-94301968-f01e-4478-92a9-e4c319a141e6 req-2718cd6c-a0f4-4064-a717-83fe0f83dfbd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:32:44 compute-0 nova_compute[183075]: 2026-01-22 17:32:44.934 183079 DEBUG nova.compute.manager [req-94301968-f01e-4478-92a9-e4c319a141e6 req-2718cd6c-a0f4-4064-a717-83fe0f83dfbd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Processing event network-vif-plugged-a49c001e-85aa-4216-a7dd-ed52fa4c71ac _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.434 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103165.4337485, b09d9bed-19f3-4aae-8aa4-7a87468084e4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.434 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] VM Started (Lifecycle Event)
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.436 183079 DEBUG nova.compute.manager [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.440 183079 DEBUG nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.443 183079 INFO nova.virt.libvirt.driver [-] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Instance spawned successfully.
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.443 183079 DEBUG nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.548 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.551 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.731 183079 DEBUG nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.731 183079 DEBUG nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.731 183079 DEBUG nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.732 183079 DEBUG nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.732 183079 DEBUG nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.732 183079 DEBUG nova.virt.libvirt.driver [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.783 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.783 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103165.4371705, b09d9bed-19f3-4aae-8aa4-7a87468084e4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:32:45 compute-0 nova_compute[183075]: 2026-01-22 17:32:45.783 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] VM Paused (Lifecycle Event)
Jan 22 17:32:46 compute-0 nova_compute[183075]: 2026-01-22 17:32:46.487 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:32:46 compute-0 nova_compute[183075]: 2026-01-22 17:32:46.490 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103165.4391863, b09d9bed-19f3-4aae-8aa4-7a87468084e4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:32:46 compute-0 nova_compute[183075]: 2026-01-22 17:32:46.490 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] VM Resumed (Lifecycle Event)
Jan 22 17:32:46 compute-0 nova_compute[183075]: 2026-01-22 17:32:46.586 183079 INFO nova.compute.manager [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Took 21.01 seconds to spawn the instance on the hypervisor.
Jan 22 17:32:46 compute-0 nova_compute[183075]: 2026-01-22 17:32:46.586 183079 DEBUG nova.compute.manager [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:32:46 compute-0 nova_compute[183075]: 2026-01-22 17:32:46.595 183079 DEBUG nova.network.neutron [req-a9f5d948-cc51-47c3-86a6-28caf4cd3ee2 req-99a6d83a-ae5a-4377-a297-bfac6c71713b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Updated VIF entry in instance network info cache for port a49c001e-85aa-4216-a7dd-ed52fa4c71ac. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:32:46 compute-0 nova_compute[183075]: 2026-01-22 17:32:46.595 183079 DEBUG nova.network.neutron [req-a9f5d948-cc51-47c3-86a6-28caf4cd3ee2 req-99a6d83a-ae5a-4377-a297-bfac6c71713b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Updating instance_info_cache with network_info: [{"id": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "address": "fa:16:3e:a0:54:7f", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49c001e-85", "ovs_interfaceid": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:32:46 compute-0 nova_compute[183075]: 2026-01-22 17:32:46.897 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:32:46 compute-0 nova_compute[183075]: 2026-01-22 17:32:46.900 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:32:47 compute-0 nova_compute[183075]: 2026-01-22 17:32:47.838 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:48 compute-0 podman[233788]: 2026-01-22 17:32:48.345397096 +0000 UTC m=+0.053180349 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:32:49 compute-0 nova_compute[183075]: 2026-01-22 17:32:49.137 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:52 compute-0 podman[233813]: 2026-01-22 17:32:52.345867848 +0000 UTC m=+0.058616687 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:32:52 compute-0 nova_compute[183075]: 2026-01-22 17:32:52.864 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:54 compute-0 nova_compute[183075]: 2026-01-22 17:32:54.139 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:57 compute-0 nova_compute[183075]: 2026-01-22 17:32:57.865 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:58 compute-0 ovn_controller[95372]: 2026-01-22T17:32:58Z|00112|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a0:54:7f 10.100.0.26
Jan 22 17:32:58 compute-0 ovn_controller[95372]: 2026-01-22T17:32:58Z|00113|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a0:54:7f 10.100.0.26
Jan 22 17:32:59 compute-0 nova_compute[183075]: 2026-01-22 17:32:59.143 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:32:59 compute-0 nova_compute[183075]: 2026-01-22 17:32:59.777 183079 DEBUG oslo_concurrency.lockutils [req-a9f5d948-cc51-47c3-86a6-28caf4cd3ee2 req-99a6d83a-ae5a-4377-a297-bfac6c71713b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-b09d9bed-19f3-4aae-8aa4-7a87468084e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:32:59 compute-0 nova_compute[183075]: 2026-01-22 17:32:59.997 183079 DEBUG nova.compute.manager [req-382d8329-fbf3-4660-bce5-ee1c44eae516 req-294669d3-2787-43e3-92cd-52430d5508d4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Received event network-vif-plugged-a49c001e-85aa-4216-a7dd-ed52fa4c71ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:32:59 compute-0 nova_compute[183075]: 2026-01-22 17:32:59.997 183079 DEBUG oslo_concurrency.lockutils [req-382d8329-fbf3-4660-bce5-ee1c44eae516 req-294669d3-2787-43e3-92cd-52430d5508d4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:32:59 compute-0 nova_compute[183075]: 2026-01-22 17:32:59.998 183079 DEBUG oslo_concurrency.lockutils [req-382d8329-fbf3-4660-bce5-ee1c44eae516 req-294669d3-2787-43e3-92cd-52430d5508d4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:32:59 compute-0 nova_compute[183075]: 2026-01-22 17:32:59.998 183079 DEBUG oslo_concurrency.lockutils [req-382d8329-fbf3-4660-bce5-ee1c44eae516 req-294669d3-2787-43e3-92cd-52430d5508d4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:32:59 compute-0 nova_compute[183075]: 2026-01-22 17:32:59.998 183079 DEBUG nova.compute.manager [req-382d8329-fbf3-4660-bce5-ee1c44eae516 req-294669d3-2787-43e3-92cd-52430d5508d4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] No waiting events found dispatching network-vif-plugged-a49c001e-85aa-4216-a7dd-ed52fa4c71ac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:32:59 compute-0 nova_compute[183075]: 2026-01-22 17:32:59.998 183079 WARNING nova.compute.manager [req-382d8329-fbf3-4660-bce5-ee1c44eae516 req-294669d3-2787-43e3-92cd-52430d5508d4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Received unexpected event network-vif-plugged-a49c001e-85aa-4216-a7dd-ed52fa4c71ac for instance with vm_state building and task_state spawning.
Jan 22 17:33:00 compute-0 nova_compute[183075]: 2026-01-22 17:33:00.036 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:33:00 compute-0 nova_compute[183075]: 2026-01-22 17:33:00.105 183079 INFO nova.compute.manager [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Took 35.22 seconds to build instance.
Jan 22 17:33:00 compute-0 nova_compute[183075]: 2026-01-22 17:33:00.229 183079 DEBUG oslo_concurrency.lockutils [None req-2df8e116-cd50-4056-998c-80f55b34a5d2 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 35.524s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:00 compute-0 nova_compute[183075]: 2026-01-22 17:33:00.801 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "eb04fe29-6d1c-4572-b219-f60350425077" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:00 compute-0 nova_compute[183075]: 2026-01-22 17:33:00.801 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "eb04fe29-6d1c-4572-b219-f60350425077" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:00 compute-0 nova_compute[183075]: 2026-01-22 17:33:00.834 183079 INFO nova.compute.manager [None req-0dc376e3-2cb1-4964-9de9-b5c0e331523a 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Get console output
Jan 22 17:33:00 compute-0 nova_compute[183075]: 2026-01-22 17:33:00.841 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:33:00 compute-0 nova_compute[183075]: 2026-01-22 17:33:00.908 183079 DEBUG nova.compute.manager [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:33:01 compute-0 nova_compute[183075]: 2026-01-22 17:33:01.248 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:01 compute-0 nova_compute[183075]: 2026-01-22 17:33:01.249 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:01 compute-0 nova_compute[183075]: 2026-01-22 17:33:01.258 183079 DEBUG nova.virt.hardware [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:33:01 compute-0 nova_compute[183075]: 2026-01-22 17:33:01.258 183079 INFO nova.compute.claims [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:33:01 compute-0 nova_compute[183075]: 2026-01-22 17:33:01.635 183079 DEBUG nova.compute.provider_tree [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:33:01 compute-0 nova_compute[183075]: 2026-01-22 17:33:01.679 183079 DEBUG nova.scheduler.client.report [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:33:02 compute-0 nova_compute[183075]: 2026-01-22 17:33:02.194 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.945s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:02 compute-0 nova_compute[183075]: 2026-01-22 17:33:02.195 183079 DEBUG nova.compute.manager [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:33:02 compute-0 nova_compute[183075]: 2026-01-22 17:33:02.435 183079 DEBUG nova.compute.manager [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:33:02 compute-0 nova_compute[183075]: 2026-01-22 17:33:02.436 183079 DEBUG nova.network.neutron [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:33:02 compute-0 nova_compute[183075]: 2026-01-22 17:33:02.613 183079 INFO nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:33:02 compute-0 nova_compute[183075]: 2026-01-22 17:33:02.631 183079 DEBUG nova.compute.manager [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:33:02 compute-0 nova_compute[183075]: 2026-01-22 17:33:02.867 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:03 compute-0 nova_compute[183075]: 2026-01-22 17:33:03.214 183079 DEBUG nova.policy [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.147 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:04 compute-0 podman[233855]: 2026-01-22 17:33:04.358449958 +0000 UTC m=+0.063590032 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 17:33:04 compute-0 podman[233856]: 2026-01-22 17:33:04.36329928 +0000 UTC m=+0.063527430 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.425 183079 DEBUG nova.compute.manager [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.426 183079 DEBUG nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.427 183079 INFO nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Creating image(s)
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.428 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "/var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.428 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "/var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.429 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "/var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:04 compute-0 podman[233854]: 2026-01-22 17:33:04.443429394 +0000 UTC m=+0.154319794 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.443 183079 DEBUG oslo_concurrency.processutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.501 183079 DEBUG oslo_concurrency.processutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.502 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.502 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.513 183079 DEBUG oslo_concurrency.processutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.571 183079 DEBUG oslo_concurrency.processutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.572 183079 DEBUG oslo_concurrency.processutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.708 183079 DEBUG oslo_concurrency.processutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077/disk 1073741824" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.709 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.710 183079 DEBUG oslo_concurrency.processutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.781 183079 DEBUG oslo_concurrency.processutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.782 183079 DEBUG nova.virt.disk.api [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Checking if we can resize image /var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.783 183079 DEBUG oslo_concurrency.processutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.843 183079 DEBUG oslo_concurrency.processutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.843 183079 DEBUG nova.virt.disk.api [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Cannot resize image /var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.844 183079 DEBUG nova.objects.instance [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lazy-loading 'migration_context' on Instance uuid eb04fe29-6d1c-4572-b219-f60350425077 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.860 183079 DEBUG nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.861 183079 DEBUG nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Ensure instance console log exists: /var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.861 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.862 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:04 compute-0 nova_compute[183075]: 2026-01-22 17:33:04.862 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:05.681 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:05.682 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:33:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.26
Jan 22 17:33:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:05 compute-0 nova_compute[183075]: 2026-01-22 17:33:05.972 183079 DEBUG nova.network.neutron [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Successfully created port: 7a42a264-5341-4ac6-8da3-c317a7b2c279 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:33:06 compute-0 nova_compute[183075]: 2026-01-22 17:33:06.078 183079 INFO nova.compute.manager [None req-828a0bbf-ca4c-4d34-9f82-25e70192eb90 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Get console output
Jan 22 17:33:06 compute-0 nova_compute[183075]: 2026-01-22 17:33:06.083 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.221 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.222 104990 INFO eventlet.wsgi.server [-] 10.100.0.26,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.5394499
Jan 22 17:33:06 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.26:37300 [22/Jan/2026:17:33:05.681] listener listener/metadata 0/0/0/541/541 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.230 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.231 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.26
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.257 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.258 104990 INFO eventlet.wsgi.server [-] 10.100.0.26,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0266259
Jan 22 17:33:06 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.26:37306 [22/Jan/2026:17:33:06.230] listener listener/metadata 0/0/0/27/27 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.262 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.263 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.26
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.278 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.279 104990 INFO eventlet.wsgi.server [-] 10.100.0.26,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0156515
Jan 22 17:33:06 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.26:37318 [22/Jan/2026:17:33:06.262] listener listener/metadata 0/0/0/16/16 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.283 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.284 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.26
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.459 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.460 104990 INFO eventlet.wsgi.server [-] 10.100.0.26,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.1760471
Jan 22 17:33:06 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.26:37332 [22/Jan/2026:17:33:06.283] listener listener/metadata 0/0/0/176/176 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.464 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.465 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.26
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.483 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.483 104990 INFO eventlet.wsgi.server [-] 10.100.0.26,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0183039
Jan 22 17:33:06 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.26:37334 [22/Jan/2026:17:33:06.464] listener listener/metadata 0/0/0/19/19 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.489 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.490 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.26
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.503 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.504 104990 INFO eventlet.wsgi.server [-] 10.100.0.26,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0138354
Jan 22 17:33:06 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.26:37344 [22/Jan/2026:17:33:06.489] listener listener/metadata 0/0/0/14/14 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.508 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.509 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.26
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.521 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:06 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.26:37354 [22/Jan/2026:17:33:06.508] listener listener/metadata 0/0/0/13/13 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.522 104990 INFO eventlet.wsgi.server [-] 10.100.0.26,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0130868
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.526 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.527 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.26
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.541 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.541 104990 INFO eventlet.wsgi.server [-] 10.100.0.26,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0145111
Jan 22 17:33:06 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.26:37362 [22/Jan/2026:17:33:06.526] listener listener/metadata 0/0/0/15/15 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.546 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.547 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.26
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.564 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.565 104990 INFO eventlet.wsgi.server [-] 10.100.0.26,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 139 time: 0.0183578
Jan 22 17:33:06 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.26:37372 [22/Jan/2026:17:33:06.546] listener listener/metadata 0/0/0/19/19 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.576 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.576 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.26
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.590 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.590 104990 INFO eventlet.wsgi.server [-] 10.100.0.26,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 139 time: 0.0140057
Jan 22 17:33:06 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.26:37382 [22/Jan/2026:17:33:06.575] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.595 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.595 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.26
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.608 104990 INFO eventlet.wsgi.server [-] 10.100.0.26,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0127583
Jan 22 17:33:06 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.26:37392 [22/Jan/2026:17:33:06.594] listener listener/metadata 0/0/0/13/13 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.616 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.617 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.26
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.632 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:06 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.26:37408 [22/Jan/2026:17:33:06.616] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.633 104990 INFO eventlet.wsgi.server [-] 10.100.0.26,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0163813
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.637 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.638 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.26
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.660 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.660 104990 INFO eventlet.wsgi.server [-] 10.100.0.26,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0224626
Jan 22 17:33:06 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.26:37422 [22/Jan/2026:17:33:06.637] listener listener/metadata 0/0/0/23/23 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.665 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.665 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.26
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.681 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.682 104990 INFO eventlet.wsgi.server [-] 10.100.0.26,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0163348
Jan 22 17:33:06 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.26:37430 [22/Jan/2026:17:33:06.665] listener listener/metadata 0/0/0/17/17 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.687 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.688 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.26
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.703 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.704 104990 INFO eventlet.wsgi.server [-] 10.100.0.26,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 139 time: 0.0162420
Jan 22 17:33:06 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.26:37440 [22/Jan/2026:17:33:06.686] listener listener/metadata 0/0/0/17/17 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.709 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.710 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.26
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ed94e4f1-14ed-42c4-8c8e-db508a59bd2c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.726 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:06.727 104990 INFO eventlet.wsgi.server [-] 10.100.0.26,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0168970
Jan 22 17:33:06 compute-0 haproxy-metadata-proxy-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233184]: 10.100.0.26:37444 [22/Jan/2026:17:33:06.709] listener listener/metadata 0/0/0/18/18 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:33:07 compute-0 nova_compute[183075]: 2026-01-22 17:33:07.869 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:07 compute-0 nova_compute[183075]: 2026-01-22 17:33:07.991 183079 DEBUG nova.network.neutron [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Successfully updated port: 7a42a264-5341-4ac6-8da3-c317a7b2c279 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:33:08 compute-0 podman[233930]: 2026-01-22 17:33:08.364691077 +0000 UTC m=+0.073917832 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 17:33:08 compute-0 nova_compute[183075]: 2026-01-22 17:33:08.475 183079 DEBUG nova.compute.manager [req-3e7d869d-0864-4ded-bc12-4352a04f52c4 req-9377ce02-baa5-49bb-ba0b-53bce6203229 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Received event network-changed-7a42a264-5341-4ac6-8da3-c317a7b2c279 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:33:08 compute-0 nova_compute[183075]: 2026-01-22 17:33:08.476 183079 DEBUG nova.compute.manager [req-3e7d869d-0864-4ded-bc12-4352a04f52c4 req-9377ce02-baa5-49bb-ba0b-53bce6203229 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Refreshing instance network info cache due to event network-changed-7a42a264-5341-4ac6-8da3-c317a7b2c279. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:33:08 compute-0 nova_compute[183075]: 2026-01-22 17:33:08.477 183079 DEBUG oslo_concurrency.lockutils [req-3e7d869d-0864-4ded-bc12-4352a04f52c4 req-9377ce02-baa5-49bb-ba0b-53bce6203229 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-eb04fe29-6d1c-4572-b219-f60350425077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:33:08 compute-0 nova_compute[183075]: 2026-01-22 17:33:08.477 183079 DEBUG oslo_concurrency.lockutils [req-3e7d869d-0864-4ded-bc12-4352a04f52c4 req-9377ce02-baa5-49bb-ba0b-53bce6203229 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-eb04fe29-6d1c-4572-b219-f60350425077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:33:08 compute-0 nova_compute[183075]: 2026-01-22 17:33:08.477 183079 DEBUG nova.network.neutron [req-3e7d869d-0864-4ded-bc12-4352a04f52c4 req-9377ce02-baa5-49bb-ba0b-53bce6203229 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Refreshing network info cache for port 7a42a264-5341-4ac6-8da3-c317a7b2c279 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:33:08 compute-0 nova_compute[183075]: 2026-01-22 17:33:08.630 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "refresh_cache-eb04fe29-6d1c-4572-b219-f60350425077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:33:09 compute-0 nova_compute[183075]: 2026-01-22 17:33:09.149 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:10 compute-0 nova_compute[183075]: 2026-01-22 17:33:10.554 183079 DEBUG nova.network.neutron [req-3e7d869d-0864-4ded-bc12-4352a04f52c4 req-9377ce02-baa5-49bb-ba0b-53bce6203229 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:33:11 compute-0 nova_compute[183075]: 2026-01-22 17:33:11.191 183079 INFO nova.compute.manager [None req-e253f3df-d73d-4e51-9071-4ac20baab5fb 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Get console output
Jan 22 17:33:11 compute-0 nova_compute[183075]: 2026-01-22 17:33:11.196 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:33:11 compute-0 nova_compute[183075]: 2026-01-22 17:33:11.600 183079 DEBUG nova.network.neutron [req-3e7d869d-0864-4ded-bc12-4352a04f52c4 req-9377ce02-baa5-49bb-ba0b-53bce6203229 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:33:11 compute-0 nova_compute[183075]: 2026-01-22 17:33:11.622 183079 DEBUG oslo_concurrency.lockutils [req-3e7d869d-0864-4ded-bc12-4352a04f52c4 req-9377ce02-baa5-49bb-ba0b-53bce6203229 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-eb04fe29-6d1c-4572-b219-f60350425077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:33:11 compute-0 nova_compute[183075]: 2026-01-22 17:33:11.623 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquired lock "refresh_cache-eb04fe29-6d1c-4572-b219-f60350425077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:33:11 compute-0 nova_compute[183075]: 2026-01-22 17:33:11.623 183079 DEBUG nova.network.neutron [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:33:11 compute-0 nova_compute[183075]: 2026-01-22 17:33:11.784 183079 DEBUG nova.network.neutron [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:33:12 compute-0 nova_compute[183075]: 2026-01-22 17:33:12.871 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:14 compute-0 nova_compute[183075]: 2026-01-22 17:33:14.151 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:14 compute-0 nova_compute[183075]: 2026-01-22 17:33:14.624 183079 DEBUG nova.network.neutron [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Updating instance_info_cache with network_info: [{"id": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "address": "fa:16:3e:f5:0b:0f", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef5:b0f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a42a264-53", "ovs_interfaceid": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.874 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.961 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Releasing lock "refresh_cache-eb04fe29-6d1c-4572-b219-f60350425077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.962 183079 DEBUG nova.compute.manager [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Instance network_info: |[{"id": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "address": "fa:16:3e:f5:0b:0f", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef5:b0f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a42a264-53", "ovs_interfaceid": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.964 183079 DEBUG nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Start _get_guest_xml network_info=[{"id": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "address": "fa:16:3e:f5:0b:0f", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef5:b0f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a42a264-53", "ovs_interfaceid": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.968 183079 WARNING nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.980 183079 DEBUG nova.virt.libvirt.host [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.981 183079 DEBUG nova.virt.libvirt.host [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.984 183079 DEBUG nova.virt.libvirt.host [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.985 183079 DEBUG nova.virt.libvirt.host [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.985 183079 DEBUG nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.985 183079 DEBUG nova.virt.hardware [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.986 183079 DEBUG nova.virt.hardware [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.986 183079 DEBUG nova.virt.hardware [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.986 183079 DEBUG nova.virt.hardware [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.987 183079 DEBUG nova.virt.hardware [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.987 183079 DEBUG nova.virt.hardware [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.987 183079 DEBUG nova.virt.hardware [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.987 183079 DEBUG nova.virt.hardware [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.988 183079 DEBUG nova.virt.hardware [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.988 183079 DEBUG nova.virt.hardware [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.988 183079 DEBUG nova.virt.hardware [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.992 183079 DEBUG nova.virt.libvirt.vif [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:32:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-887859950',display_name='tempest-server-test-887859950',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-887859950',id=56,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5QyK5QBIiC5rcI23u+Ha4rToriS54oXRHOciJ+8yg9OiIFQ5pHcofppLwzDPzqktD3JMTTskAgQadosoWLVCa44nM7NokRRlJk11u7nt0exfz9e0AepzmOn9wpcPYeVg==',key_name='tempest-keypair-test-1290435476',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='22f75e117e724f9aaadf5b8fd25a6ef6',ramdisk_id='',reservation_id='r-kfvkuair',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessSecGroupDualStackDHCPv6StatelessTest-80323303',owner_user_name='tempest-StatelessSecGroupDualStackDHCPv6StatelessTest-80323303-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:33:03Z,user_data=None,user_id='c4621e42483d4f49b9a97f2b7eb886dc',uuid=eb04fe29-6d1c-4572-b219-f60350425077,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "address": "fa:16:3e:f5:0b:0f", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef5:b0f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a42a264-53", "ovs_interfaceid": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.992 183079 DEBUG nova.network.os_vif_util [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Converting VIF {"id": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "address": "fa:16:3e:f5:0b:0f", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef5:b0f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a42a264-53", "ovs_interfaceid": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.993 183079 DEBUG nova.network.os_vif_util [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:0b:0f,bridge_name='br-int',has_traffic_filtering=True,id=7a42a264-5341-4ac6-8da3-c317a7b2c279,network=Network(2fd77df8-cf00-4afc-b4cf-75b5722c375c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a42a264-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:33:17 compute-0 nova_compute[183075]: 2026-01-22 17:33:17.994 183079 DEBUG nova.objects.instance [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lazy-loading 'pci_devices' on Instance uuid eb04fe29-6d1c-4572-b219-f60350425077 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.042 183079 DEBUG nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:33:18 compute-0 nova_compute[183075]:   <uuid>eb04fe29-6d1c-4572-b219-f60350425077</uuid>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   <name>instance-00000038</name>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-887859950</nova:name>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:33:17</nova:creationTime>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:33:18 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:33:18 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:33:18 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:33:18 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:33:18 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:33:18 compute-0 nova_compute[183075]:         <nova:user uuid="c4621e42483d4f49b9a97f2b7eb886dc">tempest-StatelessSecGroupDualStackDHCPv6StatelessTest-80323303-project-member</nova:user>
Jan 22 17:33:18 compute-0 nova_compute[183075]:         <nova:project uuid="22f75e117e724f9aaadf5b8fd25a6ef6">tempest-StatelessSecGroupDualStackDHCPv6StatelessTest-80323303</nova:project>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:33:18 compute-0 nova_compute[183075]:         <nova:port uuid="7a42a264-5341-4ac6-8da3-c317a7b2c279">
Jan 22 17:33:18 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fef5:b0f" ipVersion="6"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <system>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <entry name="serial">eb04fe29-6d1c-4572-b219-f60350425077</entry>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <entry name="uuid">eb04fe29-6d1c-4572-b219-f60350425077</entry>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     </system>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   <os>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   </os>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   <features>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   </features>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077/disk"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:f5:0b:0f"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <target dev="tap7a42a264-53"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077/console.log" append="off"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <video>
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     </video>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:33:18 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:33:18 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:33:18 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:33:18 compute-0 nova_compute[183075]: </domain>
Jan 22 17:33:18 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.044 183079 DEBUG nova.compute.manager [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Preparing to wait for external event network-vif-plugged-7a42a264-5341-4ac6-8da3-c317a7b2c279 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.044 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "eb04fe29-6d1c-4572-b219-f60350425077-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.044 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "eb04fe29-6d1c-4572-b219-f60350425077-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.045 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "eb04fe29-6d1c-4572-b219-f60350425077-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.045 183079 DEBUG nova.virt.libvirt.vif [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:32:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-887859950',display_name='tempest-server-test-887859950',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-887859950',id=56,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5QyK5QBIiC5rcI23u+Ha4rToriS54oXRHOciJ+8yg9OiIFQ5pHcofppLwzDPzqktD3JMTTskAgQadosoWLVCa44nM7NokRRlJk11u7nt0exfz9e0AepzmOn9wpcPYeVg==',key_name='tempest-keypair-test-1290435476',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='22f75e117e724f9aaadf5b8fd25a6ef6',ramdisk_id='',reservation_id='r-kfvkuair',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessSecGroupDualStackDHCPv6StatelessTest-80323303',owner_user_name='tempest-StatelessSecGroupDualStackDHCPv6StatelessTest-80323303-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:33:03Z,user_data=None,user_id='c4621e42483d4f49b9a97f2b7eb886dc',uuid=eb04fe29-6d1c-4572-b219-f60350425077,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "address": "fa:16:3e:f5:0b:0f", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef5:b0f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a42a264-53", "ovs_interfaceid": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.046 183079 DEBUG nova.network.os_vif_util [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Converting VIF {"id": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "address": "fa:16:3e:f5:0b:0f", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef5:b0f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a42a264-53", "ovs_interfaceid": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.046 183079 DEBUG nova.network.os_vif_util [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:0b:0f,bridge_name='br-int',has_traffic_filtering=True,id=7a42a264-5341-4ac6-8da3-c317a7b2c279,network=Network(2fd77df8-cf00-4afc-b4cf-75b5722c375c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a42a264-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.047 183079 DEBUG os_vif [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:0b:0f,bridge_name='br-int',has_traffic_filtering=True,id=7a42a264-5341-4ac6-8da3-c317a7b2c279,network=Network(2fd77df8-cf00-4afc-b4cf-75b5722c375c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a42a264-53') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.047 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.047 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.048 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.050 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.050 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a42a264-53, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.050 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7a42a264-53, col_values=(('external_ids', {'iface-id': '7a42a264-5341-4ac6-8da3-c317a7b2c279', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f5:0b:0f', 'vm-uuid': 'eb04fe29-6d1c-4572-b219-f60350425077'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:33:18 compute-0 NetworkManager[55454]: <info>  [1769103198.0537] manager: (tap7a42a264-53): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/260)
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.053 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.060 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.061 183079 INFO os_vif [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:0b:0f,bridge_name='br-int',has_traffic_filtering=True,id=7a42a264-5341-4ac6-8da3-c317a7b2c279,network=Network(2fd77df8-cf00-4afc-b4cf-75b5722c375c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a42a264-53')
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.181 183079 DEBUG nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.181 183079 DEBUG nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] No VIF found with MAC fa:16:3e:f5:0b:0f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:33:18 compute-0 kernel: tap7a42a264-53: entered promiscuous mode
Jan 22 17:33:18 compute-0 NetworkManager[55454]: <info>  [1769103198.2435] manager: (tap7a42a264-53): new Tun device (/org/freedesktop/NetworkManager/Devices/261)
Jan 22 17:33:18 compute-0 ovn_controller[95372]: 2026-01-22T17:33:18Z|00604|binding|INFO|Claiming lport 7a42a264-5341-4ac6-8da3-c317a7b2c279 for this chassis.
Jan 22 17:33:18 compute-0 ovn_controller[95372]: 2026-01-22T17:33:18Z|00605|binding|INFO|7a42a264-5341-4ac6-8da3-c317a7b2c279: Claiming fa:16:3e:f5:0b:0f 10.100.0.6 2001:db8::f816:3eff:fef5:b0f
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.244 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:18 compute-0 ovn_controller[95372]: 2026-01-22T17:33:18Z|00606|binding|INFO|Setting lport 7a42a264-5341-4ac6-8da3-c317a7b2c279 ovn-installed in OVS
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.258 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.263 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:18 compute-0 systemd-udevd[233974]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:33:18 compute-0 systemd-machined[154382]: New machine qemu-56-instance-00000038.
Jan 22 17:33:18 compute-0 NetworkManager[55454]: <info>  [1769103198.2864] device (tap7a42a264-53): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:33:18 compute-0 NetworkManager[55454]: <info>  [1769103198.2879] device (tap7a42a264-53): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:33:18 compute-0 systemd[1]: Started Virtual Machine qemu-56-instance-00000038.
Jan 22 17:33:18 compute-0 ovn_controller[95372]: 2026-01-22T17:33:18Z|00607|binding|INFO|Setting lport 7a42a264-5341-4ac6-8da3-c317a7b2c279 up in Southbound
Jan 22 17:33:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:18.349 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:0b:0f 10.100.0.6 2001:db8::f816:3eff:fef5:b0f'], port_security=['fa:16:3e:f5:0b:0f 10.100.0.6 2001:db8::f816:3eff:fef5:b0f'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28 2001:db8::f816:3eff:fef5:b0f/64', 'neutron:device_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e838e551-4083-4143-b761-54a81d27a6c4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6159b68b-3c7b-43be-9fb9-7f846f3d3eb8, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=7a42a264-5341-4ac6-8da3-c317a7b2c279) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:33:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:18.350 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 7a42a264-5341-4ac6-8da3-c317a7b2c279 in datapath 2fd77df8-cf00-4afc-b4cf-75b5722c375c bound to our chassis
Jan 22 17:33:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:18.352 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2fd77df8-cf00-4afc-b4cf-75b5722c375c
Jan 22 17:33:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:18.375 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[495c1275-3acc-4018-88e2-19c39a70667d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:18.421 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[71f8df5d-9b03-4768-af43-c16af62a803c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:18.425 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[7af036a2-7399-4cc8-8f6c-1e939dff0bb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:18.460 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ed704746-bc2b-4db9-99d0-b5453ab3ed62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:18.487 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a1948f0a-dc81-4ac4-a77d-386274a8fdb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2fd77df8-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:bc:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 117, 'tx_packets': 54, 'rx_bytes': 10002, 'tx_bytes': 6128, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 117, 'tx_packets': 54, 'rx_bytes': 10002, 'tx_bytes': 6128, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 170], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546788, 'reachable_time': 41475, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 18, 'inoctets': 1320, 'indelivers': 4, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 18, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1320, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 18, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 4, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233989, 'error': None, 'target': 'ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:18.512 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[908a051a-429d-422d-a512-6acc15a7a8aa]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2fd77df8-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546799, 'tstamp': 546799}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233990, 'error': None, 'target': 'ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2fd77df8-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546801, 'tstamp': 546801}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233990, 'error': None, 'target': 'ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:18.515 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2fd77df8-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.518 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.519 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:18.520 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2fd77df8-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:33:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:18.520 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:33:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:18.521 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2fd77df8-c0, col_values=(('external_ids', {'iface-id': '0297f784-1a41-4744-b018-f503dfa93754'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:33:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:18.521 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.747 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103198.7453258, eb04fe29-6d1c-4572-b219-f60350425077 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.748 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eb04fe29-6d1c-4572-b219-f60350425077] VM Started (Lifecycle Event)
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.924 183079 DEBUG nova.compute.manager [req-ba37ce2e-b889-4d15-8d26-d34c0c234fa9 req-cdd8f037-c9c1-455a-8eb7-5e5074b38699 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Received event network-vif-plugged-7a42a264-5341-4ac6-8da3-c317a7b2c279 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.925 183079 DEBUG oslo_concurrency.lockutils [req-ba37ce2e-b889-4d15-8d26-d34c0c234fa9 req-cdd8f037-c9c1-455a-8eb7-5e5074b38699 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "eb04fe29-6d1c-4572-b219-f60350425077-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.925 183079 DEBUG oslo_concurrency.lockutils [req-ba37ce2e-b889-4d15-8d26-d34c0c234fa9 req-cdd8f037-c9c1-455a-8eb7-5e5074b38699 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "eb04fe29-6d1c-4572-b219-f60350425077-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.925 183079 DEBUG oslo_concurrency.lockutils [req-ba37ce2e-b889-4d15-8d26-d34c0c234fa9 req-cdd8f037-c9c1-455a-8eb7-5e5074b38699 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "eb04fe29-6d1c-4572-b219-f60350425077-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.926 183079 DEBUG nova.compute.manager [req-ba37ce2e-b889-4d15-8d26-d34c0c234fa9 req-cdd8f037-c9c1-455a-8eb7-5e5074b38699 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Processing event network-vif-plugged-7a42a264-5341-4ac6-8da3-c317a7b2c279 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.926 183079 DEBUG nova.compute.manager [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.931 183079 DEBUG nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.934 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.937 183079 INFO nova.virt.libvirt.driver [-] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Instance spawned successfully.
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.937 183079 DEBUG nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.941 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.985 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eb04fe29-6d1c-4572-b219-f60350425077] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.985 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103198.745473, eb04fe29-6d1c-4572-b219-f60350425077 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.986 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eb04fe29-6d1c-4572-b219-f60350425077] VM Paused (Lifecycle Event)
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.994 183079 DEBUG nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.994 183079 DEBUG nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.995 183079 DEBUG nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.995 183079 DEBUG nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.995 183079 DEBUG nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:33:18 compute-0 nova_compute[183075]: 2026-01-22 17:33:18.996 183079 DEBUG nova.virt.libvirt.driver [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:33:19 compute-0 nova_compute[183075]: 2026-01-22 17:33:19.031 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:33:19 compute-0 nova_compute[183075]: 2026-01-22 17:33:19.035 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103198.930085, eb04fe29-6d1c-4572-b219-f60350425077 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:33:19 compute-0 nova_compute[183075]: 2026-01-22 17:33:19.035 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eb04fe29-6d1c-4572-b219-f60350425077] VM Resumed (Lifecycle Event)
Jan 22 17:33:19 compute-0 nova_compute[183075]: 2026-01-22 17:33:19.133 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:33:19 compute-0 nova_compute[183075]: 2026-01-22 17:33:19.137 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:33:19 compute-0 nova_compute[183075]: 2026-01-22 17:33:19.154 183079 INFO nova.compute.manager [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Took 14.73 seconds to spawn the instance on the hypervisor.
Jan 22 17:33:19 compute-0 nova_compute[183075]: 2026-01-22 17:33:19.154 183079 DEBUG nova.compute.manager [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:33:19 compute-0 nova_compute[183075]: 2026-01-22 17:33:19.164 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eb04fe29-6d1c-4572-b219-f60350425077] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:33:19 compute-0 nova_compute[183075]: 2026-01-22 17:33:19.219 183079 INFO nova.compute.manager [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Took 18.03 seconds to build instance.
Jan 22 17:33:19 compute-0 nova_compute[183075]: 2026-01-22 17:33:19.240 183079 DEBUG oslo_concurrency.lockutils [None req-55ba4b8c-8417-4bdd-831c-ed3328be28ef c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "eb04fe29-6d1c-4572-b219-f60350425077" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.438s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:19 compute-0 podman[233998]: 2026-01-22 17:33:19.374916301 +0000 UTC m=+0.072480644 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:33:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:19.706 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:33:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:19.707 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:33:19 compute-0 nova_compute[183075]: 2026-01-22 17:33:19.708 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:21 compute-0 nova_compute[183075]: 2026-01-22 17:33:21.010 183079 DEBUG nova.compute.manager [req-552d4478-084e-47a1-aa40-5567868bbc34 req-77330207-d0e6-403c-bfe6-23b4c7051983 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Received event network-vif-plugged-7a42a264-5341-4ac6-8da3-c317a7b2c279 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:33:21 compute-0 nova_compute[183075]: 2026-01-22 17:33:21.010 183079 DEBUG oslo_concurrency.lockutils [req-552d4478-084e-47a1-aa40-5567868bbc34 req-77330207-d0e6-403c-bfe6-23b4c7051983 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "eb04fe29-6d1c-4572-b219-f60350425077-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:21 compute-0 nova_compute[183075]: 2026-01-22 17:33:21.011 183079 DEBUG oslo_concurrency.lockutils [req-552d4478-084e-47a1-aa40-5567868bbc34 req-77330207-d0e6-403c-bfe6-23b4c7051983 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "eb04fe29-6d1c-4572-b219-f60350425077-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:21 compute-0 nova_compute[183075]: 2026-01-22 17:33:21.011 183079 DEBUG oslo_concurrency.lockutils [req-552d4478-084e-47a1-aa40-5567868bbc34 req-77330207-d0e6-403c-bfe6-23b4c7051983 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "eb04fe29-6d1c-4572-b219-f60350425077-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:21 compute-0 nova_compute[183075]: 2026-01-22 17:33:21.012 183079 DEBUG nova.compute.manager [req-552d4478-084e-47a1-aa40-5567868bbc34 req-77330207-d0e6-403c-bfe6-23b4c7051983 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] No waiting events found dispatching network-vif-plugged-7a42a264-5341-4ac6-8da3-c317a7b2c279 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:33:21 compute-0 nova_compute[183075]: 2026-01-22 17:33:21.012 183079 WARNING nova.compute.manager [req-552d4478-084e-47a1-aa40-5567868bbc34 req-77330207-d0e6-403c-bfe6-23b4c7051983 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Received unexpected event network-vif-plugged-7a42a264-5341-4ac6-8da3-c317a7b2c279 for instance with vm_state active and task_state None.
Jan 22 17:33:22 compute-0 nova_compute[183075]: 2026-01-22 17:33:22.675 183079 INFO nova.compute.manager [None req-3273d1f3-1da3-4f22-84b3-900ccb18410a c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Get console output
Jan 22 17:33:22 compute-0 nova_compute[183075]: 2026-01-22 17:33:22.681 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:33:22 compute-0 nova_compute[183075]: 2026-01-22 17:33:22.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:33:22 compute-0 nova_compute[183075]: 2026-01-22 17:33:22.877 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:23 compute-0 nova_compute[183075]: 2026-01-22 17:33:23.094 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:23 compute-0 podman[234024]: 2026-01-22 17:33:23.364864758 +0000 UTC m=+0.076500204 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:33:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:25.708 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:33:25 compute-0 nova_compute[183075]: 2026-01-22 17:33:25.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:33:25 compute-0 nova_compute[183075]: 2026-01-22 17:33:25.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:33:27 compute-0 nova_compute[183075]: 2026-01-22 17:33:27.778 183079 INFO nova.compute.manager [None req-bf867890-eac0-4f2b-8aed-374f066536d9 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Get console output
Jan 22 17:33:27 compute-0 nova_compute[183075]: 2026-01-22 17:33:27.783 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:33:27 compute-0 nova_compute[183075]: 2026-01-22 17:33:27.879 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:28 compute-0 nova_compute[183075]: 2026-01-22 17:33:28.097 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:29 compute-0 nova_compute[183075]: 2026-01-22 17:33:29.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:33:32 compute-0 nova_compute[183075]: 2026-01-22 17:33:32.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:33:32 compute-0 nova_compute[183075]: 2026-01-22 17:33:32.882 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:32 compute-0 nova_compute[183075]: 2026-01-22 17:33:32.926 183079 INFO nova.compute.manager [None req-e84b8c20-49a2-4805-b8a0-6ab76d6e128c c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Get console output
Jan 22 17:33:32 compute-0 nova_compute[183075]: 2026-01-22 17:33:32.932 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:33:33 compute-0 nova_compute[183075]: 2026-01-22 17:33:33.099 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:33 compute-0 ovn_controller[95372]: 2026-01-22T17:33:33Z|00114|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f5:0b:0f 10.100.0.6
Jan 22 17:33:33 compute-0 ovn_controller[95372]: 2026-01-22T17:33:33Z|00115|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f5:0b:0f 10.100.0.6
Jan 22 17:33:35 compute-0 podman[234071]: 2026-01-22 17:33:35.35399218 +0000 UTC m=+0.057054084 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:33:35 compute-0 podman[234072]: 2026-01-22 17:33:35.370581371 +0000 UTC m=+0.062645176 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 17:33:35 compute-0 podman[234070]: 2026-01-22 17:33:35.389419844 +0000 UTC m=+0.094953176 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:33:36 compute-0 nova_compute[183075]: 2026-01-22 17:33:36.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:33:36 compute-0 nova_compute[183075]: 2026-01-22 17:33:36.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:33:36 compute-0 nova_compute[183075]: 2026-01-22 17:33:36.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:33:36 compute-0 nova_compute[183075]: 2026-01-22 17:33:36.830 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:36 compute-0 nova_compute[183075]: 2026-01-22 17:33:36.831 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:36 compute-0 nova_compute[183075]: 2026-01-22 17:33:36.831 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:36 compute-0 nova_compute[183075]: 2026-01-22 17:33:36.831 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:33:37 compute-0 nova_compute[183075]: 2026-01-22 17:33:37.487 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:33:37 compute-0 nova_compute[183075]: 2026-01-22 17:33:37.564 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:33:37 compute-0 nova_compute[183075]: 2026-01-22 17:33:37.565 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:33:37 compute-0 nova_compute[183075]: 2026-01-22 17:33:37.622 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:33:37 compute-0 nova_compute[183075]: 2026-01-22 17:33:37.628 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:33:37 compute-0 nova_compute[183075]: 2026-01-22 17:33:37.686 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:33:37 compute-0 nova_compute[183075]: 2026-01-22 17:33:37.687 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:33:37 compute-0 nova_compute[183075]: 2026-01-22 17:33:37.741 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:33:37 compute-0 nova_compute[183075]: 2026-01-22 17:33:37.747 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:33:37 compute-0 nova_compute[183075]: 2026-01-22 17:33:37.813 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:33:37 compute-0 nova_compute[183075]: 2026-01-22 17:33:37.814 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:33:37 compute-0 nova_compute[183075]: 2026-01-22 17:33:37.870 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:33:37 compute-0 nova_compute[183075]: 2026-01-22 17:33:37.876 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:33:37 compute-0 nova_compute[183075]: 2026-01-22 17:33:37.895 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:37 compute-0 nova_compute[183075]: 2026-01-22 17:33:37.952 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:33:37 compute-0 nova_compute[183075]: 2026-01-22 17:33:37.953 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.020 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.051 183079 INFO nova.compute.manager [None req-2fd99390-0c08-41f0-8f3d-aa44f491580b c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Get console output
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.056 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.100 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.202 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.203 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=4996MB free_disk=73.24748229980469GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.204 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.204 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.288 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance b0949fde-940d-495c-bdb0-e6c996b0274f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.288 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 81396ea9-a9a1-4a21-9808-608e45a7aa03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.288 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance b09d9bed-19f3-4aae-8aa4-7a87468084e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.289 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance eb04fe29-6d1c-4572-b219-f60350425077 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.289 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.289 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.380 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.397 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.423 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:33:38 compute-0 nova_compute[183075]: 2026-01-22 17:33:38.424 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:39 compute-0 podman[234161]: 2026-01-22 17:33:39.346481305 +0000 UTC m=+0.053518748 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute)
Jan 22 17:33:39 compute-0 nova_compute[183075]: 2026-01-22 17:33:39.424 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:33:39 compute-0 nova_compute[183075]: 2026-01-22 17:33:39.424 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:33:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:39.673 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:39.674 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:33:39 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:39 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:39 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:39 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:39 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:39 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:33:39 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.730 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.731 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.0571666
Jan 22 17:33:40 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.6:51334 [22/Jan/2026:17:33:39.673] listener listener/metadata 0/0/0/1058/1058 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.750 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.751 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.771 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.772 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0208879
Jan 22 17:33:40 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.6:51342 [22/Jan/2026:17:33:40.749] listener listener/metadata 0/0/0/22/22 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.779 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.780 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.806 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.807 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0267911
Jan 22 17:33:40 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.6:51344 [22/Jan/2026:17:33:40.778] listener listener/metadata 0/0/0/28/28 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.815 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.816 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.840 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:40 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.6:51360 [22/Jan/2026:17:33:40.815] listener listener/metadata 0/0/0/25/25 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.840 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0240586
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.850 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.851 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.871 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.872 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0214157
Jan 22 17:33:40 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.6:51372 [22/Jan/2026:17:33:40.849] listener listener/metadata 0/0/0/23/23 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.881 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.882 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.900 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.901 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0186188
Jan 22 17:33:40 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.6:51374 [22/Jan/2026:17:33:40.880] listener listener/metadata 0/0/0/20/20 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.911 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.912 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.927 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:40 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.6:51376 [22/Jan/2026:17:33:40.911] listener listener/metadata 0/0/0/16/16 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.927 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0153279
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.937 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.938 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.951 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:40 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.6:51384 [22/Jan/2026:17:33:40.936] listener listener/metadata 0/0/0/15/15 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.951 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0136640
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.959 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.960 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.974 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.975 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0151155
Jan 22 17:33:40 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.6:51390 [22/Jan/2026:17:33:40.958] listener listener/metadata 0/0/0/16/16 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.983 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:40.984 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:33:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.002 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.002 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0185866
Jan 22 17:33:41 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.6:51392 [22/Jan/2026:17:33:40.983] listener listener/metadata 0/0/0/19/19 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.010 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.010 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:41 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.6:51404 [22/Jan/2026:17:33:41.009] listener listener/metadata 0/0/0/16/16 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.025 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0152838
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.035 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.035 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.053 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:41 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.6:51414 [22/Jan/2026:17:33:41.034] listener listener/metadata 0/0/0/18/18 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.053 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0178695
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.056 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.056 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.072 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.073 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0163159
Jan 22 17:33:41 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.6:51416 [22/Jan/2026:17:33:41.056] listener listener/metadata 0/0/0/16/16 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.076 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.076 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.092 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.092 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0156870
Jan 22 17:33:41 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.6:51426 [22/Jan/2026:17:33:41.076] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.096 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.097 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.110 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.111 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0144353
Jan 22 17:33:41 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.6:51436 [22/Jan/2026:17:33:41.096] listener listener/metadata 0/0/0/15/15 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.116 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.117 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 2fd77df8-cf00-4afc-b4cf-75b5722c375c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.133 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.133 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0161929
Jan 22 17:33:41 compute-0 haproxy-metadata-proxy-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233424]: 10.100.0.6:51452 [22/Jan/2026:17:33:41.115] listener listener/metadata 0/0/0/17/17 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:33:41 compute-0 nova_compute[183075]: 2026-01-22 17:33:41.550 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-81396ea9-a9a1-4a21-9808-608e45a7aa03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:33:41 compute-0 nova_compute[183075]: 2026-01-22 17:33:41.550 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-81396ea9-a9a1-4a21-9808-608e45a7aa03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:33:41 compute-0 nova_compute[183075]: 2026-01-22 17:33:41.551 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.949 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.950 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:41.950 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.416 183079 DEBUG oslo_concurrency.lockutils [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.418 183079 DEBUG oslo_concurrency.lockutils [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.418 183079 DEBUG oslo_concurrency.lockutils [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.419 183079 DEBUG oslo_concurrency.lockutils [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.419 183079 DEBUG oslo_concurrency.lockutils [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.421 183079 INFO nova.compute.manager [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Terminating instance
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.422 183079 DEBUG nova.compute.manager [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:33:42 compute-0 kernel: tapa49c001e-85 (unregistering): left promiscuous mode
Jan 22 17:33:42 compute-0 NetworkManager[55454]: <info>  [1769103222.4520] device (tapa49c001e-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.455 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:42 compute-0 ovn_controller[95372]: 2026-01-22T17:33:42Z|00608|binding|INFO|Releasing lport a49c001e-85aa-4216-a7dd-ed52fa4c71ac from this chassis (sb_readonly=0)
Jan 22 17:33:42 compute-0 ovn_controller[95372]: 2026-01-22T17:33:42Z|00609|binding|INFO|Setting lport a49c001e-85aa-4216-a7dd-ed52fa4c71ac down in Southbound
Jan 22 17:33:42 compute-0 ovn_controller[95372]: 2026-01-22T17:33:42Z|00610|binding|INFO|Removing iface tapa49c001e-85 ovn-installed in OVS
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.457 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:42.466 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:54:7f 10.100.0.26'], port_security=['fa:16:3e:a0:54:7f 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': 'b09d9bed-19f3-4aae-8aa4-7a87468084e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6b35f822-4909-4e2a-a8bd-ef6e39136861', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2644338c-80e0-4701-bf75-229e5a6223da, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=a49c001e-85aa-4216-a7dd-ed52fa4c71ac) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:33:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:42.467 104629 INFO neutron.agent.ovn.metadata.agent [-] Port a49c001e-85aa-4216-a7dd-ed52fa4c71ac in datapath ed94e4f1-14ed-42c4-8c8e-db508a59bd2c unbound from our chassis
Jan 22 17:33:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:42.468 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed94e4f1-14ed-42c4-8c8e-db508a59bd2c
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.475 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:42.486 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f93bc9e6-ecc8-4643-9d60-8279ebebefa0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:42 compute-0 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000037.scope: Deactivated successfully.
Jan 22 17:33:42 compute-0 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000037.scope: Consumed 14.449s CPU time.
Jan 22 17:33:42 compute-0 systemd-machined[154382]: Machine qemu-55-instance-00000037 terminated.
Jan 22 17:33:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:42.515 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd62613-ba7e-44aa-b836-185462cc000a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:42.518 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[78e0b62d-6089-49aa-abc1-c6c8a5eb5fe3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:42.545 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[8240f759-f907-47eb-af0e-79479f94e499]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:42.561 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[df90671a-01bc-41ac-9aa5-a801de801c77]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped94e4f1-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:6f:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 106, 'rx_bytes': 17308, 'tx_bytes': 11898, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 106, 'rx_bytes': 17308, 'tx_bytes': 11898, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 544169, 'reachable_time': 15481, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234204, 'error': None, 'target': 'ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:42.578 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1efde8f5-20f5-43c0-b86a-fe6a369ad952]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'taped94e4f1-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 544180, 'tstamp': 544180}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234205, 'error': None, 'target': 'ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.18'], ['IFA_LOCAL', '10.100.0.18'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'taped94e4f1-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 544183, 'tstamp': 544183}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234205, 'error': None, 'target': 'ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:42.579 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped94e4f1-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.581 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.584 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:42.585 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped94e4f1-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:33:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:42.585 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:33:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:42.585 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped94e4f1-10, col_values=(('external_ids', {'iface-id': '77b4e93d-6708-4efb-b060-601be2ddc621'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:33:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:42.585 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.691 183079 INFO nova.virt.libvirt.driver [-] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Instance destroyed successfully.
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.691 183079 DEBUG nova.objects.instance [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lazy-loading 'resources' on Instance uuid b09d9bed-19f3-4aae-8aa4-7a87468084e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.730 183079 DEBUG nova.virt.libvirt.vif [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:32:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='leia',display_name='leia',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='leia',id=55,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP31lzQFM+bzgy2rILfPJtItaZEOm3vqO/5STzV+08atBuxP1/4YmUTZSa+vuUIur2j2kVkdN8zrzADLiGWPcuNSoFATd7+40/kloWBkWhl+JRfBOGEMv65jtGsYaSx1Fw==',key_name='tempest-keypair-test-1852532011',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:32:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='89916c03f6f440f6ae7cf81f2ae99bad',ramdisk_id='',reservation_id='r-ttdhfphp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InternalDNSTest-38234021',owner_user_name='tempest-InternalDNSTest-38234021-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:33:00Z,user_data=None,user_id='1ddebe2a251e4b118d9469f7d6fdb2ce',uuid=b09d9bed-19f3-4aae-8aa4-7a87468084e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "address": "fa:16:3e:a0:54:7f", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49c001e-85", "ovs_interfaceid": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.730 183079 DEBUG nova.network.os_vif_util [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converting VIF {"id": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "address": "fa:16:3e:a0:54:7f", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49c001e-85", "ovs_interfaceid": "a49c001e-85aa-4216-a7dd-ed52fa4c71ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.731 183079 DEBUG nova.network.os_vif_util [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:54:7f,bridge_name='br-int',has_traffic_filtering=True,id=a49c001e-85aa-4216-a7dd-ed52fa4c71ac,network=Network(ed94e4f1-14ed-42c4-8c8e-db508a59bd2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa49c001e-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.731 183079 DEBUG os_vif [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:54:7f,bridge_name='br-int',has_traffic_filtering=True,id=a49c001e-85aa-4216-a7dd-ed52fa4c71ac,network=Network(ed94e4f1-14ed-42c4-8c8e-db508a59bd2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa49c001e-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.732 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.733 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa49c001e-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.734 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.737 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.739 183079 INFO os_vif [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:54:7f,bridge_name='br-int',has_traffic_filtering=True,id=a49c001e-85aa-4216-a7dd-ed52fa4c71ac,network=Network(ed94e4f1-14ed-42c4-8c8e-db508a59bd2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa49c001e-85')
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.739 183079 INFO nova.virt.libvirt.driver [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Deleting instance files /var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4_del
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.740 183079 INFO nova.virt.libvirt.driver [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Deletion of /var/lib/nova/instances/b09d9bed-19f3-4aae-8aa4-7a87468084e4_del complete
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.793 183079 INFO nova.compute.manager [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.793 183079 DEBUG oslo.service.loopingcall [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.794 183079 DEBUG nova.compute.manager [-] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.794 183079 DEBUG nova.network.neutron [-] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:33:42 compute-0 nova_compute[183075]: 2026-01-22 17:33:42.886 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:43 compute-0 nova_compute[183075]: 2026-01-22 17:33:43.056 183079 DEBUG nova.compute.manager [req-c3f67747-2df3-41b8-89be-95a3cbaf55df req-ad4dfc78-7600-40b0-a3d8-86564357f95a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Received event network-vif-unplugged-a49c001e-85aa-4216-a7dd-ed52fa4c71ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:33:43 compute-0 nova_compute[183075]: 2026-01-22 17:33:43.056 183079 DEBUG oslo_concurrency.lockutils [req-c3f67747-2df3-41b8-89be-95a3cbaf55df req-ad4dfc78-7600-40b0-a3d8-86564357f95a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:43 compute-0 nova_compute[183075]: 2026-01-22 17:33:43.057 183079 DEBUG oslo_concurrency.lockutils [req-c3f67747-2df3-41b8-89be-95a3cbaf55df req-ad4dfc78-7600-40b0-a3d8-86564357f95a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:43 compute-0 nova_compute[183075]: 2026-01-22 17:33:43.057 183079 DEBUG oslo_concurrency.lockutils [req-c3f67747-2df3-41b8-89be-95a3cbaf55df req-ad4dfc78-7600-40b0-a3d8-86564357f95a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:43 compute-0 nova_compute[183075]: 2026-01-22 17:33:43.057 183079 DEBUG nova.compute.manager [req-c3f67747-2df3-41b8-89be-95a3cbaf55df req-ad4dfc78-7600-40b0-a3d8-86564357f95a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] No waiting events found dispatching network-vif-unplugged-a49c001e-85aa-4216-a7dd-ed52fa4c71ac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:33:43 compute-0 nova_compute[183075]: 2026-01-22 17:33:43.058 183079 DEBUG nova.compute.manager [req-c3f67747-2df3-41b8-89be-95a3cbaf55df req-ad4dfc78-7600-40b0-a3d8-86564357f95a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Received event network-vif-unplugged-a49c001e-85aa-4216-a7dd-ed52fa4c71ac for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:33:43 compute-0 nova_compute[183075]: 2026-01-22 17:33:43.179 183079 INFO nova.compute.manager [None req-81bda745-c94d-4781-8b94-4d8d215c58d8 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Get console output
Jan 22 17:33:43 compute-0 nova_compute[183075]: 2026-01-22 17:33:43.185 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:33:43 compute-0 nova_compute[183075]: 2026-01-22 17:33:43.880 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Updating instance_info_cache with network_info: [{"id": "0290774a-bbee-4523-b342-ddb24aca4826", "address": "fa:16:3e:60:a0:be", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:a0be", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0290774a-bb", "ovs_interfaceid": "0290774a-bbee-4523-b342-ddb24aca4826", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:33:43 compute-0 nova_compute[183075]: 2026-01-22 17:33:43.900 183079 DEBUG nova.network.neutron [-] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:33:43 compute-0 nova_compute[183075]: 2026-01-22 17:33:43.901 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-81396ea9-a9a1-4a21-9808-608e45a7aa03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:33:43 compute-0 nova_compute[183075]: 2026-01-22 17:33:43.901 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:33:43 compute-0 nova_compute[183075]: 2026-01-22 17:33:43.901 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:33:43 compute-0 nova_compute[183075]: 2026-01-22 17:33:43.920 183079 INFO nova.compute.manager [-] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Took 1.13 seconds to deallocate network for instance.
Jan 22 17:33:44 compute-0 nova_compute[183075]: 2026-01-22 17:33:44.076 183079 DEBUG oslo_concurrency.lockutils [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:44 compute-0 nova_compute[183075]: 2026-01-22 17:33:44.076 183079 DEBUG oslo_concurrency.lockutils [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:44 compute-0 nova_compute[183075]: 2026-01-22 17:33:44.186 183079 DEBUG nova.compute.provider_tree [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:33:44 compute-0 nova_compute[183075]: 2026-01-22 17:33:44.507 183079 DEBUG nova.compute.manager [req-afd99c10-3936-41e3-9c3a-5baccfc648ad req-5636cee0-9395-4ed8-abe8-4cb9520cdf93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Received event network-vif-deleted-a49c001e-85aa-4216-a7dd-ed52fa4c71ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:33:44 compute-0 nova_compute[183075]: 2026-01-22 17:33:44.545 183079 DEBUG nova.scheduler.client.report [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:33:44 compute-0 nova_compute[183075]: 2026-01-22 17:33:44.597 183079 DEBUG oslo_concurrency.lockutils [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.521s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:44 compute-0 nova_compute[183075]: 2026-01-22 17:33:44.714 183079 INFO nova.scheduler.client.report [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Deleted allocations for instance b09d9bed-19f3-4aae-8aa4-7a87468084e4
Jan 22 17:33:44 compute-0 nova_compute[183075]: 2026-01-22 17:33:44.776 183079 DEBUG oslo_concurrency.lockutils [None req-b82425bb-24a5-4bbd-904f-d3d6b0b65e44 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.358s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.138 183079 DEBUG nova.compute.manager [req-f601525c-0d2e-4c60-aa5b-a95c54689036 req-1cb8654f-1ead-49f9-922c-191da5e1e8bd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Received event network-vif-plugged-a49c001e-85aa-4216-a7dd-ed52fa4c71ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.139 183079 DEBUG oslo_concurrency.lockutils [req-f601525c-0d2e-4c60-aa5b-a95c54689036 req-1cb8654f-1ead-49f9-922c-191da5e1e8bd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.139 183079 DEBUG oslo_concurrency.lockutils [req-f601525c-0d2e-4c60-aa5b-a95c54689036 req-1cb8654f-1ead-49f9-922c-191da5e1e8bd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.140 183079 DEBUG oslo_concurrency.lockutils [req-f601525c-0d2e-4c60-aa5b-a95c54689036 req-1cb8654f-1ead-49f9-922c-191da5e1e8bd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b09d9bed-19f3-4aae-8aa4-7a87468084e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.140 183079 DEBUG nova.compute.manager [req-f601525c-0d2e-4c60-aa5b-a95c54689036 req-1cb8654f-1ead-49f9-922c-191da5e1e8bd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] No waiting events found dispatching network-vif-plugged-a49c001e-85aa-4216-a7dd-ed52fa4c71ac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.140 183079 WARNING nova.compute.manager [req-f601525c-0d2e-4c60-aa5b-a95c54689036 req-1cb8654f-1ead-49f9-922c-191da5e1e8bd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Received unexpected event network-vif-plugged-a49c001e-85aa-4216-a7dd-ed52fa4c71ac for instance with vm_state deleted and task_state None.
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.451 183079 DEBUG oslo_concurrency.lockutils [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "b0949fde-940d-495c-bdb0-e6c996b0274f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.452 183079 DEBUG oslo_concurrency.lockutils [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "b0949fde-940d-495c-bdb0-e6c996b0274f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.452 183079 DEBUG oslo_concurrency.lockutils [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.452 183079 DEBUG oslo_concurrency.lockutils [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.452 183079 DEBUG oslo_concurrency.lockutils [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.453 183079 INFO nova.compute.manager [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Terminating instance
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.454 183079 DEBUG nova.compute.manager [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:33:45 compute-0 kernel: tap6aacbedb-69 (unregistering): left promiscuous mode
Jan 22 17:33:45 compute-0 NetworkManager[55454]: <info>  [1769103225.4767] device (tap6aacbedb-69): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:33:45 compute-0 ovn_controller[95372]: 2026-01-22T17:33:45Z|00611|binding|INFO|Releasing lport 6aacbedb-6999-4006-9e77-6e540614dbea from this chassis (sb_readonly=0)
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.488 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:45 compute-0 ovn_controller[95372]: 2026-01-22T17:33:45Z|00612|binding|INFO|Setting lport 6aacbedb-6999-4006-9e77-6e540614dbea down in Southbound
Jan 22 17:33:45 compute-0 ovn_controller[95372]: 2026-01-22T17:33:45Z|00613|binding|INFO|Removing iface tap6aacbedb-69 ovn-installed in OVS
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.491 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:45.495 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:c7:99 10.100.0.23'], port_security=['fa:16:3e:35:c7:99 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': 'b0949fde-940d-495c-bdb0-e6c996b0274f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89916c03f6f440f6ae7cf81f2ae99bad', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6b35f822-4909-4e2a-a8bd-ef6e39136861', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.174'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2644338c-80e0-4701-bf75-229e5a6223da, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=6aacbedb-6999-4006-9e77-6e540614dbea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:33:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:45.496 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 6aacbedb-6999-4006-9e77-6e540614dbea in datapath ed94e4f1-14ed-42c4-8c8e-db508a59bd2c unbound from our chassis
Jan 22 17:33:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:45.497 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed94e4f1-14ed-42c4-8c8e-db508a59bd2c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:33:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:45.498 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[321ba2b9-428a-4f27-a2e5-e5613a7bd8ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:45.498 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c namespace which is not needed anymore
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.506 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:45 compute-0 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000035.scope: Deactivated successfully.
Jan 22 17:33:45 compute-0 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000035.scope: Consumed 19.035s CPU time.
Jan 22 17:33:45 compute-0 systemd-machined[154382]: Machine qemu-53-instance-00000035 terminated.
Jan 22 17:33:45 compute-0 neutron-haproxy-ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233155]: [NOTICE]   (233170) : haproxy version is 2.8.14-c23fe91
Jan 22 17:33:45 compute-0 neutron-haproxy-ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233155]: [NOTICE]   (233170) : path to executable is /usr/sbin/haproxy
Jan 22 17:33:45 compute-0 neutron-haproxy-ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233155]: [WARNING]  (233170) : Exiting Master process...
Jan 22 17:33:45 compute-0 neutron-haproxy-ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233155]: [WARNING]  (233170) : Exiting Master process...
Jan 22 17:33:45 compute-0 neutron-haproxy-ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233155]: [ALERT]    (233170) : Current worker (233184) exited with code 143 (Terminated)
Jan 22 17:33:45 compute-0 neutron-haproxy-ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c[233155]: [WARNING]  (233170) : All workers exited. Exiting... (0)
Jan 22 17:33:45 compute-0 systemd[1]: libpod-b1ee1c00c4990dacd0863eb9a231ec1c1c2be81374bfc55badf636125a088602.scope: Deactivated successfully.
Jan 22 17:33:45 compute-0 podman[234245]: 2026-01-22 17:33:45.633225424 +0000 UTC m=+0.043278879 container died b1ee1c00c4990dacd0863eb9a231ec1c1c2be81374bfc55badf636125a088602 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:33:45 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b1ee1c00c4990dacd0863eb9a231ec1c1c2be81374bfc55badf636125a088602-userdata-shm.mount: Deactivated successfully.
Jan 22 17:33:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-a958efb9fc80caf042ae349a4e216b7d5113e0ab0397650d7bc476ecaf08dde3-merged.mount: Deactivated successfully.
Jan 22 17:33:45 compute-0 podman[234245]: 2026-01-22 17:33:45.674319433 +0000 UTC m=+0.084372888 container cleanup b1ee1c00c4990dacd0863eb9a231ec1c1c2be81374bfc55badf636125a088602 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 22 17:33:45 compute-0 systemd[1]: libpod-conmon-b1ee1c00c4990dacd0863eb9a231ec1c1c2be81374bfc55badf636125a088602.scope: Deactivated successfully.
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.707 183079 INFO nova.virt.libvirt.driver [-] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Instance destroyed successfully.
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.708 183079 DEBUG nova.objects.instance [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lazy-loading 'resources' on Instance uuid b0949fde-940d-495c-bdb0-e6c996b0274f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.728 183079 DEBUG nova.virt.libvirt.vif [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:31:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='luke',display_name='luke',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='luke',id=53,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP31lzQFM+bzgy2rILfPJtItaZEOm3vqO/5STzV+08atBuxP1/4YmUTZSa+vuUIur2j2kVkdN8zrzADLiGWPcuNSoFATd7+40/kloWBkWhl+JRfBOGEMv65jtGsYaSx1Fw==',key_name='tempest-keypair-test-1852532011',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:31:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='89916c03f6f440f6ae7cf81f2ae99bad',ramdisk_id='',reservation_id='r-73e4bgnl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InternalDNSTest-38234021',owner_user_name='tempest-InternalDNSTest-38234021-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:31:19Z,user_data=None,user_id='1ddebe2a251e4b118d9469f7d6fdb2ce',uuid=b0949fde-940d-495c-bdb0-e6c996b0274f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6aacbedb-6999-4006-9e77-6e540614dbea", "address": "fa:16:3e:35:c7:99", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6aacbedb-69", "ovs_interfaceid": "6aacbedb-6999-4006-9e77-6e540614dbea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.729 183079 DEBUG nova.network.os_vif_util [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converting VIF {"id": "6aacbedb-6999-4006-9e77-6e540614dbea", "address": "fa:16:3e:35:c7:99", "network": {"id": "ed94e4f1-14ed-42c4-8c8e-db508a59bd2c", "bridge": "br-int", "label": "tempest-test-network--696544753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "89916c03f6f440f6ae7cf81f2ae99bad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6aacbedb-69", "ovs_interfaceid": "6aacbedb-6999-4006-9e77-6e540614dbea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.730 183079 DEBUG nova.network.os_vif_util [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:35:c7:99,bridge_name='br-int',has_traffic_filtering=True,id=6aacbedb-6999-4006-9e77-6e540614dbea,network=Network(ed94e4f1-14ed-42c4-8c8e-db508a59bd2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6aacbedb-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.730 183079 DEBUG os_vif [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:c7:99,bridge_name='br-int',has_traffic_filtering=True,id=6aacbedb-6999-4006-9e77-6e540614dbea,network=Network(ed94e4f1-14ed-42c4-8c8e-db508a59bd2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6aacbedb-69') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.731 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.731 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6aacbedb-69, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.732 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.734 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.735 183079 INFO os_vif [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:c7:99,bridge_name='br-int',has_traffic_filtering=True,id=6aacbedb-6999-4006-9e77-6e540614dbea,network=Network(ed94e4f1-14ed-42c4-8c8e-db508a59bd2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6aacbedb-69')
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.736 183079 INFO nova.virt.libvirt.driver [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Deleting instance files /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f_del
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.736 183079 INFO nova.virt.libvirt.driver [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Deletion of /var/lib/nova/instances/b0949fde-940d-495c-bdb0-e6c996b0274f_del complete
Jan 22 17:33:45 compute-0 podman[234282]: 2026-01-22 17:33:45.745539592 +0000 UTC m=+0.049065587 container remove b1ee1c00c4990dacd0863eb9a231ec1c1c2be81374bfc55badf636125a088602 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:33:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:45.750 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ce49b60b-d9c9-4eb1-b2aa-e75cc219630c]: (4, ('Thu Jan 22 05:33:45 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c (b1ee1c00c4990dacd0863eb9a231ec1c1c2be81374bfc55badf636125a088602)\nb1ee1c00c4990dacd0863eb9a231ec1c1c2be81374bfc55badf636125a088602\nThu Jan 22 05:33:45 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c (b1ee1c00c4990dacd0863eb9a231ec1c1c2be81374bfc55badf636125a088602)\nb1ee1c00c4990dacd0863eb9a231ec1c1c2be81374bfc55badf636125a088602\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:45.752 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5af5e8c6-805d-4719-932c-4bdecd18b860]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:45.752 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped94e4f1-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:33:45 compute-0 kernel: taped94e4f1-10: left promiscuous mode
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.754 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.767 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:45.769 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[36088cf3-1d64-48cc-aa54-249aeabedc4d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:45.782 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ed2cad7e-96be-48f8-86aa-5240a5e79369]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:45.783 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1a1da67f-8ec9-4548-8fd0-32d07ec7ddad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:45.800 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b2817559-016c-4e17-82f6-ef52ff8d5cc7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 544161, 'reachable_time': 44519, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234304, 'error': None, 'target': 'ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:45.805 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ed94e4f1-14ed-42c4-8c8e-db508a59bd2c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:33:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:33:45.805 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[9574eb69-62d9-4826-9c4d-d827e73a3661]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:33:45 compute-0 systemd[1]: run-netns-ovnmeta\x2ded94e4f1\x2d14ed\x2d42c4\x2d8c8e\x2ddb508a59bd2c.mount: Deactivated successfully.
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.805 183079 INFO nova.compute.manager [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.806 183079 DEBUG oslo.service.loopingcall [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.808 183079 DEBUG nova.compute.manager [-] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:33:45 compute-0 nova_compute[183075]: 2026-01-22 17:33:45.808 183079 DEBUG nova.network.neutron [-] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:33:47 compute-0 nova_compute[183075]: 2026-01-22 17:33:47.899 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:48 compute-0 nova_compute[183075]: 2026-01-22 17:33:48.494 183079 DEBUG nova.compute.manager [req-1c1a52fc-71c7-47c5-bc94-952d79226cd6 req-5fa52f2a-03bf-420e-bc7d-c1e7530ac38c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Received event network-vif-unplugged-6aacbedb-6999-4006-9e77-6e540614dbea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:33:48 compute-0 nova_compute[183075]: 2026-01-22 17:33:48.494 183079 DEBUG oslo_concurrency.lockutils [req-1c1a52fc-71c7-47c5-bc94-952d79226cd6 req-5fa52f2a-03bf-420e-bc7d-c1e7530ac38c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:48 compute-0 nova_compute[183075]: 2026-01-22 17:33:48.495 183079 DEBUG oslo_concurrency.lockutils [req-1c1a52fc-71c7-47c5-bc94-952d79226cd6 req-5fa52f2a-03bf-420e-bc7d-c1e7530ac38c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:48 compute-0 nova_compute[183075]: 2026-01-22 17:33:48.495 183079 DEBUG oslo_concurrency.lockutils [req-1c1a52fc-71c7-47c5-bc94-952d79226cd6 req-5fa52f2a-03bf-420e-bc7d-c1e7530ac38c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:48 compute-0 nova_compute[183075]: 2026-01-22 17:33:48.495 183079 DEBUG nova.compute.manager [req-1c1a52fc-71c7-47c5-bc94-952d79226cd6 req-5fa52f2a-03bf-420e-bc7d-c1e7530ac38c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] No waiting events found dispatching network-vif-unplugged-6aacbedb-6999-4006-9e77-6e540614dbea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:33:48 compute-0 nova_compute[183075]: 2026-01-22 17:33:48.495 183079 DEBUG nova.compute.manager [req-1c1a52fc-71c7-47c5-bc94-952d79226cd6 req-5fa52f2a-03bf-420e-bc7d-c1e7530ac38c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Received event network-vif-unplugged-6aacbedb-6999-4006-9e77-6e540614dbea for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:33:48 compute-0 nova_compute[183075]: 2026-01-22 17:33:48.495 183079 DEBUG nova.compute.manager [req-1c1a52fc-71c7-47c5-bc94-952d79226cd6 req-5fa52f2a-03bf-420e-bc7d-c1e7530ac38c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Received event network-vif-plugged-6aacbedb-6999-4006-9e77-6e540614dbea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:33:48 compute-0 nova_compute[183075]: 2026-01-22 17:33:48.496 183079 DEBUG oslo_concurrency.lockutils [req-1c1a52fc-71c7-47c5-bc94-952d79226cd6 req-5fa52f2a-03bf-420e-bc7d-c1e7530ac38c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:48 compute-0 nova_compute[183075]: 2026-01-22 17:33:48.496 183079 DEBUG oslo_concurrency.lockutils [req-1c1a52fc-71c7-47c5-bc94-952d79226cd6 req-5fa52f2a-03bf-420e-bc7d-c1e7530ac38c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:48 compute-0 nova_compute[183075]: 2026-01-22 17:33:48.496 183079 DEBUG oslo_concurrency.lockutils [req-1c1a52fc-71c7-47c5-bc94-952d79226cd6 req-5fa52f2a-03bf-420e-bc7d-c1e7530ac38c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b0949fde-940d-495c-bdb0-e6c996b0274f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:48 compute-0 nova_compute[183075]: 2026-01-22 17:33:48.496 183079 DEBUG nova.compute.manager [req-1c1a52fc-71c7-47c5-bc94-952d79226cd6 req-5fa52f2a-03bf-420e-bc7d-c1e7530ac38c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] No waiting events found dispatching network-vif-plugged-6aacbedb-6999-4006-9e77-6e540614dbea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:33:48 compute-0 nova_compute[183075]: 2026-01-22 17:33:48.496 183079 WARNING nova.compute.manager [req-1c1a52fc-71c7-47c5-bc94-952d79226cd6 req-5fa52f2a-03bf-420e-bc7d-c1e7530ac38c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Received unexpected event network-vif-plugged-6aacbedb-6999-4006-9e77-6e540614dbea for instance with vm_state active and task_state deleting.
Jan 22 17:33:49 compute-0 nova_compute[183075]: 2026-01-22 17:33:49.212 183079 INFO nova.compute.manager [None req-be3c1274-2101-400e-87fa-4eac6b3066cf c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Get console output
Jan 22 17:33:49 compute-0 nova_compute[183075]: 2026-01-22 17:33:49.218 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:33:49 compute-0 nova_compute[183075]: 2026-01-22 17:33:49.278 183079 DEBUG nova.network.neutron [-] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:33:49 compute-0 nova_compute[183075]: 2026-01-22 17:33:49.301 183079 INFO nova.compute.manager [-] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Took 3.49 seconds to deallocate network for instance.
Jan 22 17:33:49 compute-0 nova_compute[183075]: 2026-01-22 17:33:49.343 183079 DEBUG oslo_concurrency.lockutils [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:33:49 compute-0 nova_compute[183075]: 2026-01-22 17:33:49.343 183079 DEBUG oslo_concurrency.lockutils [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:33:49 compute-0 nova_compute[183075]: 2026-01-22 17:33:49.428 183079 DEBUG nova.compute.provider_tree [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:33:49 compute-0 nova_compute[183075]: 2026-01-22 17:33:49.449 183079 DEBUG nova.scheduler.client.report [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:33:49 compute-0 nova_compute[183075]: 2026-01-22 17:33:49.471 183079 DEBUG oslo_concurrency.lockutils [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:49 compute-0 nova_compute[183075]: 2026-01-22 17:33:49.492 183079 INFO nova.scheduler.client.report [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Deleted allocations for instance b0949fde-940d-495c-bdb0-e6c996b0274f
Jan 22 17:33:49 compute-0 nova_compute[183075]: 2026-01-22 17:33:49.558 183079 DEBUG oslo_concurrency.lockutils [None req-30110a3f-890b-402b-9f00-5145e3e21960 1ddebe2a251e4b118d9469f7d6fdb2ce 89916c03f6f440f6ae7cf81f2ae99bad - - default default] Lock "b0949fde-940d-495c-bdb0-e6c996b0274f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:33:50 compute-0 podman[234305]: 2026-01-22 17:33:50.373439544 +0000 UTC m=+0.064861146 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:33:50 compute-0 nova_compute[183075]: 2026-01-22 17:33:50.588 183079 DEBUG nova.compute.manager [req-6038934b-c8e4-404a-a7ec-881b21b929de req-107c8e49-1aa2-4e02-9c17-5b2925612cc8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Received event network-vif-deleted-6aacbedb-6999-4006-9e77-6e540614dbea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:33:50 compute-0 nova_compute[183075]: 2026-01-22 17:33:50.732 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:52 compute-0 nova_compute[183075]: 2026-01-22 17:33:52.900 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:54 compute-0 podman[234329]: 2026-01-22 17:33:54.353073121 +0000 UTC m=+0.055929274 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:33:54 compute-0 nova_compute[183075]: 2026-01-22 17:33:54.655 183079 INFO nova.compute.manager [None req-a477116a-9234-4e77-960c-ee030ca94d9f c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Get console output
Jan 22 17:33:54 compute-0 nova_compute[183075]: 2026-01-22 17:33:54.659 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.458 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'name': 'tempest-server-test-761314490', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000036', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'hostId': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.460 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'name': 'tempest-server-test-887859950', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000038', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'hostId': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.460 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.463 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.incoming.bytes.delta volume: 8520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.466 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for eb04fe29-6d1c-4572-b219-f60350425077 / tap7a42a264-53 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.466 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69399712-9959-4236-be74-9cfa33887d64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 8520, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:33:55.460761', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '86b67ec4-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.221187657, 'message_signature': '1dd6598c68c27236a3d0b85a94cba272facb0cc7e435bfcfc1e1fb347b317624'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000038-eb04fe29-6d1c-4572-b219-f60350425077-tap7a42a264-53', 'timestamp': '2026-01-22T17:33:55.460761', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'tap7a42a264-53', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:0b:0f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a42a264-53'}, 'message_id': '86b6fce6-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.224149617, 'message_signature': '5761b3a77ca3433d4d24026cea54406582de0a5e98c1bb6901c696952e360270'}]}, 'timestamp': '2026-01-22 17:33:55.466906', '_unique_id': 'f0b05a8abc204b59a60896187f4e8854'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.467 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.468 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.485 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/memory.usage volume: 42.2734375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.502 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/memory.usage volume: 43.30078125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4bb168b8-fe25-4a45-90f8-63ac412a321e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.2734375, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'timestamp': '2026-01-22T17:33:55.468940', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '86b9ec30-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.245949471, 'message_signature': '3b9ff6de126cda5dd722ee2ba760e2aecb230e3781ed8b8aef82a115b4a77e3c'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.30078125, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'timestamp': '2026-01-22T17:33:55.468940', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'instance-00000038', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '86bc8828-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.262983625, 'message_signature': '7081ca384538684337019db6b756dee18df34c4465b5a9b0164f84ce7ea15ea3'}]}, 'timestamp': '2026-01-22 17:33:55.503331', '_unique_id': 'e2a4d03782c849b8b08aaab4257a7545'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.505 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.505 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.505 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a20030f5-c65a-4590-8e28-8c8574f9631b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:33:55.505336', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '86bce4da-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.221187657, 'message_signature': '3bf7ea54195be35939f4dc2ec29b2d3abd1f11a8c0c9204e398c563bc1630cc1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000038-eb04fe29-6d1c-4572-b219-f60350425077-tap7a42a264-53', 'timestamp': '2026-01-22T17:33:55.505336', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'tap7a42a264-53', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:0b:0f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a42a264-53'}, 'message_id': '86bceed0-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.224149617, 'message_signature': '54246c42d5c8e5b53ab16696a0a86475340d0a51cbc9dc929e207a8d5a0d8827'}]}, 'timestamp': '2026-01-22 17:33:55.505841', '_unique_id': 'b397fd05b05948308c7c496577cff61d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.507 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.521 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.read.requests volume: 1159 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.534 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/disk.device.read.requests volume: 1112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b369e58-bf50-4ed8-9996-48859396f28c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1159, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:33:55.507080', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86bf5436-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.267515698, 'message_signature': '75f0720864db77ba6fed29ad43e6b099e0a349b77ef7838eea8180097aa44dc6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1112, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'eb04fe29-6d1c-4572-b219-f60350425077-vda', 'timestamp': '2026-01-22T17:33:55.507080', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'instance-00000038', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c14e08-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.282097745, 'message_signature': '3c150582f203890f9027250e9cbdb0742d35607bb31bd2777753dfa552d96cc1'}]}, 'timestamp': '2026-01-22 17:33:55.534562', '_unique_id': '11060c81d0d142fa8b6628c284c27a18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.536 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.536 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.write.latency volume: 6694202653 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.536 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/disk.device.write.latency volume: 22212895632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a3de02e-6001-4497-b712-f824fcf8f38b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6694202653, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:33:55.536311', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c19e30-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.267515698, 'message_signature': '57651f575efa4f060596bd3b6147e0f0f318b17e4f432b2091bbabb9cf966aaa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22212895632, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'eb04fe29-6d1c-4572-b219-f60350425077-vda', 'timestamp': '2026-01-22T17:33:55.536311', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'instance-00000038', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c1a7cc-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.282097745, 'message_signature': '5c19c9e0f5aabd30ac8b7bffccea33809cf9ff648d9631c7c45b9cbc11a73749'}]}, 'timestamp': '2026-01-22 17:33:55.536785', '_unique_id': '789c964f270b44cca0fac01ea07da9d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.537 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.538 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.read.latency volume: 249613544 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.538 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/disk.device.read.latency volume: 228251079 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b8c4966-72ad-4bd9-bc69-1300c956eb56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 249613544, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:33:55.538056', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c1e1f6-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.267515698, 'message_signature': '2ae07daa3db43dc25bd38eaa43989ffa3bf040fd4f3442d26b3b11c656625192'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 228251079, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'eb04fe29-6d1c-4572-b219-f60350425077-vda', 'timestamp': '2026-01-22T17:33:55.538056', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'instance-00000038', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c1ea0c-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.282097745, 'message_signature': '7c8e6f21b096b0f7dcd5b66a430360575a6cf4fc4e97e51c64cfe86859a932bd'}]}, 'timestamp': '2026-01-22 17:33:55.538476', '_unique_id': '0ae067201b3f4be2a0b76dc17abcaea0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.539 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.540 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.incoming.packets volume: 78 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.540 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d3138e7-171c-43a4-b61c-7643dcf3fbde', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 78, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:33:55.540026', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '86c230ac-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.221187657, 'message_signature': 'a2125b7910a902ef978f5b87a5aa94c45f7ae0614496fc0be54d7bb4336d0120'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000038-eb04fe29-6d1c-4572-b219-f60350425077-tap7a42a264-53', 'timestamp': '2026-01-22T17:33:55.540026', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'tap7a42a264-53', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:0b:0f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a42a264-53'}, 'message_id': '86c23dae-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.224149617, 'message_signature': '3611b662be8bc493c4547eb4059f830e6c07c3230fe278e2059ee7316ffcc863'}]}, 'timestamp': '2026-01-22 17:33:55.540670', '_unique_id': '6fe8a35976714a91b9ebbd09a983b9f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.542 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.542 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.incoming.bytes volume: 8610 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.542 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/network.incoming.bytes volume: 7424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1330e87f-7b7a-4a6c-91ba-f309012b24de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8610, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:33:55.542172', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '86c2839a-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.221187657, 'message_signature': '4c47d1d131c62e52e9e43be5a49244f92041cd9461d57cc83a931c07c1877aa6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7424, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000038-eb04fe29-6d1c-4572-b219-f60350425077-tap7a42a264-53', 'timestamp': '2026-01-22T17:33:55.542172', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'tap7a42a264-53', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:0b:0f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a42a264-53'}, 'message_id': '86c28e08-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.224149617, 'message_signature': 'af2a2433f2998aeaadac607e781d930fd12112d0b7bb8d56f3c2ce02be8fa455'}]}, 'timestamp': '2026-01-22 17:33:55.542759', '_unique_id': '7f7bdd744aa24fd7b41067e87d950973'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.543 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.544 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.544 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.544 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-887859950>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-887859950>]
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.544 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.544 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.outgoing.bytes volume: 12064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.544 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/network.outgoing.bytes volume: 11277 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d3242e1-d52a-441b-8c1f-517917b92607', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12064, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:33:55.544550', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '86c2e20e-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.221187657, 'message_signature': 'ed8aa4d8b55f1a0b82692753c01138cf565757da8ee027ee16a13400708d4896'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11277, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000038-eb04fe29-6d1c-4572-b219-f60350425077-tap7a42a264-53', 'timestamp': '2026-01-22T17:33:55.544550', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'tap7a42a264-53', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:0b:0f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a42a264-53'}, 'message_id': '86c2eda8-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.224149617, 'message_signature': 'bf052ff80c87d9fa8fb8482fe633659a827e277d71e5a7d4773358d363133967'}]}, 'timestamp': '2026-01-22 17:33:55.545173', '_unique_id': 'e4ec74a364d34b4dbd892a4a25f5b2bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.545 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.546 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.546 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.write.bytes volume: 73211904 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.547 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/disk.device.write.bytes volume: 72974336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32dd70a3-560b-4881-881f-f5b387021574', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73211904, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:33:55.546920', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c34078-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.267515698, 'message_signature': '8ecd5c2072d407b7c8544dbdac3d4eb54750254b9fb05445dd9a0a9de0e2a656'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72974336, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'eb04fe29-6d1c-4572-b219-f60350425077-vda', 'timestamp': '2026-01-22T17:33:55.546920', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'instance-00000038', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c34ba4-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.282097745, 'message_signature': '28a581474fb4fbb3a371f3690af3833345ae0535560eec8505d4e5baca0aa7fb'}]}, 'timestamp': '2026-01-22 17:33:55.547575', '_unique_id': '68454726bbff4500a26f9d11bd4135a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.548 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.549 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.549 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.write.requests volume: 350 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.549 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/disk.device.write.requests volume: 315 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa1f3c21-f848-42f4-bcf3-c81905ac9334', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 350, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:33:55.549211', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c395e6-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.267515698, 'message_signature': 'fb7adcf1ab8694cdd4cdff238f163d728faf947477cd8589320a867be3185530'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 315, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'eb04fe29-6d1c-4572-b219-f60350425077-vda', 'timestamp': '2026-01-22T17:33:55.549211', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'instance-00000038', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c39f6e-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.282097745, 'message_signature': '3bbee3cbbe481339827806208ffa3ecd6e164057d224cb57a74cf0b4f7af64d9'}]}, 'timestamp': '2026-01-22 17:33:55.549693', '_unique_id': '1e1b4043e6b54b7f9e0d5a339ad771b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.550 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.551 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.551 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cff11538-40f4-4320-8b58-1f236c03fed2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:33:55.551085', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '86c3e00a-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.221187657, 'message_signature': 'f442a29819dd8df0eade07f901a2587314a75e9a14e16b341d1da98b50c2cc5c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000038-eb04fe29-6d1c-4572-b219-f60350425077-tap7a42a264-53', 'timestamp': '2026-01-22T17:33:55.551085', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'tap7a42a264-53', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:0b:0f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a42a264-53'}, 'message_id': '86c3e866-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.224149617, 'message_signature': 'e45a161ceda891e5a0103a83cfa2760edba52be2ac64a0f3eb136651e556ca02'}]}, 'timestamp': '2026-01-22 17:33:55.551548', '_unique_id': '30fe87dbe6314d0d930d43f0ac07948a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.552 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32e5f3d5-e5b5-4e0e-8580-dc5c626af7c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:33:55.552728', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '86c41fde-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.221187657, 'message_signature': '9664b866da536d03c4bc9386b5677ab95bc55f2b6c5091a0079293fb4bdea442'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000038-eb04fe29-6d1c-4572-b219-f60350425077-tap7a42a264-53', 'timestamp': '2026-01-22T17:33:55.552728', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'tap7a42a264-53', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:0b:0f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a42a264-53'}, 'message_id': '86c429ca-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.224149617, 'message_signature': '1ca74961579fc985b93ed7b07ad2c9555f7b436fae2ada1030fe34f43642945f'}]}, 'timestamp': '2026-01-22 17:33:55.553227', '_unique_id': '95d541c60548494a8b297dab673e4188'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.553 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.554 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.561 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.566 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'acbf6feb-3fa0-4753-b5b7-20add644f504', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:33:55.554419', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c5705a-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.314859827, 'message_signature': '3d1764a349b8bbd11e7e4e4efa63aa49de58306ae3ab51a75a97ce22e1276a64'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'eb04fe29-6d1c-4572-b219-f60350425077-vda', 'timestamp': '2026-01-22T17:33:55.554419', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'instance-00000038', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c64fa2-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.322124905, 'message_signature': '59e9d1d15a0d472b6b50fa693b4e79076fd6462f79440c0d183b65d57324f895'}]}, 'timestamp': '2026-01-22 17:33:55.567398', '_unique_id': 'cab1d165af07417b822f3e6c88cdb824'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.570 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.570 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.570 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-887859950>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-887859950>]
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.570 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.570 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.571 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a1d3abf-8c1a-477d-8a24-95de7eb9facc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:33:55.570890', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '86c6e6ec-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.221187657, 'message_signature': '1aa53adcd3c96483e0cf7d8f99ba59decafe41522db8d1233f3f7d2229bea85d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000038-eb04fe29-6d1c-4572-b219-f60350425077-tap7a42a264-53', 'timestamp': '2026-01-22T17:33:55.570890', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'tap7a42a264-53', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:0b:0f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a42a264-53'}, 'message_id': '86c6f272-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.224149617, 'message_signature': '4c25d166d1c7f76bd28c13b9114410ea3fe698dc6291a8dc55619f3c2c4ab3ef'}]}, 'timestamp': '2026-01-22 17:33:55.571569', '_unique_id': '879af4f51f9f4eda971507867145c581'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.573 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.573 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/cpu volume: 11960000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.573 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/cpu volume: 11180000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02127186-528c-4cfd-ba4d-88f9ae08fef4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11960000000, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'timestamp': '2026-01-22T17:33:55.573643', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '86c75244-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.245949471, 'message_signature': 'a627cf4065bf780c3894269c4fcce4d31e1242ea9c3b1ae87a4ad0ca176753d1'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11180000000, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'timestamp': '2026-01-22T17:33:55.573643', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'instance-00000038', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '86c75cbc-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.262983625, 'message_signature': '5e52aed83980e423ffb4c27c24a140b51cd0ffbdfe46ceee3585450d52851b1e'}]}, 'timestamp': '2026-01-22 17:33:55.574214', '_unique_id': 'f6d265b75d29480da5483a237eb46621'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.575 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.575 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.575 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-887859950>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-887859950>]
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.575 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.576 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.576 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b08b38c-2007-43a8-a2ac-9bac2494d298', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:33:55.576037', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c7ae56-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.314859827, 'message_signature': '819434df969af8af43cc7d8ea982b443c44b8df2330d6f993be31ab21546a784'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'eb04fe29-6d1c-4572-b219-f60350425077-vda', 'timestamp': '2026-01-22T17:33:55.576037', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'instance-00000038', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c7b874-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.322124905, 'message_signature': '352b08b6b87873403fab5749d9dd9ccc82a03b8048c647a298eb936e0f7356ee'}]}, 'timestamp': '2026-01-22 17:33:55.576561', '_unique_id': 'd7fcdc8bf3c443039f2784cf0a5cff7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.577 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.578 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f8a0bff-5ac4-4e6d-9842-8682cd2d854c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:33:55.577947', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c7f8f2-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.314859827, 'message_signature': '9744c75333ca72824f3786e4dd46447430f06ffd014ae5a991c48e2b24a03752'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'eb04fe29-6d1c-4572-b219-f60350425077-vda', 'timestamp': '2026-01-22T17:33:55.577947', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'instance-00000038', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c80310-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.322124905, 'message_signature': 'abac11d9472078fdf04bb87916119f1b8dd6c4a23d10f836bc5af28af518039f'}]}, 'timestamp': '2026-01-22 17:33:55.578473', '_unique_id': 'fefeea46870249bba6b4e194b4303773'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.579 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.outgoing.bytes.delta volume: 12064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.580 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de5ac340-13dd-4642-a2e9-65471671ac1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 12064, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:33:55.579905', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '86c84618-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.221187657, 'message_signature': 'b3b88dc36488b03d2d244227c287d3fb6ba32110d220a7e76f690f24ce82348f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000038-eb04fe29-6d1c-4572-b219-f60350425077-tap7a42a264-53', 'timestamp': '2026-01-22T17:33:55.579905', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'tap7a42a264-53', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:0b:0f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a42a264-53'}, 'message_id': '86c85158-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.224149617, 'message_signature': '47c3bb6382b0bb2ab95f36a32021876160377c3052067cfe0fb03665d159871b'}]}, 'timestamp': '2026-01-22 17:33:55.580495', '_unique_id': '9a76faeb9cc647298ca33927c79ca006'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.581 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/network.outgoing.packets volume: 136 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.582 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/network.outgoing.packets volume: 127 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b56a22b-960b-4c17-8f31-466c3293a7c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 136, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000036-81396ea9-a9a1-4a21-9808-608e45a7aa03-tap0290774a-bb', 'timestamp': '2026-01-22T17:33:55.581934', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'tap0290774a-bb', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:a0:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0290774a-bb'}, 'message_id': '86c8955a-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.221187657, 'message_signature': 'd304154bad54386bcaac0e31ae594f3fce8cc951e59bd0b36b0a22b290d72557'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 127, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'instance-00000038-eb04fe29-6d1c-4572-b219-f60350425077-tap7a42a264-53', 'timestamp': '2026-01-22T17:33:55.581934', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'tap7a42a264-53', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:0b:0f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a42a264-53'}, 'message_id': '86c8a018-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.224149617, 'message_signature': '9ca44cc5e05eeb786e5887a78197ca34041fa806500d42eed2273a058853110d'}]}, 'timestamp': '2026-01-22 17:33:55.582500', '_unique_id': 'd44dfd6725da4fa7a5ca68bb69a04281'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.583 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.584 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.584 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-887859950>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-887859950>]
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.584 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.584 12 DEBUG ceilometer.compute.pollsters [-] 81396ea9-a9a1-4a21-9808-608e45a7aa03/disk.device.read.bytes volume: 31173120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.584 12 DEBUG ceilometer.compute.pollsters [-] eb04fe29-6d1c-4572-b219-f60350425077/disk.device.read.bytes volume: 30046720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be249b2a-55dd-4ce4-81a5-7160697ff2bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31173120, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03-vda', 'timestamp': '2026-01-22T17:33:55.584343', 'resource_metadata': {'display_name': 'tempest-server-test-761314490', 'name': 'instance-00000036', 'instance_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c8f338-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.267515698, 'message_signature': '22d9fe90394bffefcac9b1836ccf18cb642e5517c2688d090fa40c188172323a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30046720, 'user_id': 'c4621e42483d4f49b9a97f2b7eb886dc', 'user_name': None, 'project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'project_name': None, 'resource_id': 'eb04fe29-6d1c-4572-b219-f60350425077-vda', 'timestamp': '2026-01-22T17:33:55.584343', 'resource_metadata': {'display_name': 'tempest-server-test-887859950', 'name': 'instance-00000038', 'instance_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'instance_type': 'm1.nano', 'host': 'df6a9b36187d289823f00820d2ee9cb1a3913a68d4f14091f78b692a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '86c8ff22-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5599.282097745, 'message_signature': 'a53c5a872761a029b0db97d36da0d29e6fb45840b5fbbeb0b089a3c36848d4e6'}]}, 'timestamp': '2026-01-22 17:33:55.584925', '_unique_id': '99266f4d4b3b48259432482506d1f65d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:33:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:33:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:33:55 compute-0 nova_compute[183075]: 2026-01-22 17:33:55.733 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:56 compute-0 nova_compute[183075]: 2026-01-22 17:33:56.261 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:33:57 compute-0 nova_compute[183075]: 2026-01-22 17:33:57.691 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103222.6898556, b09d9bed-19f3-4aae-8aa4-7a87468084e4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:33:57 compute-0 nova_compute[183075]: 2026-01-22 17:33:57.691 183079 INFO nova.compute.manager [-] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] VM Stopped (Lifecycle Event)
Jan 22 17:33:57 compute-0 nova_compute[183075]: 2026-01-22 17:33:57.742 183079 DEBUG nova.compute.manager [None req-499545cc-cb9f-4ef1-909f-52d9e6b507a8 - - - - - -] [instance: b09d9bed-19f3-4aae-8aa4-7a87468084e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:33:57 compute-0 nova_compute[183075]: 2026-01-22 17:33:57.909 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:58 compute-0 ovn_controller[95372]: 2026-01-22T17:33:58Z|00614|binding|INFO|Releasing lport 0297f784-1a41-4744-b018-f503dfa93754 from this chassis (sb_readonly=0)
Jan 22 17:33:58 compute-0 nova_compute[183075]: 2026-01-22 17:33:58.404 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:58 compute-0 ovn_controller[95372]: 2026-01-22T17:33:58Z|00615|binding|INFO|Releasing lport 0297f784-1a41-4744-b018-f503dfa93754 from this chassis (sb_readonly=0)
Jan 22 17:33:58 compute-0 nova_compute[183075]: 2026-01-22 17:33:58.535 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:33:59 compute-0 nova_compute[183075]: 2026-01-22 17:33:59.927 183079 INFO nova.compute.manager [None req-5ea1d0f9-d19f-4792-8360-a11ccbcc1ae9 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Get console output
Jan 22 17:33:59 compute-0 nova_compute[183075]: 2026-01-22 17:33:59.932 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:34:00 compute-0 nova_compute[183075]: 2026-01-22 17:34:00.706 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103225.7046955, b0949fde-940d-495c-bdb0-e6c996b0274f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:34:00 compute-0 nova_compute[183075]: 2026-01-22 17:34:00.706 183079 INFO nova.compute.manager [-] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] VM Stopped (Lifecycle Event)
Jan 22 17:34:00 compute-0 nova_compute[183075]: 2026-01-22 17:34:00.735 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:00 compute-0 nova_compute[183075]: 2026-01-22 17:34:00.746 183079 DEBUG nova.compute.manager [None req-870d527e-6540-4174-9ff0-0dca63e183bd - - - - - -] [instance: b0949fde-940d-495c-bdb0-e6c996b0274f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:34:02 compute-0 nova_compute[183075]: 2026-01-22 17:34:02.911 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:05 compute-0 nova_compute[183075]: 2026-01-22 17:34:05.279 183079 INFO nova.compute.manager [None req-b73d2510-834c-4c9a-adb3-6642ca8e41af c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Get console output
Jan 22 17:34:05 compute-0 nova_compute[183075]: 2026-01-22 17:34:05.286 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:34:05 compute-0 nova_compute[183075]: 2026-01-22 17:34:05.737 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:06 compute-0 podman[234356]: 2026-01-22 17:34:06.363924716 +0000 UTC m=+0.063862609 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true)
Jan 22 17:34:06 compute-0 podman[234357]: 2026-01-22 17:34:06.364770469 +0000 UTC m=+0.065253967 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Jan 22 17:34:06 compute-0 podman[234355]: 2026-01-22 17:34:06.40335785 +0000 UTC m=+0.101571466 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:34:07 compute-0 nova_compute[183075]: 2026-01-22 17:34:07.913 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:10 compute-0 podman[234417]: 2026-01-22 17:34:10.337830225 +0000 UTC m=+0.049740785 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 17:34:10 compute-0 nova_compute[183075]: 2026-01-22 17:34:10.519 183079 INFO nova.compute.manager [None req-c7d695c9-1727-43f3-8ca7-9c82bbf45226 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Get console output
Jan 22 17:34:10 compute-0 nova_compute[183075]: 2026-01-22 17:34:10.523 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:34:10 compute-0 nova_compute[183075]: 2026-01-22 17:34:10.738 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:12 compute-0 nova_compute[183075]: 2026-01-22 17:34:12.916 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:15 compute-0 nova_compute[183075]: 2026-01-22 17:34:15.741 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:15 compute-0 nova_compute[183075]: 2026-01-22 17:34:15.763 183079 INFO nova.compute.manager [None req-a40b7aa3-6100-4f4b-b2f2-35aeb69c80d4 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Get console output
Jan 22 17:34:15 compute-0 nova_compute[183075]: 2026-01-22 17:34:15.768 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:34:17 compute-0 ovn_controller[95372]: 2026-01-22T17:34:17Z|00616|binding|INFO|Releasing lport 0297f784-1a41-4744-b018-f503dfa93754 from this chassis (sb_readonly=0)
Jan 22 17:34:17 compute-0 nova_compute[183075]: 2026-01-22 17:34:17.896 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:17 compute-0 NetworkManager[55454]: <info>  [1769103257.8993] manager: (patch-br-int-to-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Jan 22 17:34:17 compute-0 nova_compute[183075]: 2026-01-22 17:34:17.899 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:17 compute-0 NetworkManager[55454]: <info>  [1769103257.9015] manager: (patch-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/263)
Jan 22 17:34:17 compute-0 ovn_controller[95372]: 2026-01-22T17:34:17Z|00617|binding|INFO|Releasing lport 0297f784-1a41-4744-b018-f503dfa93754 from this chassis (sb_readonly=0)
Jan 22 17:34:17 compute-0 nova_compute[183075]: 2026-01-22 17:34:17.902 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:17 compute-0 nova_compute[183075]: 2026-01-22 17:34:17.917 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:18 compute-0 nova_compute[183075]: 2026-01-22 17:34:18.321 183079 DEBUG nova.compute.manager [req-c9aa44e4-9c3b-41b3-bd15-9e335ad74bcf req-dcf2b07e-17e3-4d81-b178-df9d16ac3d34 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Received event network-changed-0290774a-bbee-4523-b342-ddb24aca4826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:34:18 compute-0 nova_compute[183075]: 2026-01-22 17:34:18.321 183079 DEBUG nova.compute.manager [req-c9aa44e4-9c3b-41b3-bd15-9e335ad74bcf req-dcf2b07e-17e3-4d81-b178-df9d16ac3d34 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Refreshing instance network info cache due to event network-changed-0290774a-bbee-4523-b342-ddb24aca4826. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:34:18 compute-0 nova_compute[183075]: 2026-01-22 17:34:18.322 183079 DEBUG oslo_concurrency.lockutils [req-c9aa44e4-9c3b-41b3-bd15-9e335ad74bcf req-dcf2b07e-17e3-4d81-b178-df9d16ac3d34 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-81396ea9-a9a1-4a21-9808-608e45a7aa03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:34:18 compute-0 nova_compute[183075]: 2026-01-22 17:34:18.322 183079 DEBUG oslo_concurrency.lockutils [req-c9aa44e4-9c3b-41b3-bd15-9e335ad74bcf req-dcf2b07e-17e3-4d81-b178-df9d16ac3d34 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-81396ea9-a9a1-4a21-9808-608e45a7aa03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:34:18 compute-0 nova_compute[183075]: 2026-01-22 17:34:18.323 183079 DEBUG nova.network.neutron [req-c9aa44e4-9c3b-41b3-bd15-9e335ad74bcf req-dcf2b07e-17e3-4d81-b178-df9d16ac3d34 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Refreshing network info cache for port 0290774a-bbee-4523-b342-ddb24aca4826 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:34:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:20.335 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:34:20 compute-0 nova_compute[183075]: 2026-01-22 17:34:20.336 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:20.337 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:34:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:20.338 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:34:20 compute-0 nova_compute[183075]: 2026-01-22 17:34:20.744 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:21 compute-0 nova_compute[183075]: 2026-01-22 17:34:21.209 183079 DEBUG nova.network.neutron [req-c9aa44e4-9c3b-41b3-bd15-9e335ad74bcf req-dcf2b07e-17e3-4d81-b178-df9d16ac3d34 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Updated VIF entry in instance network info cache for port 0290774a-bbee-4523-b342-ddb24aca4826. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:34:21 compute-0 nova_compute[183075]: 2026-01-22 17:34:21.209 183079 DEBUG nova.network.neutron [req-c9aa44e4-9c3b-41b3-bd15-9e335ad74bcf req-dcf2b07e-17e3-4d81-b178-df9d16ac3d34 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Updating instance_info_cache with network_info: [{"id": "0290774a-bbee-4523-b342-ddb24aca4826", "address": "fa:16:3e:60:a0:be", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:a0be", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0290774a-bb", "ovs_interfaceid": "0290774a-bbee-4523-b342-ddb24aca4826", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:34:21 compute-0 nova_compute[183075]: 2026-01-22 17:34:21.285 183079 DEBUG oslo_concurrency.lockutils [req-c9aa44e4-9c3b-41b3-bd15-9e335ad74bcf req-dcf2b07e-17e3-4d81-b178-df9d16ac3d34 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-81396ea9-a9a1-4a21-9808-608e45a7aa03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:34:21 compute-0 podman[234438]: 2026-01-22 17:34:21.340130014 +0000 UTC m=+0.046882248 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:34:22 compute-0 nova_compute[183075]: 2026-01-22 17:34:22.176 183079 DEBUG nova.compute.manager [req-0aaed23d-f3ae-4d69-9a29-3ffc4e8387e8 req-6e788f44-7e6d-473e-af1a-721aaec6edee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Received event network-changed-7a42a264-5341-4ac6-8da3-c317a7b2c279 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:34:22 compute-0 nova_compute[183075]: 2026-01-22 17:34:22.177 183079 DEBUG nova.compute.manager [req-0aaed23d-f3ae-4d69-9a29-3ffc4e8387e8 req-6e788f44-7e6d-473e-af1a-721aaec6edee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Refreshing instance network info cache due to event network-changed-7a42a264-5341-4ac6-8da3-c317a7b2c279. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:34:22 compute-0 nova_compute[183075]: 2026-01-22 17:34:22.177 183079 DEBUG oslo_concurrency.lockutils [req-0aaed23d-f3ae-4d69-9a29-3ffc4e8387e8 req-6e788f44-7e6d-473e-af1a-721aaec6edee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-eb04fe29-6d1c-4572-b219-f60350425077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:34:22 compute-0 nova_compute[183075]: 2026-01-22 17:34:22.177 183079 DEBUG oslo_concurrency.lockutils [req-0aaed23d-f3ae-4d69-9a29-3ffc4e8387e8 req-6e788f44-7e6d-473e-af1a-721aaec6edee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-eb04fe29-6d1c-4572-b219-f60350425077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:34:22 compute-0 nova_compute[183075]: 2026-01-22 17:34:22.178 183079 DEBUG nova.network.neutron [req-0aaed23d-f3ae-4d69-9a29-3ffc4e8387e8 req-6e788f44-7e6d-473e-af1a-721aaec6edee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Refreshing network info cache for port 7a42a264-5341-4ac6-8da3-c317a7b2c279 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:34:22 compute-0 nova_compute[183075]: 2026-01-22 17:34:22.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:34:22 compute-0 nova_compute[183075]: 2026-01-22 17:34:22.920 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:23 compute-0 nova_compute[183075]: 2026-01-22 17:34:23.163 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:24 compute-0 nova_compute[183075]: 2026-01-22 17:34:24.832 183079 DEBUG nova.network.neutron [req-0aaed23d-f3ae-4d69-9a29-3ffc4e8387e8 req-6e788f44-7e6d-473e-af1a-721aaec6edee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Updated VIF entry in instance network info cache for port 7a42a264-5341-4ac6-8da3-c317a7b2c279. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:34:24 compute-0 nova_compute[183075]: 2026-01-22 17:34:24.833 183079 DEBUG nova.network.neutron [req-0aaed23d-f3ae-4d69-9a29-3ffc4e8387e8 req-6e788f44-7e6d-473e-af1a-721aaec6edee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Updating instance_info_cache with network_info: [{"id": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "address": "fa:16:3e:f5:0b:0f", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef5:b0f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a42a264-53", "ovs_interfaceid": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:34:25 compute-0 nova_compute[183075]: 2026-01-22 17:34:25.136 183079 DEBUG oslo_concurrency.lockutils [req-0aaed23d-f3ae-4d69-9a29-3ffc4e8387e8 req-6e788f44-7e6d-473e-af1a-721aaec6edee a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-eb04fe29-6d1c-4572-b219-f60350425077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:34:25 compute-0 podman[234463]: 2026-01-22 17:34:25.396559629 +0000 UTC m=+0.107270091 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:34:25 compute-0 nova_compute[183075]: 2026-01-22 17:34:25.747 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:25 compute-0 nova_compute[183075]: 2026-01-22 17:34:25.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:34:25 compute-0 nova_compute[183075]: 2026-01-22 17:34:25.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:34:27 compute-0 nova_compute[183075]: 2026-01-22 17:34:27.922 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:28 compute-0 nova_compute[183075]: 2026-01-22 17:34:28.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:34:28 compute-0 nova_compute[183075]: 2026-01-22 17:34:28.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.302 183079 DEBUG oslo_concurrency.lockutils [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "eb04fe29-6d1c-4572-b219-f60350425077" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.303 183079 DEBUG oslo_concurrency.lockutils [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "eb04fe29-6d1c-4572-b219-f60350425077" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.303 183079 DEBUG oslo_concurrency.lockutils [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "eb04fe29-6d1c-4572-b219-f60350425077-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.303 183079 DEBUG oslo_concurrency.lockutils [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "eb04fe29-6d1c-4572-b219-f60350425077-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.303 183079 DEBUG oslo_concurrency.lockutils [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "eb04fe29-6d1c-4572-b219-f60350425077-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.306 183079 INFO nova.compute.manager [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Terminating instance
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.307 183079 DEBUG nova.compute.manager [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:34:30 compute-0 kernel: tap7a42a264-53 (unregistering): left promiscuous mode
Jan 22 17:34:30 compute-0 NetworkManager[55454]: <info>  [1769103270.3276] device (tap7a42a264-53): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.339 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:30 compute-0 ovn_controller[95372]: 2026-01-22T17:34:30Z|00618|binding|INFO|Releasing lport 7a42a264-5341-4ac6-8da3-c317a7b2c279 from this chassis (sb_readonly=0)
Jan 22 17:34:30 compute-0 ovn_controller[95372]: 2026-01-22T17:34:30Z|00619|binding|INFO|Setting lport 7a42a264-5341-4ac6-8da3-c317a7b2c279 down in Southbound
Jan 22 17:34:30 compute-0 ovn_controller[95372]: 2026-01-22T17:34:30Z|00620|binding|INFO|Removing iface tap7a42a264-53 ovn-installed in OVS
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.342 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:30.366 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:0b:0f 10.100.0.6 2001:db8::f816:3eff:fef5:b0f'], port_security=['fa:16:3e:f5:0b:0f 10.100.0.6 2001:db8::f816:3eff:fef5:b0f'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28 2001:db8::f816:3eff:fef5:b0f/64', 'neutron:device_id': 'eb04fe29-6d1c-4572-b219-f60350425077', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e838e551-4083-4143-b761-54a81d27a6c4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.231'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6159b68b-3c7b-43be-9fb9-7f846f3d3eb8, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=7a42a264-5341-4ac6-8da3-c317a7b2c279) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.367 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:30.370 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 7a42a264-5341-4ac6-8da3-c317a7b2c279 in datapath 2fd77df8-cf00-4afc-b4cf-75b5722c375c unbound from our chassis
Jan 22 17:34:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:30.373 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2fd77df8-cf00-4afc-b4cf-75b5722c375c
Jan 22 17:34:30 compute-0 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000038.scope: Deactivated successfully.
Jan 22 17:34:30 compute-0 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000038.scope: Consumed 14.932s CPU time.
Jan 22 17:34:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:30.394 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f6afa1f2-8503-4050-9ac0-fee0df542912]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:30 compute-0 systemd-machined[154382]: Machine qemu-56-instance-00000038 terminated.
Jan 22 17:34:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:30.436 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[635f2bb5-2634-4abf-85b4-ced6dbd26653]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:30.439 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[973130f9-4405-4cb7-be5e-a9748fce62db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:30.478 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5b46fd54-38de-4fbf-92b0-e6de3ba170cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:30.498 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[484657ff-8e5c-4a51-a7cb-98a2d75fd981]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2fd77df8-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:bc:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 228, 'tx_packets': 105, 'rx_bytes': 19472, 'tx_bytes': 11986, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 228, 'tx_packets': 105, 'rx_bytes': 19472, 'tx_bytes': 11986, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 170], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546788, 'reachable_time': 43674, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 30, 'inoctets': 2192, 'indelivers': 7, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 30, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 2192, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 30, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 7, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234498, 'error': None, 'target': 'ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:30.518 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e2a55c-00cd-4752-8a0b-166e737db82a]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2fd77df8-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546799, 'tstamp': 546799}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234499, 'error': None, 'target': 'ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2fd77df8-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546801, 'tstamp': 546801}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234499, 'error': None, 'target': 'ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:30.520 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2fd77df8-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.522 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.530 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:30.531 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2fd77df8-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:34:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:30.531 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:34:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:30.532 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2fd77df8-c0, col_values=(('external_ids', {'iface-id': '0297f784-1a41-4744-b018-f503dfa93754'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:34:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:30.532 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.575 183079 INFO nova.virt.libvirt.driver [-] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Instance destroyed successfully.
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.576 183079 DEBUG nova.objects.instance [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lazy-loading 'resources' on Instance uuid eb04fe29-6d1c-4572-b219-f60350425077 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.716 183079 DEBUG nova.virt.libvirt.vif [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:32:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-887859950',display_name='tempest-server-test-887859950',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-887859950',id=56,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5QyK5QBIiC5rcI23u+Ha4rToriS54oXRHOciJ+8yg9OiIFQ5pHcofppLwzDPzqktD3JMTTskAgQadosoWLVCa44nM7NokRRlJk11u7nt0exfz9e0AepzmOn9wpcPYeVg==',key_name='tempest-keypair-test-1290435476',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:33:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='22f75e117e724f9aaadf5b8fd25a6ef6',ramdisk_id='',reservation_id='r-kfvkuair',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessSecGroupDualStackDHCPv6StatelessTest-80323303',owner_user_name='tempest-StatelessSecGroupDualStackDHCPv6StatelessTest-80323303-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:33:19Z,user_data=None,user_id='c4621e42483d4f49b9a97f2b7eb886dc',uuid=eb04fe29-6d1c-4572-b219-f60350425077,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "address": "fa:16:3e:f5:0b:0f", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef5:b0f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a42a264-53", "ovs_interfaceid": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.716 183079 DEBUG nova.network.os_vif_util [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Converting VIF {"id": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "address": "fa:16:3e:f5:0b:0f", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef5:b0f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a42a264-53", "ovs_interfaceid": "7a42a264-5341-4ac6-8da3-c317a7b2c279", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.717 183079 DEBUG nova.network.os_vif_util [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f5:0b:0f,bridge_name='br-int',has_traffic_filtering=True,id=7a42a264-5341-4ac6-8da3-c317a7b2c279,network=Network(2fd77df8-cf00-4afc-b4cf-75b5722c375c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a42a264-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.717 183079 DEBUG os_vif [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f5:0b:0f,bridge_name='br-int',has_traffic_filtering=True,id=7a42a264-5341-4ac6-8da3-c317a7b2c279,network=Network(2fd77df8-cf00-4afc-b4cf-75b5722c375c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a42a264-53') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.718 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.718 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a42a264-53, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.719 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.720 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.721 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.723 183079 INFO os_vif [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f5:0b:0f,bridge_name='br-int',has_traffic_filtering=True,id=7a42a264-5341-4ac6-8da3-c317a7b2c279,network=Network(2fd77df8-cf00-4afc-b4cf-75b5722c375c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a42a264-53')
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.724 183079 INFO nova.virt.libvirt.driver [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Deleting instance files /var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077_del
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.724 183079 INFO nova.virt.libvirt.driver [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Deletion of /var/lib/nova/instances/eb04fe29-6d1c-4572-b219-f60350425077_del complete
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.778 183079 INFO nova.compute.manager [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Took 0.47 seconds to destroy the instance on the hypervisor.
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.779 183079 DEBUG oslo.service.loopingcall [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.779 183079 DEBUG nova.compute.manager [-] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.779 183079 DEBUG nova.network.neutron [-] [instance: eb04fe29-6d1c-4572-b219-f60350425077] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.990 183079 DEBUG nova.compute.manager [req-2a8c2059-2111-41ea-8288-38383355940c req-d0376835-3756-4c1a-847a-56ee8d4bfe40 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Received event network-vif-unplugged-7a42a264-5341-4ac6-8da3-c317a7b2c279 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.990 183079 DEBUG oslo_concurrency.lockutils [req-2a8c2059-2111-41ea-8288-38383355940c req-d0376835-3756-4c1a-847a-56ee8d4bfe40 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "eb04fe29-6d1c-4572-b219-f60350425077-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.990 183079 DEBUG oslo_concurrency.lockutils [req-2a8c2059-2111-41ea-8288-38383355940c req-d0376835-3756-4c1a-847a-56ee8d4bfe40 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "eb04fe29-6d1c-4572-b219-f60350425077-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.990 183079 DEBUG oslo_concurrency.lockutils [req-2a8c2059-2111-41ea-8288-38383355940c req-d0376835-3756-4c1a-847a-56ee8d4bfe40 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "eb04fe29-6d1c-4572-b219-f60350425077-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.990 183079 DEBUG nova.compute.manager [req-2a8c2059-2111-41ea-8288-38383355940c req-d0376835-3756-4c1a-847a-56ee8d4bfe40 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] No waiting events found dispatching network-vif-unplugged-7a42a264-5341-4ac6-8da3-c317a7b2c279 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:34:30 compute-0 nova_compute[183075]: 2026-01-22 17:34:30.991 183079 DEBUG nova.compute.manager [req-2a8c2059-2111-41ea-8288-38383355940c req-d0376835-3756-4c1a-847a-56ee8d4bfe40 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Received event network-vif-unplugged-7a42a264-5341-4ac6-8da3-c317a7b2c279 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:34:31 compute-0 nova_compute[183075]: 2026-01-22 17:34:31.016 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:34:32 compute-0 nova_compute[183075]: 2026-01-22 17:34:32.926 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:33 compute-0 nova_compute[183075]: 2026-01-22 17:34:33.108 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:33 compute-0 nova_compute[183075]: 2026-01-22 17:34:33.178 183079 DEBUG nova.network.neutron [-] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:34:33 compute-0 nova_compute[183075]: 2026-01-22 17:34:33.197 183079 DEBUG nova.compute.manager [req-5201565c-aec9-4f9e-a464-305f4f2756da req-9c428e87-fa19-48e8-af42-29dc2d0724a5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Received event network-vif-plugged-7a42a264-5341-4ac6-8da3-c317a7b2c279 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:34:33 compute-0 nova_compute[183075]: 2026-01-22 17:34:33.197 183079 DEBUG oslo_concurrency.lockutils [req-5201565c-aec9-4f9e-a464-305f4f2756da req-9c428e87-fa19-48e8-af42-29dc2d0724a5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "eb04fe29-6d1c-4572-b219-f60350425077-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:33 compute-0 nova_compute[183075]: 2026-01-22 17:34:33.197 183079 DEBUG oslo_concurrency.lockutils [req-5201565c-aec9-4f9e-a464-305f4f2756da req-9c428e87-fa19-48e8-af42-29dc2d0724a5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "eb04fe29-6d1c-4572-b219-f60350425077-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:33 compute-0 nova_compute[183075]: 2026-01-22 17:34:33.198 183079 DEBUG oslo_concurrency.lockutils [req-5201565c-aec9-4f9e-a464-305f4f2756da req-9c428e87-fa19-48e8-af42-29dc2d0724a5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "eb04fe29-6d1c-4572-b219-f60350425077-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:33 compute-0 nova_compute[183075]: 2026-01-22 17:34:33.198 183079 DEBUG nova.compute.manager [req-5201565c-aec9-4f9e-a464-305f4f2756da req-9c428e87-fa19-48e8-af42-29dc2d0724a5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] No waiting events found dispatching network-vif-plugged-7a42a264-5341-4ac6-8da3-c317a7b2c279 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:34:33 compute-0 nova_compute[183075]: 2026-01-22 17:34:33.198 183079 WARNING nova.compute.manager [req-5201565c-aec9-4f9e-a464-305f4f2756da req-9c428e87-fa19-48e8-af42-29dc2d0724a5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Received unexpected event network-vif-plugged-7a42a264-5341-4ac6-8da3-c317a7b2c279 for instance with vm_state active and task_state deleting.
Jan 22 17:34:33 compute-0 nova_compute[183075]: 2026-01-22 17:34:33.199 183079 INFO nova.compute.manager [-] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Took 2.42 seconds to deallocate network for instance.
Jan 22 17:34:33 compute-0 nova_compute[183075]: 2026-01-22 17:34:33.275 183079 DEBUG oslo_concurrency.lockutils [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:33 compute-0 nova_compute[183075]: 2026-01-22 17:34:33.275 183079 DEBUG oslo_concurrency.lockutils [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:33 compute-0 nova_compute[183075]: 2026-01-22 17:34:33.504 183079 DEBUG nova.compute.provider_tree [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:34:33 compute-0 nova_compute[183075]: 2026-01-22 17:34:33.536 183079 DEBUG nova.scheduler.client.report [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:34:33 compute-0 nova_compute[183075]: 2026-01-22 17:34:33.566 183079 DEBUG oslo_concurrency.lockutils [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:34 compute-0 nova_compute[183075]: 2026-01-22 17:34:34.165 183079 INFO nova.scheduler.client.report [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Deleted allocations for instance eb04fe29-6d1c-4572-b219-f60350425077
Jan 22 17:34:34 compute-0 nova_compute[183075]: 2026-01-22 17:34:34.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:34:35 compute-0 nova_compute[183075]: 2026-01-22 17:34:35.720 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:35 compute-0 nova_compute[183075]: 2026-01-22 17:34:35.832 183079 DEBUG oslo_concurrency.lockutils [None req-14a82288-e132-48a2-b2c8-4edc45363269 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "eb04fe29-6d1c-4572-b219-f60350425077" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.529s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:35 compute-0 nova_compute[183075]: 2026-01-22 17:34:35.970 183079 DEBUG nova.compute.manager [req-7cdd5b02-38b8-449c-a8ae-fbb7c28b8848 req-c6a70a44-cfc8-4ece-9318-52939961ba6d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Received event network-vif-deleted-7a42a264-5341-4ac6-8da3-c317a7b2c279 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:34:36 compute-0 nova_compute[183075]: 2026-01-22 17:34:36.971 183079 DEBUG oslo_concurrency.lockutils [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "81396ea9-a9a1-4a21-9808-608e45a7aa03" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:36 compute-0 nova_compute[183075]: 2026-01-22 17:34:36.971 183079 DEBUG oslo_concurrency.lockutils [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "81396ea9-a9a1-4a21-9808-608e45a7aa03" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:36 compute-0 nova_compute[183075]: 2026-01-22 17:34:36.972 183079 DEBUG oslo_concurrency.lockutils [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "81396ea9-a9a1-4a21-9808-608e45a7aa03-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:36 compute-0 nova_compute[183075]: 2026-01-22 17:34:36.972 183079 DEBUG oslo_concurrency.lockutils [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "81396ea9-a9a1-4a21-9808-608e45a7aa03-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:36 compute-0 nova_compute[183075]: 2026-01-22 17:34:36.972 183079 DEBUG oslo_concurrency.lockutils [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "81396ea9-a9a1-4a21-9808-608e45a7aa03-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:36 compute-0 nova_compute[183075]: 2026-01-22 17:34:36.974 183079 INFO nova.compute.manager [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Terminating instance
Jan 22 17:34:36 compute-0 nova_compute[183075]: 2026-01-22 17:34:36.975 183079 DEBUG nova.compute.manager [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:34:36 compute-0 kernel: tap0290774a-bb (unregistering): left promiscuous mode
Jan 22 17:34:36 compute-0 NetworkManager[55454]: <info>  [1769103276.9967] device (tap0290774a-bb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:34:37 compute-0 ovn_controller[95372]: 2026-01-22T17:34:37Z|00621|binding|INFO|Releasing lport 0290774a-bbee-4523-b342-ddb24aca4826 from this chassis (sb_readonly=0)
Jan 22 17:34:37 compute-0 ovn_controller[95372]: 2026-01-22T17:34:37Z|00622|binding|INFO|Setting lport 0290774a-bbee-4523-b342-ddb24aca4826 down in Southbound
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.002 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:37 compute-0 ovn_controller[95372]: 2026-01-22T17:34:37Z|00623|binding|INFO|Removing iface tap0290774a-bb ovn-installed in OVS
Jan 22 17:34:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:37.011 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:a0:be 10.100.0.3 2001:db8::f816:3eff:fe60:a0be'], port_security=['fa:16:3e:60:a0:be 10.100.0.3 2001:db8::f816:3eff:fe60:a0be'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe60:a0be/64', 'neutron:device_id': '81396ea9-a9a1-4a21-9808-608e45a7aa03', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '22f75e117e724f9aaadf5b8fd25a6ef6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e838e551-4083-4143-b761-54a81d27a6c4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.236'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6159b68b-3c7b-43be-9fb9-7f846f3d3eb8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=0290774a-bbee-4523-b342-ddb24aca4826) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:34:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:37.014 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 0290774a-bbee-4523-b342-ddb24aca4826 in datapath 2fd77df8-cf00-4afc-b4cf-75b5722c375c unbound from our chassis
Jan 22 17:34:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:37.017 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2fd77df8-cf00-4afc-b4cf-75b5722c375c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:34:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:37.019 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6d866d54-ea43-4af2-835c-9ed371fce8be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:37.020 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c namespace which is not needed anymore
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.022 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:37 compute-0 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000036.scope: Deactivated successfully.
Jan 22 17:34:37 compute-0 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000036.scope: Consumed 19.171s CPU time.
Jan 22 17:34:37 compute-0 systemd-machined[154382]: Machine qemu-54-instance-00000036 terminated.
Jan 22 17:34:37 compute-0 podman[234521]: 2026-01-22 17:34:37.117669278 +0000 UTC m=+0.072673760 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=openstack_network_exporter, architecture=x86_64, vcs-type=git)
Jan 22 17:34:37 compute-0 podman[234519]: 2026-01-22 17:34:37.120076313 +0000 UTC m=+0.080944564 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 17:34:37 compute-0 podman[234517]: 2026-01-22 17:34:37.126462097 +0000 UTC m=+0.096043266 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 22 17:34:37 compute-0 neutron-haproxy-ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233418]: [NOTICE]   (233422) : haproxy version is 2.8.14-c23fe91
Jan 22 17:34:37 compute-0 neutron-haproxy-ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233418]: [NOTICE]   (233422) : path to executable is /usr/sbin/haproxy
Jan 22 17:34:37 compute-0 neutron-haproxy-ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233418]: [WARNING]  (233422) : Exiting Master process...
Jan 22 17:34:37 compute-0 neutron-haproxy-ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233418]: [ALERT]    (233422) : Current worker (233424) exited with code 143 (Terminated)
Jan 22 17:34:37 compute-0 neutron-haproxy-ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c[233418]: [WARNING]  (233422) : All workers exited. Exiting... (0)
Jan 22 17:34:37 compute-0 systemd[1]: libpod-0398fca3db2a42f4180be2b0d9abe8ff7436042cbc3c1e6d2a2aaf6a5516a2d1.scope: Deactivated successfully.
Jan 22 17:34:37 compute-0 podman[234589]: 2026-01-22 17:34:37.203594547 +0000 UTC m=+0.094827433 container died 0398fca3db2a42f4180be2b0d9abe8ff7436042cbc3c1e6d2a2aaf6a5516a2d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.236 183079 INFO nova.virt.libvirt.driver [-] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Instance destroyed successfully.
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.237 183079 DEBUG nova.objects.instance [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lazy-loading 'resources' on Instance uuid 81396ea9-a9a1-4a21-9808-608e45a7aa03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.262 183079 DEBUG nova.virt.libvirt.vif [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:31:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-761314490',display_name='tempest-server-test-761314490',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-761314490',id=54,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5QyK5QBIiC5rcI23u+Ha4rToriS54oXRHOciJ+8yg9OiIFQ5pHcofppLwzDPzqktD3JMTTskAgQadosoWLVCa44nM7NokRRlJk11u7nt0exfz9e0AepzmOn9wpcPYeVg==',key_name='tempest-keypair-test-1290435476',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:31:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='22f75e117e724f9aaadf5b8fd25a6ef6',ramdisk_id='',reservation_id='r-uvhq1w36',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessSecGroupDualStackDHCPv6StatelessTest-80323303',owner_user_name='tempest-StatelessSecGroupDualStackDHCPv6StatelessTest-80323303-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:31:44Z,user_data=None,user_id='c4621e42483d4f49b9a97f2b7eb886dc',uuid=81396ea9-a9a1-4a21-9808-608e45a7aa03,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0290774a-bbee-4523-b342-ddb24aca4826", "address": "fa:16:3e:60:a0:be", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:a0be", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0290774a-bb", "ovs_interfaceid": "0290774a-bbee-4523-b342-ddb24aca4826", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.262 183079 DEBUG nova.network.os_vif_util [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Converting VIF {"id": "0290774a-bbee-4523-b342-ddb24aca4826", "address": "fa:16:3e:60:a0:be", "network": {"id": "2fd77df8-cf00-4afc-b4cf-75b5722c375c", "bridge": "br-int", "label": "tempest-test-network--158385692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:a0be", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22f75e117e724f9aaadf5b8fd25a6ef6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0290774a-bb", "ovs_interfaceid": "0290774a-bbee-4523-b342-ddb24aca4826", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.263 183079 DEBUG nova.network.os_vif_util [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:60:a0:be,bridge_name='br-int',has_traffic_filtering=True,id=0290774a-bbee-4523-b342-ddb24aca4826,network=Network(2fd77df8-cf00-4afc-b4cf-75b5722c375c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0290774a-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.263 183079 DEBUG os_vif [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:a0:be,bridge_name='br-int',has_traffic_filtering=True,id=0290774a-bbee-4523-b342-ddb24aca4826,network=Network(2fd77df8-cf00-4afc-b4cf-75b5722c375c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0290774a-bb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.265 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.266 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0290774a-bb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.268 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0398fca3db2a42f4180be2b0d9abe8ff7436042cbc3c1e6d2a2aaf6a5516a2d1-userdata-shm.mount: Deactivated successfully.
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.270 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-effe2baf214300b88948e861438c671909962aa2332964dd7bf3065cfe4fe4c6-merged.mount: Deactivated successfully.
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.273 183079 INFO os_vif [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:a0:be,bridge_name='br-int',has_traffic_filtering=True,id=0290774a-bbee-4523-b342-ddb24aca4826,network=Network(2fd77df8-cf00-4afc-b4cf-75b5722c375c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0290774a-bb')
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.275 183079 INFO nova.virt.libvirt.driver [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Deleting instance files /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03_del
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.275 183079 INFO nova.virt.libvirt.driver [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Deletion of /var/lib/nova/instances/81396ea9-a9a1-4a21-9808-608e45a7aa03_del complete
Jan 22 17:34:37 compute-0 podman[234589]: 2026-01-22 17:34:37.282604508 +0000 UTC m=+0.173837394 container cleanup 0398fca3db2a42f4180be2b0d9abe8ff7436042cbc3c1e6d2a2aaf6a5516a2d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 22 17:34:37 compute-0 systemd[1]: libpod-conmon-0398fca3db2a42f4180be2b0d9abe8ff7436042cbc3c1e6d2a2aaf6a5516a2d1.scope: Deactivated successfully.
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.348 183079 INFO nova.compute.manager [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.349 183079 DEBUG oslo.service.loopingcall [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.349 183079 DEBUG nova.compute.manager [-] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.349 183079 DEBUG nova.network.neutron [-] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:34:37 compute-0 podman[234644]: 2026-01-22 17:34:37.36351169 +0000 UTC m=+0.061452544 container remove 0398fca3db2a42f4180be2b0d9abe8ff7436042cbc3c1e6d2a2aaf6a5516a2d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:34:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:37.368 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[afa9e060-df05-41b1-954a-8d3008391134]: (4, ('Thu Jan 22 05:34:37 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c (0398fca3db2a42f4180be2b0d9abe8ff7436042cbc3c1e6d2a2aaf6a5516a2d1)\n0398fca3db2a42f4180be2b0d9abe8ff7436042cbc3c1e6d2a2aaf6a5516a2d1\nThu Jan 22 05:34:37 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c (0398fca3db2a42f4180be2b0d9abe8ff7436042cbc3c1e6d2a2aaf6a5516a2d1)\n0398fca3db2a42f4180be2b0d9abe8ff7436042cbc3c1e6d2a2aaf6a5516a2d1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:37.370 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2e2595ba-7022-477e-b2ec-99e7d86c738a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:37.370 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2fd77df8-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.372 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:37 compute-0 kernel: tap2fd77df8-c0: left promiscuous mode
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.384 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:37.387 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[68175d13-cb8c-407a-9f6e-122f644d5cf1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:37.410 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[06b0ce98-2fe0-453d-a2ba-3fe0a689fb7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:37.411 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[44664f7f-466e-4860-bbf9-90e9e99cd9a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:37.426 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[46fdc897-eae0-4a6b-aa01-95e666857ce2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546781, 'reachable_time': 38130, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234660, 'error': None, 'target': 'ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:37.428 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2fd77df8-cf00-4afc-b4cf-75b5722c375c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:34:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:37.428 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[ecaf5376-0a41-4e2b-83a4-25a4267aa3c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:37 compute-0 systemd[1]: run-netns-ovnmeta\x2d2fd77df8\x2dcf00\x2d4afc\x2db4cf\x2d75b5722c375c.mount: Deactivated successfully.
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.918 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.918 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.918 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.919 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.919 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:34:37 compute-0 nova_compute[183075]: 2026-01-22 17:34:37.949 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:38 compute-0 nova_compute[183075]: 2026-01-22 17:34:38.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:34:38 compute-0 nova_compute[183075]: 2026-01-22 17:34:38.829 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:38 compute-0 nova_compute[183075]: 2026-01-22 17:34:38.830 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:38 compute-0 nova_compute[183075]: 2026-01-22 17:34:38.830 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:38 compute-0 nova_compute[183075]: 2026-01-22 17:34:38.830 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:34:38 compute-0 nova_compute[183075]: 2026-01-22 17:34:38.888 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:38 compute-0 nova_compute[183075]: 2026-01-22 17:34:38.888 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:38 compute-0 nova_compute[183075]: 2026-01-22 17:34:38.977 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:34:38 compute-0 nova_compute[183075]: 2026-01-22 17:34:38.978 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5697MB free_disk=73.35983657836914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:34:38 compute-0 nova_compute[183075]: 2026-01-22 17:34:38.978 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:38 compute-0 nova_compute[183075]: 2026-01-22 17:34:38.978 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:38 compute-0 nova_compute[183075]: 2026-01-22 17:34:38.986 183079 DEBUG nova.compute.manager [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.071 183079 DEBUG nova.network.neutron [-] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.082 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.087 183079 INFO nova.compute.manager [-] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Took 1.74 seconds to deallocate network for instance.
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.099 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 81396ea9-a9a1-4a21-9808-608e45a7aa03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.123 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance f62cb90c-e99d-43d4-bbac-06a79d9b1182 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.123 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.123 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.129 183079 DEBUG oslo_concurrency.lockutils [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.166 183079 DEBUG nova.compute.manager [req-4f55169e-edbc-4849-8eac-ff255f377973 req-873444b0-94b0-40e0-973d-840a36fcfbc7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Received event network-vif-deleted-0290774a-bbee-4523-b342-ddb24aca4826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.184 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.199 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.223 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.223 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.224 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.230 183079 DEBUG nova.virt.hardware [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.230 183079 INFO nova.compute.claims [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.333 183079 DEBUG nova.compute.provider_tree [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.347 183079 DEBUG nova.scheduler.client.report [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.366 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.366 183079 DEBUG nova.compute.manager [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.369 183079 DEBUG oslo_concurrency.lockutils [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.411 183079 DEBUG nova.compute.manager [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.411 183079 DEBUG nova.network.neutron [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.432 183079 INFO nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.436 183079 DEBUG nova.compute.provider_tree [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.451 183079 DEBUG nova.scheduler.client.report [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.455 183079 DEBUG nova.compute.manager [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.494 183079 DEBUG oslo_concurrency.lockutils [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.525 183079 INFO nova.scheduler.client.report [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Deleted allocations for instance 81396ea9-a9a1-4a21-9808-608e45a7aa03
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.566 183079 DEBUG nova.compute.manager [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.567 183079 DEBUG nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.568 183079 INFO nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Creating image(s)
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.568 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "/var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.568 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "/var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.569 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "/var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.581 183079 DEBUG nova.policy [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bf68afec168c4aa2ba7e47fdb3b026af', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8cfd5f99a92142bd829974004d0e603e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.584 183079 DEBUG oslo_concurrency.processutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.604 183079 DEBUG oslo_concurrency.lockutils [None req-67bdbd38-be38-49ac-89dc-4c1e83426384 c4621e42483d4f49b9a97f2b7eb886dc 22f75e117e724f9aaadf5b8fd25a6ef6 - - default default] Lock "81396ea9-a9a1-4a21-9808-608e45a7aa03" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.642 183079 DEBUG oslo_concurrency.processutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.643 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.644 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.655 183079 DEBUG oslo_concurrency.processutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.721 183079 DEBUG oslo_concurrency.processutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.722 183079 DEBUG oslo_concurrency.processutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.758 183079 DEBUG oslo_concurrency.processutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.759 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.760 183079 DEBUG oslo_concurrency.processutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.863 183079 DEBUG oslo_concurrency.processutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.864 183079 DEBUG nova.virt.disk.api [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Checking if we can resize image /var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.864 183079 DEBUG oslo_concurrency.processutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.927 183079 DEBUG oslo_concurrency.processutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.928 183079 DEBUG nova.virt.disk.api [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Cannot resize image /var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.929 183079 DEBUG nova.objects.instance [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lazy-loading 'migration_context' on Instance uuid f62cb90c-e99d-43d4-bbac-06a79d9b1182 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.951 183079 DEBUG nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.951 183079 DEBUG nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Ensure instance console log exists: /var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.951 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.952 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:39 compute-0 nova_compute[183075]: 2026-01-22 17:34:39.952 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:40 compute-0 nova_compute[183075]: 2026-01-22 17:34:40.233 183079 DEBUG nova.network.neutron [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Successfully created port: 509507c8-2a57-4e8e-aece-de4608aa6284 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:34:41 compute-0 nova_compute[183075]: 2026-01-22 17:34:41.045 183079 DEBUG nova.network.neutron [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Successfully updated port: 509507c8-2a57-4e8e-aece-de4608aa6284 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:34:41 compute-0 nova_compute[183075]: 2026-01-22 17:34:41.071 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:34:41 compute-0 nova_compute[183075]: 2026-01-22 17:34:41.072 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquired lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:34:41 compute-0 nova_compute[183075]: 2026-01-22 17:34:41.072 183079 DEBUG nova.network.neutron [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:34:41 compute-0 podman[234677]: 2026-01-22 17:34:41.364560578 +0000 UTC m=+0.064299391 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 17:34:41 compute-0 nova_compute[183075]: 2026-01-22 17:34:41.384 183079 DEBUG nova.compute.manager [req-e0bc5926-169e-4be2-899c-a3b1150f73f4 req-c243df2e-c06c-418a-b4f6-7ffb434d5f30 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Received event network-changed-509507c8-2a57-4e8e-aece-de4608aa6284 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:34:41 compute-0 nova_compute[183075]: 2026-01-22 17:34:41.385 183079 DEBUG nova.compute.manager [req-e0bc5926-169e-4be2-899c-a3b1150f73f4 req-c243df2e-c06c-418a-b4f6-7ffb434d5f30 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Refreshing instance network info cache due to event network-changed-509507c8-2a57-4e8e-aece-de4608aa6284. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:34:41 compute-0 nova_compute[183075]: 2026-01-22 17:34:41.385 183079 DEBUG oslo_concurrency.lockutils [req-e0bc5926-169e-4be2-899c-a3b1150f73f4 req-c243df2e-c06c-418a-b4f6-7ffb434d5f30 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:34:41 compute-0 nova_compute[183075]: 2026-01-22 17:34:41.502 183079 DEBUG nova.network.neutron [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:34:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:41.951 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:41.951 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:41.951 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:42 compute-0 nova_compute[183075]: 2026-01-22 17:34:42.269 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:42 compute-0 nova_compute[183075]: 2026-01-22 17:34:42.952 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.423 183079 DEBUG nova.network.neutron [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Updating instance_info_cache with network_info: [{"id": "509507c8-2a57-4e8e-aece-de4608aa6284", "address": "fa:16:3e:04:a8:2d", "network": {"id": "66f028c0-4f7e-4541-a188-d929c83780ea", "bridge": "br-int", "label": "tempest-test-network--1307842537", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509507c8-2a", "ovs_interfaceid": "509507c8-2a57-4e8e-aece-de4608aa6284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.440 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Releasing lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.441 183079 DEBUG nova.compute.manager [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Instance network_info: |[{"id": "509507c8-2a57-4e8e-aece-de4608aa6284", "address": "fa:16:3e:04:a8:2d", "network": {"id": "66f028c0-4f7e-4541-a188-d929c83780ea", "bridge": "br-int", "label": "tempest-test-network--1307842537", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509507c8-2a", "ovs_interfaceid": "509507c8-2a57-4e8e-aece-de4608aa6284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.443 183079 DEBUG oslo_concurrency.lockutils [req-e0bc5926-169e-4be2-899c-a3b1150f73f4 req-c243df2e-c06c-418a-b4f6-7ffb434d5f30 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.444 183079 DEBUG nova.network.neutron [req-e0bc5926-169e-4be2-899c-a3b1150f73f4 req-c243df2e-c06c-418a-b4f6-7ffb434d5f30 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Refreshing network info cache for port 509507c8-2a57-4e8e-aece-de4608aa6284 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.448 183079 DEBUG nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Start _get_guest_xml network_info=[{"id": "509507c8-2a57-4e8e-aece-de4608aa6284", "address": "fa:16:3e:04:a8:2d", "network": {"id": "66f028c0-4f7e-4541-a188-d929c83780ea", "bridge": "br-int", "label": "tempest-test-network--1307842537", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509507c8-2a", "ovs_interfaceid": "509507c8-2a57-4e8e-aece-de4608aa6284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.456 183079 WARNING nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.461 183079 DEBUG nova.virt.libvirt.host [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.462 183079 DEBUG nova.virt.libvirt.host [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.468 183079 DEBUG nova.virt.libvirt.host [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.469 183079 DEBUG nova.virt.libvirt.host [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.470 183079 DEBUG nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.470 183079 DEBUG nova.virt.hardware [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.471 183079 DEBUG nova.virt.hardware [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.471 183079 DEBUG nova.virt.hardware [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.471 183079 DEBUG nova.virt.hardware [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.471 183079 DEBUG nova.virt.hardware [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.472 183079 DEBUG nova.virt.hardware [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.472 183079 DEBUG nova.virt.hardware [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.472 183079 DEBUG nova.virt.hardware [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.472 183079 DEBUG nova.virt.hardware [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.473 183079 DEBUG nova.virt.hardware [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.473 183079 DEBUG nova.virt.hardware [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.476 183079 DEBUG nova.virt.libvirt.vif [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:34:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1825252520',display_name='tempest-server-test-1825252520',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1825252520',id=57,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGomFw5Z3VR9MENpYtKeaxDUxtNH6GlF1leYJJ98KvwYvEsTZM9n4Wa79zXCEZq0VbSC6BefWHPMVcmuZdoxNG4KqsiiHTSMIJygbk2qiUn1+nbdXOT2eC+L+LWWgtGW8A==',key_name='tempest-keypair-test-857257872',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8cfd5f99a92142bd829974004d0e603e',ramdisk_id='',reservation_id='r-0g4xaa0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-1796635121',owner_user_name='tempest-PortSecurityTest-1796635121-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:34:39Z,user_data=None,user_id='bf68afec168c4aa2ba7e47fdb3b026af',uuid=f62cb90c-e99d-43d4-bbac-06a79d9b1182,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "509507c8-2a57-4e8e-aece-de4608aa6284", "address": "fa:16:3e:04:a8:2d", "network": {"id": "66f028c0-4f7e-4541-a188-d929c83780ea", "bridge": "br-int", "label": "tempest-test-network--1307842537", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509507c8-2a", "ovs_interfaceid": "509507c8-2a57-4e8e-aece-de4608aa6284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.477 183079 DEBUG nova.network.os_vif_util [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Converting VIF {"id": "509507c8-2a57-4e8e-aece-de4608aa6284", "address": "fa:16:3e:04:a8:2d", "network": {"id": "66f028c0-4f7e-4541-a188-d929c83780ea", "bridge": "br-int", "label": "tempest-test-network--1307842537", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509507c8-2a", "ovs_interfaceid": "509507c8-2a57-4e8e-aece-de4608aa6284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.478 183079 DEBUG nova.network.os_vif_util [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:a8:2d,bridge_name='br-int',has_traffic_filtering=True,id=509507c8-2a57-4e8e-aece-de4608aa6284,network=Network(66f028c0-4f7e-4541-a188-d929c83780ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap509507c8-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.479 183079 DEBUG nova.objects.instance [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lazy-loading 'pci_devices' on Instance uuid f62cb90c-e99d-43d4-bbac-06a79d9b1182 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.497 183079 DEBUG nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:34:43 compute-0 nova_compute[183075]:   <uuid>f62cb90c-e99d-43d4-bbac-06a79d9b1182</uuid>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   <name>instance-00000039</name>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1825252520</nova:name>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:34:43</nova:creationTime>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:34:43 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:34:43 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:34:43 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:34:43 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:34:43 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:34:43 compute-0 nova_compute[183075]:         <nova:user uuid="bf68afec168c4aa2ba7e47fdb3b026af">tempest-PortSecurityTest-1796635121-project-member</nova:user>
Jan 22 17:34:43 compute-0 nova_compute[183075]:         <nova:project uuid="8cfd5f99a92142bd829974004d0e603e">tempest-PortSecurityTest-1796635121</nova:project>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:34:43 compute-0 nova_compute[183075]:         <nova:port uuid="509507c8-2a57-4e8e-aece-de4608aa6284">
Jan 22 17:34:43 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <system>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <entry name="serial">f62cb90c-e99d-43d4-bbac-06a79d9b1182</entry>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <entry name="uuid">f62cb90c-e99d-43d4-bbac-06a79d9b1182</entry>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     </system>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   <os>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   </os>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   <features>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   </features>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182/disk"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:04:a8:2d"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <target dev="tap509507c8-2a"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182/console.log" append="off"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <video>
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     </video>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:34:43 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:34:43 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:34:43 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:34:43 compute-0 nova_compute[183075]: </domain>
Jan 22 17:34:43 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.499 183079 DEBUG nova.compute.manager [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Preparing to wait for external event network-vif-plugged-509507c8-2a57-4e8e-aece-de4608aa6284 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.499 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.499 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.499 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.500 183079 DEBUG nova.virt.libvirt.vif [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:34:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1825252520',display_name='tempest-server-test-1825252520',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1825252520',id=57,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGomFw5Z3VR9MENpYtKeaxDUxtNH6GlF1leYJJ98KvwYvEsTZM9n4Wa79zXCEZq0VbSC6BefWHPMVcmuZdoxNG4KqsiiHTSMIJygbk2qiUn1+nbdXOT2eC+L+LWWgtGW8A==',key_name='tempest-keypair-test-857257872',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8cfd5f99a92142bd829974004d0e603e',ramdisk_id='',reservation_id='r-0g4xaa0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-1796635121',owner_user_name='tempest-PortSecurityTest-1796635121-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:34:39Z,user_data=None,user_id='bf68afec168c4aa2ba7e47fdb3b026af',uuid=f62cb90c-e99d-43d4-bbac-06a79d9b1182,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "509507c8-2a57-4e8e-aece-de4608aa6284", "address": "fa:16:3e:04:a8:2d", "network": {"id": "66f028c0-4f7e-4541-a188-d929c83780ea", "bridge": "br-int", "label": "tempest-test-network--1307842537", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509507c8-2a", "ovs_interfaceid": "509507c8-2a57-4e8e-aece-de4608aa6284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.500 183079 DEBUG nova.network.os_vif_util [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Converting VIF {"id": "509507c8-2a57-4e8e-aece-de4608aa6284", "address": "fa:16:3e:04:a8:2d", "network": {"id": "66f028c0-4f7e-4541-a188-d929c83780ea", "bridge": "br-int", "label": "tempest-test-network--1307842537", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509507c8-2a", "ovs_interfaceid": "509507c8-2a57-4e8e-aece-de4608aa6284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.501 183079 DEBUG nova.network.os_vif_util [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:a8:2d,bridge_name='br-int',has_traffic_filtering=True,id=509507c8-2a57-4e8e-aece-de4608aa6284,network=Network(66f028c0-4f7e-4541-a188-d929c83780ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap509507c8-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.501 183079 DEBUG os_vif [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:a8:2d,bridge_name='br-int',has_traffic_filtering=True,id=509507c8-2a57-4e8e-aece-de4608aa6284,network=Network(66f028c0-4f7e-4541-a188-d929c83780ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap509507c8-2a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.502 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.502 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.502 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.504 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.504 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap509507c8-2a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.505 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap509507c8-2a, col_values=(('external_ids', {'iface-id': '509507c8-2a57-4e8e-aece-de4608aa6284', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:a8:2d', 'vm-uuid': 'f62cb90c-e99d-43d4-bbac-06a79d9b1182'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.506 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:43 compute-0 NetworkManager[55454]: <info>  [1769103283.5075] manager: (tap509507c8-2a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/264)
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.508 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.512 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.513 183079 INFO os_vif [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:a8:2d,bridge_name='br-int',has_traffic_filtering=True,id=509507c8-2a57-4e8e-aece-de4608aa6284,network=Network(66f028c0-4f7e-4541-a188-d929c83780ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap509507c8-2a')
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.566 183079 DEBUG nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.567 183079 DEBUG nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] No VIF found with MAC fa:16:3e:04:a8:2d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:34:43 compute-0 kernel: tap509507c8-2a: entered promiscuous mode
Jan 22 17:34:43 compute-0 NetworkManager[55454]: <info>  [1769103283.6228] manager: (tap509507c8-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/265)
Jan 22 17:34:43 compute-0 ovn_controller[95372]: 2026-01-22T17:34:43Z|00624|binding|INFO|Claiming lport 509507c8-2a57-4e8e-aece-de4608aa6284 for this chassis.
Jan 22 17:34:43 compute-0 ovn_controller[95372]: 2026-01-22T17:34:43Z|00625|binding|INFO|509507c8-2a57-4e8e-aece-de4608aa6284: Claiming fa:16:3e:04:a8:2d 10.100.0.7
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.626 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.632 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:a8:2d 10.100.0.7'], port_security=['fa:16:3e:04:a8:2d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f62cb90c-e99d-43d4-bbac-06a79d9b1182', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66f028c0-4f7e-4541-a188-d929c83780ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8cfd5f99a92142bd829974004d0e603e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '759464aa-5d78-4dff-baf2-2a2f16ceb397', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8e9a6e9-2a99-4df6-8e05-39265d2551ea, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=509507c8-2a57-4e8e-aece-de4608aa6284) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.633 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 509507c8-2a57-4e8e-aece-de4608aa6284 in datapath 66f028c0-4f7e-4541-a188-d929c83780ea bound to our chassis
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.634 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66f028c0-4f7e-4541-a188-d929c83780ea
Jan 22 17:34:43 compute-0 ovn_controller[95372]: 2026-01-22T17:34:43Z|00626|binding|INFO|Setting lport 509507c8-2a57-4e8e-aece-de4608aa6284 ovn-installed in OVS
Jan 22 17:34:43 compute-0 ovn_controller[95372]: 2026-01-22T17:34:43Z|00627|binding|INFO|Setting lport 509507c8-2a57-4e8e-aece-de4608aa6284 up in Southbound
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.640 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.650 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c5f3b305-c5c5-41cc-a41a-8463900f7cf6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.651 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap66f028c0-41 in ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.653 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap66f028c0-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.653 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[47dd9dac-5eea-4357-9959-7732b6860137]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:43 compute-0 systemd-udevd[234713]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.656 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b7cf630f-1481-4482-9315-1c3b97bb0447]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:43 compute-0 NetworkManager[55454]: <info>  [1769103283.6681] device (tap509507c8-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:34:43 compute-0 NetworkManager[55454]: <info>  [1769103283.6698] device (tap509507c8-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:34:43 compute-0 systemd-machined[154382]: New machine qemu-57-instance-00000039.
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.674 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[5c409a35-4944-425e-b3c8-41ee65c35481]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:43 compute-0 systemd[1]: Started Virtual Machine qemu-57-instance-00000039.
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.689 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e0b26b-13ba-4faf-9030-dd3fa443fbcb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.717 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f3572216-fe6e-406c-ba57-d804a63a7765]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.721 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[64c152fa-ccf8-453c-960e-a8deb92616cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:43 compute-0 systemd-udevd[234718]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:34:43 compute-0 NetworkManager[55454]: <info>  [1769103283.7231] manager: (tap66f028c0-40): new Veth device (/org/freedesktop/NetworkManager/Devices/266)
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.752 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[4de8b26b-9269-465b-9100-5bafdb4f7b32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.755 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[da71fb6f-83cb-4694-a34d-c469fdf3d79c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:43 compute-0 NetworkManager[55454]: <info>  [1769103283.7768] device (tap66f028c0-40): carrier: link connected
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.783 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d0125957-5bc0-4c2d-9070-8f6e256c664e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.799 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[dc35b5bd-30e3-4bce-856b-47aa3e32280a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66f028c0-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:7a:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564747, 'reachable_time': 41869, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234747, 'error': None, 'target': 'ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.813 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5900f2fa-de5b-45b8-98f4-536508fbd23a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5e:7aff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564747, 'tstamp': 564747}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234748, 'error': None, 'target': 'ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.829 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4393755b-7009-49ea-a67f-684edea85d41]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66f028c0-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:7a:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564747, 'reachable_time': 41869, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234749, 'error': None, 'target': 'ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.860 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1576294f-98e7-4106-8217-e1d741cd5df9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.910 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[09c750a1-0722-413d-bef2-8986942e494d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.911 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66f028c0-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.911 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.912 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66f028c0-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:34:43 compute-0 NetworkManager[55454]: <info>  [1769103283.9143] manager: (tap66f028c0-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/267)
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.913 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:43 compute-0 kernel: tap66f028c0-40: entered promiscuous mode
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.916 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.918 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66f028c0-40, col_values=(('external_ids', {'iface-id': '836e8f86-4bb7-45af-98e0-ef8f86eadaf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.919 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:43 compute-0 ovn_controller[95372]: 2026-01-22T17:34:43Z|00628|binding|INFO|Releasing lport 836e8f86-4bb7-45af-98e0-ef8f86eadaf8 from this chassis (sb_readonly=0)
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.919 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.920 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/66f028c0-4f7e-4541-a188-d929c83780ea.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/66f028c0-4f7e-4541-a188-d929c83780ea.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.921 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[53842766-97a9-4802-af92-78b155dbfb4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.923 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/66f028c0-4f7e-4541-a188-d929c83780ea.pid.haproxy
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 66f028c0-4f7e-4541-a188-d929c83780ea
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:34:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:34:43.924 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea', 'env', 'PROCESS_TAG=haproxy-66f028c0-4f7e-4541-a188-d929c83780ea', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/66f028c0-4f7e-4541-a188-d929c83780ea.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:34:43 compute-0 nova_compute[183075]: 2026-01-22 17:34:43.930 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:44 compute-0 podman[234783]: 2026-01-22 17:34:44.280082355 +0000 UTC m=+0.050029203 container create 6e1315e81220306b6aa7b1216a98dfcd2f1e5da3570f0b4df8fac07374687f98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:34:44 compute-0 nova_compute[183075]: 2026-01-22 17:34:44.320 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103284.3190312, f62cb90c-e99d-43d4-bbac-06a79d9b1182 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:34:44 compute-0 nova_compute[183075]: 2026-01-22 17:34:44.321 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] VM Started (Lifecycle Event)
Jan 22 17:34:44 compute-0 systemd[1]: Started libpod-conmon-6e1315e81220306b6aa7b1216a98dfcd2f1e5da3570f0b4df8fac07374687f98.scope.
Jan 22 17:34:44 compute-0 podman[234783]: 2026-01-22 17:34:44.250403417 +0000 UTC m=+0.020350285 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:34:44 compute-0 nova_compute[183075]: 2026-01-22 17:34:44.356 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:34:44 compute-0 nova_compute[183075]: 2026-01-22 17:34:44.360 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103284.3193305, f62cb90c-e99d-43d4-bbac-06a79d9b1182 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:34:44 compute-0 nova_compute[183075]: 2026-01-22 17:34:44.360 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] VM Paused (Lifecycle Event)
Jan 22 17:34:44 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:34:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a7a0a85aebe7bbd1f4bcd3c320fa18891e7f5de1f49efe07c635bbefa9058c2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:34:44 compute-0 nova_compute[183075]: 2026-01-22 17:34:44.388 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:34:44 compute-0 podman[234783]: 2026-01-22 17:34:44.388412714 +0000 UTC m=+0.158359592 container init 6e1315e81220306b6aa7b1216a98dfcd2f1e5da3570f0b4df8fac07374687f98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:34:44 compute-0 nova_compute[183075]: 2026-01-22 17:34:44.393 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:34:44 compute-0 podman[234783]: 2026-01-22 17:34:44.395081595 +0000 UTC m=+0.165028443 container start 6e1315e81220306b6aa7b1216a98dfcd2f1e5da3570f0b4df8fac07374687f98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:34:44 compute-0 neutron-haproxy-ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea[234803]: [NOTICE]   (234807) : New worker (234809) forked
Jan 22 17:34:44 compute-0 neutron-haproxy-ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea[234803]: [NOTICE]   (234807) : Loading success.
Jan 22 17:34:44 compute-0 nova_compute[183075]: 2026-01-22 17:34:44.422 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:34:45 compute-0 nova_compute[183075]: 2026-01-22 17:34:45.575 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103270.5736465, eb04fe29-6d1c-4572-b219-f60350425077 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:34:45 compute-0 nova_compute[183075]: 2026-01-22 17:34:45.575 183079 INFO nova.compute.manager [-] [instance: eb04fe29-6d1c-4572-b219-f60350425077] VM Stopped (Lifecycle Event)
Jan 22 17:34:45 compute-0 nova_compute[183075]: 2026-01-22 17:34:45.604 183079 DEBUG nova.compute.manager [None req-e8b6ba91-4e57-4787-8006-ede6992e65de - - - - - -] [instance: eb04fe29-6d1c-4572-b219-f60350425077] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.419 183079 DEBUG nova.compute.manager [req-9a6795c2-221e-4475-86a7-ab63c491ae15 req-3ea97114-704a-4751-aee0-e8237af4081a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Received event network-vif-plugged-509507c8-2a57-4e8e-aece-de4608aa6284 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.420 183079 DEBUG oslo_concurrency.lockutils [req-9a6795c2-221e-4475-86a7-ab63c491ae15 req-3ea97114-704a-4751-aee0-e8237af4081a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.420 183079 DEBUG oslo_concurrency.lockutils [req-9a6795c2-221e-4475-86a7-ab63c491ae15 req-3ea97114-704a-4751-aee0-e8237af4081a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.420 183079 DEBUG oslo_concurrency.lockutils [req-9a6795c2-221e-4475-86a7-ab63c491ae15 req-3ea97114-704a-4751-aee0-e8237af4081a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.421 183079 DEBUG nova.compute.manager [req-9a6795c2-221e-4475-86a7-ab63c491ae15 req-3ea97114-704a-4751-aee0-e8237af4081a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Processing event network-vif-plugged-509507c8-2a57-4e8e-aece-de4608aa6284 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.424 183079 DEBUG nova.compute.manager [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.427 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103286.4271984, f62cb90c-e99d-43d4-bbac-06a79d9b1182 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.427 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] VM Resumed (Lifecycle Event)
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.429 183079 DEBUG nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.432 183079 INFO nova.virt.libvirt.driver [-] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Instance spawned successfully.
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.433 183079 DEBUG nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.450 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.456 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.460 183079 DEBUG nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.461 183079 DEBUG nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.462 183079 DEBUG nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.462 183079 DEBUG nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.463 183079 DEBUG nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.463 183079 DEBUG nova.virt.libvirt.driver [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.491 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.520 183079 INFO nova.compute.manager [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Took 6.95 seconds to spawn the instance on the hypervisor.
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.521 183079 DEBUG nova.compute.manager [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.581 183079 INFO nova.compute.manager [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Took 7.52 seconds to build instance.
Jan 22 17:34:46 compute-0 nova_compute[183075]: 2026-01-22 17:34:46.600 183079 DEBUG oslo_concurrency.lockutils [None req-2c4223cd-cbb2-4123-996f-1f49b26c58e8 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.712s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:47 compute-0 nova_compute[183075]: 2026-01-22 17:34:47.610 183079 DEBUG nova.network.neutron [req-e0bc5926-169e-4be2-899c-a3b1150f73f4 req-c243df2e-c06c-418a-b4f6-7ffb434d5f30 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Updated VIF entry in instance network info cache for port 509507c8-2a57-4e8e-aece-de4608aa6284. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:34:47 compute-0 nova_compute[183075]: 2026-01-22 17:34:47.612 183079 DEBUG nova.network.neutron [req-e0bc5926-169e-4be2-899c-a3b1150f73f4 req-c243df2e-c06c-418a-b4f6-7ffb434d5f30 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Updating instance_info_cache with network_info: [{"id": "509507c8-2a57-4e8e-aece-de4608aa6284", "address": "fa:16:3e:04:a8:2d", "network": {"id": "66f028c0-4f7e-4541-a188-d929c83780ea", "bridge": "br-int", "label": "tempest-test-network--1307842537", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509507c8-2a", "ovs_interfaceid": "509507c8-2a57-4e8e-aece-de4608aa6284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:34:47 compute-0 nova_compute[183075]: 2026-01-22 17:34:47.632 183079 DEBUG oslo_concurrency.lockutils [req-e0bc5926-169e-4be2-899c-a3b1150f73f4 req-c243df2e-c06c-418a-b4f6-7ffb434d5f30 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:34:47 compute-0 nova_compute[183075]: 2026-01-22 17:34:47.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:34:47 compute-0 nova_compute[183075]: 2026-01-22 17:34:47.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 17:34:47 compute-0 nova_compute[183075]: 2026-01-22 17:34:47.801 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 17:34:47 compute-0 nova_compute[183075]: 2026-01-22 17:34:47.994 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:48 compute-0 nova_compute[183075]: 2026-01-22 17:34:48.507 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:48 compute-0 nova_compute[183075]: 2026-01-22 17:34:48.812 183079 INFO nova.compute.manager [None req-5fa2a2dc-9627-44d6-a1cb-9b5e31dc0faf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Get console output
Jan 22 17:34:48 compute-0 nova_compute[183075]: 2026-01-22 17:34:48.820 183079 DEBUG nova.compute.manager [req-3cf03688-35a5-40dd-83ea-8e143c34c610 req-12895987-254f-4882-9331-983b91f7b877 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Received event network-vif-plugged-509507c8-2a57-4e8e-aece-de4608aa6284 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:34:48 compute-0 nova_compute[183075]: 2026-01-22 17:34:48.821 183079 DEBUG oslo_concurrency.lockutils [req-3cf03688-35a5-40dd-83ea-8e143c34c610 req-12895987-254f-4882-9331-983b91f7b877 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:34:48 compute-0 nova_compute[183075]: 2026-01-22 17:34:48.821 183079 DEBUG oslo_concurrency.lockutils [req-3cf03688-35a5-40dd-83ea-8e143c34c610 req-12895987-254f-4882-9331-983b91f7b877 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:34:48 compute-0 nova_compute[183075]: 2026-01-22 17:34:48.822 183079 DEBUG oslo_concurrency.lockutils [req-3cf03688-35a5-40dd-83ea-8e143c34c610 req-12895987-254f-4882-9331-983b91f7b877 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:34:48 compute-0 nova_compute[183075]: 2026-01-22 17:34:48.822 183079 DEBUG nova.compute.manager [req-3cf03688-35a5-40dd-83ea-8e143c34c610 req-12895987-254f-4882-9331-983b91f7b877 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] No waiting events found dispatching network-vif-plugged-509507c8-2a57-4e8e-aece-de4608aa6284 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:34:48 compute-0 nova_compute[183075]: 2026-01-22 17:34:48.822 183079 WARNING nova.compute.manager [req-3cf03688-35a5-40dd-83ea-8e143c34c610 req-12895987-254f-4882-9331-983b91f7b877 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Received unexpected event network-vif-plugged-509507c8-2a57-4e8e-aece-de4608aa6284 for instance with vm_state active and task_state None.
Jan 22 17:34:52 compute-0 nova_compute[183075]: 2026-01-22 17:34:52.235 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103277.2335882, 81396ea9-a9a1-4a21-9808-608e45a7aa03 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:34:52 compute-0 nova_compute[183075]: 2026-01-22 17:34:52.236 183079 INFO nova.compute.manager [-] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] VM Stopped (Lifecycle Event)
Jan 22 17:34:52 compute-0 nova_compute[183075]: 2026-01-22 17:34:52.253 183079 DEBUG nova.compute.manager [None req-950fcfac-41a4-493d-a7e3-094d1176d267 - - - - - -] [instance: 81396ea9-a9a1-4a21-9808-608e45a7aa03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:34:52 compute-0 podman[234818]: 2026-01-22 17:34:52.338664407 +0000 UTC m=+0.045298744 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:34:52 compute-0 nova_compute[183075]: 2026-01-22 17:34:52.997 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:53 compute-0 ovn_controller[95372]: 2026-01-22T17:34:53Z|00629|binding|INFO|Releasing lport 836e8f86-4bb7-45af-98e0-ef8f86eadaf8 from this chassis (sb_readonly=0)
Jan 22 17:34:53 compute-0 nova_compute[183075]: 2026-01-22 17:34:53.123 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:53 compute-0 ovn_controller[95372]: 2026-01-22T17:34:53Z|00630|binding|INFO|Releasing lport 836e8f86-4bb7-45af-98e0-ef8f86eadaf8 from this chassis (sb_readonly=0)
Jan 22 17:34:53 compute-0 nova_compute[183075]: 2026-01-22 17:34:53.202 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:53 compute-0 nova_compute[183075]: 2026-01-22 17:34:53.509 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:54 compute-0 nova_compute[183075]: 2026-01-22 17:34:54.010 183079 INFO nova.compute.manager [None req-9d4e374a-1b3f-45c7-8e23-fb4f252a046c bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Get console output
Jan 22 17:34:56 compute-0 podman[234843]: 2026-01-22 17:34:56.345695608 +0000 UTC m=+0.055915184 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:34:57 compute-0 nova_compute[183075]: 2026-01-22 17:34:57.999 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:58 compute-0 ovn_controller[95372]: 2026-01-22T17:34:58Z|00116|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:04:a8:2d 10.100.0.7
Jan 22 17:34:58 compute-0 ovn_controller[95372]: 2026-01-22T17:34:58Z|00117|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:04:a8:2d 10.100.0.7
Jan 22 17:34:58 compute-0 nova_compute[183075]: 2026-01-22 17:34:58.512 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:34:59 compute-0 sshd-session[234886]: Received disconnect from 45.148.10.152 port 33482:11:  [preauth]
Jan 22 17:34:59 compute-0 sshd-session[234886]: Disconnected from authenticating user root 45.148.10.152 port 33482 [preauth]
Jan 22 17:34:59 compute-0 nova_compute[183075]: 2026-01-22 17:34:59.228 183079 INFO nova.compute.manager [None req-39db7772-0f8f-44f0-b29e-a7596d857021 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Get console output
Jan 22 17:34:59 compute-0 nova_compute[183075]: 2026-01-22 17:34:59.233 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:34:59 compute-0 nova_compute[183075]: 2026-01-22 17:34:59.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:35:03 compute-0 nova_compute[183075]: 2026-01-22 17:35:03.000 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:03 compute-0 nova_compute[183075]: 2026-01-22 17:35:03.514 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:03.579 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:03.580 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:35:03 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:03 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:03 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:03 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:03 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:03 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:35:03 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 66f028c0-4f7e-4541-a188-d929c83780ea __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.009 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.010 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.4300163
Jan 22 17:35:04 compute-0 haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea[234809]: 10.100.0.7:35842 [22/Jan/2026:17:35:03.578] listener listener/metadata 0/0/0/431/431 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.022 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.023 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 66f028c0-4f7e-4541-a188-d929c83780ea __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.044 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.045 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0218377
Jan 22 17:35:04 compute-0 haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea[234809]: 10.100.0.7:35856 [22/Jan/2026:17:35:04.022] listener listener/metadata 0/0/0/23/23 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.050 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.051 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 66f028c0-4f7e-4541-a188-d929c83780ea __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.063 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.063 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0127792
Jan 22 17:35:04 compute-0 haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea[234809]: 10.100.0.7:35872 [22/Jan/2026:17:35:04.049] listener listener/metadata 0/0/0/13/13 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.069 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.070 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 66f028c0-4f7e-4541-a188-d929c83780ea __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.083 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.083 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0134809
Jan 22 17:35:04 compute-0 haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea[234809]: 10.100.0.7:35888 [22/Jan/2026:17:35:04.069] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.093 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.093 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 66f028c0-4f7e-4541-a188-d929c83780ea __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.107 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.107 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0139432
Jan 22 17:35:04 compute-0 haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea[234809]: 10.100.0.7:35904 [22/Jan/2026:17:35:04.092] listener listener/metadata 0/0/0/15/15 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.117 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.117 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 66f028c0-4f7e-4541-a188-d929c83780ea __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.130 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.131 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0133154
Jan 22 17:35:04 compute-0 haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea[234809]: 10.100.0.7:35914 [22/Jan/2026:17:35:04.116] listener listener/metadata 0/0/0/14/14 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.136 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.136 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 66f028c0-4f7e-4541-a188-d929c83780ea __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.153 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.154 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0173223
Jan 22 17:35:04 compute-0 haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea[234809]: 10.100.0.7:35930 [22/Jan/2026:17:35:04.135] listener listener/metadata 0/0/0/18/18 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.162 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.163 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 66f028c0-4f7e-4541-a188-d929c83780ea __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.175 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.176 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0124826
Jan 22 17:35:04 compute-0 haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea[234809]: 10.100.0.7:35936 [22/Jan/2026:17:35:04.162] listener listener/metadata 0/0/0/14/14 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.185 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.186 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 66f028c0-4f7e-4541-a188-d929c83780ea __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.203 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.203 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0170863
Jan 22 17:35:04 compute-0 haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea[234809]: 10.100.0.7:35948 [22/Jan/2026:17:35:04.185] listener listener/metadata 0/0/0/18/18 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.207 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.207 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 66f028c0-4f7e-4541-a188-d929c83780ea __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.220 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:04 compute-0 haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea[234809]: 10.100.0.7:35960 [22/Jan/2026:17:35:04.207] listener listener/metadata 0/0/0/13/13 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.221 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0133116
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.225 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.225 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 66f028c0-4f7e-4541-a188-d929c83780ea __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:04 compute-0 haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea[234809]: 10.100.0.7:35974 [22/Jan/2026:17:35:04.224] listener listener/metadata 0/0/0/16/16 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.241 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0158296
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.252 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.252 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 66f028c0-4f7e-4541-a188-d929c83780ea __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.265 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.265 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0127168
Jan 22 17:35:04 compute-0 haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea[234809]: 10.100.0.7:35976 [22/Jan/2026:17:35:04.251] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.271 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.271 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 66f028c0-4f7e-4541-a188-d929c83780ea __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.291 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.292 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0207160
Jan 22 17:35:04 compute-0 haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea[234809]: 10.100.0.7:35990 [22/Jan/2026:17:35:04.270] listener listener/metadata 0/0/0/21/21 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.297 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.297 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 66f028c0-4f7e-4541-a188-d929c83780ea __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.312 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.313 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0155935
Jan 22 17:35:04 compute-0 haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea[234809]: 10.100.0.7:36000 [22/Jan/2026:17:35:04.296] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.319 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.320 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 66f028c0-4f7e-4541-a188-d929c83780ea __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.336 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.337 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0169239
Jan 22 17:35:04 compute-0 haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea[234809]: 10.100.0.7:36016 [22/Jan/2026:17:35:04.318] listener listener/metadata 0/0/0/18/18 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.343 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.344 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 66f028c0-4f7e-4541-a188-d929c83780ea __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.364 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:04.364 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0199952
Jan 22 17:35:04 compute-0 haproxy-metadata-proxy-66f028c0-4f7e-4541-a188-d929c83780ea[234809]: 10.100.0.7:36032 [22/Jan/2026:17:35:04.342] listener listener/metadata 0/0/0/21/21 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:35:04 compute-0 nova_compute[183075]: 2026-01-22 17:35:04.569 183079 INFO nova.compute.manager [None req-aa8767a2-d3c5-4a17-ba9f-db51dcc9305a bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Get console output
Jan 22 17:35:04 compute-0 nova_compute[183075]: 2026-01-22 17:35:04.578 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:35:07 compute-0 podman[234891]: 2026-01-22 17:35:07.354710368 +0000 UTC m=+0.057544778 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=openstack_network_exporter, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:35:07 compute-0 podman[234890]: 2026-01-22 17:35:07.383787219 +0000 UTC m=+0.080645186 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:35:07 compute-0 podman[234889]: 2026-01-22 17:35:07.402429577 +0000 UTC m=+0.112507434 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:35:08 compute-0 nova_compute[183075]: 2026-01-22 17:35:08.002 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:08 compute-0 nova_compute[183075]: 2026-01-22 17:35:08.516 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:09 compute-0 nova_compute[183075]: 2026-01-22 17:35:09.822 183079 INFO nova.compute.manager [None req-a1ce3a4e-87de-45d9-ad91-f0d13f5381ff bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Get console output
Jan 22 17:35:09 compute-0 nova_compute[183075]: 2026-01-22 17:35:09.831 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:35:12 compute-0 podman[234952]: 2026-01-22 17:35:12.347656119 +0000 UTC m=+0.059465640 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:35:13 compute-0 nova_compute[183075]: 2026-01-22 17:35:13.004 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:13 compute-0 nova_compute[183075]: 2026-01-22 17:35:13.518 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:15 compute-0 nova_compute[183075]: 2026-01-22 17:35:15.314 183079 INFO nova.compute.manager [None req-656ddd58-a1c6-4727-8095-4a4242f0e08a bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Get console output
Jan 22 17:35:15 compute-0 nova_compute[183075]: 2026-01-22 17:35:15.319 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:35:18 compute-0 nova_compute[183075]: 2026-01-22 17:35:18.006 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:18 compute-0 nova_compute[183075]: 2026-01-22 17:35:18.521 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:19 compute-0 nova_compute[183075]: 2026-01-22 17:35:19.208 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "585cde24-8038-40b2-97ce-5d30e6ecfc03" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:35:19 compute-0 nova_compute[183075]: 2026-01-22 17:35:19.209 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:35:19 compute-0 nova_compute[183075]: 2026-01-22 17:35:19.465 183079 DEBUG nova.compute.manager [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:35:19 compute-0 nova_compute[183075]: 2026-01-22 17:35:19.631 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:35:19 compute-0 nova_compute[183075]: 2026-01-22 17:35:19.632 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:35:19 compute-0 nova_compute[183075]: 2026-01-22 17:35:19.641 183079 DEBUG nova.virt.hardware [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:35:19 compute-0 nova_compute[183075]: 2026-01-22 17:35:19.641 183079 INFO nova.compute.claims [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:35:19 compute-0 nova_compute[183075]: 2026-01-22 17:35:19.910 183079 DEBUG nova.compute.provider_tree [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:35:20 compute-0 nova_compute[183075]: 2026-01-22 17:35:20.247 183079 DEBUG nova.scheduler.client.report [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:35:20 compute-0 nova_compute[183075]: 2026-01-22 17:35:20.269 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:35:20 compute-0 nova_compute[183075]: 2026-01-22 17:35:20.270 183079 DEBUG nova.compute.manager [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:35:20 compute-0 nova_compute[183075]: 2026-01-22 17:35:20.934 183079 INFO nova.compute.manager [None req-584dff79-8c72-4bbe-a9e4-b5f79742862f bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Get console output
Jan 22 17:35:20 compute-0 nova_compute[183075]: 2026-01-22 17:35:20.942 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.092 183079 DEBUG nova.compute.manager [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.092 183079 DEBUG nova.network.neutron [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.153 183079 INFO nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.234 183079 DEBUG nova.compute.manager [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.518 183079 DEBUG nova.compute.manager [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.520 183079 DEBUG nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.520 183079 INFO nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Creating image(s)
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.521 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "/var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.521 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "/var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.522 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "/var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.534 183079 DEBUG oslo_concurrency.processutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.592 183079 DEBUG oslo_concurrency.processutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.593 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.594 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.606 183079 DEBUG oslo_concurrency.processutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.661 183079 DEBUG oslo_concurrency.processutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.662 183079 DEBUG oslo_concurrency.processutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.702 183079 DEBUG oslo_concurrency.processutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.703 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.703 183079 DEBUG oslo_concurrency.processutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.760 183079 DEBUG oslo_concurrency.processutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.761 183079 DEBUG nova.virt.disk.api [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Checking if we can resize image /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.762 183079 DEBUG oslo_concurrency.processutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.790 183079 DEBUG nova.policy [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.817 183079 DEBUG oslo_concurrency.processutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.818 183079 DEBUG nova.virt.disk.api [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Cannot resize image /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:35:21 compute-0 nova_compute[183075]: 2026-01-22 17:35:21.818 183079 DEBUG nova.objects.instance [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lazy-loading 'migration_context' on Instance uuid 585cde24-8038-40b2-97ce-5d30e6ecfc03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:35:22 compute-0 nova_compute[183075]: 2026-01-22 17:35:22.091 183079 DEBUG nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:35:22 compute-0 nova_compute[183075]: 2026-01-22 17:35:22.092 183079 DEBUG nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Ensure instance console log exists: /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:35:22 compute-0 nova_compute[183075]: 2026-01-22 17:35:22.092 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:35:22 compute-0 nova_compute[183075]: 2026-01-22 17:35:22.092 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:35:22 compute-0 nova_compute[183075]: 2026-01-22 17:35:22.093 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:35:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:22.784 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:35:22 compute-0 nova_compute[183075]: 2026-01-22 17:35:22.785 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:22.785 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:35:22 compute-0 nova_compute[183075]: 2026-01-22 17:35:22.800 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:35:23 compute-0 nova_compute[183075]: 2026-01-22 17:35:23.008 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:23 compute-0 nova_compute[183075]: 2026-01-22 17:35:23.175 183079 DEBUG nova.network.neutron [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Successfully updated port: 524266a0-8b9b-42d0-9f33-913b53293292 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:35:23 compute-0 nova_compute[183075]: 2026-01-22 17:35:23.199 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "refresh_cache-585cde24-8038-40b2-97ce-5d30e6ecfc03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:35:23 compute-0 nova_compute[183075]: 2026-01-22 17:35:23.199 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquired lock "refresh_cache-585cde24-8038-40b2-97ce-5d30e6ecfc03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:35:23 compute-0 nova_compute[183075]: 2026-01-22 17:35:23.199 183079 DEBUG nova.network.neutron [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:35:23 compute-0 nova_compute[183075]: 2026-01-22 17:35:23.305 183079 DEBUG nova.compute.manager [req-b4c88c53-8dc7-4ad7-bfd2-af40afe13084 req-5c6700f6-ebdc-48a5-acfb-b197f4c524cc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Received event network-changed-524266a0-8b9b-42d0-9f33-913b53293292 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:35:23 compute-0 nova_compute[183075]: 2026-01-22 17:35:23.305 183079 DEBUG nova.compute.manager [req-b4c88c53-8dc7-4ad7-bfd2-af40afe13084 req-5c6700f6-ebdc-48a5-acfb-b197f4c524cc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Refreshing instance network info cache due to event network-changed-524266a0-8b9b-42d0-9f33-913b53293292. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:35:23 compute-0 nova_compute[183075]: 2026-01-22 17:35:23.306 183079 DEBUG oslo_concurrency.lockutils [req-b4c88c53-8dc7-4ad7-bfd2-af40afe13084 req-5c6700f6-ebdc-48a5-acfb-b197f4c524cc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-585cde24-8038-40b2-97ce-5d30e6ecfc03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:35:23 compute-0 podman[234988]: 2026-01-22 17:35:23.337917598 +0000 UTC m=+0.051607586 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:35:23 compute-0 nova_compute[183075]: 2026-01-22 17:35:23.523 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:23 compute-0 nova_compute[183075]: 2026-01-22 17:35:23.612 183079 DEBUG nova.network.neutron [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.654 183079 DEBUG nova.network.neutron [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Updating instance_info_cache with network_info: [{"id": "524266a0-8b9b-42d0-9f33-913b53293292", "address": "fa:16:3e:7a:64:5f", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap524266a0-8b", "ovs_interfaceid": "524266a0-8b9b-42d0-9f33-913b53293292", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.688 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Releasing lock "refresh_cache-585cde24-8038-40b2-97ce-5d30e6ecfc03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.689 183079 DEBUG nova.compute.manager [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Instance network_info: |[{"id": "524266a0-8b9b-42d0-9f33-913b53293292", "address": "fa:16:3e:7a:64:5f", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap524266a0-8b", "ovs_interfaceid": "524266a0-8b9b-42d0-9f33-913b53293292", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.689 183079 DEBUG oslo_concurrency.lockutils [req-b4c88c53-8dc7-4ad7-bfd2-af40afe13084 req-5c6700f6-ebdc-48a5-acfb-b197f4c524cc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-585cde24-8038-40b2-97ce-5d30e6ecfc03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.690 183079 DEBUG nova.network.neutron [req-b4c88c53-8dc7-4ad7-bfd2-af40afe13084 req-5c6700f6-ebdc-48a5-acfb-b197f4c524cc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Refreshing network info cache for port 524266a0-8b9b-42d0-9f33-913b53293292 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.692 183079 DEBUG nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Start _get_guest_xml network_info=[{"id": "524266a0-8b9b-42d0-9f33-913b53293292", "address": "fa:16:3e:7a:64:5f", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap524266a0-8b", "ovs_interfaceid": "524266a0-8b9b-42d0-9f33-913b53293292", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.697 183079 WARNING nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.700 183079 DEBUG nova.virt.libvirt.host [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.701 183079 DEBUG nova.virt.libvirt.host [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.703 183079 DEBUG nova.virt.libvirt.host [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.704 183079 DEBUG nova.virt.libvirt.host [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.704 183079 DEBUG nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.704 183079 DEBUG nova.virt.hardware [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.705 183079 DEBUG nova.virt.hardware [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.705 183079 DEBUG nova.virt.hardware [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.705 183079 DEBUG nova.virt.hardware [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.706 183079 DEBUG nova.virt.hardware [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.706 183079 DEBUG nova.virt.hardware [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.706 183079 DEBUG nova.virt.hardware [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.706 183079 DEBUG nova.virt.hardware [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.706 183079 DEBUG nova.virt.hardware [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.707 183079 DEBUG nova.virt.hardware [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.707 183079 DEBUG nova.virt.hardware [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.710 183079 DEBUG nova.virt.libvirt.vif [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:35:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-575952224',display_name='tempest-server-test-575952224',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-575952224',id=58,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMx6tQlM3uv2ZwDgQasADvvbGd5zBlSeFHyt6pKaXPuo5g1lBpnMysyabNjP8htj/tP0P4meLZoYHTsZRxp2O0FBGUiyAm9KZdR/DNDaP0hn5KYm00UnMkjIWdjBNqhB9Q==',key_name='tempest-TrunkTest-1480081563',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e92551896d2c49b5b149b1a5a0cc1761',ramdisk_id='',reservation_id='r-qr4fehoi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TrunkTest-252091256',owner_user_name='tempest-TrunkTest-252091256-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:35:21Z,user_data=None,user_id='7cc2886d6b0e400d8096a810a2159f3c',uuid=585cde24-8038-40b2-97ce-5d30e6ecfc03,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "524266a0-8b9b-42d0-9f33-913b53293292", "address": "fa:16:3e:7a:64:5f", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap524266a0-8b", "ovs_interfaceid": "524266a0-8b9b-42d0-9f33-913b53293292", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.711 183079 DEBUG nova.network.os_vif_util [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Converting VIF {"id": "524266a0-8b9b-42d0-9f33-913b53293292", "address": "fa:16:3e:7a:64:5f", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap524266a0-8b", "ovs_interfaceid": "524266a0-8b9b-42d0-9f33-913b53293292", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.711 183079 DEBUG nova.network.os_vif_util [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:64:5f,bridge_name='br-int',has_traffic_filtering=True,id=524266a0-8b9b-42d0-9f33-913b53293292,network=Network(f8ae4b18-347c-4ff3-b9f8-578518ecd408),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap524266a0-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.712 183079 DEBUG nova.objects.instance [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lazy-loading 'pci_devices' on Instance uuid 585cde24-8038-40b2-97ce-5d30e6ecfc03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.724 183079 DEBUG nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:35:24 compute-0 nova_compute[183075]:   <uuid>585cde24-8038-40b2-97ce-5d30e6ecfc03</uuid>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   <name>instance-0000003a</name>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-575952224</nova:name>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:35:24</nova:creationTime>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:35:24 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:35:24 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:35:24 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:35:24 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:35:24 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:35:24 compute-0 nova_compute[183075]:         <nova:user uuid="7cc2886d6b0e400d8096a810a2159f3c">tempest-TrunkTest-252091256-project-member</nova:user>
Jan 22 17:35:24 compute-0 nova_compute[183075]:         <nova:project uuid="e92551896d2c49b5b149b1a5a0cc1761">tempest-TrunkTest-252091256</nova:project>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:35:24 compute-0 nova_compute[183075]:         <nova:port uuid="524266a0-8b9b-42d0-9f33-913b53293292">
Jan 22 17:35:24 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <system>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <entry name="serial">585cde24-8038-40b2-97ce-5d30e6ecfc03</entry>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <entry name="uuid">585cde24-8038-40b2-97ce-5d30e6ecfc03</entry>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     </system>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   <os>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   </os>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   <features>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   </features>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:7a:64:5f"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <target dev="tap524266a0-8b"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/console.log" append="off"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <video>
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     </video>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:35:24 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:35:24 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:35:24 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:35:24 compute-0 nova_compute[183075]: </domain>
Jan 22 17:35:24 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.725 183079 DEBUG nova.compute.manager [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Preparing to wait for external event network-vif-plugged-524266a0-8b9b-42d0-9f33-913b53293292 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.725 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.726 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.726 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.727 183079 DEBUG nova.virt.libvirt.vif [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:35:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-575952224',display_name='tempest-server-test-575952224',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-575952224',id=58,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMx6tQlM3uv2ZwDgQasADvvbGd5zBlSeFHyt6pKaXPuo5g1lBpnMysyabNjP8htj/tP0P4meLZoYHTsZRxp2O0FBGUiyAm9KZdR/DNDaP0hn5KYm00UnMkjIWdjBNqhB9Q==',key_name='tempest-TrunkTest-1480081563',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e92551896d2c49b5b149b1a5a0cc1761',ramdisk_id='',reservation_id='r-qr4fehoi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TrunkTest-252091256',owner_user_name='tempest-TrunkTest-252091256-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:35:21Z,user_data=None,user_id='7cc2886d6b0e400d8096a810a2159f3c',uuid=585cde24-8038-40b2-97ce-5d30e6ecfc03,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "524266a0-8b9b-42d0-9f33-913b53293292", "address": "fa:16:3e:7a:64:5f", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap524266a0-8b", "ovs_interfaceid": "524266a0-8b9b-42d0-9f33-913b53293292", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.727 183079 DEBUG nova.network.os_vif_util [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Converting VIF {"id": "524266a0-8b9b-42d0-9f33-913b53293292", "address": "fa:16:3e:7a:64:5f", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap524266a0-8b", "ovs_interfaceid": "524266a0-8b9b-42d0-9f33-913b53293292", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.728 183079 DEBUG nova.network.os_vif_util [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:64:5f,bridge_name='br-int',has_traffic_filtering=True,id=524266a0-8b9b-42d0-9f33-913b53293292,network=Network(f8ae4b18-347c-4ff3-b9f8-578518ecd408),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap524266a0-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.728 183079 DEBUG os_vif [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:64:5f,bridge_name='br-int',has_traffic_filtering=True,id=524266a0-8b9b-42d0-9f33-913b53293292,network=Network(f8ae4b18-347c-4ff3-b9f8-578518ecd408),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap524266a0-8b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.728 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.729 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.729 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.732 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.732 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap524266a0-8b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.733 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap524266a0-8b, col_values=(('external_ids', {'iface-id': '524266a0-8b9b-42d0-9f33-913b53293292', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7a:64:5f', 'vm-uuid': '585cde24-8038-40b2-97ce-5d30e6ecfc03'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.734 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:24 compute-0 NetworkManager[55454]: <info>  [1769103324.7356] manager: (tap524266a0-8b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/268)
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.737 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.741 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.742 183079 INFO os_vif [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:64:5f,bridge_name='br-int',has_traffic_filtering=True,id=524266a0-8b9b-42d0-9f33-913b53293292,network=Network(f8ae4b18-347c-4ff3-b9f8-578518ecd408),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap524266a0-8b')
Jan 22 17:35:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:24.788 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.795 183079 DEBUG nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.796 183079 DEBUG nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] No VIF found with MAC fa:16:3e:7a:64:5f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:35:24 compute-0 kernel: tap524266a0-8b: entered promiscuous mode
Jan 22 17:35:24 compute-0 NetworkManager[55454]: <info>  [1769103324.8492] manager: (tap524266a0-8b): new Tun device (/org/freedesktop/NetworkManager/Devices/269)
Jan 22 17:35:24 compute-0 ovn_controller[95372]: 2026-01-22T17:35:24Z|00631|binding|INFO|Claiming lport 524266a0-8b9b-42d0-9f33-913b53293292 for this chassis.
Jan 22 17:35:24 compute-0 ovn_controller[95372]: 2026-01-22T17:35:24Z|00632|binding|INFO|524266a0-8b9b-42d0-9f33-913b53293292: Claiming fa:16:3e:7a:64:5f 10.100.0.6
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.851 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.853 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:24.864 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:64:5f 10.100.0.6'], port_security=['fa:16:3e:7a:64:5f 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TrunkTest-1480081563', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8ae4b18-347c-4ff3-b9f8-578518ecd408', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TrunkTest-1480081563', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '2', 'neutron:security_group_ids': '709a6363-b0f8-4821-8d51-15be786bf9b0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.231'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=43bd6944-fad9-4457-8b9b-34db3859c385, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=524266a0-8b9b-42d0-9f33-913b53293292) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:35:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:24.865 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 524266a0-8b9b-42d0-9f33-913b53293292 in datapath f8ae4b18-347c-4ff3-b9f8-578518ecd408 bound to our chassis
Jan 22 17:35:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:24.867 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f8ae4b18-347c-4ff3-b9f8-578518ecd408
Jan 22 17:35:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:24.878 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e4436dff-50c0-41d6-8a86-90a21b4b897b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:24.879 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf8ae4b18-31 in ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:35:24 compute-0 systemd-udevd[235029]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:35:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:24.881 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf8ae4b18-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:35:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:24.882 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[66eab895-e887-457e-86c0-8d802f42a06b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:24.883 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[78e84490-1c22-4ed7-bd53-a5b3f6eef7d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:24 compute-0 NetworkManager[55454]: <info>  [1769103324.8976] device (tap524266a0-8b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:35:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:24.897 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[53b81381-11f0-4b63-ba9b-006d86387a0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:24 compute-0 NetworkManager[55454]: <info>  [1769103324.8989] device (tap524266a0-8b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:35:24 compute-0 systemd-machined[154382]: New machine qemu-58-instance-0000003a.
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.909 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:24 compute-0 ovn_controller[95372]: 2026-01-22T17:35:24Z|00633|binding|INFO|Setting lport 524266a0-8b9b-42d0-9f33-913b53293292 ovn-installed in OVS
Jan 22 17:35:24 compute-0 ovn_controller[95372]: 2026-01-22T17:35:24Z|00634|binding|INFO|Setting lport 524266a0-8b9b-42d0-9f33-913b53293292 up in Southbound
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.914 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:24.924 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b790ecb2-dc69-4f0c-a293-3b0d9b3e23b0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:24 compute-0 systemd[1]: Started Virtual Machine qemu-58-instance-0000003a.
Jan 22 17:35:24 compute-0 ovn_controller[95372]: 2026-01-22T17:35:24Z|00635|binding|INFO|Releasing lport 836e8f86-4bb7-45af-98e0-ef8f86eadaf8 from this chassis (sb_readonly=0)
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.945 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:24 compute-0 NetworkManager[55454]: <info>  [1769103324.9490] manager: (patch-br-int-to-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/270)
Jan 22 17:35:24 compute-0 NetworkManager[55454]: <info>  [1769103324.9500] manager: (patch-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Jan 22 17:35:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:24.956 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[be19c442-50ac-4269-829a-f071fb2f9fe7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:24 compute-0 systemd-udevd[235032]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:35:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:24.973 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[09988971-9304-47e6-bcbf-3302207e8f64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:24 compute-0 NetworkManager[55454]: <info>  [1769103324.9743] manager: (tapf8ae4b18-30): new Veth device (/org/freedesktop/NetworkManager/Devices/272)
Jan 22 17:35:24 compute-0 ovn_controller[95372]: 2026-01-22T17:35:24Z|00636|binding|INFO|Releasing lport 836e8f86-4bb7-45af-98e0-ef8f86eadaf8 from this chassis (sb_readonly=0)
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.975 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:24 compute-0 nova_compute[183075]: 2026-01-22 17:35:24.981 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:25.011 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[1335ede2-b990-4016-844b-e60050a56bdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:25.014 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[20c0f1ac-b0d5-4f77-a979-9aa592e08df0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:25 compute-0 NetworkManager[55454]: <info>  [1769103325.0430] device (tapf8ae4b18-30): carrier: link connected
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:25.049 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[74334df3-3f24-4e20-8950-f438ec935216]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:25.068 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1453aba2-0849-42b8-a183-9567e9e0395d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8ae4b18-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:69:38'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568874, 'reachable_time': 28814, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235063, 'error': None, 'target': 'ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:25.085 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d4d0df5f-0b70-4918-bef2-25bee46617ee]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe05:6938'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 568874, 'tstamp': 568874}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235064, 'error': None, 'target': 'ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:25.103 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c457cb59-dfd1-423d-9661-3057d225251d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8ae4b18-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:69:38'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568874, 'reachable_time': 28814, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235065, 'error': None, 'target': 'ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:25.144 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f01b9b05-1c97-4aef-9172-370cb078eae1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.215 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103325.214573, 585cde24-8038-40b2-97ce-5d30e6ecfc03 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.215 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] VM Started (Lifecycle Event)
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:25.216 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5463e093-c6b2-43fe-b29e-788d9f065376]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:25.217 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8ae4b18-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:25.218 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:25.218 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf8ae4b18-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.220 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:25 compute-0 kernel: tapf8ae4b18-30: entered promiscuous mode
Jan 22 17:35:25 compute-0 NetworkManager[55454]: <info>  [1769103325.2213] manager: (tapf8ae4b18-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/273)
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.222 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:25.223 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf8ae4b18-30, col_values=(('external_ids', {'iface-id': 'ed3c9c99-2ee5-407d-8706-619270405578'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.224 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:25 compute-0 ovn_controller[95372]: 2026-01-22T17:35:25Z|00637|binding|INFO|Releasing lport ed3c9c99-2ee5-407d-8706-619270405578 from this chassis (sb_readonly=0)
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.225 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:25.226 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f8ae4b18-347c-4ff3-b9f8-578518ecd408.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f8ae4b18-347c-4ff3-b9f8-578518ecd408.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:25.227 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[36b48c18-4143-46f6-883b-de4a9a635bf0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:25.227 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/f8ae4b18-347c-4ff3-b9f8-578518ecd408.pid.haproxy
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID f8ae4b18-347c-4ff3-b9f8-578518ecd408
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:35:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:25.228 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408', 'env', 'PROCESS_TAG=haproxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f8ae4b18-347c-4ff3-b9f8-578518ecd408.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.236 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.243 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.253 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103325.2147179, 585cde24-8038-40b2-97ce-5d30e6ecfc03 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.253 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] VM Paused (Lifecycle Event)
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.277 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.283 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.302 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.382 183079 DEBUG nova.compute.manager [req-462f5056-4b99-4ee7-8425-5d7e26c2e9ed req-46d0e6b6-4d8e-4eab-ae27-87cb26d03963 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Received event network-vif-plugged-524266a0-8b9b-42d0-9f33-913b53293292 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.382 183079 DEBUG oslo_concurrency.lockutils [req-462f5056-4b99-4ee7-8425-5d7e26c2e9ed req-46d0e6b6-4d8e-4eab-ae27-87cb26d03963 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.383 183079 DEBUG oslo_concurrency.lockutils [req-462f5056-4b99-4ee7-8425-5d7e26c2e9ed req-46d0e6b6-4d8e-4eab-ae27-87cb26d03963 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.383 183079 DEBUG oslo_concurrency.lockutils [req-462f5056-4b99-4ee7-8425-5d7e26c2e9ed req-46d0e6b6-4d8e-4eab-ae27-87cb26d03963 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.383 183079 DEBUG nova.compute.manager [req-462f5056-4b99-4ee7-8425-5d7e26c2e9ed req-46d0e6b6-4d8e-4eab-ae27-87cb26d03963 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Processing event network-vif-plugged-524266a0-8b9b-42d0-9f33-913b53293292 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.385 183079 DEBUG nova.compute.manager [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.389 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103325.3895369, 585cde24-8038-40b2-97ce-5d30e6ecfc03 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.390 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] VM Resumed (Lifecycle Event)
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.394 183079 DEBUG nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.398 183079 INFO nova.virt.libvirt.driver [-] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Instance spawned successfully.
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.399 183079 DEBUG nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.418 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.423 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.425 183079 DEBUG nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.426 183079 DEBUG nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.426 183079 DEBUG nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.426 183079 DEBUG nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.427 183079 DEBUG nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.427 183079 DEBUG nova.virt.libvirt.driver [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.449 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.480 183079 INFO nova.compute.manager [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Took 3.96 seconds to spawn the instance on the hypervisor.
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.481 183079 DEBUG nova.compute.manager [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.534 183079 INFO nova.compute.manager [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Took 5.93 seconds to build instance.
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.548 183079 DEBUG oslo_concurrency.lockutils [None req-09d8692e-ece8-4049-a30d-2be64fa4e650 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:35:25 compute-0 podman[235104]: 2026-01-22 17:35:25.596671096 +0000 UTC m=+0.053783015 container create 08105ddfddf52ea2379f87a831c03264ce41b8b638a32a680756ac44900847f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 17:35:25 compute-0 systemd[1]: Started libpod-conmon-08105ddfddf52ea2379f87a831c03264ce41b8b638a32a680756ac44900847f6.scope.
Jan 22 17:35:25 compute-0 podman[235104]: 2026-01-22 17:35:25.566498885 +0000 UTC m=+0.023610804 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:35:25 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:35:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26091b2e1adf4c1ad361c15ff1e47998acde14bd242e4517f5923975a4329923/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:35:25 compute-0 podman[235104]: 2026-01-22 17:35:25.68715913 +0000 UTC m=+0.144271069 container init 08105ddfddf52ea2379f87a831c03264ce41b8b638a32a680756ac44900847f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:35:25 compute-0 podman[235104]: 2026-01-22 17:35:25.694761096 +0000 UTC m=+0.151873015 container start 08105ddfddf52ea2379f87a831c03264ce41b8b638a32a680756ac44900847f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:35:25 compute-0 neutron-haproxy-ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235119]: [NOTICE]   (235123) : New worker (235125) forked
Jan 22 17:35:25 compute-0 neutron-haproxy-ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235119]: [NOTICE]   (235123) : Loading success.
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:35:25 compute-0 nova_compute[183075]: 2026-01-22 17:35:25.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:35:26 compute-0 nova_compute[183075]: 2026-01-22 17:35:26.069 183079 INFO nova.compute.manager [None req-70543404-54b4-4ab7-9c8e-4153cfc54d55 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Get console output
Jan 22 17:35:26 compute-0 nova_compute[183075]: 2026-01-22 17:35:26.074 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:35:26 compute-0 nova_compute[183075]: 2026-01-22 17:35:26.443 183079 INFO nova.compute.manager [None req-09815e3a-0aad-4afc-a05f-c90859afb3f8 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Get console output
Jan 22 17:35:26 compute-0 nova_compute[183075]: 2026-01-22 17:35:26.448 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:35:26 compute-0 nova_compute[183075]: 2026-01-22 17:35:26.637 183079 DEBUG nova.network.neutron [req-b4c88c53-8dc7-4ad7-bfd2-af40afe13084 req-5c6700f6-ebdc-48a5-acfb-b197f4c524cc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Updated VIF entry in instance network info cache for port 524266a0-8b9b-42d0-9f33-913b53293292. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:35:26 compute-0 nova_compute[183075]: 2026-01-22 17:35:26.637 183079 DEBUG nova.network.neutron [req-b4c88c53-8dc7-4ad7-bfd2-af40afe13084 req-5c6700f6-ebdc-48a5-acfb-b197f4c524cc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Updating instance_info_cache with network_info: [{"id": "524266a0-8b9b-42d0-9f33-913b53293292", "address": "fa:16:3e:7a:64:5f", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap524266a0-8b", "ovs_interfaceid": "524266a0-8b9b-42d0-9f33-913b53293292", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:35:26 compute-0 nova_compute[183075]: 2026-01-22 17:35:26.653 183079 DEBUG oslo_concurrency.lockutils [req-b4c88c53-8dc7-4ad7-bfd2-af40afe13084 req-5c6700f6-ebdc-48a5-acfb-b197f4c524cc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-585cde24-8038-40b2-97ce-5d30e6ecfc03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:35:27 compute-0 podman[235134]: 2026-01-22 17:35:27.365971931 +0000 UTC m=+0.063616693 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:35:27 compute-0 nova_compute[183075]: 2026-01-22 17:35:27.456 183079 DEBUG nova.compute.manager [req-288bdfab-dc3e-4d0f-974c-1106c0065e52 req-810c3922-d983-4c8b-aab9-f4d67c4d65ed a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Received event network-vif-plugged-524266a0-8b9b-42d0-9f33-913b53293292 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:35:27 compute-0 nova_compute[183075]: 2026-01-22 17:35:27.457 183079 DEBUG oslo_concurrency.lockutils [req-288bdfab-dc3e-4d0f-974c-1106c0065e52 req-810c3922-d983-4c8b-aab9-f4d67c4d65ed a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:35:27 compute-0 nova_compute[183075]: 2026-01-22 17:35:27.458 183079 DEBUG oslo_concurrency.lockutils [req-288bdfab-dc3e-4d0f-974c-1106c0065e52 req-810c3922-d983-4c8b-aab9-f4d67c4d65ed a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:35:27 compute-0 nova_compute[183075]: 2026-01-22 17:35:27.458 183079 DEBUG oslo_concurrency.lockutils [req-288bdfab-dc3e-4d0f-974c-1106c0065e52 req-810c3922-d983-4c8b-aab9-f4d67c4d65ed a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:35:27 compute-0 nova_compute[183075]: 2026-01-22 17:35:27.459 183079 DEBUG nova.compute.manager [req-288bdfab-dc3e-4d0f-974c-1106c0065e52 req-810c3922-d983-4c8b-aab9-f4d67c4d65ed a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] No waiting events found dispatching network-vif-plugged-524266a0-8b9b-42d0-9f33-913b53293292 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:35:27 compute-0 nova_compute[183075]: 2026-01-22 17:35:27.459 183079 WARNING nova.compute.manager [req-288bdfab-dc3e-4d0f-974c-1106c0065e52 req-810c3922-d983-4c8b-aab9-f4d67c4d65ed a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Received unexpected event network-vif-plugged-524266a0-8b9b-42d0-9f33-913b53293292 for instance with vm_state active and task_state None.
Jan 22 17:35:28 compute-0 nova_compute[183075]: 2026-01-22 17:35:28.010 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:29 compute-0 nova_compute[183075]: 2026-01-22 17:35:29.845 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:31 compute-0 nova_compute[183075]: 2026-01-22 17:35:31.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:35:32 compute-0 nova_compute[183075]: 2026-01-22 17:35:32.233 183079 INFO nova.compute.manager [None req-f4ccca2c-e4d8-4d6a-8d2a-111b22d3bcb5 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Get console output
Jan 22 17:35:32 compute-0 nova_compute[183075]: 2026-01-22 17:35:32.240 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:35:32 compute-0 nova_compute[183075]: 2026-01-22 17:35:32.247 183079 INFO nova.compute.manager [None req-4fa77157-6c51-4f78-a9d6-42bd524bf8cb bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Get console output
Jan 22 17:35:32 compute-0 nova_compute[183075]: 2026-01-22 17:35:32.253 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:35:33 compute-0 nova_compute[183075]: 2026-01-22 17:35:33.012 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:34 compute-0 nova_compute[183075]: 2026-01-22 17:35:34.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:35:34 compute-0 nova_compute[183075]: 2026-01-22 17:35:34.849 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:36 compute-0 ovn_controller[95372]: 2026-01-22T17:35:36Z|00118|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7a:64:5f 10.100.0.6
Jan 22 17:35:36 compute-0 ovn_controller[95372]: 2026-01-22T17:35:36Z|00119|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7a:64:5f 10.100.0.6
Jan 22 17:35:37 compute-0 nova_compute[183075]: 2026-01-22 17:35:37.356 183079 INFO nova.compute.manager [None req-e7471b33-a065-4696-bc20-6fed2bb0b9da 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Get console output
Jan 22 17:35:37 compute-0 nova_compute[183075]: 2026-01-22 17:35:37.361 183079 INFO nova.compute.manager [None req-2fa1ac2f-9037-41a2-a3d9-b1e1117f4d3c bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Get console output
Jan 22 17:35:37 compute-0 nova_compute[183075]: 2026-01-22 17:35:37.363 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:35:37 compute-0 nova_compute[183075]: 2026-01-22 17:35:37.367 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:35:37 compute-0 nova_compute[183075]: 2026-01-22 17:35:37.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:35:37 compute-0 nova_compute[183075]: 2026-01-22 17:35:37.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:35:38 compute-0 nova_compute[183075]: 2026-01-22 17:35:38.015 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:38 compute-0 podman[235182]: 2026-01-22 17:35:38.371388904 +0000 UTC m=+0.066240484 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., release=1755695350, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal)
Jan 22 17:35:38 compute-0 podman[235181]: 2026-01-22 17:35:38.372771172 +0000 UTC m=+0.062454921 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:35:38 compute-0 podman[235180]: 2026-01-22 17:35:38.394455872 +0000 UTC m=+0.095083789 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:35:38 compute-0 nova_compute[183075]: 2026-01-22 17:35:38.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:35:38 compute-0 nova_compute[183075]: 2026-01-22 17:35:38.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:35:38 compute-0 nova_compute[183075]: 2026-01-22 17:35:38.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:35:39 compute-0 nova_compute[183075]: 2026-01-22 17:35:39.149 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:35:39 compute-0 nova_compute[183075]: 2026-01-22 17:35:39.149 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:35:39 compute-0 nova_compute[183075]: 2026-01-22 17:35:39.149 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:35:39 compute-0 nova_compute[183075]: 2026-01-22 17:35:39.150 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f62cb90c-e99d-43d4-bbac-06a79d9b1182 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:35:39 compute-0 nova_compute[183075]: 2026-01-22 17:35:39.852 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:41 compute-0 nova_compute[183075]: 2026-01-22 17:35:41.565 183079 DEBUG nova.compute.manager [req-bd347ffc-d7dc-42dc-9020-2f93bb18e43e req-87ddaa67-7e31-4622-9c56-a62660fa0f18 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Received event network-changed-509507c8-2a57-4e8e-aece-de4608aa6284 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:35:41 compute-0 nova_compute[183075]: 2026-01-22 17:35:41.566 183079 DEBUG nova.compute.manager [req-bd347ffc-d7dc-42dc-9020-2f93bb18e43e req-87ddaa67-7e31-4622-9c56-a62660fa0f18 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Refreshing instance network info cache due to event network-changed-509507c8-2a57-4e8e-aece-de4608aa6284. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:35:41 compute-0 nova_compute[183075]: 2026-01-22 17:35:41.566 183079 DEBUG oslo_concurrency.lockutils [req-bd347ffc-d7dc-42dc-9020-2f93bb18e43e req-87ddaa67-7e31-4622-9c56-a62660fa0f18 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:35:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:41.951 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:35:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:41.952 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:35:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:41.953 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.227 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Updating instance_info_cache with network_info: [{"id": "509507c8-2a57-4e8e-aece-de4608aa6284", "address": "fa:16:3e:04:a8:2d", "network": {"id": "66f028c0-4f7e-4541-a188-d929c83780ea", "bridge": "br-int", "label": "tempest-test-network--1307842537", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509507c8-2a", "ovs_interfaceid": "509507c8-2a57-4e8e-aece-de4608aa6284", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.251 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.251 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.251 183079 DEBUG oslo_concurrency.lockutils [req-bd347ffc-d7dc-42dc-9020-2f93bb18e43e req-87ddaa67-7e31-4622-9c56-a62660fa0f18 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.252 183079 DEBUG nova.network.neutron [req-bd347ffc-d7dc-42dc-9020-2f93bb18e43e req-87ddaa67-7e31-4622-9c56-a62660fa0f18 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Refreshing network info cache for port 509507c8-2a57-4e8e-aece-de4608aa6284 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.252 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.253 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.274 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.274 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.274 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.275 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.355 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.412 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.413 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.417 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.418 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.483 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.492 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.539 183079 INFO nova.compute.manager [None req-6c88a70c-d4d6-4a13-9c1b-4fcdb498bafd 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Get console output
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.547 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.554 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.555 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.612 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.748 183079 DEBUG nova.compute.manager [req-3c3926d8-ef95-46ed-a9ce-c9b81f60aeb6 req-1ccb6cb1-6003-4385-ad89-526b5000de25 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Received event network-changed-509507c8-2a57-4e8e-aece-de4608aa6284 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.748 183079 DEBUG nova.compute.manager [req-3c3926d8-ef95-46ed-a9ce-c9b81f60aeb6 req-1ccb6cb1-6003-4385-ad89-526b5000de25 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Refreshing instance network info cache due to event network-changed-509507c8-2a57-4e8e-aece-de4608aa6284. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.749 183079 DEBUG oslo_concurrency.lockutils [req-3c3926d8-ef95-46ed-a9ce-c9b81f60aeb6 req-1ccb6cb1-6003-4385-ad89-526b5000de25 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.824 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.825 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5376MB free_disk=73.3040542602539GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.825 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.825 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.846 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.846 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.4331608
Jan 22 17:35:42 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.6:60144 [22/Jan/2026:17:35:42.412] listener listener/metadata 0/0/0/434/434 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.856 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.857 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.876 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.877 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 166 time: 0.0197175
Jan 22 17:35:42 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.6:60152 [22/Jan/2026:17:35:42.855] listener listener/metadata 0/0/0/21/21 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.883 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.884 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.901 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.901 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0175376
Jan 22 17:35:42 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.6:60164 [22/Jan/2026:17:35:42.883] listener listener/metadata 0/0/0/18/18 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.906 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.906 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance f62cb90c-e99d-43d4-bbac-06a79d9b1182 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.907 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 585cde24-8038-40b2-97ce-5d30e6ecfc03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.907 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.907 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.907 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.922 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.922 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0151129
Jan 22 17:35:42 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.6:60172 [22/Jan/2026:17:35:42.906] listener listener/metadata 0/0/0/16/16 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.927 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.927 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.942 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.942 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0150235
Jan 22 17:35:42 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.6:60188 [22/Jan/2026:17:35:42.926] listener listener/metadata 0/0/0/16/16 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.946 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.947 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.964 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.964 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0170994
Jan 22 17:35:42 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.6:60198 [22/Jan/2026:17:35:42.946] listener listener/metadata 0/0/0/18/18 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.970 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.971 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.972 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:42 compute-0 nova_compute[183075]: 2026-01-22 17:35:42.985 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.988 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.989 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0176578
Jan 22 17:35:42 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.6:60210 [22/Jan/2026:17:35:42.970] listener listener/metadata 0/0/0/19/19 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.996 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:42.997 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:35:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.008 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.008 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.011 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.012 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0150228
Jan 22 17:35:43 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.6:60222 [22/Jan/2026:17:35:42.995] listener listener/metadata 0/0/0/16/16 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.018 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.022 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.023 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.046 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.046 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0234339
Jan 22 17:35:43 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.6:60232 [22/Jan/2026:17:35:43.021] listener listener/metadata 0/0/0/25/25 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.051 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.052 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.066 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.066 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0143242
Jan 22 17:35:43 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.6:60236 [22/Jan/2026:17:35:43.051] listener listener/metadata 0/0/0/15/15 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.070 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.071 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.087 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0155070
Jan 22 17:35:43 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.6:60240 [22/Jan/2026:17:35:43.070] listener listener/metadata 0/0/0/16/16 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.098 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.098 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.113 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.113 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0149128
Jan 22 17:35:43 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.6:60250 [22/Jan/2026:17:35:43.097] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.117 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.118 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.132 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.132 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0144389
Jan 22 17:35:43 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.6:60252 [22/Jan/2026:17:35:43.117] listener listener/metadata 0/0/0/15/15 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.137 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.138 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.151 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.151 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0138071
Jan 22 17:35:43 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.6:60254 [22/Jan/2026:17:35:43.136] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.156 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.157 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.174 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.175 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0183039
Jan 22 17:35:43 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.6:60262 [22/Jan/2026:17:35:43.155] listener listener/metadata 0/0/0/19/19 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.180 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.181 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.194 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.195 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0136802
Jan 22 17:35:43 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.6:60272 [22/Jan/2026:17:35:43.180] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:35:43 compute-0 podman[235258]: 2026-01-22 17:35:43.338519911 +0000 UTC m=+0.048305896 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.419 183079 DEBUG nova.network.neutron [req-bd347ffc-d7dc-42dc-9020-2f93bb18e43e req-87ddaa67-7e31-4622-9c56-a62660fa0f18 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Updated VIF entry in instance network info cache for port 509507c8-2a57-4e8e-aece-de4608aa6284. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.420 183079 DEBUG nova.network.neutron [req-bd347ffc-d7dc-42dc-9020-2f93bb18e43e req-87ddaa67-7e31-4622-9c56-a62660fa0f18 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Updating instance_info_cache with network_info: [{"id": "509507c8-2a57-4e8e-aece-de4608aa6284", "address": "fa:16:3e:04:a8:2d", "network": {"id": "66f028c0-4f7e-4541-a188-d929c83780ea", "bridge": "br-int", "label": "tempest-test-network--1307842537", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509507c8-2a", "ovs_interfaceid": "509507c8-2a57-4e8e-aece-de4608aa6284", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.445 183079 DEBUG oslo_concurrency.lockutils [req-bd347ffc-d7dc-42dc-9020-2f93bb18e43e req-87ddaa67-7e31-4622-9c56-a62660fa0f18 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.446 183079 DEBUG oslo_concurrency.lockutils [req-3c3926d8-ef95-46ed-a9ce-c9b81f60aeb6 req-1ccb6cb1-6003-4385-ad89-526b5000de25 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.446 183079 DEBUG nova.network.neutron [req-3c3926d8-ef95-46ed-a9ce-c9b81f60aeb6 req-1ccb6cb1-6003-4385-ad89-526b5000de25 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Refreshing network info cache for port 509507c8-2a57-4e8e-aece-de4608aa6284 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.467 183079 DEBUG oslo_concurrency.lockutils [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.468 183079 DEBUG oslo_concurrency.lockutils [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.468 183079 DEBUG oslo_concurrency.lockutils [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.468 183079 DEBUG oslo_concurrency.lockutils [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.468 183079 DEBUG oslo_concurrency.lockutils [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.469 183079 INFO nova.compute.manager [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Terminating instance
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.470 183079 DEBUG nova.compute.manager [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:35:43 compute-0 kernel: tap509507c8-2a (unregistering): left promiscuous mode
Jan 22 17:35:43 compute-0 NetworkManager[55454]: <info>  [1769103343.4887] device (tap509507c8-2a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:35:43 compute-0 ovn_controller[95372]: 2026-01-22T17:35:43Z|00638|binding|INFO|Releasing lport 509507c8-2a57-4e8e-aece-de4608aa6284 from this chassis (sb_readonly=0)
Jan 22 17:35:43 compute-0 ovn_controller[95372]: 2026-01-22T17:35:43Z|00639|binding|INFO|Setting lport 509507c8-2a57-4e8e-aece-de4608aa6284 down in Southbound
Jan 22 17:35:43 compute-0 ovn_controller[95372]: 2026-01-22T17:35:43Z|00640|binding|INFO|Removing iface tap509507c8-2a ovn-installed in OVS
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.494 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.500 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:a8:2d 10.100.0.7'], port_security=['fa:16:3e:04:a8:2d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f62cb90c-e99d-43d4-bbac-06a79d9b1182', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66f028c0-4f7e-4541-a188-d929c83780ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8cfd5f99a92142bd829974004d0e603e', 'neutron:revision_number': '6', 'neutron:security_group_ids': '759464aa-5d78-4dff-baf2-2a2f16ceb397', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.238'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8e9a6e9-2a99-4df6-8e05-39265d2551ea, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=509507c8-2a57-4e8e-aece-de4608aa6284) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.501 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 509507c8-2a57-4e8e-aece-de4608aa6284 in datapath 66f028c0-4f7e-4541-a188-d929c83780ea unbound from our chassis
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.503 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 66f028c0-4f7e-4541-a188-d929c83780ea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.505 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7e1136bf-4b5b-429c-9cd6-d35bef4752ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.505 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea namespace which is not needed anymore
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.518 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:43 compute-0 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000039.scope: Deactivated successfully.
Jan 22 17:35:43 compute-0 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000039.scope: Consumed 14.672s CPU time.
Jan 22 17:35:43 compute-0 systemd-machined[154382]: Machine qemu-57-instance-00000039 terminated.
Jan 22 17:35:43 compute-0 neutron-haproxy-ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea[234803]: [NOTICE]   (234807) : haproxy version is 2.8.14-c23fe91
Jan 22 17:35:43 compute-0 neutron-haproxy-ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea[234803]: [NOTICE]   (234807) : path to executable is /usr/sbin/haproxy
Jan 22 17:35:43 compute-0 neutron-haproxy-ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea[234803]: [WARNING]  (234807) : Exiting Master process...
Jan 22 17:35:43 compute-0 neutron-haproxy-ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea[234803]: [WARNING]  (234807) : Exiting Master process...
Jan 22 17:35:43 compute-0 neutron-haproxy-ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea[234803]: [ALERT]    (234807) : Current worker (234809) exited with code 143 (Terminated)
Jan 22 17:35:43 compute-0 neutron-haproxy-ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea[234803]: [WARNING]  (234807) : All workers exited. Exiting... (0)
Jan 22 17:35:43 compute-0 systemd[1]: libpod-6e1315e81220306b6aa7b1216a98dfcd2f1e5da3570f0b4df8fac07374687f98.scope: Deactivated successfully.
Jan 22 17:35:43 compute-0 podman[235303]: 2026-01-22 17:35:43.646261509 +0000 UTC m=+0.049533310 container died 6e1315e81220306b6aa7b1216a98dfcd2f1e5da3570f0b4df8fac07374687f98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:35:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6e1315e81220306b6aa7b1216a98dfcd2f1e5da3570f0b4df8fac07374687f98-userdata-shm.mount: Deactivated successfully.
Jan 22 17:35:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-1a7a0a85aebe7bbd1f4bcd3c320fa18891e7f5de1f49efe07c635bbefa9058c2-merged.mount: Deactivated successfully.
Jan 22 17:35:43 compute-0 podman[235303]: 2026-01-22 17:35:43.691178922 +0000 UTC m=+0.094450683 container cleanup 6e1315e81220306b6aa7b1216a98dfcd2f1e5da3570f0b4df8fac07374687f98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:35:43 compute-0 systemd[1]: libpod-conmon-6e1315e81220306b6aa7b1216a98dfcd2f1e5da3570f0b4df8fac07374687f98.scope: Deactivated successfully.
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.721 183079 INFO nova.virt.libvirt.driver [-] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Instance destroyed successfully.
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.723 183079 DEBUG nova.objects.instance [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lazy-loading 'resources' on Instance uuid f62cb90c-e99d-43d4-bbac-06a79d9b1182 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.739 183079 DEBUG nova.virt.libvirt.vif [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:34:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1825252520',display_name='tempest-server-test-1825252520',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1825252520',id=57,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGomFw5Z3VR9MENpYtKeaxDUxtNH6GlF1leYJJ98KvwYvEsTZM9n4Wa79zXCEZq0VbSC6BefWHPMVcmuZdoxNG4KqsiiHTSMIJygbk2qiUn1+nbdXOT2eC+L+LWWgtGW8A==',key_name='tempest-keypair-test-857257872',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:34:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8cfd5f99a92142bd829974004d0e603e',ramdisk_id='',reservation_id='r-0g4xaa0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortSecurityTest-1796635121',owner_user_name='tempest-PortSecurityTest-1796635121-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:34:46Z,user_data=None,user_id='bf68afec168c4aa2ba7e47fdb3b026af',uuid=f62cb90c-e99d-43d4-bbac-06a79d9b1182,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "509507c8-2a57-4e8e-aece-de4608aa6284", "address": "fa:16:3e:04:a8:2d", "network": {"id": "66f028c0-4f7e-4541-a188-d929c83780ea", "bridge": "br-int", "label": "tempest-test-network--1307842537", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509507c8-2a", "ovs_interfaceid": "509507c8-2a57-4e8e-aece-de4608aa6284", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.740 183079 DEBUG nova.network.os_vif_util [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Converting VIF {"id": "509507c8-2a57-4e8e-aece-de4608aa6284", "address": "fa:16:3e:04:a8:2d", "network": {"id": "66f028c0-4f7e-4541-a188-d929c83780ea", "bridge": "br-int", "label": "tempest-test-network--1307842537", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509507c8-2a", "ovs_interfaceid": "509507c8-2a57-4e8e-aece-de4608aa6284", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.741 183079 DEBUG nova.network.os_vif_util [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:04:a8:2d,bridge_name='br-int',has_traffic_filtering=True,id=509507c8-2a57-4e8e-aece-de4608aa6284,network=Network(66f028c0-4f7e-4541-a188-d929c83780ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap509507c8-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.741 183079 DEBUG os_vif [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:a8:2d,bridge_name='br-int',has_traffic_filtering=True,id=509507c8-2a57-4e8e-aece-de4608aa6284,network=Network(66f028c0-4f7e-4541-a188-d929c83780ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap509507c8-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.743 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.743 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap509507c8-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.746 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.748 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.750 183079 INFO os_vif [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:a8:2d,bridge_name='br-int',has_traffic_filtering=True,id=509507c8-2a57-4e8e-aece-de4608aa6284,network=Network(66f028c0-4f7e-4541-a188-d929c83780ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap509507c8-2a')
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.750 183079 INFO nova.virt.libvirt.driver [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Deleting instance files /var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182_del
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.751 183079 INFO nova.virt.libvirt.driver [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Deletion of /var/lib/nova/instances/f62cb90c-e99d-43d4-bbac-06a79d9b1182_del complete
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.807 183079 INFO nova.compute.manager [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.807 183079 DEBUG oslo.service.loopingcall [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.807 183079 DEBUG nova.compute.manager [-] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.807 183079 DEBUG nova.network.neutron [-] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:35:43 compute-0 podman[235344]: 2026-01-22 17:35:43.823128213 +0000 UTC m=+0.105695688 container remove 6e1315e81220306b6aa7b1216a98dfcd2f1e5da3570f0b4df8fac07374687f98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.829 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4109ce03-f94b-4f7d-9e52-867fc4c45163]: (4, ('Thu Jan 22 05:35:43 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea (6e1315e81220306b6aa7b1216a98dfcd2f1e5da3570f0b4df8fac07374687f98)\n6e1315e81220306b6aa7b1216a98dfcd2f1e5da3570f0b4df8fac07374687f98\nThu Jan 22 05:35:43 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea (6e1315e81220306b6aa7b1216a98dfcd2f1e5da3570f0b4df8fac07374687f98)\n6e1315e81220306b6aa7b1216a98dfcd2f1e5da3570f0b4df8fac07374687f98\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.831 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9eb2872d-8436-4d80-84ce-cc9ef86bf3f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.834 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66f028c0-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.837 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:43 compute-0 kernel: tap66f028c0-40: left promiscuous mode
Jan 22 17:35:43 compute-0 nova_compute[183075]: 2026-01-22 17:35:43.850 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.854 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9e75d9f1-08d6-40a9-a21f-1673d6353599]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.868 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[89c3d04b-f862-468b-9eaf-55096fa6d4d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.869 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b761990b-c2a6-4550-9f6f-4067c99093ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.886 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[663beb7c-bc5e-49d8-bb46-9dbe37764f8c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564741, 'reachable_time': 43586, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235361, 'error': None, 'target': 'ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d66f028c0\x2d4f7e\x2d4541\x2da188\x2dd929c83780ea.mount: Deactivated successfully.
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.891 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-66f028c0-4f7e-4541-a188-d929c83780ea deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:35:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:35:43.891 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[e451ebd4-c3e8-4a33-879e-8e797b0025ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:35:45 compute-0 nova_compute[183075]: 2026-01-22 17:35:45.380 183079 DEBUG nova.compute.manager [req-44de41ff-331a-4aca-bbb8-9f5cd1969803 req-04e34869-43f1-4b5d-ade0-4ad0b4f26697 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Received event network-changed-509507c8-2a57-4e8e-aece-de4608aa6284 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:35:45 compute-0 nova_compute[183075]: 2026-01-22 17:35:45.380 183079 DEBUG nova.compute.manager [req-44de41ff-331a-4aca-bbb8-9f5cd1969803 req-04e34869-43f1-4b5d-ade0-4ad0b4f26697 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Refreshing instance network info cache due to event network-changed-509507c8-2a57-4e8e-aece-de4608aa6284. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:35:45 compute-0 nova_compute[183075]: 2026-01-22 17:35:45.380 183079 DEBUG oslo_concurrency.lockutils [req-44de41ff-331a-4aca-bbb8-9f5cd1969803 req-04e34869-43f1-4b5d-ade0-4ad0b4f26697 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:35:46 compute-0 nova_compute[183075]: 2026-01-22 17:35:46.125 183079 DEBUG nova.network.neutron [-] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:35:46 compute-0 nova_compute[183075]: 2026-01-22 17:35:46.146 183079 INFO nova.compute.manager [-] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Took 2.34 seconds to deallocate network for instance.
Jan 22 17:35:46 compute-0 nova_compute[183075]: 2026-01-22 17:35:46.210 183079 DEBUG oslo_concurrency.lockutils [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:35:46 compute-0 nova_compute[183075]: 2026-01-22 17:35:46.211 183079 DEBUG oslo_concurrency.lockutils [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:35:46 compute-0 nova_compute[183075]: 2026-01-22 17:35:46.289 183079 DEBUG nova.compute.provider_tree [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:35:46 compute-0 nova_compute[183075]: 2026-01-22 17:35:46.484 183079 DEBUG nova.scheduler.client.report [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:35:46 compute-0 nova_compute[183075]: 2026-01-22 17:35:46.625 183079 DEBUG oslo_concurrency.lockutils [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.415s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:35:46 compute-0 nova_compute[183075]: 2026-01-22 17:35:46.629 183079 DEBUG nova.network.neutron [req-3c3926d8-ef95-46ed-a9ce-c9b81f60aeb6 req-1ccb6cb1-6003-4385-ad89-526b5000de25 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Updated VIF entry in instance network info cache for port 509507c8-2a57-4e8e-aece-de4608aa6284. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:35:46 compute-0 nova_compute[183075]: 2026-01-22 17:35:46.629 183079 DEBUG nova.network.neutron [req-3c3926d8-ef95-46ed-a9ce-c9b81f60aeb6 req-1ccb6cb1-6003-4385-ad89-526b5000de25 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Updating instance_info_cache with network_info: [{"id": "509507c8-2a57-4e8e-aece-de4608aa6284", "address": "fa:16:3e:04:a8:2d", "network": {"id": "66f028c0-4f7e-4541-a188-d929c83780ea", "bridge": "br-int", "label": "tempest-test-network--1307842537", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509507c8-2a", "ovs_interfaceid": "509507c8-2a57-4e8e-aece-de4608aa6284", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:35:46 compute-0 nova_compute[183075]: 2026-01-22 17:35:46.666 183079 INFO nova.scheduler.client.report [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Deleted allocations for instance f62cb90c-e99d-43d4-bbac-06a79d9b1182
Jan 22 17:35:46 compute-0 nova_compute[183075]: 2026-01-22 17:35:46.686 183079 DEBUG oslo_concurrency.lockutils [req-3c3926d8-ef95-46ed-a9ce-c9b81f60aeb6 req-1ccb6cb1-6003-4385-ad89-526b5000de25 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:35:46 compute-0 nova_compute[183075]: 2026-01-22 17:35:46.689 183079 DEBUG oslo_concurrency.lockutils [req-44de41ff-331a-4aca-bbb8-9f5cd1969803 req-04e34869-43f1-4b5d-ade0-4ad0b4f26697 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:35:46 compute-0 nova_compute[183075]: 2026-01-22 17:35:46.689 183079 DEBUG nova.network.neutron [req-44de41ff-331a-4aca-bbb8-9f5cd1969803 req-04e34869-43f1-4b5d-ade0-4ad0b4f26697 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Refreshing network info cache for port 509507c8-2a57-4e8e-aece-de4608aa6284 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:35:46 compute-0 nova_compute[183075]: 2026-01-22 17:35:46.739 183079 DEBUG oslo_concurrency.lockutils [None req-f117b40b-eccb-4aa5-a441-b3b32d83b031 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.163 183079 INFO nova.network.neutron [req-44de41ff-331a-4aca-bbb8-9f5cd1969803 req-04e34869-43f1-4b5d-ade0-4ad0b4f26697 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Port 509507c8-2a57-4e8e-aece-de4608aa6284 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.164 183079 DEBUG nova.network.neutron [req-44de41ff-331a-4aca-bbb8-9f5cd1969803 req-04e34869-43f1-4b5d-ade0-4ad0b4f26697 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.243 183079 DEBUG oslo_concurrency.lockutils [req-44de41ff-331a-4aca-bbb8-9f5cd1969803 req-04e34869-43f1-4b5d-ade0-4ad0b4f26697 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-f62cb90c-e99d-43d4-bbac-06a79d9b1182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.244 183079 DEBUG nova.compute.manager [req-44de41ff-331a-4aca-bbb8-9f5cd1969803 req-04e34869-43f1-4b5d-ade0-4ad0b4f26697 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Received event network-vif-unplugged-509507c8-2a57-4e8e-aece-de4608aa6284 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.244 183079 DEBUG oslo_concurrency.lockutils [req-44de41ff-331a-4aca-bbb8-9f5cd1969803 req-04e34869-43f1-4b5d-ade0-4ad0b4f26697 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.244 183079 DEBUG oslo_concurrency.lockutils [req-44de41ff-331a-4aca-bbb8-9f5cd1969803 req-04e34869-43f1-4b5d-ade0-4ad0b4f26697 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.244 183079 DEBUG oslo_concurrency.lockutils [req-44de41ff-331a-4aca-bbb8-9f5cd1969803 req-04e34869-43f1-4b5d-ade0-4ad0b4f26697 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.245 183079 DEBUG nova.compute.manager [req-44de41ff-331a-4aca-bbb8-9f5cd1969803 req-04e34869-43f1-4b5d-ade0-4ad0b4f26697 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] No waiting events found dispatching network-vif-unplugged-509507c8-2a57-4e8e-aece-de4608aa6284 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.245 183079 DEBUG nova.compute.manager [req-44de41ff-331a-4aca-bbb8-9f5cd1969803 req-04e34869-43f1-4b5d-ade0-4ad0b4f26697 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Received event network-vif-unplugged-509507c8-2a57-4e8e-aece-de4608aa6284 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.710 183079 INFO nova.compute.manager [None req-add77f75-f03c-469d-b6b0-36107f534279 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Get console output
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.715 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.741 183079 DEBUG nova.compute.manager [req-2cf5ea7c-fa9f-44e3-925a-96346fa51e8f req-3497f8ae-f295-4af4-9bdb-a1053bc690b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Received event network-vif-plugged-509507c8-2a57-4e8e-aece-de4608aa6284 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.741 183079 DEBUG oslo_concurrency.lockutils [req-2cf5ea7c-fa9f-44e3-925a-96346fa51e8f req-3497f8ae-f295-4af4-9bdb-a1053bc690b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.742 183079 DEBUG oslo_concurrency.lockutils [req-2cf5ea7c-fa9f-44e3-925a-96346fa51e8f req-3497f8ae-f295-4af4-9bdb-a1053bc690b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.742 183079 DEBUG oslo_concurrency.lockutils [req-2cf5ea7c-fa9f-44e3-925a-96346fa51e8f req-3497f8ae-f295-4af4-9bdb-a1053bc690b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "f62cb90c-e99d-43d4-bbac-06a79d9b1182-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.742 183079 DEBUG nova.compute.manager [req-2cf5ea7c-fa9f-44e3-925a-96346fa51e8f req-3497f8ae-f295-4af4-9bdb-a1053bc690b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] No waiting events found dispatching network-vif-plugged-509507c8-2a57-4e8e-aece-de4608aa6284 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.742 183079 WARNING nova.compute.manager [req-2cf5ea7c-fa9f-44e3-925a-96346fa51e8f req-3497f8ae-f295-4af4-9bdb-a1053bc690b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Received unexpected event network-vif-plugged-509507c8-2a57-4e8e-aece-de4608aa6284 for instance with vm_state deleted and task_state None.
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.743 183079 DEBUG nova.compute.manager [req-2cf5ea7c-fa9f-44e3-925a-96346fa51e8f req-3497f8ae-f295-4af4-9bdb-a1053bc690b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Received event network-vif-deleted-509507c8-2a57-4e8e-aece-de4608aa6284 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.743 183079 INFO nova.compute.manager [req-2cf5ea7c-fa9f-44e3-925a-96346fa51e8f req-3497f8ae-f295-4af4-9bdb-a1053bc690b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Neutron deleted interface 509507c8-2a57-4e8e-aece-de4608aa6284; detaching it from the instance and deleting it from the info cache
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.743 183079 DEBUG nova.network.neutron [req-2cf5ea7c-fa9f-44e3-925a-96346fa51e8f req-3497f8ae-f295-4af4-9bdb-a1053bc690b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 22 17:35:47 compute-0 nova_compute[183075]: 2026-01-22 17:35:47.746 183079 DEBUG nova.compute.manager [req-2cf5ea7c-fa9f-44e3-925a-96346fa51e8f req-3497f8ae-f295-4af4-9bdb-a1053bc690b5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Detach interface failed, port_id=509507c8-2a57-4e8e-aece-de4608aa6284, reason: Instance f62cb90c-e99d-43d4-bbac-06a79d9b1182 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 17:35:48 compute-0 nova_compute[183075]: 2026-01-22 17:35:48.019 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:48 compute-0 nova_compute[183075]: 2026-01-22 17:35:48.746 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:53 compute-0 nova_compute[183075]: 2026-01-22 17:35:53.021 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:53 compute-0 nova_compute[183075]: 2026-01-22 17:35:53.749 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:54 compute-0 podman[235362]: 2026-01-22 17:35:54.352428127 +0000 UTC m=+0.060127288 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.461 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'name': 'tempest-server-test-575952224', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'hostId': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.462 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.471 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e30184bf-886d-4424-a717-23e7f05fb235', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:35:55.463018', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ce3e51cc-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.223508194, 'message_signature': '788f5dac172f5b971290ba3a2b5533216a18ba7eeeb29c48346a98de3d5cd9c3'}]}, 'timestamp': '2026-01-22 17:35:55.472153', '_unique_id': 'cf2a58a57c544a999cd99ccb4a1e67f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.473 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.474 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.477 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 585cde24-8038-40b2-97ce-5d30e6ecfc03 / tap524266a0-8b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.477 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55f85b66-15b7-4b32-a396-fde102c7abd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:35:55.474399', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': 'ce3f435c-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.234832582, 'message_signature': '9320ea1333e1fd789ed293cdc45b1add4bc8b132acfbdb502b2a2edab285c529'}]}, 'timestamp': '2026-01-22 17:35:55.478354', '_unique_id': 'c26ecba334144d1494e3fff7703de3a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.479 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.480 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.481 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.481 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-575952224>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-575952224>]
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.481 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.481 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4176b8ee-e404-4e00-b130-adb3715fb81d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:35:55.481944', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': 'ce3fe3f2-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.234832582, 'message_signature': '2624afc84f88789d13ac8aa37aec14028a630d75eb39b11c62a93d628de7a793'}]}, 'timestamp': '2026-01-22 17:35:55.482457', '_unique_id': 'e71ce35b9e324c88b1266fc582016859'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.484 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.485 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc71154d-c552-4a8e-9a84-74fe2fb9bb1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:35:55.485085', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ce406480-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.223508194, 'message_signature': 'eaea9c4bd5d22b8bfd48fc268c38ad6ad7ac66c377d4a01ef20ef16138cd9961'}]}, 'timestamp': '2026-01-22 17:35:55.485790', '_unique_id': 'ccdd500e8180441b95cf0774c7d25e30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.486 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.488 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.504 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.read.latency volume: 204576654 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e6802d5-e5d5-407c-8092-262d4a1594e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 204576654, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:35:55.488205', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ce4368e2-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.248698159, 'message_signature': '0a89b1e246973ee4e44f397ff5c525e8b401ee73dd47727eaf5094bd3ec465a3'}]}, 'timestamp': '2026-01-22 17:35:55.505530', '_unique_id': '57add4f5135d4c41b43b1cf25da86093'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.508 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.508 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7caae10-f602-4a54-9d05-502f6ade57c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:35:55.508169', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': 'ce43e5d8-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.234832582, 'message_signature': 'e31014760da7d7d769c0ae885699b0a800e12109bdca3109080bd1774a13e029'}]}, 'timestamp': '2026-01-22 17:35:55.508753', '_unique_id': '8b5a1ff860954c0fab09697afabe6e62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.510 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.511 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.511 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-575952224>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-575952224>]
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.511 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.511 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3bb0554-2cfa-4bd6-aa50-fc607fb72546', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:35:55.511745', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': 'ce446fee-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.234832582, 'message_signature': '1f1cb3ff0f768a11134581818e4846b0f8e0d13c1bd63ae72a526a9356aa9768'}]}, 'timestamp': '2026-01-22 17:35:55.512260', '_unique_id': 'ad7d9d0772c64209b67e0e3b38ec4b5c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.513 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.514 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.514 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1dd5505-ece1-406c-9b1c-c65748b2b9fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:35:55.514672', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': 'ce44e35c-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.234832582, 'message_signature': 'dd84fe9c6d22ec19753518de0d569e8317ee92f5b84897ea7835519ce01b49a6'}]}, 'timestamp': '2026-01-22 17:35:55.515206', '_unique_id': 'ba0c86cfa32b4e0fa9fc3bbb6d38ec46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.516 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.517 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.517 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '491438a9-96dd-4f1a-82c6-cc7c09f9a6d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:35:55.517689', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ce45592c-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.223508194, 'message_signature': '804cd6b66df1347295f2f1ff52f2795e6ce66d45fa21232868840b5d0b5acb99'}]}, 'timestamp': '2026-01-22 17:35:55.518207', '_unique_id': '12664786c2c7479491d2de8c0568fd0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.520 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.520 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.outgoing.bytes volume: 10851 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '841a0ba4-d2ef-4d62-b536-1639cc809dde', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10851, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:35:55.520596', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': 'ce45cb8c-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.234832582, 'message_signature': 'af6e7a806507f5e265140d0697fd1654dc77ffc045d46acec25fd73138fcfebc'}]}, 'timestamp': '2026-01-22 17:35:55.521148', '_unique_id': 'f63f72ae47ff47d8a67a70ce298115ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.523 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.523 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.523 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-575952224>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-575952224>]
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.524 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.524 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.incoming.bytes volume: 7223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75910a5b-0621-43ce-bb21-5f781677a267', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7223, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:35:55.524152', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': 'ce46564c-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.234832582, 'message_signature': '71b8f29bfaa8a833471d8cb508befc23b27f9c8bbb7f01d7446fb5098494caec'}]}, 'timestamp': '2026-01-22 17:35:55.524752', '_unique_id': 'b563f38e2c5b416598c8ba554ec51876'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.526 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.527 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.write.bytes volume: 72978432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'edd41abe-e66d-40c3-abd4-421ac3dc6248', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72978432, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:35:55.527093', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ce46c690-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.248698159, 'message_signature': 'c5b1462f9587f2b0f5ae5cd5c59533156ea6dc7fdb836375be98a86a7d322532'}]}, 'timestamp': '2026-01-22 17:35:55.527557', '_unique_id': '67167ccfc6cd4d8ca24edb9aff2813da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.529 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.529 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.read.requests volume: 1112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60942276-3f18-4bde-9319-b89f1cee8daf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1112, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:35:55.529914', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ce4734cc-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.248698159, 'message_signature': 'adef14e1ca8fdd943323d337c12127ac185be2d7ad5ebde046941aa36cf3ce46'}]}, 'timestamp': '2026-01-22 17:35:55.530416', '_unique_id': '3c7b89c008a04f4ead53bca2bee52330'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.532 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.532 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b9194c0-59bf-4d12-9b1a-7bb01ca8cb55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:35:55.532834', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': 'ce47a722-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.234832582, 'message_signature': 'be1bb128b6a43791db598a5c246717ad2d4dcf1c2fc0609e198591fb097ea655'}]}, 'timestamp': '2026-01-22 17:35:55.533321', '_unique_id': '7cc6b2ae94af409dbfd4c323e7ee651a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.535 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.535 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.write.latency volume: 3141211496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51d4c3a8-d76b-4865-b83f-a2c1ca939867', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3141211496, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:35:55.535957', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ce4820e4-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.248698159, 'message_signature': '2f0afd1d4f94e231cc349323b20fbed529b139bd07fa01e55e668fd6cbfb4857'}]}, 'timestamp': '2026-01-22 17:35:55.536463', '_unique_id': '460caab572694b3c86bab1d3fb2ad962'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.538 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.538 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a11666c3-bea4-4464-887e-ce2034409aa1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:35:55.538769', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': 'ce488ebc-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.234832582, 'message_signature': 'a5127243b3f9f32214dfc09a317ac642bef0eb5fd61636405fbf1e362591e4de'}]}, 'timestamp': '2026-01-22 17:35:55.539242', '_unique_id': 'bf5dcabaafc544699ed7452fa17180b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.540 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.541 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.541 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.write.requests volume: 322 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa499137-bc28-46b2-9ad6-da460bc033c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 322, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:35:55.541496', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ce48f73a-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.248698159, 'message_signature': 'e5d1981c8184fd484ac89f12e5d3194b666e8e40416f0c62fa04a6499f43c048'}]}, 'timestamp': '2026-01-22 17:35:55.541835', '_unique_id': 'db125bc7f3e74de99dc232a2c4f991e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.542 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.543 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.543 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.outgoing.packets volume: 123 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27a346f1-afd5-49ac-9b7e-b9c20160c6b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 123, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:35:55.543324', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': 'ce493d94-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.234832582, 'message_signature': '4de7315bd71bca73820e026629f487404ed9fafe96c64f42597f44e54b9d396e'}]}, 'timestamp': '2026-01-22 17:35:55.543666', '_unique_id': 'd52187bc32a3452d856211a4dfcf0738'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.544 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.545 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.545 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.read.bytes volume: 30046720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fffb8053-cc7b-44c5-b802-5b873fdbfb0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30046720, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:35:55.545182', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ce498600-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.248698159, 'message_signature': 'a3af73b2287bd2053ececfd6843234fad0e487a9cc06f56bf1688227e2405514'}]}, 'timestamp': '2026-01-22 17:35:55.545490', '_unique_id': '25d5b70ccfa34c6b8cf677928f80cd9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.546 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.547 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.547 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.547 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-575952224>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-575952224>]
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.547 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.573 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/cpu volume: 10880000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d03c154-4294-4cdd-bb7f-30b6b58b6835', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10880000000, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'timestamp': '2026-01-22T17:35:55.547590', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ce4dde1c-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.333660662, 'message_signature': '53313c95aeaa9086842aaf2b7eab4a598f73431a1da660ffc5b1139bc754d1bc'}]}, 'timestamp': '2026-01-22 17:35:55.574102', '_unique_id': '1648dd16be9e44d6b2c48ce5f9365e24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.575 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.576 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.576 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/memory.usage volume: 43.421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd80c7b1-6d8c-48e7-ba87-042306eac010', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.421875, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'timestamp': '2026-01-22T17:35:55.576951', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ce4e6274-f7b8-11f0-9e69-fa163eaea1db', 'monotonic_time': 5719.333660662, 'message_signature': '47751657ffdc398be684be4caf52f2ea8f02c6de9b3f36aeadf3d0b2f63cb1fd'}]}, 'timestamp': '2026-01-22 17:35:55.577421', '_unique_id': 'cda12ddd42594cc0a066062d393d7467'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:35:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:35:55.578 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:35:56 compute-0 nova_compute[183075]: 2026-01-22 17:35:56.003 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:35:57 compute-0 nova_compute[183075]: 2026-01-22 17:35:57.059 183079 INFO nova.compute.manager [None req-a5652051-aa37-43d0-8b33-fa1ffb4827ef 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Get console output
Jan 22 17:35:57 compute-0 nova_compute[183075]: 2026-01-22 17:35:57.066 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:35:58 compute-0 nova_compute[183075]: 2026-01-22 17:35:58.024 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:58 compute-0 podman[235386]: 2026-01-22 17:35:58.377033037 +0000 UTC m=+0.076683449 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:35:58 compute-0 nova_compute[183075]: 2026-01-22 17:35:58.718 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103343.7162137, f62cb90c-e99d-43d4-bbac-06a79d9b1182 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:35:58 compute-0 nova_compute[183075]: 2026-01-22 17:35:58.719 183079 INFO nova.compute.manager [-] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] VM Stopped (Lifecycle Event)
Jan 22 17:35:58 compute-0 nova_compute[183075]: 2026-01-22 17:35:58.751 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:35:58 compute-0 nova_compute[183075]: 2026-01-22 17:35:58.952 183079 DEBUG nova.compute.manager [None req-42fdb325-cc86-4a8d-b497-09586bd81299 - - - - - -] [instance: f62cb90c-e99d-43d4-bbac-06a79d9b1182] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:36:00 compute-0 nova_compute[183075]: 2026-01-22 17:36:00.291 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:01.219 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:36:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:01.219 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:36:01 compute-0 nova_compute[183075]: 2026-01-22 17:36:01.220 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:02 compute-0 nova_compute[183075]: 2026-01-22 17:36:02.323 183079 INFO nova.compute.manager [None req-66efcd1f-f7d4-486f-bb8f-a9420c366962 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Get console output
Jan 22 17:36:02 compute-0 nova_compute[183075]: 2026-01-22 17:36:02.331 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:36:03 compute-0 nova_compute[183075]: 2026-01-22 17:36:03.027 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:03 compute-0 nova_compute[183075]: 2026-01-22 17:36:03.754 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:03 compute-0 nova_compute[183075]: 2026-01-22 17:36:03.895 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:07 compute-0 nova_compute[183075]: 2026-01-22 17:36:07.442 183079 INFO nova.compute.manager [None req-8c1629f7-26f6-43a5-aeb6-40f63c857e68 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Get console output
Jan 22 17:36:07 compute-0 nova_compute[183075]: 2026-01-22 17:36:07.447 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:36:08 compute-0 nova_compute[183075]: 2026-01-22 17:36:08.028 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:08 compute-0 nova_compute[183075]: 2026-01-22 17:36:08.757 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:09 compute-0 podman[235413]: 2026-01-22 17:36:09.361700393 +0000 UTC m=+0.060448337 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 22 17:36:09 compute-0 podman[235414]: 2026-01-22 17:36:09.394571797 +0000 UTC m=+0.088488180 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, io.buildah.version=1.33.7, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible)
Jan 22 17:36:09 compute-0 podman[235412]: 2026-01-22 17:36:09.398531335 +0000 UTC m=+0.101405091 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 17:36:10 compute-0 nova_compute[183075]: 2026-01-22 17:36:10.880 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "3fd8923a-65e6-42d0-a866-17500a9df5cf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:10 compute-0 nova_compute[183075]: 2026-01-22 17:36:10.880 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "3fd8923a-65e6-42d0-a866-17500a9df5cf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:10 compute-0 nova_compute[183075]: 2026-01-22 17:36:10.938 183079 DEBUG nova.compute.manager [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:36:11 compute-0 nova_compute[183075]: 2026-01-22 17:36:11.105 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:11 compute-0 nova_compute[183075]: 2026-01-22 17:36:11.106 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:11 compute-0 nova_compute[183075]: 2026-01-22 17:36:11.114 183079 DEBUG nova.virt.hardware [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:36:11 compute-0 nova_compute[183075]: 2026-01-22 17:36:11.114 183079 INFO nova.compute.claims [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:36:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:11.223 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:36:11 compute-0 nova_compute[183075]: 2026-01-22 17:36:11.449 183079 DEBUG nova.compute.provider_tree [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:36:11 compute-0 nova_compute[183075]: 2026-01-22 17:36:11.472 183079 DEBUG nova.scheduler.client.report [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:36:11 compute-0 nova_compute[183075]: 2026-01-22 17:36:11.564 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.458s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:11 compute-0 nova_compute[183075]: 2026-01-22 17:36:11.565 183079 DEBUG nova.compute.manager [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:36:11 compute-0 nova_compute[183075]: 2026-01-22 17:36:11.676 183079 DEBUG nova.compute.manager [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:36:11 compute-0 nova_compute[183075]: 2026-01-22 17:36:11.676 183079 DEBUG nova.network.neutron [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:36:11 compute-0 nova_compute[183075]: 2026-01-22 17:36:11.730 183079 INFO nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:36:11 compute-0 nova_compute[183075]: 2026-01-22 17:36:11.805 183079 DEBUG nova.compute.manager [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.034 183079 DEBUG nova.compute.manager [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.035 183079 DEBUG nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.035 183079 INFO nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Creating image(s)
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.036 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "/var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.036 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "/var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.037 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "/var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.048 183079 DEBUG oslo_concurrency.processutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.102 183079 DEBUG oslo_concurrency.processutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.103 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.104 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.120 183079 DEBUG oslo_concurrency.processutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.176 183079 DEBUG oslo_concurrency.processutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.177 183079 DEBUG oslo_concurrency.processutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.221 183079 DEBUG oslo_concurrency.processutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.222 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.223 183079 DEBUG oslo_concurrency.processutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.290 183079 DEBUG oslo_concurrency.processutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.291 183079 DEBUG nova.virt.disk.api [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Checking if we can resize image /var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.292 183079 DEBUG oslo_concurrency.processutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.309 183079 DEBUG nova.policy [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bf68afec168c4aa2ba7e47fdb3b026af', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8cfd5f99a92142bd829974004d0e603e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.345 183079 DEBUG oslo_concurrency.processutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.346 183079 DEBUG nova.virt.disk.api [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Cannot resize image /var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.346 183079 DEBUG nova.objects.instance [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lazy-loading 'migration_context' on Instance uuid 3fd8923a-65e6-42d0-a866-17500a9df5cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.418 183079 DEBUG nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.418 183079 DEBUG nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Ensure instance console log exists: /var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.419 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.419 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.419 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.582 183079 INFO nova.compute.manager [None req-29260561-787d-45cb-a7ae-edd547f80f2a 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Get console output
Jan 22 17:36:12 compute-0 nova_compute[183075]: 2026-01-22 17:36:12.588 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:36:13 compute-0 nova_compute[183075]: 2026-01-22 17:36:13.030 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:13 compute-0 nova_compute[183075]: 2026-01-22 17:36:13.758 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:14 compute-0 podman[235491]: 2026-01-22 17:36:14.35749444 +0000 UTC m=+0.051173764 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:36:15 compute-0 nova_compute[183075]: 2026-01-22 17:36:15.708 183079 DEBUG nova.network.neutron [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Successfully created port: 42eadaf5-c2f9-4e11-a3dc-3e5003423156 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:36:17 compute-0 nova_compute[183075]: 2026-01-22 17:36:17.235 183079 DEBUG nova.network.neutron [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Successfully updated port: 42eadaf5-c2f9-4e11-a3dc-3e5003423156 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:36:17 compute-0 nova_compute[183075]: 2026-01-22 17:36:17.625 183079 DEBUG nova.compute.manager [req-6f485ea1-ff5d-49dd-bc79-39aadff2acaa req-fd964180-c677-4529-a55f-c96b105ae78d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Received event network-changed-42eadaf5-c2f9-4e11-a3dc-3e5003423156 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:36:17 compute-0 nova_compute[183075]: 2026-01-22 17:36:17.625 183079 DEBUG nova.compute.manager [req-6f485ea1-ff5d-49dd-bc79-39aadff2acaa req-fd964180-c677-4529-a55f-c96b105ae78d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Refreshing instance network info cache due to event network-changed-42eadaf5-c2f9-4e11-a3dc-3e5003423156. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:36:17 compute-0 nova_compute[183075]: 2026-01-22 17:36:17.625 183079 DEBUG oslo_concurrency.lockutils [req-6f485ea1-ff5d-49dd-bc79-39aadff2acaa req-fd964180-c677-4529-a55f-c96b105ae78d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-3fd8923a-65e6-42d0-a866-17500a9df5cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:36:17 compute-0 nova_compute[183075]: 2026-01-22 17:36:17.626 183079 DEBUG oslo_concurrency.lockutils [req-6f485ea1-ff5d-49dd-bc79-39aadff2acaa req-fd964180-c677-4529-a55f-c96b105ae78d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-3fd8923a-65e6-42d0-a866-17500a9df5cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:36:17 compute-0 nova_compute[183075]: 2026-01-22 17:36:17.626 183079 DEBUG nova.network.neutron [req-6f485ea1-ff5d-49dd-bc79-39aadff2acaa req-fd964180-c677-4529-a55f-c96b105ae78d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Refreshing network info cache for port 42eadaf5-c2f9-4e11-a3dc-3e5003423156 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:36:17 compute-0 nova_compute[183075]: 2026-01-22 17:36:17.640 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "refresh_cache-3fd8923a-65e6-42d0-a866-17500a9df5cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:36:17 compute-0 nova_compute[183075]: 2026-01-22 17:36:17.777 183079 DEBUG nova.network.neutron [req-6f485ea1-ff5d-49dd-bc79-39aadff2acaa req-fd964180-c677-4529-a55f-c96b105ae78d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:36:17 compute-0 nova_compute[183075]: 2026-01-22 17:36:17.794 183079 INFO nova.compute.manager [None req-9906b3a9-eb7f-43f1-a85e-525c5dc235f5 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Get console output
Jan 22 17:36:17 compute-0 nova_compute[183075]: 2026-01-22 17:36:17.798 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:36:18 compute-0 nova_compute[183075]: 2026-01-22 17:36:18.032 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:18 compute-0 nova_compute[183075]: 2026-01-22 17:36:18.396 183079 DEBUG nova.network.neutron [req-6f485ea1-ff5d-49dd-bc79-39aadff2acaa req-fd964180-c677-4529-a55f-c96b105ae78d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:36:18 compute-0 nova_compute[183075]: 2026-01-22 17:36:18.564 183079 DEBUG oslo_concurrency.lockutils [req-6f485ea1-ff5d-49dd-bc79-39aadff2acaa req-fd964180-c677-4529-a55f-c96b105ae78d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-3fd8923a-65e6-42d0-a866-17500a9df5cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:36:18 compute-0 nova_compute[183075]: 2026-01-22 17:36:18.565 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquired lock "refresh_cache-3fd8923a-65e6-42d0-a866-17500a9df5cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:36:18 compute-0 nova_compute[183075]: 2026-01-22 17:36:18.565 183079 DEBUG nova.network.neutron [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:36:18 compute-0 nova_compute[183075]: 2026-01-22 17:36:18.760 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:19 compute-0 nova_compute[183075]: 2026-01-22 17:36:19.245 183079 DEBUG nova.network.neutron [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:36:20 compute-0 nova_compute[183075]: 2026-01-22 17:36:20.479 183079 DEBUG nova.network.neutron [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Updating instance_info_cache with network_info: [{"id": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "address": "fa:16:3e:23:26:1f", "network": {"id": "3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5", "bridge": "br-int", "label": "tempest-test-network--2142514082", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42eadaf5-c2", "ovs_interfaceid": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.196 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Releasing lock "refresh_cache-3fd8923a-65e6-42d0-a866-17500a9df5cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.196 183079 DEBUG nova.compute.manager [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Instance network_info: |[{"id": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "address": "fa:16:3e:23:26:1f", "network": {"id": "3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5", "bridge": "br-int", "label": "tempest-test-network--2142514082", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42eadaf5-c2", "ovs_interfaceid": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.199 183079 DEBUG nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Start _get_guest_xml network_info=[{"id": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "address": "fa:16:3e:23:26:1f", "network": {"id": "3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5", "bridge": "br-int", "label": "tempest-test-network--2142514082", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42eadaf5-c2", "ovs_interfaceid": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.203 183079 WARNING nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.208 183079 DEBUG nova.virt.libvirt.host [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.209 183079 DEBUG nova.virt.libvirt.host [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.212 183079 DEBUG nova.virt.libvirt.host [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.213 183079 DEBUG nova.virt.libvirt.host [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.213 183079 DEBUG nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.214 183079 DEBUG nova.virt.hardware [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.214 183079 DEBUG nova.virt.hardware [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.214 183079 DEBUG nova.virt.hardware [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.215 183079 DEBUG nova.virt.hardware [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.215 183079 DEBUG nova.virt.hardware [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.215 183079 DEBUG nova.virt.hardware [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.216 183079 DEBUG nova.virt.hardware [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.216 183079 DEBUG nova.virt.hardware [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.216 183079 DEBUG nova.virt.hardware [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.216 183079 DEBUG nova.virt.hardware [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.217 183079 DEBUG nova.virt.hardware [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.220 183079 DEBUG nova.virt.libvirt.vif [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:36:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-842565147',display_name='tempest-server-test-842565147',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-842565147',id=59,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBkEoXBEO7Iq9oyrNfK3wd/+mUcVrdUpimNGP2S5PkNQ/qaeiJOV3d8bp+iClmxxrwIFhFFhsVSzK7EyUN3IdfHCQdVlCqDyzkANoAeuj9f9N66mRfaNpYK0SG4yFRFqGg==',key_name='tempest-keypair-test-192960344',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8cfd5f99a92142bd829974004d0e603e',ramdisk_id='',reservation_id='r-t2nywuua',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-1796635121',owner_user_name='tempest-PortSecurityTest-1796635121-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:36:11Z,user_data=None,user_id='bf68afec168c4aa2ba7e47fdb3b026af',uuid=3fd8923a-65e6-42d0-a866-17500a9df5cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "address": "fa:16:3e:23:26:1f", "network": {"id": "3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5", "bridge": "br-int", "label": "tempest-test-network--2142514082", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42eadaf5-c2", "ovs_interfaceid": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.220 183079 DEBUG nova.network.os_vif_util [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Converting VIF {"id": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "address": "fa:16:3e:23:26:1f", "network": {"id": "3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5", "bridge": "br-int", "label": "tempest-test-network--2142514082", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42eadaf5-c2", "ovs_interfaceid": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.221 183079 DEBUG nova.network.os_vif_util [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:26:1f,bridge_name='br-int',has_traffic_filtering=True,id=42eadaf5-c2f9-4e11-a3dc-3e5003423156,network=Network(3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap42eadaf5-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.222 183079 DEBUG nova.objects.instance [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lazy-loading 'pci_devices' on Instance uuid 3fd8923a-65e6-42d0-a866-17500a9df5cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.327 183079 DEBUG nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:36:21 compute-0 nova_compute[183075]:   <uuid>3fd8923a-65e6-42d0-a866-17500a9df5cf</uuid>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   <name>instance-0000003b</name>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-842565147</nova:name>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:36:21</nova:creationTime>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:36:21 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:36:21 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:36:21 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:36:21 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:36:21 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:36:21 compute-0 nova_compute[183075]:         <nova:user uuid="bf68afec168c4aa2ba7e47fdb3b026af">tempest-PortSecurityTest-1796635121-project-member</nova:user>
Jan 22 17:36:21 compute-0 nova_compute[183075]:         <nova:project uuid="8cfd5f99a92142bd829974004d0e603e">tempest-PortSecurityTest-1796635121</nova:project>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:36:21 compute-0 nova_compute[183075]:         <nova:port uuid="42eadaf5-c2f9-4e11-a3dc-3e5003423156">
Jan 22 17:36:21 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.23" ipVersion="4"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <system>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <entry name="serial">3fd8923a-65e6-42d0-a866-17500a9df5cf</entry>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <entry name="uuid">3fd8923a-65e6-42d0-a866-17500a9df5cf</entry>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     </system>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   <os>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   </os>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   <features>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   </features>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf/disk"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:23:26:1f"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <target dev="tap42eadaf5-c2"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf/console.log" append="off"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <video>
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     </video>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:36:21 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:36:21 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:36:21 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:36:21 compute-0 nova_compute[183075]: </domain>
Jan 22 17:36:21 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.329 183079 DEBUG nova.compute.manager [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Preparing to wait for external event network-vif-plugged-42eadaf5-c2f9-4e11-a3dc-3e5003423156 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.329 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.329 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.329 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.330 183079 DEBUG nova.virt.libvirt.vif [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:36:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-842565147',display_name='tempest-server-test-842565147',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-842565147',id=59,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBkEoXBEO7Iq9oyrNfK3wd/+mUcVrdUpimNGP2S5PkNQ/qaeiJOV3d8bp+iClmxxrwIFhFFhsVSzK7EyUN3IdfHCQdVlCqDyzkANoAeuj9f9N66mRfaNpYK0SG4yFRFqGg==',key_name='tempest-keypair-test-192960344',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8cfd5f99a92142bd829974004d0e603e',ramdisk_id='',reservation_id='r-t2nywuua',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-1796635121',owner_user_name='tempest-PortSecurityTest-1796635121-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:36:11Z,user_data=None,user_id='bf68afec168c4aa2ba7e47fdb3b026af',uuid=3fd8923a-65e6-42d0-a866-17500a9df5cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "address": "fa:16:3e:23:26:1f", "network": {"id": "3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5", "bridge": "br-int", "label": "tempest-test-network--2142514082", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42eadaf5-c2", "ovs_interfaceid": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.330 183079 DEBUG nova.network.os_vif_util [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Converting VIF {"id": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "address": "fa:16:3e:23:26:1f", "network": {"id": "3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5", "bridge": "br-int", "label": "tempest-test-network--2142514082", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42eadaf5-c2", "ovs_interfaceid": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.331 183079 DEBUG nova.network.os_vif_util [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:26:1f,bridge_name='br-int',has_traffic_filtering=True,id=42eadaf5-c2f9-4e11-a3dc-3e5003423156,network=Network(3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap42eadaf5-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.331 183079 DEBUG os_vif [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:26:1f,bridge_name='br-int',has_traffic_filtering=True,id=42eadaf5-c2f9-4e11-a3dc-3e5003423156,network=Network(3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap42eadaf5-c2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.331 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.332 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.332 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.335 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.335 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap42eadaf5-c2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.335 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap42eadaf5-c2, col_values=(('external_ids', {'iface-id': '42eadaf5-c2f9-4e11-a3dc-3e5003423156', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:23:26:1f', 'vm-uuid': '3fd8923a-65e6-42d0-a866-17500a9df5cf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.337 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:21 compute-0 NetworkManager[55454]: <info>  [1769103381.3384] manager: (tap42eadaf5-c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.339 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.346 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.347 183079 INFO os_vif [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:26:1f,bridge_name='br-int',has_traffic_filtering=True,id=42eadaf5-c2f9-4e11-a3dc-3e5003423156,network=Network(3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap42eadaf5-c2')
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.860 183079 DEBUG nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.861 183079 DEBUG nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] No VIF found with MAC fa:16:3e:23:26:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:36:21 compute-0 kernel: tap42eadaf5-c2: entered promiscuous mode
Jan 22 17:36:21 compute-0 NetworkManager[55454]: <info>  [1769103381.9190] manager: (tap42eadaf5-c2): new Tun device (/org/freedesktop/NetworkManager/Devices/275)
Jan 22 17:36:21 compute-0 ovn_controller[95372]: 2026-01-22T17:36:21Z|00641|binding|INFO|Claiming lport 42eadaf5-c2f9-4e11-a3dc-3e5003423156 for this chassis.
Jan 22 17:36:21 compute-0 ovn_controller[95372]: 2026-01-22T17:36:21Z|00642|binding|INFO|42eadaf5-c2f9-4e11-a3dc-3e5003423156: Claiming fa:16:3e:23:26:1f 10.100.0.23
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.919 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:21 compute-0 ovn_controller[95372]: 2026-01-22T17:36:21Z|00643|binding|INFO|Setting lport 42eadaf5-c2f9-4e11-a3dc-3e5003423156 ovn-installed in OVS
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.931 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:21 compute-0 nova_compute[183075]: 2026-01-22 17:36:21.935 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:21 compute-0 systemd-udevd[235526]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:36:21 compute-0 ovn_controller[95372]: 2026-01-22T17:36:21Z|00644|binding|INFO|Setting lport 42eadaf5-c2f9-4e11-a3dc-3e5003423156 up in Southbound
Jan 22 17:36:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:21.948 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:26:1f 10.100.0.23'], port_security=['fa:16:3e:23:26:1f 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': '3fd8923a-65e6-42d0-a866-17500a9df5cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8cfd5f99a92142bd829974004d0e603e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a41626e7-06f3-46e2-a5f9-4fdf587496ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa2e84d0-9512-4c88-842e-9b3499a075d2, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=42eadaf5-c2f9-4e11-a3dc-3e5003423156) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:36:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:21.950 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 42eadaf5-c2f9-4e11-a3dc-3e5003423156 in datapath 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 bound to our chassis
Jan 22 17:36:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:21.952 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5
Jan 22 17:36:21 compute-0 NetworkManager[55454]: <info>  [1769103381.9562] device (tap42eadaf5-c2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:36:21 compute-0 NetworkManager[55454]: <info>  [1769103381.9567] device (tap42eadaf5-c2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:36:21 compute-0 systemd-machined[154382]: New machine qemu-59-instance-0000003b.
Jan 22 17:36:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:21.965 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[84e421fe-c135-4f2e-80da-cfd38d414cb8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:21.966 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3abe9ce6-b1 in ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:36:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:21.968 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3abe9ce6-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:36:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:21.969 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6924cead-2ade-4320-a19a-986db3cb1cae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:21.970 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[21b8b217-6919-4347-ad00-83617204986f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:21 compute-0 systemd[1]: Started Virtual Machine qemu-59-instance-0000003b.
Jan 22 17:36:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:21.983 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[ace6cdea-bc7f-4d7f-87b0-5b70f506f971]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:21.998 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5d0efb89-51ff-4bc9-be78-5adf36816adf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.036 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f3db8fb6-07be-4c3d-815a-0e9327d86b2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.043 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[de2d1931-9ce9-4f50-a061-9d10b737e34f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:22 compute-0 NetworkManager[55454]: <info>  [1769103382.0451] manager: (tap3abe9ce6-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/276)
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.077 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f2feca4c-669d-4e31-9fad-a5a1080f1ced]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.080 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[30fe2404-fe2f-4857-950b-b25d4ea48d02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:22 compute-0 NetworkManager[55454]: <info>  [1769103382.1085] device (tap3abe9ce6-b0): carrier: link connected
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.116 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[cc3844ef-571b-451b-825a-b860d5b31d60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.135 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c63e17f8-3e4c-44f3-8acd-e18ec835d994]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3abe9ce6-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a3:d9:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574581, 'reachable_time': 41289, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235562, 'error': None, 'target': 'ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.150 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[643ebda8-eb1b-4245-902f-01d298957929]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea3:d962'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 574581, 'tstamp': 574581}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235563, 'error': None, 'target': 'ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.176 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5baf1e76-e5ae-42cf-8b4e-8752f180efd3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3abe9ce6-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a3:d9:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574581, 'reachable_time': 41289, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235564, 'error': None, 'target': 'ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.214 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8d952a40-1500-42b0-aaea-2aa9a3895d89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.295 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[14c31dda-aedb-4d73-9c19-af69ce809808]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.297 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3abe9ce6-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.297 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.297 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3abe9ce6-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:36:22 compute-0 kernel: tap3abe9ce6-b0: entered promiscuous mode
Jan 22 17:36:22 compute-0 NetworkManager[55454]: <info>  [1769103382.3011] manager: (tap3abe9ce6-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/277)
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.299 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.302 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3abe9ce6-b0, col_values=(('external_ids', {'iface-id': '98901646-8ac9-42dd-b4c0-64ab9661d62b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.303 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:22 compute-0 ovn_controller[95372]: 2026-01-22T17:36:22Z|00645|binding|INFO|Releasing lport 98901646-8ac9-42dd-b4c0-64ab9661d62b from this chassis (sb_readonly=0)
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.315 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.316 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.317 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[37e521c4-2559-4314-ab5d-78ce0dd8ab11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.318 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5.pid.haproxy
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:36:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:22.318 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5', 'env', 'PROCESS_TAG=haproxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.420 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103382.419678, 3fd8923a-65e6-42d0-a866-17500a9df5cf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.420 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] VM Started (Lifecycle Event)
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.527 183079 DEBUG nova.compute.manager [req-2363d1b1-f442-475e-b6f8-8369e71a63d0 req-893f0cab-4f8b-4b84-a0e9-16af89747692 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Received event network-vif-plugged-42eadaf5-c2f9-4e11-a3dc-3e5003423156 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.527 183079 DEBUG oslo_concurrency.lockutils [req-2363d1b1-f442-475e-b6f8-8369e71a63d0 req-893f0cab-4f8b-4b84-a0e9-16af89747692 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.528 183079 DEBUG oslo_concurrency.lockutils [req-2363d1b1-f442-475e-b6f8-8369e71a63d0 req-893f0cab-4f8b-4b84-a0e9-16af89747692 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.528 183079 DEBUG oslo_concurrency.lockutils [req-2363d1b1-f442-475e-b6f8-8369e71a63d0 req-893f0cab-4f8b-4b84-a0e9-16af89747692 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.528 183079 DEBUG nova.compute.manager [req-2363d1b1-f442-475e-b6f8-8369e71a63d0 req-893f0cab-4f8b-4b84-a0e9-16af89747692 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Processing event network-vif-plugged-42eadaf5-c2f9-4e11-a3dc-3e5003423156 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.529 183079 DEBUG nova.compute.manager [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.534 183079 DEBUG nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.537 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.540 183079 INFO nova.virt.libvirt.driver [-] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Instance spawned successfully.
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.541 183079 DEBUG nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.543 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:36:22 compute-0 podman[235603]: 2026-01-22 17:36:22.722455952 +0000 UTC m=+0.027163470 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.823 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.823 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103382.4199667, 3fd8923a-65e6-42d0-a866-17500a9df5cf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.823 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] VM Paused (Lifecycle Event)
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.827 183079 DEBUG nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.827 183079 DEBUG nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.828 183079 DEBUG nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.828 183079 DEBUG nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.829 183079 DEBUG nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:36:22 compute-0 nova_compute[183075]: 2026-01-22 17:36:22.829 183079 DEBUG nova.virt.libvirt.driver [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:36:23 compute-0 nova_compute[183075]: 2026-01-22 17:36:23.034 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:23 compute-0 podman[235603]: 2026-01-22 17:36:23.557485774 +0000 UTC m=+0.862193272 container create c0f33adcd3a72a72f41f78a3cfc10a941e36145809accf9b38e5eb9e2b9e2c12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 22 17:36:23 compute-0 nova_compute[183075]: 2026-01-22 17:36:23.572 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:36:23 compute-0 nova_compute[183075]: 2026-01-22 17:36:23.577 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103382.5340858, 3fd8923a-65e6-42d0-a866-17500a9df5cf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:36:23 compute-0 nova_compute[183075]: 2026-01-22 17:36:23.577 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] VM Resumed (Lifecycle Event)
Jan 22 17:36:23 compute-0 nova_compute[183075]: 2026-01-22 17:36:23.836 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:36:23 compute-0 nova_compute[183075]: 2026-01-22 17:36:23.842 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:36:23 compute-0 systemd[1]: Started libpod-conmon-c0f33adcd3a72a72f41f78a3cfc10a941e36145809accf9b38e5eb9e2b9e2c12.scope.
Jan 22 17:36:23 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:36:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52939867bdca6d7b9c547d1d9885def44cb2216b37d4fe83ecb7a629597da92e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:36:24 compute-0 nova_compute[183075]: 2026-01-22 17:36:24.044 183079 INFO nova.compute.manager [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Took 12.01 seconds to spawn the instance on the hypervisor.
Jan 22 17:36:24 compute-0 nova_compute[183075]: 2026-01-22 17:36:24.046 183079 DEBUG nova.compute.manager [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:36:24 compute-0 podman[235603]: 2026-01-22 17:36:24.064061444 +0000 UTC m=+1.368768972 container init c0f33adcd3a72a72f41f78a3cfc10a941e36145809accf9b38e5eb9e2b9e2c12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:36:24 compute-0 podman[235603]: 2026-01-22 17:36:24.075181527 +0000 UTC m=+1.379889025 container start c0f33adcd3a72a72f41f78a3cfc10a941e36145809accf9b38e5eb9e2b9e2c12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 17:36:24 compute-0 nova_compute[183075]: 2026-01-22 17:36:24.095 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:36:24 compute-0 neutron-haproxy-ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235618]: [NOTICE]   (235622) : New worker (235624) forked
Jan 22 17:36:24 compute-0 neutron-haproxy-ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235618]: [NOTICE]   (235622) : Loading success.
Jan 22 17:36:24 compute-0 nova_compute[183075]: 2026-01-22 17:36:24.553 183079 INFO nova.compute.manager [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Took 13.50 seconds to build instance.
Jan 22 17:36:24 compute-0 nova_compute[183075]: 2026-01-22 17:36:24.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:36:25 compute-0 nova_compute[183075]: 2026-01-22 17:36:25.189 183079 DEBUG oslo_concurrency.lockutils [None req-bc97dd57-7055-4c66-b875-9976bbbf6296 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "3fd8923a-65e6-42d0-a866-17500a9df5cf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:25 compute-0 podman[235633]: 2026-01-22 17:36:25.390590005 +0000 UTC m=+0.090870575 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:36:25 compute-0 nova_compute[183075]: 2026-01-22 17:36:25.720 183079 DEBUG nova.compute.manager [req-1c8cff4f-8ac3-4ea7-9cac-7fa791631ca3 req-acb9fc82-7ad3-492a-bef3-bc293537690f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Received event network-vif-plugged-42eadaf5-c2f9-4e11-a3dc-3e5003423156 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:36:25 compute-0 nova_compute[183075]: 2026-01-22 17:36:25.721 183079 DEBUG oslo_concurrency.lockutils [req-1c8cff4f-8ac3-4ea7-9cac-7fa791631ca3 req-acb9fc82-7ad3-492a-bef3-bc293537690f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:25 compute-0 nova_compute[183075]: 2026-01-22 17:36:25.721 183079 DEBUG oslo_concurrency.lockutils [req-1c8cff4f-8ac3-4ea7-9cac-7fa791631ca3 req-acb9fc82-7ad3-492a-bef3-bc293537690f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:25 compute-0 nova_compute[183075]: 2026-01-22 17:36:25.721 183079 DEBUG oslo_concurrency.lockutils [req-1c8cff4f-8ac3-4ea7-9cac-7fa791631ca3 req-acb9fc82-7ad3-492a-bef3-bc293537690f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:25 compute-0 nova_compute[183075]: 2026-01-22 17:36:25.722 183079 DEBUG nova.compute.manager [req-1c8cff4f-8ac3-4ea7-9cac-7fa791631ca3 req-acb9fc82-7ad3-492a-bef3-bc293537690f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] No waiting events found dispatching network-vif-plugged-42eadaf5-c2f9-4e11-a3dc-3e5003423156 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:36:25 compute-0 nova_compute[183075]: 2026-01-22 17:36:25.722 183079 WARNING nova.compute.manager [req-1c8cff4f-8ac3-4ea7-9cac-7fa791631ca3 req-acb9fc82-7ad3-492a-bef3-bc293537690f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Received unexpected event network-vif-plugged-42eadaf5-c2f9-4e11-a3dc-3e5003423156 for instance with vm_state active and task_state None.
Jan 22 17:36:26 compute-0 nova_compute[183075]: 2026-01-22 17:36:26.338 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:26 compute-0 nova_compute[183075]: 2026-01-22 17:36:26.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:36:26 compute-0 nova_compute[183075]: 2026-01-22 17:36:26.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:36:28 compute-0 nova_compute[183075]: 2026-01-22 17:36:28.037 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:28 compute-0 nova_compute[183075]: 2026-01-22 17:36:28.263 183079 INFO nova.compute.manager [None req-8fc14c14-16f6-4cbe-8e32-789f49ee5000 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Get console output
Jan 22 17:36:29 compute-0 podman[235657]: 2026-01-22 17:36:29.351409078 +0000 UTC m=+0.053973460 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.188 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "b7b02307-4481-472f-8bdc-707a9d19f350" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.189 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "b7b02307-4481-472f-8bdc-707a9d19f350" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.327 183079 DEBUG nova.compute.manager [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.341 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.394 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.395 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.401 183079 DEBUG nova.virt.hardware [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.402 183079 INFO nova.compute.claims [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.560 183079 DEBUG nova.compute.provider_tree [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.579 183079 DEBUG nova.scheduler.client.report [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.659 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.660 183079 DEBUG nova.compute.manager [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.752 183079 DEBUG nova.compute.manager [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.753 183079 DEBUG nova.network.neutron [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.827 183079 INFO nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.850 183079 DEBUG nova.compute.manager [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.931 183079 DEBUG nova.compute.manager [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.933 183079 DEBUG nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.933 183079 INFO nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Creating image(s)
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.934 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "/var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.934 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "/var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.935 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "/var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:31 compute-0 nova_compute[183075]: 2026-01-22 17:36:31.948 183079 DEBUG oslo_concurrency.processutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.002 183079 DEBUG oslo_concurrency.processutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.004 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.004 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.016 183079 DEBUG oslo_concurrency.processutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.071 183079 DEBUG oslo_concurrency.processutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.073 183079 DEBUG oslo_concurrency.processutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.133 183079 DEBUG oslo_concurrency.processutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk 1073741824" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.135 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.135 183079 DEBUG oslo_concurrency.processutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.191 183079 DEBUG oslo_concurrency.processutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.192 183079 DEBUG nova.virt.disk.api [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Checking if we can resize image /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.193 183079 DEBUG oslo_concurrency.processutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.248 183079 DEBUG oslo_concurrency.processutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.249 183079 DEBUG nova.virt.disk.api [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Cannot resize image /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.249 183079 DEBUG nova.objects.instance [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lazy-loading 'migration_context' on Instance uuid b7b02307-4481-472f-8bdc-707a9d19f350 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.261 183079 DEBUG nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.262 183079 DEBUG nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Ensure instance console log exists: /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.262 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.263 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.263 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.357 183079 DEBUG nova.policy [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:36:32 compute-0 nova_compute[183075]: 2026-01-22 17:36:32.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:36:33 compute-0 nova_compute[183075]: 2026-01-22 17:36:33.039 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:34 compute-0 nova_compute[183075]: 2026-01-22 17:36:34.206 183079 INFO nova.compute.manager [None req-5dde837f-b3cb-4a0c-b0da-498e9c951436 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Get console output
Jan 22 17:36:34 compute-0 nova_compute[183075]: 2026-01-22 17:36:34.213 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:36:34 compute-0 nova_compute[183075]: 2026-01-22 17:36:34.409 183079 DEBUG nova.network.neutron [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Successfully updated port: 3a7e69a2-e6a0-4e5f-883e-d116981b58d9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:36:34 compute-0 nova_compute[183075]: 2026-01-22 17:36:34.423 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "refresh_cache-b7b02307-4481-472f-8bdc-707a9d19f350" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:36:34 compute-0 nova_compute[183075]: 2026-01-22 17:36:34.424 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquired lock "refresh_cache-b7b02307-4481-472f-8bdc-707a9d19f350" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:36:34 compute-0 nova_compute[183075]: 2026-01-22 17:36:34.424 183079 DEBUG nova.network.neutron [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:36:34 compute-0 nova_compute[183075]: 2026-01-22 17:36:34.497 183079 DEBUG nova.compute.manager [req-4d1d14e7-2958-4ac7-b15d-e730f0e5eaa3 req-69212931-b904-44fa-8068-ba37619554ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Received event network-changed-3a7e69a2-e6a0-4e5f-883e-d116981b58d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:36:34 compute-0 nova_compute[183075]: 2026-01-22 17:36:34.497 183079 DEBUG nova.compute.manager [req-4d1d14e7-2958-4ac7-b15d-e730f0e5eaa3 req-69212931-b904-44fa-8068-ba37619554ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Refreshing instance network info cache due to event network-changed-3a7e69a2-e6a0-4e5f-883e-d116981b58d9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:36:34 compute-0 nova_compute[183075]: 2026-01-22 17:36:34.498 183079 DEBUG oslo_concurrency.lockutils [req-4d1d14e7-2958-4ac7-b15d-e730f0e5eaa3 req-69212931-b904-44fa-8068-ba37619554ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-b7b02307-4481-472f-8bdc-707a9d19f350" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:36:35 compute-0 ovn_controller[95372]: 2026-01-22T17:36:35Z|00120|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:23:26:1f 10.100.0.23
Jan 22 17:36:35 compute-0 ovn_controller[95372]: 2026-01-22T17:36:35Z|00121|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:23:26:1f 10.100.0.23
Jan 22 17:36:35 compute-0 nova_compute[183075]: 2026-01-22 17:36:35.171 183079 DEBUG nova.network.neutron [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:36:35 compute-0 nova_compute[183075]: 2026-01-22 17:36:35.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.343 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.425 183079 DEBUG nova.network.neutron [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Updating instance_info_cache with network_info: [{"id": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "address": "fa:16:3e:ea:41:ca", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a7e69a2-e6", "ovs_interfaceid": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.447 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Releasing lock "refresh_cache-b7b02307-4481-472f-8bdc-707a9d19f350" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.447 183079 DEBUG nova.compute.manager [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Instance network_info: |[{"id": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "address": "fa:16:3e:ea:41:ca", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a7e69a2-e6", "ovs_interfaceid": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.448 183079 DEBUG oslo_concurrency.lockutils [req-4d1d14e7-2958-4ac7-b15d-e730f0e5eaa3 req-69212931-b904-44fa-8068-ba37619554ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-b7b02307-4481-472f-8bdc-707a9d19f350" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.448 183079 DEBUG nova.network.neutron [req-4d1d14e7-2958-4ac7-b15d-e730f0e5eaa3 req-69212931-b904-44fa-8068-ba37619554ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Refreshing network info cache for port 3a7e69a2-e6a0-4e5f-883e-d116981b58d9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.450 183079 DEBUG nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Start _get_guest_xml network_info=[{"id": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "address": "fa:16:3e:ea:41:ca", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a7e69a2-e6", "ovs_interfaceid": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.454 183079 WARNING nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.458 183079 DEBUG nova.virt.libvirt.host [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.459 183079 DEBUG nova.virt.libvirt.host [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.462 183079 DEBUG nova.virt.libvirt.host [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.462 183079 DEBUG nova.virt.libvirt.host [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.462 183079 DEBUG nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.463 183079 DEBUG nova.virt.hardware [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.463 183079 DEBUG nova.virt.hardware [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.463 183079 DEBUG nova.virt.hardware [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.464 183079 DEBUG nova.virt.hardware [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.464 183079 DEBUG nova.virt.hardware [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.464 183079 DEBUG nova.virt.hardware [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.464 183079 DEBUG nova.virt.hardware [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.464 183079 DEBUG nova.virt.hardware [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.465 183079 DEBUG nova.virt.hardware [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.465 183079 DEBUG nova.virt.hardware [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.465 183079 DEBUG nova.virt.hardware [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.468 183079 DEBUG nova.virt.libvirt.vif [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:36:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-104321321',display_name='tempest-server-test-104321321',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-104321321',id=60,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMx6tQlM3uv2ZwDgQasADvvbGd5zBlSeFHyt6pKaXPuo5g1lBpnMysyabNjP8htj/tP0P4meLZoYHTsZRxp2O0FBGUiyAm9KZdR/DNDaP0hn5KYm00UnMkjIWdjBNqhB9Q==',key_name='tempest-TrunkTest-1480081563',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e92551896d2c49b5b149b1a5a0cc1761',ramdisk_id='',reservation_id='r-iwasr1w4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TrunkTest-252091256',owner_user_name='tempest-TrunkTest-252091256-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:36:31Z,user_data=None,user_id='7cc2886d6b0e400d8096a810a2159f3c',uuid=b7b02307-4481-472f-8bdc-707a9d19f350,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "address": "fa:16:3e:ea:41:ca", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a7e69a2-e6", "ovs_interfaceid": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.468 183079 DEBUG nova.network.os_vif_util [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Converting VIF {"id": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "address": "fa:16:3e:ea:41:ca", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a7e69a2-e6", "ovs_interfaceid": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.469 183079 DEBUG nova.network.os_vif_util [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:41:ca,bridge_name='br-int',has_traffic_filtering=True,id=3a7e69a2-e6a0-4e5f-883e-d116981b58d9,network=Network(f8ae4b18-347c-4ff3-b9f8-578518ecd408),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3a7e69a2-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.470 183079 DEBUG nova.objects.instance [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lazy-loading 'pci_devices' on Instance uuid b7b02307-4481-472f-8bdc-707a9d19f350 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.489 183079 DEBUG nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:36:36 compute-0 nova_compute[183075]:   <uuid>b7b02307-4481-472f-8bdc-707a9d19f350</uuid>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   <name>instance-0000003c</name>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-104321321</nova:name>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:36:36</nova:creationTime>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:36:36 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:36:36 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:36:36 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:36:36 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:36:36 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:36:36 compute-0 nova_compute[183075]:         <nova:user uuid="7cc2886d6b0e400d8096a810a2159f3c">tempest-TrunkTest-252091256-project-member</nova:user>
Jan 22 17:36:36 compute-0 nova_compute[183075]:         <nova:project uuid="e92551896d2c49b5b149b1a5a0cc1761">tempest-TrunkTest-252091256</nova:project>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:36:36 compute-0 nova_compute[183075]:         <nova:port uuid="3a7e69a2-e6a0-4e5f-883e-d116981b58d9">
Jan 22 17:36:36 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <system>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <entry name="serial">b7b02307-4481-472f-8bdc-707a9d19f350</entry>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <entry name="uuid">b7b02307-4481-472f-8bdc-707a9d19f350</entry>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     </system>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   <os>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   </os>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   <features>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   </features>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:ea:41:ca"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <target dev="tap3a7e69a2-e6"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/console.log" append="off"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <video>
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     </video>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:36:36 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:36:36 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:36:36 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:36:36 compute-0 nova_compute[183075]: </domain>
Jan 22 17:36:36 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.491 183079 DEBUG nova.compute.manager [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Preparing to wait for external event network-vif-plugged-3a7e69a2-e6a0-4e5f-883e-d116981b58d9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.491 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.492 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.492 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.493 183079 DEBUG nova.virt.libvirt.vif [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:36:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-104321321',display_name='tempest-server-test-104321321',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-104321321',id=60,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMx6tQlM3uv2ZwDgQasADvvbGd5zBlSeFHyt6pKaXPuo5g1lBpnMysyabNjP8htj/tP0P4meLZoYHTsZRxp2O0FBGUiyAm9KZdR/DNDaP0hn5KYm00UnMkjIWdjBNqhB9Q==',key_name='tempest-TrunkTest-1480081563',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e92551896d2c49b5b149b1a5a0cc1761',ramdisk_id='',reservation_id='r-iwasr1w4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TrunkTest-252091256',owner_user_name='tempest-TrunkTest-252091256-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:36:31Z,user_data=None,user_id='7cc2886d6b0e400d8096a810a2159f3c',uuid=b7b02307-4481-472f-8bdc-707a9d19f350,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "address": "fa:16:3e:ea:41:ca", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a7e69a2-e6", "ovs_interfaceid": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.493 183079 DEBUG nova.network.os_vif_util [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Converting VIF {"id": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "address": "fa:16:3e:ea:41:ca", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a7e69a2-e6", "ovs_interfaceid": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.494 183079 DEBUG nova.network.os_vif_util [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:41:ca,bridge_name='br-int',has_traffic_filtering=True,id=3a7e69a2-e6a0-4e5f-883e-d116981b58d9,network=Network(f8ae4b18-347c-4ff3-b9f8-578518ecd408),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3a7e69a2-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.494 183079 DEBUG os_vif [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:41:ca,bridge_name='br-int',has_traffic_filtering=True,id=3a7e69a2-e6a0-4e5f-883e-d116981b58d9,network=Network(f8ae4b18-347c-4ff3-b9f8-578518ecd408),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3a7e69a2-e6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.495 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.495 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.495 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.499 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.499 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a7e69a2-e6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.500 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3a7e69a2-e6, col_values=(('external_ids', {'iface-id': '3a7e69a2-e6a0-4e5f-883e-d116981b58d9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ea:41:ca', 'vm-uuid': 'b7b02307-4481-472f-8bdc-707a9d19f350'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.501 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:36 compute-0 NetworkManager[55454]: <info>  [1769103396.5027] manager: (tap3a7e69a2-e6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/278)
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.504 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.508 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.508 183079 INFO os_vif [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:41:ca,bridge_name='br-int',has_traffic_filtering=True,id=3a7e69a2-e6a0-4e5f-883e-d116981b58d9,network=Network(f8ae4b18-347c-4ff3-b9f8-578518ecd408),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3a7e69a2-e6')
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.596 183079 DEBUG nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.597 183079 DEBUG nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] No VIF found with MAC fa:16:3e:ea:41:ca, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:36:36 compute-0 kernel: tap3a7e69a2-e6: entered promiscuous mode
Jan 22 17:36:36 compute-0 NetworkManager[55454]: <info>  [1769103396.6541] manager: (tap3a7e69a2-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/279)
Jan 22 17:36:36 compute-0 ovn_controller[95372]: 2026-01-22T17:36:36Z|00646|binding|INFO|Claiming lport 3a7e69a2-e6a0-4e5f-883e-d116981b58d9 for this chassis.
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.655 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:36 compute-0 ovn_controller[95372]: 2026-01-22T17:36:36Z|00647|binding|INFO|3a7e69a2-e6a0-4e5f-883e-d116981b58d9: Claiming fa:16:3e:ea:41:ca 10.100.0.7
Jan 22 17:36:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:36.663 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:41:ca 10.100.0.7'], port_security=['fa:16:3e:ea:41:ca 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TrunkTest-1480081563', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b7b02307-4481-472f-8bdc-707a9d19f350', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8ae4b18-347c-4ff3-b9f8-578518ecd408', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TrunkTest-1480081563', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '2', 'neutron:security_group_ids': '709a6363-b0f8-4821-8d51-15be786bf9b0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.185'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=43bd6944-fad9-4457-8b9b-34db3859c385, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=3a7e69a2-e6a0-4e5f-883e-d116981b58d9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:36:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:36.665 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 3a7e69a2-e6a0-4e5f-883e-d116981b58d9 in datapath f8ae4b18-347c-4ff3-b9f8-578518ecd408 bound to our chassis
Jan 22 17:36:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:36.666 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f8ae4b18-347c-4ff3-b9f8-578518ecd408
Jan 22 17:36:36 compute-0 ovn_controller[95372]: 2026-01-22T17:36:36Z|00648|binding|INFO|Setting lport 3a7e69a2-e6a0-4e5f-883e-d116981b58d9 ovn-installed in OVS
Jan 22 17:36:36 compute-0 ovn_controller[95372]: 2026-01-22T17:36:36Z|00649|binding|INFO|Setting lport 3a7e69a2-e6a0-4e5f-883e-d116981b58d9 up in Southbound
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.672 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:36.682 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a6ef811d-8b07-4b0c-ae43-2eee8ac612de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:36 compute-0 systemd-udevd[235737]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:36:36 compute-0 systemd-machined[154382]: New machine qemu-60-instance-0000003c.
Jan 22 17:36:36 compute-0 NetworkManager[55454]: <info>  [1769103396.7004] device (tap3a7e69a2-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:36:36 compute-0 NetworkManager[55454]: <info>  [1769103396.7013] device (tap3a7e69a2-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:36:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:36.711 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[67cef2d8-befc-4bc4-ad28-3567e37f4395]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:36.714 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a352f587-523b-4261-8eee-f084d0b64dbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:36 compute-0 systemd[1]: Started Virtual Machine qemu-60-instance-0000003c.
Jan 22 17:36:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:36.738 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5df1a03b-e2a2-4083-8651-87ba6c946640]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:36.753 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b119e582-eaa1-4f3c-b0bb-e054d26be597]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8ae4b18-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:69:38'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6141, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6141, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568874, 'reachable_time': 28814, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235745, 'error': None, 'target': 'ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:36.768 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[75653987-62f8-4cf2-919d-b70cc671df8b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf8ae4b18-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 568887, 'tstamp': 568887}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235751, 'error': None, 'target': 'ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf8ae4b18-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 568890, 'tstamp': 568890}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235751, 'error': None, 'target': 'ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:36:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:36.770 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8ae4b18-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.771 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:36 compute-0 nova_compute[183075]: 2026-01-22 17:36:36.772 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:36.773 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf8ae4b18-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:36:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:36.773 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:36:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:36.773 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf8ae4b18-30, col_values=(('external_ids', {'iface-id': 'ed3c9c99-2ee5-407d-8706-619270405578'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:36:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:36.773 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:36:37 compute-0 nova_compute[183075]: 2026-01-22 17:36:37.431 183079 DEBUG nova.compute.manager [req-ac0cbb2c-f4f2-44c5-8eef-d53ded32f810 req-608c9766-cf80-4b5d-bdf3-027b559d9567 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Received event network-vif-plugged-3a7e69a2-e6a0-4e5f-883e-d116981b58d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:36:37 compute-0 nova_compute[183075]: 2026-01-22 17:36:37.431 183079 DEBUG oslo_concurrency.lockutils [req-ac0cbb2c-f4f2-44c5-8eef-d53ded32f810 req-608c9766-cf80-4b5d-bdf3-027b559d9567 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:37 compute-0 nova_compute[183075]: 2026-01-22 17:36:37.432 183079 DEBUG oslo_concurrency.lockutils [req-ac0cbb2c-f4f2-44c5-8eef-d53ded32f810 req-608c9766-cf80-4b5d-bdf3-027b559d9567 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:37 compute-0 nova_compute[183075]: 2026-01-22 17:36:37.432 183079 DEBUG oslo_concurrency.lockutils [req-ac0cbb2c-f4f2-44c5-8eef-d53ded32f810 req-608c9766-cf80-4b5d-bdf3-027b559d9567 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:37 compute-0 nova_compute[183075]: 2026-01-22 17:36:37.432 183079 DEBUG nova.compute.manager [req-ac0cbb2c-f4f2-44c5-8eef-d53ded32f810 req-608c9766-cf80-4b5d-bdf3-027b559d9567 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Processing event network-vif-plugged-3a7e69a2-e6a0-4e5f-883e-d116981b58d9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.044 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103398.044123, b7b02307-4481-472f-8bdc-707a9d19f350 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.044 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] VM Started (Lifecycle Event)
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.048 183079 DEBUG nova.compute.manager [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.051 183079 DEBUG nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.054 183079 INFO nova.virt.libvirt.driver [-] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Instance spawned successfully.
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.054 183079 DEBUG nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.066 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.089 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.094 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.097 183079 DEBUG nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.097 183079 DEBUG nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.098 183079 DEBUG nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.098 183079 DEBUG nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.098 183079 DEBUG nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.099 183079 DEBUG nova.virt.libvirt.driver [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.135 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.135 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103398.0451303, b7b02307-4481-472f-8bdc-707a9d19f350 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.135 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] VM Paused (Lifecycle Event)
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.165 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.169 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103398.0506387, b7b02307-4481-472f-8bdc-707a9d19f350 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.169 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] VM Resumed (Lifecycle Event)
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.238 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.240 183079 INFO nova.compute.manager [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Took 6.31 seconds to spawn the instance on the hypervisor.
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.240 183079 DEBUG nova.compute.manager [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.244 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.282 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.316 183079 INFO nova.compute.manager [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Took 6.95 seconds to build instance.
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.334 183079 DEBUG oslo_concurrency.lockutils [None req-80faccad-502b-4704-96cc-2b6ddebd0782 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "b7b02307-4481-472f-8bdc-707a9d19f350" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.693 183079 INFO nova.compute.manager [None req-5505c0a3-d043-49de-8623-f5c2b1e5b737 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Get console output
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.697 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:36:38 compute-0 nova_compute[183075]: 2026-01-22 17:36:38.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:36:39 compute-0 nova_compute[183075]: 2026-01-22 17:36:39.199 183079 DEBUG nova.network.neutron [req-4d1d14e7-2958-4ac7-b15d-e730f0e5eaa3 req-69212931-b904-44fa-8068-ba37619554ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Updated VIF entry in instance network info cache for port 3a7e69a2-e6a0-4e5f-883e-d116981b58d9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:36:39 compute-0 nova_compute[183075]: 2026-01-22 17:36:39.199 183079 DEBUG nova.network.neutron [req-4d1d14e7-2958-4ac7-b15d-e730f0e5eaa3 req-69212931-b904-44fa-8068-ba37619554ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Updating instance_info_cache with network_info: [{"id": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "address": "fa:16:3e:ea:41:ca", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a7e69a2-e6", "ovs_interfaceid": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:36:39 compute-0 nova_compute[183075]: 2026-01-22 17:36:39.216 183079 DEBUG oslo_concurrency.lockutils [req-4d1d14e7-2958-4ac7-b15d-e730f0e5eaa3 req-69212931-b904-44fa-8068-ba37619554ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-b7b02307-4481-472f-8bdc-707a9d19f350" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:36:39 compute-0 nova_compute[183075]: 2026-01-22 17:36:39.330 183079 INFO nova.compute.manager [None req-9fab1e30-32f3-4f75-90ab-fa5a95d8a4c9 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Get console output
Jan 22 17:36:39 compute-0 nova_compute[183075]: 2026-01-22 17:36:39.334 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:36:39 compute-0 nova_compute[183075]: 2026-01-22 17:36:39.521 183079 DEBUG nova.compute.manager [req-77221be5-e19f-4a11-a1bd-a85a5648fb83 req-77585f3d-d36a-48b6-9b03-9c6f7ded79f3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Received event network-vif-plugged-3a7e69a2-e6a0-4e5f-883e-d116981b58d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:36:39 compute-0 nova_compute[183075]: 2026-01-22 17:36:39.522 183079 DEBUG oslo_concurrency.lockutils [req-77221be5-e19f-4a11-a1bd-a85a5648fb83 req-77585f3d-d36a-48b6-9b03-9c6f7ded79f3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:39 compute-0 nova_compute[183075]: 2026-01-22 17:36:39.522 183079 DEBUG oslo_concurrency.lockutils [req-77221be5-e19f-4a11-a1bd-a85a5648fb83 req-77585f3d-d36a-48b6-9b03-9c6f7ded79f3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:39 compute-0 nova_compute[183075]: 2026-01-22 17:36:39.522 183079 DEBUG oslo_concurrency.lockutils [req-77221be5-e19f-4a11-a1bd-a85a5648fb83 req-77585f3d-d36a-48b6-9b03-9c6f7ded79f3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:39 compute-0 nova_compute[183075]: 2026-01-22 17:36:39.522 183079 DEBUG nova.compute.manager [req-77221be5-e19f-4a11-a1bd-a85a5648fb83 req-77585f3d-d36a-48b6-9b03-9c6f7ded79f3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] No waiting events found dispatching network-vif-plugged-3a7e69a2-e6a0-4e5f-883e-d116981b58d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:36:39 compute-0 nova_compute[183075]: 2026-01-22 17:36:39.522 183079 WARNING nova.compute.manager [req-77221be5-e19f-4a11-a1bd-a85a5648fb83 req-77585f3d-d36a-48b6-9b03-9c6f7ded79f3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Received unexpected event network-vif-plugged-3a7e69a2-e6a0-4e5f-883e-d116981b58d9 for instance with vm_state active and task_state None.
Jan 22 17:36:39 compute-0 nova_compute[183075]: 2026-01-22 17:36:39.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:36:39 compute-0 nova_compute[183075]: 2026-01-22 17:36:39.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.019 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.020 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:40 compute-0 nova_compute[183075]: 2026-01-22 17:36:40.182 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-585cde24-8038-40b2-97ce-5d30e6ecfc03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:36:40 compute-0 nova_compute[183075]: 2026-01-22 17:36:40.182 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-585cde24-8038-40b2-97ce-5d30e6ecfc03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:36:40 compute-0 nova_compute[183075]: 2026-01-22 17:36:40.183 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:36:40 compute-0 podman[235761]: 2026-01-22 17:36:40.349820442 +0000 UTC m=+0.045272163 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 17:36:40 compute-0 podman[235762]: 2026-01-22 17:36:40.36957626 +0000 UTC m=+0.063719565 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, release=1755695350, version=9.6)
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.390 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.391 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.3710072
Jan 22 17:36:40 compute-0 haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235624]: 10.100.0.23:54628 [22/Jan/2026:17:36:40.018] listener listener/metadata 0/0/0/398/398 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.424 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.425 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:40 compute-0 podman[235760]: 2026-01-22 17:36:40.436959795 +0000 UTC m=+0.135000787 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.449 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.449 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 168 time: 0.0242031
Jan 22 17:36:40 compute-0 haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235624]: 10.100.0.23:54630 [22/Jan/2026:17:36:40.424] listener listener/metadata 0/0/0/25/25 200 152 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.453 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.454 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.474 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.475 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0212870
Jan 22 17:36:40 compute-0 haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235624]: 10.100.0.23:54646 [22/Jan/2026:17:36:40.453] listener listener/metadata 0/0/0/22/22 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.480 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.481 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.500 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.500 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0198486
Jan 22 17:36:40 compute-0 haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235624]: 10.100.0.23:54648 [22/Jan/2026:17:36:40.479] listener listener/metadata 0/0/0/21/21 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.505 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.505 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.519 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.519 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0139854
Jan 22 17:36:40 compute-0 haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235624]: 10.100.0.23:54656 [22/Jan/2026:17:36:40.504] listener listener/metadata 0/0/0/14/14 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.523 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.524 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.538 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.539 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0150449
Jan 22 17:36:40 compute-0 haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235624]: 10.100.0.23:54660 [22/Jan/2026:17:36:40.523] listener listener/metadata 0/0/0/15/15 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.544 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.545 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.559 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.559 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0147398
Jan 22 17:36:40 compute-0 haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235624]: 10.100.0.23:54662 [22/Jan/2026:17:36:40.543] listener listener/metadata 0/0/0/15/15 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.564 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.565 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.581 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.581 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0160599
Jan 22 17:36:40 compute-0 haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235624]: 10.100.0.23:54666 [22/Jan/2026:17:36:40.564] listener listener/metadata 0/0/0/17/17 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.587 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.587 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.605 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.605 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0182564
Jan 22 17:36:40 compute-0 haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235624]: 10.100.0.23:54674 [22/Jan/2026:17:36:40.586] listener listener/metadata 0/0/0/19/19 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.610 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.610 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.626 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.626 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0158782
Jan 22 17:36:40 compute-0 haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235624]: 10.100.0.23:54680 [22/Jan/2026:17:36:40.609] listener listener/metadata 0/0/0/16/16 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.630 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.631 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:40 compute-0 haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235624]: 10.100.0.23:54690 [22/Jan/2026:17:36:40.630] listener listener/metadata 0/0/0/17/17 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.647 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0164382
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.655 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.656 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.673 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:40 compute-0 haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235624]: 10.100.0.23:54698 [22/Jan/2026:17:36:40.655] listener listener/metadata 0/0/0/18/18 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.673 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0174470
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.676 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.677 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.691 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.692 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0152216
Jan 22 17:36:40 compute-0 haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235624]: 10.100.0.23:54706 [22/Jan/2026:17:36:40.676] listener listener/metadata 0/0/0/16/16 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.695 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.696 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.713 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.714 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0183401
Jan 22 17:36:40 compute-0 haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235624]: 10.100.0.23:54720 [22/Jan/2026:17:36:40.695] listener listener/metadata 0/0/0/19/19 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.718 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.718 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.732 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.732 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0138037
Jan 22 17:36:40 compute-0 haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235624]: 10.100.0.23:54724 [22/Jan/2026:17:36:40.717] listener listener/metadata 0/0/0/14/14 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.736 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.737 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.23
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.754 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:40.755 104990 INFO eventlet.wsgi.server [-] 10.100.0.23,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0182915
Jan 22 17:36:40 compute-0 haproxy-metadata-proxy-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235624]: 10.100.0.23:54730 [22/Jan/2026:17:36:40.736] listener listener/metadata 0/0/0/19/19 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.375 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Updating instance_info_cache with network_info: [{"id": "524266a0-8b9b-42d0-9f33-913b53293292", "address": "fa:16:3e:7a:64:5f", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap524266a0-8b", "ovs_interfaceid": "524266a0-8b9b-42d0-9f33-913b53293292", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.417 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-585cde24-8038-40b2-97ce-5d30e6ecfc03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.418 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.418 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.419 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.488 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.489 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.489 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.489 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.546 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.657 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.731 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.732 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.791 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.798 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.860 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.861 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.916 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.922 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:36:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:41.953 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:41.953 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:41.954 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.979 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:36:41 compute-0 nova_compute[183075]: 2026-01-22 17:36:41.980 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:36:42 compute-0 nova_compute[183075]: 2026-01-22 17:36:42.038 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:36:42 compute-0 nova_compute[183075]: 2026-01-22 17:36:42.191 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:36:42 compute-0 nova_compute[183075]: 2026-01-22 17:36:42.192 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5185MB free_disk=73.30297470092773GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:36:42 compute-0 nova_compute[183075]: 2026-01-22 17:36:42.193 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:36:42 compute-0 nova_compute[183075]: 2026-01-22 17:36:42.193 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:36:42 compute-0 nova_compute[183075]: 2026-01-22 17:36:42.279 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 585cde24-8038-40b2-97ce-5d30e6ecfc03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:36:42 compute-0 nova_compute[183075]: 2026-01-22 17:36:42.280 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 3fd8923a-65e6-42d0-a866-17500a9df5cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:36:42 compute-0 nova_compute[183075]: 2026-01-22 17:36:42.280 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance b7b02307-4481-472f-8bdc-707a9d19f350 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:36:42 compute-0 nova_compute[183075]: 2026-01-22 17:36:42.280 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:36:42 compute-0 nova_compute[183075]: 2026-01-22 17:36:42.281 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:36:42 compute-0 nova_compute[183075]: 2026-01-22 17:36:42.343 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:36:42 compute-0 nova_compute[183075]: 2026-01-22 17:36:42.359 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:36:42 compute-0 nova_compute[183075]: 2026-01-22 17:36:42.405 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:36:42 compute-0 nova_compute[183075]: 2026-01-22 17:36:42.407 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:36:43 compute-0 nova_compute[183075]: 2026-01-22 17:36:43.068 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:43 compute-0 nova_compute[183075]: 2026-01-22 17:36:43.911 183079 INFO nova.compute.manager [None req-e8c7af66-f9f9-4b1b-b669-9c4067b92ee4 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Get console output
Jan 22 17:36:43 compute-0 nova_compute[183075]: 2026-01-22 17:36:43.916 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:36:44 compute-0 nova_compute[183075]: 2026-01-22 17:36:44.466 183079 INFO nova.compute.manager [None req-b060a091-508f-4864-b2b0-6a72ae2e212d bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Get console output
Jan 22 17:36:44 compute-0 nova_compute[183075]: 2026-01-22 17:36:44.470 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:36:45 compute-0 podman[235840]: 2026-01-22 17:36:45.342018592 +0000 UTC m=+0.052528871 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:36:46 compute-0 nova_compute[183075]: 2026-01-22 17:36:46.548 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:48 compute-0 nova_compute[183075]: 2026-01-22 17:36:48.081 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:49 compute-0 nova_compute[183075]: 2026-01-22 17:36:49.024 183079 INFO nova.compute.manager [None req-e41cf285-048b-4a00-81eb-c127ea25f239 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Get console output
Jan 22 17:36:49 compute-0 nova_compute[183075]: 2026-01-22 17:36:49.028 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:36:50 compute-0 nova_compute[183075]: 2026-01-22 17:36:50.158 183079 INFO nova.compute.manager [None req-5eb07d5d-6eeb-4a48-972e-ab09a7d6cc0b bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Get console output
Jan 22 17:36:50 compute-0 nova_compute[183075]: 2026-01-22 17:36:50.165 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:36:50 compute-0 ovn_controller[95372]: 2026-01-22T17:36:50Z|00122|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ea:41:ca 10.100.0.7
Jan 22 17:36:50 compute-0 ovn_controller[95372]: 2026-01-22T17:36:50Z|00123|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ea:41:ca 10.100.0.7
Jan 22 17:36:51 compute-0 nova_compute[183075]: 2026-01-22 17:36:51.549 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:53 compute-0 nova_compute[183075]: 2026-01-22 17:36:53.083 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:54 compute-0 nova_compute[183075]: 2026-01-22 17:36:54.643 183079 INFO nova.compute.manager [None req-956cf635-4c7f-4eb4-bdbd-05a41218371c 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Get console output
Jan 22 17:36:54 compute-0 nova_compute[183075]: 2026-01-22 17:36:54.646 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:36:55 compute-0 nova_compute[183075]: 2026-01-22 17:36:55.369 183079 INFO nova.compute.manager [None req-772687cc-4ed6-474c-8358-1d0e486c6365 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Get console output
Jan 22 17:36:55 compute-0 nova_compute[183075]: 2026-01-22 17:36:55.375 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:36:56 compute-0 podman[235878]: 2026-01-22 17:36:56.345556994 +0000 UTC m=+0.051805761 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:36:56 compute-0 nova_compute[183075]: 2026-01-22 17:36:56.553 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:57.214 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:57.215 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:36:57 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:57 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:57 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:57 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:57 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:57 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:36:57 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:58 compute-0 nova_compute[183075]: 2026-01-22 17:36:58.085 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.374 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.374 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.1590252
Jan 22 17:36:58 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.7:59896 [22/Jan/2026:17:36:57.213] listener listener/metadata 0/0/0/1160/1160 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.384 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.385 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.408 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:58 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.7:59912 [22/Jan/2026:17:36:58.384] listener listener/metadata 0/0/0/24/24 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.409 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 166 time: 0.0238948
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.413 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.414 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.434 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:58 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.7:59916 [22/Jan/2026:17:36:58.413] listener listener/metadata 0/0/0/22/22 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.435 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0209904
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.441 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.442 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.468 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.468 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0263040
Jan 22 17:36:58 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.7:59930 [22/Jan/2026:17:36:58.440] listener listener/metadata 0/0/0/27/27 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.473 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.474 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.496 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:58 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.7:59944 [22/Jan/2026:17:36:58.473] listener listener/metadata 0/0/0/24/24 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.497 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0235741
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.501 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.502 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.536 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.536 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0346422
Jan 22 17:36:58 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.7:59960 [22/Jan/2026:17:36:58.501] listener listener/metadata 0/0/0/35/35 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.541 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.542 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.562 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:58 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.7:59972 [22/Jan/2026:17:36:58.540] listener listener/metadata 0/0/0/22/22 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.563 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0210185
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.568 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.568 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.587 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.588 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.0194056
Jan 22 17:36:58 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.7:59978 [22/Jan/2026:17:36:58.567] listener listener/metadata 0/0/0/20/20 200 135 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.594 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.594 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.614 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.615 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0202208
Jan 22 17:36:58 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.7:59984 [22/Jan/2026:17:36:58.593] listener listener/metadata 0/0/0/21/21 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.619 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.619 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.638 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.638 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0192280
Jan 22 17:36:58 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.7:59996 [22/Jan/2026:17:36:58.618] listener listener/metadata 0/0/0/20/20 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.646 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.647 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:58 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.7:60012 [22/Jan/2026:17:36:58.645] listener listener/metadata 0/0/0/17/17 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.663 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0164366
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.680 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.682 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.709 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.710 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0281126
Jan 22 17:36:58 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.7:60026 [22/Jan/2026:17:36:58.680] listener listener/metadata 0/0/0/29/29 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.714 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.715 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.732 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.732 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0178277
Jan 22 17:36:58 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.7:60028 [22/Jan/2026:17:36:58.714] listener listener/metadata 0/0/0/18/18 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.736 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.737 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.754 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.754 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0172741
Jan 22 17:36:58 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.7:60030 [22/Jan/2026:17:36:58.736] listener listener/metadata 0/0/0/18/18 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.760 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.761 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.777 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.778 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0172446
Jan 22 17:36:58 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.7:60042 [22/Jan/2026:17:36:58.759] listener listener/metadata 0/0/0/18/18 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.782 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.783 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: f8ae4b18-347c-4ff3-b9f8-578518ecd408 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.800 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:36:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:36:58.801 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0176544
Jan 22 17:36:58 compute-0 haproxy-metadata-proxy-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235125]: 10.100.0.7:60044 [22/Jan/2026:17:36:58.782] listener listener/metadata 0/0/0/18/18 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:36:59 compute-0 nova_compute[183075]: 2026-01-22 17:36:59.801 183079 INFO nova.compute.manager [None req-e2dd65f1-3e0a-43ca-9c9a-2b8d17f8ea1c 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Get console output
Jan 22 17:36:59 compute-0 nova_compute[183075]: 2026-01-22 17:36:59.806 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:37:00 compute-0 podman[235904]: 2026-01-22 17:37:00.373881647 +0000 UTC m=+0.071162358 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:37:00 compute-0 nova_compute[183075]: 2026-01-22 17:37:00.773 183079 INFO nova.compute.manager [None req-1deceb71-3492-4f8a-869e-510781922896 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Get console output
Jan 22 17:37:00 compute-0 nova_compute[183075]: 2026-01-22 17:37:00.777 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:37:01 compute-0 nova_compute[183075]: 2026-01-22 17:37:01.555 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:03 compute-0 nova_compute[183075]: 2026-01-22 17:37:03.088 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:05 compute-0 nova_compute[183075]: 2026-01-22 17:37:05.038 183079 INFO nova.compute.manager [None req-cf9d9cf3-3d05-4fcf-b18c-f40a90a70dec 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Get console output
Jan 22 17:37:05 compute-0 nova_compute[183075]: 2026-01-22 17:37:05.042 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:37:05 compute-0 nova_compute[183075]: 2026-01-22 17:37:05.886 183079 INFO nova.compute.manager [None req-c371ab36-e237-49e6-9ce7-3964bb09001e bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Get console output
Jan 22 17:37:05 compute-0 nova_compute[183075]: 2026-01-22 17:37:05.890 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:37:06 compute-0 nova_compute[183075]: 2026-01-22 17:37:06.557 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:06 compute-0 ovn_controller[95372]: 2026-01-22T17:37:06Z|00650|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Jan 22 17:37:08 compute-0 nova_compute[183075]: 2026-01-22 17:37:08.090 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:10 compute-0 nova_compute[183075]: 2026-01-22 17:37:10.697 183079 INFO nova.compute.manager [None req-6cb87a40-b99a-4f16-89db-54392bb9d7ad 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Get console output
Jan 22 17:37:10 compute-0 nova_compute[183075]: 2026-01-22 17:37:10.702 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:37:11 compute-0 nova_compute[183075]: 2026-01-22 17:37:11.048 183079 INFO nova.compute.manager [None req-f72d678d-6734-4291-96b4-dc34f599d6fe bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Get console output
Jan 22 17:37:11 compute-0 nova_compute[183075]: 2026-01-22 17:37:11.054 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:37:11 compute-0 podman[235929]: 2026-01-22 17:37:11.352660848 +0000 UTC m=+0.055439510 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 22 17:37:11 compute-0 podman[235930]: 2026-01-22 17:37:11.354380265 +0000 UTC m=+0.054197957 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, managed_by=edpm_ansible, release=1755695350)
Jan 22 17:37:11 compute-0 podman[235928]: 2026-01-22 17:37:11.379431937 +0000 UTC m=+0.083992758 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 17:37:11 compute-0 nova_compute[183075]: 2026-01-22 17:37:11.559 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:13 compute-0 nova_compute[183075]: 2026-01-22 17:37:13.093 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:15 compute-0 nova_compute[183075]: 2026-01-22 17:37:15.886 183079 INFO nova.compute.manager [None req-23ee1d60-cae4-464a-94ab-063cb6491084 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Get console output
Jan 22 17:37:15 compute-0 nova_compute[183075]: 2026-01-22 17:37:15.890 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:37:16 compute-0 nova_compute[183075]: 2026-01-22 17:37:16.166 183079 INFO nova.compute.manager [None req-153209f5-dca1-45e8-8893-be276705a592 bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Get console output
Jan 22 17:37:16 compute-0 nova_compute[183075]: 2026-01-22 17:37:16.170 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:37:16 compute-0 podman[235991]: 2026-01-22 17:37:16.335334128 +0000 UTC m=+0.051426591 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:37:16 compute-0 nova_compute[183075]: 2026-01-22 17:37:16.561 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:18 compute-0 nova_compute[183075]: 2026-01-22 17:37:18.094 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:21 compute-0 nova_compute[183075]: 2026-01-22 17:37:21.005 183079 INFO nova.compute.manager [None req-292284de-dfb7-487c-8148-7e3ae2061a37 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Get console output
Jan 22 17:37:21 compute-0 nova_compute[183075]: 2026-01-22 17:37:21.009 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:37:21 compute-0 nova_compute[183075]: 2026-01-22 17:37:21.563 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:21 compute-0 nova_compute[183075]: 2026-01-22 17:37:21.660 183079 DEBUG nova.compute.manager [req-f495d460-8545-400c-9698-b574bc5b4c94 req-fbdf1923-f992-401e-b72e-f3fb18bc4017 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Received event network-changed-42eadaf5-c2f9-4e11-a3dc-3e5003423156 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:37:21 compute-0 nova_compute[183075]: 2026-01-22 17:37:21.661 183079 DEBUG nova.compute.manager [req-f495d460-8545-400c-9698-b574bc5b4c94 req-fbdf1923-f992-401e-b72e-f3fb18bc4017 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Refreshing instance network info cache due to event network-changed-42eadaf5-c2f9-4e11-a3dc-3e5003423156. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:37:21 compute-0 nova_compute[183075]: 2026-01-22 17:37:21.661 183079 DEBUG oslo_concurrency.lockutils [req-f495d460-8545-400c-9698-b574bc5b4c94 req-fbdf1923-f992-401e-b72e-f3fb18bc4017 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-3fd8923a-65e6-42d0-a866-17500a9df5cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:37:21 compute-0 nova_compute[183075]: 2026-01-22 17:37:21.661 183079 DEBUG oslo_concurrency.lockutils [req-f495d460-8545-400c-9698-b574bc5b4c94 req-fbdf1923-f992-401e-b72e-f3fb18bc4017 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-3fd8923a-65e6-42d0-a866-17500a9df5cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:37:21 compute-0 nova_compute[183075]: 2026-01-22 17:37:21.662 183079 DEBUG nova.network.neutron [req-f495d460-8545-400c-9698-b574bc5b4c94 req-fbdf1923-f992-401e-b72e-f3fb18bc4017 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Refreshing network info cache for port 42eadaf5-c2f9-4e11-a3dc-3e5003423156 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:37:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:21.672 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:21 compute-0 nova_compute[183075]: 2026-01-22 17:37:21.673 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:21.674 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:37:22 compute-0 nova_compute[183075]: 2026-01-22 17:37:22.901 183079 DEBUG nova.network.neutron [req-f495d460-8545-400c-9698-b574bc5b4c94 req-fbdf1923-f992-401e-b72e-f3fb18bc4017 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Updated VIF entry in instance network info cache for port 42eadaf5-c2f9-4e11-a3dc-3e5003423156. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:37:22 compute-0 nova_compute[183075]: 2026-01-22 17:37:22.902 183079 DEBUG nova.network.neutron [req-f495d460-8545-400c-9698-b574bc5b4c94 req-fbdf1923-f992-401e-b72e-f3fb18bc4017 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Updating instance_info_cache with network_info: [{"id": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "address": "fa:16:3e:23:26:1f", "network": {"id": "3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5", "bridge": "br-int", "label": "tempest-test-network--2142514082", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42eadaf5-c2", "ovs_interfaceid": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:37:22 compute-0 nova_compute[183075]: 2026-01-22 17:37:22.927 183079 DEBUG oslo_concurrency.lockutils [req-f495d460-8545-400c-9698-b574bc5b4c94 req-fbdf1923-f992-401e-b72e-f3fb18bc4017 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-3fd8923a-65e6-42d0-a866-17500a9df5cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.096 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.736 183079 DEBUG oslo_concurrency.lockutils [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "3fd8923a-65e6-42d0-a866-17500a9df5cf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.737 183079 DEBUG oslo_concurrency.lockutils [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "3fd8923a-65e6-42d0-a866-17500a9df5cf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.737 183079 DEBUG oslo_concurrency.lockutils [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.737 183079 DEBUG oslo_concurrency.lockutils [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.737 183079 DEBUG oslo_concurrency.lockutils [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.739 183079 INFO nova.compute.manager [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Terminating instance
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.740 183079 DEBUG nova.compute.manager [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.742 183079 DEBUG nova.compute.manager [req-10f3c5de-3a64-4857-8608-18d6041a7f85 req-b83bcd84-e7b2-48e3-a8db-7c6d2ba8356f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Received event network-changed-42eadaf5-c2f9-4e11-a3dc-3e5003423156 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.742 183079 DEBUG nova.compute.manager [req-10f3c5de-3a64-4857-8608-18d6041a7f85 req-b83bcd84-e7b2-48e3-a8db-7c6d2ba8356f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Refreshing instance network info cache due to event network-changed-42eadaf5-c2f9-4e11-a3dc-3e5003423156. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.743 183079 DEBUG oslo_concurrency.lockutils [req-10f3c5de-3a64-4857-8608-18d6041a7f85 req-b83bcd84-e7b2-48e3-a8db-7c6d2ba8356f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-3fd8923a-65e6-42d0-a866-17500a9df5cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.743 183079 DEBUG oslo_concurrency.lockutils [req-10f3c5de-3a64-4857-8608-18d6041a7f85 req-b83bcd84-e7b2-48e3-a8db-7c6d2ba8356f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-3fd8923a-65e6-42d0-a866-17500a9df5cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.743 183079 DEBUG nova.network.neutron [req-10f3c5de-3a64-4857-8608-18d6041a7f85 req-b83bcd84-e7b2-48e3-a8db-7c6d2ba8356f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Refreshing network info cache for port 42eadaf5-c2f9-4e11-a3dc-3e5003423156 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:37:23 compute-0 kernel: tap42eadaf5-c2 (unregistering): left promiscuous mode
Jan 22 17:37:23 compute-0 NetworkManager[55454]: <info>  [1769103443.7809] device (tap42eadaf5-c2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:37:23 compute-0 ovn_controller[95372]: 2026-01-22T17:37:23Z|00651|binding|INFO|Releasing lport 42eadaf5-c2f9-4e11-a3dc-3e5003423156 from this chassis (sb_readonly=0)
Jan 22 17:37:23 compute-0 ovn_controller[95372]: 2026-01-22T17:37:23Z|00652|binding|INFO|Setting lport 42eadaf5-c2f9-4e11-a3dc-3e5003423156 down in Southbound
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.791 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:23 compute-0 ovn_controller[95372]: 2026-01-22T17:37:23Z|00653|binding|INFO|Removing iface tap42eadaf5-c2 ovn-installed in OVS
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.794 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:23.798 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:26:1f 10.100.0.23'], port_security=['fa:16:3e:23:26:1f 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': '3fd8923a-65e6-42d0-a866-17500a9df5cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8cfd5f99a92142bd829974004d0e603e', 'neutron:revision_number': '6', 'neutron:security_group_ids': '759464aa-5d78-4dff-baf2-2a2f16ceb397', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.190'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa2e84d0-9512-4c88-842e-9b3499a075d2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=42eadaf5-c2f9-4e11-a3dc-3e5003423156) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:23.800 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 42eadaf5-c2f9-4e11-a3dc-3e5003423156 in datapath 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 unbound from our chassis
Jan 22 17:37:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:23.801 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:37:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:23.802 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0d7c0e5c-9473-4dbd-8a93-fd84fc7a6c47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:23.803 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 namespace which is not needed anymore
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.806 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:23 compute-0 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000003b.scope: Deactivated successfully.
Jan 22 17:37:23 compute-0 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000003b.scope: Consumed 14.718s CPU time.
Jan 22 17:37:23 compute-0 systemd-machined[154382]: Machine qemu-59-instance-0000003b terminated.
Jan 22 17:37:23 compute-0 neutron-haproxy-ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235618]: [NOTICE]   (235622) : haproxy version is 2.8.14-c23fe91
Jan 22 17:37:23 compute-0 neutron-haproxy-ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235618]: [NOTICE]   (235622) : path to executable is /usr/sbin/haproxy
Jan 22 17:37:23 compute-0 neutron-haproxy-ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235618]: [WARNING]  (235622) : Exiting Master process...
Jan 22 17:37:23 compute-0 neutron-haproxy-ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235618]: [ALERT]    (235622) : Current worker (235624) exited with code 143 (Terminated)
Jan 22 17:37:23 compute-0 neutron-haproxy-ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5[235618]: [WARNING]  (235622) : All workers exited. Exiting... (0)
Jan 22 17:37:23 compute-0 systemd[1]: libpod-c0f33adcd3a72a72f41f78a3cfc10a941e36145809accf9b38e5eb9e2b9e2c12.scope: Deactivated successfully.
Jan 22 17:37:23 compute-0 podman[236034]: 2026-01-22 17:37:23.944797544 +0000 UTC m=+0.044996206 container died c0f33adcd3a72a72f41f78a3cfc10a941e36145809accf9b38e5eb9e2b9e2c12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:37:23 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c0f33adcd3a72a72f41f78a3cfc10a941e36145809accf9b38e5eb9e2b9e2c12-userdata-shm.mount: Deactivated successfully.
Jan 22 17:37:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-52939867bdca6d7b9c547d1d9885def44cb2216b37d4fe83ecb7a629597da92e-merged.mount: Deactivated successfully.
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.992 183079 INFO nova.virt.libvirt.driver [-] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Instance destroyed successfully.
Jan 22 17:37:23 compute-0 nova_compute[183075]: 2026-01-22 17:37:23.992 183079 DEBUG nova.objects.instance [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lazy-loading 'resources' on Instance uuid 3fd8923a-65e6-42d0-a866-17500a9df5cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.102 183079 DEBUG nova.virt.libvirt.vif [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:36:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-842565147',display_name='tempest-server-test-842565147',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-842565147',id=59,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBkEoXBEO7Iq9oyrNfK3wd/+mUcVrdUpimNGP2S5PkNQ/qaeiJOV3d8bp+iClmxxrwIFhFFhsVSzK7EyUN3IdfHCQdVlCqDyzkANoAeuj9f9N66mRfaNpYK0SG4yFRFqGg==',key_name='tempest-keypair-test-192960344',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:36:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8cfd5f99a92142bd829974004d0e603e',ramdisk_id='',reservation_id='r-t2nywuua',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortSecurityTest-1796635121',owner_user_name='tempest-PortSecurityTest-1796635121-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:36:24Z,user_data=None,user_id='bf68afec168c4aa2ba7e47fdb3b026af',uuid=3fd8923a-65e6-42d0-a866-17500a9df5cf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "address": "fa:16:3e:23:26:1f", "network": {"id": "3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5", "bridge": "br-int", "label": "tempest-test-network--2142514082", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42eadaf5-c2", "ovs_interfaceid": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.102 183079 DEBUG nova.network.os_vif_util [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Converting VIF {"id": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "address": "fa:16:3e:23:26:1f", "network": {"id": "3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5", "bridge": "br-int", "label": "tempest-test-network--2142514082", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42eadaf5-c2", "ovs_interfaceid": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.103 183079 DEBUG nova.network.os_vif_util [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:23:26:1f,bridge_name='br-int',has_traffic_filtering=True,id=42eadaf5-c2f9-4e11-a3dc-3e5003423156,network=Network(3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap42eadaf5-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.103 183079 DEBUG os_vif [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:26:1f,bridge_name='br-int',has_traffic_filtering=True,id=42eadaf5-c2f9-4e11-a3dc-3e5003423156,network=Network(3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap42eadaf5-c2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.105 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.105 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42eadaf5-c2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.107 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.108 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.111 183079 INFO os_vif [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:26:1f,bridge_name='br-int',has_traffic_filtering=True,id=42eadaf5-c2f9-4e11-a3dc-3e5003423156,network=Network(3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap42eadaf5-c2')
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.111 183079 INFO nova.virt.libvirt.driver [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Deleting instance files /var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf_del
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.112 183079 INFO nova.virt.libvirt.driver [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Deletion of /var/lib/nova/instances/3fd8923a-65e6-42d0-a866-17500a9df5cf_del complete
Jan 22 17:37:24 compute-0 podman[236034]: 2026-01-22 17:37:24.20286727 +0000 UTC m=+0.303065912 container cleanup c0f33adcd3a72a72f41f78a3cfc10a941e36145809accf9b38e5eb9e2b9e2c12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:37:24 compute-0 systemd[1]: libpod-conmon-c0f33adcd3a72a72f41f78a3cfc10a941e36145809accf9b38e5eb9e2b9e2c12.scope: Deactivated successfully.
Jan 22 17:37:24 compute-0 podman[236076]: 2026-01-22 17:37:24.322795625 +0000 UTC m=+0.096146109 container remove c0f33adcd3a72a72f41f78a3cfc10a941e36145809accf9b38e5eb9e2b9e2c12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:37:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:24.333 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[19e98fa5-277c-4e4d-b0d0-faf2bb77819f]: (4, ('Thu Jan 22 05:37:23 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 (c0f33adcd3a72a72f41f78a3cfc10a941e36145809accf9b38e5eb9e2b9e2c12)\nc0f33adcd3a72a72f41f78a3cfc10a941e36145809accf9b38e5eb9e2b9e2c12\nThu Jan 22 05:37:24 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 (c0f33adcd3a72a72f41f78a3cfc10a941e36145809accf9b38e5eb9e2b9e2c12)\nc0f33adcd3a72a72f41f78a3cfc10a941e36145809accf9b38e5eb9e2b9e2c12\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.334 183079 INFO nova.compute.manager [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Took 0.59 seconds to destroy the instance on the hypervisor.
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.334 183079 DEBUG oslo.service.loopingcall [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.334 183079 DEBUG nova.compute.manager [-] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.335 183079 DEBUG nova.network.neutron [-] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:37:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:24.335 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[68755c0a-3afd-45f3-828f-f2a46de8e2aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:24.337 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3abe9ce6-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:37:24 compute-0 kernel: tap3abe9ce6-b0: left promiscuous mode
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.341 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:24 compute-0 nova_compute[183075]: 2026-01-22 17:37:24.352 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:24.356 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b82a88bf-7953-4dc1-a1f3-17c87e85b08e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:24.374 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[352f9602-0c0b-40a8-b82c-7e094b165cd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:24.375 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[da0ad428-17af-499f-9e99-dfa647c9a1fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:24.390 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8b2eac79-3042-498a-8a13-998ac59ca246]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574572, 'reachable_time': 21960, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236091, 'error': None, 'target': 'ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:24.393 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:37:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:24.393 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[af580533-ba46-46b5-8e1c-906687292d54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:24 compute-0 systemd[1]: run-netns-ovnmeta\x2d3abe9ce6\x2db7c1\x2d4db4\x2da2eb\x2d895c1eeda6f5.mount: Deactivated successfully.
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.180 183079 DEBUG nova.network.neutron [-] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.197 183079 INFO nova.compute.manager [-] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Took 0.86 seconds to deallocate network for instance.
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.260 183079 DEBUG oslo_concurrency.lockutils [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.260 183079 DEBUG oslo_concurrency.lockutils [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.351 183079 DEBUG nova.compute.provider_tree [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.389 183079 DEBUG nova.scheduler.client.report [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.414 183079 DEBUG oslo_concurrency.lockutils [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.432 183079 DEBUG nova.network.neutron [req-10f3c5de-3a64-4857-8608-18d6041a7f85 req-b83bcd84-e7b2-48e3-a8db-7c6d2ba8356f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Updated VIF entry in instance network info cache for port 42eadaf5-c2f9-4e11-a3dc-3e5003423156. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.434 183079 DEBUG nova.network.neutron [req-10f3c5de-3a64-4857-8608-18d6041a7f85 req-b83bcd84-e7b2-48e3-a8db-7c6d2ba8356f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Updating instance_info_cache with network_info: [{"id": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "address": "fa:16:3e:23:26:1f", "network": {"id": "3abe9ce6-b7c1-4db4-a2eb-895c1eeda6f5", "bridge": "br-int", "label": "tempest-test-network--2142514082", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8cfd5f99a92142bd829974004d0e603e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42eadaf5-c2", "ovs_interfaceid": "42eadaf5-c2f9-4e11-a3dc-3e5003423156", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.441 183079 INFO nova.scheduler.client.report [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Deleted allocations for instance 3fd8923a-65e6-42d0-a866-17500a9df5cf
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.456 183079 DEBUG oslo_concurrency.lockutils [req-10f3c5de-3a64-4857-8608-18d6041a7f85 req-b83bcd84-e7b2-48e3-a8db-7c6d2ba8356f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-3fd8923a-65e6-42d0-a866-17500a9df5cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.457 183079 DEBUG nova.compute.manager [req-10f3c5de-3a64-4857-8608-18d6041a7f85 req-b83bcd84-e7b2-48e3-a8db-7c6d2ba8356f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Received event network-changed-42eadaf5-c2f9-4e11-a3dc-3e5003423156 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.457 183079 DEBUG nova.compute.manager [req-10f3c5de-3a64-4857-8608-18d6041a7f85 req-b83bcd84-e7b2-48e3-a8db-7c6d2ba8356f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Refreshing instance network info cache due to event network-changed-42eadaf5-c2f9-4e11-a3dc-3e5003423156. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.457 183079 DEBUG oslo_concurrency.lockutils [req-10f3c5de-3a64-4857-8608-18d6041a7f85 req-b83bcd84-e7b2-48e3-a8db-7c6d2ba8356f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-3fd8923a-65e6-42d0-a866-17500a9df5cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.457 183079 DEBUG oslo_concurrency.lockutils [req-10f3c5de-3a64-4857-8608-18d6041a7f85 req-b83bcd84-e7b2-48e3-a8db-7c6d2ba8356f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-3fd8923a-65e6-42d0-a866-17500a9df5cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.457 183079 DEBUG nova.network.neutron [req-10f3c5de-3a64-4857-8608-18d6041a7f85 req-b83bcd84-e7b2-48e3-a8db-7c6d2ba8356f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Refreshing network info cache for port 42eadaf5-c2f9-4e11-a3dc-3e5003423156 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.510 183079 DEBUG oslo_concurrency.lockutils [None req-c2a280ed-76cc-438e-a614-f3c063cb35cf bf68afec168c4aa2ba7e47fdb3b026af 8cfd5f99a92142bd829974004d0e603e - - default default] Lock "3fd8923a-65e6-42d0-a866-17500a9df5cf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.624 183079 INFO nova.network.neutron [req-10f3c5de-3a64-4857-8608-18d6041a7f85 req-b83bcd84-e7b2-48e3-a8db-7c6d2ba8356f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Port 42eadaf5-c2f9-4e11-a3dc-3e5003423156 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.624 183079 DEBUG nova.network.neutron [req-10f3c5de-3a64-4857-8608-18d6041a7f85 req-b83bcd84-e7b2-48e3-a8db-7c6d2ba8356f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.643 183079 DEBUG oslo_concurrency.lockutils [req-10f3c5de-3a64-4857-8608-18d6041a7f85 req-b83bcd84-e7b2-48e3-a8db-7c6d2ba8356f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-3fd8923a-65e6-42d0-a866-17500a9df5cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.907 183079 DEBUG nova.compute.manager [req-c9c062f6-62b2-4a81-af14-7d59c9e87cd5 req-47004400-ebf0-45fe-8cd0-f45bf25b0918 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Received event network-vif-unplugged-42eadaf5-c2f9-4e11-a3dc-3e5003423156 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.908 183079 DEBUG oslo_concurrency.lockutils [req-c9c062f6-62b2-4a81-af14-7d59c9e87cd5 req-47004400-ebf0-45fe-8cd0-f45bf25b0918 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.908 183079 DEBUG oslo_concurrency.lockutils [req-c9c062f6-62b2-4a81-af14-7d59c9e87cd5 req-47004400-ebf0-45fe-8cd0-f45bf25b0918 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.908 183079 DEBUG oslo_concurrency.lockutils [req-c9c062f6-62b2-4a81-af14-7d59c9e87cd5 req-47004400-ebf0-45fe-8cd0-f45bf25b0918 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.908 183079 DEBUG nova.compute.manager [req-c9c062f6-62b2-4a81-af14-7d59c9e87cd5 req-47004400-ebf0-45fe-8cd0-f45bf25b0918 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] No waiting events found dispatching network-vif-unplugged-42eadaf5-c2f9-4e11-a3dc-3e5003423156 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.908 183079 WARNING nova.compute.manager [req-c9c062f6-62b2-4a81-af14-7d59c9e87cd5 req-47004400-ebf0-45fe-8cd0-f45bf25b0918 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Received unexpected event network-vif-unplugged-42eadaf5-c2f9-4e11-a3dc-3e5003423156 for instance with vm_state deleted and task_state None.
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.908 183079 DEBUG nova.compute.manager [req-c9c062f6-62b2-4a81-af14-7d59c9e87cd5 req-47004400-ebf0-45fe-8cd0-f45bf25b0918 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Received event network-vif-plugged-42eadaf5-c2f9-4e11-a3dc-3e5003423156 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.909 183079 DEBUG oslo_concurrency.lockutils [req-c9c062f6-62b2-4a81-af14-7d59c9e87cd5 req-47004400-ebf0-45fe-8cd0-f45bf25b0918 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.909 183079 DEBUG oslo_concurrency.lockutils [req-c9c062f6-62b2-4a81-af14-7d59c9e87cd5 req-47004400-ebf0-45fe-8cd0-f45bf25b0918 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.909 183079 DEBUG oslo_concurrency.lockutils [req-c9c062f6-62b2-4a81-af14-7d59c9e87cd5 req-47004400-ebf0-45fe-8cd0-f45bf25b0918 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "3fd8923a-65e6-42d0-a866-17500a9df5cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.909 183079 DEBUG nova.compute.manager [req-c9c062f6-62b2-4a81-af14-7d59c9e87cd5 req-47004400-ebf0-45fe-8cd0-f45bf25b0918 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] No waiting events found dispatching network-vif-plugged-42eadaf5-c2f9-4e11-a3dc-3e5003423156 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.909 183079 WARNING nova.compute.manager [req-c9c062f6-62b2-4a81-af14-7d59c9e87cd5 req-47004400-ebf0-45fe-8cd0-f45bf25b0918 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Received unexpected event network-vif-plugged-42eadaf5-c2f9-4e11-a3dc-3e5003423156 for instance with vm_state deleted and task_state None.
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.909 183079 DEBUG nova.compute.manager [req-c9c062f6-62b2-4a81-af14-7d59c9e87cd5 req-47004400-ebf0-45fe-8cd0-f45bf25b0918 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Received event network-vif-deleted-42eadaf5-c2f9-4e11-a3dc-3e5003423156 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.909 183079 INFO nova.compute.manager [req-c9c062f6-62b2-4a81-af14-7d59c9e87cd5 req-47004400-ebf0-45fe-8cd0-f45bf25b0918 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Neutron deleted interface 42eadaf5-c2f9-4e11-a3dc-3e5003423156; detaching it from the instance and deleting it from the info cache
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.910 183079 DEBUG nova.network.neutron [req-c9c062f6-62b2-4a81-af14-7d59c9e87cd5 req-47004400-ebf0-45fe-8cd0-f45bf25b0918 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 22 17:37:25 compute-0 nova_compute[183075]: 2026-01-22 17:37:25.913 183079 DEBUG nova.compute.manager [req-c9c062f6-62b2-4a81-af14-7d59c9e87cd5 req-47004400-ebf0-45fe-8cd0-f45bf25b0918 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Detach interface failed, port_id=42eadaf5-c2f9-4e11-a3dc-3e5003423156, reason: Instance 3fd8923a-65e6-42d0-a866-17500a9df5cf could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 17:37:26 compute-0 nova_compute[183075]: 2026-01-22 17:37:26.167 183079 INFO nova.compute.manager [None req-3307676f-d752-4eed-8de0-117822f27a4e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Get console output
Jan 22 17:37:26 compute-0 nova_compute[183075]: 2026-01-22 17:37:26.172 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:37:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:26.677 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:37:26 compute-0 nova_compute[183075]: 2026-01-22 17:37:26.776 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:37:27 compute-0 podman[236092]: 2026-01-22 17:37:27.359910191 +0000 UTC m=+0.064011303 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:37:27 compute-0 nova_compute[183075]: 2026-01-22 17:37:27.786 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:37:27 compute-0 nova_compute[183075]: 2026-01-22 17:37:27.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:37:28 compute-0 nova_compute[183075]: 2026-01-22 17:37:28.098 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:29 compute-0 nova_compute[183075]: 2026-01-22 17:37:29.319 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:29 compute-0 ovn_controller[95372]: 2026-01-22T17:37:29Z|00654|binding|INFO|Releasing lport ed3c9c99-2ee5-407d-8706-619270405578 from this chassis (sb_readonly=0)
Jan 22 17:37:30 compute-0 nova_compute[183075]: 2026-01-22 17:37:30.040 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:31 compute-0 podman[236116]: 2026-01-22 17:37:31.34706846 +0000 UTC m=+0.057783954 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:37:31 compute-0 nova_compute[183075]: 2026-01-22 17:37:31.357 183079 INFO nova.compute.manager [None req-ea63a0fa-b667-4804-b38c-3491b2a8af42 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Get console output
Jan 22 17:37:31 compute-0 nova_compute[183075]: 2026-01-22 17:37:31.361 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:37:32 compute-0 nova_compute[183075]: 2026-01-22 17:37:32.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:37:33 compute-0 nova_compute[183075]: 2026-01-22 17:37:33.102 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:34 compute-0 nova_compute[183075]: 2026-01-22 17:37:34.321 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:36 compute-0 nova_compute[183075]: 2026-01-22 17:37:36.500 183079 INFO nova.compute.manager [None req-e184fea8-d294-4ffd-b4d0-c11c476c552b 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Get console output
Jan 22 17:37:36 compute-0 nova_compute[183075]: 2026-01-22 17:37:36.506 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:37:37 compute-0 ovn_controller[95372]: 2026-01-22T17:37:37Z|00655|binding|INFO|Releasing lport ed3c9c99-2ee5-407d-8706-619270405578 from this chassis (sb_readonly=0)
Jan 22 17:37:37 compute-0 nova_compute[183075]: 2026-01-22 17:37:37.470 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:37 compute-0 nova_compute[183075]: 2026-01-22 17:37:37.784 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:37:38 compute-0 nova_compute[183075]: 2026-01-22 17:37:38.103 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:38 compute-0 nova_compute[183075]: 2026-01-22 17:37:38.375 183079 INFO nova.compute.manager [None req-98e969b9-0069-4d79-81a4-bc8df83c4c1c 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Get console output
Jan 22 17:37:38 compute-0 nova_compute[183075]: 2026-01-22 17:37:38.379 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:37:38 compute-0 nova_compute[183075]: 2026-01-22 17:37:38.990 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103443.9894218, 3fd8923a-65e6-42d0-a866-17500a9df5cf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:37:38 compute-0 nova_compute[183075]: 2026-01-22 17:37:38.991 183079 INFO nova.compute.manager [-] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] VM Stopped (Lifecycle Event)
Jan 22 17:37:39 compute-0 nova_compute[183075]: 2026-01-22 17:37:39.021 183079 DEBUG nova.compute.manager [None req-e99b76a9-2466-4193-bda8-728f4fa6c413 - - - - - -] [instance: 3fd8923a-65e6-42d0-a866-17500a9df5cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:37:39 compute-0 nova_compute[183075]: 2026-01-22 17:37:39.323 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:39 compute-0 nova_compute[183075]: 2026-01-22 17:37:39.521 183079 INFO nova.compute.manager [None req-4a8c08ac-17a3-45a8-af9c-54b2fb51bd6a 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Get console output
Jan 22 17:37:39 compute-0 nova_compute[183075]: 2026-01-22 17:37:39.524 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:37:39 compute-0 nova_compute[183075]: 2026-01-22 17:37:39.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:37:39 compute-0 nova_compute[183075]: 2026-01-22 17:37:39.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:37:39 compute-0 nova_compute[183075]: 2026-01-22 17:37:39.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:37:40 compute-0 sshd-session[236142]: Connection reset by authenticating user root 176.120.22.47 port 35686 [preauth]
Jan 22 17:37:40 compute-0 nova_compute[183075]: 2026-01-22 17:37:40.254 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-585cde24-8038-40b2-97ce-5d30e6ecfc03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:37:40 compute-0 nova_compute[183075]: 2026-01-22 17:37:40.254 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-585cde24-8038-40b2-97ce-5d30e6ecfc03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:37:40 compute-0 nova_compute[183075]: 2026-01-22 17:37:40.254 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:37:40 compute-0 nova_compute[183075]: 2026-01-22 17:37:40.255 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 585cde24-8038-40b2-97ce-5d30e6ecfc03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:37:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:41.953 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:41.954 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:41.955 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:42 compute-0 podman[236148]: 2026-01-22 17:37:42.352066964 +0000 UTC m=+0.055666537 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Jan 22 17:37:42 compute-0 podman[236147]: 2026-01-22 17:37:42.367524984 +0000 UTC m=+0.077070179 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:37:42 compute-0 podman[236146]: 2026-01-22 17:37:42.370578417 +0000 UTC m=+0.080978725 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:37:42 compute-0 nova_compute[183075]: 2026-01-22 17:37:42.534 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Updating instance_info_cache with network_info: [{"id": "524266a0-8b9b-42d0-9f33-913b53293292", "address": "fa:16:3e:7a:64:5f", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap524266a0-8b", "ovs_interfaceid": "524266a0-8b9b-42d0-9f33-913b53293292", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:37:42 compute-0 nova_compute[183075]: 2026-01-22 17:37:42.553 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-585cde24-8038-40b2-97ce-5d30e6ecfc03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:37:42 compute-0 nova_compute[183075]: 2026-01-22 17:37:42.553 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:37:42 compute-0 nova_compute[183075]: 2026-01-22 17:37:42.553 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:37:42 compute-0 nova_compute[183075]: 2026-01-22 17:37:42.554 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:37:42 compute-0 nova_compute[183075]: 2026-01-22 17:37:42.554 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:37:42 compute-0 nova_compute[183075]: 2026-01-22 17:37:42.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:37:42 compute-0 nova_compute[183075]: 2026-01-22 17:37:42.815 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:42 compute-0 nova_compute[183075]: 2026-01-22 17:37:42.815 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:42 compute-0 nova_compute[183075]: 2026-01-22 17:37:42.815 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:42 compute-0 nova_compute[183075]: 2026-01-22 17:37:42.816 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:37:42 compute-0 nova_compute[183075]: 2026-01-22 17:37:42.901 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:37:42 compute-0 nova_compute[183075]: 2026-01-22 17:37:42.961 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:37:42 compute-0 nova_compute[183075]: 2026-01-22 17:37:42.963 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.021 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.027 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.082 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.083 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.106 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.139 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.298 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.299 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5362MB free_disk=73.30315780639648GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.299 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.300 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.623 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 585cde24-8038-40b2-97ce-5d30e6ecfc03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.623 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance b7b02307-4481-472f-8bdc-707a9d19f350 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.624 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.624 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.646 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing inventories for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.671 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Updating ProviderTree inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.672 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.688 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing aggregate associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.715 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing trait associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NODE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.774 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.794 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.824 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:37:43 compute-0 nova_compute[183075]: 2026-01-22 17:37:43.825 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.525s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:44 compute-0 nova_compute[183075]: 2026-01-22 17:37:44.327 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:47 compute-0 podman[236223]: 2026-01-22 17:37:47.356407405 +0000 UTC m=+0.060960590 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 17:37:47 compute-0 ovn_controller[95372]: 2026-01-22T17:37:47Z|00656|binding|INFO|Claiming lport 235780f4-6280-41cc-9ec8-72e939db96b0 for this chassis.
Jan 22 17:37:47 compute-0 ovn_controller[95372]: 2026-01-22T17:37:47Z|00657|binding|INFO|235780f4-6280-41cc-9ec8-72e939db96b0: Claiming fa:16:3e:5d:90:cf
Jan 22 17:37:47 compute-0 ovn_controller[95372]: 2026-01-22T17:37:47Z|00658|binding|INFO|Setting lport 235780f4-6280-41cc-9ec8-72e939db96b0 up in Southbound
Jan 22 17:37:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:47.727 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:90:cf'], port_security=['fa:16:3e:5d:90:cf'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['524266a0-8b9b-42d0-9f33-913b53293292'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a7f15ccf-0de3-41e5-96a8-e75836879c4f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '2', 'neutron:security_group_ids': '82e2a423-9fb1-4474-8e75-6c808c020203', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[3], additional_encap=[], encap=[], mirror_rules=[], datapath=47dd2de2-1929-4faf-a49d-e20d1924a299, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=235780f4-6280-41cc-9ec8-72e939db96b0) old=Port_Binding(up=[False], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:47.728 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 235780f4-6280-41cc-9ec8-72e939db96b0 in datapath a7f15ccf-0de3-41e5-96a8-e75836879c4f bound to our chassis
Jan 22 17:37:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:47.729 104629 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a7f15ccf-0de3-41e5-96a8-e75836879c4f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 17:37:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:47.730 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c7544d7f-91a9-4a01-9773-cd47ce7e463a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:47 compute-0 nova_compute[183075]: 2026-01-22 17:37:47.753 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:47 compute-0 ovn_controller[95372]: 2026-01-22T17:37:47Z|00659|binding|INFO|Claiming lport 0786767e-03b2-4160-a379-f2f8c1c1de47 for this chassis.
Jan 22 17:37:47 compute-0 ovn_controller[95372]: 2026-01-22T17:37:47Z|00660|binding|INFO|0786767e-03b2-4160-a379-f2f8c1c1de47: Claiming fa:16:3e:a3:db:72
Jan 22 17:37:47 compute-0 ovn_controller[95372]: 2026-01-22T17:37:47Z|00661|binding|INFO|Setting lport 0786767e-03b2-4160-a379-f2f8c1c1de47 up in Southbound
Jan 22 17:37:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:47.876 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:db:72'], port_security=['fa:16:3e:a3:db:72'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['524266a0-8b9b-42d0-9f33-913b53293292'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-46856498-b6e8-4882-88eb-0a7d176f3907', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '2', 'neutron:security_group_ids': '82e2a423-9fb1-4474-8e75-6c808c020203', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[4], additional_encap=[], encap=[], mirror_rules=[], datapath=c831000a-d6d5-4efa-a88a-83539962f3ee, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0786767e-03b2-4160-a379-f2f8c1c1de47) old=Port_Binding(up=[False], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:47.877 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 0786767e-03b2-4160-a379-f2f8c1c1de47 in datapath 46856498-b6e8-4882-88eb-0a7d176f3907 bound to our chassis
Jan 22 17:37:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:47.878 104629 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 46856498-b6e8-4882-88eb-0a7d176f3907 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 17:37:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:47.879 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ffdd2938-b591-4d27-b510-d642d21fb312]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:47 compute-0 nova_compute[183075]: 2026-01-22 17:37:47.904 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:48 compute-0 ovn_controller[95372]: 2026-01-22T17:37:48Z|00662|binding|INFO|Claiming lport 1d9d797b-b7d1-4932-af4a-715af9ef18ed for this chassis.
Jan 22 17:37:48 compute-0 ovn_controller[95372]: 2026-01-22T17:37:48Z|00663|binding|INFO|1d9d797b-b7d1-4932-af4a-715af9ef18ed: Claiming fa:16:3e:34:e6:17
Jan 22 17:37:48 compute-0 ovn_controller[95372]: 2026-01-22T17:37:48Z|00664|binding|INFO|Setting lport 1d9d797b-b7d1-4932-af4a-715af9ef18ed up in Southbound
Jan 22 17:37:48 compute-0 nova_compute[183075]: 2026-01-22 17:37:48.047 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:48 compute-0 nova_compute[183075]: 2026-01-22 17:37:48.107 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:48.235 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:e6:17'], port_security=['fa:16:3e:34:e6:17'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['524266a0-8b9b-42d0-9f33-913b53293292'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef142b85-db39-49be-8e2c-38028d0ff453', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '2', 'neutron:security_group_ids': '82e2a423-9fb1-4474-8e75-6c808c020203', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[5], additional_encap=[], encap=[], mirror_rules=[], datapath=d9940ce6-472d-40b4-9a94-20b67dad05d1, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1d9d797b-b7d1-4932-af4a-715af9ef18ed) old=Port_Binding(up=[False], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:48.237 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 1d9d797b-b7d1-4932-af4a-715af9ef18ed in datapath ef142b85-db39-49be-8e2c-38028d0ff453 bound to our chassis
Jan 22 17:37:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:48.238 104629 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ef142b85-db39-49be-8e2c-38028d0ff453 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 17:37:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:48.239 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[de2f45b6-56d3-432a-862e-9383c0eefabc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:48 compute-0 ovn_controller[95372]: 2026-01-22T17:37:48Z|00665|binding|INFO|Claiming lport e41a3e78-54fe-4a40-9b04-3ee7173a63c0 for this chassis.
Jan 22 17:37:48 compute-0 ovn_controller[95372]: 2026-01-22T17:37:48Z|00666|binding|INFO|e41a3e78-54fe-4a40-9b04-3ee7173a63c0: Claiming fa:16:3e:44:f2:2a
Jan 22 17:37:48 compute-0 ovn_controller[95372]: 2026-01-22T17:37:48Z|00667|binding|INFO|Setting lport e41a3e78-54fe-4a40-9b04-3ee7173a63c0 up in Southbound
Jan 22 17:37:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:48.320 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:f2:2a'], port_security=['fa:16:3e:44:f2:2a'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['524266a0-8b9b-42d0-9f33-913b53293292'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5a511daa-57a3-4f7c-bf7b-a390c6f4b274', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '2', 'neutron:security_group_ids': '82e2a423-9fb1-4474-8e75-6c808c020203', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[6], additional_encap=[], encap=[], mirror_rules=[], datapath=f21dfdf2-2283-4a8c-b6e2-e712a925b38c, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e41a3e78-54fe-4a40-9b04-3ee7173a63c0) old=Port_Binding(up=[False], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:48.321 104629 INFO neutron.agent.ovn.metadata.agent [-] Port e41a3e78-54fe-4a40-9b04-3ee7173a63c0 in datapath 5a511daa-57a3-4f7c-bf7b-a390c6f4b274 bound to our chassis
Jan 22 17:37:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:48.322 104629 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5a511daa-57a3-4f7c-bf7b-a390c6f4b274 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 17:37:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:48.323 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6b50a39e-b73e-400c-875d-d0b37130abfe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:48 compute-0 nova_compute[183075]: 2026-01-22 17:37:48.343 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:48 compute-0 sshd-session[236145]: Connection reset by authenticating user root 176.120.22.47 port 51628 [preauth]
Jan 22 17:37:49 compute-0 nova_compute[183075]: 2026-01-22 17:37:49.329 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:49 compute-0 ovn_controller[95372]: 2026-01-22T17:37:49Z|00668|binding|INFO|Removing iface tap524266a0-8b ovn-installed in OVS
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.518 104629 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 65b8008d-14b5-4784-8583-dfafc2443fa5 with type ""
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.519 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:90:cf'], port_security=['fa:16:3e:5d:90:cf'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['524266a0-8b9b-42d0-9f33-913b53293292'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a7f15ccf-0de3-41e5-96a8-e75836879c4f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '2', 'neutron:security_group_ids': '82e2a423-9fb1-4474-8e75-6c808c020203', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[3], additional_encap=[], encap=[], mirror_rules=[], datapath=47dd2de2-1929-4faf-a49d-e20d1924a299, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=235780f4-6280-41cc-9ec8-72e939db96b0) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.520 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 235780f4-6280-41cc-9ec8-72e939db96b0 in datapath a7f15ccf-0de3-41e5-96a8-e75836879c4f unbound from our chassis
Jan 22 17:37:49 compute-0 nova_compute[183075]: 2026-01-22 17:37:49.520 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.521 104629 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a7f15ccf-0de3-41e5-96a8-e75836879c4f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.522 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7c60970a-6c9f-4f8a-af46-87bda38c20b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:49 compute-0 nova_compute[183075]: 2026-01-22 17:37:49.530 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.645 104629 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 2742e69c-9e0f-436f-9b38-2509c3041cad with type ""
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.646 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:db:72'], port_security=['fa:16:3e:a3:db:72'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['524266a0-8b9b-42d0-9f33-913b53293292'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-46856498-b6e8-4882-88eb-0a7d176f3907', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '2', 'neutron:security_group_ids': '82e2a423-9fb1-4474-8e75-6c808c020203', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[4], additional_encap=[], encap=[], mirror_rules=[], datapath=c831000a-d6d5-4efa-a88a-83539962f3ee, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0786767e-03b2-4160-a379-f2f8c1c1de47) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.647 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 0786767e-03b2-4160-a379-f2f8c1c1de47 in datapath 46856498-b6e8-4882-88eb-0a7d176f3907 unbound from our chassis
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.648 104629 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 46856498-b6e8-4882-88eb-0a7d176f3907 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.648 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4c84a498-f54b-4e20-b01a-b47e6435641b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:49 compute-0 nova_compute[183075]: 2026-01-22 17:37:49.656 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.824 104629 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port e347973f-a430-4f05-8afd-2b75c9fa875b with type ""
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.825 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:e6:17'], port_security=['fa:16:3e:34:e6:17'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['524266a0-8b9b-42d0-9f33-913b53293292'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef142b85-db39-49be-8e2c-38028d0ff453', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '2', 'neutron:security_group_ids': '82e2a423-9fb1-4474-8e75-6c808c020203', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[5], additional_encap=[], encap=[], mirror_rules=[], datapath=d9940ce6-472d-40b4-9a94-20b67dad05d1, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1d9d797b-b7d1-4932-af4a-715af9ef18ed) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.827 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 1d9d797b-b7d1-4932-af4a-715af9ef18ed in datapath ef142b85-db39-49be-8e2c-38028d0ff453 unbound from our chassis
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.829 104629 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ef142b85-db39-49be-8e2c-38028d0ff453 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.830 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bd2b06a1-743b-40a6-9b3f-45e407127fa9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:49 compute-0 nova_compute[183075]: 2026-01-22 17:37:49.838 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.979 104629 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 4f4d3148-8bb1-4336-a9b4-883ab76975fb with type ""
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.980 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:f2:2a'], port_security=['fa:16:3e:44:f2:2a'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['524266a0-8b9b-42d0-9f33-913b53293292'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5a511daa-57a3-4f7c-bf7b-a390c6f4b274', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '2', 'neutron:security_group_ids': '82e2a423-9fb1-4474-8e75-6c808c020203', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[6], additional_encap=[], encap=[], mirror_rules=[], datapath=f21dfdf2-2283-4a8c-b6e2-e712a925b38c, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e41a3e78-54fe-4a40-9b04-3ee7173a63c0) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.981 104629 INFO neutron.agent.ovn.metadata.agent [-] Port e41a3e78-54fe-4a40-9b04-3ee7173a63c0 in datapath 5a511daa-57a3-4f7c-bf7b-a390c6f4b274 unbound from our chassis
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.982 104629 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5a511daa-57a3-4f7c-bf7b-a390c6f4b274 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 17:37:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:49.983 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5dc8f08f-301a-4ef5-8aa2-0183357f4169]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:49 compute-0 nova_compute[183075]: 2026-01-22 17:37:49.991 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:51 compute-0 sshd-session[236221]: Connection reset by authenticating user root 176.120.22.47 port 51638 [preauth]
Jan 22 17:37:51 compute-0 ovn_controller[95372]: 2026-01-22T17:37:51Z|00669|binding|INFO|Claiming lport 235780f4-6280-41cc-9ec8-72e939db96b0 for this chassis.
Jan 22 17:37:51 compute-0 ovn_controller[95372]: 2026-01-22T17:37:51Z|00670|binding|INFO|235780f4-6280-41cc-9ec8-72e939db96b0: Claiming fa:16:3e:5d:90:cf
Jan 22 17:37:51 compute-0 ovn_controller[95372]: 2026-01-22T17:37:51Z|00671|binding|INFO|Setting lport 235780f4-6280-41cc-9ec8-72e939db96b0 up in Southbound
Jan 22 17:37:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:51.991 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:90:cf'], port_security=['fa:16:3e:5d:90:cf'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['3a7e69a2-e6a0-4e5f-883e-d116981b58d9'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a7f15ccf-0de3-41e5-96a8-e75836879c4f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '5', 'neutron:security_group_ids': '82e2a423-9fb1-4474-8e75-6c808c020203', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[3], additional_encap=[], encap=[], mirror_rules=[], datapath=47dd2de2-1929-4faf-a49d-e20d1924a299, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=235780f4-6280-41cc-9ec8-72e939db96b0) old=Port_Binding(up=[False], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:51.992 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 235780f4-6280-41cc-9ec8-72e939db96b0 in datapath a7f15ccf-0de3-41e5-96a8-e75836879c4f bound to our chassis
Jan 22 17:37:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:51.993 104629 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a7f15ccf-0de3-41e5-96a8-e75836879c4f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 17:37:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:51.993 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[82b07d79-4dc7-4073-9383-fcb06f9facf6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:52 compute-0 nova_compute[183075]: 2026-01-22 17:37:52.010 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:52 compute-0 ovn_controller[95372]: 2026-01-22T17:37:52Z|00672|binding|INFO|Claiming lport 0786767e-03b2-4160-a379-f2f8c1c1de47 for this chassis.
Jan 22 17:37:52 compute-0 ovn_controller[95372]: 2026-01-22T17:37:52Z|00673|binding|INFO|0786767e-03b2-4160-a379-f2f8c1c1de47: Claiming fa:16:3e:a3:db:72
Jan 22 17:37:52 compute-0 ovn_controller[95372]: 2026-01-22T17:37:52Z|00674|binding|INFO|Setting lport 0786767e-03b2-4160-a379-f2f8c1c1de47 up in Southbound
Jan 22 17:37:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:52.105 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:db:72'], port_security=['fa:16:3e:a3:db:72'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['3a7e69a2-e6a0-4e5f-883e-d116981b58d9'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-46856498-b6e8-4882-88eb-0a7d176f3907', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '5', 'neutron:security_group_ids': '82e2a423-9fb1-4474-8e75-6c808c020203', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[4], additional_encap=[], encap=[], mirror_rules=[], datapath=c831000a-d6d5-4efa-a88a-83539962f3ee, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0786767e-03b2-4160-a379-f2f8c1c1de47) old=Port_Binding(up=[False], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:52.107 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 0786767e-03b2-4160-a379-f2f8c1c1de47 in datapath 46856498-b6e8-4882-88eb-0a7d176f3907 bound to our chassis
Jan 22 17:37:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:52.107 104629 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 46856498-b6e8-4882-88eb-0a7d176f3907 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 17:37:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:52.108 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[584d7451-7368-4e80-baae-573387fe7fc7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:52 compute-0 nova_compute[183075]: 2026-01-22 17:37:52.114 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:52 compute-0 ovn_controller[95372]: 2026-01-22T17:37:52Z|00675|binding|INFO|Claiming lport 1d9d797b-b7d1-4932-af4a-715af9ef18ed for this chassis.
Jan 22 17:37:52 compute-0 ovn_controller[95372]: 2026-01-22T17:37:52Z|00676|binding|INFO|1d9d797b-b7d1-4932-af4a-715af9ef18ed: Claiming fa:16:3e:34:e6:17
Jan 22 17:37:52 compute-0 ovn_controller[95372]: 2026-01-22T17:37:52Z|00677|binding|INFO|Setting lport 1d9d797b-b7d1-4932-af4a-715af9ef18ed up in Southbound
Jan 22 17:37:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:52.251 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:e6:17'], port_security=['fa:16:3e:34:e6:17'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['3a7e69a2-e6a0-4e5f-883e-d116981b58d9'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef142b85-db39-49be-8e2c-38028d0ff453', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '5', 'neutron:security_group_ids': '82e2a423-9fb1-4474-8e75-6c808c020203', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[5], additional_encap=[], encap=[], mirror_rules=[], datapath=d9940ce6-472d-40b4-9a94-20b67dad05d1, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1d9d797b-b7d1-4932-af4a-715af9ef18ed) old=Port_Binding(up=[False], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:52.252 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 1d9d797b-b7d1-4932-af4a-715af9ef18ed in datapath ef142b85-db39-49be-8e2c-38028d0ff453 bound to our chassis
Jan 22 17:37:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:52.254 104629 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ef142b85-db39-49be-8e2c-38028d0ff453 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 17:37:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:52.255 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a8237db1-647d-4917-892e-0817237ac2d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:52 compute-0 nova_compute[183075]: 2026-01-22 17:37:52.259 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:52 compute-0 ovn_controller[95372]: 2026-01-22T17:37:52Z|00678|binding|INFO|Claiming lport e41a3e78-54fe-4a40-9b04-3ee7173a63c0 for this chassis.
Jan 22 17:37:52 compute-0 ovn_controller[95372]: 2026-01-22T17:37:52Z|00679|binding|INFO|e41a3e78-54fe-4a40-9b04-3ee7173a63c0: Claiming fa:16:3e:44:f2:2a
Jan 22 17:37:52 compute-0 ovn_controller[95372]: 2026-01-22T17:37:52Z|00680|binding|INFO|Setting lport e41a3e78-54fe-4a40-9b04-3ee7173a63c0 up in Southbound
Jan 22 17:37:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:52.355 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:f2:2a'], port_security=['fa:16:3e:44:f2:2a'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['3a7e69a2-e6a0-4e5f-883e-d116981b58d9'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5a511daa-57a3-4f7c-bf7b-a390c6f4b274', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '5', 'neutron:security_group_ids': '82e2a423-9fb1-4474-8e75-6c808c020203', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[6], additional_encap=[], encap=[], mirror_rules=[], datapath=f21dfdf2-2283-4a8c-b6e2-e712a925b38c, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e41a3e78-54fe-4a40-9b04-3ee7173a63c0) old=Port_Binding(up=[False], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:52.357 104629 INFO neutron.agent.ovn.metadata.agent [-] Port e41a3e78-54fe-4a40-9b04-3ee7173a63c0 in datapath 5a511daa-57a3-4f7c-bf7b-a390c6f4b274 bound to our chassis
Jan 22 17:37:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:52.358 104629 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5a511daa-57a3-4f7c-bf7b-a390c6f4b274 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 17:37:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:52.359 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3b3e9ee8-598c-44eb-9845-fc4328fddeac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:52 compute-0 nova_compute[183075]: 2026-01-22 17:37:52.411 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.110 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.594 183079 DEBUG oslo_concurrency.lockutils [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "b7b02307-4481-472f-8bdc-707a9d19f350" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.595 183079 DEBUG oslo_concurrency.lockutils [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "b7b02307-4481-472f-8bdc-707a9d19f350" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.595 183079 DEBUG oslo_concurrency.lockutils [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.596 183079 DEBUG oslo_concurrency.lockutils [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.596 183079 DEBUG oslo_concurrency.lockutils [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.598 183079 INFO nova.compute.manager [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Terminating instance
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.600 183079 DEBUG nova.compute.manager [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:37:53 compute-0 kernel: tap3a7e69a2-e6 (unregistering): left promiscuous mode
Jan 22 17:37:53 compute-0 NetworkManager[55454]: <info>  [1769103473.6286] device (tap3a7e69a2-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:37:53 compute-0 ovn_controller[95372]: 2026-01-22T17:37:53Z|00681|binding|INFO|Releasing lport 3a7e69a2-e6a0-4e5f-883e-d116981b58d9 from this chassis (sb_readonly=0)
Jan 22 17:37:53 compute-0 ovn_controller[95372]: 2026-01-22T17:37:53Z|00682|binding|INFO|Setting lport 3a7e69a2-e6a0-4e5f-883e-d116981b58d9 down in Southbound
Jan 22 17:37:53 compute-0 ovn_controller[95372]: 2026-01-22T17:37:53Z|00683|binding|INFO|Releasing lport 235780f4-6280-41cc-9ec8-72e939db96b0 from this chassis (sb_readonly=0)
Jan 22 17:37:53 compute-0 ovn_controller[95372]: 2026-01-22T17:37:53Z|00684|binding|INFO|Setting lport 235780f4-6280-41cc-9ec8-72e939db96b0 down in Southbound
Jan 22 17:37:53 compute-0 ovn_controller[95372]: 2026-01-22T17:37:53Z|00685|binding|INFO|Releasing lport 0786767e-03b2-4160-a379-f2f8c1c1de47 from this chassis (sb_readonly=0)
Jan 22 17:37:53 compute-0 ovn_controller[95372]: 2026-01-22T17:37:53Z|00686|binding|INFO|Setting lport 0786767e-03b2-4160-a379-f2f8c1c1de47 down in Southbound
Jan 22 17:37:53 compute-0 ovn_controller[95372]: 2026-01-22T17:37:53Z|00687|binding|INFO|Releasing lport 1d9d797b-b7d1-4932-af4a-715af9ef18ed from this chassis (sb_readonly=0)
Jan 22 17:37:53 compute-0 ovn_controller[95372]: 2026-01-22T17:37:53Z|00688|binding|INFO|Setting lport 1d9d797b-b7d1-4932-af4a-715af9ef18ed down in Southbound
Jan 22 17:37:53 compute-0 ovn_controller[95372]: 2026-01-22T17:37:53Z|00689|binding|INFO|Releasing lport e41a3e78-54fe-4a40-9b04-3ee7173a63c0 from this chassis (sb_readonly=0)
Jan 22 17:37:53 compute-0 ovn_controller[95372]: 2026-01-22T17:37:53Z|00690|binding|INFO|Setting lport e41a3e78-54fe-4a40-9b04-3ee7173a63c0 down in Southbound
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.640 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:53 compute-0 ovn_controller[95372]: 2026-01-22T17:37:53Z|00691|binding|INFO|Removing iface tap3a7e69a2-e6 ovn-installed in OVS
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.643 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:53 compute-0 ovn_controller[95372]: 2026-01-22T17:37:53Z|00692|binding|INFO|Releasing lport ed3c9c99-2ee5-407d-8706-619270405578 from this chassis (sb_readonly=0)
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.652 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:41:ca 10.100.0.7'], port_security=['fa:16:3e:ea:41:ca 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TrunkTest-1480081563', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b7b02307-4481-472f-8bdc-707a9d19f350', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8ae4b18-347c-4ff3-b9f8-578518ecd408', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TrunkTest-1480081563', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '4', 'neutron:security_group_ids': '709a6363-b0f8-4821-8d51-15be786bf9b0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.185', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=43bd6944-fad9-4457-8b9b-34db3859c385, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=3a7e69a2-e6a0-4e5f-883e-d116981b58d9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.654 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:90:cf'], port_security=['fa:16:3e:5d:90:cf'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['3a7e69a2-e6a0-4e5f-883e-d116981b58d9'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a7f15ccf-0de3-41e5-96a8-e75836879c4f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '5', 'neutron:security_group_ids': '82e2a423-9fb1-4474-8e75-6c808c020203', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[3], additional_encap=[], encap=[], mirror_rules=[], datapath=47dd2de2-1929-4faf-a49d-e20d1924a299, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=235780f4-6280-41cc-9ec8-72e939db96b0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.655 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:db:72'], port_security=['fa:16:3e:a3:db:72'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['3a7e69a2-e6a0-4e5f-883e-d116981b58d9'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-46856498-b6e8-4882-88eb-0a7d176f3907', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '5', 'neutron:security_group_ids': '82e2a423-9fb1-4474-8e75-6c808c020203', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[4], additional_encap=[], encap=[], mirror_rules=[], datapath=c831000a-d6d5-4efa-a88a-83539962f3ee, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0786767e-03b2-4160-a379-f2f8c1c1de47) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:53 compute-0 ovn_controller[95372]: 2026-01-22T17:37:53Z|00693|binding|INFO|Setting lport 524266a0-8b9b-42d0-9f33-913b53293292 down in Southbound
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.657 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:e6:17'], port_security=['fa:16:3e:34:e6:17'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['3a7e69a2-e6a0-4e5f-883e-d116981b58d9'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef142b85-db39-49be-8e2c-38028d0ff453', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '5', 'neutron:security_group_ids': '82e2a423-9fb1-4474-8e75-6c808c020203', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[5], additional_encap=[], encap=[], mirror_rules=[], datapath=d9940ce6-472d-40b4-9a94-20b67dad05d1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1d9d797b-b7d1-4932-af4a-715af9ef18ed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.658 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:f2:2a'], port_security=['fa:16:3e:44:f2:2a'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['3a7e69a2-e6a0-4e5f-883e-d116981b58d9'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5a511daa-57a3-4f7c-bf7b-a390c6f4b274', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '5', 'neutron:security_group_ids': '82e2a423-9fb1-4474-8e75-6c808c020203', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[6], additional_encap=[], encap=[], mirror_rules=[], datapath=f21dfdf2-2283-4a8c-b6e2-e712a925b38c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e41a3e78-54fe-4a40-9b04-3ee7173a63c0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.659 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 3a7e69a2-e6a0-4e5f-883e-d116981b58d9 in datapath f8ae4b18-347c-4ff3-b9f8-578518ecd408 unbound from our chassis
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.661 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f8ae4b18-347c-4ff3-b9f8-578518ecd408
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.682 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ad0e888c-7a8c-4e48-b12e-8dff88ab7afb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.698 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.715 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd3fb43-8d2b-4c3a-8fb8-fe02656b0d5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.718 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ffcc6180-4939-4820-a4be-1c5073da411d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.749 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[732ee0f9-6c61-428a-a50d-47ddd72b19f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.771 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5e77feed-ac1c-4b4f-a5bb-a30cead585bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8ae4b18-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:69:38'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 204, 'tx_packets': 105, 'rx_bytes': 17392, 'tx_bytes': 12012, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 204, 'tx_packets': 105, 'rx_bytes': 17392, 'tx_bytes': 12012, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568874, 'reachable_time': 28814, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236261, 'error': None, 'target': 'ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:53 compute-0 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d0000003c.scope: Deactivated successfully.
Jan 22 17:37:53 compute-0 ovn_controller[95372]: 2026-01-22T17:37:53Z|00694|binding|INFO|Setting lport 524266a0-8b9b-42d0-9f33-913b53293292 ovn-installed in OVS
Jan 22 17:37:53 compute-0 ovn_controller[95372]: 2026-01-22T17:37:53Z|00695|binding|INFO|Setting lport 524266a0-8b9b-42d0-9f33-913b53293292 up in Southbound
Jan 22 17:37:53 compute-0 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d0000003c.scope: Consumed 16.204s CPU time.
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.784 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:53 compute-0 systemd-machined[154382]: Machine qemu-60-instance-0000003c terminated.
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.791 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8fdd054c-3ede-45fc-82f2-922814e472b6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf8ae4b18-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 568887, 'tstamp': 568887}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236262, 'error': None, 'target': 'ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf8ae4b18-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 568890, 'tstamp': 568890}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236262, 'error': None, 'target': 'ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.792 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8ae4b18-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.794 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.799 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.800 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf8ae4b18-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.800 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.800 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf8ae4b18-30, col_values=(('external_ids', {'iface-id': 'ed3c9c99-2ee5-407d-8706-619270405578'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.801 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.802 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 235780f4-6280-41cc-9ec8-72e939db96b0 in datapath a7f15ccf-0de3-41e5-96a8-e75836879c4f unbound from our chassis
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.803 104629 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a7f15ccf-0de3-41e5-96a8-e75836879c4f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.804 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[31adacf2-b0a4-45c1-a8cf-f099a0b2c2d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.804 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 0786767e-03b2-4160-a379-f2f8c1c1de47 in datapath 46856498-b6e8-4882-88eb-0a7d176f3907 unbound from our chassis
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.805 104629 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 46856498-b6e8-4882-88eb-0a7d176f3907 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.805 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2e62bc70-ba87-4f71-afca-ec090fef3ea8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.806 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 1d9d797b-b7d1-4932-af4a-715af9ef18ed in datapath ef142b85-db39-49be-8e2c-38028d0ff453 unbound from our chassis
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.806 104629 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ef142b85-db39-49be-8e2c-38028d0ff453 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.807 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ae8518f4-454a-4a60-aa7c-56102a8250c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.807 104629 INFO neutron.agent.ovn.metadata.agent [-] Port e41a3e78-54fe-4a40-9b04-3ee7173a63c0 in datapath 5a511daa-57a3-4f7c-bf7b-a390c6f4b274 unbound from our chassis
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.808 104629 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5a511daa-57a3-4f7c-bf7b-a390c6f4b274 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 17:37:53 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:53.808 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e60997f3-4546-413c-ab22-894810ad31d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.864 183079 INFO nova.virt.libvirt.driver [-] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Instance destroyed successfully.
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.865 183079 DEBUG nova.objects.instance [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lazy-loading 'resources' on Instance uuid b7b02307-4481-472f-8bdc-707a9d19f350 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.888 183079 DEBUG nova.virt.libvirt.vif [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:36:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-104321321',display_name='tempest-server-test-104321321',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-104321321',id=60,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMx6tQlM3uv2ZwDgQasADvvbGd5zBlSeFHyt6pKaXPuo5g1lBpnMysyabNjP8htj/tP0P4meLZoYHTsZRxp2O0FBGUiyAm9KZdR/DNDaP0hn5KYm00UnMkjIWdjBNqhB9Q==',key_name='tempest-TrunkTest-1480081563',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:36:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e92551896d2c49b5b149b1a5a0cc1761',ramdisk_id='',reservation_id='r-iwasr1w4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TrunkTest-252091256',owner_user_name='tempest-TrunkTest-252091256-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:36:38Z,user_data=None,user_id='7cc2886d6b0e400d8096a810a2159f3c',uuid=b7b02307-4481-472f-8bdc-707a9d19f350,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "address": "fa:16:3e:ea:41:ca", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a7e69a2-e6", "ovs_interfaceid": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.889 183079 DEBUG nova.network.os_vif_util [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Converting VIF {"id": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "address": "fa:16:3e:ea:41:ca", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a7e69a2-e6", "ovs_interfaceid": "3a7e69a2-e6a0-4e5f-883e-d116981b58d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.890 183079 DEBUG nova.network.os_vif_util [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:41:ca,bridge_name='br-int',has_traffic_filtering=True,id=3a7e69a2-e6a0-4e5f-883e-d116981b58d9,network=Network(f8ae4b18-347c-4ff3-b9f8-578518ecd408),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3a7e69a2-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.890 183079 DEBUG os_vif [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:41:ca,bridge_name='br-int',has_traffic_filtering=True,id=3a7e69a2-e6a0-4e5f-883e-d116981b58d9,network=Network(f8ae4b18-347c-4ff3-b9f8-578518ecd408),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3a7e69a2-e6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.892 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.893 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a7e69a2-e6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.894 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.895 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.897 183079 INFO os_vif [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:41:ca,bridge_name='br-int',has_traffic_filtering=True,id=3a7e69a2-e6a0-4e5f-883e-d116981b58d9,network=Network(f8ae4b18-347c-4ff3-b9f8-578518ecd408),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3a7e69a2-e6')
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.897 183079 INFO nova.virt.libvirt.driver [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Deleting instance files /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350_del
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.898 183079 INFO nova.virt.libvirt.driver [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Deletion of /var/lib/nova/instances/b7b02307-4481-472f-8bdc-707a9d19f350_del complete
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.942 183079 INFO nova.compute.manager [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.942 183079 DEBUG oslo.service.loopingcall [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.942 183079 DEBUG nova.compute.manager [-] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:37:53 compute-0 nova_compute[183075]: 2026-01-22 17:37:53.943 183079 DEBUG nova.network.neutron [-] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:37:54 compute-0 nova_compute[183075]: 2026-01-22 17:37:54.821 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.459 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'name': 'tempest-server-test-575952224', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'hostId': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.460 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.464 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.incoming.bytes.delta volume: 12066 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb3edd39-7cd1-4428-a199-a6ca3df5842a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 12066, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:37:55.460905', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': '15c3c022-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.22133812, 'message_signature': 'fa42c779e3e370140e4d84b7cf884eb6e08374e351e53f718645c6324b8ee92f'}]}, 'timestamp': '2026-01-22 17:37:55.464875', '_unique_id': '59ba517a2a034b6c84398e8e196b6e3c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.466 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.481 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.read.latency volume: 206288121 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73c7c590-f1b5-44ff-abf1-999846575406', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 206288121, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:37:55.467035', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '15c672b8-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.227474747, 'message_signature': '47f9946eb70ca64335e0cf0e1a2d4ca901470eab21bea8038f8680d75cebff34'}]}, 'timestamp': '2026-01-22 17:37:55.482474', '_unique_id': '0881f187778b41f0a4be62bf15c5e562'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.483 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.484 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.484 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.outgoing.bytes.delta volume: 9907 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b26ce3e9-9ac1-49b3-8a91-2579dc92f243', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 9907, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:37:55.484245', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': '15c6c5c4-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.22133812, 'message_signature': '829e2fca5deb14440d91437ee4f8a4c126f0de7e354d4144f883608ac3530144'}]}, 'timestamp': '2026-01-22 17:37:55.484562', '_unique_id': '3666be540cdc4ad8b65733bca32da492'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.485 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.491 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29065ca8-ba07-4fc1-827e-fdf6ab185e8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:37:55.485812', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '15c7ea76-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.246267379, 'message_signature': 'a0b8ee30cc2359cd7eca784cff502ad962e9125728116ff8aafba0a264af1b79'}]}, 'timestamp': '2026-01-22 17:37:55.492112', '_unique_id': 'c2847225bfb54e32a826469de276a7c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.493 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.494 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.494 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.read.requests volume: 1116 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60fcaa90-d847-41ef-a53a-6ab719c5a24b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1116, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:37:55.494387', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '15c851c8-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.227474747, 'message_signature': 'dc45a0b42abfee20511fd350108a8dc549c897c3242ad1638284564294f0693c'}]}, 'timestamp': '2026-01-22 17:37:55.494726', '_unique_id': '6623ebcb1ff345e594dfb00061c66e77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.495 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.496 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.496 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.read.bytes volume: 30063104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '538904e7-e1cb-43e6-82f5-6b7e76714829', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30063104, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:37:55.496325', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '15c89d22-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.227474747, 'message_signature': '7556b9cf44b8475c99dc88ec72c0e5a9499998b41a1d55bb84468cb2fab7cea6'}]}, 'timestamp': '2026-01-22 17:37:55.496618', '_unique_id': 'de8232f18c8948798873c21951031dd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.497 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.498 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6462e214-88ef-4255-b1ab-fd2f3a8fecf2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:37:55.498075', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': '15c8e1a6-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.22133812, 'message_signature': 'dc983be0777c9c84a180bc29d4080fc95e6bc8593732293dbc982f8b929ae9d8'}]}, 'timestamp': '2026-01-22 17:37:55.498387', '_unique_id': '4dad05ff05be4c34a17ade127bcf1f88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.499 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.outgoing.packets volume: 192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c847c3d-5499-4489-8036-966e7b77fd91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 192, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:37:55.499837', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': '15c9262a-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.22133812, 'message_signature': '2ac4ea099b1157bbd05d182ba9adeb2f3c9473184b3d67b0d4504f450da943c3'}]}, 'timestamp': '2026-01-22 17:37:55.500132', '_unique_id': 'c1e4c3e189994e40b271ba1b5c0e28d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.501 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.501 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.write.requests volume: 353 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88460565-ea0b-453a-b400-00ea1a6999cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 353, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:37:55.501370', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '15c960ea-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.227474747, 'message_signature': '65890a2a4502b3dc0af883fe140f60df15d173d4274b293ed803505b1f24a19f'}]}, 'timestamp': '2026-01-22 17:37:55.501619', '_unique_id': 'cbe9a4f8663f4c8585c68a4ba135d4f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.502 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.write.bytes volume: 73261056 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41169c53-c429-4f1b-83a2-8e05306719d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73261056, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:37:55.502943', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '15c99f42-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.227474747, 'message_signature': 'f86a6b10acc734ef5d79ee34cd3fe365cd814977867fd36c575aafc4c7ca369c'}]}, 'timestamp': '2026-01-22 17:37:55.503224', '_unique_id': '1912d090a7454f0589dbb4d3888ce81d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.504 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.504 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9ae4cae-119e-4f09-b76a-7cd803658c0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:37:55.504551', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': '15c9dea8-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.22133812, 'message_signature': '18876688dc32e4b0422a34513ef8bff43813ed1068715d9cafead2bf4bc7e073'}]}, 'timestamp': '2026-01-22 17:37:55.504818', '_unique_id': '4cc261a25c54476b945914352d619d01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.505 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23d56b63-2193-4d03-b100-1ed878c97f52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:37:55.506028', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': '15ca16de-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.22133812, 'message_signature': 'b03e08a56361ccc268c5914cd763e81c8dad7c1deff91cda27aeb144b247446e'}]}, 'timestamp': '2026-01-22 17:37:55.506256', '_unique_id': 'bef2c7510198495db817f454da805421'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.507 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.507 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60d67dd3-bebd-4299-9ec7-df74f4d932b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:37:55.507406', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': '15ca4de8-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.22133812, 'message_signature': '82bd8786d01441f68597119b645536c2b32f61f2615784a7a77c7a2258f76953'}]}, 'timestamp': '2026-01-22 17:37:55.507723', '_unique_id': '24bc3a0b81ee4017a784e27b13f125ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.incoming.bytes volume: 19289 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15b589ef-8219-4fa2-b86c-b0737f318287', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 19289, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:37:55.509140', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': '15ca908c-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.22133812, 'message_signature': 'f7f04f5a213a9866ab8a0a9ab6847d2e6f6b3dc9eebe2c82509b59abc89a8700'}]}, 'timestamp': '2026-01-22 17:37:55.509373', '_unique_id': '125c7297c7404d1d8d3788af7ff43018'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.510 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.510 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.incoming.packets volume: 127 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e3d0cb2-1eaf-4b15-a220-bf8f3a18284f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 127, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:37:55.510777', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': '15cad0ce-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.22133812, 'message_signature': 'facb9d8be00f221e96fefce7bdfd4c7bba175d69d3f3d923ddfc4374320cf0f1'}]}, 'timestamp': '2026-01-22 17:37:55.511023', '_unique_id': 'cba186cfc18f4ccaaaba637eaa886bf0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.511 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c4a5a07-fd45-4796-9942-e3dc25eb01a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:37:55.512237', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '15cb0922-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.246267379, 'message_signature': '292bd82cc7efbab57941a85f28af1be45b00c8964728ff43d78022b9c2727a0d'}]}, 'timestamp': '2026-01-22 17:37:55.512468', '_unique_id': '479e9ad960884d0da14ee20699835f66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.513 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.527 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/memory.usage volume: 42.515625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6467635e-8d42-46e9-b2fd-d3a8b78ddf73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.515625, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'timestamp': '2026-01-22T17:37:55.513553', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '15cd70d6-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.288067307, 'message_signature': '8ccb87c23cdd2c727e1200b384cebdac9c34cad4e908014622a01c1ea40f8e9e'}]}, 'timestamp': '2026-01-22 17:37:55.528285', '_unique_id': 'b792e0d09ba546c791dbb9aebc3d9144'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.529 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.write.latency volume: 3161742743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64eed7b0-b4db-4808-ae93-bfbd8a501a01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3161742743, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:37:55.529993', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '15cdbfbe-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.227474747, 'message_signature': 'cbb89e4d8a1dd19d685ed54322938dfe2e74047bdbd46db8d71e03e874b9b19b'}]}, 'timestamp': '2026-01-22 17:37:55.530247', '_unique_id': 'a7ae30124b5c4659a2b4b0d1f6626cdc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.531 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.531 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '638320f9-1e32-444c-8506-e02d40fc440f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03-vda', 'timestamp': '2026-01-22T17:37:55.531480', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '15cdf93e-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.246267379, 'message_signature': '2a6e972f97cff2bb46deade7e55e45eea5041cb795635d029366cc06a3e1d5b0'}]}, 'timestamp': '2026-01-22 17:37:55.531737', '_unique_id': '1b7e8fd91af94bcbb8401a46d9a2a744'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.532 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/network.outgoing.bytes volume: 20758 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de285460-e322-44c2-9b62-b9b244002b93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 20758, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': 'instance-0000003a-585cde24-8038-40b2-97ce-5d30e6ecfc03-tap524266a0-8b', 'timestamp': '2026-01-22T17:37:55.532872', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'tap524266a0-8b', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:64:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap524266a0-8b'}, 'message_id': '15ce2fda-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.22133812, 'message_signature': '86a12c14463f2f460929c7b9b1e891bd98d469b48c664e693a6d5ee1beaa83f6'}]}, 'timestamp': '2026-01-22 17:37:55.533114', '_unique_id': '1f49ea74825a4249a9049331dcd9078e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.534 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.534 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.534 12 DEBUG ceilometer.compute.pollsters [-] 585cde24-8038-40b2-97ce-5d30e6ecfc03/cpu volume: 11790000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '951e4204-655b-4db6-bde7-0fed48f8ec14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11790000000, 'user_id': '7cc2886d6b0e400d8096a810a2159f3c', 'user_name': None, 'project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'project_name': None, 'resource_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'timestamp': '2026-01-22T17:37:55.534309', 'resource_metadata': {'display_name': 'tempest-server-test-575952224', 'name': 'instance-0000003a', 'instance_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'instance_type': 'm1.nano', 'host': 'c510ecfce02bd89641940d8b7b349bfdf40320a3713a71646aafda0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '15ce678e-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5839.288067307, 'message_signature': 'fcbd08a9d22818a1db363909c0a72614aac818f05ed8c049e65f097e02df3995'}]}, 'timestamp': '2026-01-22 17:37:55.534535', '_unique_id': '4a6a7178e095455e86f47eccbc114119'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:37:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:37:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:37:55 compute-0 nova_compute[183075]: 2026-01-22 17:37:55.843 183079 DEBUG nova.compute.manager [req-13e74aac-6cef-4db8-bcc2-2d3e4437406f req-7789c7b6-7f16-408b-b16f-483028fa219f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Received event network-vif-unplugged-524266a0-8b9b-42d0-9f33-913b53293292 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:37:55 compute-0 nova_compute[183075]: 2026-01-22 17:37:55.844 183079 DEBUG oslo_concurrency.lockutils [req-13e74aac-6cef-4db8-bcc2-2d3e4437406f req-7789c7b6-7f16-408b-b16f-483028fa219f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:55 compute-0 nova_compute[183075]: 2026-01-22 17:37:55.844 183079 DEBUG oslo_concurrency.lockutils [req-13e74aac-6cef-4db8-bcc2-2d3e4437406f req-7789c7b6-7f16-408b-b16f-483028fa219f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:55 compute-0 nova_compute[183075]: 2026-01-22 17:37:55.844 183079 DEBUG oslo_concurrency.lockutils [req-13e74aac-6cef-4db8-bcc2-2d3e4437406f req-7789c7b6-7f16-408b-b16f-483028fa219f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:55 compute-0 nova_compute[183075]: 2026-01-22 17:37:55.844 183079 DEBUG nova.compute.manager [req-13e74aac-6cef-4db8-bcc2-2d3e4437406f req-7789c7b6-7f16-408b-b16f-483028fa219f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] No waiting events found dispatching network-vif-unplugged-524266a0-8b9b-42d0-9f33-913b53293292 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:37:55 compute-0 nova_compute[183075]: 2026-01-22 17:37:55.845 183079 WARNING nova.compute.manager [req-13e74aac-6cef-4db8-bcc2-2d3e4437406f req-7789c7b6-7f16-408b-b16f-483028fa219f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Received unexpected event network-vif-unplugged-524266a0-8b9b-42d0-9f33-913b53293292 for instance with vm_state active and task_state None.
Jan 22 17:37:55 compute-0 nova_compute[183075]: 2026-01-22 17:37:55.986 183079 DEBUG nova.compute.manager [req-7505bb0a-3765-45b0-b541-aa6660716b5f req-984b76ca-bfc4-4ba2-8d3a-346eeb88ce62 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Received event network-vif-unplugged-3a7e69a2-e6a0-4e5f-883e-d116981b58d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:37:55 compute-0 nova_compute[183075]: 2026-01-22 17:37:55.987 183079 DEBUG oslo_concurrency.lockutils [req-7505bb0a-3765-45b0-b541-aa6660716b5f req-984b76ca-bfc4-4ba2-8d3a-346eeb88ce62 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:55 compute-0 nova_compute[183075]: 2026-01-22 17:37:55.987 183079 DEBUG oslo_concurrency.lockutils [req-7505bb0a-3765-45b0-b541-aa6660716b5f req-984b76ca-bfc4-4ba2-8d3a-346eeb88ce62 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:55 compute-0 nova_compute[183075]: 2026-01-22 17:37:55.987 183079 DEBUG oslo_concurrency.lockutils [req-7505bb0a-3765-45b0-b541-aa6660716b5f req-984b76ca-bfc4-4ba2-8d3a-346eeb88ce62 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:55 compute-0 nova_compute[183075]: 2026-01-22 17:37:55.987 183079 DEBUG nova.compute.manager [req-7505bb0a-3765-45b0-b541-aa6660716b5f req-984b76ca-bfc4-4ba2-8d3a-346eeb88ce62 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] No waiting events found dispatching network-vif-unplugged-3a7e69a2-e6a0-4e5f-883e-d116981b58d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:37:55 compute-0 nova_compute[183075]: 2026-01-22 17:37:55.988 183079 DEBUG nova.compute.manager [req-7505bb0a-3765-45b0-b541-aa6660716b5f req-984b76ca-bfc4-4ba2-8d3a-346eeb88ce62 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Received event network-vif-unplugged-3a7e69a2-e6a0-4e5f-883e-d116981b58d9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.194 183079 DEBUG nova.network.neutron [-] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.212 183079 INFO nova.compute.manager [-] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Took 3.27 seconds to deallocate network for instance.
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.270 183079 DEBUG oslo_concurrency.lockutils [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.271 183079 DEBUG oslo_concurrency.lockutils [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.346 183079 DEBUG nova.compute.provider_tree [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.364 183079 DEBUG nova.scheduler.client.report [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.385 183079 DEBUG oslo_concurrency.lockutils [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.421 183079 INFO nova.scheduler.client.report [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Deleted allocations for instance b7b02307-4481-472f-8bdc-707a9d19f350
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.492 183079 DEBUG oslo_concurrency.lockutils [None req-d0744185-94d0-45f3-a3db-7a1bdb680db2 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "b7b02307-4481-472f-8bdc-707a9d19f350" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.897s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.961 183079 DEBUG nova.compute.manager [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Received event network-vif-plugged-524266a0-8b9b-42d0-9f33-913b53293292 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.962 183079 DEBUG oslo_concurrency.lockutils [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.962 183079 DEBUG oslo_concurrency.lockutils [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.962 183079 DEBUG oslo_concurrency.lockutils [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.962 183079 DEBUG nova.compute.manager [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] No waiting events found dispatching network-vif-plugged-524266a0-8b9b-42d0-9f33-913b53293292 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.963 183079 WARNING nova.compute.manager [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Received unexpected event network-vif-plugged-524266a0-8b9b-42d0-9f33-913b53293292 for instance with vm_state active and task_state None.
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.963 183079 DEBUG nova.compute.manager [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Received event network-vif-plugged-524266a0-8b9b-42d0-9f33-913b53293292 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.963 183079 DEBUG oslo_concurrency.lockutils [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.963 183079 DEBUG oslo_concurrency.lockutils [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.963 183079 DEBUG oslo_concurrency.lockutils [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.964 183079 DEBUG nova.compute.manager [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] No waiting events found dispatching network-vif-plugged-524266a0-8b9b-42d0-9f33-913b53293292 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.964 183079 WARNING nova.compute.manager [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Received unexpected event network-vif-plugged-524266a0-8b9b-42d0-9f33-913b53293292 for instance with vm_state active and task_state None.
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.964 183079 DEBUG nova.compute.manager [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Received event network-vif-plugged-524266a0-8b9b-42d0-9f33-913b53293292 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.964 183079 DEBUG oslo_concurrency.lockutils [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.964 183079 DEBUG oslo_concurrency.lockutils [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.965 183079 DEBUG oslo_concurrency.lockutils [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.965 183079 DEBUG nova.compute.manager [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] No waiting events found dispatching network-vif-plugged-524266a0-8b9b-42d0-9f33-913b53293292 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:37:57 compute-0 nova_compute[183075]: 2026-01-22 17:37:57.965 183079 WARNING nova.compute.manager [req-95282378-a25d-4459-9aa2-09943a06a891 req-e1339121-7b09-4f27-8414-1fb12b1c5157 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Received unexpected event network-vif-plugged-524266a0-8b9b-42d0-9f33-913b53293292 for instance with vm_state active and task_state None.
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.058 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.065 183079 DEBUG nova.compute.manager [req-6d532bf7-7835-4efa-9a7a-0263a45448f9 req-c3b17ad2-01ad-4e63-803c-a167bb7dfdb2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Received event network-vif-plugged-3a7e69a2-e6a0-4e5f-883e-d116981b58d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.065 183079 DEBUG oslo_concurrency.lockutils [req-6d532bf7-7835-4efa-9a7a-0263a45448f9 req-c3b17ad2-01ad-4e63-803c-a167bb7dfdb2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.066 183079 DEBUG oslo_concurrency.lockutils [req-6d532bf7-7835-4efa-9a7a-0263a45448f9 req-c3b17ad2-01ad-4e63-803c-a167bb7dfdb2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.066 183079 DEBUG oslo_concurrency.lockutils [req-6d532bf7-7835-4efa-9a7a-0263a45448f9 req-c3b17ad2-01ad-4e63-803c-a167bb7dfdb2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "b7b02307-4481-472f-8bdc-707a9d19f350-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.066 183079 DEBUG nova.compute.manager [req-6d532bf7-7835-4efa-9a7a-0263a45448f9 req-c3b17ad2-01ad-4e63-803c-a167bb7dfdb2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] No waiting events found dispatching network-vif-plugged-3a7e69a2-e6a0-4e5f-883e-d116981b58d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.066 183079 WARNING nova.compute.manager [req-6d532bf7-7835-4efa-9a7a-0263a45448f9 req-c3b17ad2-01ad-4e63-803c-a167bb7dfdb2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Received unexpected event network-vif-plugged-3a7e69a2-e6a0-4e5f-883e-d116981b58d9 for instance with vm_state deleted and task_state None.
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.115 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.132 183079 DEBUG oslo_concurrency.lockutils [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "585cde24-8038-40b2-97ce-5d30e6ecfc03" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.132 183079 DEBUG oslo_concurrency.lockutils [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.133 183079 DEBUG oslo_concurrency.lockutils [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.133 183079 DEBUG oslo_concurrency.lockutils [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.133 183079 DEBUG oslo_concurrency.lockutils [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.134 183079 INFO nova.compute.manager [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Terminating instance
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.135 183079 DEBUG nova.compute.manager [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:37:58 compute-0 kernel: tap524266a0-8b (unregistering): left promiscuous mode
Jan 22 17:37:58 compute-0 NetworkManager[55454]: <info>  [1769103478.1661] device (tap524266a0-8b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.174 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:58 compute-0 ovn_controller[95372]: 2026-01-22T17:37:58Z|00696|binding|INFO|Releasing lport 524266a0-8b9b-42d0-9f33-913b53293292 from this chassis (sb_readonly=0)
Jan 22 17:37:58 compute-0 ovn_controller[95372]: 2026-01-22T17:37:58Z|00697|binding|INFO|Setting lport 524266a0-8b9b-42d0-9f33-913b53293292 down in Southbound
Jan 22 17:37:58 compute-0 ovn_controller[95372]: 2026-01-22T17:37:58Z|00698|binding|INFO|Removing iface tap524266a0-8b ovn-installed in OVS
Jan 22 17:37:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:58.182 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:64:5f 10.100.0.6'], port_security=['fa:16:3e:7a:64:5f 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TrunkTest-1480081563', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '585cde24-8038-40b2-97ce-5d30e6ecfc03', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8ae4b18-347c-4ff3-b9f8-578518ecd408', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TrunkTest-1480081563', 'neutron:project_id': 'e92551896d2c49b5b149b1a5a0cc1761', 'neutron:revision_number': '6', 'neutron:security_group_ids': '709a6363-b0f8-4821-8d51-15be786bf9b0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.231', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=43bd6944-fad9-4457-8b9b-34db3859c385, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=524266a0-8b9b-42d0-9f33-913b53293292) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:37:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:58.183 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 524266a0-8b9b-42d0-9f33-913b53293292 in datapath f8ae4b18-347c-4ff3-b9f8-578518ecd408 unbound from our chassis
Jan 22 17:37:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:58.184 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f8ae4b18-347c-4ff3-b9f8-578518ecd408, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:37:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:58.185 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[aa2ba431-725d-4c55-8310-fcf0e62745d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:58.186 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408 namespace which is not needed anymore
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.195 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:58 compute-0 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000003a.scope: Deactivated successfully.
Jan 22 17:37:58 compute-0 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000003a.scope: Consumed 18.226s CPU time.
Jan 22 17:37:58 compute-0 systemd-machined[154382]: Machine qemu-58-instance-0000003a terminated.
Jan 22 17:37:58 compute-0 podman[236282]: 2026-01-22 17:37:58.268267341 +0000 UTC m=+0.060908768 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:37:58 compute-0 neutron-haproxy-ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235119]: [NOTICE]   (235123) : haproxy version is 2.8.14-c23fe91
Jan 22 17:37:58 compute-0 neutron-haproxy-ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235119]: [NOTICE]   (235123) : path to executable is /usr/sbin/haproxy
Jan 22 17:37:58 compute-0 neutron-haproxy-ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235119]: [WARNING]  (235123) : Exiting Master process...
Jan 22 17:37:58 compute-0 neutron-haproxy-ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235119]: [WARNING]  (235123) : Exiting Master process...
Jan 22 17:37:58 compute-0 neutron-haproxy-ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235119]: [ALERT]    (235123) : Current worker (235125) exited with code 143 (Terminated)
Jan 22 17:37:58 compute-0 neutron-haproxy-ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408[235119]: [WARNING]  (235123) : All workers exited. Exiting... (0)
Jan 22 17:37:58 compute-0 systemd[1]: libpod-08105ddfddf52ea2379f87a831c03264ce41b8b638a32a680756ac44900847f6.scope: Deactivated successfully.
Jan 22 17:37:58 compute-0 podman[236327]: 2026-01-22 17:37:58.328753668 +0000 UTC m=+0.045578502 container died 08105ddfddf52ea2379f87a831c03264ce41b8b638a32a680756ac44900847f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:37:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-08105ddfddf52ea2379f87a831c03264ce41b8b638a32a680756ac44900847f6-userdata-shm.mount: Deactivated successfully.
Jan 22 17:37:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-26091b2e1adf4c1ad361c15ff1e47998acde14bd242e4517f5923975a4329923-merged.mount: Deactivated successfully.
Jan 22 17:37:58 compute-0 podman[236327]: 2026-01-22 17:37:58.372742905 +0000 UTC m=+0.089567739 container cleanup 08105ddfddf52ea2379f87a831c03264ce41b8b638a32a680756ac44900847f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:37:58 compute-0 systemd[1]: libpod-conmon-08105ddfddf52ea2379f87a831c03264ce41b8b638a32a680756ac44900847f6.scope: Deactivated successfully.
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.383 183079 INFO nova.virt.libvirt.driver [-] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Instance destroyed successfully.
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.384 183079 DEBUG nova.objects.instance [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lazy-loading 'resources' on Instance uuid 585cde24-8038-40b2-97ce-5d30e6ecfc03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.397 183079 DEBUG nova.virt.libvirt.vif [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:35:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-575952224',display_name='tempest-server-test-575952224',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-575952224',id=58,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMx6tQlM3uv2ZwDgQasADvvbGd5zBlSeFHyt6pKaXPuo5g1lBpnMysyabNjP8htj/tP0P4meLZoYHTsZRxp2O0FBGUiyAm9KZdR/DNDaP0hn5KYm00UnMkjIWdjBNqhB9Q==',key_name='tempest-TrunkTest-1480081563',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:35:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e92551896d2c49b5b149b1a5a0cc1761',ramdisk_id='',reservation_id='r-qr4fehoi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TrunkTest-252091256',owner_user_name='tempest-TrunkTest-252091256-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:35:25Z,user_data=None,user_id='7cc2886d6b0e400d8096a810a2159f3c',uuid=585cde24-8038-40b2-97ce-5d30e6ecfc03,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "524266a0-8b9b-42d0-9f33-913b53293292", "address": "fa:16:3e:7a:64:5f", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap524266a0-8b", "ovs_interfaceid": "524266a0-8b9b-42d0-9f33-913b53293292", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.397 183079 DEBUG nova.network.os_vif_util [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Converting VIF {"id": "524266a0-8b9b-42d0-9f33-913b53293292", "address": "fa:16:3e:7a:64:5f", "network": {"id": "f8ae4b18-347c-4ff3-b9f8-578518ecd408", "bridge": "br-int", "label": "tempest-TrunkTest-1480081563", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e92551896d2c49b5b149b1a5a0cc1761", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap524266a0-8b", "ovs_interfaceid": "524266a0-8b9b-42d0-9f33-913b53293292", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.398 183079 DEBUG nova.network.os_vif_util [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7a:64:5f,bridge_name='br-int',has_traffic_filtering=True,id=524266a0-8b9b-42d0-9f33-913b53293292,network=Network(f8ae4b18-347c-4ff3-b9f8-578518ecd408),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap524266a0-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.398 183079 DEBUG os_vif [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:64:5f,bridge_name='br-int',has_traffic_filtering=True,id=524266a0-8b9b-42d0-9f33-913b53293292,network=Network(f8ae4b18-347c-4ff3-b9f8-578518ecd408),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap524266a0-8b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.399 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.399 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap524266a0-8b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.401 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.402 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.404 183079 INFO os_vif [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:64:5f,bridge_name='br-int',has_traffic_filtering=True,id=524266a0-8b9b-42d0-9f33-913b53293292,network=Network(f8ae4b18-347c-4ff3-b9f8-578518ecd408),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap524266a0-8b')
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.404 183079 INFO nova.virt.libvirt.driver [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Deleting instance files /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03_del
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.405 183079 INFO nova.virt.libvirt.driver [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Deletion of /var/lib/nova/instances/585cde24-8038-40b2-97ce-5d30e6ecfc03_del complete
Jan 22 17:37:58 compute-0 podman[236370]: 2026-01-22 17:37:58.436055789 +0000 UTC m=+0.041078759 container remove 08105ddfddf52ea2379f87a831c03264ce41b8b638a32a680756ac44900847f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:37:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:58.440 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e00eb388-b96d-4100-94ee-cb15cd28bd83]: (4, ('Thu Jan 22 05:37:58 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408 (08105ddfddf52ea2379f87a831c03264ce41b8b638a32a680756ac44900847f6)\n08105ddfddf52ea2379f87a831c03264ce41b8b638a32a680756ac44900847f6\nThu Jan 22 05:37:58 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408 (08105ddfddf52ea2379f87a831c03264ce41b8b638a32a680756ac44900847f6)\n08105ddfddf52ea2379f87a831c03264ce41b8b638a32a680756ac44900847f6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:58.442 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ba25ae34-c571-4846-aa02-bb1c2b352840]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:58.443 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8ae4b18-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.444 183079 INFO nova.compute.manager [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Took 0.31 seconds to destroy the instance on the hypervisor.
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.445 183079 DEBUG oslo.service.loopingcall [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:37:58 compute-0 kernel: tapf8ae4b18-30: left promiscuous mode
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.445 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.447 183079 DEBUG nova.compute.manager [-] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.447 183079 DEBUG nova.network.neutron [-] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:37:58 compute-0 nova_compute[183075]: 2026-01-22 17:37:58.455 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:37:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:58.458 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[853b43d1-cda0-44ad-a9f6-7d9bafbe55ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:58.476 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[88440dee-96b0-4dbf-8589-99ee8910e63f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:58.477 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[dbca31f5-c86e-4412-940a-2e700a75c4e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:58.492 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b1052a5b-536b-40c4-a3f5-686a73bf7026]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568864, 'reachable_time': 39476, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236386, 'error': None, 'target': 'ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:58.495 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f8ae4b18-347c-4ff3-b9f8-578518ecd408 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:37:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:37:58.495 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[8604a390-4de2-4a91-b322-6533986bbe26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:37:58 compute-0 systemd[1]: run-netns-ovnmeta\x2df8ae4b18\x2d347c\x2d4ff3\x2db9f8\x2d578518ecd408.mount: Deactivated successfully.
Jan 22 17:38:00 compute-0 sshd-session[236245]: Connection reset by authenticating user root 176.120.22.47 port 54814 [preauth]
Jan 22 17:38:00 compute-0 sshd-session[236244]: Connection reset by authenticating user root 176.120.22.47 port 54804 [preauth]
Jan 22 17:38:02 compute-0 podman[236390]: 2026-01-22 17:38:02.336489857 +0000 UTC m=+0.045830648 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:38:03 compute-0 nova_compute[183075]: 2026-01-22 17:38:03.114 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:03 compute-0 nova_compute[183075]: 2026-01-22 17:38:03.322 183079 DEBUG nova.network.neutron [-] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:38:03 compute-0 nova_compute[183075]: 2026-01-22 17:38:03.340 183079 INFO nova.compute.manager [-] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Took 4.89 seconds to deallocate network for instance.
Jan 22 17:38:03 compute-0 nova_compute[183075]: 2026-01-22 17:38:03.383 183079 DEBUG oslo_concurrency.lockutils [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:38:03 compute-0 nova_compute[183075]: 2026-01-22 17:38:03.383 183079 DEBUG oslo_concurrency.lockutils [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:38:03 compute-0 nova_compute[183075]: 2026-01-22 17:38:03.401 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:03 compute-0 nova_compute[183075]: 2026-01-22 17:38:03.434 183079 DEBUG nova.compute.provider_tree [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:38:03 compute-0 nova_compute[183075]: 2026-01-22 17:38:03.448 183079 DEBUG nova.scheduler.client.report [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:38:03 compute-0 nova_compute[183075]: 2026-01-22 17:38:03.473 183079 DEBUG oslo_concurrency.lockutils [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:38:03 compute-0 nova_compute[183075]: 2026-01-22 17:38:03.499 183079 INFO nova.scheduler.client.report [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Deleted allocations for instance 585cde24-8038-40b2-97ce-5d30e6ecfc03
Jan 22 17:38:03 compute-0 nova_compute[183075]: 2026-01-22 17:38:03.580 183079 DEBUG oslo_concurrency.lockutils [None req-c31d59fb-6e45-4d7a-b179-a1a1f71d4c1e 7cc2886d6b0e400d8096a810a2159f3c e92551896d2c49b5b149b1a5a0cc1761 - - default default] Lock "585cde24-8038-40b2-97ce-5d30e6ecfc03" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.447s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:38:07 compute-0 nova_compute[183075]: 2026-01-22 17:38:07.326 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:07 compute-0 sshd-session[236388]: Connection reset by authenticating user root 176.120.22.47 port 35542 [preauth]
Jan 22 17:38:08 compute-0 nova_compute[183075]: 2026-01-22 17:38:08.116 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:08 compute-0 nova_compute[183075]: 2026-01-22 17:38:08.403 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:08 compute-0 nova_compute[183075]: 2026-01-22 17:38:08.863 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103473.8611498, b7b02307-4481-472f-8bdc-707a9d19f350 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:38:08 compute-0 nova_compute[183075]: 2026-01-22 17:38:08.863 183079 INFO nova.compute.manager [-] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] VM Stopped (Lifecycle Event)
Jan 22 17:38:08 compute-0 nova_compute[183075]: 2026-01-22 17:38:08.915 183079 DEBUG nova.compute.manager [None req-96a21654-d002-4848-a3ff-6e4c3b5c40da - - - - - -] [instance: b7b02307-4481-472f-8bdc-707a9d19f350] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:38:13 compute-0 nova_compute[183075]: 2026-01-22 17:38:13.118 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:13 compute-0 podman[236418]: 2026-01-22 17:38:13.199322569 +0000 UTC m=+0.047905225 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:38:13 compute-0 podman[236419]: 2026-01-22 17:38:13.217540475 +0000 UTC m=+0.060239541 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:38:13 compute-0 podman[236417]: 2026-01-22 17:38:13.266524299 +0000 UTC m=+0.116470472 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:38:13 compute-0 nova_compute[183075]: 2026-01-22 17:38:13.381 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103478.3805768, 585cde24-8038-40b2-97ce-5d30e6ecfc03 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:38:13 compute-0 nova_compute[183075]: 2026-01-22 17:38:13.382 183079 INFO nova.compute.manager [-] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] VM Stopped (Lifecycle Event)
Jan 22 17:38:13 compute-0 nova_compute[183075]: 2026-01-22 17:38:13.404 183079 DEBUG nova.compute.manager [None req-770ad9dd-e436-408b-b333-22dcfd9005c0 - - - - - -] [instance: 585cde24-8038-40b2-97ce-5d30e6ecfc03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:38:13 compute-0 nova_compute[183075]: 2026-01-22 17:38:13.405 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:14 compute-0 nova_compute[183075]: 2026-01-22 17:38:14.360 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:14 compute-0 nova_compute[183075]: 2026-01-22 17:38:14.505 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.026 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.026 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.042 183079 DEBUG nova.compute.manager [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.115 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.116 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.124 183079 DEBUG nova.virt.hardware [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.124 183079 INFO nova.compute.claims [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.237 183079 DEBUG nova.compute.provider_tree [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.250 183079 DEBUG nova.scheduler.client.report [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.270 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.271 183079 DEBUG nova.compute.manager [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.313 183079 DEBUG nova.compute.manager [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.313 183079 DEBUG nova.network.neutron [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.329 183079 INFO nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.343 183079 DEBUG nova.compute.manager [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.438 183079 DEBUG nova.compute.manager [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.439 183079 DEBUG nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.440 183079 INFO nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Creating image(s)
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.441 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.441 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.442 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.457 183079 DEBUG oslo_concurrency.processutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.511 183079 DEBUG nova.policy [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.542 183079 DEBUG oslo_concurrency.processutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.543 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.544 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.561 183079 DEBUG oslo_concurrency.processutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.616 183079 DEBUG oslo_concurrency.processutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.617 183079 DEBUG oslo_concurrency.processutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.714 183079 DEBUG oslo_concurrency.processutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk 1073741824" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.715 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.715 183079 DEBUG oslo_concurrency.processutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.770 183079 DEBUG oslo_concurrency.processutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.771 183079 DEBUG nova.virt.disk.api [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.772 183079 DEBUG oslo_concurrency.processutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.825 183079 DEBUG oslo_concurrency.processutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.827 183079 DEBUG nova.virt.disk.api [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.827 183079 DEBUG nova.objects.instance [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid fb0d62d8-3d6f-4fa5-b342-612c69890cdf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.842 183079 DEBUG nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.843 183079 DEBUG nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Ensure instance console log exists: /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.844 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.844 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:38:15 compute-0 nova_compute[183075]: 2026-01-22 17:38:15.844 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:38:16 compute-0 sshd-session[236415]: Connection reset by authenticating user root 176.120.22.47 port 35578 [preauth]
Jan 22 17:38:17 compute-0 nova_compute[183075]: 2026-01-22 17:38:17.304 183079 DEBUG nova.network.neutron [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Successfully created port: 70a900f0-6d2e-40bb-92fa-e43967095d17 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:38:17 compute-0 sshd-session[236243]: error: kex_exchange_identification: read: Connection reset by peer
Jan 22 17:38:17 compute-0 sshd-session[236243]: Connection reset by 176.120.22.47 port 51652
Jan 22 17:38:18 compute-0 nova_compute[183075]: 2026-01-22 17:38:18.120 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:18 compute-0 nova_compute[183075]: 2026-01-22 17:38:18.327 183079 DEBUG nova.network.neutron [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Successfully updated port: 70a900f0-6d2e-40bb-92fa-e43967095d17 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:38:18 compute-0 nova_compute[183075]: 2026-01-22 17:38:18.343 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-fb0d62d8-3d6f-4fa5-b342-612c69890cdf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:38:18 compute-0 nova_compute[183075]: 2026-01-22 17:38:18.343 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-fb0d62d8-3d6f-4fa5-b342-612c69890cdf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:38:18 compute-0 nova_compute[183075]: 2026-01-22 17:38:18.344 183079 DEBUG nova.network.neutron [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:38:18 compute-0 podman[236499]: 2026-01-22 17:38:18.397916528 +0000 UTC m=+0.101242177 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:38:18 compute-0 nova_compute[183075]: 2026-01-22 17:38:18.407 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:18 compute-0 nova_compute[183075]: 2026-01-22 17:38:18.473 183079 DEBUG nova.compute.manager [req-ec9f20bb-aa4c-433f-b94e-fef22b3dd89b req-b73ae11a-7a4d-4ab0-bbe6-728c53f16f19 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Received event network-changed-70a900f0-6d2e-40bb-92fa-e43967095d17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:38:18 compute-0 nova_compute[183075]: 2026-01-22 17:38:18.473 183079 DEBUG nova.compute.manager [req-ec9f20bb-aa4c-433f-b94e-fef22b3dd89b req-b73ae11a-7a4d-4ab0-bbe6-728c53f16f19 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Refreshing instance network info cache due to event network-changed-70a900f0-6d2e-40bb-92fa-e43967095d17. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:38:18 compute-0 nova_compute[183075]: 2026-01-22 17:38:18.474 183079 DEBUG oslo_concurrency.lockutils [req-ec9f20bb-aa4c-433f-b94e-fef22b3dd89b req-b73ae11a-7a4d-4ab0-bbe6-728c53f16f19 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-fb0d62d8-3d6f-4fa5-b342-612c69890cdf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:38:18 compute-0 nova_compute[183075]: 2026-01-22 17:38:18.576 183079 DEBUG nova.network.neutron [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.533 183079 DEBUG nova.network.neutron [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Updating instance_info_cache with network_info: [{"id": "70a900f0-6d2e-40bb-92fa-e43967095d17", "address": "fa:16:3e:6d:f4:ff", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70a900f0-6d", "ovs_interfaceid": "70a900f0-6d2e-40bb-92fa-e43967095d17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.558 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-fb0d62d8-3d6f-4fa5-b342-612c69890cdf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.559 183079 DEBUG nova.compute.manager [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Instance network_info: |[{"id": "70a900f0-6d2e-40bb-92fa-e43967095d17", "address": "fa:16:3e:6d:f4:ff", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70a900f0-6d", "ovs_interfaceid": "70a900f0-6d2e-40bb-92fa-e43967095d17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.559 183079 DEBUG oslo_concurrency.lockutils [req-ec9f20bb-aa4c-433f-b94e-fef22b3dd89b req-b73ae11a-7a4d-4ab0-bbe6-728c53f16f19 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-fb0d62d8-3d6f-4fa5-b342-612c69890cdf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.559 183079 DEBUG nova.network.neutron [req-ec9f20bb-aa4c-433f-b94e-fef22b3dd89b req-b73ae11a-7a4d-4ab0-bbe6-728c53f16f19 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Refreshing network info cache for port 70a900f0-6d2e-40bb-92fa-e43967095d17 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.562 183079 DEBUG nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Start _get_guest_xml network_info=[{"id": "70a900f0-6d2e-40bb-92fa-e43967095d17", "address": "fa:16:3e:6d:f4:ff", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70a900f0-6d", "ovs_interfaceid": "70a900f0-6d2e-40bb-92fa-e43967095d17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.567 183079 WARNING nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.572 183079 DEBUG nova.virt.libvirt.host [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.573 183079 DEBUG nova.virt.libvirt.host [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.578 183079 DEBUG nova.virt.libvirt.host [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.579 183079 DEBUG nova.virt.libvirt.host [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.579 183079 DEBUG nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.579 183079 DEBUG nova.virt.hardware [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.580 183079 DEBUG nova.virt.hardware [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.580 183079 DEBUG nova.virt.hardware [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.580 183079 DEBUG nova.virt.hardware [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.580 183079 DEBUG nova.virt.hardware [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.580 183079 DEBUG nova.virt.hardware [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.580 183079 DEBUG nova.virt.hardware [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.580 183079 DEBUG nova.virt.hardware [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.581 183079 DEBUG nova.virt.hardware [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.581 183079 DEBUG nova.virt.hardware [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.581 183079 DEBUG nova.virt.hardware [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.584 183079 DEBUG nova.virt.libvirt.vif [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:38:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1655858819',display_name='tempest-server-test-1655858819',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1655858819',id=61,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-oqt0h1xd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:38:15Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=fb0d62d8-3d6f-4fa5-b342-612c69890cdf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "70a900f0-6d2e-40bb-92fa-e43967095d17", "address": "fa:16:3e:6d:f4:ff", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70a900f0-6d", "ovs_interfaceid": "70a900f0-6d2e-40bb-92fa-e43967095d17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.585 183079 DEBUG nova.network.os_vif_util [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "70a900f0-6d2e-40bb-92fa-e43967095d17", "address": "fa:16:3e:6d:f4:ff", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70a900f0-6d", "ovs_interfaceid": "70a900f0-6d2e-40bb-92fa-e43967095d17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.585 183079 DEBUG nova.network.os_vif_util [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:f4:ff,bridge_name='br-int',has_traffic_filtering=True,id=70a900f0-6d2e-40bb-92fa-e43967095d17,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70a900f0-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.586 183079 DEBUG nova.objects.instance [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid fb0d62d8-3d6f-4fa5-b342-612c69890cdf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.661 183079 DEBUG nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:38:19 compute-0 nova_compute[183075]:   <uuid>fb0d62d8-3d6f-4fa5-b342-612c69890cdf</uuid>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   <name>instance-0000003d</name>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1655858819</nova:name>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:38:19</nova:creationTime>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:38:19 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:38:19 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:38:19 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:38:19 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:38:19 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:38:19 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:38:19 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:38:19 compute-0 nova_compute[183075]:         <nova:port uuid="70a900f0-6d2e-40bb-92fa-e43967095d17">
Jan 22 17:38:19 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <system>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <entry name="serial">fb0d62d8-3d6f-4fa5-b342-612c69890cdf</entry>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <entry name="uuid">fb0d62d8-3d6f-4fa5-b342-612c69890cdf</entry>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     </system>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   <os>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   </os>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   <features>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   </features>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:6d:f4:ff"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <target dev="tap70a900f0-6d"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/console.log" append="off"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <video>
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     </video>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:38:19 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:38:19 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:38:19 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:38:19 compute-0 nova_compute[183075]: </domain>
Jan 22 17:38:19 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.662 183079 DEBUG nova.compute.manager [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Preparing to wait for external event network-vif-plugged-70a900f0-6d2e-40bb-92fa-e43967095d17 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.662 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.662 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.662 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.663 183079 DEBUG nova.virt.libvirt.vif [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:38:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1655858819',display_name='tempest-server-test-1655858819',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1655858819',id=61,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-oqt0h1xd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:38:15Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=fb0d62d8-3d6f-4fa5-b342-612c69890cdf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "70a900f0-6d2e-40bb-92fa-e43967095d17", "address": "fa:16:3e:6d:f4:ff", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70a900f0-6d", "ovs_interfaceid": "70a900f0-6d2e-40bb-92fa-e43967095d17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.663 183079 DEBUG nova.network.os_vif_util [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "70a900f0-6d2e-40bb-92fa-e43967095d17", "address": "fa:16:3e:6d:f4:ff", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70a900f0-6d", "ovs_interfaceid": "70a900f0-6d2e-40bb-92fa-e43967095d17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.663 183079 DEBUG nova.network.os_vif_util [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:f4:ff,bridge_name='br-int',has_traffic_filtering=True,id=70a900f0-6d2e-40bb-92fa-e43967095d17,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70a900f0-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.664 183079 DEBUG os_vif [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:f4:ff,bridge_name='br-int',has_traffic_filtering=True,id=70a900f0-6d2e-40bb-92fa-e43967095d17,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70a900f0-6d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.664 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.664 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.665 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.669 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.669 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap70a900f0-6d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.670 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap70a900f0-6d, col_values=(('external_ids', {'iface-id': '70a900f0-6d2e-40bb-92fa-e43967095d17', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:f4:ff', 'vm-uuid': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.671 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:19 compute-0 NetworkManager[55454]: <info>  [1769103499.6727] manager: (tap70a900f0-6d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/280)
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.673 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.681 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.682 183079 INFO os_vif [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:f4:ff,bridge_name='br-int',has_traffic_filtering=True,id=70a900f0-6d2e-40bb-92fa-e43967095d17,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70a900f0-6d')
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.822 183079 DEBUG nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.823 183079 DEBUG nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:6d:f4:ff, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:38:19 compute-0 kernel: tap70a900f0-6d: entered promiscuous mode
Jan 22 17:38:19 compute-0 NetworkManager[55454]: <info>  [1769103499.8944] manager: (tap70a900f0-6d): new Tun device (/org/freedesktop/NetworkManager/Devices/281)
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.894 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:19 compute-0 ovn_controller[95372]: 2026-01-22T17:38:19Z|00699|binding|INFO|Claiming lport 70a900f0-6d2e-40bb-92fa-e43967095d17 for this chassis.
Jan 22 17:38:19 compute-0 ovn_controller[95372]: 2026-01-22T17:38:19Z|00700|binding|INFO|70a900f0-6d2e-40bb-92fa-e43967095d17: Claiming fa:16:3e:6d:f4:ff 10.100.0.12
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.899 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:19.909 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:f4:ff 10.100.0.12'], port_security=['fa:16:3e:6d:f4:ff 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8dfc7f6d-f2b1-4fa9-a099-2dffcb456eca', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=70a900f0-6d2e-40bb-92fa-e43967095d17) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:38:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:19.910 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 70a900f0-6d2e-40bb-92fa-e43967095d17 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:38:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:19.912 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:38:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:19.923 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[14db5f03-40c9-4978-81f4-bd920131ead4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:38:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:19.924 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88ed9213-71 in ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:38:19 compute-0 systemd-udevd[236537]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:38:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:19.926 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88ed9213-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:38:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:19.926 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[65b10bc5-3ae2-466e-af7c-8567d6aabacb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:38:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:19.927 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[97631c6b-8953-493a-829f-0ed0b287e634]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:38:19 compute-0 NetworkManager[55454]: <info>  [1769103499.9373] device (tap70a900f0-6d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:38:19 compute-0 NetworkManager[55454]: <info>  [1769103499.9386] device (tap70a900f0-6d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:38:19 compute-0 systemd-machined[154382]: New machine qemu-61-instance-0000003d.
Jan 22 17:38:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:19.944 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[eba7c618-dc46-4510-8f78-550f318d3907]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:38:19 compute-0 systemd[1]: Started Virtual Machine qemu-61-instance-0000003d.
Jan 22 17:38:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:19.979 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[dea9b7c6-98d5-4682-95ec-d2aa2e76bbba]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.979 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:19 compute-0 ovn_controller[95372]: 2026-01-22T17:38:19Z|00701|binding|INFO|Setting lport 70a900f0-6d2e-40bb-92fa-e43967095d17 ovn-installed in OVS
Jan 22 17:38:19 compute-0 ovn_controller[95372]: 2026-01-22T17:38:19Z|00702|binding|INFO|Setting lport 70a900f0-6d2e-40bb-92fa-e43967095d17 up in Southbound
Jan 22 17:38:19 compute-0 nova_compute[183075]: 2026-01-22 17:38:19.986 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.006 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[420d6688-c765-4f05-9587-9be6560b371c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.010 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7a475090-a3e5-432b-b03a-14df7e3e7ab5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:38:20 compute-0 NetworkManager[55454]: <info>  [1769103500.0117] manager: (tap88ed9213-70): new Veth device (/org/freedesktop/NetworkManager/Devices/282)
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.039 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d850c5d9-ab5b-4f1d-be34-4cd57e7238ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.042 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d261c206-8835-4600-adbe-1693c6120db2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:38:20 compute-0 NetworkManager[55454]: <info>  [1769103500.0694] device (tap88ed9213-70): carrier: link connected
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.075 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[e1731bb2-19ea-4bdd-90ee-be820e940ef6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.096 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[55d35e7e-aa77-4e01-8fca-afdbf00f1ef0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 586377, 'reachable_time': 34631, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236571, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.112 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[afa17e2e-3336-4260-908c-cbb665245cc4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:fa93'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 586377, 'tstamp': 586377}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236572, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.125 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fed6a5d5-8569-4b58-8f0b-4da7d4f9a6f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 586377, 'reachable_time': 34631, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236573, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.153 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[13ad5411-c8b6-421f-be2d-b7b5197d45c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.214 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2dceb636-8041-409b-bb57-4baf78e0ec0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.215 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.216 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.216 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:38:20 compute-0 NetworkManager[55454]: <info>  [1769103500.2185] manager: (tap88ed9213-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/283)
Jan 22 17:38:20 compute-0 kernel: tap88ed9213-70: entered promiscuous mode
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.218 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.220 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.221 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:38:20 compute-0 ovn_controller[95372]: 2026-01-22T17:38:20Z|00703|binding|INFO|Releasing lport da93a32e-eed6-4124-8f08-78b0bfe2cb69 from this chassis (sb_readonly=0)
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.222 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.240 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.240 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.241 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ea387d0e-d815-4e48-a06d-374366804d24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.242 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:38:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:20.242 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'env', 'PROCESS_TAG=haproxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88ed9213-7d48-4c42-ad60-ce3fcf104f71.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.279 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103500.2790868, fb0d62d8-3d6f-4fa5-b342-612c69890cdf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.279 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] VM Started (Lifecycle Event)
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.318 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.324 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103500.2792258, fb0d62d8-3d6f-4fa5-b342-612c69890cdf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.324 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] VM Paused (Lifecycle Event)
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.359 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.366 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.553 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.593 183079 DEBUG nova.compute.manager [req-1ace38be-6661-47bf-aeb7-a86656875f72 req-c22db1c1-0c0a-4b5a-a72d-a4303c66de50 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Received event network-vif-plugged-70a900f0-6d2e-40bb-92fa-e43967095d17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.593 183079 DEBUG oslo_concurrency.lockutils [req-1ace38be-6661-47bf-aeb7-a86656875f72 req-c22db1c1-0c0a-4b5a-a72d-a4303c66de50 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.594 183079 DEBUG oslo_concurrency.lockutils [req-1ace38be-6661-47bf-aeb7-a86656875f72 req-c22db1c1-0c0a-4b5a-a72d-a4303c66de50 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.594 183079 DEBUG oslo_concurrency.lockutils [req-1ace38be-6661-47bf-aeb7-a86656875f72 req-c22db1c1-0c0a-4b5a-a72d-a4303c66de50 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.594 183079 DEBUG nova.compute.manager [req-1ace38be-6661-47bf-aeb7-a86656875f72 req-c22db1c1-0c0a-4b5a-a72d-a4303c66de50 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Processing event network-vif-plugged-70a900f0-6d2e-40bb-92fa-e43967095d17 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.595 183079 DEBUG nova.compute.manager [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.597 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103500.5978646, fb0d62d8-3d6f-4fa5-b342-612c69890cdf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.598 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] VM Resumed (Lifecycle Event)
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.599 183079 DEBUG nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.602 183079 INFO nova.virt.libvirt.driver [-] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Instance spawned successfully.
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.602 183079 DEBUG nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.628 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.632 183079 DEBUG nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.632 183079 DEBUG nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.633 183079 DEBUG nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:38:20 compute-0 podman[236612]: 2026-01-22 17:38:20.633553927 +0000 UTC m=+0.073182573 container create 9290a8f058abc3dd1bf10919c643bd60ba83c50276b0be3d32ffb89040c85bca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.633 183079 DEBUG nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.634 183079 DEBUG nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.634 183079 DEBUG nova.virt.libvirt.driver [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.640 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:38:20 compute-0 podman[236612]: 2026-01-22 17:38:20.580684038 +0000 UTC m=+0.020312704 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:38:20 compute-0 systemd[1]: Started libpod-conmon-9290a8f058abc3dd1bf10919c643bd60ba83c50276b0be3d32ffb89040c85bca.scope.
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.687 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:38:20 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:38:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3068633b8c20abed57d49122ccf32bc731bab3032fe29cdb7373c5cae0e15b7c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.887 183079 INFO nova.compute.manager [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Took 5.45 seconds to spawn the instance on the hypervisor.
Jan 22 17:38:20 compute-0 nova_compute[183075]: 2026-01-22 17:38:20.887 183079 DEBUG nova.compute.manager [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:38:20 compute-0 podman[236612]: 2026-01-22 17:38:20.907778022 +0000 UTC m=+0.347406698 container init 9290a8f058abc3dd1bf10919c643bd60ba83c50276b0be3d32ffb89040c85bca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:38:20 compute-0 podman[236612]: 2026-01-22 17:38:20.913183009 +0000 UTC m=+0.352811655 container start 9290a8f058abc3dd1bf10919c643bd60ba83c50276b0be3d32ffb89040c85bca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:38:20 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236627]: [NOTICE]   (236631) : New worker (236633) forked
Jan 22 17:38:20 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236627]: [NOTICE]   (236631) : Loading success.
Jan 22 17:38:21 compute-0 nova_compute[183075]: 2026-01-22 17:38:21.040 183079 INFO nova.compute.manager [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Took 5.95 seconds to build instance.
Jan 22 17:38:21 compute-0 nova_compute[183075]: 2026-01-22 17:38:21.149 183079 DEBUG oslo_concurrency.lockutils [None req-1a6ff49d-abfb-41eb-9bdd-8ca2b77f1515 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:38:22 compute-0 nova_compute[183075]: 2026-01-22 17:38:22.183 183079 DEBUG nova.network.neutron [req-ec9f20bb-aa4c-433f-b94e-fef22b3dd89b req-b73ae11a-7a4d-4ab0-bbe6-728c53f16f19 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Updated VIF entry in instance network info cache for port 70a900f0-6d2e-40bb-92fa-e43967095d17. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:38:22 compute-0 nova_compute[183075]: 2026-01-22 17:38:22.184 183079 DEBUG nova.network.neutron [req-ec9f20bb-aa4c-433f-b94e-fef22b3dd89b req-b73ae11a-7a4d-4ab0-bbe6-728c53f16f19 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Updating instance_info_cache with network_info: [{"id": "70a900f0-6d2e-40bb-92fa-e43967095d17", "address": "fa:16:3e:6d:f4:ff", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70a900f0-6d", "ovs_interfaceid": "70a900f0-6d2e-40bb-92fa-e43967095d17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:38:22 compute-0 nova_compute[183075]: 2026-01-22 17:38:22.212 183079 DEBUG oslo_concurrency.lockutils [req-ec9f20bb-aa4c-433f-b94e-fef22b3dd89b req-b73ae11a-7a4d-4ab0-bbe6-728c53f16f19 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-fb0d62d8-3d6f-4fa5-b342-612c69890cdf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:38:22 compute-0 nova_compute[183075]: 2026-01-22 17:38:22.721 183079 DEBUG nova.compute.manager [req-4b88f3f0-0231-46fb-b6cd-47933e0007fd req-ba88c9df-e88d-48b1-a1f2-d11c7d327a4b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Received event network-vif-plugged-70a900f0-6d2e-40bb-92fa-e43967095d17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:38:22 compute-0 nova_compute[183075]: 2026-01-22 17:38:22.722 183079 DEBUG oslo_concurrency.lockutils [req-4b88f3f0-0231-46fb-b6cd-47933e0007fd req-ba88c9df-e88d-48b1-a1f2-d11c7d327a4b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:38:22 compute-0 nova_compute[183075]: 2026-01-22 17:38:22.722 183079 DEBUG oslo_concurrency.lockutils [req-4b88f3f0-0231-46fb-b6cd-47933e0007fd req-ba88c9df-e88d-48b1-a1f2-d11c7d327a4b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:38:22 compute-0 nova_compute[183075]: 2026-01-22 17:38:22.722 183079 DEBUG oslo_concurrency.lockutils [req-4b88f3f0-0231-46fb-b6cd-47933e0007fd req-ba88c9df-e88d-48b1-a1f2-d11c7d327a4b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:38:22 compute-0 nova_compute[183075]: 2026-01-22 17:38:22.723 183079 DEBUG nova.compute.manager [req-4b88f3f0-0231-46fb-b6cd-47933e0007fd req-ba88c9df-e88d-48b1-a1f2-d11c7d327a4b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] No waiting events found dispatching network-vif-plugged-70a900f0-6d2e-40bb-92fa-e43967095d17 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:38:22 compute-0 nova_compute[183075]: 2026-01-22 17:38:22.723 183079 WARNING nova.compute.manager [req-4b88f3f0-0231-46fb-b6cd-47933e0007fd req-ba88c9df-e88d-48b1-a1f2-d11c7d327a4b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Received unexpected event network-vif-plugged-70a900f0-6d2e-40bb-92fa-e43967095d17 for instance with vm_state active and task_state None.
Jan 22 17:38:23 compute-0 nova_compute[183075]: 2026-01-22 17:38:23.122 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:23 compute-0 nova_compute[183075]: 2026-01-22 17:38:23.295 183079 INFO nova.compute.manager [None req-6b5b1a9c-666a-4766-b064-0658b6950055 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Get console output
Jan 22 17:38:23 compute-0 nova_compute[183075]: 2026-01-22 17:38:23.301 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:38:24 compute-0 nova_compute[183075]: 2026-01-22 17:38:24.672 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:25 compute-0 nova_compute[183075]: 2026-01-22 17:38:25.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:38:28 compute-0 nova_compute[183075]: 2026-01-22 17:38:28.124 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:28.221 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:38:28 compute-0 nova_compute[183075]: 2026-01-22 17:38:28.222 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:28.223 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:38:28 compute-0 nova_compute[183075]: 2026-01-22 17:38:28.510 183079 INFO nova.compute.manager [None req-b73a20a1-769d-4228-a12c-6c8cbf614875 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Get console output
Jan 22 17:38:28 compute-0 nova_compute[183075]: 2026-01-22 17:38:28.516 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:38:28 compute-0 nova_compute[183075]: 2026-01-22 17:38:28.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:38:28 compute-0 nova_compute[183075]: 2026-01-22 17:38:28.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:38:29 compute-0 podman[236642]: 2026-01-22 17:38:29.348489098 +0000 UTC m=+0.052559772 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:38:29 compute-0 nova_compute[183075]: 2026-01-22 17:38:29.675 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:31 compute-0 ovn_controller[95372]: 2026-01-22T17:38:31Z|00124|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6d:f4:ff 10.100.0.12
Jan 22 17:38:31 compute-0 ovn_controller[95372]: 2026-01-22T17:38:31Z|00125|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6d:f4:ff 10.100.0.12
Jan 22 17:38:32 compute-0 nova_compute[183075]: 2026-01-22 17:38:32.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:38:33 compute-0 nova_compute[183075]: 2026-01-22 17:38:33.126 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:33 compute-0 podman[236680]: 2026-01-22 17:38:33.365504801 +0000 UTC m=+0.073666726 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:38:33 compute-0 nova_compute[183075]: 2026-01-22 17:38:33.684 183079 INFO nova.compute.manager [None req-c0e31496-2a42-40e0-8cd0-6ac949d39420 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Get console output
Jan 22 17:38:34 compute-0 nova_compute[183075]: 2026-01-22 17:38:34.678 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:37.356 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:38:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:37.358 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:38:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:38:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:38:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:38:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:38:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:38:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:38:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:38:38 compute-0 nova_compute[183075]: 2026-01-22 17:38:38.129 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.226 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.403 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.404 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.0463533
Jan 22 17:38:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236633]: 10.100.0.12:45310 [22/Jan/2026:17:38:37.355] listener listener/metadata 0/0/0/1049/1049 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.414 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.414 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.444 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.444 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0299821
Jan 22 17:38:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236633]: 10.100.0.12:45312 [22/Jan/2026:17:38:38.413] listener listener/metadata 0/0/0/31/31 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.449 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.449 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.475 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.476 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0265300
Jan 22 17:38:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236633]: 10.100.0.12:45318 [22/Jan/2026:17:38:38.448] listener listener/metadata 0/0/0/27/27 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.480 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.481 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.496 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.497 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0160213
Jan 22 17:38:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236633]: 10.100.0.12:45320 [22/Jan/2026:17:38:38.480] listener listener/metadata 0/0/0/16/16 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.501 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.501 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.515 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.516 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0144131
Jan 22 17:38:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236633]: 10.100.0.12:45324 [22/Jan/2026:17:38:38.500] listener listener/metadata 0/0/0/15/15 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.520 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.520 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.533 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.533 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0129585
Jan 22 17:38:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236633]: 10.100.0.12:45336 [22/Jan/2026:17:38:38.519] listener listener/metadata 0/0/0/13/13 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.537 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.538 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.552 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:38:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236633]: 10.100.0.12:45342 [22/Jan/2026:17:38:38.537] listener listener/metadata 0/0/0/15/15 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.553 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0151474
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.557 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.557 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.574 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.575 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0171158
Jan 22 17:38:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236633]: 10.100.0.12:45346 [22/Jan/2026:17:38:38.556] listener listener/metadata 0/0/0/18/18 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.579 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.580 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.595 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:38:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236633]: 10.100.0.12:45350 [22/Jan/2026:17:38:38.579] listener listener/metadata 0/0/0/17/17 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.596 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0161345
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.600 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.600 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.620 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.620 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0200865
Jan 22 17:38:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236633]: 10.100.0.12:45362 [22/Jan/2026:17:38:38.599] listener listener/metadata 0/0/0/20/20 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.624 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.624 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:38:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236633]: 10.100.0.12:45374 [22/Jan/2026:17:38:38.624] listener listener/metadata 0/0/0/16/16 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.641 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0160720
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.648 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.649 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.663 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:38:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236633]: 10.100.0.12:45376 [22/Jan/2026:17:38:38.648] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.664 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0143044
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.667 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.668 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.686 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.687 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0191581
Jan 22 17:38:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236633]: 10.100.0.12:45392 [22/Jan/2026:17:38:38.666] listener listener/metadata 0/0/0/20/20 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.690 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.691 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.708 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:38:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236633]: 10.100.0.12:45400 [22/Jan/2026:17:38:38.690] listener listener/metadata 0/0/0/18/18 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.708 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0173008
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.712 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.713 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.733 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.733 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0203218
Jan 22 17:38:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236633]: 10.100.0.12:45412 [22/Jan/2026:17:38:38.712] listener listener/metadata 0/0/0/21/21 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.738 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.738 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.756 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:38:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:38.757 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0184267
Jan 22 17:38:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236633]: 10.100.0.12:45418 [22/Jan/2026:17:38:38.737] listener listener/metadata 0/0/0/19/19 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:38:38 compute-0 nova_compute[183075]: 2026-01-22 17:38:38.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:38:38 compute-0 nova_compute[183075]: 2026-01-22 17:38:38.830 183079 INFO nova.compute.manager [None req-9dc43f08-0457-4af7-982e-dcf277b30055 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Get console output
Jan 22 17:38:38 compute-0 nova_compute[183075]: 2026-01-22 17:38:38.835 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:38:39 compute-0 nova_compute[183075]: 2026-01-22 17:38:39.681 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:39 compute-0 nova_compute[183075]: 2026-01-22 17:38:39.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:38:39 compute-0 nova_compute[183075]: 2026-01-22 17:38:39.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:38:39 compute-0 nova_compute[183075]: 2026-01-22 17:38:39.885 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:38:40 compute-0 nova_compute[183075]: 2026-01-22 17:38:40.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:38:40 compute-0 nova_compute[183075]: 2026-01-22 17:38:40.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:38:40 compute-0 nova_compute[183075]: 2026-01-22 17:38:40.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:38:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:41.955 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:38:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:41.956 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:38:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:38:41.957 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:38:42 compute-0 nova_compute[183075]: 2026-01-22 17:38:42.786 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:38:42 compute-0 nova_compute[183075]: 2026-01-22 17:38:42.837 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:38:42 compute-0 nova_compute[183075]: 2026-01-22 17:38:42.837 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:38:42 compute-0 nova_compute[183075]: 2026-01-22 17:38:42.837 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:38:42 compute-0 nova_compute[183075]: 2026-01-22 17:38:42.838 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:38:42 compute-0 nova_compute[183075]: 2026-01-22 17:38:42.903 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:38:42 compute-0 nova_compute[183075]: 2026-01-22 17:38:42.962 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:38:42 compute-0 nova_compute[183075]: 2026-01-22 17:38:42.963 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:38:43 compute-0 nova_compute[183075]: 2026-01-22 17:38:43.014 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:38:43 compute-0 nova_compute[183075]: 2026-01-22 17:38:43.131 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:43 compute-0 nova_compute[183075]: 2026-01-22 17:38:43.154 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:38:43 compute-0 nova_compute[183075]: 2026-01-22 17:38:43.156 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5520MB free_disk=73.3315200805664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:38:43 compute-0 nova_compute[183075]: 2026-01-22 17:38:43.156 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:38:43 compute-0 nova_compute[183075]: 2026-01-22 17:38:43.156 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:38:43 compute-0 nova_compute[183075]: 2026-01-22 17:38:43.266 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance fb0d62d8-3d6f-4fa5-b342-612c69890cdf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:38:43 compute-0 nova_compute[183075]: 2026-01-22 17:38:43.266 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:38:43 compute-0 nova_compute[183075]: 2026-01-22 17:38:43.266 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:38:43 compute-0 nova_compute[183075]: 2026-01-22 17:38:43.314 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:38:43 compute-0 nova_compute[183075]: 2026-01-22 17:38:43.332 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:38:43 compute-0 nova_compute[183075]: 2026-01-22 17:38:43.351 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:38:43 compute-0 nova_compute[183075]: 2026-01-22 17:38:43.352 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:38:43 compute-0 podman[236712]: 2026-01-22 17:38:43.352768828 +0000 UTC m=+0.065741040 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 17:38:43 compute-0 podman[236711]: 2026-01-22 17:38:43.354474875 +0000 UTC m=+0.057441035 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true)
Jan 22 17:38:43 compute-0 podman[236713]: 2026-01-22 17:38:43.399370397 +0000 UTC m=+0.107658212 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:38:43 compute-0 nova_compute[183075]: 2026-01-22 17:38:43.990 183079 INFO nova.compute.manager [None req-26e9a3a7-b039-49d8-8c62-65c7e0841cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Get console output
Jan 22 17:38:43 compute-0 nova_compute[183075]: 2026-01-22 17:38:43.995 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:38:44 compute-0 nova_compute[183075]: 2026-01-22 17:38:44.684 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:48 compute-0 nova_compute[183075]: 2026-01-22 17:38:48.132 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:49 compute-0 nova_compute[183075]: 2026-01-22 17:38:49.117 183079 INFO nova.compute.manager [None req-8fba3aca-5a49-4aa1-bdfe-4b193f989c16 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Get console output
Jan 22 17:38:49 compute-0 nova_compute[183075]: 2026-01-22 17:38:49.122 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:38:49 compute-0 podman[236774]: 2026-01-22 17:38:49.385555233 +0000 UTC m=+0.086321991 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 22 17:38:49 compute-0 nova_compute[183075]: 2026-01-22 17:38:49.687 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:52 compute-0 ovn_controller[95372]: 2026-01-22T17:38:52Z|00704|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Jan 22 17:38:53 compute-0 nova_compute[183075]: 2026-01-22 17:38:53.135 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:54 compute-0 nova_compute[183075]: 2026-01-22 17:38:54.389 183079 INFO nova.compute.manager [None req-1617b869-959c-4336-97f2-4e3043bb5ec7 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Get console output
Jan 22 17:38:54 compute-0 nova_compute[183075]: 2026-01-22 17:38:54.394 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:38:54 compute-0 nova_compute[183075]: 2026-01-22 17:38:54.689 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:58 compute-0 nova_compute[183075]: 2026-01-22 17:38:58.137 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:38:59 compute-0 nova_compute[183075]: 2026-01-22 17:38:59.512 183079 INFO nova.compute.manager [None req-e1cb1e12-ddeb-4fa7-b489-5aee2b3d2686 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Get console output
Jan 22 17:38:59 compute-0 nova_compute[183075]: 2026-01-22 17:38:59.516 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:38:59 compute-0 nova_compute[183075]: 2026-01-22 17:38:59.737 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:00 compute-0 podman[236796]: 2026-01-22 17:39:00.341176952 +0000 UTC m=+0.053169259 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:39:03 compute-0 nova_compute[183075]: 2026-01-22 17:39:03.139 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:04 compute-0 podman[236821]: 2026-01-22 17:39:04.360409735 +0000 UTC m=+0.064444206 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:39:04 compute-0 nova_compute[183075]: 2026-01-22 17:39:04.669 183079 INFO nova.compute.manager [None req-3131fa7d-82d5-45c8-8259-154d76e9d7c8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Get console output
Jan 22 17:39:04 compute-0 nova_compute[183075]: 2026-01-22 17:39:04.675 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:39:04 compute-0 nova_compute[183075]: 2026-01-22 17:39:04.784 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:08 compute-0 nova_compute[183075]: 2026-01-22 17:39:08.141 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:09 compute-0 nova_compute[183075]: 2026-01-22 17:39:09.802 183079 INFO nova.compute.manager [None req-99f0fa1d-22c5-4d88-a24f-f4821ffcb103 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Get console output
Jan 22 17:39:09 compute-0 nova_compute[183075]: 2026-01-22 17:39:09.807 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:39:09 compute-0 nova_compute[183075]: 2026-01-22 17:39:09.815 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:13 compute-0 nova_compute[183075]: 2026-01-22 17:39:13.143 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:14 compute-0 podman[236849]: 2026-01-22 17:39:14.359138713 +0000 UTC m=+0.065045561 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter, architecture=x86_64, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 17:39:14 compute-0 podman[236848]: 2026-01-22 17:39:14.412875226 +0000 UTC m=+0.121181580 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:39:14 compute-0 podman[236847]: 2026-01-22 17:39:14.429517989 +0000 UTC m=+0.140418933 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:39:14 compute-0 nova_compute[183075]: 2026-01-22 17:39:14.818 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:14 compute-0 nova_compute[183075]: 2026-01-22 17:39:14.967 183079 INFO nova.compute.manager [None req-8530cae5-5f71-4a02-ae87-e0f58942ffbd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Get console output
Jan 22 17:39:14 compute-0 nova_compute[183075]: 2026-01-22 17:39:14.972 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:39:16 compute-0 ovn_controller[95372]: 2026-01-22T17:39:16Z|00705|binding|INFO|Releasing lport da93a32e-eed6-4124-8f08-78b0bfe2cb69 from this chassis (sb_readonly=0)
Jan 22 17:39:16 compute-0 nova_compute[183075]: 2026-01-22 17:39:16.513 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:16 compute-0 NetworkManager[55454]: <info>  [1769103556.5142] manager: (patch-br-int-to-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/284)
Jan 22 17:39:16 compute-0 NetworkManager[55454]: <info>  [1769103556.5162] manager: (patch-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/285)
Jan 22 17:39:16 compute-0 ovn_controller[95372]: 2026-01-22T17:39:16Z|00706|binding|INFO|Releasing lport da93a32e-eed6-4124-8f08-78b0bfe2cb69 from this chassis (sb_readonly=0)
Jan 22 17:39:16 compute-0 nova_compute[183075]: 2026-01-22 17:39:16.595 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:16.796 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:39:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:16.797 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:39:16 compute-0 nova_compute[183075]: 2026-01-22 17:39:16.797 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:16 compute-0 nova_compute[183075]: 2026-01-22 17:39:16.819 183079 DEBUG nova.compute.manager [req-dcc3a66a-79c4-4a57-96c0-a67bbc005c5e req-9937f8b9-b3cf-42b1-ae09-f44f86023de1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Received event network-changed-70a900f0-6d2e-40bb-92fa-e43967095d17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:39:16 compute-0 nova_compute[183075]: 2026-01-22 17:39:16.820 183079 DEBUG nova.compute.manager [req-dcc3a66a-79c4-4a57-96c0-a67bbc005c5e req-9937f8b9-b3cf-42b1-ae09-f44f86023de1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Refreshing instance network info cache due to event network-changed-70a900f0-6d2e-40bb-92fa-e43967095d17. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:39:16 compute-0 nova_compute[183075]: 2026-01-22 17:39:16.820 183079 DEBUG oslo_concurrency.lockutils [req-dcc3a66a-79c4-4a57-96c0-a67bbc005c5e req-9937f8b9-b3cf-42b1-ae09-f44f86023de1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-fb0d62d8-3d6f-4fa5-b342-612c69890cdf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:39:16 compute-0 nova_compute[183075]: 2026-01-22 17:39:16.820 183079 DEBUG oslo_concurrency.lockutils [req-dcc3a66a-79c4-4a57-96c0-a67bbc005c5e req-9937f8b9-b3cf-42b1-ae09-f44f86023de1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-fb0d62d8-3d6f-4fa5-b342-612c69890cdf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:39:16 compute-0 nova_compute[183075]: 2026-01-22 17:39:16.821 183079 DEBUG nova.network.neutron [req-dcc3a66a-79c4-4a57-96c0-a67bbc005c5e req-9937f8b9-b3cf-42b1-ae09-f44f86023de1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Refreshing network info cache for port 70a900f0-6d2e-40bb-92fa-e43967095d17 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:39:18 compute-0 nova_compute[183075]: 2026-01-22 17:39:18.145 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:18 compute-0 nova_compute[183075]: 2026-01-22 17:39:18.293 183079 DEBUG nova.network.neutron [req-dcc3a66a-79c4-4a57-96c0-a67bbc005c5e req-9937f8b9-b3cf-42b1-ae09-f44f86023de1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Updated VIF entry in instance network info cache for port 70a900f0-6d2e-40bb-92fa-e43967095d17. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:39:18 compute-0 nova_compute[183075]: 2026-01-22 17:39:18.294 183079 DEBUG nova.network.neutron [req-dcc3a66a-79c4-4a57-96c0-a67bbc005c5e req-9937f8b9-b3cf-42b1-ae09-f44f86023de1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Updating instance_info_cache with network_info: [{"id": "70a900f0-6d2e-40bb-92fa-e43967095d17", "address": "fa:16:3e:6d:f4:ff", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70a900f0-6d", "ovs_interfaceid": "70a900f0-6d2e-40bb-92fa-e43967095d17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:39:18 compute-0 nova_compute[183075]: 2026-01-22 17:39:18.310 183079 DEBUG oslo_concurrency.lockutils [req-dcc3a66a-79c4-4a57-96c0-a67bbc005c5e req-9937f8b9-b3cf-42b1-ae09-f44f86023de1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-fb0d62d8-3d6f-4fa5-b342-612c69890cdf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:39:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:18.800 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.134 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.135 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.153 183079 DEBUG nova.compute.manager [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.219 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.219 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.227 183079 DEBUG nova.virt.hardware [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.228 183079 INFO nova.compute.claims [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.346 183079 DEBUG nova.compute.provider_tree [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.361 183079 DEBUG nova.scheduler.client.report [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.378 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.379 183079 DEBUG nova.compute.manager [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.415 183079 DEBUG nova.compute.manager [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.416 183079 DEBUG nova.network.neutron [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.434 183079 INFO nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.448 183079 DEBUG nova.compute.manager [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.676 183079 DEBUG nova.compute.manager [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.677 183079 DEBUG nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.678 183079 INFO nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Creating image(s)
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.678 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.679 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.679 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.692 183079 DEBUG oslo_concurrency.processutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.748 183079 DEBUG oslo_concurrency.processutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.750 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.750 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.760 183079 DEBUG oslo_concurrency.processutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.820 183079 DEBUG oslo_concurrency.processutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.821 183079 DEBUG oslo_concurrency.processutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.837 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.854 183079 DEBUG oslo_concurrency.processutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.855 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.856 183079 DEBUG oslo_concurrency.processutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.909 183079 DEBUG oslo_concurrency.processutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.910 183079 DEBUG nova.virt.disk.api [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.910 183079 DEBUG oslo_concurrency.processutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.964 183079 DEBUG oslo_concurrency.processutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.966 183079 DEBUG nova.virt.disk.api [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.966 183079 DEBUG nova.objects.instance [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.982 183079 DEBUG nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.983 183079 DEBUG nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Ensure instance console log exists: /var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.983 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.984 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:39:19 compute-0 nova_compute[183075]: 2026-01-22 17:39:19.984 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:39:20 compute-0 nova_compute[183075]: 2026-01-22 17:39:20.261 183079 DEBUG nova.policy [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:39:20 compute-0 podman[236928]: 2026-01-22 17:39:20.341345674 +0000 UTC m=+0.049391795 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:39:22 compute-0 nova_compute[183075]: 2026-01-22 17:39:22.322 183079 DEBUG nova.network.neutron [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Successfully created port: 0a29f7a3-3b06-4447-a07b-2c171a583ec7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:39:23 compute-0 nova_compute[183075]: 2026-01-22 17:39:23.146 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:23 compute-0 nova_compute[183075]: 2026-01-22 17:39:23.163 183079 DEBUG nova.network.neutron [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Successfully updated port: 0a29f7a3-3b06-4447-a07b-2c171a583ec7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:39:23 compute-0 nova_compute[183075]: 2026-01-22 17:39:23.180 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:39:23 compute-0 nova_compute[183075]: 2026-01-22 17:39:23.180 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:39:23 compute-0 nova_compute[183075]: 2026-01-22 17:39:23.180 183079 DEBUG nova.network.neutron [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:39:23 compute-0 nova_compute[183075]: 2026-01-22 17:39:23.248 183079 DEBUG nova.compute.manager [req-fc37e347-8a64-4341-88e1-57f0b093f300 req-67548170-11a7-40fd-8f44-217bbe6b61b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Received event network-changed-0a29f7a3-3b06-4447-a07b-2c171a583ec7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:39:23 compute-0 nova_compute[183075]: 2026-01-22 17:39:23.248 183079 DEBUG nova.compute.manager [req-fc37e347-8a64-4341-88e1-57f0b093f300 req-67548170-11a7-40fd-8f44-217bbe6b61b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Refreshing instance network info cache due to event network-changed-0a29f7a3-3b06-4447-a07b-2c171a583ec7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:39:23 compute-0 nova_compute[183075]: 2026-01-22 17:39:23.248 183079 DEBUG oslo_concurrency.lockutils [req-fc37e347-8a64-4341-88e1-57f0b093f300 req-67548170-11a7-40fd-8f44-217bbe6b61b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:39:23 compute-0 nova_compute[183075]: 2026-01-22 17:39:23.318 183079 DEBUG nova.network.neutron [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.555 183079 DEBUG nova.network.neutron [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Updating instance_info_cache with network_info: [{"id": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "address": "fa:16:3e:fc:c4:c4", "network": {"id": "ae93f685-c847-412d-9bac-109715e96a73", "bridge": "br-int", "label": "tempest-test-network--1900519672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a29f7a3-3b", "ovs_interfaceid": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.588 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.588 183079 DEBUG nova.compute.manager [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Instance network_info: |[{"id": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "address": "fa:16:3e:fc:c4:c4", "network": {"id": "ae93f685-c847-412d-9bac-109715e96a73", "bridge": "br-int", "label": "tempest-test-network--1900519672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a29f7a3-3b", "ovs_interfaceid": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.589 183079 DEBUG oslo_concurrency.lockutils [req-fc37e347-8a64-4341-88e1-57f0b093f300 req-67548170-11a7-40fd-8f44-217bbe6b61b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.589 183079 DEBUG nova.network.neutron [req-fc37e347-8a64-4341-88e1-57f0b093f300 req-67548170-11a7-40fd-8f44-217bbe6b61b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Refreshing network info cache for port 0a29f7a3-3b06-4447-a07b-2c171a583ec7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.592 183079 DEBUG nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Start _get_guest_xml network_info=[{"id": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "address": "fa:16:3e:fc:c4:c4", "network": {"id": "ae93f685-c847-412d-9bac-109715e96a73", "bridge": "br-int", "label": "tempest-test-network--1900519672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a29f7a3-3b", "ovs_interfaceid": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.597 183079 WARNING nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.602 183079 DEBUG nova.virt.libvirt.host [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.603 183079 DEBUG nova.virt.libvirt.host [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.606 183079 DEBUG nova.virt.libvirt.host [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.606 183079 DEBUG nova.virt.libvirt.host [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.607 183079 DEBUG nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.607 183079 DEBUG nova.virt.hardware [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.607 183079 DEBUG nova.virt.hardware [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.608 183079 DEBUG nova.virt.hardware [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.608 183079 DEBUG nova.virt.hardware [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.608 183079 DEBUG nova.virt.hardware [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.608 183079 DEBUG nova.virt.hardware [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.608 183079 DEBUG nova.virt.hardware [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.609 183079 DEBUG nova.virt.hardware [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.609 183079 DEBUG nova.virt.hardware [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.609 183079 DEBUG nova.virt.hardware [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.609 183079 DEBUG nova.virt.hardware [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.612 183079 DEBUG nova.virt.libvirt.vif [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:39:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-164795055',display_name='tempest-server-test-164795055',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-164795055',id=62,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-pt57t9fa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:39:19Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "address": "fa:16:3e:fc:c4:c4", "network": {"id": "ae93f685-c847-412d-9bac-109715e96a73", "bridge": "br-int", "label": "tempest-test-network--1900519672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a29f7a3-3b", "ovs_interfaceid": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.613 183079 DEBUG nova.network.os_vif_util [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "address": "fa:16:3e:fc:c4:c4", "network": {"id": "ae93f685-c847-412d-9bac-109715e96a73", "bridge": "br-int", "label": "tempest-test-network--1900519672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a29f7a3-3b", "ovs_interfaceid": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.613 183079 DEBUG nova.network.os_vif_util [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:c4:c4,bridge_name='br-int',has_traffic_filtering=True,id=0a29f7a3-3b06-4447-a07b-2c171a583ec7,network=Network(ae93f685-c847-412d-9bac-109715e96a73),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a29f7a3-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.614 183079 DEBUG nova.objects.instance [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.638 183079 DEBUG nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:39:24 compute-0 nova_compute[183075]:   <uuid>4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6</uuid>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   <name>instance-0000003e</name>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-164795055</nova:name>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:39:24</nova:creationTime>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:39:24 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:39:24 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:39:24 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:39:24 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:39:24 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:39:24 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:39:24 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:39:24 compute-0 nova_compute[183075]:         <nova:port uuid="0a29f7a3-3b06-4447-a07b-2c171a583ec7">
Jan 22 17:39:24 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <system>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <entry name="serial">4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6</entry>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <entry name="uuid">4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6</entry>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     </system>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   <os>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   </os>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   <features>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   </features>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:fc:c4:c4"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <target dev="tap0a29f7a3-3b"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/console.log" append="off"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <video>
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     </video>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:39:24 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:39:24 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:39:24 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:39:24 compute-0 nova_compute[183075]: </domain>
Jan 22 17:39:24 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.639 183079 DEBUG nova.compute.manager [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Preparing to wait for external event network-vif-plugged-0a29f7a3-3b06-4447-a07b-2c171a583ec7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.640 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.640 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.640 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.641 183079 DEBUG nova.virt.libvirt.vif [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:39:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-164795055',display_name='tempest-server-test-164795055',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-164795055',id=62,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-pt57t9fa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:39:19Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "address": "fa:16:3e:fc:c4:c4", "network": {"id": "ae93f685-c847-412d-9bac-109715e96a73", "bridge": "br-int", "label": "tempest-test-network--1900519672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a29f7a3-3b", "ovs_interfaceid": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.641 183079 DEBUG nova.network.os_vif_util [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "address": "fa:16:3e:fc:c4:c4", "network": {"id": "ae93f685-c847-412d-9bac-109715e96a73", "bridge": "br-int", "label": "tempest-test-network--1900519672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a29f7a3-3b", "ovs_interfaceid": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.642 183079 DEBUG nova.network.os_vif_util [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:c4:c4,bridge_name='br-int',has_traffic_filtering=True,id=0a29f7a3-3b06-4447-a07b-2c171a583ec7,network=Network(ae93f685-c847-412d-9bac-109715e96a73),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a29f7a3-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.642 183079 DEBUG os_vif [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:c4:c4,bridge_name='br-int',has_traffic_filtering=True,id=0a29f7a3-3b06-4447-a07b-2c171a583ec7,network=Network(ae93f685-c847-412d-9bac-109715e96a73),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a29f7a3-3b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.643 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.643 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.644 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.646 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.646 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a29f7a3-3b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.647 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0a29f7a3-3b, col_values=(('external_ids', {'iface-id': '0a29f7a3-3b06-4447-a07b-2c171a583ec7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fc:c4:c4', 'vm-uuid': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.648 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:24 compute-0 NetworkManager[55454]: <info>  [1769103564.6495] manager: (tap0a29f7a3-3b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/286)
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.650 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.657 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.658 183079 INFO os_vif [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:c4:c4,bridge_name='br-int',has_traffic_filtering=True,id=0a29f7a3-3b06-4447-a07b-2c171a583ec7,network=Network(ae93f685-c847-412d-9bac-109715e96a73),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a29f7a3-3b')
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.720 183079 DEBUG nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.721 183079 DEBUG nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:fc:c4:c4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:39:24 compute-0 kernel: tap0a29f7a3-3b: entered promiscuous mode
Jan 22 17:39:24 compute-0 NetworkManager[55454]: <info>  [1769103564.7872] manager: (tap0a29f7a3-3b): new Tun device (/org/freedesktop/NetworkManager/Devices/287)
Jan 22 17:39:24 compute-0 ovn_controller[95372]: 2026-01-22T17:39:24Z|00707|binding|INFO|Claiming lport 0a29f7a3-3b06-4447-a07b-2c171a583ec7 for this chassis.
Jan 22 17:39:24 compute-0 ovn_controller[95372]: 2026-01-22T17:39:24Z|00708|binding|INFO|0a29f7a3-3b06-4447-a07b-2c171a583ec7: Claiming fa:16:3e:fc:c4:c4 10.100.0.22
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.789 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.793 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:c4:c4 10.100.0.22'], port_security=['fa:16:3e:fc:c4:c4 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae93f685-c847-412d-9bac-109715e96a73', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9bfec5de-80e1-4e5c-80ad-8dfa1bbb473c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f3a5ba1-b177-4437-8147-21864312aeb4, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=0a29f7a3-3b06-4447-a07b-2c171a583ec7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.795 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 0a29f7a3-3b06-4447-a07b-2c171a583ec7 in datapath ae93f685-c847-412d-9bac-109715e96a73 bound to our chassis
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.796 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ae93f685-c847-412d-9bac-109715e96a73
Jan 22 17:39:24 compute-0 ovn_controller[95372]: 2026-01-22T17:39:24Z|00709|binding|INFO|Setting lport 0a29f7a3-3b06-4447-a07b-2c171a583ec7 up in Southbound
Jan 22 17:39:24 compute-0 ovn_controller[95372]: 2026-01-22T17:39:24Z|00710|binding|INFO|Setting lport 0a29f7a3-3b06-4447-a07b-2c171a583ec7 ovn-installed in OVS
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.802 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:24 compute-0 nova_compute[183075]: 2026-01-22 17:39:24.804 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.810 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7640b275-572b-4376-907b-0f118e87b6e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.811 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapae93f685-c1 in ovnmeta-ae93f685-c847-412d-9bac-109715e96a73 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.813 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapae93f685-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.813 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1e09d0ef-cc78-47ad-9564-7ffe195559c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.814 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[360ff37b-b332-489d-b9ea-d29e89602828]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:39:24 compute-0 systemd-udevd[236964]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:39:24 compute-0 systemd-machined[154382]: New machine qemu-62-instance-0000003e.
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.825 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[8dae31d7-461c-47fa-9770-3dda0d3b1566]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:39:24 compute-0 NetworkManager[55454]: <info>  [1769103564.8325] device (tap0a29f7a3-3b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:39:24 compute-0 NetworkManager[55454]: <info>  [1769103564.8333] device (tap0a29f7a3-3b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:39:24 compute-0 systemd[1]: Started Virtual Machine qemu-62-instance-0000003e.
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.842 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[716185e3-11ec-4931-af04-533f92dc6a16]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.871 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[4b83215e-ebf3-4240-8a9c-8b9955954742]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:39:24 compute-0 systemd-udevd[236969]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.877 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[94d271de-8733-477e-9066-c1d9caafb602]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:39:24 compute-0 NetworkManager[55454]: <info>  [1769103564.8784] manager: (tapae93f685-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/288)
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.904 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d8de3c61-6a83-4829-bacf-3c519585913a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.907 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[abdaf619-24b0-4bed-ab1c-b46f9c7bc069]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:39:24 compute-0 NetworkManager[55454]: <info>  [1769103564.9302] device (tapae93f685-c0): carrier: link connected
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.935 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[6cca869f-46a8-4bae-baf1-ad37775816c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.952 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[40740cb3-428a-492e-a37e-23c996bfe538]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapae93f685-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:68:d0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 191], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592863, 'reachable_time': 33670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236997, 'error': None, 'target': 'ovnmeta-ae93f685-c847-412d-9bac-109715e96a73', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.972 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a63bbf76-066a-4454-8398-7a2bce66515b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:68d0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 592863, 'tstamp': 592863}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236999, 'error': None, 'target': 'ovnmeta-ae93f685-c847-412d-9bac-109715e96a73', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:39:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:24.987 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3145f192-689f-4f39-ba7f-614716ffdfdc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapae93f685-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:68:d0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 191], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592863, 'reachable_time': 33670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237004, 'error': None, 'target': 'ovnmeta-ae93f685-c847-412d-9bac-109715e96a73', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:25.018 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2e43d373-8044-4e36-b7a1-ff98b951e3c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.065 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103565.0647728, 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.065 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] VM Started (Lifecycle Event)
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:25.078 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[01d857b3-02e0-48d9-a588-88e767091e21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:25.079 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae93f685-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:25.079 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:25.080 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapae93f685-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.081 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:25 compute-0 NetworkManager[55454]: <info>  [1769103565.0823] manager: (tapae93f685-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/289)
Jan 22 17:39:25 compute-0 kernel: tapae93f685-c0: entered promiscuous mode
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.083 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:25.086 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapae93f685-c0, col_values=(('external_ids', {'iface-id': '1367070d-c92c-4f77-b0cd-87e841c1d45b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.087 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:25 compute-0 ovn_controller[95372]: 2026-01-22T17:39:25Z|00711|binding|INFO|Releasing lport 1367070d-c92c-4f77-b0cd-87e841c1d45b from this chassis (sb_readonly=0)
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.088 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:25.088 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ae93f685-c847-412d-9bac-109715e96a73.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ae93f685-c847-412d-9bac-109715e96a73.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:25.089 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0b541351-0d7a-4d5b-94dc-f8b203f97b3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:25.089 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/ae93f685-c847-412d-9bac-109715e96a73.pid.haproxy
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID ae93f685-c847-412d-9bac-109715e96a73
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:39:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:25.090 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ae93f685-c847-412d-9bac-109715e96a73', 'env', 'PROCESS_TAG=haproxy-ae93f685-c847-412d-9bac-109715e96a73', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ae93f685-c847-412d-9bac-109715e96a73.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.103 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.118 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.122 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103565.0649836, 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.122 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] VM Paused (Lifecycle Event)
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.165 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.169 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.222 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:39:25 compute-0 podman[237038]: 2026-01-22 17:39:25.472391506 +0000 UTC m=+0.063199107 container create 858e51a8dd9878d5748e3618089328b625db7905c7b0008d633c0f752189e1ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae93f685-c847-412d-9bac-109715e96a73, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.486 183079 DEBUG nova.compute.manager [req-4be6a0aa-d291-485a-8565-3536ad4ffdab req-7ed743b6-33be-4c1f-9613-15c07f5cecfd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Received event network-vif-plugged-0a29f7a3-3b06-4447-a07b-2c171a583ec7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.488 183079 DEBUG oslo_concurrency.lockutils [req-4be6a0aa-d291-485a-8565-3536ad4ffdab req-7ed743b6-33be-4c1f-9613-15c07f5cecfd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.489 183079 DEBUG oslo_concurrency.lockutils [req-4be6a0aa-d291-485a-8565-3536ad4ffdab req-7ed743b6-33be-4c1f-9613-15c07f5cecfd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.489 183079 DEBUG oslo_concurrency.lockutils [req-4be6a0aa-d291-485a-8565-3536ad4ffdab req-7ed743b6-33be-4c1f-9613-15c07f5cecfd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.489 183079 DEBUG nova.compute.manager [req-4be6a0aa-d291-485a-8565-3536ad4ffdab req-7ed743b6-33be-4c1f-9613-15c07f5cecfd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Processing event network-vif-plugged-0a29f7a3-3b06-4447-a07b-2c171a583ec7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.490 183079 DEBUG nova.compute.manager [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.495 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103565.4939418, 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.496 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] VM Resumed (Lifecycle Event)
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.500 183079 DEBUG nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.504 183079 INFO nova.virt.libvirt.driver [-] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Instance spawned successfully.
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.505 183079 DEBUG nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.514 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.522 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:39:25 compute-0 systemd[1]: Started libpod-conmon-858e51a8dd9878d5748e3618089328b625db7905c7b0008d633c0f752189e1ea.scope.
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.529 183079 DEBUG nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.530 183079 DEBUG nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.531 183079 DEBUG nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.531 183079 DEBUG nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:39:25 compute-0 podman[237038]: 2026-01-22 17:39:25.43878235 +0000 UTC m=+0.029589961 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.532 183079 DEBUG nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.532 183079 DEBUG nova.virt.libvirt.driver [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.541 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:39:25 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:39:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b9aba0adf66bcff2bed9feace7800fa5d2e47643a5aa3963938a4cd993335cf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:39:25 compute-0 podman[237038]: 2026-01-22 17:39:25.567411619 +0000 UTC m=+0.158219230 container init 858e51a8dd9878d5748e3618089328b625db7905c7b0008d633c0f752189e1ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae93f685-c847-412d-9bac-109715e96a73, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 17:39:25 compute-0 podman[237038]: 2026-01-22 17:39:25.573431708 +0000 UTC m=+0.164239299 container start 858e51a8dd9878d5748e3618089328b625db7905c7b0008d633c0f752189e1ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae93f685-c847-412d-9bac-109715e96a73, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:39:25 compute-0 neutron-haproxy-ovnmeta-ae93f685-c847-412d-9bac-109715e96a73[237053]: [NOTICE]   (237057) : New worker (237059) forked
Jan 22 17:39:25 compute-0 neutron-haproxy-ovnmeta-ae93f685-c847-412d-9bac-109715e96a73[237053]: [NOTICE]   (237057) : Loading success.
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.597 183079 INFO nova.compute.manager [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Took 5.92 seconds to spawn the instance on the hypervisor.
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.599 183079 DEBUG nova.compute.manager [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.669 183079 INFO nova.compute.manager [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Took 6.47 seconds to build instance.
Jan 22 17:39:25 compute-0 nova_compute[183075]: 2026-01-22 17:39:25.696 183079 DEBUG oslo_concurrency.lockutils [None req-d006146f-2389-48e6-ab7b-763c4a481715 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.561s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:39:26 compute-0 nova_compute[183075]: 2026-01-22 17:39:26.285 183079 DEBUG nova.network.neutron [req-fc37e347-8a64-4341-88e1-57f0b093f300 req-67548170-11a7-40fd-8f44-217bbe6b61b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Updated VIF entry in instance network info cache for port 0a29f7a3-3b06-4447-a07b-2c171a583ec7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:39:26 compute-0 nova_compute[183075]: 2026-01-22 17:39:26.286 183079 DEBUG nova.network.neutron [req-fc37e347-8a64-4341-88e1-57f0b093f300 req-67548170-11a7-40fd-8f44-217bbe6b61b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Updating instance_info_cache with network_info: [{"id": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "address": "fa:16:3e:fc:c4:c4", "network": {"id": "ae93f685-c847-412d-9bac-109715e96a73", "bridge": "br-int", "label": "tempest-test-network--1900519672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a29f7a3-3b", "ovs_interfaceid": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:39:26 compute-0 nova_compute[183075]: 2026-01-22 17:39:26.324 183079 DEBUG oslo_concurrency.lockutils [req-fc37e347-8a64-4341-88e1-57f0b093f300 req-67548170-11a7-40fd-8f44-217bbe6b61b8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:39:26 compute-0 nova_compute[183075]: 2026-01-22 17:39:26.354 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:39:26 compute-0 nova_compute[183075]: 2026-01-22 17:39:26.555 183079 INFO nova.compute.manager [None req-75392496-599b-4aac-8420-5f3bab69c353 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Get console output
Jan 22 17:39:26 compute-0 nova_compute[183075]: 2026-01-22 17:39:26.560 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:39:28 compute-0 nova_compute[183075]: 2026-01-22 17:39:28.149 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:29 compute-0 nova_compute[183075]: 2026-01-22 17:39:29.648 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:29 compute-0 nova_compute[183075]: 2026-01-22 17:39:29.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:39:29 compute-0 nova_compute[183075]: 2026-01-22 17:39:29.791 183079 DEBUG nova.compute.manager [req-6c51504c-e681-4abe-b295-274419a3041d req-b3a2321d-0c55-4c99-a85c-c411d9176668 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Received event network-vif-plugged-0a29f7a3-3b06-4447-a07b-2c171a583ec7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:39:29 compute-0 nova_compute[183075]: 2026-01-22 17:39:29.792 183079 DEBUG oslo_concurrency.lockutils [req-6c51504c-e681-4abe-b295-274419a3041d req-b3a2321d-0c55-4c99-a85c-c411d9176668 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:39:29 compute-0 nova_compute[183075]: 2026-01-22 17:39:29.792 183079 DEBUG oslo_concurrency.lockutils [req-6c51504c-e681-4abe-b295-274419a3041d req-b3a2321d-0c55-4c99-a85c-c411d9176668 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:39:29 compute-0 nova_compute[183075]: 2026-01-22 17:39:29.793 183079 DEBUG oslo_concurrency.lockutils [req-6c51504c-e681-4abe-b295-274419a3041d req-b3a2321d-0c55-4c99-a85c-c411d9176668 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:39:29 compute-0 nova_compute[183075]: 2026-01-22 17:39:29.793 183079 DEBUG nova.compute.manager [req-6c51504c-e681-4abe-b295-274419a3041d req-b3a2321d-0c55-4c99-a85c-c411d9176668 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] No waiting events found dispatching network-vif-plugged-0a29f7a3-3b06-4447-a07b-2c171a583ec7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:39:29 compute-0 nova_compute[183075]: 2026-01-22 17:39:29.793 183079 WARNING nova.compute.manager [req-6c51504c-e681-4abe-b295-274419a3041d req-b3a2321d-0c55-4c99-a85c-c411d9176668 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Received unexpected event network-vif-plugged-0a29f7a3-3b06-4447-a07b-2c171a583ec7 for instance with vm_state active and task_state None.
Jan 22 17:39:30 compute-0 nova_compute[183075]: 2026-01-22 17:39:30.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:39:31 compute-0 podman[237068]: 2026-01-22 17:39:31.341385564 +0000 UTC m=+0.049737811 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:39:31 compute-0 nova_compute[183075]: 2026-01-22 17:39:31.865 183079 INFO nova.compute.manager [None req-36e6fa14-2259-4509-8f8f-0ca044a57674 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Get console output
Jan 22 17:39:31 compute-0 nova_compute[183075]: 2026-01-22 17:39:31.871 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:39:33 compute-0 nova_compute[183075]: 2026-01-22 17:39:33.211 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:33 compute-0 nova_compute[183075]: 2026-01-22 17:39:33.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:39:33 compute-0 nova_compute[183075]: 2026-01-22 17:39:33.790 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:39:33 compute-0 nova_compute[183075]: 2026-01-22 17:39:33.790 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 17:39:34 compute-0 nova_compute[183075]: 2026-01-22 17:39:34.651 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:35 compute-0 podman[237093]: 2026-01-22 17:39:35.356435634 +0000 UTC m=+0.064052209 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:39:36 compute-0 ovn_controller[95372]: 2026-01-22T17:39:36Z|00126|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fc:c4:c4 10.100.0.22
Jan 22 17:39:36 compute-0 ovn_controller[95372]: 2026-01-22T17:39:36Z|00127|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fc:c4:c4 10.100.0.22
Jan 22 17:39:36 compute-0 nova_compute[183075]: 2026-01-22 17:39:36.998 183079 INFO nova.compute.manager [None req-80c15e6e-2cdd-440a-b3ec-3b5a5a8719ea 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Get console output
Jan 22 17:39:37 compute-0 nova_compute[183075]: 2026-01-22 17:39:37.003 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:39:38 compute-0 nova_compute[183075]: 2026-01-22 17:39:38.212 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:39 compute-0 nova_compute[183075]: 2026-01-22 17:39:39.700 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:39 compute-0 nova_compute[183075]: 2026-01-22 17:39:39.802 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:39:39 compute-0 nova_compute[183075]: 2026-01-22 17:39:39.803 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:39:39 compute-0 nova_compute[183075]: 2026-01-22 17:39:39.803 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:39:39 compute-0 nova_compute[183075]: 2026-01-22 17:39:39.803 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:39:40 compute-0 nova_compute[183075]: 2026-01-22 17:39:40.226 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-fb0d62d8-3d6f-4fa5-b342-612c69890cdf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:39:40 compute-0 nova_compute[183075]: 2026-01-22 17:39:40.226 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-fb0d62d8-3d6f-4fa5-b342-612c69890cdf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:39:40 compute-0 nova_compute[183075]: 2026-01-22 17:39:40.227 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:39:40 compute-0 nova_compute[183075]: 2026-01-22 17:39:40.227 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid fb0d62d8-3d6f-4fa5-b342-612c69890cdf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:39:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:41.957 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:39:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:41.958 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:39:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:41.958 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:39:42 compute-0 nova_compute[183075]: 2026-01-22 17:39:42.108 183079 INFO nova.compute.manager [None req-871ae37b-a8da-45e5-b2ad-a0a814743598 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Get console output
Jan 22 17:39:42 compute-0 nova_compute[183075]: 2026-01-22 17:39:42.113 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:39:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:42.652 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:39:42 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:42.653 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:39:42 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:39:42 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:39:42 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:39:42 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:39:42 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:39:42 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.22
Jan 22 17:39:42 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ae93f685-c847-412d-9bac-109715e96a73 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:39:42 compute-0 nova_compute[183075]: 2026-01-22 17:39:42.956 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Updating instance_info_cache with network_info: [{"id": "70a900f0-6d2e-40bb-92fa-e43967095d17", "address": "fa:16:3e:6d:f4:ff", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70a900f0-6d", "ovs_interfaceid": "70a900f0-6d2e-40bb-92fa-e43967095d17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:39:42 compute-0 nova_compute[183075]: 2026-01-22 17:39:42.979 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-fb0d62d8-3d6f-4fa5-b342-612c69890cdf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:39:42 compute-0 nova_compute[183075]: 2026-01-22 17:39:42.980 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:39:42 compute-0 nova_compute[183075]: 2026-01-22 17:39:42.980 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:39:42 compute-0 nova_compute[183075]: 2026-01-22 17:39:42.981 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:39:42 compute-0 nova_compute[183075]: 2026-01-22 17:39:42.981 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.084 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.084 104990 INFO eventlet.wsgi.server [-] 10.100.0.22,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.4312086
Jan 22 17:39:43 compute-0 haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73[237059]: 10.100.0.22:52836 [22/Jan/2026:17:39:42.651] listener listener/metadata 0/0/0/432/432 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.093 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.094 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.22
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ae93f685-c847-412d-9bac-109715e96a73 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.114 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.115 104990 INFO eventlet.wsgi.server [-] 10.100.0.22,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0209599
Jan 22 17:39:43 compute-0 haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73[237059]: 10.100.0.22:52848 [22/Jan/2026:17:39:43.093] listener listener/metadata 0/0/0/22/22 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.120 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.121 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.22
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ae93f685-c847-412d-9bac-109715e96a73 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.136 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.136 104990 INFO eventlet.wsgi.server [-] 10.100.0.22,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0156157
Jan 22 17:39:43 compute-0 haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73[237059]: 10.100.0.22:52862 [22/Jan/2026:17:39:43.120] listener listener/metadata 0/0/0/16/16 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.145 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.146 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.22
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ae93f685-c847-412d-9bac-109715e96a73 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.164 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.164 104990 INFO eventlet.wsgi.server [-] 10.100.0.22,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0181448
Jan 22 17:39:43 compute-0 haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73[237059]: 10.100.0.22:52868 [22/Jan/2026:17:39:43.144] listener listener/metadata 0/0/0/20/20 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.170 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.171 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.22
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ae93f685-c847-412d-9bac-109715e96a73 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.186 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.187 104990 INFO eventlet.wsgi.server [-] 10.100.0.22,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0156977
Jan 22 17:39:43 compute-0 haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73[237059]: 10.100.0.22:52876 [22/Jan/2026:17:39:43.169] listener listener/metadata 0/0/0/17/17 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.193 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.194 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.22
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ae93f685-c847-412d-9bac-109715e96a73 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.208 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.209 104990 INFO eventlet.wsgi.server [-] 10.100.0.22,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0152993
Jan 22 17:39:43 compute-0 haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73[237059]: 10.100.0.22:52878 [22/Jan/2026:17:39:43.192] listener listener/metadata 0/0/0/16/16 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:39:43 compute-0 nova_compute[183075]: 2026-01-22 17:39:43.214 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.218 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.220 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.22
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ae93f685-c847-412d-9bac-109715e96a73 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.232 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.232 104990 INFO eventlet.wsgi.server [-] 10.100.0.22,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0129604
Jan 22 17:39:43 compute-0 haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73[237059]: 10.100.0.22:52888 [22/Jan/2026:17:39:43.218] listener listener/metadata 0/0/0/14/14 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.240 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.241 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.22
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ae93f685-c847-412d-9bac-109715e96a73 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.263 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:39:43 compute-0 haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73[237059]: 10.100.0.22:52896 [22/Jan/2026:17:39:43.239] listener listener/metadata 0/0/0/24/24 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.264 104990 INFO eventlet.wsgi.server [-] 10.100.0.22,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0227120
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.269 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.270 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.22
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ae93f685-c847-412d-9bac-109715e96a73 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.284 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.284 104990 INFO eventlet.wsgi.server [-] 10.100.0.22,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0142331
Jan 22 17:39:43 compute-0 haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73[237059]: 10.100.0.22:52906 [22/Jan/2026:17:39:43.269] listener listener/metadata 0/0/0/15/15 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.290 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.291 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.22
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ae93f685-c847-412d-9bac-109715e96a73 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.307 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.308 104990 INFO eventlet.wsgi.server [-] 10.100.0.22,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0167606
Jan 22 17:39:43 compute-0 haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73[237059]: 10.100.0.22:52914 [22/Jan/2026:17:39:43.290] listener listener/metadata 0/0/0/18/18 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.313 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.314 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.22
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ae93f685-c847-412d-9bac-109715e96a73 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:39:43 compute-0 haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73[237059]: 10.100.0.22:52926 [22/Jan/2026:17:39:43.313] listener listener/metadata 0/0/0/16/16 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.330 104990 INFO eventlet.wsgi.server [-] 10.100.0.22,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0157423
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.339 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.340 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.22
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ae93f685-c847-412d-9bac-109715e96a73 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.354 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.354 104990 INFO eventlet.wsgi.server [-] 10.100.0.22,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0146065
Jan 22 17:39:43 compute-0 haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73[237059]: 10.100.0.22:52936 [22/Jan/2026:17:39:43.338] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.357 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.358 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.22
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ae93f685-c847-412d-9bac-109715e96a73 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.372 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.372 104990 INFO eventlet.wsgi.server [-] 10.100.0.22,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0140030
Jan 22 17:39:43 compute-0 haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73[237059]: 10.100.0.22:52952 [22/Jan/2026:17:39:43.357] listener listener/metadata 0/0/0/15/15 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.378 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.378 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.22
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ae93f685-c847-412d-9bac-109715e96a73 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.391 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:39:43 compute-0 haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73[237059]: 10.100.0.22:52962 [22/Jan/2026:17:39:43.377] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.392 104990 INFO eventlet.wsgi.server [-] 10.100.0.22,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0132918
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.398 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.399 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.22
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ae93f685-c847-412d-9bac-109715e96a73 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.416 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.417 104990 INFO eventlet.wsgi.server [-] 10.100.0.22,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0176692
Jan 22 17:39:43 compute-0 haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73[237059]: 10.100.0.22:52976 [22/Jan/2026:17:39:43.397] listener listener/metadata 0/0/0/19/19 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.424 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.425 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.22
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: ae93f685-c847-412d-9bac-109715e96a73 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.444 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:39:43 compute-0 haproxy-metadata-proxy-ae93f685-c847-412d-9bac-109715e96a73[237059]: 10.100.0.22:52980 [22/Jan/2026:17:39:43.423] listener listener/metadata 0/0/0/21/21 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:39:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:39:43.445 104990 INFO eventlet.wsgi.server [-] 10.100.0.22,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0198874
Jan 22 17:39:43 compute-0 nova_compute[183075]: 2026-01-22 17:39:43.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:39:43 compute-0 nova_compute[183075]: 2026-01-22 17:39:43.824 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:39:43 compute-0 nova_compute[183075]: 2026-01-22 17:39:43.825 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:39:43 compute-0 nova_compute[183075]: 2026-01-22 17:39:43.825 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:39:43 compute-0 nova_compute[183075]: 2026-01-22 17:39:43.825 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:39:43 compute-0 nova_compute[183075]: 2026-01-22 17:39:43.890 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:39:43 compute-0 nova_compute[183075]: 2026-01-22 17:39:43.947 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:39:43 compute-0 nova_compute[183075]: 2026-01-22 17:39:43.948 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.007 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.013 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.071 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.073 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.136 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.305 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.306 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5387MB free_disk=73.30478286743164GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.306 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.307 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.382 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance fb0d62d8-3d6f-4fa5-b342-612c69890cdf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.383 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.383 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.383 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.577 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.591 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.610 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.611 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.304s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:39:44 compute-0 nova_compute[183075]: 2026-01-22 17:39:44.703 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:45 compute-0 podman[237156]: 2026-01-22 17:39:45.360987638 +0000 UTC m=+0.048117699 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 17:39:45 compute-0 podman[237157]: 2026-01-22 17:39:45.361699187 +0000 UTC m=+0.061929173 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 17:39:45 compute-0 podman[237155]: 2026-01-22 17:39:45.400474359 +0000 UTC m=+0.109615880 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:39:47 compute-0 nova_compute[183075]: 2026-01-22 17:39:47.266 183079 INFO nova.compute.manager [None req-a54f5700-8478-4c91-af46-37565116256e 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Get console output
Jan 22 17:39:47 compute-0 nova_compute[183075]: 2026-01-22 17:39:47.271 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:39:48 compute-0 nova_compute[183075]: 2026-01-22 17:39:48.215 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:49 compute-0 nova_compute[183075]: 2026-01-22 17:39:49.706 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:51 compute-0 podman[237214]: 2026-01-22 17:39:51.354937648 +0000 UTC m=+0.057667060 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:39:52 compute-0 nova_compute[183075]: 2026-01-22 17:39:52.714 183079 INFO nova.compute.manager [None req-e6b92d10-97b9-4786-9e99-d84b0c5d9319 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Get console output
Jan 22 17:39:52 compute-0 nova_compute[183075]: 2026-01-22 17:39:52.718 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:39:53 compute-0 nova_compute[183075]: 2026-01-22 17:39:53.218 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:54 compute-0 nova_compute[183075]: 2026-01-22 17:39:54.708 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:55 compute-0 ovn_controller[95372]: 2026-01-22T17:39:55Z|00712|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.461 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'name': 'tempest-server-test-164795055', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003e', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.464 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'name': 'tempest-server-test-1655858819', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.464 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.487 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk.device.write.requests volume: 320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.508 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk.device.write.requests volume: 341 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a849823-9371-4c2e-8806-be6518ac1272', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 320, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-vda', 'timestamp': '2026-01-22T17:39:55.465022', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'instance-0000003e', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d4df192-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.225480164, 'message_signature': '3cab798887d60abffe19a292c4ddc618e42f2e5d1a3a0be845009eeba85cf274'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 341, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf-vda', 'timestamp': '2026-01-22T17:39:55.465022', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'instance-0000003d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d510378-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.249295702, 'message_signature': '73e48faf6d3a25feab8a59c3c642bec7241b3ae180737594ee86ff3909561220'}]}, 'timestamp': '2026-01-22 17:39:55.508916', '_unique_id': 'ba08a96ab1e74b11b692d272e29fbe9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.511 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.533 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/cpu volume: 11030000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.552 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/cpu volume: 11130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c135a90-f0c3-496d-a28f-34bc0deb31ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11030000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'timestamp': '2026-01-22T17:39:55.511677', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'instance-0000003e', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '5d54e4de-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.293896337, 'message_signature': 'dbb9a64ac99d761b75103ed0a7dec59377fda84f1b43580413950715dc4be2bc'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11130000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'timestamp': '2026-01-22T17:39:55.511677', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'instance-0000003d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '5d57ca14-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.312891358, 'message_signature': 'a047ab5257c57db123c6fc73ee4a51a913706b16542dff77e655a303e48454a3'}]}, 'timestamp': '2026-01-22 17:39:55.553220', '_unique_id': 'b631a5b3962f468284c34335fdd61046'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.554 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.555 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.555 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/memory.usage volume: 43.43359375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.555 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/memory.usage volume: 42.21484375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5d5069b-037e-4967-a9ce-89de34d35299', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.43359375, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'timestamp': '2026-01-22T17:39:55.555073', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'instance-0000003e', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '5d581f00-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.293896337, 'message_signature': 'b9949fd1e5549ee35f3c3f26d93a0610b7820c1e1471b2fa6a5da5f3a583adc9'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.21484375, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'timestamp': '2026-01-22T17:39:55.555073', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'instance-0000003d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '5d5827ac-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.312891358, 'message_signature': '0ac303d1798f6e4cb7444bd52009bf755cab83dd0f863a51a94f4defaefe51e2'}]}, 'timestamp': '2026-01-22 17:39:55.555520', '_unique_id': '44bacfe935334496af458c127e65f512'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.556 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.558 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6 / tap0a29f7a3-3b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.558 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.560 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for fb0d62d8-3d6f-4fa5-b342-612c69890cdf / tap70a900f0-6d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a591ac3-6971-49dc-b4c8-5d9a6ee0995e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003e-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-tap0a29f7a3-3b', 'timestamp': '2026-01-22T17:39:55.556712', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'tap0a29f7a3-3b', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:c4:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0a29f7a3-3b'}, 'message_id': '5d58b19a-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.317136269, 'message_signature': '0336f4c7f77cd13c8149ef1a607ebd1e7d8489417b173c6c04ddc132776fb195'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003d-fb0d62d8-3d6f-4fa5-b342-612c69890cdf-tap70a900f0-6d', 'timestamp': '2026-01-22T17:39:55.556712', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'tap70a900f0-6d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:f4:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70a900f0-6d'}, 'message_id': '5d590ab4-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.31945269, 'message_signature': '8c103c53cafde5995546bb9e409febdaf36b4f3af05c0157247bd931dae9e22d'}]}, 'timestamp': '2026-01-22 17:39:55.561386', '_unique_id': '3ddce51935a74caf8d1746f1ceacd163'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.561 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.562 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.562 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk.device.read.requests volume: 1112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.562 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk.device.read.requests volume: 1187 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8e996c9-49b4-4335-94bf-0464453028e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1112, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-vda', 'timestamp': '2026-01-22T17:39:55.562687', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'instance-0000003e', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d59484e-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.225480164, 'message_signature': '57f371cc05c01cb988a0e39c9b2f2820f0ef69be819fd4be336a59d8a41a73fc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1187, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf-vda', 'timestamp': '2026-01-22T17:39:55.562687', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'instance-0000003d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d595028-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.249295702, 'message_signature': 'd7ab1760e51e4197f06b71de094683452f01a07d763e00ba5525e7de01da0926'}]}, 'timestamp': '2026-01-22 17:39:55.563106', '_unique_id': '218ea81607f9474bb6212128cac1bafa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.563 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.564 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.564 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk.device.write.latency volume: 3216961771 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.564 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk.device.write.latency volume: 4498474688 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bea9155-76ea-4f18-8835-5373001c9368', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3216961771, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-vda', 'timestamp': '2026-01-22T17:39:55.564221', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'instance-0000003e', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d5983f4-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.225480164, 'message_signature': '7cc011b535bc1d11ff3f49ade39dcb54a0c39e0186f320f04c1467f241fcad84'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4498474688, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf-vda', 'timestamp': '2026-01-22T17:39:55.564221', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'instance-0000003d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d598d9a-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.249295702, 'message_signature': '41879afcdb265d075e924770bb433c315b00f2691d6f0313ca8909d395aa5ba6'}]}, 'timestamp': '2026-01-22 17:39:55.564698', '_unique_id': 'a6560702bf704513903abf69c970d79b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.565 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/network.incoming.bytes volume: 7211 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.566 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/network.incoming.bytes volume: 7214 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd59f535d-aef2-40a2-80a2-ebab2d762752', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7211, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003e-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-tap0a29f7a3-3b', 'timestamp': '2026-01-22T17:39:55.565919', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'tap0a29f7a3-3b', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:c4:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0a29f7a3-3b'}, 'message_id': '5d59c792-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.317136269, 'message_signature': '81a26198c58d93bd194bac67b61e4581fe0b2fe1b49771359f71a8f77195f19a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7214, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003d-fb0d62d8-3d6f-4fa5-b342-612c69890cdf-tap70a900f0-6d', 'timestamp': '2026-01-22T17:39:55.565919', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'tap70a900f0-6d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:f4:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70a900f0-6d'}, 'message_id': '5d59d2a0-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.31945269, 'message_signature': 'fd5c621c434ecd73a30befc81de1f31c4912f23b86c4c4de653a59a8b407d8aa'}]}, 'timestamp': '2026-01-22 17:39:55.566498', '_unique_id': '11d05388fb5d46cf9b356eaf45538277'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.567 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88fcffaa-5508-4ce4-8368-ec1164d5c6e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003e-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-tap0a29f7a3-3b', 'timestamp': '2026-01-22T17:39:55.567922', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'tap0a29f7a3-3b', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:c4:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0a29f7a3-3b'}, 'message_id': '5d5a14ae-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.317136269, 'message_signature': 'b57327e5e7674790f5514dfe1332f090b88466915427229a001c8b607131fdb8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003d-fb0d62d8-3d6f-4fa5-b342-612c69890cdf-tap70a900f0-6d', 'timestamp': '2026-01-22T17:39:55.567922', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'tap70a900f0-6d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:f4:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70a900f0-6d'}, 'message_id': '5d5a1cba-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.31945269, 'message_signature': '83f6cc150a3cb5cb7b83957e2869252378e82cf296cfe4f68f8913f6ea8b2105'}]}, 'timestamp': '2026-01-22 17:39:55.568351', '_unique_id': '32f3f0aade684490a136d255d529cccf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.569 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.569 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.569 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-164795055>, <NovaLikeServer: tempest-server-test-1655858819>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-164795055>, <NovaLikeServer: tempest-server-test-1655858819>]
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.569 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.575 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk.device.allocation volume: 30875648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.581 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0340d93-1f81-4d0f-914f-40f930c57055', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30875648, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-vda', 'timestamp': '2026-01-22T17:39:55.569814', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'instance-0000003e', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d5b49a0-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.330242295, 'message_signature': 'a792248537e9a5ae0b15f1f7a4b50b7fb53d33b4a54e74c62fbdd20c7c3cb57a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf-vda', 'timestamp': '2026-01-22T17:39:55.569814', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'instance-0000003d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d5c2708-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.336449618, 'message_signature': '777656b4258a776ade8851c46bfe056f550ccd0808c84f742e2547f3a6942ad8'}]}, 'timestamp': '2026-01-22 17:39:55.581772', '_unique_id': '27ea5fa7e03f443ca56e78b3f60617db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.582 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.583 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.583 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk.device.write.bytes volume: 72974336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.583 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk.device.write.bytes volume: 73216000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f11c273-e5b0-4f40-889b-5b07add6387f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72974336, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-vda', 'timestamp': '2026-01-22T17:39:55.583205', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'instance-0000003e', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d5c698e-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.225480164, 'message_signature': '135f2273504cd46e480a437a2cd624802b8436fe878e2fc7ff62bad9443a3cac'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73216000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf-vda', 'timestamp': '2026-01-22T17:39:55.583205', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'instance-0000003d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d5c71ea-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.249295702, 'message_signature': '55f9e507f86829385606e528518bea72fbc160beb262a410d3e0c99c25b84cd5'}]}, 'timestamp': '2026-01-22 17:39:55.583660', '_unique_id': '43a8a9142fea4697852a615ec59cb154'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.584 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd36d126e-36e2-46e4-88e0-36bb2dcb1f1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-vda', 'timestamp': '2026-01-22T17:39:55.585014', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'instance-0000003e', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d5cb09c-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.330242295, 'message_signature': 'ea3bcc327242dbb94f2c3885eb5df6e034f720569c8ae740a4da1cebddc4799c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf-vda', 'timestamp': '2026-01-22T17:39:55.585014', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'instance-0000003d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d5cb8d0-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.336449618, 'message_signature': 'e3acc0fc1e578493e36d586e5963752a31d3a32b750383f5842981447243a133'}]}, 'timestamp': '2026-01-22 17:39:55.585445', '_unique_id': 'e898c6fdde5c4280a971d61472cccf70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.586 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.586 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.586 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e82249f7-b646-461d-952d-fa69435b6e8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003e-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-tap0a29f7a3-3b', 'timestamp': '2026-01-22T17:39:55.586684', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'tap0a29f7a3-3b', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:c4:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0a29f7a3-3b'}, 'message_id': '5d5cf1ce-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.317136269, 'message_signature': 'ef3d22736b3cbb99ecb0cc7a847763b41c6f0a8d7aaabf367329501b86095f6d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003d-fb0d62d8-3d6f-4fa5-b342-612c69890cdf-tap70a900f0-6d', 'timestamp': '2026-01-22T17:39:55.586684', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'tap70a900f0-6d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:f4:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70a900f0-6d'}, 'message_id': '5d5cfa20-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.31945269, 'message_signature': '3a8c2a5fd5fcb7f2e3da1a17387fef5a1f648a05b46f350bcd249fcb3bbdc03a'}]}, 'timestamp': '2026-01-22 17:39:55.587145', '_unique_id': '440c16b7c5cc460f986843fec35c58bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.588 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.588 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.588 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c66c9bbe-239a-4019-8670-f23080976735', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003e-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-tap0a29f7a3-3b', 'timestamp': '2026-01-22T17:39:55.588322', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'tap0a29f7a3-3b', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:c4:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0a29f7a3-3b'}, 'message_id': '5d5d3198-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.317136269, 'message_signature': 'af11ff5cd42c769b76e9eba1b6771925b970df7205880b9628f03bd53ea607a6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003d-fb0d62d8-3d6f-4fa5-b342-612c69890cdf-tap70a900f0-6d', 'timestamp': '2026-01-22T17:39:55.588322', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'tap70a900f0-6d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:f4:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70a900f0-6d'}, 'message_id': '5d5d3a4e-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.31945269, 'message_signature': '31e6979c986ab2d289f74d8bb4e65306ece8ed1fd71afb04cd9f4943a3f9ab91'}]}, 'timestamp': '2026-01-22 17:39:55.588768', '_unique_id': '97fcd239c83d4272990c4db33f22fdfe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.589 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '180e3d8d-80f8-4282-a1a1-481eaf76897c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003e-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-tap0a29f7a3-3b', 'timestamp': '2026-01-22T17:39:55.590032', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'tap0a29f7a3-3b', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:c4:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0a29f7a3-3b'}, 'message_id': '5d5d7450-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.317136269, 'message_signature': '41fcd4899caa731ad602f00507ccb0a584835f5240010ac040a6da5f8e93d37e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003d-fb0d62d8-3d6f-4fa5-b342-612c69890cdf-tap70a900f0-6d', 'timestamp': '2026-01-22T17:39:55.590032', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'tap70a900f0-6d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:f4:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70a900f0-6d'}, 'message_id': '5d5d7cde-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.31945269, 'message_signature': '5e78faa4ab60ee5495f324a395801056c3ca2588bf71d2d3b5e95498f60cb8be'}]}, 'timestamp': '2026-01-22 17:39:55.590472', '_unique_id': 'f0bb04745bab49878e4951f98cabdfe6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.591 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.591 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.591 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbc8926a-da65-42c6-8558-5ee330131e9e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003e-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-tap0a29f7a3-3b', 'timestamp': '2026-01-22T17:39:55.591560', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'tap0a29f7a3-3b', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:c4:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0a29f7a3-3b'}, 'message_id': '5d5db0be-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.317136269, 'message_signature': 'df8098272329e25f87ee383531b4a9ed53b4804c9b85146791160deea03e5397'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003d-fb0d62d8-3d6f-4fa5-b342-612c69890cdf-tap70a900f0-6d', 'timestamp': '2026-01-22T17:39:55.591560', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'tap70a900f0-6d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:f4:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70a900f0-6d'}, 'message_id': '5d5db8ca-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.31945269, 'message_signature': '7552437f2565ea39b874154c7d3e81a09aa653cd14a396cbe6d5f4e7f8d8cd63'}]}, 'timestamp': '2026-01-22 17:39:55.592004', '_unique_id': '98d14ae87f0e4dc984b36b4d07ed7e71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.593 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.593 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/network.outgoing.packets volume: 123 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.593 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/network.outgoing.packets volume: 131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c9df7f4-b8df-4e93-9195-01efc3c12953', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 123, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003e-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-tap0a29f7a3-3b', 'timestamp': '2026-01-22T17:39:55.593103', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'tap0a29f7a3-3b', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:c4:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0a29f7a3-3b'}, 'message_id': '5d5dec28-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.317136269, 'message_signature': '8788a2b5e10fc6e068eb3531673965edb6156e3c51e3526d0aa796c80b72129f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 131, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003d-fb0d62d8-3d6f-4fa5-b342-612c69890cdf-tap70a900f0-6d', 'timestamp': '2026-01-22T17:39:55.593103', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'tap70a900f0-6d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:f4:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70a900f0-6d'}, 'message_id': '5d5df59c-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.31945269, 'message_signature': '768e413085a39a8c1c96b59141855d42259ccf4f13479b0f6e3ea486650e2d8f'}]}, 'timestamp': '2026-01-22 17:39:55.593562', '_unique_id': '13026df22dec4141b95e9f2ec74dccd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.594 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '540d748a-50b6-48d2-93b5-1c23475c4af0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-vda', 'timestamp': '2026-01-22T17:39:55.594709', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'instance-0000003e', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d5e2ae4-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.330242295, 'message_signature': '8ac2ee9d1b4956418d268e772d9071cc4ef5632ea6f987102afee2e91771957e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf-vda', 'timestamp': '2026-01-22T17:39:55.594709', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'instance-0000003d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d5e3282-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.336449618, 'message_signature': 'b40d1ef90a90aaa44876600da117305ad508c2cee3a29c61ac4d68faedab6684'}]}, 'timestamp': '2026-01-22 17:39:55.595114', '_unique_id': '0c154dff65e9473e979fc09d4d653606'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.596 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.596 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.596 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-164795055>, <NovaLikeServer: tempest-server-test-1655858819>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-164795055>, <NovaLikeServer: tempest-server-test-1655858819>]
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.596 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.596 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk.device.read.bytes volume: 30046720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.596 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk.device.read.bytes volume: 31963648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '034ea041-9b57-4177-b5ff-4ac0265c30a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30046720, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-vda', 'timestamp': '2026-01-22T17:39:55.596519', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'instance-0000003e', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d5e724c-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.225480164, 'message_signature': 'b6100007bb09a930c0fa475044a32cd3c744585a811a42074c1644b616b38fe9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31963648, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf-vda', 'timestamp': '2026-01-22T17:39:55.596519', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'instance-0000003d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d5e7a62-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.249295702, 'message_signature': '680a460c11940ff3b3893a4dbf881b5753a8c9a58db075a0b9c7d94f9a088eb5'}]}, 'timestamp': '2026-01-22 17:39:55.596953', '_unique_id': 'a4cb079d54e24efd889e1c277d3f3668'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.597 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/disk.device.read.latency volume: 190711893 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/disk.device.read.latency volume: 226154414 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74ad442f-9616-429d-ac67-dce0acc15b20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 190711893, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-vda', 'timestamp': '2026-01-22T17:39:55.598036', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'instance-0000003e', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d5eacbc-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.225480164, 'message_signature': '2faac6a94dbd25cd49c27a198dea862548e5bd747ca843fb95aba5713268b652'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 226154414, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf-vda', 'timestamp': '2026-01-22T17:39:55.598036', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'instance-0000003d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5d5eb4be-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.249295702, 'message_signature': '09f68ddbbaaabb39feaebb4da8d0209d132f2604fc03d72f2bb1302c678fa46a'}]}, 'timestamp': '2026-01-22 17:39:55.598446', '_unique_id': '98fc94444bc34f20835c980154d96389'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.598 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.599 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.599 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.599 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-164795055>, <NovaLikeServer: tempest-server-test-1655858819>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-164795055>, <NovaLikeServer: tempest-server-test-1655858819>]
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.599 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.599 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.599 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-164795055>, <NovaLikeServer: tempest-server-test-1655858819>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-164795055>, <NovaLikeServer: tempest-server-test-1655858819>]
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/network.outgoing.bytes volume: 10851 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/network.outgoing.bytes volume: 11596 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52a40d15-8043-4a96-9698-aef5f4bc5009', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10851, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003e-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-tap0a29f7a3-3b', 'timestamp': '2026-01-22T17:39:55.600080', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'tap0a29f7a3-3b', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:c4:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0a29f7a3-3b'}, 'message_id': '5d5efcbc-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.317136269, 'message_signature': '940ec80b3553a455bf75407fb7b1c96ff42d54b8e46eebcfa79762daff3eb0f9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11596, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003d-fb0d62d8-3d6f-4fa5-b342-612c69890cdf-tap70a900f0-6d', 'timestamp': '2026-01-22T17:39:55.600080', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'tap70a900f0-6d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:f4:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70a900f0-6d'}, 'message_id': '5d5f04fa-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.31945269, 'message_signature': 'e4a9bf865041947af911d2d4d7e92ec25c54d7239c6452500f179090a1ba0762'}]}, 'timestamp': '2026-01-22 17:39:55.600508', '_unique_id': '37a583e562d94e999f6989d98964da9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.600 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.601 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.601 12 DEBUG ceilometer.compute.pollsters [-] 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.601 12 DEBUG ceilometer.compute.pollsters [-] fb0d62d8-3d6f-4fa5-b342-612c69890cdf/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '304ef285-8a4a-43d6-b65a-c47bf261267c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003e-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-tap0a29f7a3-3b', 'timestamp': '2026-01-22T17:39:55.601588', 'resource_metadata': {'display_name': 'tempest-server-test-164795055', 'name': 'tap0a29f7a3-3b', 'instance_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:c4:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0a29f7a3-3b'}, 'message_id': '5d5f3876-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.317136269, 'message_signature': 'a746d0969aa1addc337cfdb41cd4301c3cd7bb2655117ddc40f969431273ddd0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003d-fb0d62d8-3d6f-4fa5-b342-612c69890cdf-tap70a900f0-6d', 'timestamp': '2026-01-22T17:39:55.601588', 'resource_metadata': {'display_name': 'tempest-server-test-1655858819', 'name': 'tap70a900f0-6d', 'instance_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:f4:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70a900f0-6d'}, 'message_id': '5d5f4244-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 5959.31945269, 'message_signature': '238de4a3163d5da7bfb705d61972f99e55410652438a28b866655801f3f38672'}]}, 'timestamp': '2026-01-22 17:39:55.602107', '_unique_id': '5f22f071a93844bdb2ca9f2e67ff3e25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:39:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:39:55.602 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:39:55 compute-0 nova_compute[183075]: 2026-01-22 17:39:55.606 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:39:55 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:39:55 compute-0 nova_compute[183075]: 2026-01-22 17:39:55.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:39:55 compute-0 nova_compute[183075]: 2026-01-22 17:39:55.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 17:39:55 compute-0 nova_compute[183075]: 2026-01-22 17:39:55.805 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 17:39:57 compute-0 nova_compute[183075]: 2026-01-22 17:39:57.833 183079 INFO nova.compute.manager [None req-a18170b4-5311-4ff0-a47a-69a267b0a068 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Get console output
Jan 22 17:39:57 compute-0 nova_compute[183075]: 2026-01-22 17:39:57.837 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:39:58 compute-0 nova_compute[183075]: 2026-01-22 17:39:58.219 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:39:59 compute-0 nova_compute[183075]: 2026-01-22 17:39:59.712 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:02 compute-0 podman[237236]: 2026-01-22 17:40:02.372097931 +0000 UTC m=+0.067668693 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:40:03 compute-0 nova_compute[183075]: 2026-01-22 17:40:03.089 183079 INFO nova.compute.manager [None req-b5e5f338-d18f-40f4-a1c0-c4ce61ec156e 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Get console output
Jan 22 17:40:03 compute-0 nova_compute[183075]: 2026-01-22 17:40:03.093 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:40:03 compute-0 nova_compute[183075]: 2026-01-22 17:40:03.222 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:04 compute-0 nova_compute[183075]: 2026-01-22 17:40:04.713 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:06 compute-0 podman[237260]: 2026-01-22 17:40:06.335567903 +0000 UTC m=+0.042392658 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:40:08 compute-0 nova_compute[183075]: 2026-01-22 17:40:08.223 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:08 compute-0 nova_compute[183075]: 2026-01-22 17:40:08.251 183079 INFO nova.compute.manager [None req-6751bad4-496f-4ebe-b634-e3f53d317104 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Get console output
Jan 22 17:40:08 compute-0 nova_compute[183075]: 2026-01-22 17:40:08.257 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:40:08 compute-0 nova_compute[183075]: 2026-01-22 17:40:08.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:40:09 compute-0 nova_compute[183075]: 2026-01-22 17:40:09.714 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:11 compute-0 nova_compute[183075]: 2026-01-22 17:40:11.773 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:40:12 compute-0 nova_compute[183075]: 2026-01-22 17:40:12.071 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Triggering sync for uuid fb0d62d8-3d6f-4fa5-b342-612c69890cdf _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 17:40:12 compute-0 nova_compute[183075]: 2026-01-22 17:40:12.071 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Triggering sync for uuid 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 17:40:12 compute-0 nova_compute[183075]: 2026-01-22 17:40:12.072 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:12 compute-0 nova_compute[183075]: 2026-01-22 17:40:12.072 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:12 compute-0 nova_compute[183075]: 2026-01-22 17:40:12.072 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:12 compute-0 nova_compute[183075]: 2026-01-22 17:40:12.073 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:12 compute-0 nova_compute[183075]: 2026-01-22 17:40:12.169 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:12 compute-0 nova_compute[183075]: 2026-01-22 17:40:12.221 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:13 compute-0 nova_compute[183075]: 2026-01-22 17:40:13.226 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:13 compute-0 nova_compute[183075]: 2026-01-22 17:40:13.612 183079 INFO nova.compute.manager [None req-593c9550-2f1f-4c73-b82d-f6e307ad9047 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Get console output
Jan 22 17:40:13 compute-0 nova_compute[183075]: 2026-01-22 17:40:13.619 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:40:14 compute-0 nova_compute[183075]: 2026-01-22 17:40:14.715 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:16 compute-0 podman[237285]: 2026-01-22 17:40:16.347582044 +0000 UTC m=+0.051238121 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 17:40:16 compute-0 podman[237286]: 2026-01-22 17:40:16.355916934 +0000 UTC m=+0.056884860 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.openshift.expose-services=)
Jan 22 17:40:16 compute-0 podman[237284]: 2026-01-22 17:40:16.387345522 +0000 UTC m=+0.090244119 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:40:18 compute-0 nova_compute[183075]: 2026-01-22 17:40:18.227 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:18 compute-0 nova_compute[183075]: 2026-01-22 17:40:18.760 183079 INFO nova.compute.manager [None req-56be47c6-788b-44f2-bcf4-b13742ef398e 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Get console output
Jan 22 17:40:18 compute-0 nova_compute[183075]: 2026-01-22 17:40:18.764 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:40:19 compute-0 nova_compute[183075]: 2026-01-22 17:40:19.718 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:20 compute-0 nova_compute[183075]: 2026-01-22 17:40:20.767 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:20.767 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:40:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:20.768 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:40:20 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:20.769 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:40:21 compute-0 nova_compute[183075]: 2026-01-22 17:40:21.553 183079 DEBUG nova.compute.manager [req-052194fd-30a8-4c49-b8d0-a48eaa25a922 req-48215971-7310-479b-9332-d500c4dd10c2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Received event network-changed-0a29f7a3-3b06-4447-a07b-2c171a583ec7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:40:21 compute-0 nova_compute[183075]: 2026-01-22 17:40:21.554 183079 DEBUG nova.compute.manager [req-052194fd-30a8-4c49-b8d0-a48eaa25a922 req-48215971-7310-479b-9332-d500c4dd10c2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Refreshing instance network info cache due to event network-changed-0a29f7a3-3b06-4447-a07b-2c171a583ec7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:40:21 compute-0 nova_compute[183075]: 2026-01-22 17:40:21.554 183079 DEBUG oslo_concurrency.lockutils [req-052194fd-30a8-4c49-b8d0-a48eaa25a922 req-48215971-7310-479b-9332-d500c4dd10c2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:40:21 compute-0 nova_compute[183075]: 2026-01-22 17:40:21.554 183079 DEBUG oslo_concurrency.lockutils [req-052194fd-30a8-4c49-b8d0-a48eaa25a922 req-48215971-7310-479b-9332-d500c4dd10c2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:40:21 compute-0 nova_compute[183075]: 2026-01-22 17:40:21.554 183079 DEBUG nova.network.neutron [req-052194fd-30a8-4c49-b8d0-a48eaa25a922 req-48215971-7310-479b-9332-d500c4dd10c2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Refreshing network info cache for port 0a29f7a3-3b06-4447-a07b-2c171a583ec7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:40:22 compute-0 podman[237348]: 2026-01-22 17:40:22.337049036 +0000 UTC m=+0.049041863 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:40:22 compute-0 nova_compute[183075]: 2026-01-22 17:40:22.792 183079 DEBUG nova.network.neutron [req-052194fd-30a8-4c49-b8d0-a48eaa25a922 req-48215971-7310-479b-9332-d500c4dd10c2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Updated VIF entry in instance network info cache for port 0a29f7a3-3b06-4447-a07b-2c171a583ec7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:40:22 compute-0 nova_compute[183075]: 2026-01-22 17:40:22.792 183079 DEBUG nova.network.neutron [req-052194fd-30a8-4c49-b8d0-a48eaa25a922 req-48215971-7310-479b-9332-d500c4dd10c2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Updating instance_info_cache with network_info: [{"id": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "address": "fa:16:3e:fc:c4:c4", "network": {"id": "ae93f685-c847-412d-9bac-109715e96a73", "bridge": "br-int", "label": "tempest-test-network--1900519672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a29f7a3-3b", "ovs_interfaceid": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:40:22 compute-0 nova_compute[183075]: 2026-01-22 17:40:22.810 183079 DEBUG oslo_concurrency.lockutils [req-052194fd-30a8-4c49-b8d0-a48eaa25a922 req-48215971-7310-479b-9332-d500c4dd10c2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:40:23 compute-0 nova_compute[183075]: 2026-01-22 17:40:23.229 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:24 compute-0 nova_compute[183075]: 2026-01-22 17:40:24.720 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.092 183079 DEBUG oslo_concurrency.lockutils [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.093 183079 DEBUG oslo_concurrency.lockutils [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.093 183079 DEBUG oslo_concurrency.lockutils [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.094 183079 DEBUG oslo_concurrency.lockutils [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.094 183079 DEBUG oslo_concurrency.lockutils [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.096 183079 INFO nova.compute.manager [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Terminating instance
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.097 183079 DEBUG nova.compute.manager [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:40:26 compute-0 kernel: tap0a29f7a3-3b (unregistering): left promiscuous mode
Jan 22 17:40:26 compute-0 NetworkManager[55454]: <info>  [1769103626.2210] device (tap0a29f7a3-3b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.232 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:26 compute-0 ovn_controller[95372]: 2026-01-22T17:40:26Z|00713|binding|INFO|Releasing lport 0a29f7a3-3b06-4447-a07b-2c171a583ec7 from this chassis (sb_readonly=0)
Jan 22 17:40:26 compute-0 ovn_controller[95372]: 2026-01-22T17:40:26Z|00714|binding|INFO|Setting lport 0a29f7a3-3b06-4447-a07b-2c171a583ec7 down in Southbound
Jan 22 17:40:26 compute-0 ovn_controller[95372]: 2026-01-22T17:40:26Z|00715|binding|INFO|Removing iface tap0a29f7a3-3b ovn-installed in OVS
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.240 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:26.242 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:c4:c4 10.100.0.22'], port_security=['fa:16:3e:fc:c4:c4 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': '4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae93f685-c847-412d-9bac-109715e96a73', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9bfec5de-80e1-4e5c-80ad-8dfa1bbb473c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.250'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f3a5ba1-b177-4437-8147-21864312aeb4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=0a29f7a3-3b06-4447-a07b-2c171a583ec7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:40:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:26.244 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 0a29f7a3-3b06-4447-a07b-2c171a583ec7 in datapath ae93f685-c847-412d-9bac-109715e96a73 unbound from our chassis
Jan 22 17:40:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:26.245 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ae93f685-c847-412d-9bac-109715e96a73, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.247 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:26.247 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[82b18aa7-f560-44ef-bd97-ec1e62f85b38]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:26.248 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ae93f685-c847-412d-9bac-109715e96a73 namespace which is not needed anymore
Jan 22 17:40:26 compute-0 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Jan 22 17:40:26 compute-0 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000003e.scope: Consumed 14.276s CPU time.
Jan 22 17:40:26 compute-0 systemd-machined[154382]: Machine qemu-62-instance-0000003e terminated.
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.363 183079 INFO nova.virt.libvirt.driver [-] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Instance destroyed successfully.
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.365 183079 DEBUG nova.objects.instance [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.379 183079 DEBUG nova.virt.libvirt.vif [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:39:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-164795055',display_name='tempest-server-test-164795055',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-164795055',id=62,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:39:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-pt57t9fa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:39:25Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "address": "fa:16:3e:fc:c4:c4", "network": {"id": "ae93f685-c847-412d-9bac-109715e96a73", "bridge": "br-int", "label": "tempest-test-network--1900519672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a29f7a3-3b", "ovs_interfaceid": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.380 183079 DEBUG nova.network.os_vif_util [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "address": "fa:16:3e:fc:c4:c4", "network": {"id": "ae93f685-c847-412d-9bac-109715e96a73", "bridge": "br-int", "label": "tempest-test-network--1900519672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a29f7a3-3b", "ovs_interfaceid": "0a29f7a3-3b06-4447-a07b-2c171a583ec7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.381 183079 DEBUG nova.network.os_vif_util [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fc:c4:c4,bridge_name='br-int',has_traffic_filtering=True,id=0a29f7a3-3b06-4447-a07b-2c171a583ec7,network=Network(ae93f685-c847-412d-9bac-109715e96a73),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a29f7a3-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.381 183079 DEBUG os_vif [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fc:c4:c4,bridge_name='br-int',has_traffic_filtering=True,id=0a29f7a3-3b06-4447-a07b-2c171a583ec7,network=Network(ae93f685-c847-412d-9bac-109715e96a73),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a29f7a3-3b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.383 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.384 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a29f7a3-3b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.386 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.387 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.390 183079 INFO os_vif [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fc:c4:c4,bridge_name='br-int',has_traffic_filtering=True,id=0a29f7a3-3b06-4447-a07b-2c171a583ec7,network=Network(ae93f685-c847-412d-9bac-109715e96a73),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a29f7a3-3b')
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.390 183079 INFO nova.virt.libvirt.driver [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Deleting instance files /var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6_del
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.391 183079 INFO nova.virt.libvirt.driver [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Deletion of /var/lib/nova/instances/4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6_del complete
Jan 22 17:40:26 compute-0 neutron-haproxy-ovnmeta-ae93f685-c847-412d-9bac-109715e96a73[237053]: [NOTICE]   (237057) : haproxy version is 2.8.14-c23fe91
Jan 22 17:40:26 compute-0 neutron-haproxy-ovnmeta-ae93f685-c847-412d-9bac-109715e96a73[237053]: [NOTICE]   (237057) : path to executable is /usr/sbin/haproxy
Jan 22 17:40:26 compute-0 neutron-haproxy-ovnmeta-ae93f685-c847-412d-9bac-109715e96a73[237053]: [WARNING]  (237057) : Exiting Master process...
Jan 22 17:40:26 compute-0 neutron-haproxy-ovnmeta-ae93f685-c847-412d-9bac-109715e96a73[237053]: [ALERT]    (237057) : Current worker (237059) exited with code 143 (Terminated)
Jan 22 17:40:26 compute-0 neutron-haproxy-ovnmeta-ae93f685-c847-412d-9bac-109715e96a73[237053]: [WARNING]  (237057) : All workers exited. Exiting... (0)
Jan 22 17:40:26 compute-0 systemd[1]: libpod-858e51a8dd9878d5748e3618089328b625db7905c7b0008d633c0f752189e1ea.scope: Deactivated successfully.
Jan 22 17:40:26 compute-0 conmon[237053]: conmon 858e51a8dd9878d5748e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-858e51a8dd9878d5748e3618089328b625db7905c7b0008d633c0f752189e1ea.scope/container/memory.events
Jan 22 17:40:26 compute-0 podman[237402]: 2026-01-22 17:40:26.406538551 +0000 UTC m=+0.050737628 container died 858e51a8dd9878d5748e3618089328b625db7905c7b0008d633c0f752189e1ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae93f685-c847-412d-9bac-109715e96a73, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.423 183079 DEBUG nova.compute.manager [req-614a8021-4efe-4f2d-af0c-3e44b951e2c1 req-722e46cb-653e-4b29-ab9d-654d5a74384b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Received event network-vif-unplugged-0a29f7a3-3b06-4447-a07b-2c171a583ec7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.423 183079 DEBUG oslo_concurrency.lockutils [req-614a8021-4efe-4f2d-af0c-3e44b951e2c1 req-722e46cb-653e-4b29-ab9d-654d5a74384b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.424 183079 DEBUG oslo_concurrency.lockutils [req-614a8021-4efe-4f2d-af0c-3e44b951e2c1 req-722e46cb-653e-4b29-ab9d-654d5a74384b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.424 183079 DEBUG oslo_concurrency.lockutils [req-614a8021-4efe-4f2d-af0c-3e44b951e2c1 req-722e46cb-653e-4b29-ab9d-654d5a74384b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.424 183079 DEBUG nova.compute.manager [req-614a8021-4efe-4f2d-af0c-3e44b951e2c1 req-722e46cb-653e-4b29-ab9d-654d5a74384b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] No waiting events found dispatching network-vif-unplugged-0a29f7a3-3b06-4447-a07b-2c171a583ec7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.425 183079 DEBUG nova.compute.manager [req-614a8021-4efe-4f2d-af0c-3e44b951e2c1 req-722e46cb-653e-4b29-ab9d-654d5a74384b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Received event network-vif-unplugged-0a29f7a3-3b06-4447-a07b-2c171a583ec7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:40:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-858e51a8dd9878d5748e3618089328b625db7905c7b0008d633c0f752189e1ea-userdata-shm.mount: Deactivated successfully.
Jan 22 17:40:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-5b9aba0adf66bcff2bed9feace7800fa5d2e47643a5aa3963938a4cd993335cf-merged.mount: Deactivated successfully.
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.444 183079 INFO nova.compute.manager [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.446 183079 DEBUG oslo.service.loopingcall [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.446 183079 DEBUG nova.compute.manager [-] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.446 183079 DEBUG nova.network.neutron [-] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:40:26 compute-0 podman[237402]: 2026-01-22 17:40:26.452137433 +0000 UTC m=+0.096336500 container cleanup 858e51a8dd9878d5748e3618089328b625db7905c7b0008d633c0f752189e1ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae93f685-c847-412d-9bac-109715e96a73, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:40:26 compute-0 systemd[1]: libpod-conmon-858e51a8dd9878d5748e3618089328b625db7905c7b0008d633c0f752189e1ea.scope: Deactivated successfully.
Jan 22 17:40:26 compute-0 podman[237437]: 2026-01-22 17:40:26.513183901 +0000 UTC m=+0.040719444 container remove 858e51a8dd9878d5748e3618089328b625db7905c7b0008d633c0f752189e1ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae93f685-c847-412d-9bac-109715e96a73, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:40:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:26.518 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cb503e9a-e366-47d3-aea8-af1581db3d66]: (4, ('Thu Jan 22 05:40:26 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ae93f685-c847-412d-9bac-109715e96a73 (858e51a8dd9878d5748e3618089328b625db7905c7b0008d633c0f752189e1ea)\n858e51a8dd9878d5748e3618089328b625db7905c7b0008d633c0f752189e1ea\nThu Jan 22 05:40:26 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ae93f685-c847-412d-9bac-109715e96a73 (858e51a8dd9878d5748e3618089328b625db7905c7b0008d633c0f752189e1ea)\n858e51a8dd9878d5748e3618089328b625db7905c7b0008d633c0f752189e1ea\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:26.520 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3abb5ce8-6df2-45c5-ad04-7949b1219e6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:26.521 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae93f685-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.522 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:26 compute-0 kernel: tapae93f685-c0: left promiscuous mode
Jan 22 17:40:26 compute-0 nova_compute[183075]: 2026-01-22 17:40:26.534 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:26.537 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2b316c5c-be41-4634-8cbf-71666c6a1ebc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:26.558 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e7ef7a89-b3d9-4d66-9949-6432421f81f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:26.558 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a6c0aa-0a7c-4767-8333-99fd1438a094]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:26.574 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9a2a058e-3a69-4e61-bd11-cf766ab3b001]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592856, 'reachable_time': 24003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237453, 'error': None, 'target': 'ovnmeta-ae93f685-c847-412d-9bac-109715e96a73', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:26.576 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ae93f685-c847-412d-9bac-109715e96a73 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:40:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:26.576 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[ea863b39-e303-45a2-a3a1-010fc1f561db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:26 compute-0 systemd[1]: run-netns-ovnmeta\x2dae93f685\x2dc847\x2d412d\x2d9bac\x2d109715e96a73.mount: Deactivated successfully.
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.087 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.165 183079 DEBUG nova.network.neutron [-] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.183 183079 INFO nova.compute.manager [-] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Took 0.74 seconds to deallocate network for instance.
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.230 183079 DEBUG oslo_concurrency.lockutils [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.231 183079 DEBUG oslo_concurrency.lockutils [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.282 183079 DEBUG nova.compute.manager [req-26dbd672-1ce5-4f11-8066-ce8b403893a1 req-dffd8cb5-0b1f-4167-a860-3c989f84ed05 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Received event network-vif-deleted-0a29f7a3-3b06-4447-a07b-2c171a583ec7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.312 183079 DEBUG nova.compute.provider_tree [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.328 183079 DEBUG nova.scheduler.client.report [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.353 183079 DEBUG oslo_concurrency.lockutils [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.379 183079 INFO nova.scheduler.client.report [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.430 183079 DEBUG oslo_concurrency.lockutils [None req-18cc4ac6-f828-40d0-98b0-fb86a6e44df9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.789 183079 DEBUG oslo_concurrency.lockutils [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.790 183079 DEBUG oslo_concurrency.lockutils [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.790 183079 DEBUG oslo_concurrency.lockutils [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.790 183079 DEBUG oslo_concurrency.lockutils [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.791 183079 DEBUG oslo_concurrency.lockutils [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.792 183079 INFO nova.compute.manager [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Terminating instance
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.793 183079 DEBUG nova.compute.manager [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:40:27 compute-0 kernel: tap70a900f0-6d (unregistering): left promiscuous mode
Jan 22 17:40:27 compute-0 NetworkManager[55454]: <info>  [1769103627.8198] device (tap70a900f0-6d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.823 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:27 compute-0 ovn_controller[95372]: 2026-01-22T17:40:27Z|00716|binding|INFO|Releasing lport 70a900f0-6d2e-40bb-92fa-e43967095d17 from this chassis (sb_readonly=0)
Jan 22 17:40:27 compute-0 ovn_controller[95372]: 2026-01-22T17:40:27Z|00717|binding|INFO|Setting lport 70a900f0-6d2e-40bb-92fa-e43967095d17 down in Southbound
Jan 22 17:40:27 compute-0 ovn_controller[95372]: 2026-01-22T17:40:27Z|00718|binding|INFO|Removing iface tap70a900f0-6d ovn-installed in OVS
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.825 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:27 compute-0 nova_compute[183075]: 2026-01-22 17:40:27.840 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:27.843 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:f4:ff 10.100.0.12'], port_security=['fa:16:3e:6d:f4:ff 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'fb0d62d8-3d6f-4fa5-b342-612c69890cdf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8dfc7f6d-f2b1-4fa9-a099-2dffcb456eca', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.181'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=70a900f0-6d2e-40bb-92fa-e43967095d17) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:40:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:27.844 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 70a900f0-6d2e-40bb-92fa-e43967095d17 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:40:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:27.845 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:40:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:27.846 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[533e6b67-a111-40f1-9052-bb25a8500d25]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:27.846 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace which is not needed anymore
Jan 22 17:40:27 compute-0 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d0000003d.scope: Deactivated successfully.
Jan 22 17:40:27 compute-0 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d0000003d.scope: Consumed 16.679s CPU time.
Jan 22 17:40:27 compute-0 systemd-machined[154382]: Machine qemu-61-instance-0000003d terminated.
Jan 22 17:40:27 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236627]: [NOTICE]   (236631) : haproxy version is 2.8.14-c23fe91
Jan 22 17:40:27 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236627]: [NOTICE]   (236631) : path to executable is /usr/sbin/haproxy
Jan 22 17:40:27 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236627]: [WARNING]  (236631) : Exiting Master process...
Jan 22 17:40:27 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236627]: [WARNING]  (236631) : Exiting Master process...
Jan 22 17:40:27 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236627]: [ALERT]    (236631) : Current worker (236633) exited with code 143 (Terminated)
Jan 22 17:40:27 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[236627]: [WARNING]  (236631) : All workers exited. Exiting... (0)
Jan 22 17:40:27 compute-0 systemd[1]: libpod-9290a8f058abc3dd1bf10919c643bd60ba83c50276b0be3d32ffb89040c85bca.scope: Deactivated successfully.
Jan 22 17:40:27 compute-0 podman[237476]: 2026-01-22 17:40:27.97619086 +0000 UTC m=+0.048159350 container died 9290a8f058abc3dd1bf10919c643bd60ba83c50276b0be3d32ffb89040c85bca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:40:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9290a8f058abc3dd1bf10919c643bd60ba83c50276b0be3d32ffb89040c85bca-userdata-shm.mount: Deactivated successfully.
Jan 22 17:40:28 compute-0 NetworkManager[55454]: <info>  [1769103628.0110] manager: (tap70a900f0-6d): new Tun device (/org/freedesktop/NetworkManager/Devices/290)
Jan 22 17:40:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-3068633b8c20abed57d49122ccf32bc731bab3032fe29cdb7373c5cae0e15b7c-merged.mount: Deactivated successfully.
Jan 22 17:40:28 compute-0 podman[237476]: 2026-01-22 17:40:28.01796167 +0000 UTC m=+0.089930160 container cleanup 9290a8f058abc3dd1bf10919c643bd60ba83c50276b0be3d32ffb89040c85bca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 17:40:28 compute-0 systemd[1]: libpod-conmon-9290a8f058abc3dd1bf10919c643bd60ba83c50276b0be3d32ffb89040c85bca.scope: Deactivated successfully.
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.049 183079 INFO nova.virt.libvirt.driver [-] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Instance destroyed successfully.
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.049 183079 DEBUG nova.objects.instance [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid fb0d62d8-3d6f-4fa5-b342-612c69890cdf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.062 183079 DEBUG nova.virt.libvirt.vif [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:38:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1655858819',display_name='tempest-server-test-1655858819',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1655858819',id=61,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:38:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-oqt0h1xd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:38:20Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=fb0d62d8-3d6f-4fa5-b342-612c69890cdf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "70a900f0-6d2e-40bb-92fa-e43967095d17", "address": "fa:16:3e:6d:f4:ff", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70a900f0-6d", "ovs_interfaceid": "70a900f0-6d2e-40bb-92fa-e43967095d17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.062 183079 DEBUG nova.network.os_vif_util [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "70a900f0-6d2e-40bb-92fa-e43967095d17", "address": "fa:16:3e:6d:f4:ff", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70a900f0-6d", "ovs_interfaceid": "70a900f0-6d2e-40bb-92fa-e43967095d17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.063 183079 DEBUG nova.network.os_vif_util [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6d:f4:ff,bridge_name='br-int',has_traffic_filtering=True,id=70a900f0-6d2e-40bb-92fa-e43967095d17,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70a900f0-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.063 183079 DEBUG os_vif [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6d:f4:ff,bridge_name='br-int',has_traffic_filtering=True,id=70a900f0-6d2e-40bb-92fa-e43967095d17,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70a900f0-6d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.065 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.065 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70a900f0-6d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.066 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.068 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.070 183079 INFO os_vif [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6d:f4:ff,bridge_name='br-int',has_traffic_filtering=True,id=70a900f0-6d2e-40bb-92fa-e43967095d17,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70a900f0-6d')
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.071 183079 INFO nova.virt.libvirt.driver [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Deleting instance files /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf_del
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.071 183079 INFO nova.virt.libvirt.driver [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Deletion of /var/lib/nova/instances/fb0d62d8-3d6f-4fa5-b342-612c69890cdf_del complete
Jan 22 17:40:28 compute-0 podman[237515]: 2026-01-22 17:40:28.08359831 +0000 UTC m=+0.044318819 container remove 9290a8f058abc3dd1bf10919c643bd60ba83c50276b0be3d32ffb89040c85bca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:40:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:28.088 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[805fce3e-e19c-4c43-b671-6a72615f1c2b]: (4, ('Thu Jan 22 05:40:27 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (9290a8f058abc3dd1bf10919c643bd60ba83c50276b0be3d32ffb89040c85bca)\n9290a8f058abc3dd1bf10919c643bd60ba83c50276b0be3d32ffb89040c85bca\nThu Jan 22 05:40:28 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (9290a8f058abc3dd1bf10919c643bd60ba83c50276b0be3d32ffb89040c85bca)\n9290a8f058abc3dd1bf10919c643bd60ba83c50276b0be3d32ffb89040c85bca\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:28.090 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[330ff971-8f77-4acf-9c89-f95f80de2261]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:28.091 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.092 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:28 compute-0 kernel: tap88ed9213-70: left promiscuous mode
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.103 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:28.105 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[502aba61-6816-4823-9537-09a2c41f8edb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:28.118 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ceb8ef51-a65d-4103-b908-4967d18b1a7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:28.119 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f2dbc028-8f4a-4f84-84a3-874b12847199]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.132 183079 INFO nova.compute.manager [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.133 183079 DEBUG oslo.service.loopingcall [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.133 183079 DEBUG nova.compute.manager [-] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.133 183079 DEBUG nova.network.neutron [-] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:40:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:28.132 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ca9630f5-d0b1-4dee-8bba-12b6bcd92cd2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 586370, 'reachable_time': 36903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237535, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:28.134 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:40:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:28.134 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[d4443005-089d-40a1-990b-7adac19c3916]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:28 compute-0 systemd[1]: run-netns-ovnmeta\x2d88ed9213\x2d7d48\x2d4c42\x2dad60\x2dce3fcf104f71.mount: Deactivated successfully.
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.232 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.535 183079 DEBUG nova.compute.manager [req-e10d8950-7def-4acd-8921-c44100cd27a4 req-9aa1a8a1-1bf7-4f23-b172-fd3d17c90592 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Received event network-vif-plugged-0a29f7a3-3b06-4447-a07b-2c171a583ec7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.536 183079 DEBUG oslo_concurrency.lockutils [req-e10d8950-7def-4acd-8921-c44100cd27a4 req-9aa1a8a1-1bf7-4f23-b172-fd3d17c90592 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.536 183079 DEBUG oslo_concurrency.lockutils [req-e10d8950-7def-4acd-8921-c44100cd27a4 req-9aa1a8a1-1bf7-4f23-b172-fd3d17c90592 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.536 183079 DEBUG oslo_concurrency.lockutils [req-e10d8950-7def-4acd-8921-c44100cd27a4 req-9aa1a8a1-1bf7-4f23-b172-fd3d17c90592 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.536 183079 DEBUG nova.compute.manager [req-e10d8950-7def-4acd-8921-c44100cd27a4 req-9aa1a8a1-1bf7-4f23-b172-fd3d17c90592 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] No waiting events found dispatching network-vif-plugged-0a29f7a3-3b06-4447-a07b-2c171a583ec7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:40:28 compute-0 nova_compute[183075]: 2026-01-22 17:40:28.537 183079 WARNING nova.compute.manager [req-e10d8950-7def-4acd-8921-c44100cd27a4 req-9aa1a8a1-1bf7-4f23-b172-fd3d17c90592 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Received unexpected event network-vif-plugged-0a29f7a3-3b06-4447-a07b-2c171a583ec7 for instance with vm_state deleted and task_state None.
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.106 183079 DEBUG nova.network.neutron [-] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.125 183079 INFO nova.compute.manager [-] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Took 0.99 seconds to deallocate network for instance.
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.171 183079 DEBUG oslo_concurrency.lockutils [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.171 183079 DEBUG oslo_concurrency.lockutils [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.213 183079 DEBUG nova.compute.provider_tree [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.228 183079 DEBUG nova.scheduler.client.report [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.249 183079 DEBUG oslo_concurrency.lockutils [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.275 183079 INFO nova.scheduler.client.report [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance fb0d62d8-3d6f-4fa5-b342-612c69890cdf
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.339 183079 DEBUG oslo_concurrency.lockutils [None req-807db415-62c5-4f4e-b1e5-7d4332e8e692 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.549s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.387 183079 DEBUG nova.compute.manager [req-241189af-532f-4482-be84-d52f25de09af req-fe56c0a6-e686-4d72-853c-3d231fd0f052 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Received event network-vif-unplugged-70a900f0-6d2e-40bb-92fa-e43967095d17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.387 183079 DEBUG oslo_concurrency.lockutils [req-241189af-532f-4482-be84-d52f25de09af req-fe56c0a6-e686-4d72-853c-3d231fd0f052 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.387 183079 DEBUG oslo_concurrency.lockutils [req-241189af-532f-4482-be84-d52f25de09af req-fe56c0a6-e686-4d72-853c-3d231fd0f052 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.387 183079 DEBUG oslo_concurrency.lockutils [req-241189af-532f-4482-be84-d52f25de09af req-fe56c0a6-e686-4d72-853c-3d231fd0f052 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.388 183079 DEBUG nova.compute.manager [req-241189af-532f-4482-be84-d52f25de09af req-fe56c0a6-e686-4d72-853c-3d231fd0f052 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] No waiting events found dispatching network-vif-unplugged-70a900f0-6d2e-40bb-92fa-e43967095d17 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.388 183079 WARNING nova.compute.manager [req-241189af-532f-4482-be84-d52f25de09af req-fe56c0a6-e686-4d72-853c-3d231fd0f052 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Received unexpected event network-vif-unplugged-70a900f0-6d2e-40bb-92fa-e43967095d17 for instance with vm_state deleted and task_state None.
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.388 183079 DEBUG nova.compute.manager [req-241189af-532f-4482-be84-d52f25de09af req-fe56c0a6-e686-4d72-853c-3d231fd0f052 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Received event network-vif-plugged-70a900f0-6d2e-40bb-92fa-e43967095d17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.388 183079 DEBUG oslo_concurrency.lockutils [req-241189af-532f-4482-be84-d52f25de09af req-fe56c0a6-e686-4d72-853c-3d231fd0f052 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.389 183079 DEBUG oslo_concurrency.lockutils [req-241189af-532f-4482-be84-d52f25de09af req-fe56c0a6-e686-4d72-853c-3d231fd0f052 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.389 183079 DEBUG oslo_concurrency.lockutils [req-241189af-532f-4482-be84-d52f25de09af req-fe56c0a6-e686-4d72-853c-3d231fd0f052 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "fb0d62d8-3d6f-4fa5-b342-612c69890cdf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.389 183079 DEBUG nova.compute.manager [req-241189af-532f-4482-be84-d52f25de09af req-fe56c0a6-e686-4d72-853c-3d231fd0f052 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] No waiting events found dispatching network-vif-plugged-70a900f0-6d2e-40bb-92fa-e43967095d17 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.389 183079 WARNING nova.compute.manager [req-241189af-532f-4482-be84-d52f25de09af req-fe56c0a6-e686-4d72-853c-3d231fd0f052 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Received unexpected event network-vif-plugged-70a900f0-6d2e-40bb-92fa-e43967095d17 for instance with vm_state deleted and task_state None.
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.389 183079 DEBUG nova.compute.manager [req-241189af-532f-4482-be84-d52f25de09af req-fe56c0a6-e686-4d72-853c-3d231fd0f052 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Received event network-vif-deleted-70a900f0-6d2e-40bb-92fa-e43967095d17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:40:29 compute-0 nova_compute[183075]: 2026-01-22 17:40:29.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:40:32 compute-0 nova_compute[183075]: 2026-01-22 17:40:32.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.067 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.233 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.350 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.350 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:33 compute-0 podman[237536]: 2026-01-22 17:40:33.350931975 +0000 UTC m=+0.057527986 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.366 183079 DEBUG nova.compute.manager [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.425 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.425 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.434 183079 DEBUG nova.virt.hardware [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.435 183079 INFO nova.compute.claims [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.547 183079 DEBUG nova.compute.provider_tree [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.560 183079 DEBUG nova.scheduler.client.report [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.578 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.578 183079 DEBUG nova.compute.manager [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.621 183079 DEBUG nova.compute.manager [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.621 183079 DEBUG nova.network.neutron [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.638 183079 INFO nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.652 183079 DEBUG nova.compute.manager [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.741 183079 DEBUG nova.compute.manager [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.742 183079 DEBUG nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.743 183079 INFO nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Creating image(s)
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.743 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.744 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.744 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.757 183079 DEBUG oslo_concurrency.processutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.812 183079 DEBUG oslo_concurrency.processutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.813 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.814 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.827 183079 DEBUG oslo_concurrency.processutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.917 183079 DEBUG oslo_concurrency.processutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.918 183079 DEBUG oslo_concurrency.processutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.956 183079 DEBUG oslo_concurrency.processutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.957 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:33 compute-0 nova_compute[183075]: 2026-01-22 17:40:33.957 183079 DEBUG oslo_concurrency.processutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:40:34 compute-0 nova_compute[183075]: 2026-01-22 17:40:34.033 183079 DEBUG oslo_concurrency.processutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:40:34 compute-0 nova_compute[183075]: 2026-01-22 17:40:34.034 183079 DEBUG nova.virt.disk.api [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:40:34 compute-0 nova_compute[183075]: 2026-01-22 17:40:34.034 183079 DEBUG oslo_concurrency.processutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:40:34 compute-0 nova_compute[183075]: 2026-01-22 17:40:34.095 183079 DEBUG oslo_concurrency.processutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:40:34 compute-0 nova_compute[183075]: 2026-01-22 17:40:34.097 183079 DEBUG nova.virt.disk.api [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:40:34 compute-0 nova_compute[183075]: 2026-01-22 17:40:34.097 183079 DEBUG nova.objects.instance [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:40:34 compute-0 nova_compute[183075]: 2026-01-22 17:40:34.109 183079 DEBUG nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:40:34 compute-0 nova_compute[183075]: 2026-01-22 17:40:34.109 183079 DEBUG nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Ensure instance console log exists: /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:40:34 compute-0 nova_compute[183075]: 2026-01-22 17:40:34.110 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:34 compute-0 nova_compute[183075]: 2026-01-22 17:40:34.110 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:34 compute-0 nova_compute[183075]: 2026-01-22 17:40:34.110 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:34 compute-0 nova_compute[183075]: 2026-01-22 17:40:34.364 183079 DEBUG nova.policy [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:40:34 compute-0 nova_compute[183075]: 2026-01-22 17:40:34.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:40:35 compute-0 nova_compute[183075]: 2026-01-22 17:40:35.878 183079 DEBUG nova.network.neutron [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Successfully created port: 5b604d92-0d68-405f-b4e2-6a3f72fbabad _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:40:37 compute-0 podman[237575]: 2026-01-22 17:40:37.347143713 +0000 UTC m=+0.056701690 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:40:37 compute-0 nova_compute[183075]: 2026-01-22 17:40:37.378 183079 DEBUG nova.network.neutron [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Successfully updated port: 5b604d92-0d68-405f-b4e2-6a3f72fbabad _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:40:37 compute-0 nova_compute[183075]: 2026-01-22 17:40:37.393 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:40:37 compute-0 nova_compute[183075]: 2026-01-22 17:40:37.393 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:40:37 compute-0 nova_compute[183075]: 2026-01-22 17:40:37.393 183079 DEBUG nova.network.neutron [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:40:37 compute-0 nova_compute[183075]: 2026-01-22 17:40:37.461 183079 DEBUG nova.compute.manager [req-fff00852-ea1f-4be1-bdbc-41adf6badca5 req-66e7601c-45b5-4245-afac-3ad9a5282cf8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Received event network-changed-5b604d92-0d68-405f-b4e2-6a3f72fbabad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:40:37 compute-0 nova_compute[183075]: 2026-01-22 17:40:37.461 183079 DEBUG nova.compute.manager [req-fff00852-ea1f-4be1-bdbc-41adf6badca5 req-66e7601c-45b5-4245-afac-3ad9a5282cf8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Refreshing instance network info cache due to event network-changed-5b604d92-0d68-405f-b4e2-6a3f72fbabad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:40:37 compute-0 nova_compute[183075]: 2026-01-22 17:40:37.462 183079 DEBUG oslo_concurrency.lockutils [req-fff00852-ea1f-4be1-bdbc-41adf6badca5 req-66e7601c-45b5-4245-afac-3ad9a5282cf8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:40:37 compute-0 nova_compute[183075]: 2026-01-22 17:40:37.514 183079 DEBUG nova.network.neutron [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:40:38 compute-0 nova_compute[183075]: 2026-01-22 17:40:38.070 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:38 compute-0 nova_compute[183075]: 2026-01-22 17:40:38.234 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.487 183079 DEBUG nova.network.neutron [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Updating instance_info_cache with network_info: [{"id": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "address": "fa:16:3e:77:61:07", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b604d92-0d", "ovs_interfaceid": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.514 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.514 183079 DEBUG nova.compute.manager [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Instance network_info: |[{"id": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "address": "fa:16:3e:77:61:07", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b604d92-0d", "ovs_interfaceid": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.514 183079 DEBUG oslo_concurrency.lockutils [req-fff00852-ea1f-4be1-bdbc-41adf6badca5 req-66e7601c-45b5-4245-afac-3ad9a5282cf8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.515 183079 DEBUG nova.network.neutron [req-fff00852-ea1f-4be1-bdbc-41adf6badca5 req-66e7601c-45b5-4245-afac-3ad9a5282cf8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Refreshing network info cache for port 5b604d92-0d68-405f-b4e2-6a3f72fbabad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.517 183079 DEBUG nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Start _get_guest_xml network_info=[{"id": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "address": "fa:16:3e:77:61:07", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b604d92-0d", "ovs_interfaceid": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.521 183079 WARNING nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.531 183079 DEBUG nova.virt.libvirt.host [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.532 183079 DEBUG nova.virt.libvirt.host [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.535 183079 DEBUG nova.virt.libvirt.host [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.536 183079 DEBUG nova.virt.libvirt.host [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.536 183079 DEBUG nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.536 183079 DEBUG nova.virt.hardware [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.537 183079 DEBUG nova.virt.hardware [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.537 183079 DEBUG nova.virt.hardware [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.537 183079 DEBUG nova.virt.hardware [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.538 183079 DEBUG nova.virt.hardware [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.538 183079 DEBUG nova.virt.hardware [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.538 183079 DEBUG nova.virt.hardware [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.538 183079 DEBUG nova.virt.hardware [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.539 183079 DEBUG nova.virt.hardware [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.539 183079 DEBUG nova.virt.hardware [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.539 183079 DEBUG nova.virt.hardware [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.542 183079 DEBUG nova.virt.libvirt.vif [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:40:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1916165929',display_name='tempest-server-test-1916165929',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1916165929',id=63,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-uy1v1f1q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:40:33Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=5bc95cf8-db79-4c62-95f8-ab8f0dbabd44,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "address": "fa:16:3e:77:61:07", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b604d92-0d", "ovs_interfaceid": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.543 183079 DEBUG nova.network.os_vif_util [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "address": "fa:16:3e:77:61:07", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b604d92-0d", "ovs_interfaceid": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.543 183079 DEBUG nova.network.os_vif_util [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:61:07,bridge_name='br-int',has_traffic_filtering=True,id=5b604d92-0d68-405f-b4e2-6a3f72fbabad,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b604d92-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.544 183079 DEBUG nova.objects.instance [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.557 183079 DEBUG nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:40:39 compute-0 nova_compute[183075]:   <uuid>5bc95cf8-db79-4c62-95f8-ab8f0dbabd44</uuid>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   <name>instance-0000003f</name>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1916165929</nova:name>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:40:39</nova:creationTime>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:40:39 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:40:39 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:40:39 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:40:39 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:40:39 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:40:39 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:40:39 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:40:39 compute-0 nova_compute[183075]:         <nova:port uuid="5b604d92-0d68-405f-b4e2-6a3f72fbabad">
Jan 22 17:40:39 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <system>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <entry name="serial">5bc95cf8-db79-4c62-95f8-ab8f0dbabd44</entry>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <entry name="uuid">5bc95cf8-db79-4c62-95f8-ab8f0dbabd44</entry>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     </system>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   <os>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   </os>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   <features>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   </features>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:77:61:07"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <target dev="tap5b604d92-0d"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/console.log" append="off"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <video>
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     </video>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:40:39 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:40:39 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:40:39 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:40:39 compute-0 nova_compute[183075]: </domain>
Jan 22 17:40:39 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.558 183079 DEBUG nova.compute.manager [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Preparing to wait for external event network-vif-plugged-5b604d92-0d68-405f-b4e2-6a3f72fbabad prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.559 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.559 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.559 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.560 183079 DEBUG nova.virt.libvirt.vif [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:40:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1916165929',display_name='tempest-server-test-1916165929',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1916165929',id=63,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-uy1v1f1q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:40:33Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=5bc95cf8-db79-4c62-95f8-ab8f0dbabd44,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "address": "fa:16:3e:77:61:07", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b604d92-0d", "ovs_interfaceid": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.560 183079 DEBUG nova.network.os_vif_util [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "address": "fa:16:3e:77:61:07", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b604d92-0d", "ovs_interfaceid": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.561 183079 DEBUG nova.network.os_vif_util [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:61:07,bridge_name='br-int',has_traffic_filtering=True,id=5b604d92-0d68-405f-b4e2-6a3f72fbabad,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b604d92-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.561 183079 DEBUG os_vif [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:61:07,bridge_name='br-int',has_traffic_filtering=True,id=5b604d92-0d68-405f-b4e2-6a3f72fbabad,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b604d92-0d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.561 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.562 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.562 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.564 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.564 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5b604d92-0d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.564 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5b604d92-0d, col_values=(('external_ids', {'iface-id': '5b604d92-0d68-405f-b4e2-6a3f72fbabad', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:61:07', 'vm-uuid': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.566 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:39 compute-0 NetworkManager[55454]: <info>  [1769103639.5670] manager: (tap5b604d92-0d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.569 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.571 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.573 183079 INFO os_vif [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:61:07,bridge_name='br-int',has_traffic_filtering=True,id=5b604d92-0d68-405f-b4e2-6a3f72fbabad,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b604d92-0d')
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.625 183079 DEBUG nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.626 183079 DEBUG nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:77:61:07, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:40:39 compute-0 kernel: tap5b604d92-0d: entered promiscuous mode
Jan 22 17:40:39 compute-0 NetworkManager[55454]: <info>  [1769103639.6733] manager: (tap5b604d92-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/292)
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.674 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:39 compute-0 ovn_controller[95372]: 2026-01-22T17:40:39Z|00719|binding|INFO|Claiming lport 5b604d92-0d68-405f-b4e2-6a3f72fbabad for this chassis.
Jan 22 17:40:39 compute-0 ovn_controller[95372]: 2026-01-22T17:40:39Z|00720|binding|INFO|5b604d92-0d68-405f-b4e2-6a3f72fbabad: Claiming fa:16:3e:77:61:07 10.100.0.9
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.681 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:61:07 10.100.0.9'], port_security=['fa:16:3e:77:61:07 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5d20a402-d35a-4a65-8bef-cd1a3a7097da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=5b604d92-0d68-405f-b4e2-6a3f72fbabad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.683 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 5b604d92-0d68-405f-b4e2-6a3f72fbabad in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.684 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:40:39 compute-0 ovn_controller[95372]: 2026-01-22T17:40:39Z|00721|binding|INFO|Setting lport 5b604d92-0d68-405f-b4e2-6a3f72fbabad up in Southbound
Jan 22 17:40:39 compute-0 ovn_controller[95372]: 2026-01-22T17:40:39Z|00722|binding|INFO|Setting lport 5b604d92-0d68-405f-b4e2-6a3f72fbabad ovn-installed in OVS
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.688 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.692 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.697 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c1b084d6-02df-475b-98ae-ca1596c17c5a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.697 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88ed9213-71 in ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.699 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88ed9213-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.699 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c68b8b59-70fb-4311-a989-563136d77baa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.700 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d9dda328-e8bd-44ab-bad9-6bb05a66a253]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:39 compute-0 systemd-udevd[237616]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.710 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[8484fc64-6c59-43fe-989b-2d0833bd7e14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:39 compute-0 systemd-machined[154382]: New machine qemu-63-instance-0000003f.
Jan 22 17:40:39 compute-0 NetworkManager[55454]: <info>  [1769103639.7149] device (tap5b604d92-0d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:40:39 compute-0 NetworkManager[55454]: <info>  [1769103639.7155] device (tap5b604d92-0d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:40:39 compute-0 systemd[1]: Started Virtual Machine qemu-63-instance-0000003f.
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.732 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[62865bc4-c347-48d9-a527-1cb437a497dc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.756 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[b8358c3c-9070-4bcb-92c2-599b3c9b336a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.760 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a2f6b89e-1da1-4978-9bd5-d1539d338ed6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:39 compute-0 NetworkManager[55454]: <info>  [1769103639.7614] manager: (tap88ed9213-70): new Veth device (/org/freedesktop/NetworkManager/Devices/293)
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.791 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d6dea7fc-6efe-4090-9f1e-c3180e67254c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.793 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[430eb9f2-05d0-454a-b2e8-ce2f699617c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:39 compute-0 NetworkManager[55454]: <info>  [1769103639.8127] device (tap88ed9213-70): carrier: link connected
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.817 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5d360c02-676e-44f1-9453-69200a3311dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.832 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2c364b52-3d55-443c-9cfd-2db4b0b6f122]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 195], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600351, 'reachable_time': 28750, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237649, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.848 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3b02a2ad-d49d-443e-bd5c-11aa987e1713]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:fa93'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600351, 'tstamp': 600351}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237650, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.865 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[95cec9f6-be86-4be3-9e1e-2c2b12e2de93]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 195], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600351, 'reachable_time': 28750, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237651, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.893 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[84bfccbc-cb6e-457c-bde4-1436b4c9ae5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.944 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6b91ed6e-e65d-4792-b75a-418a5b58acb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.946 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.946 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.946 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.977 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:39 compute-0 kernel: tap88ed9213-70: entered promiscuous mode
Jan 22 17:40:39 compute-0 NetworkManager[55454]: <info>  [1769103639.9788] manager: (tap88ed9213-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/294)
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.980 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.981 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.982 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:39 compute-0 ovn_controller[95372]: 2026-01-22T17:40:39Z|00723|binding|INFO|Releasing lport da93a32e-eed6-4124-8f08-78b0bfe2cb69 from this chassis (sb_readonly=0)
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.983 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.984 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.985 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[55dd4a04-fde0-4516-8edb-cde1b733e217]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.986 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:40:39 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:39.987 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'env', 'PROCESS_TAG=haproxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88ed9213-7d48-4c42-ad60-ce3fcf104f71.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:40:39 compute-0 nova_compute[183075]: 2026-01-22 17:40:39.994 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.002 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103640.0022314, 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.002 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] VM Started (Lifecycle Event)
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.025 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.028 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103640.0028126, 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.028 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] VM Paused (Lifecycle Event)
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.043 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.046 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.062 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:40:40 compute-0 podman[237690]: 2026-01-22 17:40:40.343019033 +0000 UTC m=+0.053873294 container create 1ce05d4696edaa5144b72d13bc4ec532a882443887f7a0f2b062a21ca6e0629c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:40:40 compute-0 systemd[1]: Started libpod-conmon-1ce05d4696edaa5144b72d13bc4ec532a882443887f7a0f2b062a21ca6e0629c.scope.
Jan 22 17:40:40 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:40:40 compute-0 podman[237690]: 2026-01-22 17:40:40.314031407 +0000 UTC m=+0.024885688 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:40:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3b52b4ca8ea92f634dea2c0bf0806bc158530ab7534af443d415dba6fc695dc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:40:40 compute-0 podman[237690]: 2026-01-22 17:40:40.423818218 +0000 UTC m=+0.134672489 container init 1ce05d4696edaa5144b72d13bc4ec532a882443887f7a0f2b062a21ca6e0629c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:40:40 compute-0 podman[237690]: 2026-01-22 17:40:40.429076709 +0000 UTC m=+0.139930970 container start 1ce05d4696edaa5144b72d13bc4ec532a882443887f7a0f2b062a21ca6e0629c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:40:40 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237705]: [NOTICE]   (237709) : New worker (237711) forked
Jan 22 17:40:40 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237705]: [NOTICE]   (237709) : Loading success.
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.466 183079 DEBUG nova.compute.manager [req-9d8d7bee-19fb-4134-a307-3e6f361ea770 req-bca60287-816f-4f16-8b95-7fd7996cfb98 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Received event network-vif-plugged-5b604d92-0d68-405f-b4e2-6a3f72fbabad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.467 183079 DEBUG oslo_concurrency.lockutils [req-9d8d7bee-19fb-4134-a307-3e6f361ea770 req-bca60287-816f-4f16-8b95-7fd7996cfb98 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.467 183079 DEBUG oslo_concurrency.lockutils [req-9d8d7bee-19fb-4134-a307-3e6f361ea770 req-bca60287-816f-4f16-8b95-7fd7996cfb98 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.467 183079 DEBUG oslo_concurrency.lockutils [req-9d8d7bee-19fb-4134-a307-3e6f361ea770 req-bca60287-816f-4f16-8b95-7fd7996cfb98 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.467 183079 DEBUG nova.compute.manager [req-9d8d7bee-19fb-4134-a307-3e6f361ea770 req-bca60287-816f-4f16-8b95-7fd7996cfb98 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Processing event network-vif-plugged-5b604d92-0d68-405f-b4e2-6a3f72fbabad _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.468 183079 DEBUG nova.compute.manager [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.473 183079 DEBUG nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.474 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103640.4746172, 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.475 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] VM Resumed (Lifecycle Event)
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.478 183079 INFO nova.virt.libvirt.driver [-] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Instance spawned successfully.
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.478 183079 DEBUG nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.497 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.502 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.504 183079 DEBUG nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.505 183079 DEBUG nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.505 183079 DEBUG nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.505 183079 DEBUG nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.506 183079 DEBUG nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.506 183079 DEBUG nova.virt.libvirt.driver [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.523 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.565 183079 INFO nova.compute.manager [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Took 6.82 seconds to spawn the instance on the hypervisor.
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.565 183079 DEBUG nova.compute.manager [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.620 183079 INFO nova.compute.manager [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Took 7.21 seconds to build instance.
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.635 183079 DEBUG oslo_concurrency.lockutils [None req-76573e9b-9340-40e0-8167-df572ebb71e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:40 compute-0 nova_compute[183075]: 2026-01-22 17:40:40.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:40:41 compute-0 nova_compute[183075]: 2026-01-22 17:40:41.362 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103626.3610055, 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:40:41 compute-0 nova_compute[183075]: 2026-01-22 17:40:41.363 183079 INFO nova.compute.manager [-] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] VM Stopped (Lifecycle Event)
Jan 22 17:40:41 compute-0 nova_compute[183075]: 2026-01-22 17:40:41.387 183079 DEBUG nova.compute.manager [None req-3d0ec7c8-8c84-4899-9264-5cd121eb4803 - - - - - -] [instance: 4aab4e50-d6b2-4e53-b7c5-f3a3a1019da6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:40:41 compute-0 nova_compute[183075]: 2026-01-22 17:40:41.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:40:41 compute-0 nova_compute[183075]: 2026-01-22 17:40:41.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:40:41 compute-0 nova_compute[183075]: 2026-01-22 17:40:41.809 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:40:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:41.959 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:41.961 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:41.962 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:42 compute-0 nova_compute[183075]: 2026-01-22 17:40:42.295 183079 DEBUG nova.network.neutron [req-fff00852-ea1f-4be1-bdbc-41adf6badca5 req-66e7601c-45b5-4245-afac-3ad9a5282cf8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Updated VIF entry in instance network info cache for port 5b604d92-0d68-405f-b4e2-6a3f72fbabad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:40:42 compute-0 nova_compute[183075]: 2026-01-22 17:40:42.295 183079 DEBUG nova.network.neutron [req-fff00852-ea1f-4be1-bdbc-41adf6badca5 req-66e7601c-45b5-4245-afac-3ad9a5282cf8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Updating instance_info_cache with network_info: [{"id": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "address": "fa:16:3e:77:61:07", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b604d92-0d", "ovs_interfaceid": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:40:42 compute-0 nova_compute[183075]: 2026-01-22 17:40:42.421 183079 DEBUG oslo_concurrency.lockutils [req-fff00852-ea1f-4be1-bdbc-41adf6badca5 req-66e7601c-45b5-4245-afac-3ad9a5282cf8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:40:42 compute-0 nova_compute[183075]: 2026-01-22 17:40:42.562 183079 DEBUG nova.compute.manager [req-d9f9f038-1ec5-4dc7-9a0b-a6cfcb63fbbd req-b385da0d-7222-456e-8c25-ecc7b7730d81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Received event network-vif-plugged-5b604d92-0d68-405f-b4e2-6a3f72fbabad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:40:42 compute-0 nova_compute[183075]: 2026-01-22 17:40:42.562 183079 DEBUG oslo_concurrency.lockutils [req-d9f9f038-1ec5-4dc7-9a0b-a6cfcb63fbbd req-b385da0d-7222-456e-8c25-ecc7b7730d81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:42 compute-0 nova_compute[183075]: 2026-01-22 17:40:42.562 183079 DEBUG oslo_concurrency.lockutils [req-d9f9f038-1ec5-4dc7-9a0b-a6cfcb63fbbd req-b385da0d-7222-456e-8c25-ecc7b7730d81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:42 compute-0 nova_compute[183075]: 2026-01-22 17:40:42.563 183079 DEBUG oslo_concurrency.lockutils [req-d9f9f038-1ec5-4dc7-9a0b-a6cfcb63fbbd req-b385da0d-7222-456e-8c25-ecc7b7730d81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:42 compute-0 nova_compute[183075]: 2026-01-22 17:40:42.563 183079 DEBUG nova.compute.manager [req-d9f9f038-1ec5-4dc7-9a0b-a6cfcb63fbbd req-b385da0d-7222-456e-8c25-ecc7b7730d81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] No waiting events found dispatching network-vif-plugged-5b604d92-0d68-405f-b4e2-6a3f72fbabad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:40:42 compute-0 nova_compute[183075]: 2026-01-22 17:40:42.563 183079 WARNING nova.compute.manager [req-d9f9f038-1ec5-4dc7-9a0b-a6cfcb63fbbd req-b385da0d-7222-456e-8c25-ecc7b7730d81 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Received unexpected event network-vif-plugged-5b604d92-0d68-405f-b4e2-6a3f72fbabad for instance with vm_state active and task_state None.
Jan 22 17:40:43 compute-0 nova_compute[183075]: 2026-01-22 17:40:43.048 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103628.0469596, fb0d62d8-3d6f-4fa5-b342-612c69890cdf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:40:43 compute-0 nova_compute[183075]: 2026-01-22 17:40:43.048 183079 INFO nova.compute.manager [-] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] VM Stopped (Lifecycle Event)
Jan 22 17:40:43 compute-0 nova_compute[183075]: 2026-01-22 17:40:43.078 183079 DEBUG nova.compute.manager [None req-f658aa0e-cc4e-4984-a17f-ae90c61444e0 - - - - - -] [instance: fb0d62d8-3d6f-4fa5-b342-612c69890cdf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:40:43 compute-0 nova_compute[183075]: 2026-01-22 17:40:43.177 183079 INFO nova.compute.manager [None req-2aca1e93-c69d-4da2-a3f2-873718bac235 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Get console output
Jan 22 17:40:43 compute-0 nova_compute[183075]: 2026-01-22 17:40:43.182 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:40:43 compute-0 nova_compute[183075]: 2026-01-22 17:40:43.249 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:43 compute-0 nova_compute[183075]: 2026-01-22 17:40:43.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:40:43 compute-0 nova_compute[183075]: 2026-01-22 17:40:43.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:40:44 compute-0 nova_compute[183075]: 2026-01-22 17:40:44.568 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:44 compute-0 nova_compute[183075]: 2026-01-22 17:40:44.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:40:44 compute-0 nova_compute[183075]: 2026-01-22 17:40:44.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:40:44 compute-0 nova_compute[183075]: 2026-01-22 17:40:44.809 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:44 compute-0 nova_compute[183075]: 2026-01-22 17:40:44.809 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:44 compute-0 nova_compute[183075]: 2026-01-22 17:40:44.809 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:44 compute-0 nova_compute[183075]: 2026-01-22 17:40:44.810 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:40:44 compute-0 nova_compute[183075]: 2026-01-22 17:40:44.871 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:40:44 compute-0 nova_compute[183075]: 2026-01-22 17:40:44.925 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:40:44 compute-0 nova_compute[183075]: 2026-01-22 17:40:44.926 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:40:44 compute-0 nova_compute[183075]: 2026-01-22 17:40:44.979 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:40:45 compute-0 nova_compute[183075]: 2026-01-22 17:40:45.152 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:40:45 compute-0 nova_compute[183075]: 2026-01-22 17:40:45.153 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5550MB free_disk=73.35882949829102GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:40:45 compute-0 nova_compute[183075]: 2026-01-22 17:40:45.154 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:40:45 compute-0 nova_compute[183075]: 2026-01-22 17:40:45.154 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:40:45 compute-0 nova_compute[183075]: 2026-01-22 17:40:45.220 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:40:45 compute-0 nova_compute[183075]: 2026-01-22 17:40:45.221 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:40:45 compute-0 nova_compute[183075]: 2026-01-22 17:40:45.221 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:40:45 compute-0 nova_compute[183075]: 2026-01-22 17:40:45.292 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:40:45 compute-0 nova_compute[183075]: 2026-01-22 17:40:45.308 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:40:45 compute-0 nova_compute[183075]: 2026-01-22 17:40:45.333 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:40:45 compute-0 nova_compute[183075]: 2026-01-22 17:40:45.334 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:40:47 compute-0 podman[237728]: 2026-01-22 17:40:47.345323326 +0000 UTC m=+0.056228958 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 17:40:47 compute-0 podman[237729]: 2026-01-22 17:40:47.378492254 +0000 UTC m=+0.087528126 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, name=ubi9-minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc.)
Jan 22 17:40:47 compute-0 podman[237727]: 2026-01-22 17:40:47.398663334 +0000 UTC m=+0.111819176 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:40:48 compute-0 nova_compute[183075]: 2026-01-22 17:40:48.250 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:48 compute-0 nova_compute[183075]: 2026-01-22 17:40:48.291 183079 INFO nova.compute.manager [None req-46262b02-8af4-4756-9c6b-89a37a46c1af 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Get console output
Jan 22 17:40:48 compute-0 nova_compute[183075]: 2026-01-22 17:40:48.295 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:40:49 compute-0 nova_compute[183075]: 2026-01-22 17:40:49.571 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:51 compute-0 ovn_controller[95372]: 2026-01-22T17:40:51Z|00128|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:77:61:07 10.100.0.9
Jan 22 17:40:51 compute-0 ovn_controller[95372]: 2026-01-22T17:40:51Z|00129|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:77:61:07 10.100.0.9
Jan 22 17:40:53 compute-0 nova_compute[183075]: 2026-01-22 17:40:53.300 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:53 compute-0 podman[237808]: 2026-01-22 17:40:53.384928721 +0000 UTC m=+0.061399366 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Jan 22 17:40:53 compute-0 nova_compute[183075]: 2026-01-22 17:40:53.423 183079 INFO nova.compute.manager [None req-e646496a-9402-4bf0-929c-d4a53fd134b7 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Get console output
Jan 22 17:40:53 compute-0 nova_compute[183075]: 2026-01-22 17:40:53.429 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:40:54 compute-0 nova_compute[183075]: 2026-01-22 17:40:54.573 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:57.681 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:40:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:57.682 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:40:57 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:40:57 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:40:57 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:40:57 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:40:57 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:40:57 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:40:57 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:40:58 compute-0 nova_compute[183075]: 2026-01-22 17:40:58.301 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.441 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.442 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.7597399
Jan 22 17:40:58 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.9:54840 [22/Jan/2026:17:40:57.680] listener listener/metadata 0/0/0/761/761 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.452 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.453 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.480 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.480 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0275681
Jan 22 17:40:58 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.9:54856 [22/Jan/2026:17:40:58.451] listener listener/metadata 0/0/0/29/29 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.485 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.485 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.503 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.503 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0179129
Jan 22 17:40:58 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.9:54860 [22/Jan/2026:17:40:58.484] listener listener/metadata 0/0/0/19/19 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.508 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.509 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.523 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:40:58 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.9:54876 [22/Jan/2026:17:40:58.508] listener listener/metadata 0/0/0/15/15 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.524 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0144784
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.527 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.528 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.541 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.541 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0133576
Jan 22 17:40:58 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.9:54878 [22/Jan/2026:17:40:58.527] listener listener/metadata 0/0/0/14/14 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.546 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.546 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:40:58 compute-0 nova_compute[183075]: 2026-01-22 17:40:58.551 183079 INFO nova.compute.manager [None req-b8c28f36-1732-4979-a61b-ab4f4d75b038 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Get console output
Jan 22 17:40:58 compute-0 nova_compute[183075]: 2026-01-22 17:40:58.555 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.558 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.558 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0121634
Jan 22 17:40:58 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.9:54890 [22/Jan/2026:17:40:58.545] listener listener/metadata 0/0/0/12/12 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.562 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.562 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.576 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.576 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0134487
Jan 22 17:40:58 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.9:54898 [22/Jan/2026:17:40:58.562] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.580 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.580 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.593 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.594 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0134084
Jan 22 17:40:58 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.9:54904 [22/Jan/2026:17:40:58.580] listener listener/metadata 0/0/0/14/14 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.598 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.598 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.612 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:40:58 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.9:54908 [22/Jan/2026:17:40:58.597] listener listener/metadata 0/0/0/14/14 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.612 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0139630
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.617 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.617 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.630 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.630 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0131109
Jan 22 17:40:58 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.9:54918 [22/Jan/2026:17:40:58.616] listener listener/metadata 0/0/0/13/13 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.634 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.635 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:40:58 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.9:54934 [22/Jan/2026:17:40:58.634] listener listener/metadata 0/0/0/15/15 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.650 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0148115
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.661 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.661 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.673 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.673 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0123870
Jan 22 17:40:58 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.9:54938 [22/Jan/2026:17:40:58.660] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.677 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.678 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.690 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.690 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0128129
Jan 22 17:40:58 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.9:54950 [22/Jan/2026:17:40:58.677] listener listener/metadata 0/0/0/13/13 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.694 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.695 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.705 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.705 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0108812
Jan 22 17:40:58 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.9:54966 [22/Jan/2026:17:40:58.694] listener listener/metadata 0/0/0/11/11 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.710 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.711 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.723 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.723 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0123615
Jan 22 17:40:58 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.9:54974 [22/Jan/2026:17:40:58.710] listener listener/metadata 0/0/0/13/13 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.732 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.732 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.9
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.743 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:40:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:40:58.744 104990 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0115278
Jan 22 17:40:58 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.9:54980 [22/Jan/2026:17:40:58.731] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:40:59 compute-0 nova_compute[183075]: 2026-01-22 17:40:59.577 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:02 compute-0 sshd-session[237829]: Connection closed by 170.64.201.191 port 36106
Jan 22 17:41:03 compute-0 nova_compute[183075]: 2026-01-22 17:41:03.344 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:03 compute-0 nova_compute[183075]: 2026-01-22 17:41:03.774 183079 INFO nova.compute.manager [None req-9d873468-5960-4c84-9ff3-4e648eecea8b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Get console output
Jan 22 17:41:03 compute-0 nova_compute[183075]: 2026-01-22 17:41:03.780 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:41:04 compute-0 podman[237830]: 2026-01-22 17:41:04.348485193 +0000 UTC m=+0.059128125 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:41:04 compute-0 nova_compute[183075]: 2026-01-22 17:41:04.579 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:08 compute-0 podman[237854]: 2026-01-22 17:41:08.329471021 +0000 UTC m=+0.044440721 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:41:08 compute-0 nova_compute[183075]: 2026-01-22 17:41:08.345 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:08 compute-0 nova_compute[183075]: 2026-01-22 17:41:08.987 183079 INFO nova.compute.manager [None req-e69e316e-bdb5-41b8-8363-cea316de9912 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Get console output
Jan 22 17:41:08 compute-0 nova_compute[183075]: 2026-01-22 17:41:08.993 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:41:09 compute-0 nova_compute[183075]: 2026-01-22 17:41:09.582 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:09 compute-0 ovn_controller[95372]: 2026-01-22T17:41:09Z|00724|memory_trim|INFO|Detected inactivity (last active 30023 ms ago): trimming memory
Jan 22 17:41:13 compute-0 nova_compute[183075]: 2026-01-22 17:41:13.347 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:14 compute-0 nova_compute[183075]: 2026-01-22 17:41:14.116 183079 INFO nova.compute.manager [None req-9071127d-4901-4a20-b367-38e196b85091 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Get console output
Jan 22 17:41:14 compute-0 nova_compute[183075]: 2026-01-22 17:41:14.120 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:41:14 compute-0 nova_compute[183075]: 2026-01-22 17:41:14.622 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:18 compute-0 nova_compute[183075]: 2026-01-22 17:41:18.349 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:18 compute-0 podman[237882]: 2026-01-22 17:41:18.350920747 +0000 UTC m=+0.053801862 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:41:18 compute-0 podman[237883]: 2026-01-22 17:41:18.361556772 +0000 UTC m=+0.064458048 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:41:18 compute-0 podman[237881]: 2026-01-22 17:41:18.389821789 +0000 UTC m=+0.096664950 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 22 17:41:19 compute-0 nova_compute[183075]: 2026-01-22 17:41:19.238 183079 INFO nova.compute.manager [None req-f0f60c8a-952f-4142-8066-9494e90b7ac6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Get console output
Jan 22 17:41:19 compute-0 nova_compute[183075]: 2026-01-22 17:41:19.242 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:41:19 compute-0 nova_compute[183075]: 2026-01-22 17:41:19.625 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:20 compute-0 sshd-session[237879]: Connection closed by authenticating user root 170.64.201.191 port 47700 [preauth]
Jan 22 17:41:23 compute-0 nova_compute[183075]: 2026-01-22 17:41:23.351 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:24 compute-0 podman[237944]: 2026-01-22 17:41:24.357180927 +0000 UTC m=+0.071896857 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:41:24 compute-0 nova_compute[183075]: 2026-01-22 17:41:24.380 183079 INFO nova.compute.manager [None req-743bc1d4-2814-495d-a4c4-33f06ee43b6e 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Get console output
Jan 22 17:41:24 compute-0 nova_compute[183075]: 2026-01-22 17:41:24.384 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:41:24 compute-0 nova_compute[183075]: 2026-01-22 17:41:24.629 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:28 compute-0 nova_compute[183075]: 2026-01-22 17:41:28.334 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:41:28 compute-0 nova_compute[183075]: 2026-01-22 17:41:28.352 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:29 compute-0 nova_compute[183075]: 2026-01-22 17:41:29.543 183079 INFO nova.compute.manager [None req-88b46062-7185-4277-82a8-8f9df478c5d6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Get console output
Jan 22 17:41:29 compute-0 nova_compute[183075]: 2026-01-22 17:41:29.548 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:41:29 compute-0 nova_compute[183075]: 2026-01-22 17:41:29.631 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:31 compute-0 nova_compute[183075]: 2026-01-22 17:41:31.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:41:32 compute-0 nova_compute[183075]: 2026-01-22 17:41:32.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:41:33 compute-0 nova_compute[183075]: 2026-01-22 17:41:33.355 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:34 compute-0 nova_compute[183075]: 2026-01-22 17:41:34.637 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:34 compute-0 nova_compute[183075]: 2026-01-22 17:41:34.712 183079 INFO nova.compute.manager [None req-cfa19073-f378-472f-bb3b-8a087934149c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Get console output
Jan 22 17:41:34 compute-0 nova_compute[183075]: 2026-01-22 17:41:34.717 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:41:35 compute-0 podman[237964]: 2026-01-22 17:41:35.339569285 +0000 UTC m=+0.050495524 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:41:35 compute-0 nova_compute[183075]: 2026-01-22 17:41:35.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:41:38 compute-0 nova_compute[183075]: 2026-01-22 17:41:38.357 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:38 compute-0 nova_compute[183075]: 2026-01-22 17:41:38.969 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:41:38 compute-0 nova_compute[183075]: 2026-01-22 17:41:38.969 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:41:38 compute-0 nova_compute[183075]: 2026-01-22 17:41:38.987 183079 DEBUG nova.compute.manager [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.063 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.064 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.073 183079 DEBUG nova.virt.hardware [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.073 183079 INFO nova.compute.claims [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.179 183079 DEBUG nova.compute.provider_tree [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.192 183079 DEBUG nova.scheduler.client.report [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.207 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.208 183079 DEBUG nova.compute.manager [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.260 183079 DEBUG nova.compute.manager [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.260 183079 DEBUG nova.network.neutron [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.276 183079 INFO nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.291 183079 DEBUG nova.compute.manager [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:41:39 compute-0 podman[237988]: 2026-01-22 17:41:39.341299399 +0000 UTC m=+0.048644824 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.394 183079 DEBUG nova.compute.manager [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.395 183079 DEBUG nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.396 183079 INFO nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Creating image(s)
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.397 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.397 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.398 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.414 183079 DEBUG oslo_concurrency.processutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.477 183079 DEBUG oslo_concurrency.processutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.478 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.479 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.490 183079 DEBUG oslo_concurrency.processutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.546 183079 DEBUG oslo_concurrency.processutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.547 183079 DEBUG oslo_concurrency.processutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.581 183079 DEBUG oslo_concurrency.processutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.582 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.582 183079 DEBUG oslo_concurrency.processutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.681 183079 DEBUG oslo_concurrency.processutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.682 183079 DEBUG nova.virt.disk.api [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.682 183079 DEBUG oslo_concurrency.processutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.697 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.735 183079 DEBUG oslo_concurrency.processutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.736 183079 DEBUG nova.virt.disk.api [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.736 183079 DEBUG nova.objects.instance [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid 5abad643-2e22-47fc-bd1d-98ba4f7d6edd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.748 183079 DEBUG nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.749 183079 DEBUG nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Ensure instance console log exists: /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.749 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.750 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:41:39 compute-0 nova_compute[183075]: 2026-01-22 17:41:39.750 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:41:40 compute-0 nova_compute[183075]: 2026-01-22 17:41:40.380 183079 DEBUG nova.policy [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:41:40 compute-0 nova_compute[183075]: 2026-01-22 17:41:40.782 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:41.719 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:41:41 compute-0 nova_compute[183075]: 2026-01-22 17:41:41.720 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:41.720 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:41:41 compute-0 nova_compute[183075]: 2026-01-22 17:41:41.876 183079 DEBUG nova.network.neutron [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Successfully created port: b9c988ec-665e-44cd-a682-5e207216eabc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:41.960 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:41.960 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:41:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:41.961 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:41:42 compute-0 nova_compute[183075]: 2026-01-22 17:41:42.524 183079 DEBUG nova.network.neutron [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Successfully updated port: b9c988ec-665e-44cd-a682-5e207216eabc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:41:42 compute-0 nova_compute[183075]: 2026-01-22 17:41:42.539 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-5abad643-2e22-47fc-bd1d-98ba4f7d6edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:41:42 compute-0 nova_compute[183075]: 2026-01-22 17:41:42.540 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-5abad643-2e22-47fc-bd1d-98ba4f7d6edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:41:42 compute-0 nova_compute[183075]: 2026-01-22 17:41:42.540 183079 DEBUG nova.network.neutron [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:41:42 compute-0 nova_compute[183075]: 2026-01-22 17:41:42.625 183079 DEBUG nova.compute.manager [req-080a5e43-aef2-45f7-9cd0-e8ba4403ca9e req-c43bb894-6aa4-42dc-a085-3f2ec41e5ebc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Received event network-changed-b9c988ec-665e-44cd-a682-5e207216eabc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:41:42 compute-0 nova_compute[183075]: 2026-01-22 17:41:42.625 183079 DEBUG nova.compute.manager [req-080a5e43-aef2-45f7-9cd0-e8ba4403ca9e req-c43bb894-6aa4-42dc-a085-3f2ec41e5ebc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Refreshing instance network info cache due to event network-changed-b9c988ec-665e-44cd-a682-5e207216eabc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:41:42 compute-0 nova_compute[183075]: 2026-01-22 17:41:42.626 183079 DEBUG oslo_concurrency.lockutils [req-080a5e43-aef2-45f7-9cd0-e8ba4403ca9e req-c43bb894-6aa4-42dc-a085-3f2ec41e5ebc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-5abad643-2e22-47fc-bd1d-98ba4f7d6edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:41:42 compute-0 nova_compute[183075]: 2026-01-22 17:41:42.786 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:41:42 compute-0 nova_compute[183075]: 2026-01-22 17:41:42.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:41:42 compute-0 nova_compute[183075]: 2026-01-22 17:41:42.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:41:42 compute-0 nova_compute[183075]: 2026-01-22 17:41:42.806 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 17:41:43 compute-0 nova_compute[183075]: 2026-01-22 17:41:43.059 183079 DEBUG nova.network.neutron [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:41:43 compute-0 nova_compute[183075]: 2026-01-22 17:41:43.064 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:41:43 compute-0 nova_compute[183075]: 2026-01-22 17:41:43.064 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:41:43 compute-0 nova_compute[183075]: 2026-01-22 17:41:43.064 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:41:43 compute-0 nova_compute[183075]: 2026-01-22 17:41:43.064 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:41:43 compute-0 nova_compute[183075]: 2026-01-22 17:41:43.359 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.541 183079 DEBUG nova.network.neutron [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Updating instance_info_cache with network_info: [{"id": "b9c988ec-665e-44cd-a682-5e207216eabc", "address": "fa:16:3e:e8:db:2c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9c988ec-66", "ovs_interfaceid": "b9c988ec-665e-44cd-a682-5e207216eabc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.645 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-5abad643-2e22-47fc-bd1d-98ba4f7d6edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.645 183079 DEBUG nova.compute.manager [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Instance network_info: |[{"id": "b9c988ec-665e-44cd-a682-5e207216eabc", "address": "fa:16:3e:e8:db:2c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9c988ec-66", "ovs_interfaceid": "b9c988ec-665e-44cd-a682-5e207216eabc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.646 183079 DEBUG oslo_concurrency.lockutils [req-080a5e43-aef2-45f7-9cd0-e8ba4403ca9e req-c43bb894-6aa4-42dc-a085-3f2ec41e5ebc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-5abad643-2e22-47fc-bd1d-98ba4f7d6edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.646 183079 DEBUG nova.network.neutron [req-080a5e43-aef2-45f7-9cd0-e8ba4403ca9e req-c43bb894-6aa4-42dc-a085-3f2ec41e5ebc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Refreshing network info cache for port b9c988ec-665e-44cd-a682-5e207216eabc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.649 183079 DEBUG nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Start _get_guest_xml network_info=[{"id": "b9c988ec-665e-44cd-a682-5e207216eabc", "address": "fa:16:3e:e8:db:2c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9c988ec-66", "ovs_interfaceid": "b9c988ec-665e-44cd-a682-5e207216eabc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.653 183079 WARNING nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.658 183079 DEBUG nova.virt.libvirt.host [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.659 183079 DEBUG nova.virt.libvirt.host [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.661 183079 DEBUG nova.virt.libvirt.host [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.662 183079 DEBUG nova.virt.libvirt.host [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.662 183079 DEBUG nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.663 183079 DEBUG nova.virt.hardware [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.663 183079 DEBUG nova.virt.hardware [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.663 183079 DEBUG nova.virt.hardware [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.663 183079 DEBUG nova.virt.hardware [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.664 183079 DEBUG nova.virt.hardware [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.664 183079 DEBUG nova.virt.hardware [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.665 183079 DEBUG nova.virt.hardware [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.666 183079 DEBUG nova.virt.hardware [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.666 183079 DEBUG nova.virt.hardware [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.666 183079 DEBUG nova.virt.hardware [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.667 183079 DEBUG nova.virt.hardware [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.673 183079 DEBUG nova.virt.libvirt.vif [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:41:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-270521243',display_name='tempest-server-test-270521243',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-270521243',id=64,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-43v38ray',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:41:39Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=5abad643-2e22-47fc-bd1d-98ba4f7d6edd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b9c988ec-665e-44cd-a682-5e207216eabc", "address": "fa:16:3e:e8:db:2c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9c988ec-66", "ovs_interfaceid": "b9c988ec-665e-44cd-a682-5e207216eabc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.674 183079 DEBUG nova.network.os_vif_util [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "b9c988ec-665e-44cd-a682-5e207216eabc", "address": "fa:16:3e:e8:db:2c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9c988ec-66", "ovs_interfaceid": "b9c988ec-665e-44cd-a682-5e207216eabc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.675 183079 DEBUG nova.network.os_vif_util [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:db:2c,bridge_name='br-int',has_traffic_filtering=True,id=b9c988ec-665e-44cd-a682-5e207216eabc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9c988ec-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.676 183079 DEBUG nova.objects.instance [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid 5abad643-2e22-47fc-bd1d-98ba4f7d6edd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.701 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.728 183079 DEBUG nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:41:44 compute-0 nova_compute[183075]:   <uuid>5abad643-2e22-47fc-bd1d-98ba4f7d6edd</uuid>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   <name>instance-00000040</name>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-270521243</nova:name>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:41:44</nova:creationTime>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:41:44 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:41:44 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:41:44 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:41:44 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:41:44 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:41:44 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:41:44 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:41:44 compute-0 nova_compute[183075]:         <nova:port uuid="b9c988ec-665e-44cd-a682-5e207216eabc">
Jan 22 17:41:44 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <system>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <entry name="serial">5abad643-2e22-47fc-bd1d-98ba4f7d6edd</entry>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <entry name="uuid">5abad643-2e22-47fc-bd1d-98ba4f7d6edd</entry>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     </system>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   <os>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   </os>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   <features>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   </features>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:e8:db:2c"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <target dev="tapb9c988ec-66"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/console.log" append="off"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <video>
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     </video>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:41:44 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:41:44 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:41:44 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:41:44 compute-0 nova_compute[183075]: </domain>
Jan 22 17:41:44 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.729 183079 DEBUG nova.compute.manager [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Preparing to wait for external event network-vif-plugged-b9c988ec-665e-44cd-a682-5e207216eabc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.729 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.730 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.730 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.731 183079 DEBUG nova.virt.libvirt.vif [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:41:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-270521243',display_name='tempest-server-test-270521243',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-270521243',id=64,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-43v38ray',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:41:39Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=5abad643-2e22-47fc-bd1d-98ba4f7d6edd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b9c988ec-665e-44cd-a682-5e207216eabc", "address": "fa:16:3e:e8:db:2c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9c988ec-66", "ovs_interfaceid": "b9c988ec-665e-44cd-a682-5e207216eabc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.731 183079 DEBUG nova.network.os_vif_util [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "b9c988ec-665e-44cd-a682-5e207216eabc", "address": "fa:16:3e:e8:db:2c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9c988ec-66", "ovs_interfaceid": "b9c988ec-665e-44cd-a682-5e207216eabc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.732 183079 DEBUG nova.network.os_vif_util [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:db:2c,bridge_name='br-int',has_traffic_filtering=True,id=b9c988ec-665e-44cd-a682-5e207216eabc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9c988ec-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.732 183079 DEBUG os_vif [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:db:2c,bridge_name='br-int',has_traffic_filtering=True,id=b9c988ec-665e-44cd-a682-5e207216eabc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9c988ec-66') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.733 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.733 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.733 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.737 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.737 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb9c988ec-66, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.738 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb9c988ec-66, col_values=(('external_ids', {'iface-id': 'b9c988ec-665e-44cd-a682-5e207216eabc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e8:db:2c', 'vm-uuid': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:41:44 compute-0 NetworkManager[55454]: <info>  [1769103704.7403] manager: (tapb9c988ec-66): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.739 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.742 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.747 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.748 183079 INFO os_vif [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:db:2c,bridge_name='br-int',has_traffic_filtering=True,id=b9c988ec-665e-44cd-a682-5e207216eabc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9c988ec-66')
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.800 183079 DEBUG nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.801 183079 DEBUG nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:e8:db:2c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:41:44 compute-0 kernel: tapb9c988ec-66: entered promiscuous mode
Jan 22 17:41:44 compute-0 NetworkManager[55454]: <info>  [1769103704.8590] manager: (tapb9c988ec-66): new Tun device (/org/freedesktop/NetworkManager/Devices/296)
Jan 22 17:41:44 compute-0 ovn_controller[95372]: 2026-01-22T17:41:44Z|00725|binding|INFO|Claiming lport b9c988ec-665e-44cd-a682-5e207216eabc for this chassis.
Jan 22 17:41:44 compute-0 ovn_controller[95372]: 2026-01-22T17:41:44Z|00726|binding|INFO|b9c988ec-665e-44cd-a682-5e207216eabc: Claiming fa:16:3e:e8:db:2c 10.100.0.4
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.860 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:44.873 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:db:2c 10.100.0.4'], port_security=['fa:16:3e:e8:db:2c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5d20a402-d35a-4a65-8bef-cd1a3a7097da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=b9c988ec-665e-44cd-a682-5e207216eabc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.874 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:44 compute-0 ovn_controller[95372]: 2026-01-22T17:41:44Z|00727|binding|INFO|Setting lport b9c988ec-665e-44cd-a682-5e207216eabc ovn-installed in OVS
Jan 22 17:41:44 compute-0 ovn_controller[95372]: 2026-01-22T17:41:44Z|00728|binding|INFO|Setting lport b9c988ec-665e-44cd-a682-5e207216eabc up in Southbound
Jan 22 17:41:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:44.875 104629 INFO neutron.agent.ovn.metadata.agent [-] Port b9c988ec-665e-44cd-a682-5e207216eabc in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.876 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:44.876 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.877 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:44 compute-0 systemd-udevd[238044]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:41:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:44.893 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[81333a48-6ca2-4bcf-baa1-9fc27b2986ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:41:44 compute-0 systemd-machined[154382]: New machine qemu-64-instance-00000040.
Jan 22 17:41:44 compute-0 NetworkManager[55454]: <info>  [1769103704.9041] device (tapb9c988ec-66): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:41:44 compute-0 NetworkManager[55454]: <info>  [1769103704.9055] device (tapb9c988ec-66): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:41:44 compute-0 systemd[1]: Started Virtual Machine qemu-64-instance-00000040.
Jan 22 17:41:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:44.921 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[e44f6b6a-37a2-4a0a-9cfe-9fb86e20d6ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:41:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:44.925 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f1fd68d1-12bc-4090-8ce7-f92b8e86658c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:41:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:44.953 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ddb8d96e-602a-4259-b964-338fa933126c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:41:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:44.967 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4230b789-32c5-4319-8dda-8c805252665d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6131, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6131, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 195], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600351, 'reachable_time': 28750, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238057, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:41:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:44.980 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[95a4915d-ab4e-4b74-86e6-55fcee1e4e30]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600361, 'tstamp': 600361}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238058, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600364, 'tstamp': 600364}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238058, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:41:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:44.982 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.983 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:44 compute-0 nova_compute[183075]: 2026-01-22 17:41:44.984 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:44.985 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:41:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:44.985 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:41:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:44.985 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:41:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:44.986 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.131 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103705.1304832, 5abad643-2e22-47fc-bd1d-98ba4f7d6edd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.131 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] VM Started (Lifecycle Event)
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.147 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.150 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103705.1308136, 5abad643-2e22-47fc-bd1d-98ba4f7d6edd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.150 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] VM Paused (Lifecycle Event)
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.164 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.166 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.184 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.548 183079 DEBUG nova.compute.manager [req-20db5ca2-5d9f-46df-b293-083e4355b31d req-f432f211-1e6b-43ad-8448-f0b0ad17d11d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Received event network-vif-plugged-b9c988ec-665e-44cd-a682-5e207216eabc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.549 183079 DEBUG oslo_concurrency.lockutils [req-20db5ca2-5d9f-46df-b293-083e4355b31d req-f432f211-1e6b-43ad-8448-f0b0ad17d11d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.550 183079 DEBUG oslo_concurrency.lockutils [req-20db5ca2-5d9f-46df-b293-083e4355b31d req-f432f211-1e6b-43ad-8448-f0b0ad17d11d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.551 183079 DEBUG oslo_concurrency.lockutils [req-20db5ca2-5d9f-46df-b293-083e4355b31d req-f432f211-1e6b-43ad-8448-f0b0ad17d11d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.551 183079 DEBUG nova.compute.manager [req-20db5ca2-5d9f-46df-b293-083e4355b31d req-f432f211-1e6b-43ad-8448-f0b0ad17d11d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Processing event network-vif-plugged-b9c988ec-665e-44cd-a682-5e207216eabc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.552 183079 DEBUG nova.compute.manager [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.556 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103705.5563138, 5abad643-2e22-47fc-bd1d-98ba4f7d6edd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.556 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] VM Resumed (Lifecycle Event)
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.558 183079 DEBUG nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.561 183079 INFO nova.virt.libvirt.driver [-] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Instance spawned successfully.
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.561 183079 DEBUG nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.580 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.584 183079 DEBUG nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.585 183079 DEBUG nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.585 183079 DEBUG nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.585 183079 DEBUG nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.586 183079 DEBUG nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.586 183079 DEBUG nova.virt.libvirt.driver [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.590 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.633 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.657 183079 INFO nova.compute.manager [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Took 6.26 seconds to spawn the instance on the hypervisor.
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.658 183079 DEBUG nova.compute.manager [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.743 183079 INFO nova.compute.manager [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Took 6.72 seconds to build instance.
Jan 22 17:41:45 compute-0 nova_compute[183075]: 2026-01-22 17:41:45.762 183079 DEBUG oslo_concurrency.lockutils [None req-eb9af3aa-0670-4516-a032-4a04be825cec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.793s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.444 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Updating instance_info_cache with network_info: [{"id": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "address": "fa:16:3e:77:61:07", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b604d92-0d", "ovs_interfaceid": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.463 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.464 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.464 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.464 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.464 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.465 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.489 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.489 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.490 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.490 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.557 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.614 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.615 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.668 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.674 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:41:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:41:46.723 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.737 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.738 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.796 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.957 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.958 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5490MB free_disk=73.33053588867188GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.959 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:41:46 compute-0 nova_compute[183075]: 2026-01-22 17:41:46.959 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.056 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.057 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 5abad643-2e22-47fc-bd1d-98ba4f7d6edd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.057 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.057 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.118 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.133 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.156 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.157 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.365 183079 DEBUG nova.network.neutron [req-080a5e43-aef2-45f7-9cd0-e8ba4403ca9e req-c43bb894-6aa4-42dc-a085-3f2ec41e5ebc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Updated VIF entry in instance network info cache for port b9c988ec-665e-44cd-a682-5e207216eabc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.366 183079 DEBUG nova.network.neutron [req-080a5e43-aef2-45f7-9cd0-e8ba4403ca9e req-c43bb894-6aa4-42dc-a085-3f2ec41e5ebc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Updating instance_info_cache with network_info: [{"id": "b9c988ec-665e-44cd-a682-5e207216eabc", "address": "fa:16:3e:e8:db:2c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9c988ec-66", "ovs_interfaceid": "b9c988ec-665e-44cd-a682-5e207216eabc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.458 183079 DEBUG oslo_concurrency.lockutils [req-080a5e43-aef2-45f7-9cd0-e8ba4403ca9e req-c43bb894-6aa4-42dc-a085-3f2ec41e5ebc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-5abad643-2e22-47fc-bd1d-98ba4f7d6edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.755 183079 INFO nova.compute.manager [None req-e2dadb68-3bbc-41ef-a902-10250f54b7ba 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Get console output
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.887 183079 DEBUG nova.compute.manager [req-2c7da8f0-4513-4e68-a0fe-4073466abdfd req-620f3ed1-b380-4d05-8041-67942990478a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Received event network-vif-plugged-b9c988ec-665e-44cd-a682-5e207216eabc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.888 183079 DEBUG oslo_concurrency.lockutils [req-2c7da8f0-4513-4e68-a0fe-4073466abdfd req-620f3ed1-b380-4d05-8041-67942990478a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.888 183079 DEBUG oslo_concurrency.lockutils [req-2c7da8f0-4513-4e68-a0fe-4073466abdfd req-620f3ed1-b380-4d05-8041-67942990478a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.889 183079 DEBUG oslo_concurrency.lockutils [req-2c7da8f0-4513-4e68-a0fe-4073466abdfd req-620f3ed1-b380-4d05-8041-67942990478a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.889 183079 DEBUG nova.compute.manager [req-2c7da8f0-4513-4e68-a0fe-4073466abdfd req-620f3ed1-b380-4d05-8041-67942990478a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] No waiting events found dispatching network-vif-plugged-b9c988ec-665e-44cd-a682-5e207216eabc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:41:47 compute-0 nova_compute[183075]: 2026-01-22 17:41:47.890 183079 WARNING nova.compute.manager [req-2c7da8f0-4513-4e68-a0fe-4073466abdfd req-620f3ed1-b380-4d05-8041-67942990478a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Received unexpected event network-vif-plugged-b9c988ec-665e-44cd-a682-5e207216eabc for instance with vm_state active and task_state None.
Jan 22 17:41:48 compute-0 nova_compute[183075]: 2026-01-22 17:41:48.361 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:49 compute-0 podman[238089]: 2026-01-22 17:41:49.353157239 +0000 UTC m=+0.055510028 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:41:49 compute-0 podman[238090]: 2026-01-22 17:41:49.358102071 +0000 UTC m=+0.055189529 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.buildah.version=1.33.7, name=ubi9-minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Jan 22 17:41:49 compute-0 podman[238088]: 2026-01-22 17:41:49.384416996 +0000 UTC m=+0.093479615 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 17:41:49 compute-0 nova_compute[183075]: 2026-01-22 17:41:49.740 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:52 compute-0 nova_compute[183075]: 2026-01-22 17:41:52.863 183079 INFO nova.compute.manager [None req-6ad1801a-99fd-4b22-9f31-c04cd4773ca2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Get console output
Jan 22 17:41:52 compute-0 nova_compute[183075]: 2026-01-22 17:41:52.868 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:41:53 compute-0 nova_compute[183075]: 2026-01-22 17:41:53.364 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:54 compute-0 nova_compute[183075]: 2026-01-22 17:41:54.741 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:55 compute-0 podman[238148]: 2026-01-22 17:41:55.345670749 +0000 UTC m=+0.055615891 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.460 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'name': 'tempest-server-test-270521243', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000040', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.462 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'name': 'tempest-server-test-1916165929', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003f', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.462 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.464 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5abad643-2e22-47fc-bd1d-98ba4f7d6edd / tapb9c988ec-66 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.464 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.466 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44 / tap5b604d92-0d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.466 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/network.outgoing.bytes volume: 11596 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f98223b-32d7-40f7-8a30-883d8a1b9374', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000040-5abad643-2e22-47fc-bd1d-98ba4f7d6edd-tapb9c988ec-66', 'timestamp': '2026-01-22T17:41:55.462565', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'tapb9c988ec-66', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:db:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9c988ec-66'}, 'message_id': 'a4d0dd0e-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.222981151, 'message_signature': 'c9ac0a58db93be82c15990c936502401266fac26bfdc2ef10c341985f0aeb58e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11596, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003f-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-tap5b604d92-0d', 'timestamp': '2026-01-22T17:41:55.462565', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'tap5b604d92-0d', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:61:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b604d92-0d'}, 'message_id': 'a4d12bc4-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.225276153, 'message_signature': '9f3d86224dffe36c6924afa48ae8549e95439a2b9f020f1532cf198c6a451570'}]}, 'timestamp': '2026-01-22 17:41:55.466907', '_unique_id': 'c8c4114f305a461099849a90702da5b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.467 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.468 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.468 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.468 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-270521243>, <NovaLikeServer: tempest-server-test-1916165929>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-270521243>, <NovaLikeServer: tempest-server-test-1916165929>]
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.469 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.469 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.469 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-270521243>, <NovaLikeServer: tempest-server-test-1916165929>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-270521243>, <NovaLikeServer: tempest-server-test-1916165929>]
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.469 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.481 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.495 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk.device.read.requests volume: 1155 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc2941cc-066b-4af4-91f7-39254ada9e3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd-vda', 'timestamp': '2026-01-22T17:41:55.469465', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'instance-00000040', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4d37820-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.229881406, 'message_signature': 'be684741f90c1142a873117a858955962fadbef1c7add9ec3c3c87b7a114e658'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1155, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-vda', 'timestamp': '2026-01-22T17:41:55.469465', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'instance-0000003f', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4d58a70-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.242375701, 'message_signature': 'e3a24452ae1bfca46b64b6f8649243f5f556835eebf0d6faadc05a5bdd700e83'}]}, 'timestamp': '2026-01-22 17:41:55.495531', '_unique_id': '60e33bb249d3405dafbf5ae054d8a7a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.496 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.497 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.497 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.497 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca06a20a-aa53-4db5-a4c5-eae1ce21470a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000040-5abad643-2e22-47fc-bd1d-98ba4f7d6edd-tapb9c988ec-66', 'timestamp': '2026-01-22T17:41:55.497383', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'tapb9c988ec-66', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:db:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9c988ec-66'}, 'message_id': 'a4d5dde0-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.222981151, 'message_signature': '1a6cdcf79ad6ece1784d3a1b98ce6f2b15c74eebedd802a7e80413c303e69763'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003f-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-tap5b604d92-0d', 'timestamp': '2026-01-22T17:41:55.497383', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'tap5b604d92-0d', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:61:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b604d92-0d'}, 'message_id': 'a4d5e772-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.225276153, 'message_signature': 'c75ef32606a55c653c989bdb198450bebd484d514094160ac49af2a7e839878e'}]}, 'timestamp': '2026-01-22 17:41:55.497872', '_unique_id': '3f9b415d5ef84a5b9e17f589f879fdb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.498 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.499 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.499 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk.device.read.latency volume: 135731090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.499 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk.device.read.latency volume: 184669789 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33f921c1-6415-4ad0-9aa7-f4596a874dc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 135731090, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd-vda', 'timestamp': '2026-01-22T17:41:55.499533', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'instance-00000040', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4d6327c-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.229881406, 'message_signature': '9bb7df4b17a437c4e6353eb432791b76061fb6be38d64514108768e54b5343fa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 184669789, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-vda', 'timestamp': '2026-01-22T17:41:55.499533', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'instance-0000003f', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4d63b1e-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.242375701, 'message_signature': '1bf8042e7859b966b622383633aa949dde78b9b24e8e0e9649bce70370fe276f'}]}, 'timestamp': '2026-01-22 17:41:55.500056', '_unique_id': '335db6b39a6743babc1347d3b19c96e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.501 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.501 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.501 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4d8e378-1300-4426-831e-0e7ddfbebc1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000040-5abad643-2e22-47fc-bd1d-98ba4f7d6edd-tapb9c988ec-66', 'timestamp': '2026-01-22T17:41:55.501272', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'tapb9c988ec-66', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:db:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9c988ec-66'}, 'message_id': 'a4d67552-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.222981151, 'message_signature': '7faefe07369925bb7af690b8394587aa378f03e1465c54c2edd813059c555845'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003f-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-tap5b604d92-0d', 'timestamp': '2026-01-22T17:41:55.501272', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'tap5b604d92-0d', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:61:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b604d92-0d'}, 'message_id': 'a4d67dcc-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.225276153, 'message_signature': '94e0a2459e259a67ed46c326bc94a42329d29b064c2f3b57b3fb9180eab07506'}]}, 'timestamp': '2026-01-22 17:41:55.501763', '_unique_id': 'ae2745fe0a1c42c19f2d1c6a5768448d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.502 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1823502e-0d0d-491a-9ade-c1102ed0de5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000040-5abad643-2e22-47fc-bd1d-98ba4f7d6edd-tapb9c988ec-66', 'timestamp': '2026-01-22T17:41:55.502964', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'tapb9c988ec-66', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:db:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9c988ec-66'}, 'message_id': 'a4d6b7ba-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.222981151, 'message_signature': 'c3307a6e29846c0d41ae1f6fa60f318db7bea134dcaf316c88fb30679217e0bc'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003f-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-tap5b604d92-0d', 'timestamp': '2026-01-22T17:41:55.502964', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'tap5b604d92-0d', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:61:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b604d92-0d'}, 'message_id': 'a4d6c1f6-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.225276153, 'message_signature': '1f869e34d6bcdc37b4b5688a947c930fd4fafc48d276d18e0c90ec82dd04c957'}]}, 'timestamp': '2026-01-22 17:41:55.503471', '_unique_id': '59f24314d64e432ebbb69ec2422c935a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.504 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.504 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.504 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk.device.write.requests volume: 347 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5600fe1-e310-4fb2-bd49-efc3299140b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd-vda', 'timestamp': '2026-01-22T17:41:55.504662', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'instance-00000040', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4d6fa9a-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.229881406, 'message_signature': '819e39930969d0bedf28774230ea00fb8aee032133f461afddc535096f6ddc91'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 347, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-vda', 'timestamp': '2026-01-22T17:41:55.504662', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'instance-0000003f', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4d70526-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.242375701, 'message_signature': '05f3fed9db27f0e4929569cd349e55db531c4f3dbc7c968b4a186dabf0e74814'}]}, 'timestamp': '2026-01-22 17:41:55.505204', '_unique_id': '4dd164b4f2e7443baab29f544059391b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.505 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.506 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.506 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.506 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk.device.write.latency volume: 3347241399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48b02b48-d2dc-411d-a527-a45bc7038666', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd-vda', 'timestamp': '2026-01-22T17:41:55.506452', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'instance-00000040', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4d73f46-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.229881406, 'message_signature': 'bf6d5c69798ba3060da843925a7c89bb35bcf6c99fb259c7aa4961b8bef9007c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3347241399, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-vda', 'timestamp': '2026-01-22T17:41:55.506452', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'instance-0000003f', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4d74860-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.242375701, 'message_signature': 'b67eb8fc1894ae09fdf3e6e924d1fc1b6457d55ccfa650b782a766ab5eef8c05'}]}, 'timestamp': '2026-01-22 17:41:55.506892', '_unique_id': '189336615a624ab6a31aa605b97ff60b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.507 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/network.outgoing.packets volume: 131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '242f99e0-ae65-4d66-b2b1-ef22f10f9648', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000040-5abad643-2e22-47fc-bd1d-98ba4f7d6edd-tapb9c988ec-66', 'timestamp': '2026-01-22T17:41:55.507981', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'tapb9c988ec-66', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:db:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9c988ec-66'}, 'message_id': 'a4d77b50-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.222981151, 'message_signature': '922d8a36f2954932d95652d2d23c5bc7f197f0baa316147b02770050262693d1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 131, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003f-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-tap5b604d92-0d', 'timestamp': '2026-01-22T17:41:55.507981', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'tap5b604d92-0d', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:61:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b604d92-0d'}, 'message_id': 'a4d783b6-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.225276153, 'message_signature': 'eef5ac940ba350aea99a5376133a4065c9f0d6bcffbd1f93f85cfcc485140c84'}]}, 'timestamp': '2026-01-22 17:41:55.508418', '_unique_id': 'c10a5cbf870d4776bf87e62338f69d4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.509 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.509 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.509 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5126e85a-fecd-4e60-9849-625b655a8464', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000040-5abad643-2e22-47fc-bd1d-98ba4f7d6edd-tapb9c988ec-66', 'timestamp': '2026-01-22T17:41:55.509566', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'tapb9c988ec-66', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:db:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9c988ec-66'}, 'message_id': 'a4d7ba84-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.222981151, 'message_signature': '9f67d5178c3c9efa6adb881960e3c6120a79b0e7999a867091a6cda2515bee01'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003f-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-tap5b604d92-0d', 'timestamp': '2026-01-22T17:41:55.509566', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'tap5b604d92-0d', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:61:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b604d92-0d'}, 'message_id': 'a4d7c312-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.225276153, 'message_signature': '9c1dd0050bf84d1b0cdf80bd5bec8bf4db3c36b73b3b6a7d9e84168cc2af887c'}]}, 'timestamp': '2026-01-22 17:41:55.510040', '_unique_id': 'cc1e610946394164b6d57b8b5c885e76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.511 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.511 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.511 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3bcb3a7c-9cc5-45f9-9e7b-8fb3ad71f6ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000040-5abad643-2e22-47fc-bd1d-98ba4f7d6edd-tapb9c988ec-66', 'timestamp': '2026-01-22T17:41:55.511158', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'tapb9c988ec-66', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:db:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9c988ec-66'}, 'message_id': 'a4d7f742-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.222981151, 'message_signature': '4016629bacbee80e166f58519ff5f52965b04d10c3a8233b5c7102d9b0a47678'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003f-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-tap5b604d92-0d', 'timestamp': '2026-01-22T17:41:55.511158', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'tap5b604d92-0d', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:61:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b604d92-0d'}, 'message_id': 'a4d7ff8a-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.225276153, 'message_signature': 'cb2eead6e0ee7c73659daf2efcec7f5f6740f5f2eccb66d1a651bfb569fdc1ee'}]}, 'timestamp': '2026-01-22 17:41:55.511588', '_unique_id': '7086cad738174833b27dc6d6af03f987'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.512 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk.device.write.bytes volume: 73224192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45e2fb8b-063d-40fe-a046-212c7bf616c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd-vda', 'timestamp': '2026-01-22T17:41:55.512696', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'instance-00000040', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4d83338-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.229881406, 'message_signature': 'd8015acac727f7abc70f9edddcaf98a820a42a42bf9b4befd583749fa072d99b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73224192, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-vda', 'timestamp': '2026-01-22T17:41:55.512696', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'instance-0000003f', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4d83b30-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.242375701, 'message_signature': '9dc249a1919270864586790436a6aa0dec3d70ba0f64e0c1fcc229c39ba212f5'}]}, 'timestamp': '2026-01-22 17:41:55.513107', '_unique_id': '93f9f112c4884172ba4cf17ca50f5a25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.513 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.514 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.526 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/cpu volume: 9650000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.538 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/cpu volume: 11610000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d8b1102-d131-4448-b3d2-2d53d7af181f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9650000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'timestamp': '2026-01-22T17:41:55.514242', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'instance-00000040', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'a4da4d3a-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.286797911, 'message_signature': '8697a77fa170c52e50094b2a76cc32842cd5cfa336e405c22d850f3c26cc293a'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11610000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'timestamp': '2026-01-22T17:41:55.514242', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'instance-0000003f', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'a4dc2772-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.298912256, 'message_signature': 'b0a00bdb5d29008790b975f97272459a6e7765374f427795af2db2c568e987d5'}]}, 'timestamp': '2026-01-22 17:41:55.538838', '_unique_id': 'ae9da393a0fd4880babb23de270e10aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.540 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.540 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.540 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-270521243>, <NovaLikeServer: tempest-server-test-1916165929>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-270521243>, <NovaLikeServer: tempest-server-test-1916165929>]
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.540 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.540 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.540 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ce7654c-a8e9-491b-aeba-5a474b210f93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000040-5abad643-2e22-47fc-bd1d-98ba4f7d6edd-tapb9c988ec-66', 'timestamp': '2026-01-22T17:41:55.540654', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'tapb9c988ec-66', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:db:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9c988ec-66'}, 'message_id': 'a4dc77b8-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.222981151, 'message_signature': '59559080ac0af40ef24e834c9f954d47716d2003d3399a706723ad2954a4a5c8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003f-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-tap5b604d92-0d', 'timestamp': '2026-01-22T17:41:55.540654', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'tap5b604d92-0d', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:61:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b604d92-0d'}, 'message_id': 'a4dc7fce-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.225276153, 'message_signature': '489b4f9003a797a03cc6bcea8836a27de9c38c8bca8cf91be8168a529683b531'}]}, 'timestamp': '2026-01-22 17:41:55.541085', '_unique_id': 'a6f5a8d7edf14ba6920ec407b520d317'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.542 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.547 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.552 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '279c3001-f323-4dd1-a82c-ec7e4cd8cdf2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd-vda', 'timestamp': '2026-01-22T17:41:55.542131', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'instance-00000040', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4dd7f50-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.302546493, 'message_signature': '44d261553444240b01d41e8e6918bbd62734e32a593e07b2872ca0c1056f6420'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-vda', 'timestamp': '2026-01-22T17:41:55.542131', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'instance-0000003f', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4de3bfc-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.308152933, 'message_signature': 'e3305819e6247b63dfb269f7eeaa0eb4eb9a11551289398ee7c750fdaa3f3c0d'}]}, 'timestamp': '2026-01-22 17:41:55.552537', '_unique_id': 'cac7d4931e0647e8bf068373a79a1781'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.553 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.554 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.554 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.554 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-270521243>, <NovaLikeServer: tempest-server-test-1916165929>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-270521243>, <NovaLikeServer: tempest-server-test-1916165929>]
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.555 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.555 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.555 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6139599e-a188-415e-9e36-15dd04150ec2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000040-5abad643-2e22-47fc-bd1d-98ba4f7d6edd-tapb9c988ec-66', 'timestamp': '2026-01-22T17:41:55.555240', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'tapb9c988ec-66', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:db:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9c988ec-66'}, 'message_id': 'a4deb2e4-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.222981151, 'message_signature': '9ee4e4a60309d58659a51741b81db23042f9bf9f2fa338535f1f1c0b7f65ca49'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003f-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-tap5b604d92-0d', 'timestamp': '2026-01-22T17:41:55.555240', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'tap5b604d92-0d', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:61:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b604d92-0d'}, 'message_id': 'a4debe6a-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.225276153, 'message_signature': 'e67519e646bb5bfd20c3a7c2476103e0ec5b3c09ac3285fef9e2dde87ab77689'}]}, 'timestamp': '2026-01-22 17:41:55.555831', '_unique_id': 'c3035c0f32f047ce88f0ee13d4bde5a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.556 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.557 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.557 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.557 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk.device.read.bytes volume: 31087104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ce1f6db-361c-46c8-bce6-46517b87d1ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd-vda', 'timestamp': '2026-01-22T17:41:55.557295', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'instance-00000040', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4df0262-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.229881406, 'message_signature': '883e81b232492c550ecb856115ada311be2e63a060c48e77ca92e16741240329'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31087104, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-vda', 'timestamp': '2026-01-22T17:41:55.557295', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'instance-0000003f', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4df0d16-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.242375701, 'message_signature': '4eaafecdc201208361694ba81cd67d1e5973c80306ddf9ca04b559682a5c996f'}]}, 'timestamp': '2026-01-22 17:41:55.557833', '_unique_id': '38489030e3de4ba89781fdb53d1f7847'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.558 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.559 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.559 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.559 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d6462d6-effc-4178-9946-18934fc491ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd-vda', 'timestamp': '2026-01-22T17:41:55.559199', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'instance-00000040', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4df4cb8-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.302546493, 'message_signature': '563ab211d23a08a60a63988bbde31f0fb0259120788774452d992dea85caad92'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-vda', 'timestamp': '2026-01-22T17:41:55.559199', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'instance-0000003f', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4df56e0-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.308152933, 'message_signature': 'a148c673ee19aef2f68b929a4065a52ddec8302434d3fa58f47fc1b4a0a453e1'}]}, 'timestamp': '2026-01-22 17:41:55.559745', '_unique_id': '1aabf1b574c249f0b60831b613a6997d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.560 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.561 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.561 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.561 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea4c18ba-3366-4c75-bf78-fb9a6824c6b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd-vda', 'timestamp': '2026-01-22T17:41:55.561133', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'instance-00000040', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4df9862-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.302546493, 'message_signature': '4df1f863ed3cbc1c4ca888d0b3e2fd8401d1b65af10667551d81d87159800d20'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-vda', 'timestamp': '2026-01-22T17:41:55.561133', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'instance-0000003f', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a4dfa262-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.308152933, 'message_signature': '7037c29debe32a2f8616b55232a5f62d05cc21bd401295f56e2d21860caa1e72'}]}, 'timestamp': '2026-01-22 17:41:55.561676', '_unique_id': '5c6c014e8ee241ea8293ae9de0bdfb17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.562 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.563 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.563 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/network.incoming.bytes volume: 7207 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84e25317-5270-4a97-8ddb-a167ed6e5cad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000040-5abad643-2e22-47fc-bd1d-98ba4f7d6edd-tapb9c988ec-66', 'timestamp': '2026-01-22T17:41:55.563014', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'tapb9c988ec-66', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:db:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9c988ec-66'}, 'message_id': 'a4dfe1dc-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.222981151, 'message_signature': '9988d9ca6e449cec72f476affe39e0bc886306ad7e4b0e95142c940eee67abb6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7207, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000003f-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-tap5b604d92-0d', 'timestamp': '2026-01-22T17:41:55.563014', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'tap5b604d92-0d', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:61:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b604d92-0d'}, 'message_id': 'a4dfebfa-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.225276153, 'message_signature': '0005a6ae0b38ec4cf672d85152ec4da99fbb8c4f138e0a89d4181ac19e5d7c8e'}]}, 'timestamp': '2026-01-22 17:41:55.563543', '_unique_id': '2481434980c9435382ef5413d0e9dba4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.564 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.565 12 DEBUG ceilometer.compute.pollsters [-] 5abad643-2e22-47fc-bd1d-98ba4f7d6edd/memory.usage volume: 40.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.565 12 DEBUG ceilometer.compute.pollsters [-] 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/memory.usage volume: 41.92578125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b59f4ee1-c0f7-4f28-b467-0247f44c7b36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.38671875, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'timestamp': '2026-01-22T17:41:55.565044', 'resource_metadata': {'display_name': 'tempest-server-test-270521243', 'name': 'instance-00000040', 'instance_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'a4e03132-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.286797911, 'message_signature': 'c83ff8fa24a7f55b70e421bd100bb75fd90aa48b86a58f9547e6a8705c19cb1e'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 41.92578125, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'timestamp': '2026-01-22T17:41:55.565044', 'resource_metadata': {'display_name': 'tempest-server-test-1916165929', 'name': 'instance-0000003f', 'instance_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'a4e03a9c-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6079.298912256, 'message_signature': '2127a071db589be0ea8160dfff6d7c16006018460c2dfb2be34debbef428ceb6'}]}, 'timestamp': '2026-01-22 17:41:55.565550', '_unique_id': 'c8a054d6685842529349b3b896f3547d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:41:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:41:55.566 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:41:58 compute-0 nova_compute[183075]: 2026-01-22 17:41:58.000 183079 INFO nova.compute.manager [None req-cefc2450-1a59-49ad-9245-26cefde1e7f2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Get console output
Jan 22 17:41:58 compute-0 nova_compute[183075]: 2026-01-22 17:41:58.004 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:41:58 compute-0 ovn_controller[95372]: 2026-01-22T17:41:58Z|00130|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e8:db:2c 10.100.0.4
Jan 22 17:41:58 compute-0 ovn_controller[95372]: 2026-01-22T17:41:58Z|00131|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e8:db:2c 10.100.0.4
Jan 22 17:41:58 compute-0 nova_compute[183075]: 2026-01-22 17:41:58.365 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:41:59 compute-0 nova_compute[183075]: 2026-01-22 17:41:59.760 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:00 compute-0 nova_compute[183075]: 2026-01-22 17:42:00.154 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:42:01 compute-0 sshd-session[238183]: Connection closed by 85.217.149.18 port 37922 [preauth]
Jan 22 17:42:03 compute-0 nova_compute[183075]: 2026-01-22 17:42:03.196 183079 INFO nova.compute.manager [None req-d58ada19-4c0a-4afe-88f0-a2ce62638c67 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Get console output
Jan 22 17:42:03 compute-0 nova_compute[183075]: 2026-01-22 17:42:03.201 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:42:03 compute-0 nova_compute[183075]: 2026-01-22 17:42:03.367 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:04.747 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:42:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:04.748 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:42:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:42:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:42:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:42:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:42:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:42:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:42:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:42:04 compute-0 nova_compute[183075]: 2026-01-22 17:42:04.762 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.441 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.441 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.6926789
Jan 22 17:42:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.4:50660 [22/Jan/2026:17:42:04.746] listener listener/metadata 0/0/0/694/694 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.450 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.451 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.473 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.473 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0225737
Jan 22 17:42:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.4:50676 [22/Jan/2026:17:42:05.449] listener listener/metadata 0/0/0/23/23 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.477 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.477 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.492 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.493 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0150840
Jan 22 17:42:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.4:50682 [22/Jan/2026:17:42:05.476] listener listener/metadata 0/0/0/16/16 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.497 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.497 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.510 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.511 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0134749
Jan 22 17:42:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.4:50686 [22/Jan/2026:17:42:05.497] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.515 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.516 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.529 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.529 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0131693
Jan 22 17:42:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.4:50688 [22/Jan/2026:17:42:05.515] listener listener/metadata 0/0/0/14/14 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.533 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.534 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.546 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.547 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0131867
Jan 22 17:42:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.4:50704 [22/Jan/2026:17:42:05.533] listener listener/metadata 0/0/0/14/14 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.551 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.552 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.563 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.563 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0110636
Jan 22 17:42:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.4:50718 [22/Jan/2026:17:42:05.551] listener listener/metadata 0/0/0/12/12 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.566 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.567 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.578 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.578 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0116720
Jan 22 17:42:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.4:50724 [22/Jan/2026:17:42:05.566] listener listener/metadata 0/0/0/12/12 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.582 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.582 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.592 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.593 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0108879
Jan 22 17:42:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.4:50726 [22/Jan/2026:17:42:05.581] listener listener/metadata 0/0/0/11/11 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.596 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.597 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.609 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.609 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0121574
Jan 22 17:42:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.4:50742 [22/Jan/2026:17:42:05.596] listener listener/metadata 0/0/0/12/12 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.613 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.613 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:42:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.4:50752 [22/Jan/2026:17:42:05.613] listener listener/metadata 0/0/0/15/15 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.628 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0147414
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.637 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.638 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.652 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.653 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0148292
Jan 22 17:42:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.4:50754 [22/Jan/2026:17:42:05.637] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.656 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.657 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.670 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:42:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.4:50764 [22/Jan/2026:17:42:05.656] listener listener/metadata 0/0/0/14/14 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.671 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0135634
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.674 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.674 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.686 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.687 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0123217
Jan 22 17:42:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.4:50780 [22/Jan/2026:17:42:05.673] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.696 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.697 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.712 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.712 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0152929
Jan 22 17:42:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.4:50784 [22/Jan/2026:17:42:05.695] listener listener/metadata 0/0/0/16/16 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.717 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.717 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.729 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:42:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:05.729 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0122249
Jan 22 17:42:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237711]: 10.100.0.4:50800 [22/Jan/2026:17:42:05.716] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:42:06 compute-0 podman[238185]: 2026-01-22 17:42:06.339253066 +0000 UTC m=+0.053153435 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:42:08 compute-0 nova_compute[183075]: 2026-01-22 17:42:08.370 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:08 compute-0 nova_compute[183075]: 2026-01-22 17:42:08.465 183079 INFO nova.compute.manager [None req-837066d2-227b-4b2a-8aa4-5dbe2fde14d8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Get console output
Jan 22 17:42:08 compute-0 nova_compute[183075]: 2026-01-22 17:42:08.470 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:42:09 compute-0 nova_compute[183075]: 2026-01-22 17:42:09.766 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:10 compute-0 podman[238209]: 2026-01-22 17:42:10.341401242 +0000 UTC m=+0.048155751 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:42:13 compute-0 nova_compute[183075]: 2026-01-22 17:42:13.372 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:13 compute-0 nova_compute[183075]: 2026-01-22 17:42:13.814 183079 INFO nova.compute.manager [None req-50e7ac2d-6a83-491d-8a75-c452d7b39a05 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Get console output
Jan 22 17:42:13 compute-0 nova_compute[183075]: 2026-01-22 17:42:13.819 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:42:14 compute-0 nova_compute[183075]: 2026-01-22 17:42:14.767 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:14 compute-0 ovn_controller[95372]: 2026-01-22T17:42:14Z|00729|memory_trim|INFO|Detected inactivity (last active 30025 ms ago): trimming memory
Jan 22 17:42:18 compute-0 nova_compute[183075]: 2026-01-22 17:42:18.403 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:18 compute-0 nova_compute[183075]: 2026-01-22 17:42:18.933 183079 INFO nova.compute.manager [None req-0c739acb-4836-4721-a6fc-60a7b7a528e5 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Get console output
Jan 22 17:42:18 compute-0 nova_compute[183075]: 2026-01-22 17:42:18.937 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:42:19 compute-0 nova_compute[183075]: 2026-01-22 17:42:19.770 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:20 compute-0 podman[238234]: 2026-01-22 17:42:20.355883401 +0000 UTC m=+0.060785990 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 17:42:20 compute-0 podman[238233]: 2026-01-22 17:42:20.364241784 +0000 UTC m=+0.076928211 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:42:20 compute-0 podman[238235]: 2026-01-22 17:42:20.395456491 +0000 UTC m=+0.093293070 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-type=git, maintainer=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public)
Jan 22 17:42:23 compute-0 nova_compute[183075]: 2026-01-22 17:42:23.405 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:24 compute-0 nova_compute[183075]: 2026-01-22 17:42:24.087 183079 INFO nova.compute.manager [None req-ba6cee9c-2aea-4147-91d0-e5e8163b50d5 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Get console output
Jan 22 17:42:24 compute-0 nova_compute[183075]: 2026-01-22 17:42:24.092 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:42:24 compute-0 nova_compute[183075]: 2026-01-22 17:42:24.771 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:26 compute-0 podman[238298]: 2026-01-22 17:42:26.3564816 +0000 UTC m=+0.062417713 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute)
Jan 22 17:42:27 compute-0 nova_compute[183075]: 2026-01-22 17:42:27.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:42:28 compute-0 nova_compute[183075]: 2026-01-22 17:42:28.406 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:29 compute-0 nova_compute[183075]: 2026-01-22 17:42:29.221 183079 INFO nova.compute.manager [None req-fb3f2f55-7da7-48d8-8fac-1f4dbb15507a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Get console output
Jan 22 17:42:29 compute-0 nova_compute[183075]: 2026-01-22 17:42:29.225 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:42:29 compute-0 nova_compute[183075]: 2026-01-22 17:42:29.775 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:32 compute-0 nova_compute[183075]: 2026-01-22 17:42:32.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:42:33 compute-0 nova_compute[183075]: 2026-01-22 17:42:33.419 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:33 compute-0 nova_compute[183075]: 2026-01-22 17:42:33.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:42:34 compute-0 nova_compute[183075]: 2026-01-22 17:42:34.372 183079 INFO nova.compute.manager [None req-2697d5b9-29a1-4ae5-8d27-7b2cf1ae0939 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Get console output
Jan 22 17:42:34 compute-0 nova_compute[183075]: 2026-01-22 17:42:34.376 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:42:34 compute-0 nova_compute[183075]: 2026-01-22 17:42:34.777 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:36 compute-0 nova_compute[183075]: 2026-01-22 17:42:36.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:42:37 compute-0 podman[238319]: 2026-01-22 17:42:37.337434158 +0000 UTC m=+0.051022608 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:42:38 compute-0 nova_compute[183075]: 2026-01-22 17:42:38.420 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:39 compute-0 nova_compute[183075]: 2026-01-22 17:42:39.691 183079 INFO nova.compute.manager [None req-757b5636-8343-4216-acfe-c68608f828fa 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Get console output
Jan 22 17:42:39 compute-0 nova_compute[183075]: 2026-01-22 17:42:39.700 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:42:39 compute-0 nova_compute[183075]: 2026-01-22 17:42:39.779 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:41 compute-0 sshd-session[238343]: Received disconnect from 91.224.92.78 port 40336:11:  [preauth]
Jan 22 17:42:41 compute-0 sshd-session[238343]: Disconnected from authenticating user root 91.224.92.78 port 40336 [preauth]
Jan 22 17:42:41 compute-0 podman[238345]: 2026-01-22 17:42:41.336301445 +0000 UTC m=+0.043166068 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:42:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:41.960 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:42:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:41.961 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:42:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:41.961 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:42:42 compute-0 nova_compute[183075]: 2026-01-22 17:42:42.782 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:42:42 compute-0 nova_compute[183075]: 2026-01-22 17:42:42.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:42:42 compute-0 nova_compute[183075]: 2026-01-22 17:42:42.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:42:42 compute-0 nova_compute[183075]: 2026-01-22 17:42:42.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:42:43 compute-0 nova_compute[183075]: 2026-01-22 17:42:43.423 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:44 compute-0 nova_compute[183075]: 2026-01-22 17:42:44.782 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:45 compute-0 nova_compute[183075]: 2026-01-22 17:42:45.337 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:42:45 compute-0 nova_compute[183075]: 2026-01-22 17:42:45.338 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:42:45 compute-0 nova_compute[183075]: 2026-01-22 17:42:45.338 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:42:45 compute-0 nova_compute[183075]: 2026-01-22 17:42:45.338 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:42:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:46.613 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:42:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:46.614 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:42:46 compute-0 nova_compute[183075]: 2026-01-22 17:42:46.614 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:47 compute-0 nova_compute[183075]: 2026-01-22 17:42:47.694 183079 DEBUG nova.compute.manager [req-3dcbe3cd-8c2e-49bf-8b54-636b60dff1b1 req-bb95b23b-a102-494f-87ad-36f8d15b7b04 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Received event network-changed-5b604d92-0d68-405f-b4e2-6a3f72fbabad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:42:47 compute-0 nova_compute[183075]: 2026-01-22 17:42:47.695 183079 DEBUG nova.compute.manager [req-3dcbe3cd-8c2e-49bf-8b54-636b60dff1b1 req-bb95b23b-a102-494f-87ad-36f8d15b7b04 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Refreshing instance network info cache due to event network-changed-5b604d92-0d68-405f-b4e2-6a3f72fbabad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:42:47 compute-0 nova_compute[183075]: 2026-01-22 17:42:47.695 183079 DEBUG oslo_concurrency.lockutils [req-3dcbe3cd-8c2e-49bf-8b54-636b60dff1b1 req-bb95b23b-a102-494f-87ad-36f8d15b7b04 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:42:48 compute-0 nova_compute[183075]: 2026-01-22 17:42:48.425 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:48 compute-0 nova_compute[183075]: 2026-01-22 17:42:48.576 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Updating instance_info_cache with network_info: [{"id": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "address": "fa:16:3e:77:61:07", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b604d92-0d", "ovs_interfaceid": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.202 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.202 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.203 183079 DEBUG oslo_concurrency.lockutils [req-3dcbe3cd-8c2e-49bf-8b54-636b60dff1b1 req-bb95b23b-a102-494f-87ad-36f8d15b7b04 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.204 183079 DEBUG nova.network.neutron [req-3dcbe3cd-8c2e-49bf-8b54-636b60dff1b1 req-bb95b23b-a102-494f-87ad-36f8d15b7b04 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Refreshing network info cache for port 5b604d92-0d68-405f-b4e2-6a3f72fbabad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.205 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.207 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.207 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.207 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.235 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.236 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.236 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.236 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.398 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.454 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.455 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.519 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.525 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.581 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.581 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.640 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.784 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.789 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.790 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5392MB free_disk=73.30254745483398GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.790 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:42:49 compute-0 nova_compute[183075]: 2026-01-22 17:42:49.790 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:42:51 compute-0 podman[238382]: 2026-01-22 17:42:51.353665392 +0000 UTC m=+0.055389335 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent)
Jan 22 17:42:51 compute-0 podman[238383]: 2026-01-22 17:42:51.37002187 +0000 UTC m=+0.066937404 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64)
Jan 22 17:42:51 compute-0 podman[238381]: 2026-01-22 17:42:51.383385488 +0000 UTC m=+0.081947866 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:42:51 compute-0 nova_compute[183075]: 2026-01-22 17:42:51.388 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:42:51 compute-0 nova_compute[183075]: 2026-01-22 17:42:51.389 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 5abad643-2e22-47fc-bd1d-98ba4f7d6edd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:42:51 compute-0 nova_compute[183075]: 2026-01-22 17:42:51.389 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:42:51 compute-0 nova_compute[183075]: 2026-01-22 17:42:51.389 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:42:51 compute-0 nova_compute[183075]: 2026-01-22 17:42:51.400 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing inventories for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 17:42:51 compute-0 nova_compute[183075]: 2026-01-22 17:42:51.470 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Updating ProviderTree inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 17:42:51 compute-0 nova_compute[183075]: 2026-01-22 17:42:51.470 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 17:42:51 compute-0 nova_compute[183075]: 2026-01-22 17:42:51.482 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing aggregate associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 17:42:51 compute-0 nova_compute[183075]: 2026-01-22 17:42:51.502 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing trait associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NODE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 17:42:51 compute-0 nova_compute[183075]: 2026-01-22 17:42:51.553 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:42:51 compute-0 nova_compute[183075]: 2026-01-22 17:42:51.585 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:42:51 compute-0 nova_compute[183075]: 2026-01-22 17:42:51.587 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:42:51 compute-0 nova_compute[183075]: 2026-01-22 17:42:51.587 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.796s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:42:52 compute-0 nova_compute[183075]: 2026-01-22 17:42:52.567 183079 DEBUG nova.network.neutron [req-3dcbe3cd-8c2e-49bf-8b54-636b60dff1b1 req-bb95b23b-a102-494f-87ad-36f8d15b7b04 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Updated VIF entry in instance network info cache for port 5b604d92-0d68-405f-b4e2-6a3f72fbabad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:42:52 compute-0 nova_compute[183075]: 2026-01-22 17:42:52.568 183079 DEBUG nova.network.neutron [req-3dcbe3cd-8c2e-49bf-8b54-636b60dff1b1 req-bb95b23b-a102-494f-87ad-36f8d15b7b04 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Updating instance_info_cache with network_info: [{"id": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "address": "fa:16:3e:77:61:07", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b604d92-0d", "ovs_interfaceid": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:42:52 compute-0 nova_compute[183075]: 2026-01-22 17:42:52.664 183079 DEBUG oslo_concurrency.lockutils [req-3dcbe3cd-8c2e-49bf-8b54-636b60dff1b1 req-bb95b23b-a102-494f-87ad-36f8d15b7b04 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:42:52 compute-0 nova_compute[183075]: 2026-01-22 17:42:52.674 183079 DEBUG nova.compute.manager [req-9bece60b-7299-4257-9577-a3820204eb17 req-da776710-9aab-4f37-a4e9-f54ac9302285 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Received event network-changed-b9c988ec-665e-44cd-a682-5e207216eabc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:42:52 compute-0 nova_compute[183075]: 2026-01-22 17:42:52.675 183079 DEBUG nova.compute.manager [req-9bece60b-7299-4257-9577-a3820204eb17 req-da776710-9aab-4f37-a4e9-f54ac9302285 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Refreshing instance network info cache due to event network-changed-b9c988ec-665e-44cd-a682-5e207216eabc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:42:52 compute-0 nova_compute[183075]: 2026-01-22 17:42:52.675 183079 DEBUG oslo_concurrency.lockutils [req-9bece60b-7299-4257-9577-a3820204eb17 req-da776710-9aab-4f37-a4e9-f54ac9302285 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-5abad643-2e22-47fc-bd1d-98ba4f7d6edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:42:52 compute-0 nova_compute[183075]: 2026-01-22 17:42:52.676 183079 DEBUG oslo_concurrency.lockutils [req-9bece60b-7299-4257-9577-a3820204eb17 req-da776710-9aab-4f37-a4e9-f54ac9302285 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-5abad643-2e22-47fc-bd1d-98ba4f7d6edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:42:52 compute-0 nova_compute[183075]: 2026-01-22 17:42:52.676 183079 DEBUG nova.network.neutron [req-9bece60b-7299-4257-9577-a3820204eb17 req-da776710-9aab-4f37-a4e9-f54ac9302285 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Refreshing network info cache for port b9c988ec-665e-44cd-a682-5e207216eabc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:42:53 compute-0 nova_compute[183075]: 2026-01-22 17:42:53.427 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:54 compute-0 nova_compute[183075]: 2026-01-22 17:42:54.682 183079 DEBUG nova.network.neutron [req-9bece60b-7299-4257-9577-a3820204eb17 req-da776710-9aab-4f37-a4e9-f54ac9302285 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Updated VIF entry in instance network info cache for port b9c988ec-665e-44cd-a682-5e207216eabc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:42:54 compute-0 nova_compute[183075]: 2026-01-22 17:42:54.683 183079 DEBUG nova.network.neutron [req-9bece60b-7299-4257-9577-a3820204eb17 req-da776710-9aab-4f37-a4e9-f54ac9302285 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Updating instance_info_cache with network_info: [{"id": "b9c988ec-665e-44cd-a682-5e207216eabc", "address": "fa:16:3e:e8:db:2c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9c988ec-66", "ovs_interfaceid": "b9c988ec-665e-44cd-a682-5e207216eabc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:42:54 compute-0 nova_compute[183075]: 2026-01-22 17:42:54.743 183079 DEBUG oslo_concurrency.lockutils [req-9bece60b-7299-4257-9577-a3820204eb17 req-da776710-9aab-4f37-a4e9-f54ac9302285 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-5abad643-2e22-47fc-bd1d-98ba4f7d6edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:42:54 compute-0 nova_compute[183075]: 2026-01-22 17:42:54.787 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.660 183079 DEBUG oslo_concurrency.lockutils [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.660 183079 DEBUG oslo_concurrency.lockutils [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.661 183079 DEBUG oslo_concurrency.lockutils [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.661 183079 DEBUG oslo_concurrency.lockutils [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.661 183079 DEBUG oslo_concurrency.lockutils [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.662 183079 INFO nova.compute.manager [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Terminating instance
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.663 183079 DEBUG nova.compute.manager [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:42:55 compute-0 kernel: tapb9c988ec-66 (unregistering): left promiscuous mode
Jan 22 17:42:55 compute-0 NetworkManager[55454]: <info>  [1769103775.6982] device (tapb9c988ec-66): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:42:55 compute-0 ovn_controller[95372]: 2026-01-22T17:42:55Z|00730|binding|INFO|Releasing lport b9c988ec-665e-44cd-a682-5e207216eabc from this chassis (sb_readonly=0)
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.707 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:55 compute-0 ovn_controller[95372]: 2026-01-22T17:42:55Z|00731|binding|INFO|Setting lport b9c988ec-665e-44cd-a682-5e207216eabc down in Southbound
Jan 22 17:42:55 compute-0 ovn_controller[95372]: 2026-01-22T17:42:55Z|00732|binding|INFO|Removing iface tapb9c988ec-66 ovn-installed in OVS
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.722 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:55 compute-0 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000040.scope: Deactivated successfully.
Jan 22 17:42:55 compute-0 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000040.scope: Consumed 14.569s CPU time.
Jan 22 17:42:55 compute-0 systemd-machined[154382]: Machine qemu-64-instance-00000040 terminated.
Jan 22 17:42:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:55.891 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:db:2c 10.100.0.4'], port_security=['fa:16:3e:e8:db:2c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '5abad643-2e22-47fc-bd1d-98ba4f7d6edd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5d20a402-d35a-4a65-8bef-cd1a3a7097da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.221'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=b9c988ec-665e-44cd-a682-5e207216eabc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:42:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:55.893 104629 INFO neutron.agent.ovn.metadata.agent [-] Port b9c988ec-665e-44cd-a682-5e207216eabc in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:42:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:55.894 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:42:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:55.915 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d03bddf4-e6d8-4385-a167-a6e7232aeec2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.927 183079 INFO nova.virt.libvirt.driver [-] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Instance destroyed successfully.
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.928 183079 DEBUG nova.objects.instance [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid 5abad643-2e22-47fc-bd1d-98ba4f7d6edd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:42:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:55.947 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[bb5d74a7-302d-4a86-9bfc-3e290559158f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:42:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:55.951 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d9848567-8943-46d4-b204-249aadb5f05a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.953 183079 DEBUG nova.virt.libvirt.vif [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:41:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-270521243',display_name='tempest-server-test-270521243',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-270521243',id=64,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:41:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-43v38ray',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:41:45Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=5abad643-2e22-47fc-bd1d-98ba4f7d6edd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b9c988ec-665e-44cd-a682-5e207216eabc", "address": "fa:16:3e:e8:db:2c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9c988ec-66", "ovs_interfaceid": "b9c988ec-665e-44cd-a682-5e207216eabc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.954 183079 DEBUG nova.network.os_vif_util [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "b9c988ec-665e-44cd-a682-5e207216eabc", "address": "fa:16:3e:e8:db:2c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9c988ec-66", "ovs_interfaceid": "b9c988ec-665e-44cd-a682-5e207216eabc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.955 183079 DEBUG nova.network.os_vif_util [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e8:db:2c,bridge_name='br-int',has_traffic_filtering=True,id=b9c988ec-665e-44cd-a682-5e207216eabc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9c988ec-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.955 183079 DEBUG os_vif [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:db:2c,bridge_name='br-int',has_traffic_filtering=True,id=b9c988ec-665e-44cd-a682-5e207216eabc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9c988ec-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.956 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.957 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9c988ec-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.959 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.960 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.963 183079 INFO os_vif [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:db:2c,bridge_name='br-int',has_traffic_filtering=True,id=b9c988ec-665e-44cd-a682-5e207216eabc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9c988ec-66')
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.964 183079 INFO nova.virt.libvirt.driver [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Deleting instance files /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd_del
Jan 22 17:42:55 compute-0 nova_compute[183075]: 2026-01-22 17:42:55.964 183079 INFO nova.virt.libvirt.driver [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Deletion of /var/lib/nova/instances/5abad643-2e22-47fc-bd1d-98ba4f7d6edd_del complete
Jan 22 17:42:55 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:55.982 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[65f40702-599a-44c3-804b-3efb1a60ee68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:42:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:56.000 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c622fe3d-9790-491a-85de-19197499f962]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 105, 'rx_bytes': 17308, 'tx_bytes': 11989, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 105, 'rx_bytes': 17308, 'tx_bytes': 11989, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 195], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600351, 'reachable_time': 28750, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238469, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:42:56 compute-0 nova_compute[183075]: 2026-01-22 17:42:56.010 183079 INFO nova.compute.manager [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 17:42:56 compute-0 nova_compute[183075]: 2026-01-22 17:42:56.010 183079 DEBUG oslo.service.loopingcall [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:42:56 compute-0 nova_compute[183075]: 2026-01-22 17:42:56.010 183079 DEBUG nova.compute.manager [-] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:42:56 compute-0 nova_compute[183075]: 2026-01-22 17:42:56.010 183079 DEBUG nova.network.neutron [-] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:42:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:56.020 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2bdd83ab-5f9a-4ac4-a325-58ce58a7a5a4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600361, 'tstamp': 600361}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238470, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600364, 'tstamp': 600364}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238470, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:42:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:56.023 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:42:56 compute-0 nova_compute[183075]: 2026-01-22 17:42:56.024 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:56 compute-0 nova_compute[183075]: 2026-01-22 17:42:56.025 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:56.026 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:42:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:56.026 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:42:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:56.026 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:42:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:56.027 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:42:56 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:56.616 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:42:56 compute-0 nova_compute[183075]: 2026-01-22 17:42:56.854 183079 DEBUG nova.compute.manager [req-57245558-0cee-406e-837b-b01b951a8b89 req-b2f3e25a-8807-4235-977e-9c2be33b0952 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Received event network-vif-unplugged-b9c988ec-665e-44cd-a682-5e207216eabc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:42:56 compute-0 nova_compute[183075]: 2026-01-22 17:42:56.855 183079 DEBUG oslo_concurrency.lockutils [req-57245558-0cee-406e-837b-b01b951a8b89 req-b2f3e25a-8807-4235-977e-9c2be33b0952 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:42:56 compute-0 nova_compute[183075]: 2026-01-22 17:42:56.855 183079 DEBUG oslo_concurrency.lockutils [req-57245558-0cee-406e-837b-b01b951a8b89 req-b2f3e25a-8807-4235-977e-9c2be33b0952 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:42:56 compute-0 nova_compute[183075]: 2026-01-22 17:42:56.855 183079 DEBUG oslo_concurrency.lockutils [req-57245558-0cee-406e-837b-b01b951a8b89 req-b2f3e25a-8807-4235-977e-9c2be33b0952 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:42:56 compute-0 nova_compute[183075]: 2026-01-22 17:42:56.855 183079 DEBUG nova.compute.manager [req-57245558-0cee-406e-837b-b01b951a8b89 req-b2f3e25a-8807-4235-977e-9c2be33b0952 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] No waiting events found dispatching network-vif-unplugged-b9c988ec-665e-44cd-a682-5e207216eabc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:42:56 compute-0 nova_compute[183075]: 2026-01-22 17:42:56.855 183079 DEBUG nova.compute.manager [req-57245558-0cee-406e-837b-b01b951a8b89 req-b2f3e25a-8807-4235-977e-9c2be33b0952 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Received event network-vif-unplugged-b9c988ec-665e-44cd-a682-5e207216eabc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:42:57 compute-0 podman[238471]: 2026-01-22 17:42:57.355012711 +0000 UTC m=+0.059212547 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 22 17:42:57 compute-0 nova_compute[183075]: 2026-01-22 17:42:57.644 183079 DEBUG nova.network.neutron [-] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:42:57 compute-0 nova_compute[183075]: 2026-01-22 17:42:57.885 183079 INFO nova.compute.manager [-] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Took 1.87 seconds to deallocate network for instance.
Jan 22 17:42:58 compute-0 nova_compute[183075]: 2026-01-22 17:42:58.044 183079 DEBUG oslo_concurrency.lockutils [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:42:58 compute-0 nova_compute[183075]: 2026-01-22 17:42:58.045 183079 DEBUG oslo_concurrency.lockutils [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:42:58 compute-0 nova_compute[183075]: 2026-01-22 17:42:58.112 183079 DEBUG nova.compute.provider_tree [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:42:58 compute-0 nova_compute[183075]: 2026-01-22 17:42:58.302 183079 DEBUG nova.scheduler.client.report [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:42:58 compute-0 nova_compute[183075]: 2026-01-22 17:42:58.431 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:58 compute-0 nova_compute[183075]: 2026-01-22 17:42:58.521 183079 DEBUG oslo_concurrency.lockutils [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.476s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:42:58 compute-0 nova_compute[183075]: 2026-01-22 17:42:58.552 183079 INFO nova.scheduler.client.report [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance 5abad643-2e22-47fc-bd1d-98ba4f7d6edd
Jan 22 17:42:58 compute-0 nova_compute[183075]: 2026-01-22 17:42:58.823 183079 DEBUG oslo_concurrency.lockutils [None req-205c7196-1d33-4b10-90df-420eb2d69e63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.073 183079 DEBUG nova.compute.manager [req-5f772253-541d-44f9-bccb-8106ab7626af req-09be601e-3f71-48f6-b691-13394c01f5d2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Received event network-vif-plugged-b9c988ec-665e-44cd-a682-5e207216eabc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.074 183079 DEBUG oslo_concurrency.lockutils [req-5f772253-541d-44f9-bccb-8106ab7626af req-09be601e-3f71-48f6-b691-13394c01f5d2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.074 183079 DEBUG oslo_concurrency.lockutils [req-5f772253-541d-44f9-bccb-8106ab7626af req-09be601e-3f71-48f6-b691-13394c01f5d2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.075 183079 DEBUG oslo_concurrency.lockutils [req-5f772253-541d-44f9-bccb-8106ab7626af req-09be601e-3f71-48f6-b691-13394c01f5d2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "5abad643-2e22-47fc-bd1d-98ba4f7d6edd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.075 183079 DEBUG nova.compute.manager [req-5f772253-541d-44f9-bccb-8106ab7626af req-09be601e-3f71-48f6-b691-13394c01f5d2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] No waiting events found dispatching network-vif-plugged-b9c988ec-665e-44cd-a682-5e207216eabc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.075 183079 WARNING nova.compute.manager [req-5f772253-541d-44f9-bccb-8106ab7626af req-09be601e-3f71-48f6-b691-13394c01f5d2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Received unexpected event network-vif-plugged-b9c988ec-665e-44cd-a682-5e207216eabc for instance with vm_state deleted and task_state None.
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.076 183079 DEBUG nova.compute.manager [req-5f772253-541d-44f9-bccb-8106ab7626af req-09be601e-3f71-48f6-b691-13394c01f5d2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Received event network-vif-deleted-b9c988ec-665e-44cd-a682-5e207216eabc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.348 183079 DEBUG oslo_concurrency.lockutils [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.349 183079 DEBUG oslo_concurrency.lockutils [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.349 183079 DEBUG oslo_concurrency.lockutils [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.349 183079 DEBUG oslo_concurrency.lockutils [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.350 183079 DEBUG oslo_concurrency.lockutils [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.351 183079 INFO nova.compute.manager [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Terminating instance
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.352 183079 DEBUG nova.compute.manager [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:42:59 compute-0 kernel: tap5b604d92-0d (unregistering): left promiscuous mode
Jan 22 17:42:59 compute-0 NetworkManager[55454]: <info>  [1769103779.3719] device (tap5b604d92-0d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.378 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:59 compute-0 ovn_controller[95372]: 2026-01-22T17:42:59Z|00733|binding|INFO|Releasing lport 5b604d92-0d68-405f-b4e2-6a3f72fbabad from this chassis (sb_readonly=0)
Jan 22 17:42:59 compute-0 ovn_controller[95372]: 2026-01-22T17:42:59Z|00734|binding|INFO|Setting lport 5b604d92-0d68-405f-b4e2-6a3f72fbabad down in Southbound
Jan 22 17:42:59 compute-0 ovn_controller[95372]: 2026-01-22T17:42:59Z|00735|binding|INFO|Removing iface tap5b604d92-0d ovn-installed in OVS
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.381 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.394 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:59 compute-0 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000003f.scope: Deactivated successfully.
Jan 22 17:42:59 compute-0 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000003f.scope: Consumed 17.571s CPU time.
Jan 22 17:42:59 compute-0 systemd-machined[154382]: Machine qemu-63-instance-0000003f terminated.
Jan 22 17:42:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:59.549 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:61:07 10.100.0.9'], port_security=['fa:16:3e:77:61:07 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5bc95cf8-db79-4c62-95f8-ab8f0dbabd44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5d20a402-d35a-4a65-8bef-cd1a3a7097da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.231'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=5b604d92-0d68-405f-b4e2-6a3f72fbabad) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:42:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:59.550 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 5b604d92-0d68-405f-b4e2-6a3f72fbabad in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:42:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:59.551 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:42:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:59.552 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bd16b414-15d4-4444-b8b9-848ec01dd657]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:42:59 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:42:59.553 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace which is not needed anymore
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.615 183079 INFO nova.virt.libvirt.driver [-] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Instance destroyed successfully.
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.615 183079 DEBUG nova.objects.instance [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.754 183079 DEBUG nova.virt.libvirt.vif [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:40:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1916165929',display_name='tempest-server-test-1916165929',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1916165929',id=63,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:40:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-uy1v1f1q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:40:40Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=5bc95cf8-db79-4c62-95f8-ab8f0dbabd44,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "address": "fa:16:3e:77:61:07", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b604d92-0d", "ovs_interfaceid": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.755 183079 DEBUG nova.network.os_vif_util [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "address": "fa:16:3e:77:61:07", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b604d92-0d", "ovs_interfaceid": "5b604d92-0d68-405f-b4e2-6a3f72fbabad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.756 183079 DEBUG nova.network.os_vif_util [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:77:61:07,bridge_name='br-int',has_traffic_filtering=True,id=5b604d92-0d68-405f-b4e2-6a3f72fbabad,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b604d92-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.756 183079 DEBUG os_vif [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:61:07,bridge_name='br-int',has_traffic_filtering=True,id=5b604d92-0d68-405f-b4e2-6a3f72fbabad,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b604d92-0d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.758 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.759 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5b604d92-0d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.761 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.762 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.766 183079 INFO os_vif [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:61:07,bridge_name='br-int',has_traffic_filtering=True,id=5b604d92-0d68-405f-b4e2-6a3f72fbabad,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b604d92-0d')
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.767 183079 INFO nova.virt.libvirt.driver [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Deleting instance files /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44_del
Jan 22 17:42:59 compute-0 nova_compute[183075]: 2026-01-22 17:42:59.767 183079 INFO nova.virt.libvirt.driver [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Deletion of /var/lib/nova/instances/5bc95cf8-db79-4c62-95f8-ab8f0dbabd44_del complete
Jan 22 17:42:59 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237705]: [NOTICE]   (237709) : haproxy version is 2.8.14-c23fe91
Jan 22 17:42:59 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237705]: [NOTICE]   (237709) : path to executable is /usr/sbin/haproxy
Jan 22 17:42:59 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237705]: [WARNING]  (237709) : Exiting Master process...
Jan 22 17:42:59 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237705]: [ALERT]    (237709) : Current worker (237711) exited with code 143 (Terminated)
Jan 22 17:42:59 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[237705]: [WARNING]  (237709) : All workers exited. Exiting... (0)
Jan 22 17:42:59 compute-0 systemd[1]: libpod-1ce05d4696edaa5144b72d13bc4ec532a882443887f7a0f2b062a21ca6e0629c.scope: Deactivated successfully.
Jan 22 17:42:59 compute-0 podman[238526]: 2026-01-22 17:42:59.778335855 +0000 UTC m=+0.136176599 container died 1ce05d4696edaa5144b72d13bc4ec532a882443887f7a0f2b062a21ca6e0629c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 17:43:00 compute-0 nova_compute[183075]: 2026-01-22 17:43:00.214 183079 INFO nova.compute.manager [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Took 0.86 seconds to destroy the instance on the hypervisor.
Jan 22 17:43:00 compute-0 nova_compute[183075]: 2026-01-22 17:43:00.216 183079 DEBUG oslo.service.loopingcall [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:43:00 compute-0 nova_compute[183075]: 2026-01-22 17:43:00.217 183079 DEBUG nova.compute.manager [-] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:43:00 compute-0 nova_compute[183075]: 2026-01-22 17:43:00.217 183079 DEBUG nova.network.neutron [-] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:43:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-d3b52b4ca8ea92f634dea2c0bf0806bc158530ab7534af443d415dba6fc695dc-merged.mount: Deactivated successfully.
Jan 22 17:43:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1ce05d4696edaa5144b72d13bc4ec532a882443887f7a0f2b062a21ca6e0629c-userdata-shm.mount: Deactivated successfully.
Jan 22 17:43:02 compute-0 nova_compute[183075]: 2026-01-22 17:43:02.188 183079 DEBUG nova.compute.manager [req-c22c6e8e-c52c-4b67-bac3-0e2096b5baad req-90303c9f-ceb3-4f58-a6f7-78339c5d7ee0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Received event network-vif-unplugged-5b604d92-0d68-405f-b4e2-6a3f72fbabad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:43:02 compute-0 nova_compute[183075]: 2026-01-22 17:43:02.189 183079 DEBUG oslo_concurrency.lockutils [req-c22c6e8e-c52c-4b67-bac3-0e2096b5baad req-90303c9f-ceb3-4f58-a6f7-78339c5d7ee0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:02 compute-0 nova_compute[183075]: 2026-01-22 17:43:02.189 183079 DEBUG oslo_concurrency.lockutils [req-c22c6e8e-c52c-4b67-bac3-0e2096b5baad req-90303c9f-ceb3-4f58-a6f7-78339c5d7ee0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:02 compute-0 nova_compute[183075]: 2026-01-22 17:43:02.189 183079 DEBUG oslo_concurrency.lockutils [req-c22c6e8e-c52c-4b67-bac3-0e2096b5baad req-90303c9f-ceb3-4f58-a6f7-78339c5d7ee0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:02 compute-0 nova_compute[183075]: 2026-01-22 17:43:02.190 183079 DEBUG nova.compute.manager [req-c22c6e8e-c52c-4b67-bac3-0e2096b5baad req-90303c9f-ceb3-4f58-a6f7-78339c5d7ee0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] No waiting events found dispatching network-vif-unplugged-5b604d92-0d68-405f-b4e2-6a3f72fbabad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:43:02 compute-0 nova_compute[183075]: 2026-01-22 17:43:02.190 183079 DEBUG nova.compute.manager [req-c22c6e8e-c52c-4b67-bac3-0e2096b5baad req-90303c9f-ceb3-4f58-a6f7-78339c5d7ee0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Received event network-vif-unplugged-5b604d92-0d68-405f-b4e2-6a3f72fbabad for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:43:02 compute-0 podman[238526]: 2026-01-22 17:43:02.397547977 +0000 UTC m=+2.755388761 container cleanup 1ce05d4696edaa5144b72d13bc4ec532a882443887f7a0f2b062a21ca6e0629c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 17:43:02 compute-0 systemd[1]: libpod-conmon-1ce05d4696edaa5144b72d13bc4ec532a882443887f7a0f2b062a21ca6e0629c.scope: Deactivated successfully.
Jan 22 17:43:03 compute-0 podman[238557]: 2026-01-22 17:43:03.214781837 +0000 UTC m=+0.782840720 container remove 1ce05d4696edaa5144b72d13bc4ec532a882443887f7a0f2b062a21ca6e0629c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 17:43:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:03.219 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6409a314-1ba7-439c-912d-a4ec48069ff3]: (4, ('Thu Jan 22 05:42:59 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (1ce05d4696edaa5144b72d13bc4ec532a882443887f7a0f2b062a21ca6e0629c)\n1ce05d4696edaa5144b72d13bc4ec532a882443887f7a0f2b062a21ca6e0629c\nThu Jan 22 05:43:02 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (1ce05d4696edaa5144b72d13bc4ec532a882443887f7a0f2b062a21ca6e0629c)\n1ce05d4696edaa5144b72d13bc4ec532a882443887f7a0f2b062a21ca6e0629c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:03.222 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[76eceddf-0787-424a-b08b-af2d04ed72a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:03.223 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:43:03 compute-0 nova_compute[183075]: 2026-01-22 17:43:03.225 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:03 compute-0 kernel: tap88ed9213-70: left promiscuous mode
Jan 22 17:43:03 compute-0 nova_compute[183075]: 2026-01-22 17:43:03.229 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:03.232 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b9401b42-3db6-4d7e-9a90-d9599b8d126d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:03 compute-0 nova_compute[183075]: 2026-01-22 17:43:03.243 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:03.254 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[17952614-9425-4488-9e04-3b2e4ce0245b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:03.256 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b3f82d08-ea67-4d30-80e5-3fb5e5d832c1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:03.274 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[677054e6-2852-4b21-ac87-9c6f26a32647]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600345, 'reachable_time': 30164, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238571, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:03 compute-0 systemd[1]: run-netns-ovnmeta\x2d88ed9213\x2d7d48\x2d4c42\x2dad60\x2dce3fcf104f71.mount: Deactivated successfully.
Jan 22 17:43:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:03.278 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:43:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:03.278 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[6e0555fe-3354-4770-8b44-92f01a025f76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:03 compute-0 nova_compute[183075]: 2026-01-22 17:43:03.433 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:04 compute-0 nova_compute[183075]: 2026-01-22 17:43:04.762 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:04 compute-0 nova_compute[183075]: 2026-01-22 17:43:04.892 183079 DEBUG nova.network.neutron [-] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:43:04 compute-0 nova_compute[183075]: 2026-01-22 17:43:04.904 183079 DEBUG nova.compute.manager [req-0497fdb3-f7cf-4f18-b1f2-ab1a3e6f2a26 req-4ec6d86d-066e-46e8-afe3-7762d27e711a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Received event network-vif-deleted-5b604d92-0d68-405f-b4e2-6a3f72fbabad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:43:04 compute-0 nova_compute[183075]: 2026-01-22 17:43:04.904 183079 INFO nova.compute.manager [req-0497fdb3-f7cf-4f18-b1f2-ab1a3e6f2a26 req-4ec6d86d-066e-46e8-afe3-7762d27e711a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Neutron deleted interface 5b604d92-0d68-405f-b4e2-6a3f72fbabad; detaching it from the instance and deleting it from the info cache
Jan 22 17:43:04 compute-0 nova_compute[183075]: 2026-01-22 17:43:04.904 183079 DEBUG nova.network.neutron [req-0497fdb3-f7cf-4f18-b1f2-ab1a3e6f2a26 req-4ec6d86d-066e-46e8-afe3-7762d27e711a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:43:04 compute-0 nova_compute[183075]: 2026-01-22 17:43:04.906 183079 DEBUG nova.compute.manager [req-4f3af66a-f8ed-4f64-b129-1c4cbddf47d9 req-b2d44791-cf4c-4abc-ba18-6fb274e7ddf1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Received event network-vif-plugged-5b604d92-0d68-405f-b4e2-6a3f72fbabad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:43:04 compute-0 nova_compute[183075]: 2026-01-22 17:43:04.906 183079 DEBUG oslo_concurrency.lockutils [req-4f3af66a-f8ed-4f64-b129-1c4cbddf47d9 req-b2d44791-cf4c-4abc-ba18-6fb274e7ddf1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:04 compute-0 nova_compute[183075]: 2026-01-22 17:43:04.906 183079 DEBUG oslo_concurrency.lockutils [req-4f3af66a-f8ed-4f64-b129-1c4cbddf47d9 req-b2d44791-cf4c-4abc-ba18-6fb274e7ddf1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:04 compute-0 nova_compute[183075]: 2026-01-22 17:43:04.907 183079 DEBUG oslo_concurrency.lockutils [req-4f3af66a-f8ed-4f64-b129-1c4cbddf47d9 req-b2d44791-cf4c-4abc-ba18-6fb274e7ddf1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:04 compute-0 nova_compute[183075]: 2026-01-22 17:43:04.907 183079 DEBUG nova.compute.manager [req-4f3af66a-f8ed-4f64-b129-1c4cbddf47d9 req-b2d44791-cf4c-4abc-ba18-6fb274e7ddf1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] No waiting events found dispatching network-vif-plugged-5b604d92-0d68-405f-b4e2-6a3f72fbabad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:43:04 compute-0 nova_compute[183075]: 2026-01-22 17:43:04.907 183079 WARNING nova.compute.manager [req-4f3af66a-f8ed-4f64-b129-1c4cbddf47d9 req-b2d44791-cf4c-4abc-ba18-6fb274e7ddf1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Received unexpected event network-vif-plugged-5b604d92-0d68-405f-b4e2-6a3f72fbabad for instance with vm_state active and task_state deleting.
Jan 22 17:43:05 compute-0 nova_compute[183075]: 2026-01-22 17:43:05.222 183079 INFO nova.compute.manager [-] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Took 5.00 seconds to deallocate network for instance.
Jan 22 17:43:05 compute-0 nova_compute[183075]: 2026-01-22 17:43:05.233 183079 DEBUG nova.compute.manager [req-0497fdb3-f7cf-4f18-b1f2-ab1a3e6f2a26 req-4ec6d86d-066e-46e8-afe3-7762d27e711a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Detach interface failed, port_id=5b604d92-0d68-405f-b4e2-6a3f72fbabad, reason: Instance 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 17:43:05 compute-0 nova_compute[183075]: 2026-01-22 17:43:05.577 183079 DEBUG oslo_concurrency.lockutils [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:05 compute-0 nova_compute[183075]: 2026-01-22 17:43:05.578 183079 DEBUG oslo_concurrency.lockutils [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:05 compute-0 nova_compute[183075]: 2026-01-22 17:43:05.625 183079 DEBUG nova.compute.provider_tree [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:43:05 compute-0 nova_compute[183075]: 2026-01-22 17:43:05.836 183079 DEBUG nova.scheduler.client.report [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:43:06 compute-0 nova_compute[183075]: 2026-01-22 17:43:06.272 183079 DEBUG oslo_concurrency.lockutils [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:06 compute-0 nova_compute[183075]: 2026-01-22 17:43:06.441 183079 INFO nova.scheduler.client.report [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44
Jan 22 17:43:06 compute-0 nova_compute[183075]: 2026-01-22 17:43:06.832 183079 DEBUG oslo_concurrency.lockutils [None req-fce579f1-914e-4f24-beff-3533e69d1422 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "5bc95cf8-db79-4c62-95f8-ab8f0dbabd44" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.484s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:08 compute-0 podman[238576]: 2026-01-22 17:43:08.335444495 +0000 UTC m=+0.045218132 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:43:08 compute-0 nova_compute[183075]: 2026-01-22 17:43:08.435 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:09 compute-0 nova_compute[183075]: 2026-01-22 17:43:09.766 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:10 compute-0 nova_compute[183075]: 2026-01-22 17:43:10.926 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103775.9242158, 5abad643-2e22-47fc-bd1d-98ba4f7d6edd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:43:10 compute-0 nova_compute[183075]: 2026-01-22 17:43:10.926 183079 INFO nova.compute.manager [-] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] VM Stopped (Lifecycle Event)
Jan 22 17:43:10 compute-0 nova_compute[183075]: 2026-01-22 17:43:10.946 183079 DEBUG nova.compute.manager [None req-c2221f15-c110-41dc-94e7-5955efbfdf33 - - - - - -] [instance: 5abad643-2e22-47fc-bd1d-98ba4f7d6edd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:43:12 compute-0 podman[238601]: 2026-01-22 17:43:12.340266822 +0000 UTC m=+0.052941449 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:43:13 compute-0 nova_compute[183075]: 2026-01-22 17:43:13.437 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:14 compute-0 nova_compute[183075]: 2026-01-22 17:43:14.081 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:14 compute-0 nova_compute[183075]: 2026-01-22 17:43:14.081 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:14 compute-0 nova_compute[183075]: 2026-01-22 17:43:14.288 183079 DEBUG nova.compute.manager [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:43:14 compute-0 nova_compute[183075]: 2026-01-22 17:43:14.614 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103779.613193, 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:43:14 compute-0 nova_compute[183075]: 2026-01-22 17:43:14.614 183079 INFO nova.compute.manager [-] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] VM Stopped (Lifecycle Event)
Jan 22 17:43:14 compute-0 nova_compute[183075]: 2026-01-22 17:43:14.769 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:14 compute-0 nova_compute[183075]: 2026-01-22 17:43:14.978 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:14 compute-0 nova_compute[183075]: 2026-01-22 17:43:14.978 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:14 compute-0 nova_compute[183075]: 2026-01-22 17:43:14.985 183079 DEBUG nova.virt.hardware [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:43:14 compute-0 nova_compute[183075]: 2026-01-22 17:43:14.985 183079 INFO nova.compute.claims [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:43:15 compute-0 nova_compute[183075]: 2026-01-22 17:43:15.054 183079 DEBUG nova.compute.manager [None req-ea2aa433-e1a9-4096-bd6d-7c1340c8716d - - - - - -] [instance: 5bc95cf8-db79-4c62-95f8-ab8f0dbabd44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:43:15 compute-0 nova_compute[183075]: 2026-01-22 17:43:15.489 183079 DEBUG nova.compute.provider_tree [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:43:15 compute-0 nova_compute[183075]: 2026-01-22 17:43:15.532 183079 DEBUG nova.scheduler.client.report [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:43:15 compute-0 nova_compute[183075]: 2026-01-22 17:43:15.704 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.726s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:15 compute-0 nova_compute[183075]: 2026-01-22 17:43:15.705 183079 DEBUG nova.compute.manager [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:43:16 compute-0 nova_compute[183075]: 2026-01-22 17:43:16.072 183079 DEBUG nova.compute.manager [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:43:16 compute-0 nova_compute[183075]: 2026-01-22 17:43:16.072 183079 DEBUG nova.network.neutron [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:43:16 compute-0 nova_compute[183075]: 2026-01-22 17:43:16.262 183079 INFO nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:43:16 compute-0 nova_compute[183075]: 2026-01-22 17:43:16.642 183079 DEBUG nova.compute.manager [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.335 183079 DEBUG nova.compute.manager [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.336 183079 DEBUG nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.336 183079 INFO nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Creating image(s)
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.337 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.337 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.338 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.351 183079 DEBUG oslo_concurrency.processutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.401 183079 DEBUG nova.policy [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.408 183079 DEBUG oslo_concurrency.processutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.408 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.409 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.420 183079 DEBUG oslo_concurrency.processutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.469 183079 DEBUG oslo_concurrency.processutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.470 183079 DEBUG oslo_concurrency.processutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.865 183079 DEBUG oslo_concurrency.processutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk 1073741824" returned: 0 in 0.395s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.867 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.458s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.867 183079 DEBUG oslo_concurrency.processutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.925 183079 DEBUG oslo_concurrency.processutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.926 183079 DEBUG nova.virt.disk.api [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.927 183079 DEBUG oslo_concurrency.processutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.986 183079 DEBUG oslo_concurrency.processutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.987 183079 DEBUG nova.virt.disk.api [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:43:17 compute-0 nova_compute[183075]: 2026-01-22 17:43:17.987 183079 DEBUG nova.objects.instance [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid c570960f-e948-4456-9f9c-7b8afd2cf0ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:43:18 compute-0 nova_compute[183075]: 2026-01-22 17:43:18.013 183079 DEBUG nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:43:18 compute-0 nova_compute[183075]: 2026-01-22 17:43:18.014 183079 DEBUG nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Ensure instance console log exists: /var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:43:18 compute-0 nova_compute[183075]: 2026-01-22 17:43:18.015 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:18 compute-0 nova_compute[183075]: 2026-01-22 17:43:18.015 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:18 compute-0 nova_compute[183075]: 2026-01-22 17:43:18.015 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:18 compute-0 nova_compute[183075]: 2026-01-22 17:43:18.440 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:19 compute-0 nova_compute[183075]: 2026-01-22 17:43:19.772 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:20 compute-0 nova_compute[183075]: 2026-01-22 17:43:20.555 183079 DEBUG nova.network.neutron [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Successfully created port: eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:43:21 compute-0 nova_compute[183075]: 2026-01-22 17:43:21.354 183079 DEBUG nova.network.neutron [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Successfully updated port: eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:43:21 compute-0 nova_compute[183075]: 2026-01-22 17:43:21.375 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-c570960f-e948-4456-9f9c-7b8afd2cf0ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:43:21 compute-0 nova_compute[183075]: 2026-01-22 17:43:21.375 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-c570960f-e948-4456-9f9c-7b8afd2cf0ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:43:21 compute-0 nova_compute[183075]: 2026-01-22 17:43:21.375 183079 DEBUG nova.network.neutron [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:43:21 compute-0 nova_compute[183075]: 2026-01-22 17:43:21.435 183079 DEBUG nova.compute.manager [req-8ff43f89-518e-4369-8895-6ad8af6251ba req-f544a81e-346d-4ba5-b1df-076418e36d37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received event network-changed-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:43:21 compute-0 nova_compute[183075]: 2026-01-22 17:43:21.435 183079 DEBUG nova.compute.manager [req-8ff43f89-518e-4369-8895-6ad8af6251ba req-f544a81e-346d-4ba5-b1df-076418e36d37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Refreshing instance network info cache due to event network-changed-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:43:21 compute-0 nova_compute[183075]: 2026-01-22 17:43:21.436 183079 DEBUG oslo_concurrency.lockutils [req-8ff43f89-518e-4369-8895-6ad8af6251ba req-f544a81e-346d-4ba5-b1df-076418e36d37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-c570960f-e948-4456-9f9c-7b8afd2cf0ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:43:21 compute-0 nova_compute[183075]: 2026-01-22 17:43:21.518 183079 DEBUG nova.network.neutron [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:43:22 compute-0 podman[238643]: 2026-01-22 17:43:22.348676259 +0000 UTC m=+0.052393665 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 17:43:22 compute-0 podman[238644]: 2026-01-22 17:43:22.355951474 +0000 UTC m=+0.055880418 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.openshift.expose-services=, config_id=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350)
Jan 22 17:43:22 compute-0 podman[238642]: 2026-01-22 17:43:22.403379984 +0000 UTC m=+0.111866318 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.703 183079 DEBUG nova.network.neutron [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Updating instance_info_cache with network_info: [{"id": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "address": "fa:16:3e:d4:a5:66", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeff2a2a1-b5", "ovs_interfaceid": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.725 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-c570960f-e948-4456-9f9c-7b8afd2cf0ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.726 183079 DEBUG nova.compute.manager [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Instance network_info: |[{"id": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "address": "fa:16:3e:d4:a5:66", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeff2a2a1-b5", "ovs_interfaceid": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.726 183079 DEBUG oslo_concurrency.lockutils [req-8ff43f89-518e-4369-8895-6ad8af6251ba req-f544a81e-346d-4ba5-b1df-076418e36d37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-c570960f-e948-4456-9f9c-7b8afd2cf0ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.727 183079 DEBUG nova.network.neutron [req-8ff43f89-518e-4369-8895-6ad8af6251ba req-f544a81e-346d-4ba5-b1df-076418e36d37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Refreshing network info cache for port eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.729 183079 DEBUG nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Start _get_guest_xml network_info=[{"id": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "address": "fa:16:3e:d4:a5:66", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeff2a2a1-b5", "ovs_interfaceid": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.734 183079 WARNING nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.740 183079 DEBUG nova.virt.libvirt.host [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.741 183079 DEBUG nova.virt.libvirt.host [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.749 183079 DEBUG nova.virt.libvirt.host [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.750 183079 DEBUG nova.virt.libvirt.host [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.751 183079 DEBUG nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.751 183079 DEBUG nova.virt.hardware [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.752 183079 DEBUG nova.virt.hardware [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.752 183079 DEBUG nova.virt.hardware [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.752 183079 DEBUG nova.virt.hardware [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.753 183079 DEBUG nova.virt.hardware [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.753 183079 DEBUG nova.virt.hardware [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.753 183079 DEBUG nova.virt.hardware [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.754 183079 DEBUG nova.virt.hardware [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.754 183079 DEBUG nova.virt.hardware [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.754 183079 DEBUG nova.virt.hardware [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.755 183079 DEBUG nova.virt.hardware [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.758 183079 DEBUG nova.virt.libvirt.vif [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:43:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-738920808',display_name='tempest-server-test-738920808',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-738920808',id=65,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-abrrxh4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:43:17Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=c570960f-e948-4456-9f9c-7b8afd2cf0ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "address": "fa:16:3e:d4:a5:66", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeff2a2a1-b5", "ovs_interfaceid": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.759 183079 DEBUG nova.network.os_vif_util [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "address": "fa:16:3e:d4:a5:66", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeff2a2a1-b5", "ovs_interfaceid": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.760 183079 DEBUG nova.network.os_vif_util [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:a5:66,bridge_name='br-int',has_traffic_filtering=True,id=eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeff2a2a1-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.761 183079 DEBUG nova.objects.instance [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid c570960f-e948-4456-9f9c-7b8afd2cf0ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.774 183079 DEBUG nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:43:22 compute-0 nova_compute[183075]:   <uuid>c570960f-e948-4456-9f9c-7b8afd2cf0ac</uuid>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   <name>instance-00000041</name>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-738920808</nova:name>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:43:22</nova:creationTime>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:43:22 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:43:22 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:43:22 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:43:22 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:43:22 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:43:22 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:43:22 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:43:22 compute-0 nova_compute[183075]:         <nova:port uuid="eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef">
Jan 22 17:43:22 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <system>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <entry name="serial">c570960f-e948-4456-9f9c-7b8afd2cf0ac</entry>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <entry name="uuid">c570960f-e948-4456-9f9c-7b8afd2cf0ac</entry>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     </system>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   <os>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   </os>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   <features>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   </features>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:d4:a5:66"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <target dev="tapeff2a2a1-b5"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac/console.log" append="off"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <video>
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     </video>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:43:22 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:43:22 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:43:22 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:43:22 compute-0 nova_compute[183075]: </domain>
Jan 22 17:43:22 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.776 183079 DEBUG nova.compute.manager [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Preparing to wait for external event network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.776 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.777 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.777 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.778 183079 DEBUG nova.virt.libvirt.vif [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:43:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-738920808',display_name='tempest-server-test-738920808',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-738920808',id=65,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-abrrxh4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:43:17Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=c570960f-e948-4456-9f9c-7b8afd2cf0ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "address": "fa:16:3e:d4:a5:66", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeff2a2a1-b5", "ovs_interfaceid": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.778 183079 DEBUG nova.network.os_vif_util [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "address": "fa:16:3e:d4:a5:66", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeff2a2a1-b5", "ovs_interfaceid": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.779 183079 DEBUG nova.network.os_vif_util [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:a5:66,bridge_name='br-int',has_traffic_filtering=True,id=eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeff2a2a1-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.780 183079 DEBUG os_vif [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:a5:66,bridge_name='br-int',has_traffic_filtering=True,id=eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeff2a2a1-b5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.781 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.782 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.782 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.787 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.787 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeff2a2a1-b5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.788 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeff2a2a1-b5, col_values=(('external_ids', {'iface-id': 'eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d4:a5:66', 'vm-uuid': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.790 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:22 compute-0 NetworkManager[55454]: <info>  [1769103802.7913] manager: (tapeff2a2a1-b5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/297)
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.793 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.796 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:22 compute-0 nova_compute[183075]: 2026-01-22 17:43:22.797 183079 INFO os_vif [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:a5:66,bridge_name='br-int',has_traffic_filtering=True,id=eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeff2a2a1-b5')
Jan 22 17:43:23 compute-0 nova_compute[183075]: 2026-01-22 17:43:23.167 183079 DEBUG nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:43:23 compute-0 nova_compute[183075]: 2026-01-22 17:43:23.167 183079 DEBUG nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:d4:a5:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:43:23 compute-0 kernel: tapeff2a2a1-b5: entered promiscuous mode
Jan 22 17:43:23 compute-0 NetworkManager[55454]: <info>  [1769103803.2466] manager: (tapeff2a2a1-b5): new Tun device (/org/freedesktop/NetworkManager/Devices/298)
Jan 22 17:43:23 compute-0 nova_compute[183075]: 2026-01-22 17:43:23.245 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:23 compute-0 ovn_controller[95372]: 2026-01-22T17:43:23Z|00736|binding|INFO|Claiming lport eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef for this chassis.
Jan 22 17:43:23 compute-0 ovn_controller[95372]: 2026-01-22T17:43:23Z|00737|binding|INFO|eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef: Claiming fa:16:3e:d4:a5:66 10.100.0.13
Jan 22 17:43:23 compute-0 ovn_controller[95372]: 2026-01-22T17:43:23Z|00738|binding|INFO|Setting lport eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef ovn-installed in OVS
Jan 22 17:43:23 compute-0 nova_compute[183075]: 2026-01-22 17:43:23.259 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:23 compute-0 nova_compute[183075]: 2026-01-22 17:43:23.262 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:23 compute-0 systemd-udevd[238721]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:43:23 compute-0 systemd-machined[154382]: New machine qemu-65-instance-00000041.
Jan 22 17:43:23 compute-0 NetworkManager[55454]: <info>  [1769103803.2988] device (tapeff2a2a1-b5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:43:23 compute-0 NetworkManager[55454]: <info>  [1769103803.2996] device (tapeff2a2a1-b5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:43:23 compute-0 systemd[1]: Started Virtual Machine qemu-65-instance-00000041.
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.377 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:a5:66 10.100.0.13'], port_security=['fa:16:3e:d4:a5:66 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '54e9e10b-dcc5-478e-80d1-2fa424b9bf22 b9fe05a4-23df-4029-b1fa-68aa53e00156', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.378 104629 INFO neutron.agent.ovn.metadata.agent [-] Port eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:43:23 compute-0 ovn_controller[95372]: 2026-01-22T17:43:23Z|00739|binding|INFO|Setting lport eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef up in Southbound
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.379 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.398 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3c8f9c4c-6c81-4743-8484-60a22365f9ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.399 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88ed9213-71 in ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.401 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88ed9213-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.401 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0e7bde2d-0595-4dc8-9211-c3f86aa7866c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.402 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[90e3baab-2582-42bf-9baf-33901411a83d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.417 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[a3ab1f02-c418-4f01-b806-86a83c22b3ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.430 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[785d0dcc-d66c-4cd7-831c-71852ed8b3d7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:23 compute-0 nova_compute[183075]: 2026-01-22 17:43:23.440 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.458 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[85c1661a-c3c3-406d-bd23-5a37053b7c81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.464 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[35362db6-bb92-47e3-b3e0-00ce4aea7ce8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:23 compute-0 NetworkManager[55454]: <info>  [1769103803.4652] manager: (tap88ed9213-70): new Veth device (/org/freedesktop/NetworkManager/Devices/299)
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.501 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[730d140c-3755-4276-b06a-d7cc06fcb1f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.505 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[07367d3d-8848-48b7-a26c-8449333e71f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:23 compute-0 NetworkManager[55454]: <info>  [1769103803.5260] device (tap88ed9213-70): carrier: link connected
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.531 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[29ab7cd6-f9cb-40aa-95e8-70e48cc15768]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.547 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ff28756a-628b-46b6-ab06-b402429c2c26]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616722, 'reachable_time': 15422, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238755, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.562 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e5352e1e-fa99-4464-973e-f928004eb999]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:fa93'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 616722, 'tstamp': 616722}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238756, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.585 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[09b07d35-e950-4d9c-91f5-6fa7b5636be5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616722, 'reachable_time': 15422, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238757, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.614 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb898ae-963b-44a6-b192-34c2a025c17e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.692 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fde7e578-a56a-4142-a0a1-fd01ecf7c1f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.695 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.696 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.697 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:43:23 compute-0 NetworkManager[55454]: <info>  [1769103803.6995] manager: (tap88ed9213-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/300)
Jan 22 17:43:23 compute-0 kernel: tap88ed9213-70: entered promiscuous mode
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.702 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:43:23 compute-0 ovn_controller[95372]: 2026-01-22T17:43:23Z|00740|binding|INFO|Releasing lport da93a32e-eed6-4124-8f08-78b0bfe2cb69 from this chassis (sb_readonly=0)
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.704 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:43:23 compute-0 nova_compute[183075]: 2026-01-22 17:43:23.712 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.715 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e9044f3f-4db7-4834-9815-2434ae5d0faf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.716 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:43:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:23.717 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'env', 'PROCESS_TAG=haproxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88ed9213-7d48-4c42-ad60-ce3fcf104f71.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:43:23 compute-0 nova_compute[183075]: 2026-01-22 17:43:23.718 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:23 compute-0 nova_compute[183075]: 2026-01-22 17:43:23.758 183079 DEBUG nova.compute.manager [req-7e6173d0-0c8d-4eda-8022-5dce26798b38 req-240a80cd-37fe-407b-a2ea-813bbe182270 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received event network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:43:23 compute-0 nova_compute[183075]: 2026-01-22 17:43:23.759 183079 DEBUG oslo_concurrency.lockutils [req-7e6173d0-0c8d-4eda-8022-5dce26798b38 req-240a80cd-37fe-407b-a2ea-813bbe182270 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:23 compute-0 nova_compute[183075]: 2026-01-22 17:43:23.760 183079 DEBUG oslo_concurrency.lockutils [req-7e6173d0-0c8d-4eda-8022-5dce26798b38 req-240a80cd-37fe-407b-a2ea-813bbe182270 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:23 compute-0 nova_compute[183075]: 2026-01-22 17:43:23.760 183079 DEBUG oslo_concurrency.lockutils [req-7e6173d0-0c8d-4eda-8022-5dce26798b38 req-240a80cd-37fe-407b-a2ea-813bbe182270 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:23 compute-0 nova_compute[183075]: 2026-01-22 17:43:23.760 183079 DEBUG nova.compute.manager [req-7e6173d0-0c8d-4eda-8022-5dce26798b38 req-240a80cd-37fe-407b-a2ea-813bbe182270 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Processing event network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:43:23 compute-0 nova_compute[183075]: 2026-01-22 17:43:23.892 183079 DEBUG nova.network.neutron [req-8ff43f89-518e-4369-8895-6ad8af6251ba req-f544a81e-346d-4ba5-b1df-076418e36d37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Updated VIF entry in instance network info cache for port eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:43:23 compute-0 nova_compute[183075]: 2026-01-22 17:43:23.892 183079 DEBUG nova.network.neutron [req-8ff43f89-518e-4369-8895-6ad8af6251ba req-f544a81e-346d-4ba5-b1df-076418e36d37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Updating instance_info_cache with network_info: [{"id": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "address": "fa:16:3e:d4:a5:66", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeff2a2a1-b5", "ovs_interfaceid": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:43:23 compute-0 nova_compute[183075]: 2026-01-22 17:43:23.907 183079 DEBUG oslo_concurrency.lockutils [req-8ff43f89-518e-4369-8895-6ad8af6251ba req-f544a81e-346d-4ba5-b1df-076418e36d37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-c570960f-e948-4456-9f9c-7b8afd2cf0ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:43:24 compute-0 podman[238789]: 2026-01-22 17:43:24.097669889 +0000 UTC m=+0.047080682 container create 3907e30560daef0f4176c513bc1747cc2bb9cdc4e6713cd7031660683c7398c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 17:43:24 compute-0 systemd[1]: Started libpod-conmon-3907e30560daef0f4176c513bc1747cc2bb9cdc4e6713cd7031660683c7398c6.scope.
Jan 22 17:43:24 compute-0 podman[238789]: 2026-01-22 17:43:24.071272232 +0000 UTC m=+0.020683055 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:43:24 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:43:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03abbd08c70c21a9f5403a59a983775d43964d1a14afe1985847a6494a290f70/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.190 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103804.1902406, c570960f-e948-4456-9f9c-7b8afd2cf0ac => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.192 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] VM Started (Lifecycle Event)
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.194 183079 DEBUG nova.compute.manager [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:43:24 compute-0 podman[238789]: 2026-01-22 17:43:24.197171904 +0000 UTC m=+0.146582727 container init 3907e30560daef0f4176c513bc1747cc2bb9cdc4e6713cd7031660683c7398c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.197 183079 DEBUG nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.200 183079 INFO nova.virt.libvirt.driver [-] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Instance spawned successfully.
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.200 183079 DEBUG nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:43:24 compute-0 podman[238789]: 2026-01-22 17:43:24.202393444 +0000 UTC m=+0.151804237 container start 3907e30560daef0f4176c513bc1747cc2bb9cdc4e6713cd7031660683c7398c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.221 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.226 183079 DEBUG nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.227 183079 DEBUG nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.228 183079 DEBUG nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.228 183079 DEBUG nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.229 183079 DEBUG nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.229 183079 DEBUG nova.virt.libvirt.driver [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:43:24 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238810]: [NOTICE]   (238815) : New worker (238817) forked
Jan 22 17:43:24 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238810]: [NOTICE]   (238815) : Loading success.
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.238 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.276 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.277 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103804.1904864, c570960f-e948-4456-9f9c-7b8afd2cf0ac => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.277 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] VM Paused (Lifecycle Event)
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.298 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.302 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103804.196365, c570960f-e948-4456-9f9c-7b8afd2cf0ac => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.303 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] VM Resumed (Lifecycle Event)
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.372 183079 INFO nova.compute.manager [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Took 7.04 seconds to spawn the instance on the hypervisor.
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.374 183079 DEBUG nova.compute.manager [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.499 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.503 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.528 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.530 183079 INFO nova.compute.manager [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Took 9.59 seconds to build instance.
Jan 22 17:43:24 compute-0 nova_compute[183075]: 2026-01-22 17:43:24.683 183079 DEBUG oslo_concurrency.lockutils [None req-cd1f4a43-cf43-4f93-b799-c8a633fb64ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:25 compute-0 nova_compute[183075]: 2026-01-22 17:43:25.834 183079 DEBUG nova.compute.manager [req-a075182f-83e0-4d9d-9ef8-ef4880fd1132 req-7be2b631-b6ba-4778-9686-f1b31d29f4d9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received event network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:43:25 compute-0 nova_compute[183075]: 2026-01-22 17:43:25.835 183079 DEBUG oslo_concurrency.lockutils [req-a075182f-83e0-4d9d-9ef8-ef4880fd1132 req-7be2b631-b6ba-4778-9686-f1b31d29f4d9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:25 compute-0 nova_compute[183075]: 2026-01-22 17:43:25.835 183079 DEBUG oslo_concurrency.lockutils [req-a075182f-83e0-4d9d-9ef8-ef4880fd1132 req-7be2b631-b6ba-4778-9686-f1b31d29f4d9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:25 compute-0 nova_compute[183075]: 2026-01-22 17:43:25.836 183079 DEBUG oslo_concurrency.lockutils [req-a075182f-83e0-4d9d-9ef8-ef4880fd1132 req-7be2b631-b6ba-4778-9686-f1b31d29f4d9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:25 compute-0 nova_compute[183075]: 2026-01-22 17:43:25.836 183079 DEBUG nova.compute.manager [req-a075182f-83e0-4d9d-9ef8-ef4880fd1132 req-7be2b631-b6ba-4778-9686-f1b31d29f4d9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] No waiting events found dispatching network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:43:25 compute-0 nova_compute[183075]: 2026-01-22 17:43:25.836 183079 WARNING nova.compute.manager [req-a075182f-83e0-4d9d-9ef8-ef4880fd1132 req-7be2b631-b6ba-4778-9686-f1b31d29f4d9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received unexpected event network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef for instance with vm_state active and task_state None.
Jan 22 17:43:27 compute-0 nova_compute[183075]: 2026-01-22 17:43:27.481 183079 INFO nova.compute.manager [None req-ee76c29c-4428-4652-b1a6-1bb4725dc022 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Get console output
Jan 22 17:43:27 compute-0 nova_compute[183075]: 2026-01-22 17:43:27.487 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:43:27 compute-0 nova_compute[183075]: 2026-01-22 17:43:27.789 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:28 compute-0 podman[238826]: 2026-01-22 17:43:28.372448478 +0000 UTC m=+0.070862389 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 17:43:28 compute-0 nova_compute[183075]: 2026-01-22 17:43:28.445 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:31 compute-0 nova_compute[183075]: 2026-01-22 17:43:31.169 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:43:32 compute-0 nova_compute[183075]: 2026-01-22 17:43:32.710 183079 INFO nova.compute.manager [None req-71aff81a-5b16-432e-b814-adee00be9814 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Get console output
Jan 22 17:43:32 compute-0 nova_compute[183075]: 2026-01-22 17:43:32.717 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:43:32 compute-0 nova_compute[183075]: 2026-01-22 17:43:32.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:43:32 compute-0 nova_compute[183075]: 2026-01-22 17:43:32.792 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:33 compute-0 nova_compute[183075]: 2026-01-22 17:43:33.444 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:35 compute-0 nova_compute[183075]: 2026-01-22 17:43:35.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:43:36 compute-0 ovn_controller[95372]: 2026-01-22T17:43:36Z|00132|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d4:a5:66 10.100.0.13
Jan 22 17:43:36 compute-0 ovn_controller[95372]: 2026-01-22T17:43:36Z|00133|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d4:a5:66 10.100.0.13
Jan 22 17:43:37 compute-0 nova_compute[183075]: 2026-01-22 17:43:37.793 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:37 compute-0 nova_compute[183075]: 2026-01-22 17:43:37.845 183079 INFO nova.compute.manager [None req-3138df82-0d5d-4589-aa1d-23c7f2d483dd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Get console output
Jan 22 17:43:37 compute-0 nova_compute[183075]: 2026-01-22 17:43:37.850 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:43:38 compute-0 nova_compute[183075]: 2026-01-22 17:43:38.446 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:38 compute-0 nova_compute[183075]: 2026-01-22 17:43:38.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:43:39 compute-0 podman[238868]: 2026-01-22 17:43:39.344583991 +0000 UTC m=+0.054711597 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:43:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:41.961 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:41.962 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:41.962 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:42 compute-0 nova_compute[183075]: 2026-01-22 17:43:42.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:43:42 compute-0 nova_compute[183075]: 2026-01-22 17:43:42.795 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:42 compute-0 nova_compute[183075]: 2026-01-22 17:43:42.959 183079 INFO nova.compute.manager [None req-edb5280b-0523-4f06-8d84-7185f8614ddd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Get console output
Jan 22 17:43:42 compute-0 nova_compute[183075]: 2026-01-22 17:43:42.965 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:43:43 compute-0 podman[238893]: 2026-01-22 17:43:43.345992497 +0000 UTC m=+0.051656224 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:43:43 compute-0 nova_compute[183075]: 2026-01-22 17:43:43.448 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:43.595 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:43:43 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:43.597 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:43:43 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:43:43 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:43:43 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:43:43 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:43:43 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:43:43 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:43:43 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.545 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.545 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.9483969
Jan 22 17:43:44 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.13:40966 [22/Jan/2026:17:43:43.594] listener listener/metadata 0/0/0/950/950 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.553 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.554 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.576 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.576 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0225811
Jan 22 17:43:44 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.13:40970 [22/Jan/2026:17:43:44.553] listener listener/metadata 0/0/0/23/23 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.580 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.581 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.596 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.596 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0151098
Jan 22 17:43:44 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.13:40972 [22/Jan/2026:17:43:44.580] listener listener/metadata 0/0/0/16/16 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.601 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.602 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.616 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.617 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0145748
Jan 22 17:43:44 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.13:40988 [22/Jan/2026:17:43:44.601] listener listener/metadata 0/0/0/15/15 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.622 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.622 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.634 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.635 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0126865
Jan 22 17:43:44 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.13:40998 [22/Jan/2026:17:43:44.621] listener listener/metadata 0/0/0/13/13 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.641 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.641 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.658 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.659 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0176754
Jan 22 17:43:44 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.13:41004 [22/Jan/2026:17:43:44.640] listener listener/metadata 0/0/0/18/18 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.664 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.665 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.676 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.676 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0119517
Jan 22 17:43:44 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.13:41018 [22/Jan/2026:17:43:44.664] listener listener/metadata 0/0/0/12/12 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.681 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.682 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.694 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.694 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0126324
Jan 22 17:43:44 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.13:41032 [22/Jan/2026:17:43:44.681] listener listener/metadata 0/0/0/13/13 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.698 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.699 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.711 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.712 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0131383
Jan 22 17:43:44 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.13:41044 [22/Jan/2026:17:43:44.698] listener listener/metadata 0/0/0/13/13 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.716 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.716 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.730 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.730 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0138102
Jan 22 17:43:44 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.13:41046 [22/Jan/2026:17:43:44.715] listener listener/metadata 0/0/0/14/14 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.734 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.734 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.747 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0131903
Jan 22 17:43:44 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.13:41056 [22/Jan/2026:17:43:44.733] listener listener/metadata 0/0/0/14/14 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.756 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.757 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.772 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.772 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0154717
Jan 22 17:43:44 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.13:41066 [22/Jan/2026:17:43:44.756] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.777 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.777 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:43:44 compute-0 nova_compute[183075]: 2026-01-22 17:43:44.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:43:44 compute-0 nova_compute[183075]: 2026-01-22 17:43:44.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.789 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:43:44 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.13:41072 [22/Jan/2026:17:43:44.776] listener listener/metadata 0/0/0/13/13 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.790 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0126374
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.794 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.794 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:43:44 compute-0 nova_compute[183075]: 2026-01-22 17:43:44.809 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:43:44 compute-0 nova_compute[183075]: 2026-01-22 17:43:44.809 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.810 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.810 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0163648
Jan 22 17:43:44 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.13:41086 [22/Jan/2026:17:43:44.793] listener listener/metadata 0/0/0/17/17 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.815 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.816 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.830 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.830 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0139637
Jan 22 17:43:44 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.13:41094 [22/Jan/2026:17:43:44.815] listener listener/metadata 0/0/0/15/15 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:43:44 compute-0 nova_compute[183075]: 2026-01-22 17:43:44.832 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:44 compute-0 nova_compute[183075]: 2026-01-22 17:43:44.833 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:44 compute-0 nova_compute[183075]: 2026-01-22 17:43:44.833 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:44 compute-0 nova_compute[183075]: 2026-01-22 17:43:44.833 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.834 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.835 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.852 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:43:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:44.853 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0178735
Jan 22 17:43:44 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.13:41110 [22/Jan/2026:17:43:44.834] listener listener/metadata 0/0/0/19/19 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:43:44 compute-0 nova_compute[183075]: 2026-01-22 17:43:44.902 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:43:45 compute-0 nova_compute[183075]: 2026-01-22 17:43:44.999 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:43:45 compute-0 nova_compute[183075]: 2026-01-22 17:43:45.000 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:43:45 compute-0 nova_compute[183075]: 2026-01-22 17:43:45.061 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:43:45 compute-0 nova_compute[183075]: 2026-01-22 17:43:45.236 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:43:45 compute-0 nova_compute[183075]: 2026-01-22 17:43:45.237 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5529MB free_disk=73.33184814453125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:43:45 compute-0 nova_compute[183075]: 2026-01-22 17:43:45.238 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:45 compute-0 nova_compute[183075]: 2026-01-22 17:43:45.238 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:45 compute-0 nova_compute[183075]: 2026-01-22 17:43:45.313 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance c570960f-e948-4456-9f9c-7b8afd2cf0ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:43:45 compute-0 nova_compute[183075]: 2026-01-22 17:43:45.314 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:43:45 compute-0 nova_compute[183075]: 2026-01-22 17:43:45.315 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:43:45 compute-0 nova_compute[183075]: 2026-01-22 17:43:45.356 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:43:45 compute-0 nova_compute[183075]: 2026-01-22 17:43:45.374 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:43:45 compute-0 nova_compute[183075]: 2026-01-22 17:43:45.403 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:43:45 compute-0 nova_compute[183075]: 2026-01-22 17:43:45.404 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:47 compute-0 nova_compute[183075]: 2026-01-22 17:43:47.798 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:48 compute-0 nova_compute[183075]: 2026-01-22 17:43:48.095 183079 INFO nova.compute.manager [None req-0919aa89-f33d-450c-b016-4ab96a85a4a6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Get console output
Jan 22 17:43:48 compute-0 nova_compute[183075]: 2026-01-22 17:43:48.100 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:43:48 compute-0 nova_compute[183075]: 2026-01-22 17:43:48.383 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:43:48 compute-0 nova_compute[183075]: 2026-01-22 17:43:48.383 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:43:48 compute-0 nova_compute[183075]: 2026-01-22 17:43:48.451 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:48 compute-0 nova_compute[183075]: 2026-01-22 17:43:48.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:43:52 compute-0 nova_compute[183075]: 2026-01-22 17:43:52.852 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:53 compute-0 ovn_controller[95372]: 2026-01-22T17:43:53Z|00741|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.353 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "8881a120-c63d-43dd-8135-7596ef1f460c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.354 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "8881a120-c63d-43dd-8135-7596ef1f460c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.370 183079 DEBUG nova.compute.manager [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:43:53 compute-0 podman[238926]: 2026-01-22 17:43:53.387251033 +0000 UTC m=+0.071265690 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64)
Jan 22 17:43:53 compute-0 podman[238925]: 2026-01-22 17:43:53.40208052 +0000 UTC m=+0.088211864 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 17:43:53 compute-0 podman[238924]: 2026-01-22 17:43:53.411682658 +0000 UTC m=+0.104322996 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.453 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.458 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.458 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.466 183079 DEBUG nova.virt.hardware [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.466 183079 INFO nova.compute.claims [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.607 183079 DEBUG nova.compute.provider_tree [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.624 183079 DEBUG nova.scheduler.client.report [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.644 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.645 183079 DEBUG nova.compute.manager [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.688 183079 DEBUG nova.compute.manager [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.689 183079 DEBUG nova.network.neutron [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.708 183079 INFO nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.724 183079 DEBUG nova.compute.manager [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.800 183079 DEBUG nova.compute.manager [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.801 183079 DEBUG nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.802 183079 INFO nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Creating image(s)
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.802 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/8881a120-c63d-43dd-8135-7596ef1f460c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.802 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/8881a120-c63d-43dd-8135-7596ef1f460c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.803 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/8881a120-c63d-43dd-8135-7596ef1f460c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.816 183079 DEBUG oslo_concurrency.processutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.849 183079 DEBUG nova.policy [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.894 183079 DEBUG oslo_concurrency.processutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.895 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.896 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.912 183079 DEBUG oslo_concurrency.processutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.967 183079 DEBUG oslo_concurrency.processutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:43:53 compute-0 nova_compute[183075]: 2026-01-22 17:43:53.968 183079 DEBUG oslo_concurrency.processutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/8881a120-c63d-43dd-8135-7596ef1f460c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:43:54 compute-0 nova_compute[183075]: 2026-01-22 17:43:54.009 183079 DEBUG oslo_concurrency.processutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/8881a120-c63d-43dd-8135-7596ef1f460c/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:43:54 compute-0 nova_compute[183075]: 2026-01-22 17:43:54.010 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:54 compute-0 nova_compute[183075]: 2026-01-22 17:43:54.011 183079 DEBUG oslo_concurrency.processutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:43:54 compute-0 nova_compute[183075]: 2026-01-22 17:43:54.061 183079 DEBUG oslo_concurrency.processutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:43:54 compute-0 nova_compute[183075]: 2026-01-22 17:43:54.062 183079 DEBUG nova.virt.disk.api [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/8881a120-c63d-43dd-8135-7596ef1f460c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:43:54 compute-0 nova_compute[183075]: 2026-01-22 17:43:54.062 183079 DEBUG oslo_concurrency.processutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8881a120-c63d-43dd-8135-7596ef1f460c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:43:54 compute-0 nova_compute[183075]: 2026-01-22 17:43:54.115 183079 DEBUG oslo_concurrency.processutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8881a120-c63d-43dd-8135-7596ef1f460c/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:43:54 compute-0 nova_compute[183075]: 2026-01-22 17:43:54.117 183079 DEBUG nova.virt.disk.api [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/8881a120-c63d-43dd-8135-7596ef1f460c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:43:54 compute-0 nova_compute[183075]: 2026-01-22 17:43:54.117 183079 DEBUG nova.objects.instance [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid 8881a120-c63d-43dd-8135-7596ef1f460c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:43:54 compute-0 nova_compute[183075]: 2026-01-22 17:43:54.133 183079 DEBUG nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:43:54 compute-0 nova_compute[183075]: 2026-01-22 17:43:54.134 183079 DEBUG nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Ensure instance console log exists: /var/lib/nova/instances/8881a120-c63d-43dd-8135-7596ef1f460c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:43:54 compute-0 nova_compute[183075]: 2026-01-22 17:43:54.134 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:54 compute-0 nova_compute[183075]: 2026-01-22 17:43:54.134 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:54 compute-0 nova_compute[183075]: 2026-01-22 17:43:54.135 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:54.778 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:43:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:54.780 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:43:54 compute-0 nova_compute[183075]: 2026-01-22 17:43:54.779 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:54 compute-0 nova_compute[183075]: 2026-01-22 17:43:54.870 183079 DEBUG nova.network.neutron [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Successfully created port: 52cabc04-1d39-486d-bd16-256d2140713c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.462 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'name': 'tempest-server-test-738920808', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000041', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.462 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.475 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk.device.write.bytes volume: 72998912 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bd445f6-2041-4b1b-983a-427433c073d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72998912, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac-vda', 'timestamp': '2026-01-22T17:43:55.463018', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'instance-00000041', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ec5909c6-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.223444869, 'message_signature': 'dc1d5ec126ddb8d6b1e27e02c2ecb49314de8adb4d004f195106da9bc8aaa7d8'}]}, 'timestamp': '2026-01-22 17:43:55.475601', '_unique_id': 'a48af475aac3443aba0c184dd01ab9bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.476 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.477 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.477 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk.device.read.bytes volume: 31283712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be78c4d7-aa14-4b4d-920f-d9914c258f91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31283712, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac-vda', 'timestamp': '2026-01-22T17:43:55.477913', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'instance-00000041', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ec59742e-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.223444869, 'message_signature': '1c6953ea806e33bec2a32f65c287000a1e5a5cd4ab4c3895751e65daf738f6e1'}]}, 'timestamp': '2026-01-22 17:43:55.478255', '_unique_id': '217bab4baf5c4289883ed2b82cf2da0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.478 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.479 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.479 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.479 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-738920808>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-738920808>]
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.480 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.482 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c570960f-e948-4456-9f9c-7b8afd2cf0ac / tapeff2a2a1-b5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.482 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d5670b2-68c9-479e-8188-d2ad576afcc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000041-c570960f-e948-4456-9f9c-7b8afd2cf0ac-tapeff2a2a1-b5', 'timestamp': '2026-01-22T17:43:55.480064', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'tapeff2a2a1-b5', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:a5:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeff2a2a1-b5'}, 'message_id': 'ec5a3148-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.240508726, 'message_signature': 'd261fb64cca4f477b47128cfd8a0f79d0a0cdd61aa04f5c1182b0557126c0918'}]}, 'timestamp': '2026-01-22 17:43:55.483134', '_unique_id': 'a4c5e06e8d704a31bfd9da06f7e2cebd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.484 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk.device.read.latency volume: 216585279 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1cc6a93a-d717-44ab-871a-27f96946dd71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 216585279, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac-vda', 'timestamp': '2026-01-22T17:43:55.484921', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'instance-00000041', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ec5a82c4-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.223444869, 'message_signature': '1b5e66722fa0e50f2a582c19c1ea81f7375dda266ca7615547b7784fc2c9c8a2'}]}, 'timestamp': '2026-01-22 17:43:55.485162', '_unique_id': '1b57c35c263745e8a42a993791ba93e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.486 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.486 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.486 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-738920808>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-738920808>]
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.486 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.492 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '013acfde-9f24-4572-a15a-67790b2e3cc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac-vda', 'timestamp': '2026-01-22T17:43:55.486803', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'instance-00000041', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ec5bc274-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.247255017, 'message_signature': 'a1829c2fb99526ad2880fb048b3bf18397071d21651f4492792db2a296af5d05'}]}, 'timestamp': '2026-01-22 17:43:55.493429', '_unique_id': '204d7459297f41c0af79eaa10f49f266'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.495 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.495 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efa790cf-2566-4d67-a39f-4a026e61ba63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000041-c570960f-e948-4456-9f9c-7b8afd2cf0ac-tapeff2a2a1-b5', 'timestamp': '2026-01-22T17:43:55.495331', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'tapeff2a2a1-b5', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:a5:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeff2a2a1-b5'}, 'message_id': 'ec5c1af8-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.240508726, 'message_signature': '2dd776a3334c596a35708dfd828f04f14fae5a020777a7f01102895885fbc525'}]}, 'timestamp': '2026-01-22 17:43:55.495682', '_unique_id': 'a0f0405d370a40fab1a6eb783d23f6c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.496 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.497 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.497 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/network.incoming.bytes volume: 7219 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aeb3e551-ed81-4387-bd6e-0d032ef52b2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7219, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000041-c570960f-e948-4456-9f9c-7b8afd2cf0ac-tapeff2a2a1-b5', 'timestamp': '2026-01-22T17:43:55.497247', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'tapeff2a2a1-b5', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:a5:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeff2a2a1-b5'}, 'message_id': 'ec5c6508-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.240508726, 'message_signature': 'ada771a2dc4db62790150ae0c57cad3ae0b100ea7c6e7eef3883538587d73dfa'}]}, 'timestamp': '2026-01-22 17:43:55.497543', '_unique_id': 'd587a7b5c13d4c078038f54b498302b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.498 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/network.outgoing.bytes volume: 10361 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '861a57b3-045d-4dd1-aabe-b720241e9adf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10361, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000041-c570960f-e948-4456-9f9c-7b8afd2cf0ac-tapeff2a2a1-b5', 'timestamp': '2026-01-22T17:43:55.499085', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'tapeff2a2a1-b5', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:a5:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeff2a2a1-b5'}, 'message_id': 'ec5caca2-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.240508726, 'message_signature': '17d3027a4cf4e8e6226a8e31ec302b0186acc9eb669aff3adc384d5fa0881d00'}]}, 'timestamp': '2026-01-22 17:43:55.499373', '_unique_id': '31c65e0be4d64e5f8e632bd22a8d5b87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.500 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.500 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.500 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-738920808>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-738920808>]
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.501 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.501 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad949b52-205a-4c46-9122-01cbccaf059c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000041-c570960f-e948-4456-9f9c-7b8afd2cf0ac-tapeff2a2a1-b5', 'timestamp': '2026-01-22T17:43:55.501168', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'tapeff2a2a1-b5', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:a5:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeff2a2a1-b5'}, 'message_id': 'ec5cfe5a-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.240508726, 'message_signature': '65539335ef7cd1448c1a6afae2828b97834d405a4095f49276a019e2d49b9584'}]}, 'timestamp': '2026-01-22 17:43:55.501464', '_unique_id': 'abb3621e54544128b67df45ad279cb4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.502 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e04b4f5b-7adf-4c90-9924-7100f983feeb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000041-c570960f-e948-4456-9f9c-7b8afd2cf0ac-tapeff2a2a1-b5', 'timestamp': '2026-01-22T17:43:55.502860', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'tapeff2a2a1-b5', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:a5:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeff2a2a1-b5'}, 'message_id': 'ec5d404a-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.240508726, 'message_signature': 'bac5dbde4006ff2c100b6972fbc4b41ff126320415a5583663f6196aca46f4dc'}]}, 'timestamp': '2026-01-22 17:43:55.503155', '_unique_id': '5cc4223a8a444be2b191ce39ab4a2c30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.504 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.504 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e8f4eb1-1fc2-4896-9c73-be3b362677c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000041-c570960f-e948-4456-9f9c-7b8afd2cf0ac-tapeff2a2a1-b5', 'timestamp': '2026-01-22T17:43:55.504481', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'tapeff2a2a1-b5', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:a5:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeff2a2a1-b5'}, 'message_id': 'ec5d803c-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.240508726, 'message_signature': 'ca7b913ca9695157fef9075e93fb327e27d857636a3244d6bd2a0dbb947165be'}]}, 'timestamp': '2026-01-22 17:43:55.504797', '_unique_id': '290a4a53ddbf4b7f9e42e245c3698b20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.505 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.506 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.506 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk.device.write.latency volume: 2054950470 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f62aa316-c859-4a91-ba27-170289c7196c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2054950470, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac-vda', 'timestamp': '2026-01-22T17:43:55.506163', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'instance-00000041', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ec5dc132-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.223444869, 'message_signature': 'e6a193fb862561f2c6e09521f0017b8edf9bec3e95ffaee999a99aa0d5db392a'}]}, 'timestamp': '2026-01-22 17:43:55.506443', '_unique_id': 'baa2de05dc834dcfa6f0c6d8c2838a22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.507 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk.device.write.requests volume: 320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98c49b09-b040-43b3-aad6-dc14c3a43c9e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 320, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac-vda', 'timestamp': '2026-01-22T17:43:55.507792', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'instance-00000041', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ec5e005c-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.223444869, 'message_signature': '95e6ab7fa80254bce867089e71e5fb4cb2b6a1e634932bf79217f2763eba9c75'}]}, 'timestamp': '2026-01-22 17:43:55.508058', '_unique_id': '0b72e33b5afe4eaaa6c7d27812f4fc5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.509 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.509 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cc057d9-a0df-4e15-9028-fa6ce03470de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000041-c570960f-e948-4456-9f9c-7b8afd2cf0ac-tapeff2a2a1-b5', 'timestamp': '2026-01-22T17:43:55.509521', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'tapeff2a2a1-b5', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:a5:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeff2a2a1-b5'}, 'message_id': 'ec5e4508-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.240508726, 'message_signature': 'b5ae6c9a2c4075f1707e09a8ed770852135066a1408df4646096c952e6fb1f13'}]}, 'timestamp': '2026-01-22 17:43:55.509822', '_unique_id': '75838e4a20824cf294074784c66ac4a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk.device.read.requests volume: 1156 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e3d4680-2255-452c-8206-c270ac44dfe7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1156, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac-vda', 'timestamp': '2026-01-22T17:43:55.511141', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'instance-00000041', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ec5e836a-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.223444869, 'message_signature': '110a08fce9a29cfae7b0b7bc415c07d6aa4be0decf82bf797b9e06f8f48aa9d5'}]}, 'timestamp': '2026-01-22 17:43:55.511416', '_unique_id': 'd1516e5b83734c5da01a2a958e16097c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.511 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.512 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.512 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '255c0e50-46fa-4622-8149-1ff901ca7a6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000041-c570960f-e948-4456-9f9c-7b8afd2cf0ac-tapeff2a2a1-b5', 'timestamp': '2026-01-22T17:43:55.512766', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'tapeff2a2a1-b5', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:a5:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeff2a2a1-b5'}, 'message_id': 'ec5ec302-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.240508726, 'message_signature': '8cc6c3a7bf20f327275f69698d2d8d9bdffc7af7f4217539ccbb1657d5308cee'}]}, 'timestamp': '2026-01-22 17:43:55.513052', '_unique_id': '7caa325560574680815bdffeb840b880'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.513 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.514 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.528 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/memory.usage volume: 42.7890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9188501d-0dae-4189-92c5-f0aa7fcad1b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.7890625, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'timestamp': '2026-01-22T17:43:55.514409', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'instance-00000041', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ec613448-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.288864681, 'message_signature': '4469033c6c4abd1d8b09a7024b73c1d8d41d546e77ea4ed4751ea1a84948b681'}]}, 'timestamp': '2026-01-22 17:43:55.529130', '_unique_id': '9b7aa9f42f944c76acfc343c84c673b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.530 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/cpu volume: 11410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9d6989c-efe8-4b1f-93a7-b7e4557ea69b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11410000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'timestamp': '2026-01-22T17:43:55.531061', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'instance-00000041', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ec618de4-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.288864681, 'message_signature': '05871e434d83ee947b8ea4e0971781819c7eea917fd33824b2b2aeddd5ef3a81'}]}, 'timestamp': '2026-01-22 17:43:55.531342', '_unique_id': '37f661b7251842de9fb912bd9a3e5d69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.532 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.532 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f2b205a-39c0-4491-ad9c-2fb0170e9ab2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac-vda', 'timestamp': '2026-01-22T17:43:55.532680', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'instance-00000041', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ec61cd18-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.247255017, 'message_signature': 'cf5927ab4fb900b48986f27047bbe02fc87d0626b8826fb8be3ac7c232fc3035'}]}, 'timestamp': '2026-01-22 17:43:55.532959', '_unique_id': '5039b8b3c011406c838a6956b284f1bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.534 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.534 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.534 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-738920808>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-738920808>]
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.534 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.534 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bc40173-0708-48ce-b5cd-d72875315559', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000041-c570960f-e948-4456-9f9c-7b8afd2cf0ac-tapeff2a2a1-b5', 'timestamp': '2026-01-22T17:43:55.534914', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'tapeff2a2a1-b5', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:a5:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeff2a2a1-b5'}, 'message_id': 'ec6224f2-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.240508726, 'message_signature': '08474f2f764770e1731ad27cef25e0a4e91d316de8414c85ce80cd50b36af7d4'}]}, 'timestamp': '2026-01-22 17:43:55.535232', '_unique_id': '96e3c20f4fa14109af4cd287fe3b65b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.536 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.536 12 DEBUG ceilometer.compute.pollsters [-] c570960f-e948-4456-9f9c-7b8afd2cf0ac/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '397d0cfe-6de2-4420-bb1b-8503941ee149', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac-vda', 'timestamp': '2026-01-22T17:43:55.536613', 'resource_metadata': {'display_name': 'tempest-server-test-738920808', 'name': 'instance-00000041', 'instance_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ec62676e-f7b9-11f0-9e69-fa163eaea1db', 'monotonic_time': 6199.247255017, 'message_signature': '54ef61fadb8f2eb61d79ee7b3a707adb1803147fb36f52d992013370762c7cd5'}]}, 'timestamp': '2026-01-22 17:43:55.536910', '_unique_id': '2612673fd91c4919a40636dadeaa1f09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:43:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:43:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:43:55 compute-0 nova_compute[183075]: 2026-01-22 17:43:55.971 183079 DEBUG nova.network.neutron [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Successfully updated port: 52cabc04-1d39-486d-bd16-256d2140713c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:43:55 compute-0 nova_compute[183075]: 2026-01-22 17:43:55.988 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-8881a120-c63d-43dd-8135-7596ef1f460c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:43:55 compute-0 nova_compute[183075]: 2026-01-22 17:43:55.989 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-8881a120-c63d-43dd-8135-7596ef1f460c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:43:55 compute-0 nova_compute[183075]: 2026-01-22 17:43:55.989 183079 DEBUG nova.network.neutron [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:43:56 compute-0 nova_compute[183075]: 2026-01-22 17:43:56.051 183079 DEBUG nova.compute.manager [req-e0bfc71b-3bd6-407b-98d4-16a0ab4b4309 req-5dc349f3-b88c-40f3-8f0d-30b4fa49beef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Received event network-changed-52cabc04-1d39-486d-bd16-256d2140713c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:43:56 compute-0 nova_compute[183075]: 2026-01-22 17:43:56.051 183079 DEBUG nova.compute.manager [req-e0bfc71b-3bd6-407b-98d4-16a0ab4b4309 req-5dc349f3-b88c-40f3-8f0d-30b4fa49beef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Refreshing instance network info cache due to event network-changed-52cabc04-1d39-486d-bd16-256d2140713c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:43:56 compute-0 nova_compute[183075]: 2026-01-22 17:43:56.052 183079 DEBUG oslo_concurrency.lockutils [req-e0bfc71b-3bd6-407b-98d4-16a0ab4b4309 req-5dc349f3-b88c-40f3-8f0d-30b4fa49beef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-8881a120-c63d-43dd-8135-7596ef1f460c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:43:56 compute-0 nova_compute[183075]: 2026-01-22 17:43:56.325 183079 DEBUG nova.network.neutron [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:43:56 compute-0 nova_compute[183075]: 2026-01-22 17:43:56.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.291 183079 DEBUG nova.network.neutron [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Updating instance_info_cache with network_info: [{"id": "52cabc04-1d39-486d-bd16-256d2140713c", "address": "fa:16:3e:f3:e4:dc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52cabc04-1d", "ovs_interfaceid": "52cabc04-1d39-486d-bd16-256d2140713c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.697 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-8881a120-c63d-43dd-8135-7596ef1f460c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.697 183079 DEBUG nova.compute.manager [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Instance network_info: |[{"id": "52cabc04-1d39-486d-bd16-256d2140713c", "address": "fa:16:3e:f3:e4:dc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52cabc04-1d", "ovs_interfaceid": "52cabc04-1d39-486d-bd16-256d2140713c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.698 183079 DEBUG oslo_concurrency.lockutils [req-e0bfc71b-3bd6-407b-98d4-16a0ab4b4309 req-5dc349f3-b88c-40f3-8f0d-30b4fa49beef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-8881a120-c63d-43dd-8135-7596ef1f460c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.698 183079 DEBUG nova.network.neutron [req-e0bfc71b-3bd6-407b-98d4-16a0ab4b4309 req-5dc349f3-b88c-40f3-8f0d-30b4fa49beef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Refreshing network info cache for port 52cabc04-1d39-486d-bd16-256d2140713c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.701 183079 DEBUG nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Start _get_guest_xml network_info=[{"id": "52cabc04-1d39-486d-bd16-256d2140713c", "address": "fa:16:3e:f3:e4:dc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52cabc04-1d", "ovs_interfaceid": "52cabc04-1d39-486d-bd16-256d2140713c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.708 183079 WARNING nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.717 183079 DEBUG nova.virt.libvirt.host [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.718 183079 DEBUG nova.virt.libvirt.host [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.722 183079 DEBUG nova.virt.libvirt.host [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.723 183079 DEBUG nova.virt.libvirt.host [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.723 183079 DEBUG nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.723 183079 DEBUG nova.virt.hardware [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.724 183079 DEBUG nova.virt.hardware [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.724 183079 DEBUG nova.virt.hardware [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.725 183079 DEBUG nova.virt.hardware [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.725 183079 DEBUG nova.virt.hardware [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.725 183079 DEBUG nova.virt.hardware [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.725 183079 DEBUG nova.virt.hardware [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.726 183079 DEBUG nova.virt.hardware [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.726 183079 DEBUG nova.virt.hardware [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.727 183079 DEBUG nova.virt.hardware [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.727 183079 DEBUG nova.virt.hardware [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.731 183079 DEBUG nova.virt.libvirt.vif [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:43:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-756250740',display_name='tempest-server-test-756250740',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-756250740',id=66,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-fg69o9kh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:43:53Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=8881a120-c63d-43dd-8135-7596ef1f460c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "52cabc04-1d39-486d-bd16-256d2140713c", "address": "fa:16:3e:f3:e4:dc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52cabc04-1d", "ovs_interfaceid": "52cabc04-1d39-486d-bd16-256d2140713c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.732 183079 DEBUG nova.network.os_vif_util [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "52cabc04-1d39-486d-bd16-256d2140713c", "address": "fa:16:3e:f3:e4:dc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52cabc04-1d", "ovs_interfaceid": "52cabc04-1d39-486d-bd16-256d2140713c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.733 183079 DEBUG nova.network.os_vif_util [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:e4:dc,bridge_name='br-int',has_traffic_filtering=True,id=52cabc04-1d39-486d-bd16-256d2140713c,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52cabc04-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.735 183079 DEBUG nova.objects.instance [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid 8881a120-c63d-43dd-8135-7596ef1f460c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.756 183079 DEBUG nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:43:57 compute-0 nova_compute[183075]:   <uuid>8881a120-c63d-43dd-8135-7596ef1f460c</uuid>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   <name>instance-00000042</name>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-756250740</nova:name>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:43:57</nova:creationTime>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:43:57 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:43:57 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:43:57 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:43:57 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:43:57 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:43:57 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:43:57 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:43:57 compute-0 nova_compute[183075]:         <nova:port uuid="52cabc04-1d39-486d-bd16-256d2140713c">
Jan 22 17:43:57 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <system>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <entry name="serial">8881a120-c63d-43dd-8135-7596ef1f460c</entry>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <entry name="uuid">8881a120-c63d-43dd-8135-7596ef1f460c</entry>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     </system>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   <os>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   </os>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   <features>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   </features>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/8881a120-c63d-43dd-8135-7596ef1f460c/disk"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:f3:e4:dc"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <target dev="tap52cabc04-1d"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/8881a120-c63d-43dd-8135-7596ef1f460c/console.log" append="off"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <video>
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     </video>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:43:57 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:43:57 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:43:57 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:43:57 compute-0 nova_compute[183075]: </domain>
Jan 22 17:43:57 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.757 183079 DEBUG nova.compute.manager [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Preparing to wait for external event network-vif-plugged-52cabc04-1d39-486d-bd16-256d2140713c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.757 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.757 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.758 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.758 183079 DEBUG nova.virt.libvirt.vif [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:43:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-756250740',display_name='tempest-server-test-756250740',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-756250740',id=66,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-fg69o9kh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:43:53Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=8881a120-c63d-43dd-8135-7596ef1f460c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "52cabc04-1d39-486d-bd16-256d2140713c", "address": "fa:16:3e:f3:e4:dc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52cabc04-1d", "ovs_interfaceid": "52cabc04-1d39-486d-bd16-256d2140713c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.759 183079 DEBUG nova.network.os_vif_util [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "52cabc04-1d39-486d-bd16-256d2140713c", "address": "fa:16:3e:f3:e4:dc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52cabc04-1d", "ovs_interfaceid": "52cabc04-1d39-486d-bd16-256d2140713c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.760 183079 DEBUG nova.network.os_vif_util [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:e4:dc,bridge_name='br-int',has_traffic_filtering=True,id=52cabc04-1d39-486d-bd16-256d2140713c,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52cabc04-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.760 183079 DEBUG os_vif [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:e4:dc,bridge_name='br-int',has_traffic_filtering=True,id=52cabc04-1d39-486d-bd16-256d2140713c,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52cabc04-1d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.761 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.762 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.762 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.766 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.766 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap52cabc04-1d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.767 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap52cabc04-1d, col_values=(('external_ids', {'iface-id': '52cabc04-1d39-486d-bd16-256d2140713c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f3:e4:dc', 'vm-uuid': '8881a120-c63d-43dd-8135-7596ef1f460c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.769 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:57 compute-0 NetworkManager[55454]: <info>  [1769103837.7711] manager: (tap52cabc04-1d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/301)
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.773 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.776 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.777 183079 INFO os_vif [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:e4:dc,bridge_name='br-int',has_traffic_filtering=True,id=52cabc04-1d39-486d-bd16-256d2140713c,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52cabc04-1d')
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.819 183079 DEBUG nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.820 183079 DEBUG nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:f3:e4:dc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:43:57 compute-0 kernel: tap52cabc04-1d: entered promiscuous mode
Jan 22 17:43:57 compute-0 NetworkManager[55454]: <info>  [1769103837.8962] manager: (tap52cabc04-1d): new Tun device (/org/freedesktop/NetworkManager/Devices/302)
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.897 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:57 compute-0 ovn_controller[95372]: 2026-01-22T17:43:57Z|00742|binding|INFO|Claiming lport 52cabc04-1d39-486d-bd16-256d2140713c for this chassis.
Jan 22 17:43:57 compute-0 ovn_controller[95372]: 2026-01-22T17:43:57Z|00743|binding|INFO|52cabc04-1d39-486d-bd16-256d2140713c: Claiming fa:16:3e:f3:e4:dc 10.100.0.5
Jan 22 17:43:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:57.905 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:e4:dc 10.100.0.5'], port_security=['fa:16:3e:f3:e4:dc 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '8881a120-c63d-43dd-8135-7596ef1f460c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '54e9e10b-dcc5-478e-80d1-2fa424b9bf22 b9fe05a4-23df-4029-b1fa-68aa53e00156', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=52cabc04-1d39-486d-bd16-256d2140713c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:43:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:57.907 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 52cabc04-1d39-486d-bd16-256d2140713c in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:43:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:57.909 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:43:57 compute-0 ovn_controller[95372]: 2026-01-22T17:43:57Z|00744|binding|INFO|Setting lport 52cabc04-1d39-486d-bd16-256d2140713c up in Southbound
Jan 22 17:43:57 compute-0 ovn_controller[95372]: 2026-01-22T17:43:57Z|00745|binding|INFO|Setting lport 52cabc04-1d39-486d-bd16-256d2140713c ovn-installed in OVS
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.911 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:57 compute-0 nova_compute[183075]: 2026-01-22 17:43:57.915 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:57.930 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d31610d6-f6bc-4f84-ad06-da844f6f28cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:57 compute-0 systemd-udevd[239018]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:43:57 compute-0 systemd-machined[154382]: New machine qemu-66-instance-00000042.
Jan 22 17:43:57 compute-0 NetworkManager[55454]: <info>  [1769103837.9513] device (tap52cabc04-1d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:43:57 compute-0 NetworkManager[55454]: <info>  [1769103837.9519] device (tap52cabc04-1d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:43:57 compute-0 systemd[1]: Started Virtual Machine qemu-66-instance-00000042.
Jan 22 17:43:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:57.969 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d94546-9d36-4d14-ab2e-57bbb22c813e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:57 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:57.974 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[b9192331-67e6-4474-a44b-f3e226c48cf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:58.010 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c6b92641-fb2d-4599-8b7d-00e0570e1e72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:58.030 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b81d5324-9575-4383-884b-41b5cf3c6d9b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6129, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6129, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616722, 'reachable_time': 15422, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239030, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:58.053 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4416ed42-b8a4-4db3-b8e3-89e3d4a0fa44]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 616735, 'tstamp': 616735}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239032, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 616738, 'tstamp': 616738}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239032, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:43:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:58.055 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:43:58 compute-0 nova_compute[183075]: 2026-01-22 17:43:58.056 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:58.059 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:43:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:58.059 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:43:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:58.060 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:43:58 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:43:58.060 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:43:58 compute-0 nova_compute[183075]: 2026-01-22 17:43:58.450 183079 DEBUG nova.compute.manager [req-06f3b7fd-4749-46e4-aaba-35652548e1cc req-72310a60-b55e-400c-a50a-22b2e871d897 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Received event network-vif-plugged-52cabc04-1d39-486d-bd16-256d2140713c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:43:58 compute-0 nova_compute[183075]: 2026-01-22 17:43:58.450 183079 DEBUG oslo_concurrency.lockutils [req-06f3b7fd-4749-46e4-aaba-35652548e1cc req-72310a60-b55e-400c-a50a-22b2e871d897 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:43:58 compute-0 nova_compute[183075]: 2026-01-22 17:43:58.451 183079 DEBUG oslo_concurrency.lockutils [req-06f3b7fd-4749-46e4-aaba-35652548e1cc req-72310a60-b55e-400c-a50a-22b2e871d897 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:43:58 compute-0 nova_compute[183075]: 2026-01-22 17:43:58.452 183079 DEBUG oslo_concurrency.lockutils [req-06f3b7fd-4749-46e4-aaba-35652548e1cc req-72310a60-b55e-400c-a50a-22b2e871d897 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:43:58 compute-0 nova_compute[183075]: 2026-01-22 17:43:58.452 183079 DEBUG nova.compute.manager [req-06f3b7fd-4749-46e4-aaba-35652548e1cc req-72310a60-b55e-400c-a50a-22b2e871d897 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Processing event network-vif-plugged-52cabc04-1d39-486d-bd16-256d2140713c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:43:58 compute-0 nova_compute[183075]: 2026-01-22 17:43:58.457 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:43:58 compute-0 nova_compute[183075]: 2026-01-22 17:43:58.785 183079 DEBUG nova.network.neutron [req-e0bfc71b-3bd6-407b-98d4-16a0ab4b4309 req-5dc349f3-b88c-40f3-8f0d-30b4fa49beef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Updated VIF entry in instance network info cache for port 52cabc04-1d39-486d-bd16-256d2140713c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:43:58 compute-0 nova_compute[183075]: 2026-01-22 17:43:58.786 183079 DEBUG nova.network.neutron [req-e0bfc71b-3bd6-407b-98d4-16a0ab4b4309 req-5dc349f3-b88c-40f3-8f0d-30b4fa49beef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Updating instance_info_cache with network_info: [{"id": "52cabc04-1d39-486d-bd16-256d2140713c", "address": "fa:16:3e:f3:e4:dc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52cabc04-1d", "ovs_interfaceid": "52cabc04-1d39-486d-bd16-256d2140713c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:43:58 compute-0 nova_compute[183075]: 2026-01-22 17:43:58.883 183079 DEBUG oslo_concurrency.lockutils [req-e0bfc71b-3bd6-407b-98d4-16a0ab4b4309 req-5dc349f3-b88c-40f3-8f0d-30b4fa49beef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-8881a120-c63d-43dd-8135-7596ef1f460c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:43:59 compute-0 podman[239033]: 2026-01-22 17:43:59.352920136 +0000 UTC m=+0.064491298 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.706 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103839.705618, 8881a120-c63d-43dd-8135-7596ef1f460c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.706 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] VM Started (Lifecycle Event)
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.708 183079 DEBUG nova.compute.manager [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.712 183079 DEBUG nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.715 183079 INFO nova.virt.libvirt.driver [-] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Instance spawned successfully.
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.715 183079 DEBUG nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.727 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.733 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.737 183079 DEBUG nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.738 183079 DEBUG nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.738 183079 DEBUG nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.739 183079 DEBUG nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.739 183079 DEBUG nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.740 183079 DEBUG nova.virt.libvirt.driver [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.776 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.776 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103839.705972, 8881a120-c63d-43dd-8135-7596ef1f460c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.776 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] VM Paused (Lifecycle Event)
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.804 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.808 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103839.7111306, 8881a120-c63d-43dd-8135-7596ef1f460c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.808 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] VM Resumed (Lifecycle Event)
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.814 183079 INFO nova.compute.manager [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Took 6.01 seconds to spawn the instance on the hypervisor.
Jan 22 17:43:59 compute-0 nova_compute[183075]: 2026-01-22 17:43:59.814 183079 DEBUG nova.compute.manager [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:44:00 compute-0 nova_compute[183075]: 2026-01-22 17:44:00.149 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:44:00 compute-0 nova_compute[183075]: 2026-01-22 17:44:00.152 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:44:00 compute-0 nova_compute[183075]: 2026-01-22 17:44:00.671 183079 DEBUG nova.compute.manager [req-50c3b2af-d87a-4696-9b53-7dbbbc137153 req-30174261-be91-4e94-a4c6-b1830a4950b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Received event network-vif-plugged-52cabc04-1d39-486d-bd16-256d2140713c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:44:00 compute-0 nova_compute[183075]: 2026-01-22 17:44:00.671 183079 DEBUG oslo_concurrency.lockutils [req-50c3b2af-d87a-4696-9b53-7dbbbc137153 req-30174261-be91-4e94-a4c6-b1830a4950b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:00 compute-0 nova_compute[183075]: 2026-01-22 17:44:00.672 183079 DEBUG oslo_concurrency.lockutils [req-50c3b2af-d87a-4696-9b53-7dbbbc137153 req-30174261-be91-4e94-a4c6-b1830a4950b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:00 compute-0 nova_compute[183075]: 2026-01-22 17:44:00.672 183079 DEBUG oslo_concurrency.lockutils [req-50c3b2af-d87a-4696-9b53-7dbbbc137153 req-30174261-be91-4e94-a4c6-b1830a4950b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:00 compute-0 nova_compute[183075]: 2026-01-22 17:44:00.672 183079 DEBUG nova.compute.manager [req-50c3b2af-d87a-4696-9b53-7dbbbc137153 req-30174261-be91-4e94-a4c6-b1830a4950b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] No waiting events found dispatching network-vif-plugged-52cabc04-1d39-486d-bd16-256d2140713c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:44:00 compute-0 nova_compute[183075]: 2026-01-22 17:44:00.673 183079 WARNING nova.compute.manager [req-50c3b2af-d87a-4696-9b53-7dbbbc137153 req-30174261-be91-4e94-a4c6-b1830a4950b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Received unexpected event network-vif-plugged-52cabc04-1d39-486d-bd16-256d2140713c for instance with vm_state building and task_state spawning.
Jan 22 17:44:00 compute-0 nova_compute[183075]: 2026-01-22 17:44:00.690 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:44:00 compute-0 nova_compute[183075]: 2026-01-22 17:44:00.708 183079 INFO nova.compute.manager [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Took 7.28 seconds to build instance.
Jan 22 17:44:00 compute-0 nova_compute[183075]: 2026-01-22 17:44:00.723 183079 DEBUG oslo_concurrency.lockutils [None req-516adfcd-82f6-4eaf-bfab-c94954b2352b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "8881a120-c63d-43dd-8135-7596ef1f460c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.369s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:02 compute-0 nova_compute[183075]: 2026-01-22 17:44:02.643 183079 INFO nova.compute.manager [None req-4bcfcb0f-12dd-4bf3-b2bc-5826c9fc87b6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Get console output
Jan 22 17:44:02 compute-0 nova_compute[183075]: 2026-01-22 17:44:02.650 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:44:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:02.782 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:44:02 compute-0 nova_compute[183075]: 2026-01-22 17:44:02.810 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:03 compute-0 nova_compute[183075]: 2026-01-22 17:44:03.458 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:07 compute-0 nova_compute[183075]: 2026-01-22 17:44:07.804 183079 INFO nova.compute.manager [None req-bdfdcec0-34e8-4c31-89dc-eb5f31671aae 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Get console output
Jan 22 17:44:07 compute-0 nova_compute[183075]: 2026-01-22 17:44:07.809 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:44:07 compute-0 nova_compute[183075]: 2026-01-22 17:44:07.813 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:08 compute-0 nova_compute[183075]: 2026-01-22 17:44:08.460 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:10 compute-0 podman[239067]: 2026-01-22 17:44:10.345108845 +0000 UTC m=+0.050820452 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:44:11 compute-0 ovn_controller[95372]: 2026-01-22T17:44:11Z|00134|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f3:e4:dc 10.100.0.5
Jan 22 17:44:11 compute-0 ovn_controller[95372]: 2026-01-22T17:44:11Z|00135|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f3:e4:dc 10.100.0.5
Jan 22 17:44:12 compute-0 nova_compute[183075]: 2026-01-22 17:44:12.814 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:12 compute-0 nova_compute[183075]: 2026-01-22 17:44:12.929 183079 INFO nova.compute.manager [None req-b7557654-799a-4e64-a209-86e2dbc86828 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Get console output
Jan 22 17:44:12 compute-0 nova_compute[183075]: 2026-01-22 17:44:12.934 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:44:13 compute-0 nova_compute[183075]: 2026-01-22 17:44:13.462 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:14 compute-0 podman[239105]: 2026-01-22 17:44:14.363503616 +0000 UTC m=+0.071743623 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:44:17 compute-0 nova_compute[183075]: 2026-01-22 17:44:17.816 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:18 compute-0 nova_compute[183075]: 2026-01-22 17:44:18.091 183079 INFO nova.compute.manager [None req-08e488a5-a8d8-43bc-9309-d73bcf46d129 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Get console output
Jan 22 17:44:18 compute-0 nova_compute[183075]: 2026-01-22 17:44:18.096 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:44:18 compute-0 nova_compute[183075]: 2026-01-22 17:44:18.465 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:18.616 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:44:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:18.618 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:44:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:44:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:44:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:44:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:44:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:44:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:44:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.513 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:44:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.5:47038 [22/Jan/2026:17:44:18.615] listener listener/metadata 0/0/0/898/898 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.514 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.8967021
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.521 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.522 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.534 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.534 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0124500
Jan 22 17:44:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.5:47048 [22/Jan/2026:17:44:19.521] listener listener/metadata 0/0/0/13/13 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.538 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.538 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.551 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.551 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0131562
Jan 22 17:44:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.5:47052 [22/Jan/2026:17:44:19.537] listener listener/metadata 0/0/0/14/14 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.555 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.556 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.571 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:44:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.5:47054 [22/Jan/2026:17:44:19.555] listener listener/metadata 0/0/0/16/16 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.572 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0160656
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.576 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.577 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.590 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.590 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0135713
Jan 22 17:44:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.5:47056 [22/Jan/2026:17:44:19.576] listener listener/metadata 0/0/0/14/14 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.595 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.595 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.611 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.611 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0160847
Jan 22 17:44:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.5:47070 [22/Jan/2026:17:44:19.594] listener listener/metadata 0/0/0/17/17 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.616 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.616 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.629 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.630 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0132813
Jan 22 17:44:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.5:47082 [22/Jan/2026:17:44:19.615] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.634 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.634 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.647 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:44:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.5:47094 [22/Jan/2026:17:44:19.633] listener listener/metadata 0/0/0/14/14 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.648 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0137622
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.652 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.653 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.668 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.669 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0161726
Jan 22 17:44:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.5:47104 [22/Jan/2026:17:44:19.652] listener listener/metadata 0/0/0/17/17 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.674 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.675 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.688 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.688 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0132678
Jan 22 17:44:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.5:47120 [22/Jan/2026:17:44:19.674] listener listener/metadata 0/0/0/14/14 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.693 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.694 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.711 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0171483
Jan 22 17:44:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.5:47126 [22/Jan/2026:17:44:19.693] listener listener/metadata 0/0/0/18/18 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.721 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.722 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.737 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:44:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.5:47136 [22/Jan/2026:17:44:19.721] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.737 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0152369
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.741 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.741 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.761 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.762 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0204732
Jan 22 17:44:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.5:47138 [22/Jan/2026:17:44:19.740] listener listener/metadata 0/0/0/21/21 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.765 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.766 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.780 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.780 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0140791
Jan 22 17:44:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.5:47142 [22/Jan/2026:17:44:19.765] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.784 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.785 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.798 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:44:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.5:47148 [22/Jan/2026:17:44:19.784] listener listener/metadata 0/0/0/14/14 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.798 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0132871
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.802 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.803 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.814 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:44:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:19.814 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0115492
Jan 22 17:44:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238817]: 10.100.0.5:47154 [22/Jan/2026:17:44:19.802] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:44:22 compute-0 nova_compute[183075]: 2026-01-22 17:44:22.818 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:23 compute-0 nova_compute[183075]: 2026-01-22 17:44:23.224 183079 INFO nova.compute.manager [None req-5d3f870f-e747-4c94-89fe-bd874d12018b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Get console output
Jan 22 17:44:23 compute-0 nova_compute[183075]: 2026-01-22 17:44:23.229 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:44:23 compute-0 nova_compute[183075]: 2026-01-22 17:44:23.467 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:24 compute-0 podman[239144]: 2026-01-22 17:44:24.347348984 +0000 UTC m=+0.047846962 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 17:44:24 compute-0 podman[239145]: 2026-01-22 17:44:24.376647499 +0000 UTC m=+0.063634385 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, release=1755695350)
Jan 22 17:44:24 compute-0 podman[239143]: 2026-01-22 17:44:24.39532227 +0000 UTC m=+0.099129947 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:44:27 compute-0 nova_compute[183075]: 2026-01-22 17:44:27.820 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:27 compute-0 ovn_controller[95372]: 2026-01-22T17:44:27Z|00746|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 22 17:44:28 compute-0 nova_compute[183075]: 2026-01-22 17:44:28.471 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:28 compute-0 nova_compute[183075]: 2026-01-22 17:44:28.880 183079 DEBUG nova.compute.manager [req-b61e020c-6e3b-46b9-962d-0a865694840b req-4cefbb96-a0c0-4e55-9cce-e6c3513174a5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received event network-changed-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:44:28 compute-0 nova_compute[183075]: 2026-01-22 17:44:28.881 183079 DEBUG nova.compute.manager [req-b61e020c-6e3b-46b9-962d-0a865694840b req-4cefbb96-a0c0-4e55-9cce-e6c3513174a5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Refreshing instance network info cache due to event network-changed-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:44:28 compute-0 nova_compute[183075]: 2026-01-22 17:44:28.881 183079 DEBUG oslo_concurrency.lockutils [req-b61e020c-6e3b-46b9-962d-0a865694840b req-4cefbb96-a0c0-4e55-9cce-e6c3513174a5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-c570960f-e948-4456-9f9c-7b8afd2cf0ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:44:28 compute-0 nova_compute[183075]: 2026-01-22 17:44:28.881 183079 DEBUG oslo_concurrency.lockutils [req-b61e020c-6e3b-46b9-962d-0a865694840b req-4cefbb96-a0c0-4e55-9cce-e6c3513174a5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-c570960f-e948-4456-9f9c-7b8afd2cf0ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:44:28 compute-0 nova_compute[183075]: 2026-01-22 17:44:28.882 183079 DEBUG nova.network.neutron [req-b61e020c-6e3b-46b9-962d-0a865694840b req-4cefbb96-a0c0-4e55-9cce-e6c3513174a5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Refreshing network info cache for port eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:44:29 compute-0 nova_compute[183075]: 2026-01-22 17:44:29.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:44:30 compute-0 podman[239207]: 2026-01-22 17:44:30.352435762 +0000 UTC m=+0.062652980 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 17:44:31 compute-0 nova_compute[183075]: 2026-01-22 17:44:31.459 183079 DEBUG nova.network.neutron [req-b61e020c-6e3b-46b9-962d-0a865694840b req-4cefbb96-a0c0-4e55-9cce-e6c3513174a5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Updated VIF entry in instance network info cache for port eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:44:31 compute-0 nova_compute[183075]: 2026-01-22 17:44:31.460 183079 DEBUG nova.network.neutron [req-b61e020c-6e3b-46b9-962d-0a865694840b req-4cefbb96-a0c0-4e55-9cce-e6c3513174a5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Updating instance_info_cache with network_info: [{"id": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "address": "fa:16:3e:d4:a5:66", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeff2a2a1-b5", "ovs_interfaceid": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:44:31 compute-0 nova_compute[183075]: 2026-01-22 17:44:31.481 183079 DEBUG oslo_concurrency.lockutils [req-b61e020c-6e3b-46b9-962d-0a865694840b req-4cefbb96-a0c0-4e55-9cce-e6c3513174a5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-c570960f-e948-4456-9f9c-7b8afd2cf0ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:44:32 compute-0 nova_compute[183075]: 2026-01-22 17:44:32.773 183079 DEBUG nova.compute.manager [req-081fe260-c462-419c-882e-e19673396354 req-0e572be5-3f78-4937-9365-34d3d185d71c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Received event network-changed-52cabc04-1d39-486d-bd16-256d2140713c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:44:32 compute-0 nova_compute[183075]: 2026-01-22 17:44:32.773 183079 DEBUG nova.compute.manager [req-081fe260-c462-419c-882e-e19673396354 req-0e572be5-3f78-4937-9365-34d3d185d71c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Refreshing instance network info cache due to event network-changed-52cabc04-1d39-486d-bd16-256d2140713c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:44:32 compute-0 nova_compute[183075]: 2026-01-22 17:44:32.773 183079 DEBUG oslo_concurrency.lockutils [req-081fe260-c462-419c-882e-e19673396354 req-0e572be5-3f78-4937-9365-34d3d185d71c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-8881a120-c63d-43dd-8135-7596ef1f460c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:44:32 compute-0 nova_compute[183075]: 2026-01-22 17:44:32.773 183079 DEBUG oslo_concurrency.lockutils [req-081fe260-c462-419c-882e-e19673396354 req-0e572be5-3f78-4937-9365-34d3d185d71c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-8881a120-c63d-43dd-8135-7596ef1f460c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:44:32 compute-0 nova_compute[183075]: 2026-01-22 17:44:32.773 183079 DEBUG nova.network.neutron [req-081fe260-c462-419c-882e-e19673396354 req-0e572be5-3f78-4937-9365-34d3d185d71c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Refreshing network info cache for port 52cabc04-1d39-486d-bd16-256d2140713c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:44:32 compute-0 nova_compute[183075]: 2026-01-22 17:44:32.821 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:32 compute-0 nova_compute[183075]: 2026-01-22 17:44:32.953 183079 DEBUG oslo_concurrency.lockutils [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "8881a120-c63d-43dd-8135-7596ef1f460c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:32 compute-0 nova_compute[183075]: 2026-01-22 17:44:32.954 183079 DEBUG oslo_concurrency.lockutils [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "8881a120-c63d-43dd-8135-7596ef1f460c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:32 compute-0 nova_compute[183075]: 2026-01-22 17:44:32.954 183079 DEBUG oslo_concurrency.lockutils [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:32 compute-0 nova_compute[183075]: 2026-01-22 17:44:32.955 183079 DEBUG oslo_concurrency.lockutils [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:32 compute-0 nova_compute[183075]: 2026-01-22 17:44:32.955 183079 DEBUG oslo_concurrency.lockutils [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:32 compute-0 nova_compute[183075]: 2026-01-22 17:44:32.956 183079 INFO nova.compute.manager [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Terminating instance
Jan 22 17:44:32 compute-0 nova_compute[183075]: 2026-01-22 17:44:32.958 183079 DEBUG nova.compute.manager [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:44:32 compute-0 kernel: tap52cabc04-1d (unregistering): left promiscuous mode
Jan 22 17:44:32 compute-0 NetworkManager[55454]: <info>  [1769103872.9869] device (tap52cabc04-1d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:44:32 compute-0 ovn_controller[95372]: 2026-01-22T17:44:32Z|00747|binding|INFO|Releasing lport 52cabc04-1d39-486d-bd16-256d2140713c from this chassis (sb_readonly=0)
Jan 22 17:44:32 compute-0 ovn_controller[95372]: 2026-01-22T17:44:32Z|00748|binding|INFO|Setting lport 52cabc04-1d39-486d-bd16-256d2140713c down in Southbound
Jan 22 17:44:32 compute-0 nova_compute[183075]: 2026-01-22 17:44:32.994 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:32 compute-0 ovn_controller[95372]: 2026-01-22T17:44:32Z|00749|binding|INFO|Removing iface tap52cabc04-1d ovn-installed in OVS
Jan 22 17:44:32 compute-0 nova_compute[183075]: 2026-01-22 17:44:32.996 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:33.002 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:e4:dc 10.100.0.5'], port_security=['fa:16:3e:f3:e4:dc 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '8881a120-c63d-43dd-8135-7596ef1f460c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '54e9e10b-dcc5-478e-80d1-2fa424b9bf22 b9fe05a4-23df-4029-b1fa-68aa53e00156', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.192'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=52cabc04-1d39-486d-bd16-256d2140713c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:44:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:33.003 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 52cabc04-1d39-486d-bd16-256d2140713c in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:44:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:33.004 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.006 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:33.027 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[739a1c7b-fc6a-4134-86a4-cccd60b86b64]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:33 compute-0 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000042.scope: Deactivated successfully.
Jan 22 17:44:33 compute-0 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000042.scope: Consumed 13.872s CPU time.
Jan 22 17:44:33 compute-0 systemd-machined[154382]: Machine qemu-66-instance-00000042 terminated.
Jan 22 17:44:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:33.056 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[8217cca6-f4c3-4679-9228-d6e8c8e498c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:33.059 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[81e22546-02b3-40e2-9865-4d39a362a86b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:33.088 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[331d94ad-0492-4002-81df-18deee4bda0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:33.105 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[34248d99-3fbd-4055-bd88-8d16c51b4a22]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 105, 'rx_bytes': 17308, 'tx_bytes': 11987, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 105, 'rx_bytes': 17308, 'tx_bytes': 11987, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616722, 'reachable_time': 32499, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239241, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:33.119 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[17403956-a070-44e5-aae6-560c13bd089d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 616735, 'tstamp': 616735}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239242, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 616738, 'tstamp': 616738}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239242, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:33.120 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.122 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:33.127 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:44:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:33.128 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.127 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:33.128 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:44:33 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:33.128 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.229 183079 INFO nova.virt.libvirt.driver [-] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Instance destroyed successfully.
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.230 183079 DEBUG nova.objects.instance [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid 8881a120-c63d-43dd-8135-7596ef1f460c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.242 183079 DEBUG nova.virt.libvirt.vif [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:43:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-756250740',display_name='tempest-server-test-756250740',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-756250740',id=66,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:43:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-fg69o9kh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:44:00Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=8881a120-c63d-43dd-8135-7596ef1f460c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "52cabc04-1d39-486d-bd16-256d2140713c", "address": "fa:16:3e:f3:e4:dc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52cabc04-1d", "ovs_interfaceid": "52cabc04-1d39-486d-bd16-256d2140713c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.242 183079 DEBUG nova.network.os_vif_util [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "52cabc04-1d39-486d-bd16-256d2140713c", "address": "fa:16:3e:f3:e4:dc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52cabc04-1d", "ovs_interfaceid": "52cabc04-1d39-486d-bd16-256d2140713c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.243 183079 DEBUG nova.network.os_vif_util [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:e4:dc,bridge_name='br-int',has_traffic_filtering=True,id=52cabc04-1d39-486d-bd16-256d2140713c,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52cabc04-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.243 183079 DEBUG os_vif [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:e4:dc,bridge_name='br-int',has_traffic_filtering=True,id=52cabc04-1d39-486d-bd16-256d2140713c,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52cabc04-1d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.245 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.245 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52cabc04-1d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.246 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.248 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.250 183079 INFO os_vif [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:e4:dc,bridge_name='br-int',has_traffic_filtering=True,id=52cabc04-1d39-486d-bd16-256d2140713c,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52cabc04-1d')
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.251 183079 INFO nova.virt.libvirt.driver [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Deleting instance files /var/lib/nova/instances/8881a120-c63d-43dd-8135-7596ef1f460c_del
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.251 183079 INFO nova.virt.libvirt.driver [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Deletion of /var/lib/nova/instances/8881a120-c63d-43dd-8135-7596ef1f460c_del complete
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.304 183079 INFO nova.compute.manager [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.305 183079 DEBUG oslo.service.loopingcall [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.305 183079 DEBUG nova.compute.manager [-] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.305 183079 DEBUG nova.network.neutron [-] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.477 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:33 compute-0 nova_compute[183075]: 2026-01-22 17:44:33.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.439 183079 DEBUG nova.network.neutron [req-081fe260-c462-419c-882e-e19673396354 req-0e572be5-3f78-4937-9365-34d3d185d71c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Updated VIF entry in instance network info cache for port 52cabc04-1d39-486d-bd16-256d2140713c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.440 183079 DEBUG nova.network.neutron [req-081fe260-c462-419c-882e-e19673396354 req-0e572be5-3f78-4937-9365-34d3d185d71c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Updating instance_info_cache with network_info: [{"id": "52cabc04-1d39-486d-bd16-256d2140713c", "address": "fa:16:3e:f3:e4:dc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52cabc04-1d", "ovs_interfaceid": "52cabc04-1d39-486d-bd16-256d2140713c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.461 183079 DEBUG nova.network.neutron [-] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.466 183079 DEBUG oslo_concurrency.lockutils [req-081fe260-c462-419c-882e-e19673396354 req-0e572be5-3f78-4937-9365-34d3d185d71c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-8881a120-c63d-43dd-8135-7596ef1f460c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.476 183079 INFO nova.compute.manager [-] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Took 1.17 seconds to deallocate network for instance.
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.516 183079 DEBUG oslo_concurrency.lockutils [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.516 183079 DEBUG oslo_concurrency.lockutils [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.601 183079 DEBUG nova.compute.provider_tree [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.615 183079 DEBUG nova.scheduler.client.report [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.640 183079 DEBUG oslo_concurrency.lockutils [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.676 183079 INFO nova.scheduler.client.report [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance 8881a120-c63d-43dd-8135-7596ef1f460c
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.732 183079 DEBUG oslo_concurrency.lockutils [None req-44c906c6-e90f-4956-9711-6ee89bfae316 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "8881a120-c63d-43dd-8135-7596ef1f460c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.778s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.890 183079 DEBUG nova.compute.manager [req-621d94ad-c55e-4926-9862-64a85fff2fb5 req-5714c283-108e-40ea-b658-1eed3a3b2a8a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Received event network-vif-unplugged-52cabc04-1d39-486d-bd16-256d2140713c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.891 183079 DEBUG oslo_concurrency.lockutils [req-621d94ad-c55e-4926-9862-64a85fff2fb5 req-5714c283-108e-40ea-b658-1eed3a3b2a8a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.891 183079 DEBUG oslo_concurrency.lockutils [req-621d94ad-c55e-4926-9862-64a85fff2fb5 req-5714c283-108e-40ea-b658-1eed3a3b2a8a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.891 183079 DEBUG oslo_concurrency.lockutils [req-621d94ad-c55e-4926-9862-64a85fff2fb5 req-5714c283-108e-40ea-b658-1eed3a3b2a8a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.892 183079 DEBUG nova.compute.manager [req-621d94ad-c55e-4926-9862-64a85fff2fb5 req-5714c283-108e-40ea-b658-1eed3a3b2a8a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] No waiting events found dispatching network-vif-unplugged-52cabc04-1d39-486d-bd16-256d2140713c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.892 183079 WARNING nova.compute.manager [req-621d94ad-c55e-4926-9862-64a85fff2fb5 req-5714c283-108e-40ea-b658-1eed3a3b2a8a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Received unexpected event network-vif-unplugged-52cabc04-1d39-486d-bd16-256d2140713c for instance with vm_state deleted and task_state None.
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.892 183079 DEBUG nova.compute.manager [req-621d94ad-c55e-4926-9862-64a85fff2fb5 req-5714c283-108e-40ea-b658-1eed3a3b2a8a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Received event network-vif-plugged-52cabc04-1d39-486d-bd16-256d2140713c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.892 183079 DEBUG oslo_concurrency.lockutils [req-621d94ad-c55e-4926-9862-64a85fff2fb5 req-5714c283-108e-40ea-b658-1eed3a3b2a8a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.893 183079 DEBUG oslo_concurrency.lockutils [req-621d94ad-c55e-4926-9862-64a85fff2fb5 req-5714c283-108e-40ea-b658-1eed3a3b2a8a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.893 183079 DEBUG oslo_concurrency.lockutils [req-621d94ad-c55e-4926-9862-64a85fff2fb5 req-5714c283-108e-40ea-b658-1eed3a3b2a8a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "8881a120-c63d-43dd-8135-7596ef1f460c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.893 183079 DEBUG nova.compute.manager [req-621d94ad-c55e-4926-9862-64a85fff2fb5 req-5714c283-108e-40ea-b658-1eed3a3b2a8a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] No waiting events found dispatching network-vif-plugged-52cabc04-1d39-486d-bd16-256d2140713c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.893 183079 WARNING nova.compute.manager [req-621d94ad-c55e-4926-9862-64a85fff2fb5 req-5714c283-108e-40ea-b658-1eed3a3b2a8a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Received unexpected event network-vif-plugged-52cabc04-1d39-486d-bd16-256d2140713c for instance with vm_state deleted and task_state None.
Jan 22 17:44:34 compute-0 nova_compute[183075]: 2026-01-22 17:44:34.893 183079 DEBUG nova.compute.manager [req-621d94ad-c55e-4926-9862-64a85fff2fb5 req-5714c283-108e-40ea-b658-1eed3a3b2a8a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Received event network-vif-deleted-52cabc04-1d39-486d-bd16-256d2140713c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:44:35 compute-0 nova_compute[183075]: 2026-01-22 17:44:35.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.067 183079 DEBUG oslo_concurrency.lockutils [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.068 183079 DEBUG oslo_concurrency.lockutils [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.069 183079 DEBUG oslo_concurrency.lockutils [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.069 183079 DEBUG oslo_concurrency.lockutils [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.070 183079 DEBUG oslo_concurrency.lockutils [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.072 183079 INFO nova.compute.manager [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Terminating instance
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.074 183079 DEBUG nova.compute.manager [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:44:36 compute-0 kernel: tapeff2a2a1-b5 (unregistering): left promiscuous mode
Jan 22 17:44:36 compute-0 NetworkManager[55454]: <info>  [1769103876.0999] device (tapeff2a2a1-b5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:44:36 compute-0 ovn_controller[95372]: 2026-01-22T17:44:36Z|00750|binding|INFO|Releasing lport eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef from this chassis (sb_readonly=0)
Jan 22 17:44:36 compute-0 ovn_controller[95372]: 2026-01-22T17:44:36Z|00751|binding|INFO|Setting lport eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef down in Southbound
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.105 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:36 compute-0 ovn_controller[95372]: 2026-01-22T17:44:36Z|00752|binding|INFO|Removing iface tapeff2a2a1-b5 ovn-installed in OVS
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.110 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.113 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:a5:66 10.100.0.13'], port_security=['fa:16:3e:d4:a5:66 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '54e9e10b-dcc5-478e-80d1-2fa424b9bf22 b9fe05a4-23df-4029-b1fa-68aa53e00156', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.201'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.114 104629 INFO neutron.agent.ovn.metadata.agent [-] Port eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.115 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.116 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[14b02f6f-dae9-42c7-8f67-a02b56ae5518]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.116 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace which is not needed anymore
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.124 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:36 compute-0 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000041.scope: Deactivated successfully.
Jan 22 17:44:36 compute-0 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000041.scope: Consumed 15.335s CPU time.
Jan 22 17:44:36 compute-0 systemd-machined[154382]: Machine qemu-65-instance-00000041 terminated.
Jan 22 17:44:36 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238810]: [NOTICE]   (238815) : haproxy version is 2.8.14-c23fe91
Jan 22 17:44:36 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238810]: [NOTICE]   (238815) : path to executable is /usr/sbin/haproxy
Jan 22 17:44:36 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238810]: [WARNING]  (238815) : Exiting Master process...
Jan 22 17:44:36 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238810]: [ALERT]    (238815) : Current worker (238817) exited with code 143 (Terminated)
Jan 22 17:44:36 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[238810]: [WARNING]  (238815) : All workers exited. Exiting... (0)
Jan 22 17:44:36 compute-0 systemd[1]: libpod-3907e30560daef0f4176c513bc1747cc2bb9cdc4e6713cd7031660683c7398c6.scope: Deactivated successfully.
Jan 22 17:44:36 compute-0 podman[239282]: 2026-01-22 17:44:36.254468359 +0000 UTC m=+0.043133727 container died 3907e30560daef0f4176c513bc1747cc2bb9cdc4e6713cd7031660683c7398c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 17:44:36 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3907e30560daef0f4176c513bc1747cc2bb9cdc4e6713cd7031660683c7398c6-userdata-shm.mount: Deactivated successfully.
Jan 22 17:44:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-03abbd08c70c21a9f5403a59a983775d43964d1a14afe1985847a6494a290f70-merged.mount: Deactivated successfully.
Jan 22 17:44:36 compute-0 kernel: tapeff2a2a1-b5: entered promiscuous mode
Jan 22 17:44:36 compute-0 kernel: tapeff2a2a1-b5 (unregistering): left promiscuous mode
Jan 22 17:44:36 compute-0 ovn_controller[95372]: 2026-01-22T17:44:36Z|00753|binding|INFO|Claiming lport eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef for this chassis.
Jan 22 17:44:36 compute-0 ovn_controller[95372]: 2026-01-22T17:44:36Z|00754|binding|INFO|eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef: Claiming fa:16:3e:d4:a5:66 10.100.0.13
Jan 22 17:44:36 compute-0 podman[239282]: 2026-01-22 17:44:36.295338814 +0000 UTC m=+0.084004192 container cleanup 3907e30560daef0f4176c513bc1747cc2bb9cdc4e6713cd7031660683c7398c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:44:36 compute-0 NetworkManager[55454]: <info>  [1769103876.2963] manager: (tapeff2a2a1-b5): new Tun device (/org/freedesktop/NetworkManager/Devices/303)
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.295 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:36 compute-0 systemd[1]: libpod-conmon-3907e30560daef0f4176c513bc1747cc2bb9cdc4e6713cd7031660683c7398c6.scope: Deactivated successfully.
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.302 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:a5:66 10.100.0.13'], port_security=['fa:16:3e:d4:a5:66 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '54e9e10b-dcc5-478e-80d1-2fa424b9bf22 b9fe05a4-23df-4029-b1fa-68aa53e00156', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.201'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.313 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:36 compute-0 ovn_controller[95372]: 2026-01-22T17:44:36Z|00755|binding|INFO|Setting lport eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef ovn-installed in OVS
Jan 22 17:44:36 compute-0 ovn_controller[95372]: 2026-01-22T17:44:36Z|00756|binding|INFO|Setting lport eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef up in Southbound
Jan 22 17:44:36 compute-0 ovn_controller[95372]: 2026-01-22T17:44:36Z|00757|binding|INFO|Releasing lport eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef from this chassis (sb_readonly=1)
Jan 22 17:44:36 compute-0 ovn_controller[95372]: 2026-01-22T17:44:36Z|00758|if_status|INFO|Dropped 1 log messages in last 874 seconds (most recently, 874 seconds ago) due to excessive rate
Jan 22 17:44:36 compute-0 ovn_controller[95372]: 2026-01-22T17:44:36Z|00759|if_status|INFO|Not setting lport eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef down as sb is readonly
Jan 22 17:44:36 compute-0 ovn_controller[95372]: 2026-01-22T17:44:36Z|00760|binding|INFO|Removing iface tapeff2a2a1-b5 ovn-installed in OVS
Jan 22 17:44:36 compute-0 ovn_controller[95372]: 2026-01-22T17:44:36Z|00761|binding|INFO|Releasing lport eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef from this chassis (sb_readonly=0)
Jan 22 17:44:36 compute-0 ovn_controller[95372]: 2026-01-22T17:44:36Z|00762|binding|INFO|Setting lport eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef down in Southbound
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.323 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:a5:66 10.100.0.13'], port_security=['fa:16:3e:d4:a5:66 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c570960f-e948-4456-9f9c-7b8afd2cf0ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '54e9e10b-dcc5-478e-80d1-2fa424b9bf22 b9fe05a4-23df-4029-b1fa-68aa53e00156', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.201'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.327 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.336 183079 INFO nova.virt.libvirt.driver [-] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Instance destroyed successfully.
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.336 183079 DEBUG nova.objects.instance [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid c570960f-e948-4456-9f9c-7b8afd2cf0ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.349 183079 DEBUG nova.virt.libvirt.vif [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:43:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-738920808',display_name='tempest-server-test-738920808',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-738920808',id=65,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:43:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-abrrxh4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:43:24Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=c570960f-e948-4456-9f9c-7b8afd2cf0ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "address": "fa:16:3e:d4:a5:66", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeff2a2a1-b5", "ovs_interfaceid": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.349 183079 DEBUG nova.network.os_vif_util [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "address": "fa:16:3e:d4:a5:66", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeff2a2a1-b5", "ovs_interfaceid": "eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.350 183079 DEBUG nova.network.os_vif_util [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d4:a5:66,bridge_name='br-int',has_traffic_filtering=True,id=eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeff2a2a1-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.350 183079 DEBUG os_vif [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:a5:66,bridge_name='br-int',has_traffic_filtering=True,id=eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeff2a2a1-b5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.352 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.352 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeff2a2a1-b5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.354 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.355 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.357 183079 INFO os_vif [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:a5:66,bridge_name='br-int',has_traffic_filtering=True,id=eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeff2a2a1-b5')
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.358 183079 INFO nova.virt.libvirt.driver [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Deleting instance files /var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac_del
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.359 183079 INFO nova.virt.libvirt.driver [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Deletion of /var/lib/nova/instances/c570960f-e948-4456-9f9c-7b8afd2cf0ac_del complete
Jan 22 17:44:36 compute-0 podman[239317]: 2026-01-22 17:44:36.37248643 +0000 UTC m=+0.052315832 container remove 3907e30560daef0f4176c513bc1747cc2bb9cdc4e6713cd7031660683c7398c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.377 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2814fc40-4ca7-4b76-afe7-69ee2df3af70]: (4, ('Thu Jan 22 05:44:36 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (3907e30560daef0f4176c513bc1747cc2bb9cdc4e6713cd7031660683c7398c6)\n3907e30560daef0f4176c513bc1747cc2bb9cdc4e6713cd7031660683c7398c6\nThu Jan 22 05:44:36 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (3907e30560daef0f4176c513bc1747cc2bb9cdc4e6713cd7031660683c7398c6)\n3907e30560daef0f4176c513bc1747cc2bb9cdc4e6713cd7031660683c7398c6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.379 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[de822b62-360e-4443-988b-dc1371a8440f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.380 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.381 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:36 compute-0 kernel: tap88ed9213-70: left promiscuous mode
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.393 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.397 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9d75e749-f4d0-4b5d-a531-d2dc186675cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.411 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f30b83e3-3066-4631-b177-b0f6a2c7c084]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.412 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7b480bfc-3c03-4e66-883f-3cb46e42c821]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.415 183079 INFO nova.compute.manager [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.415 183079 DEBUG oslo.service.loopingcall [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.416 183079 DEBUG nova.compute.manager [-] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.416 183079 DEBUG nova.network.neutron [-] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.427 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4ea698c1-98b2-407a-babe-ab0fa6d04110]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616715, 'reachable_time': 44368, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239334, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:36 compute-0 systemd[1]: run-netns-ovnmeta\x2d88ed9213\x2d7d48\x2d4c42\x2dad60\x2dce3fcf104f71.mount: Deactivated successfully.
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.431 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.431 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[9efbf389-c6f5-4421-895b-0ccbd9f0d424]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.432 104629 INFO neutron.agent.ovn.metadata.agent [-] Port eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.433 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.434 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e1fa0d56-1877-44b6-ba70-841b63edd0a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.434 104629 INFO neutron.agent.ovn.metadata.agent [-] Port eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.435 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:44:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:36.436 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e30f2f88-5386-4288-a858-ce90b93798a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.513 183079 DEBUG nova.compute.manager [req-549531cb-07e8-4b87-9ac2-aee64a958b0a req-c7d4b8ab-0784-4d5f-9148-2f1e7d58afff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received event network-vif-unplugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.513 183079 DEBUG oslo_concurrency.lockutils [req-549531cb-07e8-4b87-9ac2-aee64a958b0a req-c7d4b8ab-0784-4d5f-9148-2f1e7d58afff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.513 183079 DEBUG oslo_concurrency.lockutils [req-549531cb-07e8-4b87-9ac2-aee64a958b0a req-c7d4b8ab-0784-4d5f-9148-2f1e7d58afff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.514 183079 DEBUG oslo_concurrency.lockutils [req-549531cb-07e8-4b87-9ac2-aee64a958b0a req-c7d4b8ab-0784-4d5f-9148-2f1e7d58afff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.514 183079 DEBUG nova.compute.manager [req-549531cb-07e8-4b87-9ac2-aee64a958b0a req-c7d4b8ab-0784-4d5f-9148-2f1e7d58afff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] No waiting events found dispatching network-vif-unplugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:44:36 compute-0 nova_compute[183075]: 2026-01-22 17:44:36.514 183079 DEBUG nova.compute.manager [req-549531cb-07e8-4b87-9ac2-aee64a958b0a req-c7d4b8ab-0784-4d5f-9148-2f1e7d58afff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received event network-vif-unplugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:44:37 compute-0 nova_compute[183075]: 2026-01-22 17:44:37.125 183079 DEBUG nova.network.neutron [-] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:44:37 compute-0 nova_compute[183075]: 2026-01-22 17:44:37.142 183079 INFO nova.compute.manager [-] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Took 0.73 seconds to deallocate network for instance.
Jan 22 17:44:37 compute-0 nova_compute[183075]: 2026-01-22 17:44:37.192 183079 DEBUG oslo_concurrency.lockutils [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:37 compute-0 nova_compute[183075]: 2026-01-22 17:44:37.192 183079 DEBUG oslo_concurrency.lockutils [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:37 compute-0 nova_compute[183075]: 2026-01-22 17:44:37.233 183079 DEBUG nova.compute.manager [req-27927120-8110-4ce8-8b08-046f5933901f req-b350e78a-0e91-467d-b2d0-861a4955613c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received event network-vif-deleted-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:44:37 compute-0 nova_compute[183075]: 2026-01-22 17:44:37.260 183079 DEBUG nova.compute.provider_tree [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:44:37 compute-0 nova_compute[183075]: 2026-01-22 17:44:37.279 183079 DEBUG nova.scheduler.client.report [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:44:37 compute-0 nova_compute[183075]: 2026-01-22 17:44:37.301 183079 DEBUG oslo_concurrency.lockutils [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:37 compute-0 nova_compute[183075]: 2026-01-22 17:44:37.323 183079 INFO nova.scheduler.client.report [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance c570960f-e948-4456-9f9c-7b8afd2cf0ac
Jan 22 17:44:37 compute-0 nova_compute[183075]: 2026-01-22 17:44:37.391 183079 DEBUG oslo_concurrency.lockutils [None req-36dbd963-983b-4df1-8583-0db7d24401e6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.323s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.502 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.629 183079 DEBUG nova.compute.manager [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received event network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.630 183079 DEBUG oslo_concurrency.lockutils [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.630 183079 DEBUG oslo_concurrency.lockutils [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.630 183079 DEBUG oslo_concurrency.lockutils [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.630 183079 DEBUG nova.compute.manager [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] No waiting events found dispatching network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.631 183079 WARNING nova.compute.manager [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received unexpected event network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef for instance with vm_state deleted and task_state None.
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.631 183079 DEBUG nova.compute.manager [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received event network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.631 183079 DEBUG oslo_concurrency.lockutils [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.632 183079 DEBUG oslo_concurrency.lockutils [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.632 183079 DEBUG oslo_concurrency.lockutils [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.632 183079 DEBUG nova.compute.manager [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] No waiting events found dispatching network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.633 183079 WARNING nova.compute.manager [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received unexpected event network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef for instance with vm_state deleted and task_state None.
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.633 183079 DEBUG nova.compute.manager [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received event network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.633 183079 DEBUG oslo_concurrency.lockutils [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.634 183079 DEBUG oslo_concurrency.lockutils [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.634 183079 DEBUG oslo_concurrency.lockutils [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.634 183079 DEBUG nova.compute.manager [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] No waiting events found dispatching network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.634 183079 WARNING nova.compute.manager [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received unexpected event network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef for instance with vm_state deleted and task_state None.
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.635 183079 DEBUG nova.compute.manager [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received event network-vif-unplugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.635 183079 DEBUG oslo_concurrency.lockutils [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.635 183079 DEBUG oslo_concurrency.lockutils [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.635 183079 DEBUG oslo_concurrency.lockutils [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.635 183079 DEBUG nova.compute.manager [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] No waiting events found dispatching network-vif-unplugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.636 183079 WARNING nova.compute.manager [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received unexpected event network-vif-unplugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef for instance with vm_state deleted and task_state None.
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.636 183079 DEBUG nova.compute.manager [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received event network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.636 183079 DEBUG oslo_concurrency.lockutils [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.636 183079 DEBUG oslo_concurrency.lockutils [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.637 183079 DEBUG oslo_concurrency.lockutils [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c570960f-e948-4456-9f9c-7b8afd2cf0ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.637 183079 DEBUG nova.compute.manager [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] No waiting events found dispatching network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.637 183079 WARNING nova.compute.manager [req-9adc74ce-977c-45ba-948a-395ce880b000 req-b60c863f-39f8-4a78-ba28-41965427fbc5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Received unexpected event network-vif-plugged-eff2a2a1-b5c5-4ee3-92d0-09b232b7c9ef for instance with vm_state deleted and task_state None.
Jan 22 17:44:38 compute-0 nova_compute[183075]: 2026-01-22 17:44:38.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:44:41 compute-0 podman[239335]: 2026-01-22 17:44:41.353653112 +0000 UTC m=+0.055474097 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:44:41 compute-0 nova_compute[183075]: 2026-01-22 17:44:41.355 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:41.963 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:41.963 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:41.963 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:43 compute-0 nova_compute[183075]: 2026-01-22 17:44:43.505 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:43 compute-0 nova_compute[183075]: 2026-01-22 17:44:43.685 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "12ded08f-c0e2-4d03-967b-3436626dbbb2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:43 compute-0 nova_compute[183075]: 2026-01-22 17:44:43.685 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "12ded08f-c0e2-4d03-967b-3436626dbbb2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:43 compute-0 nova_compute[183075]: 2026-01-22 17:44:43.699 183079 DEBUG nova.compute.manager [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:44:43 compute-0 nova_compute[183075]: 2026-01-22 17:44:43.765 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:43 compute-0 nova_compute[183075]: 2026-01-22 17:44:43.766 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:43 compute-0 nova_compute[183075]: 2026-01-22 17:44:43.772 183079 DEBUG nova.virt.hardware [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:44:43 compute-0 nova_compute[183075]: 2026-01-22 17:44:43.773 183079 INFO nova.compute.claims [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:44:43 compute-0 nova_compute[183075]: 2026-01-22 17:44:43.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:44:43 compute-0 nova_compute[183075]: 2026-01-22 17:44:43.912 183079 DEBUG nova.compute.provider_tree [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:44:43 compute-0 nova_compute[183075]: 2026-01-22 17:44:43.926 183079 DEBUG nova.scheduler.client.report [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:44:43 compute-0 nova_compute[183075]: 2026-01-22 17:44:43.948 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:43 compute-0 nova_compute[183075]: 2026-01-22 17:44:43.949 183079 DEBUG nova.compute.manager [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:44:43 compute-0 nova_compute[183075]: 2026-01-22 17:44:43.990 183079 DEBUG nova.compute.manager [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:44:43 compute-0 nova_compute[183075]: 2026-01-22 17:44:43.990 183079 DEBUG nova.network.neutron [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.007 183079 INFO nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.023 183079 DEBUG nova.compute.manager [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.100 183079 DEBUG nova.compute.manager [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.101 183079 DEBUG nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.102 183079 INFO nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Creating image(s)
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.102 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.103 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.103 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.115 183079 DEBUG oslo_concurrency.processutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.186 183079 DEBUG oslo_concurrency.processutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.187 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.187 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.199 183079 DEBUG oslo_concurrency.processutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.266 183079 DEBUG oslo_concurrency.processutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.267 183079 DEBUG oslo_concurrency.processutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.308 183079 DEBUG oslo_concurrency.processutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.309 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.310 183079 DEBUG oslo_concurrency.processutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.367 183079 DEBUG oslo_concurrency.processutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.368 183079 DEBUG nova.virt.disk.api [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.369 183079 DEBUG oslo_concurrency.processutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.429 183079 DEBUG nova.policy [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.438 183079 DEBUG oslo_concurrency.processutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.439 183079 DEBUG nova.virt.disk.api [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.439 183079 DEBUG nova.objects.instance [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid 12ded08f-c0e2-4d03-967b-3436626dbbb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.455 183079 DEBUG nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.455 183079 DEBUG nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Ensure instance console log exists: /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.456 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.456 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.456 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:44 compute-0 nova_compute[183075]: 2026-01-22 17:44:44.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.298 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.299 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.299 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.299 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:44:45 compute-0 podman[239374]: 2026-01-22 17:44:45.341419132 +0000 UTC m=+0.054708916 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.458 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.459 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5697MB free_disk=73.35900115966797GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.459 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.459 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.583 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 12ded08f-c0e2-4d03-967b-3436626dbbb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.584 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.584 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.694 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.711 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.733 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.733 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.748 183079 DEBUG nova.network.neutron [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Successfully created port: d562087b-b585-4eeb-9ef3-31bd96de01f4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.807 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.808 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.808 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:44:45 compute-0 nova_compute[183075]: 2026-01-22 17:44:45.808 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 17:44:46 compute-0 nova_compute[183075]: 2026-01-22 17:44:46.357 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:46 compute-0 nova_compute[183075]: 2026-01-22 17:44:46.604 183079 DEBUG nova.network.neutron [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Successfully updated port: d562087b-b585-4eeb-9ef3-31bd96de01f4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:44:46 compute-0 nova_compute[183075]: 2026-01-22 17:44:46.619 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-12ded08f-c0e2-4d03-967b-3436626dbbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:44:46 compute-0 nova_compute[183075]: 2026-01-22 17:44:46.620 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-12ded08f-c0e2-4d03-967b-3436626dbbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:44:46 compute-0 nova_compute[183075]: 2026-01-22 17:44:46.620 183079 DEBUG nova.network.neutron [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:44:46 compute-0 nova_compute[183075]: 2026-01-22 17:44:46.703 183079 DEBUG nova.compute.manager [req-627d91b9-be99-4894-9e91-7c7d717d1c9f req-57194d95-0305-42f1-92fc-4e6528f095ff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Received event network-changed-d562087b-b585-4eeb-9ef3-31bd96de01f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:44:46 compute-0 nova_compute[183075]: 2026-01-22 17:44:46.703 183079 DEBUG nova.compute.manager [req-627d91b9-be99-4894-9e91-7c7d717d1c9f req-57194d95-0305-42f1-92fc-4e6528f095ff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Refreshing instance network info cache due to event network-changed-d562087b-b585-4eeb-9ef3-31bd96de01f4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:44:46 compute-0 nova_compute[183075]: 2026-01-22 17:44:46.703 183079 DEBUG oslo_concurrency.lockutils [req-627d91b9-be99-4894-9e91-7c7d717d1c9f req-57194d95-0305-42f1-92fc-4e6528f095ff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-12ded08f-c0e2-4d03-967b-3436626dbbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:44:46 compute-0 nova_compute[183075]: 2026-01-22 17:44:46.756 183079 DEBUG nova.network.neutron [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.583 183079 DEBUG nova.network.neutron [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Updating instance_info_cache with network_info: [{"id": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "address": "fa:16:3e:8b:ad:02", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd562087b-b5", "ovs_interfaceid": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.603 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-12ded08f-c0e2-4d03-967b-3436626dbbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.604 183079 DEBUG nova.compute.manager [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Instance network_info: |[{"id": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "address": "fa:16:3e:8b:ad:02", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd562087b-b5", "ovs_interfaceid": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.604 183079 DEBUG oslo_concurrency.lockutils [req-627d91b9-be99-4894-9e91-7c7d717d1c9f req-57194d95-0305-42f1-92fc-4e6528f095ff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-12ded08f-c0e2-4d03-967b-3436626dbbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.604 183079 DEBUG nova.network.neutron [req-627d91b9-be99-4894-9e91-7c7d717d1c9f req-57194d95-0305-42f1-92fc-4e6528f095ff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Refreshing network info cache for port d562087b-b585-4eeb-9ef3-31bd96de01f4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.606 183079 DEBUG nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Start _get_guest_xml network_info=[{"id": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "address": "fa:16:3e:8b:ad:02", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd562087b-b5", "ovs_interfaceid": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.610 183079 WARNING nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.614 183079 DEBUG nova.virt.libvirt.host [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.614 183079 DEBUG nova.virt.libvirt.host [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.620 183079 DEBUG nova.virt.libvirt.host [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.621 183079 DEBUG nova.virt.libvirt.host [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.621 183079 DEBUG nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.621 183079 DEBUG nova.virt.hardware [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.622 183079 DEBUG nova.virt.hardware [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.622 183079 DEBUG nova.virt.hardware [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.622 183079 DEBUG nova.virt.hardware [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.623 183079 DEBUG nova.virt.hardware [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.623 183079 DEBUG nova.virt.hardware [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.623 183079 DEBUG nova.virt.hardware [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.623 183079 DEBUG nova.virt.hardware [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.623 183079 DEBUG nova.virt.hardware [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.624 183079 DEBUG nova.virt.hardware [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.624 183079 DEBUG nova.virt.hardware [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.629 183079 DEBUG nova.virt.libvirt.vif [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:44:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1658329936',display_name='tempest-server-test-1658329936',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1658329936',id=67,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-0n0qchr8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:44:44Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=12ded08f-c0e2-4d03-967b-3436626dbbb2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "address": "fa:16:3e:8b:ad:02", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd562087b-b5", "ovs_interfaceid": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.629 183079 DEBUG nova.network.os_vif_util [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "address": "fa:16:3e:8b:ad:02", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd562087b-b5", "ovs_interfaceid": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.630 183079 DEBUG nova.network.os_vif_util [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:ad:02,bridge_name='br-int',has_traffic_filtering=True,id=d562087b-b585-4eeb-9ef3-31bd96de01f4,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd562087b-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.630 183079 DEBUG nova.objects.instance [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid 12ded08f-c0e2-4d03-967b-3436626dbbb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.649 183079 DEBUG nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:44:47 compute-0 nova_compute[183075]:   <uuid>12ded08f-c0e2-4d03-967b-3436626dbbb2</uuid>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   <name>instance-00000043</name>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1658329936</nova:name>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:44:47</nova:creationTime>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:44:47 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:44:47 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:44:47 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:44:47 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:44:47 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:44:47 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:44:47 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:44:47 compute-0 nova_compute[183075]:         <nova:port uuid="d562087b-b585-4eeb-9ef3-31bd96de01f4">
Jan 22 17:44:47 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <system>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <entry name="serial">12ded08f-c0e2-4d03-967b-3436626dbbb2</entry>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <entry name="uuid">12ded08f-c0e2-4d03-967b-3436626dbbb2</entry>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     </system>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   <os>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   </os>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   <features>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   </features>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:8b:ad:02"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <target dev="tapd562087b-b5"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/console.log" append="off"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <video>
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     </video>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:44:47 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:44:47 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:44:47 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:44:47 compute-0 nova_compute[183075]: </domain>
Jan 22 17:44:47 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.650 183079 DEBUG nova.compute.manager [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Preparing to wait for external event network-vif-plugged-d562087b-b585-4eeb-9ef3-31bd96de01f4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.650 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.651 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.651 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.651 183079 DEBUG nova.virt.libvirt.vif [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:44:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1658329936',display_name='tempest-server-test-1658329936',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1658329936',id=67,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-0n0qchr8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:44:44Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=12ded08f-c0e2-4d03-967b-3436626dbbb2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "address": "fa:16:3e:8b:ad:02", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd562087b-b5", "ovs_interfaceid": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.652 183079 DEBUG nova.network.os_vif_util [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "address": "fa:16:3e:8b:ad:02", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd562087b-b5", "ovs_interfaceid": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.653 183079 DEBUG nova.network.os_vif_util [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:ad:02,bridge_name='br-int',has_traffic_filtering=True,id=d562087b-b585-4eeb-9ef3-31bd96de01f4,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd562087b-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.653 183079 DEBUG os_vif [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:ad:02,bridge_name='br-int',has_traffic_filtering=True,id=d562087b-b585-4eeb-9ef3-31bd96de01f4,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd562087b-b5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.653 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.654 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.654 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.657 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.657 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd562087b-b5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.657 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd562087b-b5, col_values=(('external_ids', {'iface-id': 'd562087b-b585-4eeb-9ef3-31bd96de01f4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:ad:02', 'vm-uuid': '12ded08f-c0e2-4d03-967b-3436626dbbb2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.659 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:47 compute-0 NetworkManager[55454]: <info>  [1769103887.6601] manager: (tapd562087b-b5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.661 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.664 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.664 183079 INFO os_vif [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:ad:02,bridge_name='br-int',has_traffic_filtering=True,id=d562087b-b585-4eeb-9ef3-31bd96de01f4,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd562087b-b5')
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.705 183079 DEBUG nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.706 183079 DEBUG nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:8b:ad:02, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:44:47 compute-0 kernel: tapd562087b-b5: entered promiscuous mode
Jan 22 17:44:47 compute-0 NetworkManager[55454]: <info>  [1769103887.7754] manager: (tapd562087b-b5): new Tun device (/org/freedesktop/NetworkManager/Devices/305)
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.776 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:47 compute-0 ovn_controller[95372]: 2026-01-22T17:44:47Z|00763|binding|INFO|Claiming lport d562087b-b585-4eeb-9ef3-31bd96de01f4 for this chassis.
Jan 22 17:44:47 compute-0 ovn_controller[95372]: 2026-01-22T17:44:47Z|00764|binding|INFO|d562087b-b585-4eeb-9ef3-31bd96de01f4: Claiming fa:16:3e:8b:ad:02 10.100.0.13
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.783 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:ad:02 10.100.0.13'], port_security=['fa:16:3e:8b:ad:02 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '211927a5-364f-4b76-9332-16507814b750 f983fa10-a163-4ba4-97be-44ccb35ba95f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=d562087b-b585-4eeb-9ef3-31bd96de01f4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.784 104629 INFO neutron.agent.ovn.metadata.agent [-] Port d562087b-b585-4eeb-9ef3-31bd96de01f4 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.785 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.792 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:47 compute-0 ovn_controller[95372]: 2026-01-22T17:44:47Z|00765|binding|INFO|Setting lport d562087b-b585-4eeb-9ef3-31bd96de01f4 up in Southbound
Jan 22 17:44:47 compute-0 ovn_controller[95372]: 2026-01-22T17:44:47Z|00766|binding|INFO|Setting lport d562087b-b585-4eeb-9ef3-31bd96de01f4 ovn-installed in OVS
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.794 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.798 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.800 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7648ea8e-c469-485b-ba48-b297002981f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.801 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88ed9213-71 in ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.803 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88ed9213-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.803 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8f2e76f1-bbcd-4aac-af27-f73ef70975e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.804 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[de0d5429-1add-4c49-bd9a-b454e3a66759]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:47 compute-0 systemd-udevd[239415]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.809 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.809 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.816 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[3eef1395-20d1-46e3-9014-ac56bc98c7d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:47 compute-0 NetworkManager[55454]: <info>  [1769103887.8240] device (tapd562087b-b5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:44:47 compute-0 NetworkManager[55454]: <info>  [1769103887.8253] device (tapd562087b-b5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:44:47 compute-0 systemd-machined[154382]: New machine qemu-67-instance-00000043.
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.835 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c1dd4d81-993e-40fa-8ee8-e534d6c45f5f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:47 compute-0 systemd[1]: Started Virtual Machine qemu-67-instance-00000043.
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.870 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ae4aad55-2c27-494f-9d56-f942a5234f90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:47 compute-0 systemd-udevd[239421]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:44:47 compute-0 NetworkManager[55454]: <info>  [1769103887.8783] manager: (tap88ed9213-70): new Veth device (/org/freedesktop/NetworkManager/Devices/306)
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.877 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ef329f59-a98d-4909-9165-639db4d1c842]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.923 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[2eebf8cc-898c-44d5-8fba-89c84c7c640c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.927 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[aaeadd59-7eb6-4f09-a04b-e8fb192d55f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:47 compute-0 NetworkManager[55454]: <info>  [1769103887.9535] device (tap88ed9213-70): carrier: link connected
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.958 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f1fdf4c4-0a83-407c-ab0c-92daf9296fb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.965 183079 DEBUG nova.compute.manager [req-870a166c-1114-4207-aff1-1c67c9f57b77 req-b9694d15-e184-4a40-90a2-5a07f0244f65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Received event network-vif-plugged-d562087b-b585-4eeb-9ef3-31bd96de01f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.965 183079 DEBUG oslo_concurrency.lockutils [req-870a166c-1114-4207-aff1-1c67c9f57b77 req-b9694d15-e184-4a40-90a2-5a07f0244f65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.965 183079 DEBUG oslo_concurrency.lockutils [req-870a166c-1114-4207-aff1-1c67c9f57b77 req-b9694d15-e184-4a40-90a2-5a07f0244f65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.965 183079 DEBUG oslo_concurrency.lockutils [req-870a166c-1114-4207-aff1-1c67c9f57b77 req-b9694d15-e184-4a40-90a2-5a07f0244f65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:47 compute-0 nova_compute[183075]: 2026-01-22 17:44:47.965 183079 DEBUG nova.compute.manager [req-870a166c-1114-4207-aff1-1c67c9f57b77 req-b9694d15-e184-4a40-90a2-5a07f0244f65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Processing event network-vif-plugged-d562087b-b585-4eeb-9ef3-31bd96de01f4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.976 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0bebca30-8d8b-499a-b194-b6c911260465]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 625165, 'reachable_time': 16968, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239449, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:47.995 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ffce1333-60a4-4189-8460-dab2775f99ae]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:fa93'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 625165, 'tstamp': 625165}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239450, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:48.015 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d8328398-cda8-4bfe-9d7e-63fba332bf23]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 625165, 'reachable_time': 16968, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239451, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:48.047 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fa6661e2-576b-4020-8390-c281153232aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:48.112 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[44c8484c-dacb-46b6-90ad-06a6399c4eb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:48.114 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:48.114 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:48.115 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:44:48 compute-0 NetworkManager[55454]: <info>  [1769103888.1179] manager: (tap88ed9213-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/307)
Jan 22 17:44:48 compute-0 kernel: tap88ed9213-70: entered promiscuous mode
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.117 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:48.121 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:44:48 compute-0 ovn_controller[95372]: 2026-01-22T17:44:48Z|00767|binding|INFO|Releasing lport da93a32e-eed6-4124-8f08-78b0bfe2cb69 from this chassis (sb_readonly=0)
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.122 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:48.125 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:48.130 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[eb25ed0d-61ab-49ec-ae19-56f9160bcb48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:48.130 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:44:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:44:48.131 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'env', 'PROCESS_TAG=haproxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88ed9213-7d48-4c42-ad60-ce3fcf104f71.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.133 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.195 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103888.195291, 12ded08f-c0e2-4d03-967b-3436626dbbb2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.196 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] VM Started (Lifecycle Event)
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.198 183079 DEBUG nova.compute.manager [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.200 183079 DEBUG nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.203 183079 INFO nova.virt.libvirt.driver [-] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Instance spawned successfully.
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.204 183079 DEBUG nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.228 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.229 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103873.2272272, 8881a120-c63d-43dd-8135-7596ef1f460c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.229 183079 INFO nova.compute.manager [-] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] VM Stopped (Lifecycle Event)
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.234 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.236 183079 DEBUG nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.237 183079 DEBUG nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.237 183079 DEBUG nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.238 183079 DEBUG nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.238 183079 DEBUG nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.238 183079 DEBUG nova.virt.libvirt.driver [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.264 183079 DEBUG nova.compute.manager [None req-1f9ea9fa-9d7b-458e-90c6-474aea8c7cfb - - - - - -] [instance: 8881a120-c63d-43dd-8135-7596ef1f460c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.270 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.271 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103888.1955638, 12ded08f-c0e2-4d03-967b-3436626dbbb2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.271 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] VM Paused (Lifecycle Event)
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.298 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.301 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103888.2000868, 12ded08f-c0e2-4d03-967b-3436626dbbb2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.302 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] VM Resumed (Lifecycle Event)
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.310 183079 INFO nova.compute.manager [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Took 4.21 seconds to spawn the instance on the hypervisor.
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.311 183079 DEBUG nova.compute.manager [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.319 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.322 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.357 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.371 183079 INFO nova.compute.manager [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Took 4.63 seconds to build instance.
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.390 183079 DEBUG oslo_concurrency.lockutils [None req-b345e08a-a8e3-40c4-8f68-abc0e6625ee8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "12ded08f-c0e2-4d03-967b-3436626dbbb2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.705s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.509 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:48 compute-0 podman[239490]: 2026-01-22 17:44:48.510479352 +0000 UTC m=+0.056597837 container create 9ef748973fac02fecebc83d5ab3691929723bb88d4f618f5679b8bddaea1ef1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 22 17:44:48 compute-0 systemd[1]: Started libpod-conmon-9ef748973fac02fecebc83d5ab3691929723bb88d4f618f5679b8bddaea1ef1e.scope.
Jan 22 17:44:48 compute-0 podman[239490]: 2026-01-22 17:44:48.479698508 +0000 UTC m=+0.025817013 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:44:48 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:44:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646e3662c6d677120c479f4bc7a903e0243fccd3ba345ebdd09d79ad7110378f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:44:48 compute-0 podman[239490]: 2026-01-22 17:44:48.615308831 +0000 UTC m=+0.161427336 container init 9ef748973fac02fecebc83d5ab3691929723bb88d4f618f5679b8bddaea1ef1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 17:44:48 compute-0 podman[239490]: 2026-01-22 17:44:48.620431068 +0000 UTC m=+0.166549553 container start 9ef748973fac02fecebc83d5ab3691929723bb88d4f618f5679b8bddaea1ef1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 17:44:48 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239505]: [NOTICE]   (239509) : New worker (239511) forked
Jan 22 17:44:48 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239505]: [NOTICE]   (239509) : Loading success.
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.739 183079 DEBUG nova.network.neutron [req-627d91b9-be99-4894-9e91-7c7d717d1c9f req-57194d95-0305-42f1-92fc-4e6528f095ff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Updated VIF entry in instance network info cache for port d562087b-b585-4eeb-9ef3-31bd96de01f4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.739 183079 DEBUG nova.network.neutron [req-627d91b9-be99-4894-9e91-7c7d717d1c9f req-57194d95-0305-42f1-92fc-4e6528f095ff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Updating instance_info_cache with network_info: [{"id": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "address": "fa:16:3e:8b:ad:02", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd562087b-b5", "ovs_interfaceid": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:44:48 compute-0 nova_compute[183075]: 2026-01-22 17:44:48.753 183079 DEBUG oslo_concurrency.lockutils [req-627d91b9-be99-4894-9e91-7c7d717d1c9f req-57194d95-0305-42f1-92fc-4e6528f095ff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-12ded08f-c0e2-4d03-967b-3436626dbbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:44:49 compute-0 nova_compute[183075]: 2026-01-22 17:44:49.037 183079 INFO nova.compute.manager [None req-72c4cd0c-d44a-4b4f-85c4-3b74ca6a75a6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Get console output
Jan 22 17:44:49 compute-0 nova_compute[183075]: 2026-01-22 17:44:49.043 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:44:49 compute-0 nova_compute[183075]: 2026-01-22 17:44:49.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:44:50 compute-0 nova_compute[183075]: 2026-01-22 17:44:50.038 183079 DEBUG nova.compute.manager [req-4b631a55-f5e0-4b4f-9782-cabde6935c9e req-f83c630f-cac7-4438-bf60-c89f990c3011 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Received event network-vif-plugged-d562087b-b585-4eeb-9ef3-31bd96de01f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:44:50 compute-0 nova_compute[183075]: 2026-01-22 17:44:50.039 183079 DEBUG oslo_concurrency.lockutils [req-4b631a55-f5e0-4b4f-9782-cabde6935c9e req-f83c630f-cac7-4438-bf60-c89f990c3011 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:44:50 compute-0 nova_compute[183075]: 2026-01-22 17:44:50.039 183079 DEBUG oslo_concurrency.lockutils [req-4b631a55-f5e0-4b4f-9782-cabde6935c9e req-f83c630f-cac7-4438-bf60-c89f990c3011 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:44:50 compute-0 nova_compute[183075]: 2026-01-22 17:44:50.039 183079 DEBUG oslo_concurrency.lockutils [req-4b631a55-f5e0-4b4f-9782-cabde6935c9e req-f83c630f-cac7-4438-bf60-c89f990c3011 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:44:50 compute-0 nova_compute[183075]: 2026-01-22 17:44:50.040 183079 DEBUG nova.compute.manager [req-4b631a55-f5e0-4b4f-9782-cabde6935c9e req-f83c630f-cac7-4438-bf60-c89f990c3011 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] No waiting events found dispatching network-vif-plugged-d562087b-b585-4eeb-9ef3-31bd96de01f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:44:50 compute-0 nova_compute[183075]: 2026-01-22 17:44:50.040 183079 WARNING nova.compute.manager [req-4b631a55-f5e0-4b4f-9782-cabde6935c9e req-f83c630f-cac7-4438-bf60-c89f990c3011 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Received unexpected event network-vif-plugged-d562087b-b585-4eeb-9ef3-31bd96de01f4 for instance with vm_state active and task_state None.
Jan 22 17:44:51 compute-0 nova_compute[183075]: 2026-01-22 17:44:51.335 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769103876.334154, c570960f-e948-4456-9f9c-7b8afd2cf0ac => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:44:51 compute-0 nova_compute[183075]: 2026-01-22 17:44:51.335 183079 INFO nova.compute.manager [-] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] VM Stopped (Lifecycle Event)
Jan 22 17:44:51 compute-0 nova_compute[183075]: 2026-01-22 17:44:51.364 183079 DEBUG nova.compute.manager [None req-692d79ef-c81b-4abc-9606-0ad7bbecbdf7 - - - - - -] [instance: c570960f-e948-4456-9f9c-7b8afd2cf0ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:44:52 compute-0 nova_compute[183075]: 2026-01-22 17:44:52.659 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:53 compute-0 nova_compute[183075]: 2026-01-22 17:44:53.566 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:54 compute-0 nova_compute[183075]: 2026-01-22 17:44:54.158 183079 INFO nova.compute.manager [None req-4a7b6d75-6ce8-4839-b7f1-8879440d06dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Get console output
Jan 22 17:44:54 compute-0 nova_compute[183075]: 2026-01-22 17:44:54.163 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:44:55 compute-0 podman[239522]: 2026-01-22 17:44:55.353440676 +0000 UTC m=+0.057902053 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.openshift.expose-services=, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 17:44:55 compute-0 podman[239521]: 2026-01-22 17:44:55.372735872 +0000 UTC m=+0.079616463 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:44:55 compute-0 podman[239520]: 2026-01-22 17:44:55.384494127 +0000 UTC m=+0.093578827 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 17:44:57 compute-0 nova_compute[183075]: 2026-01-22 17:44:57.661 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:58 compute-0 nova_compute[183075]: 2026-01-22 17:44:58.567 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:44:59 compute-0 nova_compute[183075]: 2026-01-22 17:44:59.260 183079 INFO nova.compute.manager [None req-24220af3-4b64-4584-8f84-320d9ec0741b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Get console output
Jan 22 17:44:59 compute-0 nova_compute[183075]: 2026-01-22 17:44:59.267 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:45:00 compute-0 ovn_controller[95372]: 2026-01-22T17:45:00Z|00136|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:ad:02 10.100.0.13
Jan 22 17:45:00 compute-0 ovn_controller[95372]: 2026-01-22T17:45:00Z|00137|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:ad:02 10.100.0.13
Jan 22 17:45:01 compute-0 podman[239594]: 2026-01-22 17:45:01.364700029 +0000 UTC m=+0.080622960 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:45:02 compute-0 nova_compute[183075]: 2026-01-22 17:45:02.664 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:02 compute-0 nova_compute[183075]: 2026-01-22 17:45:02.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:45:02 compute-0 nova_compute[183075]: 2026-01-22 17:45:02.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 17:45:02 compute-0 nova_compute[183075]: 2026-01-22 17:45:02.808 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 17:45:03 compute-0 nova_compute[183075]: 2026-01-22 17:45:03.569 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:05 compute-0 nova_compute[183075]: 2026-01-22 17:45:05.044 183079 INFO nova.compute.manager [None req-e54e5668-71b5-452d-8c06-a75c6c5c11df 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Get console output
Jan 22 17:45:05 compute-0 nova_compute[183075]: 2026-01-22 17:45:05.049 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.456 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.457 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.778 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.779 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.3220737
Jan 22 17:45:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.13:34290 [22/Jan/2026:17:45:05.455] listener listener/metadata 0/0/0/323/323 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.788 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.789 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.807 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.808 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0186605
Jan 22 17:45:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.13:34298 [22/Jan/2026:17:45:05.788] listener listener/metadata 0/0/0/20/20 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.813 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.814 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.829 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.829 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0154610
Jan 22 17:45:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.13:34300 [22/Jan/2026:17:45:05.813] listener listener/metadata 0/0/0/16/16 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.835 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.836 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.853 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.854 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0184264
Jan 22 17:45:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.13:34302 [22/Jan/2026:17:45:05.835] listener listener/metadata 0/0/0/19/19 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.861 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.862 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.876 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.877 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0150597
Jan 22 17:45:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.13:34316 [22/Jan/2026:17:45:05.860] listener listener/metadata 0/0/0/16/16 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.882 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.883 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.896 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.896 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0133204
Jan 22 17:45:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.13:34328 [22/Jan/2026:17:45:05.881] listener listener/metadata 0/0/0/14/14 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.902 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.902 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.915 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.916 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0132363
Jan 22 17:45:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.13:34334 [22/Jan/2026:17:45:05.901] listener listener/metadata 0/0/0/14/14 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.921 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.922 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.936 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.936 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0143480
Jan 22 17:45:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.13:34346 [22/Jan/2026:17:45:05.921] listener listener/metadata 0/0/0/15/15 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.941 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.941 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.956 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.957 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0153141
Jan 22 17:45:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.13:34350 [22/Jan/2026:17:45:05.940] listener listener/metadata 0/0/0/16/16 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.961 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.962 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.977 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.977 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0157926
Jan 22 17:45:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.13:34356 [22/Jan/2026:17:45:05.960] listener listener/metadata 0/0/0/17/17 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.982 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.982 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:45:05 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.13:34360 [22/Jan/2026:17:45:05.981] listener listener/metadata 0/0/0/15/15 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:45:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:05.997 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0147991
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.005 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.006 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.019 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.019 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0137620
Jan 22 17:45:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.13:34368 [22/Jan/2026:17:45:06.005] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.023 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.024 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.035 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.036 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0122685
Jan 22 17:45:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.13:34378 [22/Jan/2026:17:45:06.022] listener listener/metadata 0/0/0/13/13 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.040 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.040 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.054 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.055 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0144114
Jan 22 17:45:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.13:34388 [22/Jan/2026:17:45:06.039] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.059 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.060 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.076 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.077 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0172350
Jan 22 17:45:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.13:34398 [22/Jan/2026:17:45:06.059] listener listener/metadata 0/0/0/18/18 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.082 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.083 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.096 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:45:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:06.096 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0135210
Jan 22 17:45:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.13:34410 [22/Jan/2026:17:45:06.081] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:45:07 compute-0 nova_compute[183075]: 2026-01-22 17:45:07.666 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:08 compute-0 nova_compute[183075]: 2026-01-22 17:45:08.572 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:08 compute-0 nova_compute[183075]: 2026-01-22 17:45:08.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:45:10 compute-0 nova_compute[183075]: 2026-01-22 17:45:10.182 183079 INFO nova.compute.manager [None req-93849972-b69f-4258-a5de-44e2bd2328ca 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Get console output
Jan 22 17:45:10 compute-0 nova_compute[183075]: 2026-01-22 17:45:10.186 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:45:12 compute-0 podman[239615]: 2026-01-22 17:45:12.370419258 +0000 UTC m=+0.066960464 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:45:12 compute-0 nova_compute[183075]: 2026-01-22 17:45:12.667 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:13 compute-0 nova_compute[183075]: 2026-01-22 17:45:13.575 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:15 compute-0 nova_compute[183075]: 2026-01-22 17:45:15.326 183079 INFO nova.compute.manager [None req-5bd6ab3e-4a65-4d3d-b1b3-5597b4a64938 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Get console output
Jan 22 17:45:15 compute-0 nova_compute[183075]: 2026-01-22 17:45:15.331 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:45:16 compute-0 podman[239640]: 2026-01-22 17:45:16.340651319 +0000 UTC m=+0.049578270 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:45:17 compute-0 nova_compute[183075]: 2026-01-22 17:45:17.668 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:17 compute-0 ovn_controller[95372]: 2026-01-22T17:45:17Z|00768|memory_trim|INFO|Detected inactivity (last active 30019 ms ago): trimming memory
Jan 22 17:45:18 compute-0 nova_compute[183075]: 2026-01-22 17:45:18.577 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:20 compute-0 nova_compute[183075]: 2026-01-22 17:45:20.453 183079 INFO nova.compute.manager [None req-0cf17b33-bcbe-433a-943c-65af517c02ac 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Get console output
Jan 22 17:45:20 compute-0 nova_compute[183075]: 2026-01-22 17:45:20.457 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:45:22 compute-0 nova_compute[183075]: 2026-01-22 17:45:22.672 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:23 compute-0 nova_compute[183075]: 2026-01-22 17:45:23.579 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:25 compute-0 nova_compute[183075]: 2026-01-22 17:45:25.586 183079 INFO nova.compute.manager [None req-f4df972a-d6d7-47d4-b579-e30ff7c3f838 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Get console output
Jan 22 17:45:25 compute-0 nova_compute[183075]: 2026-01-22 17:45:25.591 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:45:26 compute-0 podman[239666]: 2026-01-22 17:45:26.347847571 +0000 UTC m=+0.051353226 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, distribution-scope=public, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:45:26 compute-0 podman[239665]: 2026-01-22 17:45:26.370238381 +0000 UTC m=+0.076363036 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:45:26 compute-0 podman[239664]: 2026-01-22 17:45:26.379816938 +0000 UTC m=+0.086916200 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:45:27 compute-0 nova_compute[183075]: 2026-01-22 17:45:27.675 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:28 compute-0 nova_compute[183075]: 2026-01-22 17:45:28.581 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:30 compute-0 nova_compute[183075]: 2026-01-22 17:45:30.713 183079 INFO nova.compute.manager [None req-1611c0cc-5ad6-4fc7-be5c-85e3616c3a50 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Get console output
Jan 22 17:45:30 compute-0 nova_compute[183075]: 2026-01-22 17:45:30.719 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:45:31 compute-0 nova_compute[183075]: 2026-01-22 17:45:31.804 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:45:32 compute-0 podman[239725]: 2026-01-22 17:45:32.374416644 +0000 UTC m=+0.084233997 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 17:45:32 compute-0 nova_compute[183075]: 2026-01-22 17:45:32.679 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:33 compute-0 nova_compute[183075]: 2026-01-22 17:45:33.583 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:34 compute-0 nova_compute[183075]: 2026-01-22 17:45:34.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:45:35 compute-0 nova_compute[183075]: 2026-01-22 17:45:35.858 183079 INFO nova.compute.manager [None req-3f6713a8-5b8f-43c5-8bea-4cbf4fdd0bbe 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Get console output
Jan 22 17:45:35 compute-0 nova_compute[183075]: 2026-01-22 17:45:35.863 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:45:36 compute-0 nova_compute[183075]: 2026-01-22 17:45:36.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:45:37 compute-0 nova_compute[183075]: 2026-01-22 17:45:37.681 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:38 compute-0 nova_compute[183075]: 2026-01-22 17:45:38.584 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:38 compute-0 nova_compute[183075]: 2026-01-22 17:45:38.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:45:40 compute-0 nova_compute[183075]: 2026-01-22 17:45:40.987 183079 INFO nova.compute.manager [None req-e1c2b1e2-67ee-445c-a96e-1c66fe622bb2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Get console output
Jan 22 17:45:40 compute-0 nova_compute[183075]: 2026-01-22 17:45:40.992 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:45:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:41.964 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:45:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:41.965 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:45:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:41.966 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:45:42 compute-0 nova_compute[183075]: 2026-01-22 17:45:42.685 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:43 compute-0 podman[239747]: 2026-01-22 17:45:43.325438512 +0000 UTC m=+0.040828395 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:45:43 compute-0 nova_compute[183075]: 2026-01-22 17:45:43.586 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:44 compute-0 nova_compute[183075]: 2026-01-22 17:45:44.733 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:45:44 compute-0 nova_compute[183075]: 2026-01-22 17:45:44.734 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:45:44 compute-0 nova_compute[183075]: 2026-01-22 17:45:44.751 183079 DEBUG nova.compute.manager [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:45:44 compute-0 nova_compute[183075]: 2026-01-22 17:45:44.826 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:45:44 compute-0 nova_compute[183075]: 2026-01-22 17:45:44.827 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:45:44 compute-0 nova_compute[183075]: 2026-01-22 17:45:44.836 183079 DEBUG nova.virt.hardware [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:45:44 compute-0 nova_compute[183075]: 2026-01-22 17:45:44.836 183079 INFO nova.compute.claims [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:45:44 compute-0 nova_compute[183075]: 2026-01-22 17:45:44.980 183079 DEBUG nova.compute.provider_tree [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:45:44 compute-0 nova_compute[183075]: 2026-01-22 17:45:44.993 183079 DEBUG nova.scheduler.client.report [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.015 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.016 183079 DEBUG nova.compute.manager [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.060 183079 DEBUG nova.compute.manager [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.061 183079 DEBUG nova.network.neutron [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.080 183079 INFO nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.097 183079 DEBUG nova.compute.manager [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.185 183079 DEBUG nova.compute.manager [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.186 183079 DEBUG nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.186 183079 INFO nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Creating image(s)
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.187 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.187 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.188 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.199 183079 DEBUG oslo_concurrency.processutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.259 183079 DEBUG oslo_concurrency.processutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.260 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.261 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.273 183079 DEBUG oslo_concurrency.processutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.331 183079 DEBUG oslo_concurrency.processutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.332 183079 DEBUG oslo_concurrency.processutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.362 183079 DEBUG oslo_concurrency.processutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.363 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.363 183079 DEBUG oslo_concurrency.processutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.415 183079 DEBUG oslo_concurrency.processutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.416 183079 DEBUG nova.virt.disk.api [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.416 183079 DEBUG oslo_concurrency.processutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.465 183079 DEBUG nova.policy [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.469 183079 DEBUG oslo_concurrency.processutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.469 183079 DEBUG nova.virt.disk.api [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.470 183079 DEBUG nova.objects.instance [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid cff4e488-d0bf-4df8-900e-e9f61f4309ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.487 183079 DEBUG nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.488 183079 DEBUG nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Ensure instance console log exists: /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.488 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.488 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.489 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:45:45 compute-0 nova_compute[183075]: 2026-01-22 17:45:45.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:45:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:46.042 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:45:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:46.043 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:45:46 compute-0 nova_compute[183075]: 2026-01-22 17:45:46.043 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:46.044 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:45:46 compute-0 nova_compute[183075]: 2026-01-22 17:45:46.143 183079 DEBUG nova.network.neutron [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Successfully created port: 9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:45:46 compute-0 nova_compute[183075]: 2026-01-22 17:45:46.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:45:46 compute-0 nova_compute[183075]: 2026-01-22 17:45:46.792 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:45:46 compute-0 nova_compute[183075]: 2026-01-22 17:45:46.792 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:45:46 compute-0 nova_compute[183075]: 2026-01-22 17:45:46.809 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 17:45:46 compute-0 nova_compute[183075]: 2026-01-22 17:45:46.951 183079 DEBUG nova.network.neutron [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Successfully updated port: 9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:45:46 compute-0 nova_compute[183075]: 2026-01-22 17:45:46.968 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-cff4e488-d0bf-4df8-900e-e9f61f4309ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:45:46 compute-0 nova_compute[183075]: 2026-01-22 17:45:46.969 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-cff4e488-d0bf-4df8-900e-e9f61f4309ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:45:46 compute-0 nova_compute[183075]: 2026-01-22 17:45:46.969 183079 DEBUG nova.network.neutron [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:45:47 compute-0 nova_compute[183075]: 2026-01-22 17:45:47.011 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-12ded08f-c0e2-4d03-967b-3436626dbbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:45:47 compute-0 nova_compute[183075]: 2026-01-22 17:45:47.011 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-12ded08f-c0e2-4d03-967b-3436626dbbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:45:47 compute-0 nova_compute[183075]: 2026-01-22 17:45:47.011 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:45:47 compute-0 nova_compute[183075]: 2026-01-22 17:45:47.012 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 12ded08f-c0e2-4d03-967b-3436626dbbb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:45:47 compute-0 nova_compute[183075]: 2026-01-22 17:45:47.049 183079 DEBUG nova.compute.manager [req-1c03ef48-b57a-4f97-8314-4364f584426f req-1733f567-385b-4ba9-a93e-6ec33b7d84a9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Received event network-changed-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:45:47 compute-0 nova_compute[183075]: 2026-01-22 17:45:47.050 183079 DEBUG nova.compute.manager [req-1c03ef48-b57a-4f97-8314-4364f584426f req-1733f567-385b-4ba9-a93e-6ec33b7d84a9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Refreshing instance network info cache due to event network-changed-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:45:47 compute-0 nova_compute[183075]: 2026-01-22 17:45:47.050 183079 DEBUG oslo_concurrency.lockutils [req-1c03ef48-b57a-4f97-8314-4364f584426f req-1733f567-385b-4ba9-a93e-6ec33b7d84a9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-cff4e488-d0bf-4df8-900e-e9f61f4309ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:45:47 compute-0 nova_compute[183075]: 2026-01-22 17:45:47.144 183079 DEBUG nova.network.neutron [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:45:47 compute-0 podman[239787]: 2026-01-22 17:45:47.341570351 +0000 UTC m=+0.054960033 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:45:47 compute-0 nova_compute[183075]: 2026-01-22 17:45:47.688 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.290 183079 DEBUG nova.network.neutron [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Updating instance_info_cache with network_info: [{"id": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "address": "fa:16:3e:78:62:dd", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9df171fe-9c", "ovs_interfaceid": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.315 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-cff4e488-d0bf-4df8-900e-e9f61f4309ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.316 183079 DEBUG nova.compute.manager [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Instance network_info: |[{"id": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "address": "fa:16:3e:78:62:dd", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9df171fe-9c", "ovs_interfaceid": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.316 183079 DEBUG oslo_concurrency.lockutils [req-1c03ef48-b57a-4f97-8314-4364f584426f req-1733f567-385b-4ba9-a93e-6ec33b7d84a9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-cff4e488-d0bf-4df8-900e-e9f61f4309ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.317 183079 DEBUG nova.network.neutron [req-1c03ef48-b57a-4f97-8314-4364f584426f req-1733f567-385b-4ba9-a93e-6ec33b7d84a9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Refreshing network info cache for port 9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.324 183079 DEBUG nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Start _get_guest_xml network_info=[{"id": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "address": "fa:16:3e:78:62:dd", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9df171fe-9c", "ovs_interfaceid": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.330 183079 WARNING nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.334 183079 DEBUG nova.virt.libvirt.host [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.335 183079 DEBUG nova.virt.libvirt.host [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.343 183079 DEBUG nova.virt.libvirt.host [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.344 183079 DEBUG nova.virt.libvirt.host [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.344 183079 DEBUG nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.344 183079 DEBUG nova.virt.hardware [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.345 183079 DEBUG nova.virt.hardware [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.345 183079 DEBUG nova.virt.hardware [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.345 183079 DEBUG nova.virt.hardware [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.346 183079 DEBUG nova.virt.hardware [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.346 183079 DEBUG nova.virt.hardware [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.346 183079 DEBUG nova.virt.hardware [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.346 183079 DEBUG nova.virt.hardware [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.347 183079 DEBUG nova.virt.hardware [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.347 183079 DEBUG nova.virt.hardware [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.347 183079 DEBUG nova.virt.hardware [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.350 183079 DEBUG nova.virt.libvirt.vif [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:45:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1515844450',display_name='tempest-server-test-1515844450',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1515844450',id=68,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-0t85eya6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:45:45Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=cff4e488-d0bf-4df8-900e-e9f61f4309ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "address": "fa:16:3e:78:62:dd", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9df171fe-9c", "ovs_interfaceid": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.350 183079 DEBUG nova.network.os_vif_util [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "address": "fa:16:3e:78:62:dd", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9df171fe-9c", "ovs_interfaceid": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.351 183079 DEBUG nova.network.os_vif_util [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:62:dd,bridge_name='br-int',has_traffic_filtering=True,id=9df171fe-9cf9-4e33-b0f6-8054d4fb76c3,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9df171fe-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.352 183079 DEBUG nova.objects.instance [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid cff4e488-d0bf-4df8-900e-e9f61f4309ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.368 183079 DEBUG nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:45:48 compute-0 nova_compute[183075]:   <uuid>cff4e488-d0bf-4df8-900e-e9f61f4309ac</uuid>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   <name>instance-00000044</name>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1515844450</nova:name>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:45:48</nova:creationTime>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:45:48 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:45:48 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:45:48 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:45:48 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:45:48 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:45:48 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:45:48 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:45:48 compute-0 nova_compute[183075]:         <nova:port uuid="9df171fe-9cf9-4e33-b0f6-8054d4fb76c3">
Jan 22 17:45:48 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <system>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <entry name="serial">cff4e488-d0bf-4df8-900e-e9f61f4309ac</entry>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <entry name="uuid">cff4e488-d0bf-4df8-900e-e9f61f4309ac</entry>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     </system>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   <os>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   </os>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   <features>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   </features>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:78:62:dd"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <target dev="tap9df171fe-9c"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/console.log" append="off"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <video>
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     </video>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:45:48 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:45:48 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:45:48 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:45:48 compute-0 nova_compute[183075]: </domain>
Jan 22 17:45:48 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.369 183079 DEBUG nova.compute.manager [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Preparing to wait for external event network-vif-plugged-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.370 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.370 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.370 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.371 183079 DEBUG nova.virt.libvirt.vif [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:45:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1515844450',display_name='tempest-server-test-1515844450',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1515844450',id=68,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-0t85eya6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:45:45Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=cff4e488-d0bf-4df8-900e-e9f61f4309ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "address": "fa:16:3e:78:62:dd", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9df171fe-9c", "ovs_interfaceid": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.371 183079 DEBUG nova.network.os_vif_util [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "address": "fa:16:3e:78:62:dd", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9df171fe-9c", "ovs_interfaceid": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.372 183079 DEBUG nova.network.os_vif_util [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:62:dd,bridge_name='br-int',has_traffic_filtering=True,id=9df171fe-9cf9-4e33-b0f6-8054d4fb76c3,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9df171fe-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.372 183079 DEBUG os_vif [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:62:dd,bridge_name='br-int',has_traffic_filtering=True,id=9df171fe-9cf9-4e33-b0f6-8054d4fb76c3,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9df171fe-9c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.373 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.373 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.374 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.377 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.377 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9df171fe-9c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.378 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9df171fe-9c, col_values=(('external_ids', {'iface-id': '9df171fe-9cf9-4e33-b0f6-8054d4fb76c3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:78:62:dd', 'vm-uuid': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.380 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:48 compute-0 NetworkManager[55454]: <info>  [1769103948.3808] manager: (tap9df171fe-9c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.382 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.387 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.388 183079 INFO os_vif [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:62:dd,bridge_name='br-int',has_traffic_filtering=True,id=9df171fe-9cf9-4e33-b0f6-8054d4fb76c3,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9df171fe-9c')
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.456 183079 DEBUG nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.457 183079 DEBUG nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:78:62:dd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:45:48 compute-0 kernel: tap9df171fe-9c: entered promiscuous mode
Jan 22 17:45:48 compute-0 NetworkManager[55454]: <info>  [1769103948.5300] manager: (tap9df171fe-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/309)
Jan 22 17:45:48 compute-0 ovn_controller[95372]: 2026-01-22T17:45:48Z|00769|binding|INFO|Claiming lport 9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 for this chassis.
Jan 22 17:45:48 compute-0 ovn_controller[95372]: 2026-01-22T17:45:48Z|00770|binding|INFO|9df171fe-9cf9-4e33-b0f6-8054d4fb76c3: Claiming fa:16:3e:78:62:dd 10.100.0.5
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.538 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Updating instance_info_cache with network_info: [{"id": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "address": "fa:16:3e:8b:ad:02", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd562087b-b5", "ovs_interfaceid": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.540 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:48.541 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:62:dd 10.100.0.5'], port_security=['fa:16:3e:78:62:dd 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '211927a5-364f-4b76-9332-16507814b750 f983fa10-a163-4ba4-97be-44ccb35ba95f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=9df171fe-9cf9-4e33-b0f6-8054d4fb76c3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:45:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:48.542 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:45:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:48.545 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:45:48 compute-0 ovn_controller[95372]: 2026-01-22T17:45:48Z|00771|binding|INFO|Setting lport 9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 up in Southbound
Jan 22 17:45:48 compute-0 ovn_controller[95372]: 2026-01-22T17:45:48Z|00772|binding|INFO|Setting lport 9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 ovn-installed in OVS
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.552 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.554 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.596 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-12ded08f-c0e2-4d03-967b-3436626dbbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.597 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.598 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.599 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:45:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:48.600 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[640e3b45-64e4-4908-87be-906f269a6a29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:45:48 compute-0 systemd-udevd[239832]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.624 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.624 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.625 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.625 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:45:48 compute-0 systemd-machined[154382]: New machine qemu-68-instance-00000044.
Jan 22 17:45:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:48.631 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[dc8d857d-e827-4d6f-9c56-8aeb304602bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:45:48 compute-0 NetworkManager[55454]: <info>  [1769103948.6355] device (tap9df171fe-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:45:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:48.635 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ecf78350-84f4-4acd-82ea-df6fbd25047e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:45:48 compute-0 NetworkManager[55454]: <info>  [1769103948.6369] device (tap9df171fe-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:45:48 compute-0 systemd[1]: Started Virtual Machine qemu-68-instance-00000044.
Jan 22 17:45:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:48.664 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[cac801f4-839d-4ece-a20f-642d5cad90ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:45:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:48.685 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3626bb60-57e8-4aa0-84cd-50d4eab97196]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 105, 'tx_packets': 55, 'rx_bytes': 8962, 'tx_bytes': 6222, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 105, 'tx_packets': 55, 'rx_bytes': 8962, 'tx_bytes': 6222, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 625165, 'reachable_time': 16968, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239839, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:45:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:48.706 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[53b33031-9d09-4174-999c-4e1d33668072]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 625177, 'tstamp': 625177}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239844, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 625180, 'tstamp': 625180}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239844, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:45:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:48.708 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.710 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.711 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:48.712 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:45:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:48.712 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:45:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:48.713 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:45:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:45:48.713 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.751 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.773 183079 DEBUG nova.compute.manager [req-1d18908a-268e-4112-9f48-d18a00ebdba0 req-f3c25b6f-d4ee-475e-aed5-bc9cb8af8610 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Received event network-vif-plugged-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.774 183079 DEBUG oslo_concurrency.lockutils [req-1d18908a-268e-4112-9f48-d18a00ebdba0 req-f3c25b6f-d4ee-475e-aed5-bc9cb8af8610 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.775 183079 DEBUG oslo_concurrency.lockutils [req-1d18908a-268e-4112-9f48-d18a00ebdba0 req-f3c25b6f-d4ee-475e-aed5-bc9cb8af8610 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.775 183079 DEBUG oslo_concurrency.lockutils [req-1d18908a-268e-4112-9f48-d18a00ebdba0 req-f3c25b6f-d4ee-475e-aed5-bc9cb8af8610 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.776 183079 DEBUG nova.compute.manager [req-1d18908a-268e-4112-9f48-d18a00ebdba0 req-f3c25b6f-d4ee-475e-aed5-bc9cb8af8610 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Processing event network-vif-plugged-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.819 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.821 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.874 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.881 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.936 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:45:48 compute-0 nova_compute[183075]: 2026-01-22 17:45:48.937 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.005 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.172 183079 DEBUG nova.compute.manager [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.173 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103949.1715593, cff4e488-d0bf-4df8-900e-e9f61f4309ac => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.173 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] VM Started (Lifecycle Event)
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.177 183079 DEBUG nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.182 183079 INFO nova.virt.libvirt.driver [-] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Instance spawned successfully.
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.182 183079 DEBUG nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.197 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.209 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.212 183079 DEBUG nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.212 183079 DEBUG nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.213 183079 DEBUG nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.213 183079 DEBUG nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.213 183079 DEBUG nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.214 183079 DEBUG nova.virt.libvirt.driver [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.240 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.241 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5541MB free_disk=73.33052062988281GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.241 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.241 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.243 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.243 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103949.171721, cff4e488-d0bf-4df8-900e-e9f61f4309ac => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.243 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] VM Paused (Lifecycle Event)
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.278 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.282 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769103949.176665, cff4e488-d0bf-4df8-900e-e9f61f4309ac => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.282 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] VM Resumed (Lifecycle Event)
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.294 183079 INFO nova.compute.manager [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Took 4.11 seconds to spawn the instance on the hypervisor.
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.295 183079 DEBUG nova.compute.manager [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.315 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.323 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.335 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 12ded08f-c0e2-4d03-967b-3436626dbbb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.335 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance cff4e488-d0bf-4df8-900e-e9f61f4309ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.335 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.335 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.355 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.367 183079 INFO nova.compute.manager [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Took 4.57 seconds to build instance.
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.385 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.395 183079 DEBUG oslo_concurrency.lockutils [None req-76cc4298-1ab3-4152-b243-5bf19385e9b8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.661s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.404 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.434 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:45:49 compute-0 nova_compute[183075]: 2026-01-22 17:45:49.435 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:45:50 compute-0 nova_compute[183075]: 2026-01-22 17:45:50.624 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:45:50 compute-0 nova_compute[183075]: 2026-01-22 17:45:50.624 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:45:50 compute-0 nova_compute[183075]: 2026-01-22 17:45:50.624 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:45:50 compute-0 nova_compute[183075]: 2026-01-22 17:45:50.834 183079 DEBUG nova.compute.manager [req-0359fcb3-e7e2-4e74-8ebf-d1d5b6d2ecef req-bdd15bb4-a0c0-48a4-9d61-02027ebbfbb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Received event network-vif-plugged-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:45:50 compute-0 nova_compute[183075]: 2026-01-22 17:45:50.834 183079 DEBUG oslo_concurrency.lockutils [req-0359fcb3-e7e2-4e74-8ebf-d1d5b6d2ecef req-bdd15bb4-a0c0-48a4-9d61-02027ebbfbb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:45:50 compute-0 nova_compute[183075]: 2026-01-22 17:45:50.834 183079 DEBUG oslo_concurrency.lockutils [req-0359fcb3-e7e2-4e74-8ebf-d1d5b6d2ecef req-bdd15bb4-a0c0-48a4-9d61-02027ebbfbb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:45:50 compute-0 nova_compute[183075]: 2026-01-22 17:45:50.835 183079 DEBUG oslo_concurrency.lockutils [req-0359fcb3-e7e2-4e74-8ebf-d1d5b6d2ecef req-bdd15bb4-a0c0-48a4-9d61-02027ebbfbb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:45:50 compute-0 nova_compute[183075]: 2026-01-22 17:45:50.836 183079 DEBUG nova.compute.manager [req-0359fcb3-e7e2-4e74-8ebf-d1d5b6d2ecef req-bdd15bb4-a0c0-48a4-9d61-02027ebbfbb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] No waiting events found dispatching network-vif-plugged-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:45:50 compute-0 nova_compute[183075]: 2026-01-22 17:45:50.836 183079 WARNING nova.compute.manager [req-0359fcb3-e7e2-4e74-8ebf-d1d5b6d2ecef req-bdd15bb4-a0c0-48a4-9d61-02027ebbfbb8 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Received unexpected event network-vif-plugged-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 for instance with vm_state active and task_state None.
Jan 22 17:45:51 compute-0 nova_compute[183075]: 2026-01-22 17:45:51.469 183079 DEBUG nova.network.neutron [req-1c03ef48-b57a-4f97-8314-4364f584426f req-1733f567-385b-4ba9-a93e-6ec33b7d84a9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Updated VIF entry in instance network info cache for port 9df171fe-9cf9-4e33-b0f6-8054d4fb76c3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:45:51 compute-0 nova_compute[183075]: 2026-01-22 17:45:51.469 183079 DEBUG nova.network.neutron [req-1c03ef48-b57a-4f97-8314-4364f584426f req-1733f567-385b-4ba9-a93e-6ec33b7d84a9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Updating instance_info_cache with network_info: [{"id": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "address": "fa:16:3e:78:62:dd", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9df171fe-9c", "ovs_interfaceid": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:45:51 compute-0 nova_compute[183075]: 2026-01-22 17:45:51.490 183079 DEBUG oslo_concurrency.lockutils [req-1c03ef48-b57a-4f97-8314-4364f584426f req-1733f567-385b-4ba9-a93e-6ec33b7d84a9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-cff4e488-d0bf-4df8-900e-e9f61f4309ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:45:51 compute-0 nova_compute[183075]: 2026-01-22 17:45:51.562 183079 INFO nova.compute.manager [None req-14eb845e-721a-48f1-ba52-0b27ef8cd148 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Get console output
Jan 22 17:45:51 compute-0 nova_compute[183075]: 2026-01-22 17:45:51.569 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:45:53 compute-0 nova_compute[183075]: 2026-01-22 17:45:53.380 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:53 compute-0 nova_compute[183075]: 2026-01-22 17:45:53.599 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.461 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'name': 'tempest-server-test-1515844450', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000044', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.464 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'name': 'tempest-server-test-1658329936', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000043', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.464 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.466 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for cff4e488-d0bf-4df8-900e-e9f61f4309ac / tap9df171fe-9c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.466 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.469 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 12ded08f-c0e2-4d03-967b-3436626dbbb2 / tapd562087b-b5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.469 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2b424be-29c8-4c0f-9e2e-35587749cbf9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000044-cff4e488-d0bf-4df8-900e-e9f61f4309ac-tap9df171fe-9c', 'timestamp': '2026-01-22T17:45:55.464486', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'tap9df171fe-9c', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:62:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9df171fe-9c'}, 'message_id': '33de6052-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.224903892, 'message_signature': 'd600b95cd2fc3ea1c290a5c1de3a27a3dcbf3144be966314547ba4e6cc0924a7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000043-12ded08f-c0e2-4d03-967b-3436626dbbb2-tapd562087b-b5', 'timestamp': '2026-01-22T17:45:55.464486', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'tapd562087b-b5', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:ad:02', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd562087b-b5'}, 'message_id': '33deb976-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.228083277, 'message_signature': '82077e3443b28e7f422bbc3776f3cf10d067bef48e6d98168cd6aabdcdb91691'}]}, 'timestamp': '2026-01-22 17:45:55.469889', '_unique_id': 'f308a7c273f142afab283c39d3d93b8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.471 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.484 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.500 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/disk.device.read.bytes volume: 31361536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bdec8948-f6cf-4663-ba5d-b2c3ecf64658', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac-vda', 'timestamp': '2026-01-22T17:45:55.472058', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'instance-00000044', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33e10ca8-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.232492695, 'message_signature': 'd053ff347973b50d14f50e833bfa350d9693d4e8107cc0762eb4844d8f9accf0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31361536, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2-vda', 'timestamp': '2026-01-22T17:45:55.472058', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'instance-00000043', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33e37f88-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.245617337, 'message_signature': '6fa6b924900acc83646b156f65c315eebd0e495b1036ebea0c4eedb37d894150'}]}, 'timestamp': '2026-01-22 17:45:55.501445', '_unique_id': 'e03978dc9b674641b01bd43103c50775'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.504 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.505 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.506 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1515844450>, <NovaLikeServer: tempest-server-test-1658329936>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1515844450>, <NovaLikeServer: tempest-server-test-1658329936>]
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.507 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.507 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.507 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/disk.device.read.requests volume: 1161 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6090a80e-5ece-4a48-ac93-520046705634', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac-vda', 'timestamp': '2026-01-22T17:45:55.507337', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'instance-00000044', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33e47c44-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.232492695, 'message_signature': '8a9c6f74d3ee67dc4af9598903e4349bdef9c352328acc674335766103afb077'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1161, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2-vda', 'timestamp': '2026-01-22T17:45:55.507337', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'instance-00000043', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33e48b80-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.245617337, 'message_signature': '2507de26e3bbed8d113eb6a0f9142d4f10372ff7ea66c426c08764dd2ccabb31'}]}, 'timestamp': '2026-01-22 17:45:55.508005', '_unique_id': '4b22015a271f423ea9101c375fa2e014'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.511 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.511 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.512 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/network.incoming.bytes volume: 7340 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cb3ea80-20f2-40ca-b18c-a4e10c8c9e96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000044-cff4e488-d0bf-4df8-900e-e9f61f4309ac-tap9df171fe-9c', 'timestamp': '2026-01-22T17:45:55.511922', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'tap9df171fe-9c', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:62:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9df171fe-9c'}, 'message_id': '33e52edc-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.224903892, 'message_signature': 'f2fba66d6188948de241efb301101dfa64a4e9fe18178ef127ef9288de6499f5'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7340, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000043-12ded08f-c0e2-4d03-967b-3436626dbbb2-tapd562087b-b5', 'timestamp': '2026-01-22T17:45:55.511922', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'tapd562087b-b5', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:ad:02', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd562087b-b5'}, 'message_id': '33e53a12-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.228083277, 'message_signature': '38233995ea239415e44dc3a6d7a6499e5d5aa4e22bbb9623b986d7034d20c5a1'}]}, 'timestamp': '2026-01-22 17:45:55.512510', '_unique_id': 'f1ddfec9626a481aaa055c576d5d49ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.513 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.516 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.516 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.516 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/disk.device.write.latency volume: 2744392420 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b816d21b-da09-4a6a-8cf8-cd6ad107b013', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac-vda', 'timestamp': '2026-01-22T17:45:55.516433', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'instance-00000044', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33e5df76-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.232492695, 'message_signature': 'dbc9742dee831669e66f57da64184f2783c060b745ef0a0762c7573922cc8360'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2744392420, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2-vda', 'timestamp': '2026-01-22T17:45:55.516433', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'instance-00000043', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33e5ee12-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.245617337, 'message_signature': 'ee24f7fe8a6cf2cf60c1c339b0fc09d56d4b1b70cdac700eaa0bcbdb67af6297'}]}, 'timestamp': '2026-01-22 17:45:55.517122', '_unique_id': '425391ef5f5c43afbeaeebb2d19d3211'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.521 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.539 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/cpu volume: 6120000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.556 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/cpu volume: 11690000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '456017bd-b5d9-4f6e-a8c5-4a54e8455e1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6120000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'timestamp': '2026-01-22T17:45:55.521500', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'instance-00000044', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '33e97c80-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.299670395, 'message_signature': '86301a7f8ec37c96fe3049d17381f02e9da6653117d19041d4edc301c2c48d33'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11690000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'timestamp': '2026-01-22T17:45:55.521500', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'instance-00000043', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '33ec1c6a-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.316843585, 'message_signature': '167d67b72f3a19cc02633fb0136bb34c2f4bc4a2cde254d0b24e9745f9d73b09'}]}, 'timestamp': '2026-01-22 17:45:55.557776', '_unique_id': '04689cc6f4f14b8fb7e829fcc822f526'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.559 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.562 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.562 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk.device.read.latency volume: 97820649 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.562 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/disk.device.read.latency volume: 182802253 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1350d53-a8fd-402e-8fcc-e7d61764cbf0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 97820649, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac-vda', 'timestamp': '2026-01-22T17:45:55.562110', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'instance-00000044', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33ecd754-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.232492695, 'message_signature': '266fc3b9d4f2b2f1a262a4625e90231180dd02027f101071350ee1a0e2f96ccd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 182802253, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2-vda', 'timestamp': '2026-01-22T17:45:55.562110', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'instance-00000043', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33ece23a-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.245617337, 'message_signature': '702d6997dec972f004b4947d104f2b775f6a62e7dabc8b659800b2c178cfdfd7'}]}, 'timestamp': '2026-01-22 17:45:55.562707', '_unique_id': 'f6974e0c99da4c26ae39303eb2b6b1ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.565 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.572 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.580 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d1f972a-a97a-4f50-9a8c-3088404e9aca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac-vda', 'timestamp': '2026-01-22T17:45:55.565922', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'instance-00000044', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33ee8e0a-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.32635986, 'message_signature': '7391178d75270686c252bb6e56dbc0f8dc548390a6957a4ea5acc3b2c7db2b3e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2-vda', 'timestamp': '2026-01-22T17:45:55.565922', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'instance-00000043', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33efa65a-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.334429226, 'message_signature': 'd35979933d25298ad3ccaf83d5abb37a36c6058da6654bb319804d1effe951a9'}]}, 'timestamp': '2026-01-22 17:45:55.580885', '_unique_id': '13e98fca088c44719f67296bbf15ebc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.582 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.583 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.583 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cef8a00-94dd-441a-8174-c8eb8ed1599b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000044-cff4e488-d0bf-4df8-900e-e9f61f4309ac-tap9df171fe-9c', 'timestamp': '2026-01-22T17:45:55.583774', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'tap9df171fe-9c', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:62:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9df171fe-9c'}, 'message_id': '33f026ca-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.224903892, 'message_signature': '38c81ea1f9cf95b98aa850abb38e8ed0f351a0f1066e9bd4267fbab911d62979'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000043-12ded08f-c0e2-4d03-967b-3436626dbbb2-tapd562087b-b5', 'timestamp': '2026-01-22T17:45:55.583774', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'tapd562087b-b5', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:ad:02', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd562087b-b5'}, 'message_id': '33f02fee-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.228083277, 'message_signature': '5e0d2e6071a18a03e3c71548bd291a251ee2dc3c6ffbb2177237454b19eca956'}]}, 'timestamp': '2026-01-22 17:45:55.584308', '_unique_id': '1f0a7ca248ca448986ed2c66f791d1aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.584 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.585 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.586 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.586 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '529a5890-a725-4e2e-aced-f419ce03d325', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000044-cff4e488-d0bf-4df8-900e-e9f61f4309ac-tap9df171fe-9c', 'timestamp': '2026-01-22T17:45:55.586074', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'tap9df171fe-9c', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:62:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9df171fe-9c'}, 'message_id': '33f07db4-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.224903892, 'message_signature': '3f9e381231d3cc2b1ae278b79ebd4a82d3e99f388cfbede1511869e54034914a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000043-12ded08f-c0e2-4d03-967b-3436626dbbb2-tapd562087b-b5', 'timestamp': '2026-01-22T17:45:55.586074', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'tapd562087b-b5', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:ad:02', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd562087b-b5'}, 'message_id': '33f08610-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.228083277, 'message_signature': '1d64414acdd29aa858d722e265be0fd719b188e80a8899cff7dd61d0cb1aa68c'}]}, 'timestamp': '2026-01-22 17:45:55.586508', '_unique_id': '8b21fa30ecb146e492e2d25d44848bd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.587 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f6da755-fb4c-4e51-9661-d00ccf2ba294', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000044-cff4e488-d0bf-4df8-900e-e9f61f4309ac-tap9df171fe-9c', 'timestamp': '2026-01-22T17:45:55.587827', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'tap9df171fe-9c', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:62:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9df171fe-9c'}, 'message_id': '33f0c210-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.224903892, 'message_signature': '34178a42496fee3d07235ce3f55eda80ea8c74a32d90df47364459c9416c7f13'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000043-12ded08f-c0e2-4d03-967b-3436626dbbb2-tapd562087b-b5', 'timestamp': '2026-01-22T17:45:55.587827', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'tapd562087b-b5', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:ad:02', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd562087b-b5'}, 'message_id': '33f0cc74-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.228083277, 'message_signature': '738760310c8c7dbdd225fe75742787b61ed0922e143ad158df505e9982e46b29'}]}, 'timestamp': '2026-01-22 17:45:55.588312', '_unique_id': 'cd7ce742591243ec8e06c77fe920cccc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.588 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.590 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.590 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.590 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56ebaf4d-d86d-4693-ad01-c226b3722960', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac-vda', 'timestamp': '2026-01-22T17:45:55.590479', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'instance-00000044', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33f130b0-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.32635986, 'message_signature': '5359594d5e7fb11c37889a48b9850947b709a5673aff5be32e18fdd6df8bc9f6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2-vda', 'timestamp': '2026-01-22T17:45:55.590479', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'instance-00000043', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33f138f8-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.334429226, 'message_signature': '5a456f4667f5d15a08f9b04160966869f7746123ad86b049f99968ee5f8cc3a4'}]}, 'timestamp': '2026-01-22 17:45:55.591083', '_unique_id': '0179d2e6074a4811b4780d095ed7617d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.591 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.593 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.593 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.594 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2627a10f-a317-4970-bc09-4c6cf2176825', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000044-cff4e488-d0bf-4df8-900e-e9f61f4309ac-tap9df171fe-9c', 'timestamp': '2026-01-22T17:45:55.593670', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'tap9df171fe-9c', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:62:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9df171fe-9c'}, 'message_id': '33f1b76a-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.224903892, 'message_signature': '73a48021428ddfda0908324cd3d533014d2c6a196b1d5f9b78dfe0f6dbe1281b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000043-12ded08f-c0e2-4d03-967b-3436626dbbb2-tapd562087b-b5', 'timestamp': '2026-01-22T17:45:55.593670', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'tapd562087b-b5', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:ad:02', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd562087b-b5'}, 'message_id': '33f1c214-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.228083277, 'message_signature': 'ec8c378b21dee6eccb481b6a89ea34444f1f6732fea6f6bd1ed83e9f86db525a'}]}, 'timestamp': '2026-01-22 17:45:55.594600', '_unique_id': '81fc558758324fcd88aa74d48bafbc3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.596 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.596 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.597 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1515844450>, <NovaLikeServer: tempest-server-test-1658329936>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1515844450>, <NovaLikeServer: tempest-server-test-1658329936>]
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.597 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.598 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.598 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance cff4e488-d0bf-4df8-900e-e9f61f4309ac: ceilometer.compute.pollsters.NoVolumeException
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.598 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/memory.usage volume: 42.0703125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f4c173c-7fa0-47b5-8981-63837c3a1a5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.0703125, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'timestamp': '2026-01-22T17:45:55.598072', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'instance-00000043', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '33f2650c-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.316843585, 'message_signature': 'f09896a9e4547ba9aa35f0a8739dae3e951787e728252cf359eb5d9c7b100f9f'}]}, 'timestamp': '2026-01-22 17:45:55.598773', '_unique_id': '4f61f3c06fec44de8e30b2868fcc4ec3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.599 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.600 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.601 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/network.outgoing.bytes volume: 11638 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7fff159-d86b-43b8-a25c-f786452c01a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000044-cff4e488-d0bf-4df8-900e-e9f61f4309ac-tap9df171fe-9c', 'timestamp': '2026-01-22T17:45:55.600337', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'tap9df171fe-9c', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:62:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9df171fe-9c'}, 'message_id': '33f2c5ce-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.224903892, 'message_signature': '93a1271db31da6bc8b580221ca0d87f56d0dd60b8904cd8077757fea3adfd06a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11638, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000043-12ded08f-c0e2-4d03-967b-3436626dbbb2-tapd562087b-b5', 'timestamp': '2026-01-22T17:45:55.600337', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'tapd562087b-b5', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:ad:02', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd562087b-b5'}, 'message_id': '33f2ed60-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.228083277, 'message_signature': 'a17a3410834ecc5e8e3781fc0aeb76b2e172264a94e07da8b9dc321a7c0c33c4'}]}, 'timestamp': '2026-01-22 17:45:55.602725', '_unique_id': '2982b643e207497297188a7d80876346'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.603 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.604 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.604 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.605 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/disk.device.write.requests volume: 348 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d935a2f-9c5a-407e-b8a8-ecb988e8daf9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac-vda', 'timestamp': '2026-01-22T17:45:55.604552', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'instance-00000044', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33f36614-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.232492695, 'message_signature': '4b19aeb9098be79d75d2a9c70027e98b753e476e2c3ea7efaa3a6690069f9e47'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 348, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2-vda', 'timestamp': '2026-01-22T17:45:55.604552', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'instance-00000043', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33f39652-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.245617337, 'message_signature': '2c5ff6095a54f08371f4b71295e9d04a22382dfe245d0c32f034f675069e8b20'}]}, 'timestamp': '2026-01-22 17:45:55.606641', '_unique_id': '408ab80e634d4490b2d3c10e07d0ce5e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.607 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.608 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.608 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/network.outgoing.packets volume: 132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e587f0e4-6dcd-4a49-80ab-7786be7021d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000044-cff4e488-d0bf-4df8-900e-e9f61f4309ac-tap9df171fe-9c', 'timestamp': '2026-01-22T17:45:55.608060', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'tap9df171fe-9c', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:62:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9df171fe-9c'}, 'message_id': '33f3d9fa-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.224903892, 'message_signature': 'f991528c2781b92e97bbc4a687e87488d208bae6337c6208544f2c7168311da9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 132, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000043-12ded08f-c0e2-4d03-967b-3436626dbbb2-tapd562087b-b5', 'timestamp': '2026-01-22T17:45:55.608060', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'tapd562087b-b5', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:ad:02', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd562087b-b5'}, 'message_id': '33f3e544-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.228083277, 'message_signature': '00b2b2edaed1d600e9b6c503d9fec90fae63368b8c30a8b19bf2dd3436c80252'}]}, 'timestamp': '2026-01-22 17:45:55.608691', '_unique_id': 'ec9f4258b8f6401a88597e30241dba93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.609 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.610 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.610 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.610 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c957b03-bdfc-43af-8a33-b4fab9e9d918', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000044-cff4e488-d0bf-4df8-900e-e9f61f4309ac-tap9df171fe-9c', 'timestamp': '2026-01-22T17:45:55.610285', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'tap9df171fe-9c', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:62:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9df171fe-9c'}, 'message_id': '33f430c6-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.224903892, 'message_signature': 'bd8e340e3934f2194ef0d93b277d1ab3fbdafb4528bb80937b0e1b24fb079bc2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000043-12ded08f-c0e2-4d03-967b-3436626dbbb2-tapd562087b-b5', 'timestamp': '2026-01-22T17:45:55.610285', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'tapd562087b-b5', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:ad:02', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd562087b-b5'}, 'message_id': '33f43d14-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.228083277, 'message_signature': 'ce4dcf9e1e64e85ab4fb5f1f529a40b935927c2e5a33bdf766bb2b3402b7da81'}]}, 'timestamp': '2026-01-22 17:45:55.610898', '_unique_id': '1de489d86eed44e9bfd8ad579242d43a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.611 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.612 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.612 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.612 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/disk.device.write.bytes volume: 73228288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7ccce54-b799-49a7-8a25-0ae0f73d74d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac-vda', 'timestamp': '2026-01-22T17:45:55.612409', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'instance-00000044', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33f48364-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.232492695, 'message_signature': '23d489bb3f62d06653270f24df3718c643c803b72615810b08cf35c39e58373e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73228288, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2-vda', 'timestamp': '2026-01-22T17:45:55.612409', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'instance-00000043', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33f48fda-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.245617337, 'message_signature': '3b8dfa30adc0c53155b0e7795891def1bd7c849cc0615f57b0ab859fd0adc36e'}]}, 'timestamp': '2026-01-22 17:45:55.613052', '_unique_id': '21dbbad378774c10a682066a925863be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.613 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.614 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.614 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.614 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1515844450>, <NovaLikeServer: tempest-server-test-1658329936>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1515844450>, <NovaLikeServer: tempest-server-test-1658329936>]
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.614 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.614 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.615 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1515844450>, <NovaLikeServer: tempest-server-test-1658329936>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1515844450>, <NovaLikeServer: tempest-server-test-1658329936>]
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.615 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.615 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.615 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '855c2e15-081f-441e-a39d-62d95b869f24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000044-cff4e488-d0bf-4df8-900e-e9f61f4309ac-tap9df171fe-9c', 'timestamp': '2026-01-22T17:45:55.615382', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'tap9df171fe-9c', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:62:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9df171fe-9c'}, 'message_id': '33f4f830-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.224903892, 'message_signature': 'ec10668368f857ba57e2f11c72e2ce770d6dfccd157aef82c4f42463c724f96b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000043-12ded08f-c0e2-4d03-967b-3436626dbbb2-tapd562087b-b5', 'timestamp': '2026-01-22T17:45:55.615382', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'tapd562087b-b5', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:ad:02', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd562087b-b5'}, 'message_id': '33f504a6-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.228083277, 'message_signature': '137869cffb57eed074dbfc154efef0dd6413fd8ca38c6fa7ddb15954daa73425'}]}, 'timestamp': '2026-01-22 17:45:55.616007', '_unique_id': '10338bee510247a6ad04aac235af9fa3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.616 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.617 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.617 12 DEBUG ceilometer.compute.pollsters [-] cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.617 12 DEBUG ceilometer.compute.pollsters [-] 12ded08f-c0e2-4d03-967b-3436626dbbb2/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0b28f48-6786-4a69-b64a-e8573c82cabf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac-vda', 'timestamp': '2026-01-22T17:45:55.617246', 'resource_metadata': {'display_name': 'tempest-server-test-1515844450', 'name': 'instance-00000044', 'instance_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33f53f48-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.32635986, 'message_signature': 'f022f7bb321b3e151130444e08578fecf202af29213c8009410a968fe51e2855'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2-vda', 'timestamp': '2026-01-22T17:45:55.617246', 'resource_metadata': {'display_name': 'tempest-server-test-1658329936', 'name': 'instance-00000043', 'instance_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '33f5474a-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6319.334429226, 'message_signature': '98615aab9fee450a7d1caed43f27f4b9da1aa7789e4fd0a55a6e0e69e7a50a45'}]}, 'timestamp': '2026-01-22 17:45:55.617693', '_unique_id': '9e1d1dbd5dde4f0b80cb1c5f00d12c54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:45:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:45:55.618 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:45:56 compute-0 nova_compute[183075]: 2026-01-22 17:45:56.671 183079 INFO nova.compute.manager [None req-29ee4f6a-77e2-47ea-854f-3bfabe486e5f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Get console output
Jan 22 17:45:57 compute-0 podman[239867]: 2026-01-22 17:45:57.362547485 +0000 UTC m=+0.062808623 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Jan 22 17:45:57 compute-0 podman[239866]: 2026-01-22 17:45:57.382676134 +0000 UTC m=+0.085807369 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 22 17:45:57 compute-0 podman[239865]: 2026-01-22 17:45:57.458536246 +0000 UTC m=+0.163747687 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 17:45:58 compute-0 nova_compute[183075]: 2026-01-22 17:45:58.383 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:58 compute-0 nova_compute[183075]: 2026-01-22 17:45:58.602 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:45:59 compute-0 nova_compute[183075]: 2026-01-22 17:45:59.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:46:01 compute-0 ovn_controller[95372]: 2026-01-22T17:46:01Z|00138|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:78:62:dd 10.100.0.5
Jan 22 17:46:01 compute-0 ovn_controller[95372]: 2026-01-22T17:46:01Z|00139|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:78:62:dd 10.100.0.5
Jan 22 17:46:01 compute-0 nova_compute[183075]: 2026-01-22 17:46:01.937 183079 INFO nova.compute.manager [None req-12caa981-e700-43ab-99c8-98b9d5236d43 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Get console output
Jan 22 17:46:01 compute-0 nova_compute[183075]: 2026-01-22 17:46:01.942 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:46:03 compute-0 podman[239946]: 2026-01-22 17:46:03.363883583 +0000 UTC m=+0.065352612 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:46:03 compute-0 nova_compute[183075]: 2026-01-22 17:46:03.385 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:03 compute-0 nova_compute[183075]: 2026-01-22 17:46:03.603 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:06.316 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:46:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:06.317 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:46:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:46:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:46:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:46:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:46:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:46:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:46:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:46:07 compute-0 nova_compute[183075]: 2026-01-22 17:46:07.070 183079 INFO nova.compute.manager [None req-2d98068b-cede-44ea-8160-63777e835f58 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Get console output
Jan 22 17:46:07 compute-0 nova_compute[183075]: 2026-01-22 17:46:07.076 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:46:08 compute-0 nova_compute[183075]: 2026-01-22 17:46:08.387 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.438 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.439 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 2.1218367
Jan 22 17:46:08 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.5:34354 [22/Jan/2026:17:46:06.316] listener listener/metadata 0/0/0/2123/2123 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.446 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.446 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.467 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.468 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0214753
Jan 22 17:46:08 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.5:34356 [22/Jan/2026:17:46:08.445] listener listener/metadata 0/0/0/22/22 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.472 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.472 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.484 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.484 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0118589
Jan 22 17:46:08 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.5:34372 [22/Jan/2026:17:46:08.472] listener listener/metadata 0/0/0/12/12 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.489 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.490 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.503 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.503 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0138116
Jan 22 17:46:08 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.5:34388 [22/Jan/2026:17:46:08.489] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.509 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.510 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.525 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:46:08 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.5:34404 [22/Jan/2026:17:46:08.508] listener listener/metadata 0/0/0/17/17 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.526 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0159998
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.531 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.531 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.544 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:46:08 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.5:34416 [22/Jan/2026:17:46:08.530] listener listener/metadata 0/0/0/14/14 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.545 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0133281
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.549 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.549 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.562 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:46:08 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.5:34424 [22/Jan/2026:17:46:08.549] listener listener/metadata 0/0/0/13/13 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.562 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0128779
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.568 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.569 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.590 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.590 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0213602
Jan 22 17:46:08 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.5:34434 [22/Jan/2026:17:46:08.568] listener listener/metadata 0/0/0/22/22 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.596 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.596 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:46:08 compute-0 nova_compute[183075]: 2026-01-22 17:46:08.604 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.610 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:46:08 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.5:34436 [22/Jan/2026:17:46:08.595] listener listener/metadata 0/0/0/15/15 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.611 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0148852
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.616 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.617 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.634 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:46:08 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.5:34440 [22/Jan/2026:17:46:08.616] listener listener/metadata 0/0/0/18/18 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.634 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0170937
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.640 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.641 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:46:08 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.5:34448 [22/Jan/2026:17:46:08.640] listener listener/metadata 0/0/0/19/19 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.659 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0184543
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.668 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.669 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.684 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:46:08 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.5:34458 [22/Jan/2026:17:46:08.668] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.684 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0155802
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.689 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.689 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.703 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.704 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0146577
Jan 22 17:46:08 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.5:34460 [22/Jan/2026:17:46:08.688] listener listener/metadata 0/0/0/15/15 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.708 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.709 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.727 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:46:08 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.5:34476 [22/Jan/2026:17:46:08.708] listener listener/metadata 0/0/0/19/19 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.728 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0187469
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.734 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.734 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.748 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.748 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0141904
Jan 22 17:46:08 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.5:34488 [22/Jan/2026:17:46:08.733] listener listener/metadata 0/0/0/15/15 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.754 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.755 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.771 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:46:08 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239511]: 10.100.0.5:34496 [22/Jan/2026:17:46:08.754] listener listener/metadata 0/0/0/17/17 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:46:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:08.771 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0163326
Jan 22 17:46:12 compute-0 nova_compute[183075]: 2026-01-22 17:46:12.224 183079 INFO nova.compute.manager [None req-0f690270-2108-40a9-a582-4502d4f734d3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Get console output
Jan 22 17:46:12 compute-0 nova_compute[183075]: 2026-01-22 17:46:12.229 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:46:13 compute-0 nova_compute[183075]: 2026-01-22 17:46:13.389 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:13 compute-0 nova_compute[183075]: 2026-01-22 17:46:13.606 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:14 compute-0 podman[239966]: 2026-01-22 17:46:14.355076057 +0000 UTC m=+0.064472838 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:46:17 compute-0 nova_compute[183075]: 2026-01-22 17:46:17.345 183079 INFO nova.compute.manager [None req-2ec3e86d-d21e-4beb-8ba0-b9ab281b50de 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Get console output
Jan 22 17:46:17 compute-0 nova_compute[183075]: 2026-01-22 17:46:17.351 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:46:18 compute-0 podman[239991]: 2026-01-22 17:46:18.338448781 +0000 UTC m=+0.052885927 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:46:18 compute-0 nova_compute[183075]: 2026-01-22 17:46:18.390 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:18 compute-0 ovn_controller[95372]: 2026-01-22T17:46:18Z|00773|memory_trim|INFO|Detected inactivity (last active 30021 ms ago): trimming memory
Jan 22 17:46:18 compute-0 nova_compute[183075]: 2026-01-22 17:46:18.608 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:22 compute-0 nova_compute[183075]: 2026-01-22 17:46:22.629 183079 INFO nova.compute.manager [None req-5a4a701a-4b56-43ea-99f3-f058321e9031 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Get console output
Jan 22 17:46:22 compute-0 nova_compute[183075]: 2026-01-22 17:46:22.635 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:46:23 compute-0 nova_compute[183075]: 2026-01-22 17:46:23.392 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:23 compute-0 nova_compute[183075]: 2026-01-22 17:46:23.610 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:27 compute-0 nova_compute[183075]: 2026-01-22 17:46:27.850 183079 INFO nova.compute.manager [None req-079f8804-26af-45c6-938a-e01ed3b58570 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Get console output
Jan 22 17:46:27 compute-0 nova_compute[183075]: 2026-01-22 17:46:27.853 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:46:28 compute-0 podman[240017]: 2026-01-22 17:46:28.364417838 +0000 UTC m=+0.062218808 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter)
Jan 22 17:46:28 compute-0 podman[240016]: 2026-01-22 17:46:28.365324962 +0000 UTC m=+0.052838047 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 17:46:28 compute-0 podman[240015]: 2026-01-22 17:46:28.387451165 +0000 UTC m=+0.092120959 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:46:28 compute-0 nova_compute[183075]: 2026-01-22 17:46:28.395 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:28 compute-0 nova_compute[183075]: 2026-01-22 17:46:28.612 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:31 compute-0 nova_compute[183075]: 2026-01-22 17:46:31.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:46:33 compute-0 nova_compute[183075]: 2026-01-22 17:46:33.396 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:33 compute-0 nova_compute[183075]: 2026-01-22 17:46:33.614 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:34 compute-0 nova_compute[183075]: 2026-01-22 17:46:34.101 183079 INFO nova.compute.manager [None req-8a21f302-63fc-4aa3-9e88-6f36c8897e42 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Get console output
Jan 22 17:46:34 compute-0 nova_compute[183075]: 2026-01-22 17:46:34.107 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:46:34 compute-0 podman[240082]: 2026-01-22 17:46:34.364522863 +0000 UTC m=+0.060717938 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Jan 22 17:46:34 compute-0 nova_compute[183075]: 2026-01-22 17:46:34.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:46:36 compute-0 nova_compute[183075]: 2026-01-22 17:46:36.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:46:38 compute-0 nova_compute[183075]: 2026-01-22 17:46:38.398 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:38 compute-0 nova_compute[183075]: 2026-01-22 17:46:38.616 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:38 compute-0 nova_compute[183075]: 2026-01-22 17:46:38.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:46:39 compute-0 nova_compute[183075]: 2026-01-22 17:46:39.462 183079 INFO nova.compute.manager [None req-5917eb08-f6b6-433f-b266-89f6a1a04fbb 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Get console output
Jan 22 17:46:39 compute-0 nova_compute[183075]: 2026-01-22 17:46:39.466 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:46:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:41.965 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:46:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:41.965 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:46:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:41.966 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:46:43 compute-0 nova_compute[183075]: 2026-01-22 17:46:43.401 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:43 compute-0 nova_compute[183075]: 2026-01-22 17:46:43.618 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:44 compute-0 nova_compute[183075]: 2026-01-22 17:46:44.643 183079 INFO nova.compute.manager [None req-e2f0c34b-806c-4be5-bd4c-c212a3076a10 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Get console output
Jan 22 17:46:44 compute-0 nova_compute[183075]: 2026-01-22 17:46:44.647 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:46:45 compute-0 podman[240102]: 2026-01-22 17:46:45.350990979 +0000 UTC m=+0.055829107 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:46:45 compute-0 nova_compute[183075]: 2026-01-22 17:46:45.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:46:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:46.434 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:46:46 compute-0 nova_compute[183075]: 2026-01-22 17:46:46.435 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:46.435 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:46:46 compute-0 nova_compute[183075]: 2026-01-22 17:46:46.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:46:46 compute-0 nova_compute[183075]: 2026-01-22 17:46:46.810 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:46:46 compute-0 nova_compute[183075]: 2026-01-22 17:46:46.811 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:46:46 compute-0 nova_compute[183075]: 2026-01-22 17:46:46.811 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:46:46 compute-0 nova_compute[183075]: 2026-01-22 17:46:46.811 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.278 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.336 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.337 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.393 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.399 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.462 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.463 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.520 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.682 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.684 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5400MB free_disk=73.30258560180664GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.684 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.684 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.809 183079 DEBUG nova.compute.manager [req-0ac878db-328a-41f8-b62f-aeafcb337175 req-e6ae49c9-0c70-4a82-9540-5639f7a9090c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Received event network-changed-d562087b-b585-4eeb-9ef3-31bd96de01f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.809 183079 DEBUG nova.compute.manager [req-0ac878db-328a-41f8-b62f-aeafcb337175 req-e6ae49c9-0c70-4a82-9540-5639f7a9090c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Refreshing instance network info cache due to event network-changed-d562087b-b585-4eeb-9ef3-31bd96de01f4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.810 183079 DEBUG oslo_concurrency.lockutils [req-0ac878db-328a-41f8-b62f-aeafcb337175 req-e6ae49c9-0c70-4a82-9540-5639f7a9090c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-12ded08f-c0e2-4d03-967b-3436626dbbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.810 183079 DEBUG oslo_concurrency.lockutils [req-0ac878db-328a-41f8-b62f-aeafcb337175 req-e6ae49c9-0c70-4a82-9540-5639f7a9090c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-12ded08f-c0e2-4d03-967b-3436626dbbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.810 183079 DEBUG nova.network.neutron [req-0ac878db-328a-41f8-b62f-aeafcb337175 req-e6ae49c9-0c70-4a82-9540-5639f7a9090c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Refreshing network info cache for port d562087b-b585-4eeb-9ef3-31bd96de01f4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.898 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 12ded08f-c0e2-4d03-967b-3436626dbbb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.899 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance cff4e488-d0bf-4df8-900e-e9f61f4309ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.899 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.899 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.956 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:46:47 compute-0 nova_compute[183075]: 2026-01-22 17:46:47.984 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:46:48 compute-0 nova_compute[183075]: 2026-01-22 17:46:48.030 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:46:48 compute-0 nova_compute[183075]: 2026-01-22 17:46:48.030 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:46:48 compute-0 nova_compute[183075]: 2026-01-22 17:46:48.403 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:48 compute-0 nova_compute[183075]: 2026-01-22 17:46:48.619 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:49 compute-0 nova_compute[183075]: 2026-01-22 17:46:49.013 183079 DEBUG nova.network.neutron [req-0ac878db-328a-41f8-b62f-aeafcb337175 req-e6ae49c9-0c70-4a82-9540-5639f7a9090c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Updated VIF entry in instance network info cache for port d562087b-b585-4eeb-9ef3-31bd96de01f4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:46:49 compute-0 nova_compute[183075]: 2026-01-22 17:46:49.014 183079 DEBUG nova.network.neutron [req-0ac878db-328a-41f8-b62f-aeafcb337175 req-e6ae49c9-0c70-4a82-9540-5639f7a9090c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Updating instance_info_cache with network_info: [{"id": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "address": "fa:16:3e:8b:ad:02", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd562087b-b5", "ovs_interfaceid": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:46:49 compute-0 nova_compute[183075]: 2026-01-22 17:46:49.031 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:46:49 compute-0 nova_compute[183075]: 2026-01-22 17:46:49.032 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:46:49 compute-0 nova_compute[183075]: 2026-01-22 17:46:49.032 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:46:49 compute-0 nova_compute[183075]: 2026-01-22 17:46:49.035 183079 DEBUG oslo_concurrency.lockutils [req-0ac878db-328a-41f8-b62f-aeafcb337175 req-e6ae49c9-0c70-4a82-9540-5639f7a9090c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-12ded08f-c0e2-4d03-967b-3436626dbbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:46:49 compute-0 nova_compute[183075]: 2026-01-22 17:46:49.181 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-12ded08f-c0e2-4d03-967b-3436626dbbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:46:49 compute-0 nova_compute[183075]: 2026-01-22 17:46:49.181 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-12ded08f-c0e2-4d03-967b-3436626dbbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:46:49 compute-0 nova_compute[183075]: 2026-01-22 17:46:49.182 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:46:49 compute-0 nova_compute[183075]: 2026-01-22 17:46:49.182 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 12ded08f-c0e2-4d03-967b-3436626dbbb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:46:49 compute-0 podman[240139]: 2026-01-22 17:46:49.341554654 +0000 UTC m=+0.049195919 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:46:49 compute-0 nova_compute[183075]: 2026-01-22 17:46:49.890 183079 DEBUG nova.compute.manager [req-4a370f65-1e30-4e1d-87ec-e7693a17770c req-4efc54d3-ee06-4f05-9c5a-e85b3743c0ce a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Received event network-changed-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:46:49 compute-0 nova_compute[183075]: 2026-01-22 17:46:49.891 183079 DEBUG nova.compute.manager [req-4a370f65-1e30-4e1d-87ec-e7693a17770c req-4efc54d3-ee06-4f05-9c5a-e85b3743c0ce a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Refreshing instance network info cache due to event network-changed-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:46:49 compute-0 nova_compute[183075]: 2026-01-22 17:46:49.891 183079 DEBUG oslo_concurrency.lockutils [req-4a370f65-1e30-4e1d-87ec-e7693a17770c req-4efc54d3-ee06-4f05-9c5a-e85b3743c0ce a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-cff4e488-d0bf-4df8-900e-e9f61f4309ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:46:49 compute-0 nova_compute[183075]: 2026-01-22 17:46:49.892 183079 DEBUG oslo_concurrency.lockutils [req-4a370f65-1e30-4e1d-87ec-e7693a17770c req-4efc54d3-ee06-4f05-9c5a-e85b3743c0ce a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-cff4e488-d0bf-4df8-900e-e9f61f4309ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:46:49 compute-0 nova_compute[183075]: 2026-01-22 17:46:49.892 183079 DEBUG nova.network.neutron [req-4a370f65-1e30-4e1d-87ec-e7693a17770c req-4efc54d3-ee06-4f05-9c5a-e85b3743c0ce a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Refreshing network info cache for port 9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:46:50 compute-0 nova_compute[183075]: 2026-01-22 17:46:50.248 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Updating instance_info_cache with network_info: [{"id": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "address": "fa:16:3e:8b:ad:02", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd562087b-b5", "ovs_interfaceid": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:46:50 compute-0 nova_compute[183075]: 2026-01-22 17:46:50.264 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-12ded08f-c0e2-4d03-967b-3436626dbbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:46:50 compute-0 nova_compute[183075]: 2026-01-22 17:46:50.265 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:46:50 compute-0 nova_compute[183075]: 2026-01-22 17:46:50.265 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:46:50 compute-0 nova_compute[183075]: 2026-01-22 17:46:50.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:46:50 compute-0 nova_compute[183075]: 2026-01-22 17:46:50.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.024 183079 DEBUG oslo_concurrency.lockutils [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.024 183079 DEBUG oslo_concurrency.lockutils [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.025 183079 DEBUG oslo_concurrency.lockutils [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.025 183079 DEBUG oslo_concurrency.lockutils [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.026 183079 DEBUG oslo_concurrency.lockutils [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.027 183079 INFO nova.compute.manager [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Terminating instance
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.028 183079 DEBUG nova.compute.manager [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:46:51 compute-0 kernel: tap9df171fe-9c (unregistering): left promiscuous mode
Jan 22 17:46:51 compute-0 NetworkManager[55454]: <info>  [1769104011.0560] device (tap9df171fe-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:46:51 compute-0 ovn_controller[95372]: 2026-01-22T17:46:51Z|00774|binding|INFO|Releasing lport 9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 from this chassis (sb_readonly=0)
Jan 22 17:46:51 compute-0 ovn_controller[95372]: 2026-01-22T17:46:51Z|00775|binding|INFO|Setting lport 9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 down in Southbound
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.064 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:51 compute-0 ovn_controller[95372]: 2026-01-22T17:46:51Z|00776|binding|INFO|Removing iface tap9df171fe-9c ovn-installed in OVS
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.066 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:51.073 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:62:dd 10.100.0.5'], port_security=['fa:16:3e:78:62:dd 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'cff4e488-d0bf-4df8-900e-e9f61f4309ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '211927a5-364f-4b76-9332-16507814b750 f983fa10-a163-4ba4-97be-44ccb35ba95f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.224'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=9df171fe-9cf9-4e33-b0f6-8054d4fb76c3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:46:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:51.074 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:46:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:51.076 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.077 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:51.093 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9da731-25ee-465a-990f-b2a88c0c88e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:46:51 compute-0 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000044.scope: Deactivated successfully.
Jan 22 17:46:51 compute-0 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000044.scope: Consumed 14.395s CPU time.
Jan 22 17:46:51 compute-0 systemd-machined[154382]: Machine qemu-68-instance-00000044 terminated.
Jan 22 17:46:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:51.125 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[cc45a1bf-f379-4ccb-9b96-d56b3744dafd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:46:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:51.129 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[006ce58c-b017-4c6c-b809-236236c232c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:46:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:51.154 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[bfe5fcff-7563-45d9-85ee-185b9ef34a09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:46:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:51.171 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f95ece77-8b93-41a7-acdd-7b98c45d1a21]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 204, 'tx_packets': 106, 'rx_bytes': 17392, 'tx_bytes': 12083, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 204, 'tx_packets': 106, 'rx_bytes': 17392, 'tx_bytes': 12083, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 625165, 'reachable_time': 16968, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240176, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:46:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:51.187 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b74cc7dc-4621-4af7-85ed-d80709d71be6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 625177, 'tstamp': 625177}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240177, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 625180, 'tstamp': 625180}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240177, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:46:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:51.189 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.191 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.195 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:51.195 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:46:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:51.196 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:46:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:51.196 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:46:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:51.196 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:46:51 compute-0 kernel: tap9df171fe-9c: entered promiscuous mode
Jan 22 17:46:51 compute-0 kernel: tap9df171fe-9c (unregistering): left promiscuous mode
Jan 22 17:46:51 compute-0 NetworkManager[55454]: <info>  [1769104011.2543] manager: (tap9df171fe-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/310)
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.258 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.269 183079 DEBUG nova.compute.manager [req-c3d03ac2-31ad-4c5c-b913-d87287f78e08 req-72e84fd1-fb3c-4e51-8f53-40657a8bc518 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Received event network-vif-unplugged-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.270 183079 DEBUG oslo_concurrency.lockutils [req-c3d03ac2-31ad-4c5c-b913-d87287f78e08 req-72e84fd1-fb3c-4e51-8f53-40657a8bc518 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.270 183079 DEBUG oslo_concurrency.lockutils [req-c3d03ac2-31ad-4c5c-b913-d87287f78e08 req-72e84fd1-fb3c-4e51-8f53-40657a8bc518 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.270 183079 DEBUG oslo_concurrency.lockutils [req-c3d03ac2-31ad-4c5c-b913-d87287f78e08 req-72e84fd1-fb3c-4e51-8f53-40657a8bc518 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.270 183079 DEBUG nova.compute.manager [req-c3d03ac2-31ad-4c5c-b913-d87287f78e08 req-72e84fd1-fb3c-4e51-8f53-40657a8bc518 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] No waiting events found dispatching network-vif-unplugged-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.270 183079 DEBUG nova.compute.manager [req-c3d03ac2-31ad-4c5c-b913-d87287f78e08 req-72e84fd1-fb3c-4e51-8f53-40657a8bc518 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Received event network-vif-unplugged-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.286 183079 INFO nova.virt.libvirt.driver [-] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Instance destroyed successfully.
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.287 183079 DEBUG nova.objects.instance [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid cff4e488-d0bf-4df8-900e-e9f61f4309ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.305 183079 DEBUG nova.virt.libvirt.vif [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:45:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1515844450',display_name='tempest-server-test-1515844450',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1515844450',id=68,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:45:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-0t85eya6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:45:49Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=cff4e488-d0bf-4df8-900e-e9f61f4309ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "address": "fa:16:3e:78:62:dd", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9df171fe-9c", "ovs_interfaceid": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.305 183079 DEBUG nova.network.os_vif_util [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "address": "fa:16:3e:78:62:dd", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9df171fe-9c", "ovs_interfaceid": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.306 183079 DEBUG nova.network.os_vif_util [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:62:dd,bridge_name='br-int',has_traffic_filtering=True,id=9df171fe-9cf9-4e33-b0f6-8054d4fb76c3,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9df171fe-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.306 183079 DEBUG os_vif [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:62:dd,bridge_name='br-int',has_traffic_filtering=True,id=9df171fe-9cf9-4e33-b0f6-8054d4fb76c3,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9df171fe-9c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.308 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.308 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9df171fe-9c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.309 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.311 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.314 183079 INFO os_vif [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:62:dd,bridge_name='br-int',has_traffic_filtering=True,id=9df171fe-9cf9-4e33-b0f6-8054d4fb76c3,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9df171fe-9c')
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.314 183079 INFO nova.virt.libvirt.driver [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Deleting instance files /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac_del
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.315 183079 INFO nova.virt.libvirt.driver [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Deletion of /var/lib/nova/instances/cff4e488-d0bf-4df8-900e-e9f61f4309ac_del complete
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.377 183079 INFO nova.compute.manager [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.377 183079 DEBUG oslo.service.loopingcall [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.378 183079 DEBUG nova.compute.manager [-] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.378 183079 DEBUG nova.network.neutron [-] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.418 183079 DEBUG nova.network.neutron [req-4a370f65-1e30-4e1d-87ec-e7693a17770c req-4efc54d3-ee06-4f05-9c5a-e85b3743c0ce a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Updated VIF entry in instance network info cache for port 9df171fe-9cf9-4e33-b0f6-8054d4fb76c3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.418 183079 DEBUG nova.network.neutron [req-4a370f65-1e30-4e1d-87ec-e7693a17770c req-4efc54d3-ee06-4f05-9c5a-e85b3743c0ce a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Updating instance_info_cache with network_info: [{"id": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "address": "fa:16:3e:78:62:dd", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9df171fe-9c", "ovs_interfaceid": "9df171fe-9cf9-4e33-b0f6-8054d4fb76c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:46:51 compute-0 nova_compute[183075]: 2026-01-22 17:46:51.440 183079 DEBUG oslo_concurrency.lockutils [req-4a370f65-1e30-4e1d-87ec-e7693a17770c req-4efc54d3-ee06-4f05-9c5a-e85b3743c0ce a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-cff4e488-d0bf-4df8-900e-e9f61f4309ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:46:53 compute-0 nova_compute[183075]: 2026-01-22 17:46:53.264 183079 DEBUG nova.network.neutron [-] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:46:53 compute-0 nova_compute[183075]: 2026-01-22 17:46:53.289 183079 INFO nova.compute.manager [-] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Took 1.91 seconds to deallocate network for instance.
Jan 22 17:46:53 compute-0 nova_compute[183075]: 2026-01-22 17:46:53.307 183079 DEBUG nova.compute.manager [req-53ecd9d2-1d8b-42c2-a5a3-6ffdebd62e31 req-0a8ee727-4f3c-4630-b952-a3ea04b28fb9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Received event network-vif-deleted-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:46:53 compute-0 nova_compute[183075]: 2026-01-22 17:46:53.328 183079 DEBUG oslo_concurrency.lockutils [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:46:53 compute-0 nova_compute[183075]: 2026-01-22 17:46:53.329 183079 DEBUG oslo_concurrency.lockutils [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:46:53 compute-0 nova_compute[183075]: 2026-01-22 17:46:53.371 183079 DEBUG nova.compute.manager [req-b4dff85f-6d84-48ca-b025-95a1eba4aafd req-7eac93ba-6709-43bb-94a2-2e7667d99f87 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Received event network-vif-plugged-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:46:53 compute-0 nova_compute[183075]: 2026-01-22 17:46:53.371 183079 DEBUG oslo_concurrency.lockutils [req-b4dff85f-6d84-48ca-b025-95a1eba4aafd req-7eac93ba-6709-43bb-94a2-2e7667d99f87 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:46:53 compute-0 nova_compute[183075]: 2026-01-22 17:46:53.372 183079 DEBUG oslo_concurrency.lockutils [req-b4dff85f-6d84-48ca-b025-95a1eba4aafd req-7eac93ba-6709-43bb-94a2-2e7667d99f87 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:46:53 compute-0 nova_compute[183075]: 2026-01-22 17:46:53.372 183079 DEBUG oslo_concurrency.lockutils [req-b4dff85f-6d84-48ca-b025-95a1eba4aafd req-7eac93ba-6709-43bb-94a2-2e7667d99f87 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:46:53 compute-0 nova_compute[183075]: 2026-01-22 17:46:53.372 183079 DEBUG nova.compute.manager [req-b4dff85f-6d84-48ca-b025-95a1eba4aafd req-7eac93ba-6709-43bb-94a2-2e7667d99f87 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] No waiting events found dispatching network-vif-plugged-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:46:53 compute-0 nova_compute[183075]: 2026-01-22 17:46:53.372 183079 WARNING nova.compute.manager [req-b4dff85f-6d84-48ca-b025-95a1eba4aafd req-7eac93ba-6709-43bb-94a2-2e7667d99f87 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Received unexpected event network-vif-plugged-9df171fe-9cf9-4e33-b0f6-8054d4fb76c3 for instance with vm_state deleted and task_state None.
Jan 22 17:46:53 compute-0 nova_compute[183075]: 2026-01-22 17:46:53.623 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.132 183079 DEBUG nova.compute.provider_tree [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.150 183079 DEBUG nova.scheduler.client.report [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.176 183079 DEBUG oslo_concurrency.lockutils [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.847s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.197 183079 INFO nova.scheduler.client.report [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance cff4e488-d0bf-4df8-900e-e9f61f4309ac
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.253 183079 DEBUG oslo_concurrency.lockutils [None req-7b44ff3b-c1f7-451a-84ec-da60353afedc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "cff4e488-d0bf-4df8-900e-e9f61f4309ac" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:46:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:54.438 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.450 183079 DEBUG oslo_concurrency.lockutils [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "12ded08f-c0e2-4d03-967b-3436626dbbb2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.450 183079 DEBUG oslo_concurrency.lockutils [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "12ded08f-c0e2-4d03-967b-3436626dbbb2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.450 183079 DEBUG oslo_concurrency.lockutils [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.451 183079 DEBUG oslo_concurrency.lockutils [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.451 183079 DEBUG oslo_concurrency.lockutils [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.452 183079 INFO nova.compute.manager [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Terminating instance
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.453 183079 DEBUG nova.compute.manager [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:46:54 compute-0 kernel: tapd562087b-b5 (unregistering): left promiscuous mode
Jan 22 17:46:54 compute-0 NetworkManager[55454]: <info>  [1769104014.4801] device (tapd562087b-b5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:46:54 compute-0 ovn_controller[95372]: 2026-01-22T17:46:54Z|00777|binding|INFO|Releasing lport d562087b-b585-4eeb-9ef3-31bd96de01f4 from this chassis (sb_readonly=0)
Jan 22 17:46:54 compute-0 ovn_controller[95372]: 2026-01-22T17:46:54Z|00778|binding|INFO|Setting lport d562087b-b585-4eeb-9ef3-31bd96de01f4 down in Southbound
Jan 22 17:46:54 compute-0 ovn_controller[95372]: 2026-01-22T17:46:54Z|00779|binding|INFO|Removing iface tapd562087b-b5 ovn-installed in OVS
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.485 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.487 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:54.498 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:ad:02 10.100.0.13'], port_security=['fa:16:3e:8b:ad:02 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '12ded08f-c0e2-4d03-967b-3436626dbbb2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '211927a5-364f-4b76-9332-16507814b750 f983fa10-a163-4ba4-97be-44ccb35ba95f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.189'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=d562087b-b585-4eeb-9ef3-31bd96de01f4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:46:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:54.499 104629 INFO neutron.agent.ovn.metadata.agent [-] Port d562087b-b585-4eeb-9ef3-31bd96de01f4 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:46:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:54.501 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:46:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:54.501 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ba16270e-72f5-47c0-b7e4-24c515afdafa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:46:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:54.502 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace which is not needed anymore
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.504 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:54 compute-0 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000043.scope: Deactivated successfully.
Jan 22 17:46:54 compute-0 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000043.scope: Consumed 17.134s CPU time.
Jan 22 17:46:54 compute-0 systemd-machined[154382]: Machine qemu-67-instance-00000043 terminated.
Jan 22 17:46:54 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239505]: [NOTICE]   (239509) : haproxy version is 2.8.14-c23fe91
Jan 22 17:46:54 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239505]: [NOTICE]   (239509) : path to executable is /usr/sbin/haproxy
Jan 22 17:46:54 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239505]: [WARNING]  (239509) : Exiting Master process...
Jan 22 17:46:54 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239505]: [WARNING]  (239509) : Exiting Master process...
Jan 22 17:46:54 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239505]: [ALERT]    (239509) : Current worker (239511) exited with code 143 (Terminated)
Jan 22 17:46:54 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[239505]: [WARNING]  (239509) : All workers exited. Exiting... (0)
Jan 22 17:46:54 compute-0 systemd[1]: libpod-9ef748973fac02fecebc83d5ab3691929723bb88d4f618f5679b8bddaea1ef1e.scope: Deactivated successfully.
Jan 22 17:46:54 compute-0 podman[240217]: 2026-01-22 17:46:54.637842525 +0000 UTC m=+0.046020334 container died 9ef748973fac02fecebc83d5ab3691929723bb88d4f618f5679b8bddaea1ef1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:46:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9ef748973fac02fecebc83d5ab3691929723bb88d4f618f5679b8bddaea1ef1e-userdata-shm.mount: Deactivated successfully.
Jan 22 17:46:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-646e3662c6d677120c479f4bc7a903e0243fccd3ba345ebdd09d79ad7110378f-merged.mount: Deactivated successfully.
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.676 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.681 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:54 compute-0 podman[240217]: 2026-01-22 17:46:54.68692556 +0000 UTC m=+0.095103369 container cleanup 9ef748973fac02fecebc83d5ab3691929723bb88d4f618f5679b8bddaea1ef1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:46:54 compute-0 systemd[1]: libpod-conmon-9ef748973fac02fecebc83d5ab3691929723bb88d4f618f5679b8bddaea1ef1e.scope: Deactivated successfully.
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.719 183079 INFO nova.virt.libvirt.driver [-] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Instance destroyed successfully.
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.720 183079 DEBUG nova.objects.instance [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid 12ded08f-c0e2-4d03-967b-3436626dbbb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.734 183079 DEBUG nova.virt.libvirt.vif [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:44:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1658329936',display_name='tempest-server-test-1658329936',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1658329936',id=67,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:44:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-0n0qchr8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:44:48Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=12ded08f-c0e2-4d03-967b-3436626dbbb2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "address": "fa:16:3e:8b:ad:02", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd562087b-b5", "ovs_interfaceid": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.734 183079 DEBUG nova.network.os_vif_util [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "address": "fa:16:3e:8b:ad:02", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd562087b-b5", "ovs_interfaceid": "d562087b-b585-4eeb-9ef3-31bd96de01f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.735 183079 DEBUG nova.network.os_vif_util [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8b:ad:02,bridge_name='br-int',has_traffic_filtering=True,id=d562087b-b585-4eeb-9ef3-31bd96de01f4,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd562087b-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.735 183079 DEBUG os_vif [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:ad:02,bridge_name='br-int',has_traffic_filtering=True,id=d562087b-b585-4eeb-9ef3-31bd96de01f4,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd562087b-b5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.737 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.737 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd562087b-b5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.739 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.740 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.742 183079 INFO os_vif [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:ad:02,bridge_name='br-int',has_traffic_filtering=True,id=d562087b-b585-4eeb-9ef3-31bd96de01f4,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd562087b-b5')
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.743 183079 INFO nova.virt.libvirt.driver [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Deleting instance files /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2_del
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.743 183079 INFO nova.virt.libvirt.driver [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Deletion of /var/lib/nova/instances/12ded08f-c0e2-4d03-967b-3436626dbbb2_del complete
Jan 22 17:46:54 compute-0 podman[240257]: 2026-01-22 17:46:54.773089318 +0000 UTC m=+0.058704964 container remove 9ef748973fac02fecebc83d5ab3691929723bb88d4f618f5679b8bddaea1ef1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 17:46:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:54.778 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3b7357e2-06ab-44b4-8468-72fe63b9a359]: (4, ('Thu Jan 22 05:46:54 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (9ef748973fac02fecebc83d5ab3691929723bb88d4f618f5679b8bddaea1ef1e)\n9ef748973fac02fecebc83d5ab3691929723bb88d4f618f5679b8bddaea1ef1e\nThu Jan 22 05:46:54 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (9ef748973fac02fecebc83d5ab3691929723bb88d4f618f5679b8bddaea1ef1e)\n9ef748973fac02fecebc83d5ab3691929723bb88d4f618f5679b8bddaea1ef1e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:46:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:54.780 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6a380fcf-0cdb-44f8-8b21-2b48706f23d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:46:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:54.781 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.784 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:54 compute-0 kernel: tap88ed9213-70: left promiscuous mode
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.785 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:54.789 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e2fb522e-9621-4541-a208-2edc8e709378]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.793 183079 INFO nova.compute.manager [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.793 183079 DEBUG oslo.service.loopingcall [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.793 183079 DEBUG nova.compute.manager [-] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.794 183079 DEBUG nova.network.neutron [-] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:46:54 compute-0 nova_compute[183075]: 2026-01-22 17:46:54.799 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:54.809 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[907e1c86-ce6f-45d6-b773-fd4fa8b826a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:46:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:54.810 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5eae4e72-30a7-48ca-80df-b4dfb9d378f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:46:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:54.827 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[93e2c1a4-73f0-4069-8b14-ffa986ebacad]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 625156, 'reachable_time': 29890, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240272, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:46:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:54.830 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:46:54 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:46:54.830 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[b4f8732f-3d53-4ea7-96b5-bb2aec4854cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:46:54 compute-0 systemd[1]: run-netns-ovnmeta\x2d88ed9213\x2d7d48\x2d4c42\x2dad60\x2dce3fcf104f71.mount: Deactivated successfully.
Jan 22 17:46:56 compute-0 nova_compute[183075]: 2026-01-22 17:46:56.499 183079 DEBUG nova.compute.manager [req-4e9d7477-b7c7-433a-becd-78272998471f req-d6f37a40-8098-4f71-926f-a12cc125fbd1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Received event network-vif-unplugged-d562087b-b585-4eeb-9ef3-31bd96de01f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:46:56 compute-0 nova_compute[183075]: 2026-01-22 17:46:56.499 183079 DEBUG oslo_concurrency.lockutils [req-4e9d7477-b7c7-433a-becd-78272998471f req-d6f37a40-8098-4f71-926f-a12cc125fbd1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:46:56 compute-0 nova_compute[183075]: 2026-01-22 17:46:56.499 183079 DEBUG oslo_concurrency.lockutils [req-4e9d7477-b7c7-433a-becd-78272998471f req-d6f37a40-8098-4f71-926f-a12cc125fbd1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:46:56 compute-0 nova_compute[183075]: 2026-01-22 17:46:56.500 183079 DEBUG oslo_concurrency.lockutils [req-4e9d7477-b7c7-433a-becd-78272998471f req-d6f37a40-8098-4f71-926f-a12cc125fbd1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:46:56 compute-0 nova_compute[183075]: 2026-01-22 17:46:56.500 183079 DEBUG nova.compute.manager [req-4e9d7477-b7c7-433a-becd-78272998471f req-d6f37a40-8098-4f71-926f-a12cc125fbd1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] No waiting events found dispatching network-vif-unplugged-d562087b-b585-4eeb-9ef3-31bd96de01f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:46:56 compute-0 nova_compute[183075]: 2026-01-22 17:46:56.500 183079 DEBUG nova.compute.manager [req-4e9d7477-b7c7-433a-becd-78272998471f req-d6f37a40-8098-4f71-926f-a12cc125fbd1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Received event network-vif-unplugged-d562087b-b585-4eeb-9ef3-31bd96de01f4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:46:56 compute-0 nova_compute[183075]: 2026-01-22 17:46:56.816 183079 DEBUG nova.network.neutron [-] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:46:56 compute-0 nova_compute[183075]: 2026-01-22 17:46:56.834 183079 INFO nova.compute.manager [-] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Took 2.04 seconds to deallocate network for instance.
Jan 22 17:46:56 compute-0 nova_compute[183075]: 2026-01-22 17:46:56.878 183079 DEBUG oslo_concurrency.lockutils [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:46:56 compute-0 nova_compute[183075]: 2026-01-22 17:46:56.878 183079 DEBUG oslo_concurrency.lockutils [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:46:56 compute-0 nova_compute[183075]: 2026-01-22 17:46:56.909 183079 DEBUG nova.compute.manager [req-bb14058f-9d04-44ef-b6ee-bf8b34c92620 req-f5a0facd-ad09-491b-a284-c052514aaa80 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Received event network-vif-deleted-d562087b-b585-4eeb-9ef3-31bd96de01f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:46:56 compute-0 nova_compute[183075]: 2026-01-22 17:46:56.929 183079 DEBUG nova.compute.provider_tree [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:46:56 compute-0 nova_compute[183075]: 2026-01-22 17:46:56.943 183079 DEBUG nova.scheduler.client.report [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:46:56 compute-0 nova_compute[183075]: 2026-01-22 17:46:56.966 183079 DEBUG oslo_concurrency.lockutils [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:46:56 compute-0 nova_compute[183075]: 2026-01-22 17:46:56.989 183079 INFO nova.scheduler.client.report [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance 12ded08f-c0e2-4d03-967b-3436626dbbb2
Jan 22 17:46:57 compute-0 nova_compute[183075]: 2026-01-22 17:46:57.069 183079 DEBUG oslo_concurrency.lockutils [None req-8e0b9236-74cf-4148-ac79-bb2511edd302 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "12ded08f-c0e2-4d03-967b-3436626dbbb2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:46:58 compute-0 nova_compute[183075]: 2026-01-22 17:46:58.595 183079 DEBUG nova.compute.manager [req-10e55913-3216-4810-806b-f573e52dfc87 req-0619fd45-3652-465c-972f-076d15b81c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Received event network-vif-plugged-d562087b-b585-4eeb-9ef3-31bd96de01f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:46:58 compute-0 nova_compute[183075]: 2026-01-22 17:46:58.595 183079 DEBUG oslo_concurrency.lockutils [req-10e55913-3216-4810-806b-f573e52dfc87 req-0619fd45-3652-465c-972f-076d15b81c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:46:58 compute-0 nova_compute[183075]: 2026-01-22 17:46:58.596 183079 DEBUG oslo_concurrency.lockutils [req-10e55913-3216-4810-806b-f573e52dfc87 req-0619fd45-3652-465c-972f-076d15b81c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:46:58 compute-0 nova_compute[183075]: 2026-01-22 17:46:58.596 183079 DEBUG oslo_concurrency.lockutils [req-10e55913-3216-4810-806b-f573e52dfc87 req-0619fd45-3652-465c-972f-076d15b81c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "12ded08f-c0e2-4d03-967b-3436626dbbb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:46:58 compute-0 nova_compute[183075]: 2026-01-22 17:46:58.596 183079 DEBUG nova.compute.manager [req-10e55913-3216-4810-806b-f573e52dfc87 req-0619fd45-3652-465c-972f-076d15b81c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] No waiting events found dispatching network-vif-plugged-d562087b-b585-4eeb-9ef3-31bd96de01f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:46:58 compute-0 nova_compute[183075]: 2026-01-22 17:46:58.596 183079 WARNING nova.compute.manager [req-10e55913-3216-4810-806b-f573e52dfc87 req-0619fd45-3652-465c-972f-076d15b81c11 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Received unexpected event network-vif-plugged-d562087b-b585-4eeb-9ef3-31bd96de01f4 for instance with vm_state deleted and task_state None.
Jan 22 17:46:58 compute-0 nova_compute[183075]: 2026-01-22 17:46:58.625 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:46:59 compute-0 podman[240274]: 2026-01-22 17:46:59.369422969 +0000 UTC m=+0.069689428 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 22 17:46:59 compute-0 podman[240275]: 2026-01-22 17:46:59.384235466 +0000 UTC m=+0.089549791 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 17:46:59 compute-0 podman[240273]: 2026-01-22 17:46:59.405595528 +0000 UTC m=+0.106591127 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 17:46:59 compute-0 nova_compute[183075]: 2026-01-22 17:46:59.739 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:03 compute-0 nova_compute[183075]: 2026-01-22 17:47:03.628 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:04 compute-0 nova_compute[183075]: 2026-01-22 17:47:04.663 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "df3ded3b-e065-4dee-93d3-e1ced39c8619" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:47:04 compute-0 nova_compute[183075]: 2026-01-22 17:47:04.663 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:47:04 compute-0 nova_compute[183075]: 2026-01-22 17:47:04.682 183079 DEBUG nova.compute.manager [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:47:04 compute-0 nova_compute[183075]: 2026-01-22 17:47:04.742 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:04 compute-0 nova_compute[183075]: 2026-01-22 17:47:04.762 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:47:04 compute-0 nova_compute[183075]: 2026-01-22 17:47:04.763 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:47:04 compute-0 nova_compute[183075]: 2026-01-22 17:47:04.770 183079 DEBUG nova.virt.hardware [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:47:04 compute-0 nova_compute[183075]: 2026-01-22 17:47:04.770 183079 INFO nova.compute.claims [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:47:04 compute-0 nova_compute[183075]: 2026-01-22 17:47:04.881 183079 DEBUG nova.compute.provider_tree [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:47:04 compute-0 nova_compute[183075]: 2026-01-22 17:47:04.895 183079 DEBUG nova.scheduler.client.report [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:47:04 compute-0 nova_compute[183075]: 2026-01-22 17:47:04.921 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:47:04 compute-0 nova_compute[183075]: 2026-01-22 17:47:04.922 183079 DEBUG nova.compute.manager [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:47:04 compute-0 nova_compute[183075]: 2026-01-22 17:47:04.976 183079 DEBUG nova.compute.manager [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:47:04 compute-0 nova_compute[183075]: 2026-01-22 17:47:04.977 183079 DEBUG nova.network.neutron [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.014 183079 INFO nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.031 183079 DEBUG nova.compute.manager [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.120 183079 DEBUG nova.compute.manager [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.121 183079 DEBUG nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.122 183079 INFO nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Creating image(s)
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.122 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.123 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.123 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.134 183079 DEBUG oslo_concurrency.processutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.207 183079 DEBUG oslo_concurrency.processutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.208 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.209 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.224 183079 DEBUG oslo_concurrency.processutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.302 183079 DEBUG oslo_concurrency.processutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.303 183079 DEBUG oslo_concurrency.processutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.346 183079 DEBUG oslo_concurrency.processutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.347 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.348 183079 DEBUG oslo_concurrency.processutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:47:05 compute-0 podman[240343]: 2026-01-22 17:47:05.358644662 +0000 UTC m=+0.067967541 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.401 183079 DEBUG oslo_concurrency.processutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.402 183079 DEBUG nova.virt.disk.api [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.402 183079 DEBUG oslo_concurrency.processutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.452 183079 DEBUG oslo_concurrency.processutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.453 183079 DEBUG nova.virt.disk.api [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.453 183079 DEBUG nova.objects.instance [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid df3ded3b-e065-4dee-93d3-e1ced39c8619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.466 183079 DEBUG nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.467 183079 DEBUG nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Ensure instance console log exists: /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.467 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.468 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.468 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:47:05 compute-0 nova_compute[183075]: 2026-01-22 17:47:05.562 183079 DEBUG nova.policy [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:47:06 compute-0 nova_compute[183075]: 2026-01-22 17:47:06.286 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769104011.2845979, cff4e488-d0bf-4df8-900e-e9f61f4309ac => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:47:06 compute-0 nova_compute[183075]: 2026-01-22 17:47:06.286 183079 INFO nova.compute.manager [-] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] VM Stopped (Lifecycle Event)
Jan 22 17:47:06 compute-0 nova_compute[183075]: 2026-01-22 17:47:06.333 183079 DEBUG nova.compute.manager [None req-dfa9728b-41f6-43b9-bdf8-ff6008d275e1 - - - - - -] [instance: cff4e488-d0bf-4df8-900e-e9f61f4309ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:47:07 compute-0 nova_compute[183075]: 2026-01-22 17:47:07.668 183079 DEBUG nova.network.neutron [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Successfully created port: 11393b0e-74ee-456c-8793-6b2a6cb69a8c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:47:08 compute-0 nova_compute[183075]: 2026-01-22 17:47:08.388 183079 DEBUG nova.network.neutron [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Successfully updated port: 11393b0e-74ee-456c-8793-6b2a6cb69a8c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:47:08 compute-0 nova_compute[183075]: 2026-01-22 17:47:08.405 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-df3ded3b-e065-4dee-93d3-e1ced39c8619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:47:08 compute-0 nova_compute[183075]: 2026-01-22 17:47:08.405 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-df3ded3b-e065-4dee-93d3-e1ced39c8619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:47:08 compute-0 nova_compute[183075]: 2026-01-22 17:47:08.405 183079 DEBUG nova.network.neutron [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:47:08 compute-0 nova_compute[183075]: 2026-01-22 17:47:08.483 183079 DEBUG nova.compute.manager [req-120ee282-80fb-457b-abc9-0efc0feb7b6e req-73ef8b36-ee5e-4970-81d2-1db4fcc70d95 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Received event network-changed-11393b0e-74ee-456c-8793-6b2a6cb69a8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:47:08 compute-0 nova_compute[183075]: 2026-01-22 17:47:08.483 183079 DEBUG nova.compute.manager [req-120ee282-80fb-457b-abc9-0efc0feb7b6e req-73ef8b36-ee5e-4970-81d2-1db4fcc70d95 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Refreshing instance network info cache due to event network-changed-11393b0e-74ee-456c-8793-6b2a6cb69a8c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:47:08 compute-0 nova_compute[183075]: 2026-01-22 17:47:08.483 183079 DEBUG oslo_concurrency.lockutils [req-120ee282-80fb-457b-abc9-0efc0feb7b6e req-73ef8b36-ee5e-4970-81d2-1db4fcc70d95 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-df3ded3b-e065-4dee-93d3-e1ced39c8619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:47:08 compute-0 nova_compute[183075]: 2026-01-22 17:47:08.550 183079 DEBUG nova.network.neutron [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:47:08 compute-0 nova_compute[183075]: 2026-01-22 17:47:08.630 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.593 183079 DEBUG nova.network.neutron [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Updating instance_info_cache with network_info: [{"id": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "address": "fa:16:3e:af:50:24", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11393b0e-74", "ovs_interfaceid": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.630 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-df3ded3b-e065-4dee-93d3-e1ced39c8619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.631 183079 DEBUG nova.compute.manager [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Instance network_info: |[{"id": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "address": "fa:16:3e:af:50:24", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11393b0e-74", "ovs_interfaceid": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.631 183079 DEBUG oslo_concurrency.lockutils [req-120ee282-80fb-457b-abc9-0efc0feb7b6e req-73ef8b36-ee5e-4970-81d2-1db4fcc70d95 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-df3ded3b-e065-4dee-93d3-e1ced39c8619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.632 183079 DEBUG nova.network.neutron [req-120ee282-80fb-457b-abc9-0efc0feb7b6e req-73ef8b36-ee5e-4970-81d2-1db4fcc70d95 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Refreshing network info cache for port 11393b0e-74ee-456c-8793-6b2a6cb69a8c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.635 183079 DEBUG nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Start _get_guest_xml network_info=[{"id": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "address": "fa:16:3e:af:50:24", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11393b0e-74", "ovs_interfaceid": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.640 183079 WARNING nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.646 183079 DEBUG nova.virt.libvirt.host [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.649 183079 DEBUG nova.virt.libvirt.host [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.656 183079 DEBUG nova.virt.libvirt.host [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.657 183079 DEBUG nova.virt.libvirt.host [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.658 183079 DEBUG nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.658 183079 DEBUG nova.virt.hardware [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.659 183079 DEBUG nova.virt.hardware [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.659 183079 DEBUG nova.virt.hardware [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.659 183079 DEBUG nova.virt.hardware [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.660 183079 DEBUG nova.virt.hardware [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.660 183079 DEBUG nova.virt.hardware [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.660 183079 DEBUG nova.virt.hardware [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.661 183079 DEBUG nova.virt.hardware [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.661 183079 DEBUG nova.virt.hardware [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.661 183079 DEBUG nova.virt.hardware [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.662 183079 DEBUG nova.virt.hardware [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.666 183079 DEBUG nova.virt.libvirt.vif [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:47:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1131995147',display_name='tempest-server-test-1131995147',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1131995147',id=69,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-m3xyla0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:47:05Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=df3ded3b-e065-4dee-93d3-e1ced39c8619,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "address": "fa:16:3e:af:50:24", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11393b0e-74", "ovs_interfaceid": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.667 183079 DEBUG nova.network.os_vif_util [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "address": "fa:16:3e:af:50:24", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11393b0e-74", "ovs_interfaceid": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.668 183079 DEBUG nova.network.os_vif_util [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:50:24,bridge_name='br-int',has_traffic_filtering=True,id=11393b0e-74ee-456c-8793-6b2a6cb69a8c,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11393b0e-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.669 183079 DEBUG nova.objects.instance [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid df3ded3b-e065-4dee-93d3-e1ced39c8619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.694 183079 DEBUG nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:47:09 compute-0 nova_compute[183075]:   <uuid>df3ded3b-e065-4dee-93d3-e1ced39c8619</uuid>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   <name>instance-00000045</name>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1131995147</nova:name>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:47:09</nova:creationTime>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:47:09 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:47:09 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:47:09 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:47:09 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:47:09 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:47:09 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:47:09 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:47:09 compute-0 nova_compute[183075]:         <nova:port uuid="11393b0e-74ee-456c-8793-6b2a6cb69a8c">
Jan 22 17:47:09 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <system>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <entry name="serial">df3ded3b-e065-4dee-93d3-e1ced39c8619</entry>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <entry name="uuid">df3ded3b-e065-4dee-93d3-e1ced39c8619</entry>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     </system>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   <os>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   </os>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   <features>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   </features>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:af:50:24"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <target dev="tap11393b0e-74"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/console.log" append="off"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <video>
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     </video>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:47:09 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:47:09 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:47:09 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:47:09 compute-0 nova_compute[183075]: </domain>
Jan 22 17:47:09 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.695 183079 DEBUG nova.compute.manager [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Preparing to wait for external event network-vif-plugged-11393b0e-74ee-456c-8793-6b2a6cb69a8c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.696 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.696 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.696 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.696 183079 DEBUG nova.virt.libvirt.vif [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:47:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1131995147',display_name='tempest-server-test-1131995147',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1131995147',id=69,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-m3xyla0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:47:05Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=df3ded3b-e065-4dee-93d3-e1ced39c8619,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "address": "fa:16:3e:af:50:24", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11393b0e-74", "ovs_interfaceid": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.697 183079 DEBUG nova.network.os_vif_util [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "address": "fa:16:3e:af:50:24", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11393b0e-74", "ovs_interfaceid": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.697 183079 DEBUG nova.network.os_vif_util [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:50:24,bridge_name='br-int',has_traffic_filtering=True,id=11393b0e-74ee-456c-8793-6b2a6cb69a8c,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11393b0e-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.698 183079 DEBUG os_vif [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:50:24,bridge_name='br-int',has_traffic_filtering=True,id=11393b0e-74ee-456c-8793-6b2a6cb69a8c,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11393b0e-74') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.698 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.698 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.699 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.701 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.701 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap11393b0e-74, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.701 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap11393b0e-74, col_values=(('external_ids', {'iface-id': '11393b0e-74ee-456c-8793-6b2a6cb69a8c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:af:50:24', 'vm-uuid': 'df3ded3b-e065-4dee-93d3-e1ced39c8619'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:47:09 compute-0 NetworkManager[55454]: <info>  [1769104029.7038] manager: (tap11393b0e-74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/311)
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.703 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.705 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.709 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.709 183079 INFO os_vif [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:50:24,bridge_name='br-int',has_traffic_filtering=True,id=11393b0e-74ee-456c-8793-6b2a6cb69a8c,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11393b0e-74')
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.717 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769104014.717106, 12ded08f-c0e2-4d03-967b-3436626dbbb2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.718 183079 INFO nova.compute.manager [-] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] VM Stopped (Lifecycle Event)
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.736 183079 DEBUG nova.compute.manager [None req-8b71d4e7-f8a8-453c-afe4-b3beee352604 - - - - - -] [instance: 12ded08f-c0e2-4d03-967b-3436626dbbb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.778 183079 DEBUG nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.778 183079 DEBUG nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:af:50:24, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:47:09 compute-0 kernel: tap11393b0e-74: entered promiscuous mode
Jan 22 17:47:09 compute-0 NetworkManager[55454]: <info>  [1769104029.8406] manager: (tap11393b0e-74): new Tun device (/org/freedesktop/NetworkManager/Devices/312)
Jan 22 17:47:09 compute-0 ovn_controller[95372]: 2026-01-22T17:47:09Z|00780|binding|INFO|Claiming lport 11393b0e-74ee-456c-8793-6b2a6cb69a8c for this chassis.
Jan 22 17:47:09 compute-0 ovn_controller[95372]: 2026-01-22T17:47:09Z|00781|binding|INFO|11393b0e-74ee-456c-8793-6b2a6cb69a8c: Claiming fa:16:3e:af:50:24 10.100.0.12
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.842 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:09.852 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:50:24 10.100.0.12'], port_security=['fa:16:3e:af:50:24 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'bdd883a6-7421-4241-bb4f-a9657593b878', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=11393b0e-74ee-456c-8793-6b2a6cb69a8c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.854 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:09.854 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 11393b0e-74ee-456c-8793-6b2a6cb69a8c in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:47:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:09.855 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:47:09 compute-0 ovn_controller[95372]: 2026-01-22T17:47:09Z|00782|binding|INFO|Setting lport 11393b0e-74ee-456c-8793-6b2a6cb69a8c ovn-installed in OVS
Jan 22 17:47:09 compute-0 ovn_controller[95372]: 2026-01-22T17:47:09Z|00783|binding|INFO|Setting lport 11393b0e-74ee-456c-8793-6b2a6cb69a8c up in Southbound
Jan 22 17:47:09 compute-0 nova_compute[183075]: 2026-01-22 17:47:09.857 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:09.867 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[94b675d2-7412-4501-a340-f45c9d35e718]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:47:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:09.868 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88ed9213-71 in ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:47:09 compute-0 systemd-udevd[240392]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:47:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:09.870 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88ed9213-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:47:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:09.871 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e2b58aad-ea8b-4766-b7d5-e7be4a315f47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:47:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:09.872 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ac392cf9-6ab0-4604-bde6-e47ac9305ae6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:47:09 compute-0 systemd-machined[154382]: New machine qemu-69-instance-00000045.
Jan 22 17:47:09 compute-0 NetworkManager[55454]: <info>  [1769104029.8819] device (tap11393b0e-74): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:47:09 compute-0 NetworkManager[55454]: <info>  [1769104029.8826] device (tap11393b0e-74): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:47:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:09.886 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[1c47ee00-bd9c-4a8c-820b-4066d8cad679]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:47:09 compute-0 systemd[1]: Started Virtual Machine qemu-69-instance-00000045.
Jan 22 17:47:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:09.902 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bb95d375-7a99-47b9-9bb9-9c6584484728]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:47:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:09.930 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[da2a5c13-b04a-4c75-8359-9fe2141a5fb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:47:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:09.934 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8cf9572b-14cb-4637-a1ef-43aee405c632]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:47:09 compute-0 NetworkManager[55454]: <info>  [1769104029.9358] manager: (tap88ed9213-70): new Veth device (/org/freedesktop/NetworkManager/Devices/313)
Jan 22 17:47:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:09.961 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c42cb74f-fadd-44a9-a344-5449a3181bd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:47:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:09.965 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[f06b121a-0f84-41c1-80e4-16b2390e1860]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:47:09 compute-0 NetworkManager[55454]: <info>  [1769104029.9843] device (tap88ed9213-70): carrier: link connected
Jan 22 17:47:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:09.990 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[dc7e1188-9235-40c2-b46f-500d9495e228]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:10.009 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1fa63a11-095e-42b7-afab-a8cd4d321854]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639368, 'reachable_time': 24278, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240425, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:10.025 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[74e9a49f-5453-474e-80e3-58b8a5bcf1e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:fa93'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 639368, 'tstamp': 639368}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240426, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:10.040 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[358abc20-0045-4163-86bf-a097f5639c0f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639368, 'reachable_time': 24278, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240427, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:10.069 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[92a37dd8-40d0-4cac-8b8d-1de9e7e45b31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:10.114 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a6ec1fa1-2f6d-428c-b62e-94b4b1a81532]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:10.115 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:10.115 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:10.116 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:47:10 compute-0 NetworkManager[55454]: <info>  [1769104030.1188] manager: (tap88ed9213-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/314)
Jan 22 17:47:10 compute-0 kernel: tap88ed9213-70: entered promiscuous mode
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.119 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.121 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:10.122 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:47:10 compute-0 ovn_controller[95372]: 2026-01-22T17:47:10Z|00784|binding|INFO|Releasing lport da93a32e-eed6-4124-8f08-78b0bfe2cb69 from this chassis (sb_readonly=0)
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.123 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.136 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.136 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:10.137 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:10.138 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e7c2e376-5d6d-4e62-b11d-5f2397614be9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:10.138 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:47:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:10.139 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'env', 'PROCESS_TAG=haproxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88ed9213-7d48-4c42-ad60-ce3fcf104f71.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.342 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104030.3422468, df3ded3b-e065-4dee-93d3-e1ced39c8619 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.343 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] VM Started (Lifecycle Event)
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.369 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.374 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104030.3423998, df3ded3b-e065-4dee-93d3-e1ced39c8619 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.375 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] VM Paused (Lifecycle Event)
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.392 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.396 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.415 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:47:10 compute-0 podman[240466]: 2026-01-22 17:47:10.497005054 +0000 UTC m=+0.049506187 container create 86b7ddc8fb11ca0d8d513aa3bd0c09fd24d0db577008358da0229169bb1faf4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:47:10 compute-0 systemd[1]: Started libpod-conmon-86b7ddc8fb11ca0d8d513aa3bd0c09fd24d0db577008358da0229169bb1faf4e.scope.
Jan 22 17:47:10 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:47:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb8ea020f48aebd88578bf0755c742772153b6daf58fcdfb6d33204e874fca13/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:47:10 compute-0 podman[240466]: 2026-01-22 17:47:10.470951756 +0000 UTC m=+0.023452919 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.621 183079 DEBUG nova.compute.manager [req-d6ef3867-59ad-496f-be1a-8685f5ff64dc req-503111ac-7095-426f-acdb-1f6cf9aa7326 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Received event network-vif-plugged-11393b0e-74ee-456c-8793-6b2a6cb69a8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.623 183079 DEBUG oslo_concurrency.lockutils [req-d6ef3867-59ad-496f-be1a-8685f5ff64dc req-503111ac-7095-426f-acdb-1f6cf9aa7326 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.624 183079 DEBUG oslo_concurrency.lockutils [req-d6ef3867-59ad-496f-be1a-8685f5ff64dc req-503111ac-7095-426f-acdb-1f6cf9aa7326 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.624 183079 DEBUG oslo_concurrency.lockutils [req-d6ef3867-59ad-496f-be1a-8685f5ff64dc req-503111ac-7095-426f-acdb-1f6cf9aa7326 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.624 183079 DEBUG nova.compute.manager [req-d6ef3867-59ad-496f-be1a-8685f5ff64dc req-503111ac-7095-426f-acdb-1f6cf9aa7326 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Processing event network-vif-plugged-11393b0e-74ee-456c-8793-6b2a6cb69a8c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.625 183079 DEBUG nova.compute.manager [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.630 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104030.6298625, df3ded3b-e065-4dee-93d3-e1ced39c8619 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.631 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] VM Resumed (Lifecycle Event)
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.632 183079 DEBUG nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.636 183079 INFO nova.virt.libvirt.driver [-] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Instance spawned successfully.
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.637 183079 DEBUG nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:47:10 compute-0 podman[240466]: 2026-01-22 17:47:10.741617126 +0000 UTC m=+0.294118319 container init 86b7ddc8fb11ca0d8d513aa3bd0c09fd24d0db577008358da0229169bb1faf4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:47:10 compute-0 podman[240466]: 2026-01-22 17:47:10.748218633 +0000 UTC m=+0.300719796 container start 86b7ddc8fb11ca0d8d513aa3bd0c09fd24d0db577008358da0229169bb1faf4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:47:10 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240481]: [NOTICE]   (240485) : New worker (240487) forked
Jan 22 17:47:10 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240481]: [NOTICE]   (240485) : Loading success.
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.969 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.975 183079 DEBUG nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.975 183079 DEBUG nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.976 183079 DEBUG nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.976 183079 DEBUG nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.977 183079 DEBUG nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.977 183079 DEBUG nova.virt.libvirt.driver [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:47:10 compute-0 nova_compute[183075]: 2026-01-22 17:47:10.981 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:47:11 compute-0 nova_compute[183075]: 2026-01-22 17:47:11.094 183079 DEBUG nova.network.neutron [req-120ee282-80fb-457b-abc9-0efc0feb7b6e req-73ef8b36-ee5e-4970-81d2-1db4fcc70d95 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Updated VIF entry in instance network info cache for port 11393b0e-74ee-456c-8793-6b2a6cb69a8c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:47:11 compute-0 nova_compute[183075]: 2026-01-22 17:47:11.096 183079 DEBUG nova.network.neutron [req-120ee282-80fb-457b-abc9-0efc0feb7b6e req-73ef8b36-ee5e-4970-81d2-1db4fcc70d95 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Updating instance_info_cache with network_info: [{"id": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "address": "fa:16:3e:af:50:24", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11393b0e-74", "ovs_interfaceid": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:47:11 compute-0 nova_compute[183075]: 2026-01-22 17:47:11.273 183079 DEBUG oslo_concurrency.lockutils [req-120ee282-80fb-457b-abc9-0efc0feb7b6e req-73ef8b36-ee5e-4970-81d2-1db4fcc70d95 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-df3ded3b-e065-4dee-93d3-e1ced39c8619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:47:11 compute-0 nova_compute[183075]: 2026-01-22 17:47:11.276 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:47:11 compute-0 nova_compute[183075]: 2026-01-22 17:47:11.331 183079 INFO nova.compute.manager [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Took 6.21 seconds to spawn the instance on the hypervisor.
Jan 22 17:47:11 compute-0 nova_compute[183075]: 2026-01-22 17:47:11.331 183079 DEBUG nova.compute.manager [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:47:11 compute-0 nova_compute[183075]: 2026-01-22 17:47:11.409 183079 INFO nova.compute.manager [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Took 6.67 seconds to build instance.
Jan 22 17:47:11 compute-0 nova_compute[183075]: 2026-01-22 17:47:11.428 183079 DEBUG oslo_concurrency.lockutils [None req-88a73a79-6b58-4b79-bb94-a8e43048905f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.764s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:47:12 compute-0 nova_compute[183075]: 2026-01-22 17:47:12.788 183079 DEBUG nova.compute.manager [req-f638a0e3-638a-4eba-b309-01d5d372aec5 req-84cdfe44-d9b7-4936-b31b-b3d24ed64462 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Received event network-vif-plugged-11393b0e-74ee-456c-8793-6b2a6cb69a8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:47:12 compute-0 nova_compute[183075]: 2026-01-22 17:47:12.789 183079 DEBUG oslo_concurrency.lockutils [req-f638a0e3-638a-4eba-b309-01d5d372aec5 req-84cdfe44-d9b7-4936-b31b-b3d24ed64462 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:47:12 compute-0 nova_compute[183075]: 2026-01-22 17:47:12.789 183079 DEBUG oslo_concurrency.lockutils [req-f638a0e3-638a-4eba-b309-01d5d372aec5 req-84cdfe44-d9b7-4936-b31b-b3d24ed64462 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:47:12 compute-0 nova_compute[183075]: 2026-01-22 17:47:12.790 183079 DEBUG oslo_concurrency.lockutils [req-f638a0e3-638a-4eba-b309-01d5d372aec5 req-84cdfe44-d9b7-4936-b31b-b3d24ed64462 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:47:12 compute-0 nova_compute[183075]: 2026-01-22 17:47:12.790 183079 DEBUG nova.compute.manager [req-f638a0e3-638a-4eba-b309-01d5d372aec5 req-84cdfe44-d9b7-4936-b31b-b3d24ed64462 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] No waiting events found dispatching network-vif-plugged-11393b0e-74ee-456c-8793-6b2a6cb69a8c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:47:12 compute-0 nova_compute[183075]: 2026-01-22 17:47:12.790 183079 WARNING nova.compute.manager [req-f638a0e3-638a-4eba-b309-01d5d372aec5 req-84cdfe44-d9b7-4936-b31b-b3d24ed64462 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Received unexpected event network-vif-plugged-11393b0e-74ee-456c-8793-6b2a6cb69a8c for instance with vm_state active and task_state None.
Jan 22 17:47:13 compute-0 nova_compute[183075]: 2026-01-22 17:47:13.318 183079 INFO nova.compute.manager [None req-9098117f-bb2f-4be1-b2bb-c74c5c257e3a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Get console output
Jan 22 17:47:13 compute-0 nova_compute[183075]: 2026-01-22 17:47:13.326 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:47:13 compute-0 nova_compute[183075]: 2026-01-22 17:47:13.634 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:14 compute-0 nova_compute[183075]: 2026-01-22 17:47:14.705 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:16 compute-0 podman[240496]: 2026-01-22 17:47:16.352713372 +0000 UTC m=+0.057823720 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:47:18 compute-0 nova_compute[183075]: 2026-01-22 17:47:18.449 183079 INFO nova.compute.manager [None req-6b6d5f31-853b-475a-9f29-d120a9decd4d 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Get console output
Jan 22 17:47:18 compute-0 nova_compute[183075]: 2026-01-22 17:47:18.635 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:19 compute-0 nova_compute[183075]: 2026-01-22 17:47:19.707 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:20 compute-0 podman[240521]: 2026-01-22 17:47:20.352745021 +0000 UTC m=+0.061138259 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:47:23 compute-0 nova_compute[183075]: 2026-01-22 17:47:23.638 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:23 compute-0 nova_compute[183075]: 2026-01-22 17:47:23.691 183079 INFO nova.compute.manager [None req-838d9535-e8f6-42c6-84f8-353093ed2f40 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Get console output
Jan 22 17:47:23 compute-0 ovn_controller[95372]: 2026-01-22T17:47:23Z|00140|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:af:50:24 10.100.0.12
Jan 22 17:47:23 compute-0 ovn_controller[95372]: 2026-01-22T17:47:23Z|00141|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:af:50:24 10.100.0.12
Jan 22 17:47:24 compute-0 nova_compute[183075]: 2026-01-22 17:47:24.710 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:28 compute-0 nova_compute[183075]: 2026-01-22 17:47:28.641 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:28 compute-0 nova_compute[183075]: 2026-01-22 17:47:28.826 183079 INFO nova.compute.manager [None req-60e4f378-a50c-4b26-b5b1-57b89fba65f4 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Get console output
Jan 22 17:47:28 compute-0 nova_compute[183075]: 2026-01-22 17:47:28.831 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.078 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.079 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.645 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.645 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.5668600
Jan 22 17:47:29 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.12:59430 [22/Jan/2026:17:47:29.077] listener listener/metadata 0/0/0/568/568 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.653 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.654 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.701 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.702 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0476079
Jan 22 17:47:29 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.12:59446 [22/Jan/2026:17:47:29.653] listener listener/metadata 0/0/0/48/48 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.706 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.707 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:47:29 compute-0 nova_compute[183075]: 2026-01-22 17:47:29.712 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.723 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.724 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0164328
Jan 22 17:47:29 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.12:59448 [22/Jan/2026:17:47:29.706] listener listener/metadata 0/0/0/18/18 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.728 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.729 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.744 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.745 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0156345
Jan 22 17:47:29 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.12:59464 [22/Jan/2026:17:47:29.728] listener listener/metadata 0/0/0/16/16 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.749 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.750 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.762 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.762 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0125523
Jan 22 17:47:29 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.12:59472 [22/Jan/2026:17:47:29.749] listener listener/metadata 0/0/0/13/13 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.767 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.768 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.785 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:47:29 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.12:59474 [22/Jan/2026:17:47:29.767] listener listener/metadata 0/0/0/19/19 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.786 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0183370
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.791 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.791 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.803 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.804 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0125737
Jan 22 17:47:29 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.12:59482 [22/Jan/2026:17:47:29.791] listener listener/metadata 0/0/0/13/13 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.808 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.809 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.826 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.827 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0176749
Jan 22 17:47:29 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.12:59484 [22/Jan/2026:17:47:29.808] listener listener/metadata 0/0/0/18/18 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.831 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.832 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.847 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.848 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0163083
Jan 22 17:47:29 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.12:59486 [22/Jan/2026:17:47:29.831] listener listener/metadata 0/0/0/17/17 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.852 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.852 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.868 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.868 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0156343
Jan 22 17:47:29 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.12:59494 [22/Jan/2026:17:47:29.851] listener listener/metadata 0/0/0/16/16 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.872 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.872 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:47:29 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.12:59508 [22/Jan/2026:17:47:29.871] listener listener/metadata 0/0/0/14/14 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.886 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0136878
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.893 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.893 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.908 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.909 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0152943
Jan 22 17:47:29 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.12:59514 [22/Jan/2026:17:47:29.893] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.913 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.913 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.924 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.924 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0109644
Jan 22 17:47:29 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.12:59528 [22/Jan/2026:17:47:29.913] listener listener/metadata 0/0/0/11/11 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.927 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.927 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.942 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:47:29 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.12:59530 [22/Jan/2026:17:47:29.926] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.942 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0154500
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.946 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.946 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.962 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.962 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0161777
Jan 22 17:47:29 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.12:59534 [22/Jan/2026:17:47:29.945] listener listener/metadata 0/0/0/17/17 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.967 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.968 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.981 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:47:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:29.981 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0129755
Jan 22 17:47:29 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.12:59544 [22/Jan/2026:17:47:29.967] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:47:30 compute-0 podman[240558]: 2026-01-22 17:47:30.360349478 +0000 UTC m=+0.062205498 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc.)
Jan 22 17:47:30 compute-0 podman[240557]: 2026-01-22 17:47:30.375509924 +0000 UTC m=+0.078478953 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 22 17:47:30 compute-0 podman[240556]: 2026-01-22 17:47:30.388257845 +0000 UTC m=+0.093762332 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:47:32 compute-0 nova_compute[183075]: 2026-01-22 17:47:32.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:47:33 compute-0 nova_compute[183075]: 2026-01-22 17:47:33.644 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:33 compute-0 nova_compute[183075]: 2026-01-22 17:47:33.964 183079 INFO nova.compute.manager [None req-85382a0c-db83-4ed5-a2ef-3625b64d8459 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Get console output
Jan 22 17:47:33 compute-0 nova_compute[183075]: 2026-01-22 17:47:33.968 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:47:34 compute-0 nova_compute[183075]: 2026-01-22 17:47:34.714 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:35 compute-0 nova_compute[183075]: 2026-01-22 17:47:35.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:47:36 compute-0 podman[240620]: 2026-01-22 17:47:36.365338214 +0000 UTC m=+0.072560485 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:47:36 compute-0 nova_compute[183075]: 2026-01-22 17:47:36.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:47:38 compute-0 nova_compute[183075]: 2026-01-22 17:47:38.648 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:38 compute-0 nova_compute[183075]: 2026-01-22 17:47:38.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:47:39 compute-0 nova_compute[183075]: 2026-01-22 17:47:39.145 183079 INFO nova.compute.manager [None req-8919095d-07ac-4312-b2de-95ba5a7fc0b9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Get console output
Jan 22 17:47:39 compute-0 nova_compute[183075]: 2026-01-22 17:47:39.149 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:47:39 compute-0 nova_compute[183075]: 2026-01-22 17:47:39.716 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:39 compute-0 ovn_controller[95372]: 2026-01-22T17:47:39Z|00785|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Jan 22 17:47:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:41.966 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:47:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:41.967 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:47:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:47:41.967 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:47:43 compute-0 nova_compute[183075]: 2026-01-22 17:47:43.673 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:44 compute-0 nova_compute[183075]: 2026-01-22 17:47:44.336 183079 INFO nova.compute.manager [None req-c73add35-bd36-4ff1-9109-1dfba8bfc9af 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Get console output
Jan 22 17:47:44 compute-0 nova_compute[183075]: 2026-01-22 17:47:44.341 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:47:44 compute-0 nova_compute[183075]: 2026-01-22 17:47:44.748 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:45 compute-0 nova_compute[183075]: 2026-01-22 17:47:45.782 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:47:47 compute-0 podman[240640]: 2026-01-22 17:47:47.339726595 +0000 UTC m=+0.054520632 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:47:47 compute-0 nova_compute[183075]: 2026-01-22 17:47:47.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:47:47 compute-0 nova_compute[183075]: 2026-01-22 17:47:47.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:47:47 compute-0 nova_compute[183075]: 2026-01-22 17:47:47.810 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:47:47 compute-0 nova_compute[183075]: 2026-01-22 17:47:47.810 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:47:47 compute-0 nova_compute[183075]: 2026-01-22 17:47:47.839 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:47:47 compute-0 nova_compute[183075]: 2026-01-22 17:47:47.839 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:47:47 compute-0 nova_compute[183075]: 2026-01-22 17:47:47.840 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:47:47 compute-0 nova_compute[183075]: 2026-01-22 17:47:47.840 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:47:47 compute-0 nova_compute[183075]: 2026-01-22 17:47:47.910 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:47:47 compute-0 nova_compute[183075]: 2026-01-22 17:47:47.974 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:47:47 compute-0 nova_compute[183075]: 2026-01-22 17:47:47.975 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:47:48 compute-0 nova_compute[183075]: 2026-01-22 17:47:48.059 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:47:48 compute-0 nova_compute[183075]: 2026-01-22 17:47:48.224 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:47:48 compute-0 nova_compute[183075]: 2026-01-22 17:47:48.225 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5516MB free_disk=73.32317733764648GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:47:48 compute-0 nova_compute[183075]: 2026-01-22 17:47:48.225 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:47:48 compute-0 nova_compute[183075]: 2026-01-22 17:47:48.225 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:47:48 compute-0 nova_compute[183075]: 2026-01-22 17:47:48.310 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance df3ded3b-e065-4dee-93d3-e1ced39c8619 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:47:48 compute-0 nova_compute[183075]: 2026-01-22 17:47:48.310 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:47:48 compute-0 nova_compute[183075]: 2026-01-22 17:47:48.311 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:47:48 compute-0 nova_compute[183075]: 2026-01-22 17:47:48.361 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:47:48 compute-0 nova_compute[183075]: 2026-01-22 17:47:48.429 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:47:48 compute-0 nova_compute[183075]: 2026-01-22 17:47:48.463 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:47:48 compute-0 nova_compute[183075]: 2026-01-22 17:47:48.463 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:47:48 compute-0 nova_compute[183075]: 2026-01-22 17:47:48.676 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:49 compute-0 nova_compute[183075]: 2026-01-22 17:47:49.458 183079 INFO nova.compute.manager [None req-d8b8ceee-e68a-4e96-8304-41b46c9c37a6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Get console output
Jan 22 17:47:49 compute-0 nova_compute[183075]: 2026-01-22 17:47:49.464 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:47:49 compute-0 nova_compute[183075]: 2026-01-22 17:47:49.751 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:51 compute-0 podman[240671]: 2026-01-22 17:47:51.338801688 +0000 UTC m=+0.046125556 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:47:51 compute-0 nova_compute[183075]: 2026-01-22 17:47:51.442 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:47:51 compute-0 nova_compute[183075]: 2026-01-22 17:47:51.443 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:47:51 compute-0 nova_compute[183075]: 2026-01-22 17:47:51.443 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:47:53 compute-0 nova_compute[183075]: 2026-01-22 17:47:53.677 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:54 compute-0 nova_compute[183075]: 2026-01-22 17:47:54.601 183079 INFO nova.compute.manager [None req-c496f126-0ba7-46e3-8cbe-e3c08dab9755 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Get console output
Jan 22 17:47:54 compute-0 nova_compute[183075]: 2026-01-22 17:47:54.606 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:47:54 compute-0 nova_compute[183075]: 2026-01-22 17:47:54.754 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.462 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'name': 'tempest-server-test-1131995147', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000045', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.462 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.465 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for df3ded3b-e065-4dee-93d3-e1ced39c8619 / tap11393b0e-74 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.465 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2ed3ee2-787f-4efe-88d8-53a0cfad9f98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:47:55.462799', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': '7b64a0c6-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.223218149, 'message_signature': '07ece3af03ec33c57f9222364af5154603baddf1bad0b9edf5fb5dc490827528'}]}, 'timestamp': '2026-01-22 17:47:55.465729', '_unique_id': '7aa0ae9fa9c849a9b94ef6f80d77477f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.466 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.467 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.467 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ebe9a9d-5102-4d07-9db3-60cac5b6fccf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:47:55.467523', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': '7b64f38c-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.223218149, 'message_signature': '54fe568e4657a4468554ff1be240ed7c3e79c95d3760ae9b0fb9fa54e81df053'}]}, 'timestamp': '2026-01-22 17:47:55.467776', '_unique_id': '68552fd591354ac6b2f036a5dc19d274'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.468 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53b8c2c0-76e2-4d51-919f-f87690c353ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:47:55.468949', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': '7b652ac8-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.223218149, 'message_signature': 'ef488a02b6d61dc8f4f83b99d64e24945f11288b9d2d16c1a8caf385f910a8fd'}]}, 'timestamp': '2026-01-22 17:47:55.469186', '_unique_id': '0821c41033e94e7c9287b41e0b597ab0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.469 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.470 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.470 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cbcf4b4-d0e8-4632-b4de-73d4b1c4ca2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:47:55.470271', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': '7b655e44-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.223218149, 'message_signature': 'e1f6c59826848245aa6c852cc0d2d1302f84dd717b78b47e9b28b1408afe87b1'}]}, 'timestamp': '2026-01-22 17:47:55.470521', '_unique_id': 'e01190b3206b4758b1e911ae62b66b36'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.471 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1131995147>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1131995147>]
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.472 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.472 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.472 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1131995147>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1131995147>]
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.472 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.486 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/cpu volume: 11210000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fcb6d0a-f3e5-4a90-8fbb-3b8ef4826039', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11210000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'timestamp': '2026-01-22T17:47:55.472515', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '7b67dbc4-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.246653746, 'message_signature': '66e4118b6a82196748cfc10b81a3c2ff7929583165effd0dae4dc0024b6834c9'}]}, 'timestamp': '2026-01-22 17:47:55.486934', '_unique_id': '02f243b0b345484abd5b5fcfb37f0377'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.487 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.488 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.499 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.write.requests volume: 328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2f79534-0de0-48e3-99d6-e17fd4a55689', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 328, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:47:55.488757', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7b69d640-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.249223365, 'message_signature': 'a7a5d98e9e530bfcbbab78c9d0b92955a764dd4ec7cf5ea06ec04e27c0f92429'}]}, 'timestamp': '2026-01-22 17:47:55.499847', '_unique_id': '69df2f180c9e4e938d8261339a6001d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.501 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.501 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.write.latency volume: 25179848617 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10c5d9cf-c119-4174-b7cb-429534a40954', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25179848617, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:47:55.501872', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7b6a30f4-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.249223365, 'message_signature': '4f236acfdd51f9c8ba6f1eec2d1df26dc1621f2053289787319cbed3ced67a81'}]}, 'timestamp': '2026-01-22 17:47:55.502171', '_unique_id': '873357eeb2d64747a2f91310699ebc34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.502 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.503 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.503 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/memory.usage volume: 43.421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17742b7b-80dd-4d1f-a933-174bc6ad8248', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.421875, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'timestamp': '2026-01-22T17:47:55.503459', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '7b6a6eac-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.246653746, 'message_signature': '1babac0f93bafd5a8e403863c908f54e70f858c2d4c58babee940ab45af9c9ae'}]}, 'timestamp': '2026-01-22 17:47:55.503703', '_unique_id': '36e2dd59879b45089d0038f570ff89ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.504 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1131995147>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1131995147>]
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.incoming.bytes volume: 7280 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '304795dd-c477-49ad-8083-a4cce20966d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7280, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:47:55.505204', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': '7b6ab34e-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.223218149, 'message_signature': '1253c5969c71c141a712d73bac6a737c1aaf1306398100b01ccb951fb9447f1e'}]}, 'timestamp': '2026-01-22 17:47:55.505449', '_unique_id': '61d20b9a965141288e8bdffa71b8df90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.505 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.506 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.506 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.outgoing.packets volume: 131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67681f68-a9dd-400e-8e68-a1cee79f906f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 131, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:47:55.506568', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': '7b6ae8a0-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.223218149, 'message_signature': 'febbc58b15b13962c493378229754cb1e11dd339f44c815bfae5e6e241e5f2de'}]}, 'timestamp': '2026-01-22 17:47:55.506814', '_unique_id': '7780bd9453a3411f800ed684fa51bc2a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.507 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.read.bytes volume: 30812672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e67e41b-b5e4-4336-b5ba-0703af56a52f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30812672, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:47:55.507923', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7b6b1cb2-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.249223365, 'message_signature': '4cccf0a75b5e102d183fe8f5a1d8bfca436051098b474ff51a30bbd87290365e'}]}, 'timestamp': '2026-01-22 17:47:55.508140', '_unique_id': 'a381bd0511884e469fd12151d651b428'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.509 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.509 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a711dec-fbd2-4426-982b-970b38a14894', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:47:55.509256', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': '7b6b533a-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.223218149, 'message_signature': '9700bfac1420dfb382681556d0b3b15eedce497eaeb36bb05bb40482613cb31d'}]}, 'timestamp': '2026-01-22 17:47:55.509556', '_unique_id': '51dd7d5ca8654892952240396efaaa97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.510 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.515 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f75a631c-c468-4b4e-811b-a9c3a1f8d465', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:47:55.510856', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7b6c56c2-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.271289926, 'message_signature': '3ecb17e6f0029f4705e393620a0f73b073859dc5805cbbe91923c5a789cbef37'}]}, 'timestamp': '2026-01-22 17:47:55.516203', '_unique_id': '23ab69e5a0d94490a6f7fa6348db30cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.516 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.517 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.517 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.outgoing.bytes volume: 11596 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '623646bf-cf7e-4ff9-94bf-db2cc1e74b95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11596, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:47:55.517682', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': '7b6c9ac4-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.223218149, 'message_signature': '879af3555141fd6a922620862565d103c40c18598c6362a439bb82f819903a83'}]}, 'timestamp': '2026-01-22 17:47:55.517930', '_unique_id': '6e1fed0e626e42c5903b529fa7c108f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.518 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '917dc5cb-b1fd-4e92-9a22-f83d99eaae6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:47:55.519072', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': '7b6cd070-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.223218149, 'message_signature': '4237ef4ff48abc12e9b4d6168b2f886b6ca98457a99d2acd7ab4d9c943ab9acb'}]}, 'timestamp': '2026-01-22 17:47:55.519301', '_unique_id': 'f85635b313b843378c37a32ab2620a33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.520 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.520 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.520 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1131995147>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1131995147>]
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.520 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16f91ac7-4b52-4081-ac92-e4df9785f901', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:47:55.520999', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': '7b6d1d82-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.223218149, 'message_signature': '163413fd934a6a432bc9d333a68f10ca98b2205a42bd2bdab61270d723380549'}]}, 'timestamp': '2026-01-22 17:47:55.521285', '_unique_id': '419cd39886364a0ba3ebb9c86895fd07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.522 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.522 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.read.latency volume: 265135836 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19b48301-f35f-4197-91f7-bf4a0a1257ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 265135836, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:47:55.522553', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7b6d5c0c-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.249223365, 'message_signature': '9ec705dd366aaef2d94a32c3e284df203bdf551eca4c770abd1cdad2e11353ed'}]}, 'timestamp': '2026-01-22 17:47:55.522903', '_unique_id': 'd3f431c465ee4e3d88a4a77cc6a5bad2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.524 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.524 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0d6387f-3f98-4e6f-9aeb-2c0083a03a72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:47:55.524280', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7b6d9d3e-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.271289926, 'message_signature': '7aaf8b0ae8087843a21822688d394719df4d1b813aedf40564f26fc0c7482019'}]}, 'timestamp': '2026-01-22 17:47:55.524579', '_unique_id': 'c7a1ee81da574b0bbc0c17de8fdbbe44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.525 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba8779f9-e1f0-4514-8ef2-b5863f07b1f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:47:55.525904', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7b6ddc72-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.271289926, 'message_signature': '1d0aaefcb0b526edc441f42fc842dc05fd5800c48c1556253b9a1cd36525b4b4'}]}, 'timestamp': '2026-01-22 17:47:55.526191', '_unique_id': 'c276a16baabb430c9f917ad9e9ed3e9f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.527 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.527 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.read.requests volume: 1134 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ccad408d-ec8d-41d8-9887-517d1498396a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1134, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:47:55.527368', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7b6e15a2-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.249223365, 'message_signature': '81ee3af0bde96032d8e9b49b2f561c36d15256f9a84bb29831caf26a0ea6f1ab'}]}, 'timestamp': '2026-01-22 17:47:55.527678', '_unique_id': '75e94ab8a5934754a0540976e3110bb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.528 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b50ca482-cd05-4eba-8c03-6ee38af1678d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:47:55.528957', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7b6e53be-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6439.249223365, 'message_signature': 'b179cb9e262fcc98f7ae07e63bafa68c40e1c628d2ba62a54121d9a14c059e6f'}]}, 'timestamp': '2026-01-22 17:47:55.529247', '_unique_id': 'e94b76b8f8b0493889908e407f383f0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:47:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:47:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:47:58 compute-0 nova_compute[183075]: 2026-01-22 17:47:58.678 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:47:59 compute-0 nova_compute[183075]: 2026-01-22 17:47:59.715 183079 INFO nova.compute.manager [None req-758a437d-4a9a-49e3-84ca-1baba3e5fa98 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Get console output
Jan 22 17:47:59 compute-0 nova_compute[183075]: 2026-01-22 17:47:59.720 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:47:59 compute-0 nova_compute[183075]: 2026-01-22 17:47:59.757 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:01 compute-0 podman[240695]: 2026-01-22 17:48:01.381147223 +0000 UTC m=+0.056074683 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 17:48:01 compute-0 podman[240696]: 2026-01-22 17:48:01.380577358 +0000 UTC m=+0.052917309 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 17:48:01 compute-0 podman[240694]: 2026-01-22 17:48:01.397412639 +0000 UTC m=+0.078605407 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:48:03 compute-0 nova_compute[183075]: 2026-01-22 17:48:03.682 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:03 compute-0 nova_compute[183075]: 2026-01-22 17:48:03.784 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:48:04 compute-0 nova_compute[183075]: 2026-01-22 17:48:04.759 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:04 compute-0 nova_compute[183075]: 2026-01-22 17:48:04.861 183079 INFO nova.compute.manager [None req-c3881e5d-382e-4fd3-80a5-a37695c23648 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Get console output
Jan 22 17:48:04 compute-0 nova_compute[183075]: 2026-01-22 17:48:04.868 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:48:06 compute-0 nova_compute[183075]: 2026-01-22 17:48:06.849 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "38456dbd-001e-46c3-ae2d-5e1765611833" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:48:06 compute-0 nova_compute[183075]: 2026-01-22 17:48:06.850 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "38456dbd-001e-46c3-ae2d-5e1765611833" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:48:06 compute-0 nova_compute[183075]: 2026-01-22 17:48:06.868 183079 DEBUG nova.compute.manager [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:48:06 compute-0 nova_compute[183075]: 2026-01-22 17:48:06.956 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:48:06 compute-0 nova_compute[183075]: 2026-01-22 17:48:06.956 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:48:06 compute-0 nova_compute[183075]: 2026-01-22 17:48:06.967 183079 DEBUG nova.virt.hardware [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:48:06 compute-0 nova_compute[183075]: 2026-01-22 17:48:06.967 183079 INFO nova.compute.claims [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.077 183079 DEBUG nova.scheduler.client.report [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Refreshing inventories for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.108 183079 DEBUG nova.scheduler.client.report [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Updating ProviderTree inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.108 183079 DEBUG nova.compute.provider_tree [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.122 183079 DEBUG nova.scheduler.client.report [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Refreshing aggregate associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.159 183079 DEBUG nova.scheduler.client.report [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Refreshing trait associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NODE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.221 183079 DEBUG nova.compute.provider_tree [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.239 183079 DEBUG nova.scheduler.client.report [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.265 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.266 183079 DEBUG nova.compute.manager [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.344 183079 DEBUG nova.compute.manager [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.345 183079 DEBUG nova.network.neutron [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:48:07 compute-0 podman[240754]: 2026-01-22 17:48:07.362959556 +0000 UTC m=+0.067958441 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.368 183079 INFO nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.383 183079 DEBUG nova.compute.manager [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.460 183079 DEBUG nova.compute.manager [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.461 183079 DEBUG nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.461 183079 INFO nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Creating image(s)
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.462 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.463 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.463 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.478 183079 DEBUG oslo_concurrency.processutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.521 183079 DEBUG nova.policy [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.555 183079 DEBUG oslo_concurrency.processutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.556 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.557 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.583 183079 DEBUG oslo_concurrency.processutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.652 183079 DEBUG oslo_concurrency.processutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.653 183079 DEBUG oslo_concurrency.processutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.685 183079 DEBUG oslo_concurrency.processutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.687 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.687 183079 DEBUG oslo_concurrency.processutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.750 183079 DEBUG oslo_concurrency.processutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.751 183079 DEBUG nova.virt.disk.api [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.751 183079 DEBUG oslo_concurrency.processutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.806 183079 DEBUG oslo_concurrency.processutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.807 183079 DEBUG nova.virt.disk.api [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.807 183079 DEBUG nova.objects.instance [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid 38456dbd-001e-46c3-ae2d-5e1765611833 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.824 183079 DEBUG nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.825 183079 DEBUG nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Ensure instance console log exists: /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.825 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.825 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:48:07 compute-0 nova_compute[183075]: 2026-01-22 17:48:07.825 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:48:08 compute-0 nova_compute[183075]: 2026-01-22 17:48:08.685 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:09 compute-0 nova_compute[183075]: 2026-01-22 17:48:09.793 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:10.141 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:48:10 compute-0 nova_compute[183075]: 2026-01-22 17:48:10.142 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:10.143 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:48:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:10.145 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:48:10 compute-0 nova_compute[183075]: 2026-01-22 17:48:10.432 183079 DEBUG nova.network.neutron [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Successfully created port: f7f0a8be-56b9-4a12-8c48-0b7a66239107 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:48:12 compute-0 nova_compute[183075]: 2026-01-22 17:48:12.641 183079 DEBUG nova.network.neutron [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Successfully updated port: f7f0a8be-56b9-4a12-8c48-0b7a66239107 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:48:12 compute-0 nova_compute[183075]: 2026-01-22 17:48:12.658 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-38456dbd-001e-46c3-ae2d-5e1765611833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:48:12 compute-0 nova_compute[183075]: 2026-01-22 17:48:12.659 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-38456dbd-001e-46c3-ae2d-5e1765611833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:48:12 compute-0 nova_compute[183075]: 2026-01-22 17:48:12.659 183079 DEBUG nova.network.neutron [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:48:12 compute-0 nova_compute[183075]: 2026-01-22 17:48:12.738 183079 DEBUG nova.compute.manager [req-a402e61e-7602-49f7-8102-17366772d234 req-981aa8c4-b3d7-4b43-9bb3-3cd81c17ca8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Received event network-changed-f7f0a8be-56b9-4a12-8c48-0b7a66239107 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:48:12 compute-0 nova_compute[183075]: 2026-01-22 17:48:12.739 183079 DEBUG nova.compute.manager [req-a402e61e-7602-49f7-8102-17366772d234 req-981aa8c4-b3d7-4b43-9bb3-3cd81c17ca8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Refreshing instance network info cache due to event network-changed-f7f0a8be-56b9-4a12-8c48-0b7a66239107. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:48:12 compute-0 nova_compute[183075]: 2026-01-22 17:48:12.739 183079 DEBUG oslo_concurrency.lockutils [req-a402e61e-7602-49f7-8102-17366772d234 req-981aa8c4-b3d7-4b43-9bb3-3cd81c17ca8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-38456dbd-001e-46c3-ae2d-5e1765611833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:48:12 compute-0 nova_compute[183075]: 2026-01-22 17:48:12.802 183079 DEBUG nova.network.neutron [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.630 183079 DEBUG nova.network.neutron [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Updating instance_info_cache with network_info: [{"id": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "address": "fa:16:3e:d4:5d:bb", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7f0a8be-56", "ovs_interfaceid": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.651 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-38456dbd-001e-46c3-ae2d-5e1765611833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.651 183079 DEBUG nova.compute.manager [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Instance network_info: |[{"id": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "address": "fa:16:3e:d4:5d:bb", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7f0a8be-56", "ovs_interfaceid": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.652 183079 DEBUG oslo_concurrency.lockutils [req-a402e61e-7602-49f7-8102-17366772d234 req-981aa8c4-b3d7-4b43-9bb3-3cd81c17ca8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-38456dbd-001e-46c3-ae2d-5e1765611833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.652 183079 DEBUG nova.network.neutron [req-a402e61e-7602-49f7-8102-17366772d234 req-981aa8c4-b3d7-4b43-9bb3-3cd81c17ca8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Refreshing network info cache for port f7f0a8be-56b9-4a12-8c48-0b7a66239107 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.655 183079 DEBUG nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Start _get_guest_xml network_info=[{"id": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "address": "fa:16:3e:d4:5d:bb", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7f0a8be-56", "ovs_interfaceid": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.660 183079 WARNING nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.663 183079 DEBUG nova.virt.libvirt.host [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.664 183079 DEBUG nova.virt.libvirt.host [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.668 183079 DEBUG nova.virt.libvirt.host [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.669 183079 DEBUG nova.virt.libvirt.host [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.669 183079 DEBUG nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.670 183079 DEBUG nova.virt.hardware [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.670 183079 DEBUG nova.virt.hardware [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.670 183079 DEBUG nova.virt.hardware [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.671 183079 DEBUG nova.virt.hardware [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.671 183079 DEBUG nova.virt.hardware [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.671 183079 DEBUG nova.virt.hardware [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.672 183079 DEBUG nova.virt.hardware [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.672 183079 DEBUG nova.virt.hardware [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.673 183079 DEBUG nova.virt.hardware [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.673 183079 DEBUG nova.virt.hardware [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.673 183079 DEBUG nova.virt.hardware [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.679 183079 DEBUG nova.virt.libvirt.vif [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:48:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-7188366',display_name='tempest-server-test-7188366',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-7188366',id=70,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-jlj121jw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:48:07Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=38456dbd-001e-46c3-ae2d-5e1765611833,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "address": "fa:16:3e:d4:5d:bb", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7f0a8be-56", "ovs_interfaceid": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.680 183079 DEBUG nova.network.os_vif_util [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "address": "fa:16:3e:d4:5d:bb", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7f0a8be-56", "ovs_interfaceid": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.680 183079 DEBUG nova.network.os_vif_util [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:5d:bb,bridge_name='br-int',has_traffic_filtering=True,id=f7f0a8be-56b9-4a12-8c48-0b7a66239107,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7f0a8be-56') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.681 183079 DEBUG nova.objects.instance [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid 38456dbd-001e-46c3-ae2d-5e1765611833 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.686 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.697 183079 DEBUG nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:48:13 compute-0 nova_compute[183075]:   <uuid>38456dbd-001e-46c3-ae2d-5e1765611833</uuid>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   <name>instance-00000046</name>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-7188366</nova:name>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:48:13</nova:creationTime>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:48:13 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:48:13 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:48:13 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:48:13 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:48:13 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:48:13 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:48:13 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:48:13 compute-0 nova_compute[183075]:         <nova:port uuid="f7f0a8be-56b9-4a12-8c48-0b7a66239107">
Jan 22 17:48:13 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <system>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <entry name="serial">38456dbd-001e-46c3-ae2d-5e1765611833</entry>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <entry name="uuid">38456dbd-001e-46c3-ae2d-5e1765611833</entry>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     </system>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   <os>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   </os>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   <features>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   </features>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:d4:5d:bb"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <target dev="tapf7f0a8be-56"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/console.log" append="off"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <video>
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     </video>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:48:13 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:48:13 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:48:13 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:48:13 compute-0 nova_compute[183075]: </domain>
Jan 22 17:48:13 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.698 183079 DEBUG nova.compute.manager [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Preparing to wait for external event network-vif-plugged-f7f0a8be-56b9-4a12-8c48-0b7a66239107 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.699 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.700 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.700 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.701 183079 DEBUG nova.virt.libvirt.vif [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:48:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-7188366',display_name='tempest-server-test-7188366',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-7188366',id=70,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-jlj121jw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:48:07Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=38456dbd-001e-46c3-ae2d-5e1765611833,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "address": "fa:16:3e:d4:5d:bb", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7f0a8be-56", "ovs_interfaceid": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.701 183079 DEBUG nova.network.os_vif_util [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "address": "fa:16:3e:d4:5d:bb", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7f0a8be-56", "ovs_interfaceid": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.702 183079 DEBUG nova.network.os_vif_util [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:5d:bb,bridge_name='br-int',has_traffic_filtering=True,id=f7f0a8be-56b9-4a12-8c48-0b7a66239107,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7f0a8be-56') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.703 183079 DEBUG os_vif [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:5d:bb,bridge_name='br-int',has_traffic_filtering=True,id=f7f0a8be-56b9-4a12-8c48-0b7a66239107,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7f0a8be-56') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.703 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.704 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.704 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.708 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.708 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7f0a8be-56, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.709 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf7f0a8be-56, col_values=(('external_ids', {'iface-id': 'f7f0a8be-56b9-4a12-8c48-0b7a66239107', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d4:5d:bb', 'vm-uuid': '38456dbd-001e-46c3-ae2d-5e1765611833'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.711 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:13 compute-0 NetworkManager[55454]: <info>  [1769104093.7118] manager: (tapf7f0a8be-56): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/315)
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.713 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.717 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.718 183079 INFO os_vif [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:5d:bb,bridge_name='br-int',has_traffic_filtering=True,id=f7f0a8be-56b9-4a12-8c48-0b7a66239107,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7f0a8be-56')
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.765 183079 DEBUG nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.766 183079 DEBUG nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:d4:5d:bb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:48:13 compute-0 kernel: tapf7f0a8be-56: entered promiscuous mode
Jan 22 17:48:13 compute-0 NetworkManager[55454]: <info>  [1769104093.8349] manager: (tapf7f0a8be-56): new Tun device (/org/freedesktop/NetworkManager/Devices/316)
Jan 22 17:48:13 compute-0 ovn_controller[95372]: 2026-01-22T17:48:13Z|00786|binding|INFO|Claiming lport f7f0a8be-56b9-4a12-8c48-0b7a66239107 for this chassis.
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.837 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:13 compute-0 ovn_controller[95372]: 2026-01-22T17:48:13Z|00787|binding|INFO|f7f0a8be-56b9-4a12-8c48-0b7a66239107: Claiming fa:16:3e:d4:5d:bb 10.100.0.10
Jan 22 17:48:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:13.850 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:5d:bb 10.100.0.10'], port_security=['fa:16:3e:d4:5d:bb 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'bdd883a6-7421-4241-bb4f-a9657593b878', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=f7f0a8be-56b9-4a12-8c48-0b7a66239107) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:48:13 compute-0 ovn_controller[95372]: 2026-01-22T17:48:13Z|00788|binding|INFO|Setting lport f7f0a8be-56b9-4a12-8c48-0b7a66239107 ovn-installed in OVS
Jan 22 17:48:13 compute-0 ovn_controller[95372]: 2026-01-22T17:48:13Z|00789|binding|INFO|Setting lport f7f0a8be-56b9-4a12-8c48-0b7a66239107 up in Southbound
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.851 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:13.852 104629 INFO neutron.agent.ovn.metadata.agent [-] Port f7f0a8be-56b9-4a12-8c48-0b7a66239107 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:48:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:13.854 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.856 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:13 compute-0 systemd-udevd[240805]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:48:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:13.871 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4c039b79-dd1a-406d-88a3-c57241318afa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:48:13 compute-0 systemd-machined[154382]: New machine qemu-70-instance-00000046.
Jan 22 17:48:13 compute-0 NetworkManager[55454]: <info>  [1769104093.8780] device (tapf7f0a8be-56): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:48:13 compute-0 NetworkManager[55454]: <info>  [1769104093.8791] device (tapf7f0a8be-56): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:48:13 compute-0 systemd[1]: Started Virtual Machine qemu-70-instance-00000046.
Jan 22 17:48:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:13.903 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[193c4dd4-3f35-443c-a077-f363eaa0977d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:48:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:13.905 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[e1892b71-3ad9-4506-928d-4a04aa6450ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:48:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:13.929 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[9ef4926e-22ba-4e50-9bf5-dfc6f8681341]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:48:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:13.945 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4a474c60-0ac4-4c12-8568-9bf7c2d75988]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 56, 'rx_bytes': 8920, 'tx_bytes': 6284, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 56, 'rx_bytes': 8920, 'tx_bytes': 6284, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639368, 'reachable_time': 24278, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240819, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:48:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:13.960 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3b6698a1-4c01-4a31-8077-d29ecb1336dc]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 639379, 'tstamp': 639379}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240820, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 639381, 'tstamp': 639381}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240820, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:48:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:13.961 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.963 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:13 compute-0 nova_compute[183075]: 2026-01-22 17:48:13.963 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:13.964 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:48:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:13.964 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:48:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:13.964 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:48:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:13.965 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.720 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104094.7192874, 38456dbd-001e-46c3-ae2d-5e1765611833 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.720 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] VM Started (Lifecycle Event)
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.742 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.747 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104094.7195277, 38456dbd-001e-46c3-ae2d-5e1765611833 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.748 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] VM Paused (Lifecycle Event)
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.770 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.774 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.801 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.853 183079 DEBUG nova.compute.manager [req-20511e3c-6193-4e1b-8156-f405a208b7dc req-7bf56ddc-e7ec-486f-9e63-6cd7c8dd2dd6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Received event network-vif-plugged-f7f0a8be-56b9-4a12-8c48-0b7a66239107 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.854 183079 DEBUG oslo_concurrency.lockutils [req-20511e3c-6193-4e1b-8156-f405a208b7dc req-7bf56ddc-e7ec-486f-9e63-6cd7c8dd2dd6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.854 183079 DEBUG oslo_concurrency.lockutils [req-20511e3c-6193-4e1b-8156-f405a208b7dc req-7bf56ddc-e7ec-486f-9e63-6cd7c8dd2dd6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.854 183079 DEBUG oslo_concurrency.lockutils [req-20511e3c-6193-4e1b-8156-f405a208b7dc req-7bf56ddc-e7ec-486f-9e63-6cd7c8dd2dd6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.854 183079 DEBUG nova.compute.manager [req-20511e3c-6193-4e1b-8156-f405a208b7dc req-7bf56ddc-e7ec-486f-9e63-6cd7c8dd2dd6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Processing event network-vif-plugged-f7f0a8be-56b9-4a12-8c48-0b7a66239107 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.855 183079 DEBUG nova.compute.manager [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.858 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104094.8581183, 38456dbd-001e-46c3-ae2d-5e1765611833 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.858 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] VM Resumed (Lifecycle Event)
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.860 183079 DEBUG nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.862 183079 INFO nova.virt.libvirt.driver [-] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Instance spawned successfully.
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.863 183079 DEBUG nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.876 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.882 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.886 183079 DEBUG nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.887 183079 DEBUG nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.887 183079 DEBUG nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.888 183079 DEBUG nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.888 183079 DEBUG nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.889 183079 DEBUG nova.virt.libvirt.driver [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.910 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.941 183079 INFO nova.compute.manager [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Took 7.48 seconds to spawn the instance on the hypervisor.
Jan 22 17:48:14 compute-0 nova_compute[183075]: 2026-01-22 17:48:14.942 183079 DEBUG nova.compute.manager [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:48:15 compute-0 nova_compute[183075]: 2026-01-22 17:48:15.031 183079 INFO nova.compute.manager [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Took 8.12 seconds to build instance.
Jan 22 17:48:15 compute-0 nova_compute[183075]: 2026-01-22 17:48:15.056 183079 DEBUG oslo_concurrency.lockutils [None req-86b49ff5-4120-47fc-8a31-d8d881ba1ed8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "38456dbd-001e-46c3-ae2d-5e1765611833" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:48:15 compute-0 nova_compute[183075]: 2026-01-22 17:48:15.859 183079 DEBUG nova.network.neutron [req-a402e61e-7602-49f7-8102-17366772d234 req-981aa8c4-b3d7-4b43-9bb3-3cd81c17ca8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Updated VIF entry in instance network info cache for port f7f0a8be-56b9-4a12-8c48-0b7a66239107. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:48:15 compute-0 nova_compute[183075]: 2026-01-22 17:48:15.859 183079 DEBUG nova.network.neutron [req-a402e61e-7602-49f7-8102-17366772d234 req-981aa8c4-b3d7-4b43-9bb3-3cd81c17ca8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Updating instance_info_cache with network_info: [{"id": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "address": "fa:16:3e:d4:5d:bb", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7f0a8be-56", "ovs_interfaceid": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:48:15 compute-0 nova_compute[183075]: 2026-01-22 17:48:15.876 183079 DEBUG oslo_concurrency.lockutils [req-a402e61e-7602-49f7-8102-17366772d234 req-981aa8c4-b3d7-4b43-9bb3-3cd81c17ca8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-38456dbd-001e-46c3-ae2d-5e1765611833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:48:16 compute-0 nova_compute[183075]: 2026-01-22 17:48:16.503 183079 INFO nova.compute.manager [None req-ce2c307f-c89f-4369-a9f7-8a04c71e23a1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Get console output
Jan 22 17:48:16 compute-0 nova_compute[183075]: 2026-01-22 17:48:16.970 183079 DEBUG nova.compute.manager [req-b0065f21-af7e-4ef9-acce-9fa5c4102c12 req-11e4877f-90bc-455f-b53d-4aaae962eda6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Received event network-vif-plugged-f7f0a8be-56b9-4a12-8c48-0b7a66239107 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:48:16 compute-0 nova_compute[183075]: 2026-01-22 17:48:16.970 183079 DEBUG oslo_concurrency.lockutils [req-b0065f21-af7e-4ef9-acce-9fa5c4102c12 req-11e4877f-90bc-455f-b53d-4aaae962eda6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:48:16 compute-0 nova_compute[183075]: 2026-01-22 17:48:16.971 183079 DEBUG oslo_concurrency.lockutils [req-b0065f21-af7e-4ef9-acce-9fa5c4102c12 req-11e4877f-90bc-455f-b53d-4aaae962eda6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:48:16 compute-0 nova_compute[183075]: 2026-01-22 17:48:16.971 183079 DEBUG oslo_concurrency.lockutils [req-b0065f21-af7e-4ef9-acce-9fa5c4102c12 req-11e4877f-90bc-455f-b53d-4aaae962eda6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:48:16 compute-0 nova_compute[183075]: 2026-01-22 17:48:16.971 183079 DEBUG nova.compute.manager [req-b0065f21-af7e-4ef9-acce-9fa5c4102c12 req-11e4877f-90bc-455f-b53d-4aaae962eda6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] No waiting events found dispatching network-vif-plugged-f7f0a8be-56b9-4a12-8c48-0b7a66239107 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:48:16 compute-0 nova_compute[183075]: 2026-01-22 17:48:16.971 183079 WARNING nova.compute.manager [req-b0065f21-af7e-4ef9-acce-9fa5c4102c12 req-11e4877f-90bc-455f-b53d-4aaae962eda6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Received unexpected event network-vif-plugged-f7f0a8be-56b9-4a12-8c48-0b7a66239107 for instance with vm_state active and task_state None.
Jan 22 17:48:18 compute-0 podman[240829]: 2026-01-22 17:48:18.362256116 +0000 UTC m=+0.062054434 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:48:18 compute-0 nova_compute[183075]: 2026-01-22 17:48:18.689 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:18 compute-0 nova_compute[183075]: 2026-01-22 17:48:18.712 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:21 compute-0 nova_compute[183075]: 2026-01-22 17:48:21.627 183079 INFO nova.compute.manager [None req-6d3b3b8b-2a28-4485-b847-72c76226fbad 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Get console output
Jan 22 17:48:21 compute-0 nova_compute[183075]: 2026-01-22 17:48:21.634 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:48:22 compute-0 podman[240853]: 2026-01-22 17:48:22.34118309 +0000 UTC m=+0.054733717 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:48:23 compute-0 nova_compute[183075]: 2026-01-22 17:48:23.691 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:23 compute-0 nova_compute[183075]: 2026-01-22 17:48:23.713 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:26 compute-0 nova_compute[183075]: 2026-01-22 17:48:26.779 183079 INFO nova.compute.manager [None req-eedc8e1b-04e8-4cdf-a526-504f00b04a97 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Get console output
Jan 22 17:48:26 compute-0 nova_compute[183075]: 2026-01-22 17:48:26.784 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:48:27 compute-0 ovn_controller[95372]: 2026-01-22T17:48:27Z|00142|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d4:5d:bb 10.100.0.10
Jan 22 17:48:27 compute-0 ovn_controller[95372]: 2026-01-22T17:48:27Z|00143|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d4:5d:bb 10.100.0.10
Jan 22 17:48:28 compute-0 nova_compute[183075]: 2026-01-22 17:48:28.695 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:28 compute-0 nova_compute[183075]: 2026-01-22 17:48:28.715 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:32 compute-0 nova_compute[183075]: 2026-01-22 17:48:32.275 183079 INFO nova.compute.manager [None req-53438346-41ef-419f-9f44-2510fab3834a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Get console output
Jan 22 17:48:32 compute-0 nova_compute[183075]: 2026-01-22 17:48:32.281 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:48:32 compute-0 podman[240909]: 2026-01-22 17:48:32.341504291 +0000 UTC m=+0.050220737 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:48:32 compute-0 podman[240910]: 2026-01-22 17:48:32.358180207 +0000 UTC m=+0.058762275 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.buildah.version=1.33.7)
Jan 22 17:48:32 compute-0 podman[240908]: 2026-01-22 17:48:32.364815045 +0000 UTC m=+0.076514161 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Jan 22 17:48:32 compute-0 nova_compute[183075]: 2026-01-22 17:48:32.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:48:33 compute-0 nova_compute[183075]: 2026-01-22 17:48:33.697 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:33 compute-0 nova_compute[183075]: 2026-01-22 17:48:33.716 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.033 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.034 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.480 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.481 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.4466836
Jan 22 17:48:34 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.10:35936 [22/Jan/2026:17:48:34.033] listener listener/metadata 0/0/0/447/447 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.488 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.488 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.507 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:48:34 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.10:35942 [22/Jan/2026:17:48:34.487] listener listener/metadata 0/0/0/20/20 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.508 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0196636
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.513 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.514 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.527 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.527 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0135400
Jan 22 17:48:34 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.10:35956 [22/Jan/2026:17:48:34.513] listener listener/metadata 0/0/0/14/14 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.532 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.533 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.548 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:48:34 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.10:35972 [22/Jan/2026:17:48:34.532] listener listener/metadata 0/0/0/16/16 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.549 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0162003
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.552 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.553 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.568 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:48:34 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.10:35978 [22/Jan/2026:17:48:34.552] listener listener/metadata 0/0/0/15/15 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.568 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0150654
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.574 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.575 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.587 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.588 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0130095
Jan 22 17:48:34 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.10:35992 [22/Jan/2026:17:48:34.574] listener listener/metadata 0/0/0/13/13 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.592 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.593 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.608 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.609 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0155869
Jan 22 17:48:34 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.10:35996 [22/Jan/2026:17:48:34.592] listener listener/metadata 0/0/0/16/16 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.613 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.613 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.630 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:48:34 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.10:36008 [22/Jan/2026:17:48:34.612] listener listener/metadata 0/0/0/17/17 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.630 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0170543
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.634 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.634 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.656 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.656 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 163 time: 0.0214443
Jan 22 17:48:34 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.10:36012 [22/Jan/2026:17:48:34.634] listener listener/metadata 0/0/0/22/22 200 147 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.660 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.661 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.673 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.673 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 163 time: 0.0126448
Jan 22 17:48:34 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.10:36028 [22/Jan/2026:17:48:34.660] listener listener/metadata 0/0/0/13/13 200 147 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.677 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.677 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:48:34 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.10:36034 [22/Jan/2026:17:48:34.676] listener listener/metadata 0/0/0/14/14 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.691 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0140572
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.699 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.700 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.715 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:48:34 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.10:36048 [22/Jan/2026:17:48:34.699] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.715 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0158048
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.718 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.718 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.736 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.736 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0175855
Jan 22 17:48:34 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.10:36052 [22/Jan/2026:17:48:34.718] listener listener/metadata 0/0/0/18/18 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.739 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.740 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.757 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.757 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0178220
Jan 22 17:48:34 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.10:36058 [22/Jan/2026:17:48:34.739] listener listener/metadata 0/0/0/18/18 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.761 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.762 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.779 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:48:34 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.10:36062 [22/Jan/2026:17:48:34.761] listener listener/metadata 0/0/0/17/17 200 147 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.779 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 163 time: 0.0176959
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.783 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.783 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.10
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.798 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:48:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:34.799 104990 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0156419
Jan 22 17:48:34 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.10:36076 [22/Jan/2026:17:48:34.782] listener listener/metadata 0/0/0/16/16 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:48:36 compute-0 nova_compute[183075]: 2026-01-22 17:48:36.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:48:37 compute-0 nova_compute[183075]: 2026-01-22 17:48:37.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:48:38 compute-0 nova_compute[183075]: 2026-01-22 17:48:38.146 183079 INFO nova.compute.manager [None req-983a3154-4353-4a40-ad4f-5ff358e81a43 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Get console output
Jan 22 17:48:38 compute-0 nova_compute[183075]: 2026-01-22 17:48:38.152 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:48:38 compute-0 podman[240970]: 2026-01-22 17:48:38.355712664 +0000 UTC m=+0.057708817 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:48:38 compute-0 nova_compute[183075]: 2026-01-22 17:48:38.699 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:38 compute-0 nova_compute[183075]: 2026-01-22 17:48:38.717 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:38 compute-0 nova_compute[183075]: 2026-01-22 17:48:38.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:48:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:41.967 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:48:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:41.968 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:48:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:48:41.969 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:48:43 compute-0 nova_compute[183075]: 2026-01-22 17:48:43.263 183079 INFO nova.compute.manager [None req-fa1bc01f-b57f-48c5-a545-e6af9ac0ad7a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Get console output
Jan 22 17:48:43 compute-0 nova_compute[183075]: 2026-01-22 17:48:43.269 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:48:43 compute-0 nova_compute[183075]: 2026-01-22 17:48:43.700 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:43 compute-0 nova_compute[183075]: 2026-01-22 17:48:43.718 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:43 compute-0 ovn_controller[95372]: 2026-01-22T17:48:43Z|00790|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 22 17:48:45 compute-0 nova_compute[183075]: 2026-01-22 17:48:45.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:48:48 compute-0 nova_compute[183075]: 2026-01-22 17:48:48.410 183079 INFO nova.compute.manager [None req-215ee037-92c6-4e39-9ee6-b4bcd850fd7f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Get console output
Jan 22 17:48:48 compute-0 nova_compute[183075]: 2026-01-22 17:48:48.414 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:48:48 compute-0 nova_compute[183075]: 2026-01-22 17:48:48.720 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:48:48 compute-0 nova_compute[183075]: 2026-01-22 17:48:48.722 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:48:48 compute-0 nova_compute[183075]: 2026-01-22 17:48:48.723 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 17:48:48 compute-0 nova_compute[183075]: 2026-01-22 17:48:48.723 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 17:48:48 compute-0 nova_compute[183075]: 2026-01-22 17:48:48.727 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:48 compute-0 nova_compute[183075]: 2026-01-22 17:48:48.728 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 17:48:49 compute-0 podman[240990]: 2026-01-22 17:48:49.341556473 +0000 UTC m=+0.053771062 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:48:49 compute-0 nova_compute[183075]: 2026-01-22 17:48:49.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:48:49 compute-0 nova_compute[183075]: 2026-01-22 17:48:49.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:48:49 compute-0 nova_compute[183075]: 2026-01-22 17:48:49.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:48:50 compute-0 nova_compute[183075]: 2026-01-22 17:48:50.574 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-df3ded3b-e065-4dee-93d3-e1ced39c8619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:48:50 compute-0 nova_compute[183075]: 2026-01-22 17:48:50.575 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-df3ded3b-e065-4dee-93d3-e1ced39c8619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:48:50 compute-0 nova_compute[183075]: 2026-01-22 17:48:50.575 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:48:50 compute-0 nova_compute[183075]: 2026-01-22 17:48:50.575 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid df3ded3b-e065-4dee-93d3-e1ced39c8619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.594 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Updating instance_info_cache with network_info: [{"id": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "address": "fa:16:3e:af:50:24", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11393b0e-74", "ovs_interfaceid": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.609 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-df3ded3b-e065-4dee-93d3-e1ced39c8619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.609 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.610 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.611 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.611 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.611 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.631 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.632 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.632 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.632 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.697 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:48:52 compute-0 podman[241017]: 2026-01-22 17:48:52.715881381 +0000 UTC m=+0.049561139 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.756 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.757 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.811 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.818 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.873 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.874 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:48:52 compute-0 nova_compute[183075]: 2026-01-22 17:48:52.930 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.088 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.089 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5417MB free_disk=73.29502487182617GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.090 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.090 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.182 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance df3ded3b-e065-4dee-93d3-e1ced39c8619 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.182 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 38456dbd-001e-46c3-ae2d-5e1765611833 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.183 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.183 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.242 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.257 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.278 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.278 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.456 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.528 183079 INFO nova.compute.manager [None req-3f09599c-d1b4-4f9c-9675-ad2034e0f3bc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Get console output
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.532 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.729 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:48:53 compute-0 nova_compute[183075]: 2026-01-22 17:48:53.732 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:48:58 compute-0 nova_compute[183075]: 2026-01-22 17:48:58.650 183079 INFO nova.compute.manager [None req-33f51e3c-bf1a-4c9a-aacf-fd0da8f57498 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Get console output
Jan 22 17:48:58 compute-0 nova_compute[183075]: 2026-01-22 17:48:58.656 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:48:58 compute-0 nova_compute[183075]: 2026-01-22 17:48:58.731 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:03 compute-0 podman[241056]: 2026-01-22 17:49:03.376989321 +0000 UTC m=+0.065546557 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Jan 22 17:49:03 compute-0 podman[241054]: 2026-01-22 17:49:03.402381211 +0000 UTC m=+0.100275997 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 17:49:03 compute-0 podman[241055]: 2026-01-22 17:49:03.407775386 +0000 UTC m=+0.091894383 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, tcib_managed=true)
Jan 22 17:49:03 compute-0 nova_compute[183075]: 2026-01-22 17:49:03.733 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:03 compute-0 nova_compute[183075]: 2026-01-22 17:49:03.770 183079 INFO nova.compute.manager [None req-1d87aa40-3e92-4160-8ae7-e509c2249d8c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Get console output
Jan 22 17:49:03 compute-0 nova_compute[183075]: 2026-01-22 17:49:03.774 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:49:08 compute-0 nova_compute[183075]: 2026-01-22 17:49:08.735 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:08 compute-0 nova_compute[183075]: 2026-01-22 17:49:08.934 183079 INFO nova.compute.manager [None req-c4a791fd-d92c-4a7b-9c25-8238e8b426c7 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Get console output
Jan 22 17:49:08 compute-0 nova_compute[183075]: 2026-01-22 17:49:08.945 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:49:09 compute-0 podman[241118]: 2026-01-22 17:49:09.371487768 +0000 UTC m=+0.082936306 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.307 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.309 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.324 183079 DEBUG nova.compute.manager [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.390 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.391 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.397 183079 DEBUG nova.virt.hardware [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.397 183079 INFO nova.compute.claims [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.529 183079 DEBUG nova.compute.provider_tree [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.544 183079 DEBUG nova.scheduler.client.report [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.563 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.563 183079 DEBUG nova.compute.manager [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.601 183079 DEBUG nova.compute.manager [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.602 183079 DEBUG nova.network.neutron [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.632 183079 INFO nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.652 183079 DEBUG nova.compute.manager [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.736 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.738 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.754 183079 DEBUG nova.compute.manager [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.756 183079 DEBUG nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.756 183079 INFO nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Creating image(s)
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.757 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.757 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.758 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.770 183079 DEBUG nova.policy [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.773 183079 DEBUG oslo_concurrency.processutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.825 183079 DEBUG oslo_concurrency.processutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.826 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.827 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.837 183079 DEBUG oslo_concurrency.processutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.890 183079 DEBUG oslo_concurrency.processutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.891 183079 DEBUG oslo_concurrency.processutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.922 183079 DEBUG oslo_concurrency.processutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.923 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.924 183079 DEBUG oslo_concurrency.processutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.983 183079 DEBUG oslo_concurrency.processutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.984 183079 DEBUG nova.virt.disk.api [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:49:13 compute-0 nova_compute[183075]: 2026-01-22 17:49:13.984 183079 DEBUG oslo_concurrency.processutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:49:14 compute-0 nova_compute[183075]: 2026-01-22 17:49:14.041 183079 DEBUG oslo_concurrency.processutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:49:14 compute-0 nova_compute[183075]: 2026-01-22 17:49:14.042 183079 DEBUG nova.virt.disk.api [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:49:14 compute-0 nova_compute[183075]: 2026-01-22 17:49:14.042 183079 DEBUG nova.objects.instance [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:49:14 compute-0 nova_compute[183075]: 2026-01-22 17:49:14.057 183079 DEBUG nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:49:14 compute-0 nova_compute[183075]: 2026-01-22 17:49:14.083 183079 DEBUG nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Ensure instance console log exists: /var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:49:14 compute-0 nova_compute[183075]: 2026-01-22 17:49:14.083 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:49:14 compute-0 nova_compute[183075]: 2026-01-22 17:49:14.084 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:49:14 compute-0 nova_compute[183075]: 2026-01-22 17:49:14.084 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:49:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:15.005 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:49:15 compute-0 nova_compute[183075]: 2026-01-22 17:49:15.006 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:15.007 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:49:15 compute-0 nova_compute[183075]: 2026-01-22 17:49:15.586 183079 DEBUG nova.network.neutron [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Successfully created port: 598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:49:16 compute-0 nova_compute[183075]: 2026-01-22 17:49:16.587 183079 DEBUG nova.network.neutron [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Successfully updated port: 598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:49:16 compute-0 nova_compute[183075]: 2026-01-22 17:49:16.600 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:49:16 compute-0 nova_compute[183075]: 2026-01-22 17:49:16.601 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:49:16 compute-0 nova_compute[183075]: 2026-01-22 17:49:16.601 183079 DEBUG nova.network.neutron [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:49:16 compute-0 nova_compute[183075]: 2026-01-22 17:49:16.702 183079 DEBUG nova.compute.manager [req-d851f3bd-5cad-499f-810c-605b6b505b56 req-d58e9a9a-b20d-45b6-a78d-43a00a3634e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Received event network-changed-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:49:16 compute-0 nova_compute[183075]: 2026-01-22 17:49:16.702 183079 DEBUG nova.compute.manager [req-d851f3bd-5cad-499f-810c-605b6b505b56 req-d58e9a9a-b20d-45b6-a78d-43a00a3634e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Refreshing instance network info cache due to event network-changed-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:49:16 compute-0 nova_compute[183075]: 2026-01-22 17:49:16.702 183079 DEBUG oslo_concurrency.lockutils [req-d851f3bd-5cad-499f-810c-605b6b505b56 req-d58e9a9a-b20d-45b6-a78d-43a00a3634e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:49:16 compute-0 nova_compute[183075]: 2026-01-22 17:49:16.752 183079 DEBUG nova.network.neutron [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.851 183079 DEBUG nova.network.neutron [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Updating instance_info_cache with network_info: [{"id": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "address": "fa:16:3e:9e:71:1f", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598a2a6c-cd", "ovs_interfaceid": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.876 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.877 183079 DEBUG nova.compute.manager [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Instance network_info: |[{"id": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "address": "fa:16:3e:9e:71:1f", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598a2a6c-cd", "ovs_interfaceid": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.877 183079 DEBUG oslo_concurrency.lockutils [req-d851f3bd-5cad-499f-810c-605b6b505b56 req-d58e9a9a-b20d-45b6-a78d-43a00a3634e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.878 183079 DEBUG nova.network.neutron [req-d851f3bd-5cad-499f-810c-605b6b505b56 req-d58e9a9a-b20d-45b6-a78d-43a00a3634e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Refreshing network info cache for port 598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.881 183079 DEBUG nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Start _get_guest_xml network_info=[{"id": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "address": "fa:16:3e:9e:71:1f", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598a2a6c-cd", "ovs_interfaceid": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.886 183079 WARNING nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.894 183079 DEBUG nova.virt.libvirt.host [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.895 183079 DEBUG nova.virt.libvirt.host [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.898 183079 DEBUG nova.virt.libvirt.host [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.899 183079 DEBUG nova.virt.libvirt.host [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.899 183079 DEBUG nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.899 183079 DEBUG nova.virt.hardware [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.900 183079 DEBUG nova.virt.hardware [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.900 183079 DEBUG nova.virt.hardware [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.901 183079 DEBUG nova.virt.hardware [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.901 183079 DEBUG nova.virt.hardware [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.901 183079 DEBUG nova.virt.hardware [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.902 183079 DEBUG nova.virt.hardware [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.902 183079 DEBUG nova.virt.hardware [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.902 183079 DEBUG nova.virt.hardware [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.902 183079 DEBUG nova.virt.hardware [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.903 183079 DEBUG nova.virt.hardware [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.907 183079 DEBUG nova.virt.libvirt.vif [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:49:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1547454978',display_name='tempest-server-test-1547454978',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1547454978',id=71,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-6n6k9xyp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:49:13Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "address": "fa:16:3e:9e:71:1f", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598a2a6c-cd", "ovs_interfaceid": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.908 183079 DEBUG nova.network.os_vif_util [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "address": "fa:16:3e:9e:71:1f", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598a2a6c-cd", "ovs_interfaceid": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.908 183079 DEBUG nova.network.os_vif_util [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:71:1f,bridge_name='br-int',has_traffic_filtering=True,id=598a2a6c-cdfc-40ec-9ff8-b57bac7c6063,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap598a2a6c-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.909 183079 DEBUG nova.objects.instance [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.926 183079 DEBUG nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:49:17 compute-0 nova_compute[183075]:   <uuid>d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2</uuid>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   <name>instance-00000047</name>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1547454978</nova:name>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:49:17</nova:creationTime>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:49:17 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:49:17 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:49:17 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:49:17 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:49:17 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:49:17 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:49:17 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:49:17 compute-0 nova_compute[183075]:         <nova:port uuid="598a2a6c-cdfc-40ec-9ff8-b57bac7c6063">
Jan 22 17:49:17 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <system>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <entry name="serial">d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2</entry>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <entry name="uuid">d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2</entry>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     </system>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   <os>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   </os>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   <features>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   </features>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:9e:71:1f"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <target dev="tap598a2a6c-cd"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/console.log" append="off"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <video>
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     </video>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:49:17 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:49:17 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:49:17 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:49:17 compute-0 nova_compute[183075]: </domain>
Jan 22 17:49:17 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.927 183079 DEBUG nova.compute.manager [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Preparing to wait for external event network-vif-plugged-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.928 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.928 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.928 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.930 183079 DEBUG nova.virt.libvirt.vif [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:49:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1547454978',display_name='tempest-server-test-1547454978',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1547454978',id=71,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-6n6k9xyp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:49:13Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "address": "fa:16:3e:9e:71:1f", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598a2a6c-cd", "ovs_interfaceid": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.930 183079 DEBUG nova.network.os_vif_util [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "address": "fa:16:3e:9e:71:1f", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598a2a6c-cd", "ovs_interfaceid": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.931 183079 DEBUG nova.network.os_vif_util [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:71:1f,bridge_name='br-int',has_traffic_filtering=True,id=598a2a6c-cdfc-40ec-9ff8-b57bac7c6063,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap598a2a6c-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.932 183079 DEBUG os_vif [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:71:1f,bridge_name='br-int',has_traffic_filtering=True,id=598a2a6c-cdfc-40ec-9ff8-b57bac7c6063,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap598a2a6c-cd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.933 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.933 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.934 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.937 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.937 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap598a2a6c-cd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.938 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap598a2a6c-cd, col_values=(('external_ids', {'iface-id': '598a2a6c-cdfc-40ec-9ff8-b57bac7c6063', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9e:71:1f', 'vm-uuid': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.939 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:17 compute-0 NetworkManager[55454]: <info>  [1769104157.9412] manager: (tap598a2a6c-cd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/317)
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.941 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.947 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.948 183079 INFO os_vif [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:71:1f,bridge_name='br-int',has_traffic_filtering=True,id=598a2a6c-cdfc-40ec-9ff8-b57bac7c6063,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap598a2a6c-cd')
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.997 183079 DEBUG nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:49:17 compute-0 nova_compute[183075]: 2026-01-22 17:49:17.998 183079 DEBUG nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:9e:71:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:49:18 compute-0 kernel: tap598a2a6c-cd: entered promiscuous mode
Jan 22 17:49:18 compute-0 NetworkManager[55454]: <info>  [1769104158.0624] manager: (tap598a2a6c-cd): new Tun device (/org/freedesktop/NetworkManager/Devices/318)
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.062 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:18 compute-0 ovn_controller[95372]: 2026-01-22T17:49:18Z|00791|binding|INFO|Claiming lport 598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 for this chassis.
Jan 22 17:49:18 compute-0 ovn_controller[95372]: 2026-01-22T17:49:18Z|00792|binding|INFO|598a2a6c-cdfc-40ec-9ff8-b57bac7c6063: Claiming fa:16:3e:9e:71:1f 10.100.0.14
Jan 22 17:49:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:18.074 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:71:1f 10.100.0.14'], port_security=['fa:16:3e:9e:71:1f 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'bdd883a6-7421-4241-bb4f-a9657593b878', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=598a2a6c-cdfc-40ec-9ff8-b57bac7c6063) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:49:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:18.076 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:49:18 compute-0 ovn_controller[95372]: 2026-01-22T17:49:18Z|00793|binding|INFO|Setting lport 598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 up in Southbound
Jan 22 17:49:18 compute-0 ovn_controller[95372]: 2026-01-22T17:49:18Z|00794|binding|INFO|Setting lport 598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 ovn-installed in OVS
Jan 22 17:49:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:18.077 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.078 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.084 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:18.098 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[36f4ffd2-c71a-4aa2-a9af-c0205b09c1fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:49:18 compute-0 systemd-udevd[241169]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:49:18 compute-0 systemd-machined[154382]: New machine qemu-71-instance-00000047.
Jan 22 17:49:18 compute-0 NetworkManager[55454]: <info>  [1769104158.1164] device (tap598a2a6c-cd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:49:18 compute-0 NetworkManager[55454]: <info>  [1769104158.1172] device (tap598a2a6c-cd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:49:18 compute-0 systemd[1]: Started Virtual Machine qemu-71-instance-00000047.
Jan 22 17:49:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:18.137 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[b2dde1d1-7254-4c5f-a886-1cc9f4bf103e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:49:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:18.140 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[55af691d-eebc-4b5f-a0b5-cbadac227d73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:49:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:18.173 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[84848b64-9675-4779-9df2-183f52e456b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:49:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:18.191 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b58856ac-ce90-46b9-8975-85734d9c9aea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 107, 'rx_bytes': 17308, 'tx_bytes': 12137, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 107, 'rx_bytes': 17308, 'tx_bytes': 12137, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639368, 'reachable_time': 24278, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241182, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:49:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:18.208 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[13362e35-75aa-49a4-a836-311c5619d930]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 639379, 'tstamp': 639379}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241184, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 639381, 'tstamp': 639381}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241184, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:49:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:18.209 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.268 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:18.268 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:49:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:18.268 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:49:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:18.268 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:49:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:18.268 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.451 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104158.4510114, d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.452 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] VM Started (Lifecycle Event)
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.568 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.571 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104158.4542718, d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.572 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] VM Paused (Lifecycle Event)
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.597 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.600 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.618 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.720 183079 DEBUG nova.compute.manager [req-56942788-9cbf-4ba9-a4b0-fb08b71cbc1d req-28d3b4a0-566b-40c6-956d-40e39bd97e60 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Received event network-vif-plugged-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.721 183079 DEBUG oslo_concurrency.lockutils [req-56942788-9cbf-4ba9-a4b0-fb08b71cbc1d req-28d3b4a0-566b-40c6-956d-40e39bd97e60 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.721 183079 DEBUG oslo_concurrency.lockutils [req-56942788-9cbf-4ba9-a4b0-fb08b71cbc1d req-28d3b4a0-566b-40c6-956d-40e39bd97e60 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.722 183079 DEBUG oslo_concurrency.lockutils [req-56942788-9cbf-4ba9-a4b0-fb08b71cbc1d req-28d3b4a0-566b-40c6-956d-40e39bd97e60 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.722 183079 DEBUG nova.compute.manager [req-56942788-9cbf-4ba9-a4b0-fb08b71cbc1d req-28d3b4a0-566b-40c6-956d-40e39bd97e60 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Processing event network-vif-plugged-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.723 183079 DEBUG nova.compute.manager [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.726 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104158.725601, d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.726 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] VM Resumed (Lifecycle Event)
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.728 183079 DEBUG nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.731 183079 INFO nova.virt.libvirt.driver [-] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Instance spawned successfully.
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.731 183079 DEBUG nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.738 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.744 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.748 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.751 183079 DEBUG nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.752 183079 DEBUG nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.752 183079 DEBUG nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.752 183079 DEBUG nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.753 183079 DEBUG nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.753 183079 DEBUG nova.virt.libvirt.driver [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.778 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.805 183079 INFO nova.compute.manager [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Took 5.05 seconds to spawn the instance on the hypervisor.
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.806 183079 DEBUG nova.compute.manager [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.863 183079 INFO nova.compute.manager [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Took 5.50 seconds to build instance.
Jan 22 17:49:18 compute-0 nova_compute[183075]: 2026-01-22 17:49:18.878 183079 DEBUG oslo_concurrency.lockutils [None req-b77d9a5e-e10c-4113-b404-43ada242ee03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:49:19 compute-0 nova_compute[183075]: 2026-01-22 17:49:19.587 183079 DEBUG nova.network.neutron [req-d851f3bd-5cad-499f-810c-605b6b505b56 req-d58e9a9a-b20d-45b6-a78d-43a00a3634e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Updated VIF entry in instance network info cache for port 598a2a6c-cdfc-40ec-9ff8-b57bac7c6063. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:49:19 compute-0 nova_compute[183075]: 2026-01-22 17:49:19.587 183079 DEBUG nova.network.neutron [req-d851f3bd-5cad-499f-810c-605b6b505b56 req-d58e9a9a-b20d-45b6-a78d-43a00a3634e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Updating instance_info_cache with network_info: [{"id": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "address": "fa:16:3e:9e:71:1f", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598a2a6c-cd", "ovs_interfaceid": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:49:19 compute-0 nova_compute[183075]: 2026-01-22 17:49:19.599 183079 DEBUG oslo_concurrency.lockutils [req-d851f3bd-5cad-499f-810c-605b6b505b56 req-d58e9a9a-b20d-45b6-a78d-43a00a3634e2 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:49:20 compute-0 podman[241192]: 2026-01-22 17:49:20.373942636 +0000 UTC m=+0.071356135 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:49:20 compute-0 nova_compute[183075]: 2026-01-22 17:49:20.825 183079 DEBUG nova.compute.manager [req-9b30a811-311d-41c2-9558-918b4d07f778 req-187131f3-a313-41bc-91a2-a645bf8933a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Received event network-vif-plugged-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:49:20 compute-0 nova_compute[183075]: 2026-01-22 17:49:20.825 183079 DEBUG oslo_concurrency.lockutils [req-9b30a811-311d-41c2-9558-918b4d07f778 req-187131f3-a313-41bc-91a2-a645bf8933a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:49:20 compute-0 nova_compute[183075]: 2026-01-22 17:49:20.825 183079 DEBUG oslo_concurrency.lockutils [req-9b30a811-311d-41c2-9558-918b4d07f778 req-187131f3-a313-41bc-91a2-a645bf8933a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:49:20 compute-0 nova_compute[183075]: 2026-01-22 17:49:20.826 183079 DEBUG oslo_concurrency.lockutils [req-9b30a811-311d-41c2-9558-918b4d07f778 req-187131f3-a313-41bc-91a2-a645bf8933a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:49:20 compute-0 nova_compute[183075]: 2026-01-22 17:49:20.826 183079 DEBUG nova.compute.manager [req-9b30a811-311d-41c2-9558-918b4d07f778 req-187131f3-a313-41bc-91a2-a645bf8933a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] No waiting events found dispatching network-vif-plugged-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:49:20 compute-0 nova_compute[183075]: 2026-01-22 17:49:20.826 183079 WARNING nova.compute.manager [req-9b30a811-311d-41c2-9558-918b4d07f778 req-187131f3-a313-41bc-91a2-a645bf8933a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Received unexpected event network-vif-plugged-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 for instance with vm_state active and task_state None.
Jan 22 17:49:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:21.009 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:49:21 compute-0 nova_compute[183075]: 2026-01-22 17:49:21.851 183079 INFO nova.compute.manager [None req-7c7e9387-38f3-420a-b315-18e995273531 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Get console output
Jan 22 17:49:21 compute-0 nova_compute[183075]: 2026-01-22 17:49:21.859 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:49:22 compute-0 nova_compute[183075]: 2026-01-22 17:49:22.941 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:23 compute-0 podman[241217]: 2026-01-22 17:49:23.358883189 +0000 UTC m=+0.066328189 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:49:23 compute-0 nova_compute[183075]: 2026-01-22 17:49:23.740 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:27 compute-0 nova_compute[183075]: 2026-01-22 17:49:27.004 183079 INFO nova.compute.manager [None req-ed8a3978-34a1-4540-bc8b-97805f59e9c2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Get console output
Jan 22 17:49:27 compute-0 nova_compute[183075]: 2026-01-22 17:49:27.946 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:28 compute-0 nova_compute[183075]: 2026-01-22 17:49:28.742 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:32 compute-0 nova_compute[183075]: 2026-01-22 17:49:32.180 183079 INFO nova.compute.manager [None req-776fec91-f062-403f-9e1f-7be8167f23f1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Get console output
Jan 22 17:49:32 compute-0 ovn_controller[95372]: 2026-01-22T17:49:32Z|00144|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9e:71:1f 10.100.0.14
Jan 22 17:49:32 compute-0 ovn_controller[95372]: 2026-01-22T17:49:32Z|00145|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9e:71:1f 10.100.0.14
Jan 22 17:49:32 compute-0 nova_compute[183075]: 2026-01-22 17:49:32.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:49:32 compute-0 nova_compute[183075]: 2026-01-22 17:49:32.948 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:33 compute-0 nova_compute[183075]: 2026-01-22 17:49:33.743 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:34 compute-0 podman[241256]: 2026-01-22 17:49:34.380161822 +0000 UTC m=+0.075022763 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, version=9.6, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7)
Jan 22 17:49:34 compute-0 podman[241255]: 2026-01-22 17:49:34.383955654 +0000 UTC m=+0.085379611 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:49:34 compute-0 podman[241254]: 2026-01-22 17:49:34.410317611 +0000 UTC m=+0.106213170 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 17:49:37 compute-0 nova_compute[183075]: 2026-01-22 17:49:37.419 183079 INFO nova.compute.manager [None req-aa036cc0-b962-45ab-8419-214b5fb43b7f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Get console output
Jan 22 17:49:37 compute-0 nova_compute[183075]: 2026-01-22 17:49:37.424 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:49:37 compute-0 nova_compute[183075]: 2026-01-22 17:49:37.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:49:37 compute-0 nova_compute[183075]: 2026-01-22 17:49:37.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:49:37 compute-0 nova_compute[183075]: 2026-01-22 17:49:37.950 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:38.445 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:49:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:38.446 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:49:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:49:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:49:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:49:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:49:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:49:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:49:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:49:38 compute-0 nova_compute[183075]: 2026-01-22 17:49:38.745 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:38 compute-0 nova_compute[183075]: 2026-01-22 17:49:38.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:49:40 compute-0 podman[241336]: 2026-01-22 17:49:40.377927367 +0000 UTC m=+0.080100239 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.567 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.568 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 2.1211588
Jan 22 17:49:40 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.14:46574 [22/Jan/2026:17:49:38.444] listener listener/metadata 0/0/0/2123/2123 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.582 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.583 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.610 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:49:40 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.14:46588 [22/Jan/2026:17:49:40.582] listener listener/metadata 0/0/0/28/28 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.610 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0274911
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.614 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.615 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.630 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.630 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0148916
Jan 22 17:49:40 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.14:46594 [22/Jan/2026:17:49:40.614] listener listener/metadata 0/0/0/15/15 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.640 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.641 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.660 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:49:40 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.14:46606 [22/Jan/2026:17:49:40.640] listener listener/metadata 0/0/0/20/20 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.660 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0192549
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.670 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.670 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.686 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.687 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0168681
Jan 22 17:49:40 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.14:46608 [22/Jan/2026:17:49:40.669] listener listener/metadata 0/0/0/17/17 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.693 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.693 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.706 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.706 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0132987
Jan 22 17:49:40 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.14:46620 [22/Jan/2026:17:49:40.692] listener listener/metadata 0/0/0/14/14 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.711 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.712 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.728 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.729 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0170825
Jan 22 17:49:40 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.14:46624 [22/Jan/2026:17:49:40.711] listener listener/metadata 0/0/0/17/17 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.734 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.735 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.749 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:49:40 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.14:46638 [22/Jan/2026:17:49:40.734] listener listener/metadata 0/0/0/15/15 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.749 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0143795
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.754 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.754 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.769 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.770 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0156729
Jan 22 17:49:40 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.14:46642 [22/Jan/2026:17:49:40.753] listener listener/metadata 0/0/0/16/16 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.779 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.779 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.794 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:49:40 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.14:46648 [22/Jan/2026:17:49:40.778] listener listener/metadata 0/0/0/16/16 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.794 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0153477
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.801 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.801 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:49:40 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.14:46660 [22/Jan/2026:17:49:40.800] listener listener/metadata 0/0/0/19/19 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.820 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0188279
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.852 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.853 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.867 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:49:40 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.14:46666 [22/Jan/2026:17:49:40.851] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.867 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0146954
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.871 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.871 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.889 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:49:40 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.14:46676 [22/Jan/2026:17:49:40.870] listener listener/metadata 0/0/0/19/19 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.890 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0185547
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.894 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.895 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.913 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.913 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0182908
Jan 22 17:49:40 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.14:46680 [22/Jan/2026:17:49:40.894] listener listener/metadata 0/0/0/19/19 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.918 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.919 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.933 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:49:40 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.14:46682 [22/Jan/2026:17:49:40.918] listener listener/metadata 0/0/0/15/15 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.934 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0150645
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.942 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.943 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.973 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:49:40 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:40.973 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0305924
Jan 22 17:49:40 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240487]: 10.100.0.14:46698 [22/Jan/2026:17:49:40.941] listener listener/metadata 0/0/0/31/31 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:49:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:41.968 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:49:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:41.969 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:49:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:49:41.970 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:49:42 compute-0 nova_compute[183075]: 2026-01-22 17:49:42.717 183079 INFO nova.compute.manager [None req-0cebd274-95fb-4bba-ada7-3cf3e96b9c61 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Get console output
Jan 22 17:49:42 compute-0 nova_compute[183075]: 2026-01-22 17:49:42.722 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:49:42 compute-0 nova_compute[183075]: 2026-01-22 17:49:42.997 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:43 compute-0 nova_compute[183075]: 2026-01-22 17:49:43.747 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:43 compute-0 nova_compute[183075]: 2026-01-22 17:49:43.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:49:45 compute-0 nova_compute[183075]: 2026-01-22 17:49:45.811 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:49:47 compute-0 nova_compute[183075]: 2026-01-22 17:49:47.999 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:48 compute-0 nova_compute[183075]: 2026-01-22 17:49:48.042 183079 INFO nova.compute.manager [None req-7fcacdbf-7779-4538-8423-ae812625a24c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Get console output
Jan 22 17:49:48 compute-0 nova_compute[183075]: 2026-01-22 17:49:48.047 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:49:48 compute-0 ovn_controller[95372]: 2026-01-22T17:49:48Z|00795|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 22 17:49:48 compute-0 nova_compute[183075]: 2026-01-22 17:49:48.750 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:49 compute-0 nova_compute[183075]: 2026-01-22 17:49:49.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:49:49 compute-0 nova_compute[183075]: 2026-01-22 17:49:49.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:49:50 compute-0 nova_compute[183075]: 2026-01-22 17:49:50.573 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-38456dbd-001e-46c3-ae2d-5e1765611833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:49:50 compute-0 nova_compute[183075]: 2026-01-22 17:49:50.573 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-38456dbd-001e-46c3-ae2d-5e1765611833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:49:50 compute-0 nova_compute[183075]: 2026-01-22 17:49:50.573 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:49:51 compute-0 podman[241357]: 2026-01-22 17:49:51.366525564 +0000 UTC m=+0.060987947 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:49:52 compute-0 nova_compute[183075]: 2026-01-22 17:49:52.763 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Updating instance_info_cache with network_info: [{"id": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "address": "fa:16:3e:d4:5d:bb", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7f0a8be-56", "ovs_interfaceid": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:49:52 compute-0 nova_compute[183075]: 2026-01-22 17:49:52.778 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-38456dbd-001e-46c3-ae2d-5e1765611833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:49:52 compute-0 nova_compute[183075]: 2026-01-22 17:49:52.779 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:49:52 compute-0 nova_compute[183075]: 2026-01-22 17:49:52.779 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:49:52 compute-0 nova_compute[183075]: 2026-01-22 17:49:52.800 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:49:52 compute-0 nova_compute[183075]: 2026-01-22 17:49:52.802 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:49:52 compute-0 nova_compute[183075]: 2026-01-22 17:49:52.802 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:49:52 compute-0 nova_compute[183075]: 2026-01-22 17:49:52.803 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:49:52 compute-0 nova_compute[183075]: 2026-01-22 17:49:52.881 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:49:52 compute-0 nova_compute[183075]: 2026-01-22 17:49:52.960 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:49:52 compute-0 nova_compute[183075]: 2026-01-22 17:49:52.962 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.001 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.016 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.022 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.077 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.078 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.142 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.148 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.166 183079 INFO nova.compute.manager [None req-3d04b188-f246-4184-b0e9-48a07d430d63 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Get console output
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.172 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.203 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.203 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.259 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.496 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.497 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5255MB free_disk=73.26605987548828GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.497 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.498 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.592 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance df3ded3b-e065-4dee-93d3-e1ced39c8619 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.593 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 38456dbd-001e-46c3-ae2d-5e1765611833 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.593 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.594 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.594 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.672 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.689 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.714 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.714 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.715 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.715 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.738 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.738 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.752 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:53 compute-0 nova_compute[183075]: 2026-01-22 17:49:53.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:49:54 compute-0 podman[241400]: 2026-01-22 17:49:54.339231729 +0000 UTC m=+0.046614061 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:49:55 compute-0 sshd-session[241427]: Received disconnect from 91.224.92.54 port 58936:11:  [preauth]
Jan 22 17:49:55 compute-0 sshd-session[241427]: Disconnected from authenticating user root 91.224.92.54 port 58936 [preauth]
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.463 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'name': 'tempest-server-test-7188366', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000046', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.466 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'name': 'tempest-server-test-1547454978', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000047', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.468 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'name': 'tempest-server-test-1131995147', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000045', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.469 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.471 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 38456dbd-001e-46c3-ae2d-5e1765611833 / tapf7f0a8be-56 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.472 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.474 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2 / tap598a2a6c-cd inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.474 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.476 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.incoming.bytes.delta volume: 336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fe49885-c0ec-427b-853b-6ba10dcc1315', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000046-38456dbd-001e-46c3-ae2d-5e1765611833-tapf7f0a8be-56', 'timestamp': '2026-01-22T17:49:55.469522', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'tapf7f0a8be-56', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:5d:bb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7f0a8be-56'}, 'message_id': 'c2ec3454-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.229955555, 'message_signature': '09cab5571a844306972b6f2387930c11f1e49ac6945d6832e4673f5da4cf18f9'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000047-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-tap598a2a6c-cd', 'timestamp': '2026-01-22T17:49:55.469522', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'tap598a2a6c-cd', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:71:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap598a2a6c-cd'}, 'message_id': 'c2ec9412-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.232844872, 'message_signature': '8f13028928f0b28f5cf4342b0dd2e4fc219e1183be04bd6690247e8d40f81c8a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 336, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:49:55.469522', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': 'c2ecf362-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.235244187, 'message_signature': '6de4a1cd666b480b85ba7ccf0415767a8a6f4b44c15099aa2762ac8eac447bbf'}]}, 'timestamp': '2026-01-22 17:49:55.477340', '_unique_id': 'dfc50b13c7e94026885e536eb91f0ffa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.478 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.479 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.490 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.499 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.508 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a07a4b68-23cd-4527-979d-e3712efe70f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '38456dbd-001e-46c3-ae2d-5e1765611833-vda', 'timestamp': '2026-01-22T17:49:55.479878', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'instance-00000046', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c2eef90a-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.240300322, 'message_signature': 'd58a289a4083665078a7ce80e3534734f547148b5ef135ead6771a07f808cd59'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-vda', 'timestamp': '2026-01-22T17:49:55.479878', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'instance-00000047', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c2f05340-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.250994909, 'message_signature': 'b574028e0fc09e645122e03d9f9eac5037b368100522f8e0d33c4419f8bc9bdf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:49:55.479878', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c2f1cbd0-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.259944799, 'message_signature': '34fd764c0d2732e64a57c6595f748c54a5c01a10e4da9b2560a7a7f11bc026c8'}]}, 'timestamp': '2026-01-22 17:49:55.509111', '_unique_id': '6921d53c1efc45e9ae8aee5f6bf18bd3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.511 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.523 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/disk.device.write.requests volume: 346 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.536 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk.device.write.requests volume: 321 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.549 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.write.requests volume: 353 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1986390-9a29-4161-9bd5-fc22cdcf3c3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 346, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '38456dbd-001e-46c3-ae2d-5e1765611833-vda', 'timestamp': '2026-01-22T17:49:55.511074', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'instance-00000046', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c2f41868-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.271517379, 'message_signature': 'af9d9bd81cdebe8cd95225aa7954604847aa6c9050b85dad9d24acc7be07b941'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 321, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-vda', 'timestamp': '2026-01-22T17:49:55.511074', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'instance-00000047', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c2f60f6a-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.284652482, 'message_signature': '2e1280f3ae556dd557ae2da367ada9e832dd77a03e831ba3dad12a751337f178'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 353, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:49:55.511074', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c2f80748-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.297449025, 'message_signature': '9e9c8b1f84afc49d07396f14ca22cb9ac0a1d43c46326405cd41fc8d183261aa'}]}, 'timestamp': '2026-01-22 17:49:55.549950', '_unique_id': 'a48926b5469e4d848c5300062aa58e7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.551 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.552 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.566 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/cpu volume: 11560000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.578 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/cpu volume: 11340000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.591 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/cpu volume: 11970000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a083be8-b7eb-4dab-9f42-627b9beb6e3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11560000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'timestamp': '2026-01-22T17:49:55.552091', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'instance-00000046', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'c2fa9350-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.326472763, 'message_signature': '6ee9f81de874c7fbd9456b52ca53a7209d9a7078992bf8083a76e786686edb8c'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11340000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'timestamp': '2026-01-22T17:49:55.552091', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'instance-00000047', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'c2fc6cca-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.338707501, 'message_signature': 'f22a67fccce5e43cc7271e872c3d8c2924ac97084e3aff57bec7aae736e857d9'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11970000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'timestamp': '2026-01-22T17:49:55.552091', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'c2fe6fb6-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.35171082, 'message_signature': '7eb0d90ec1c7f30e797da65f1387680e60b6fddb0e3f3e808b885f469347ff30'}]}, 'timestamp': '2026-01-22 17:49:55.591990', '_unique_id': '5764f7ab655d45c795afa5bc05e26782'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.593 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.593 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/network.incoming.packets volume: 65 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.594 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.594 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.incoming.packets volume: 69 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcccf032-3535-4824-9629-bf2590992d2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 65, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000046-38456dbd-001e-46c3-ae2d-5e1765611833-tapf7f0a8be-56', 'timestamp': '2026-01-22T17:49:55.593962', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'tapf7f0a8be-56', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:5d:bb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7f0a8be-56'}, 'message_id': 'c2fecbd2-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.229955555, 'message_signature': 'c6f46d3097ec597b22953a60948716826b0ae3660eef14673560fd21accdceb4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000047-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-tap598a2a6c-cd', 'timestamp': '2026-01-22T17:49:55.593962', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'tap598a2a6c-cd', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:71:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap598a2a6c-cd'}, 'message_id': 'c2fed6fe-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.232844872, 'message_signature': '021bc887a0518d735d8e87519cc57a9df61cb0ca89910aa257d6d2a1219ba05e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 69, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:49:55.593962', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': 'c2fee2de-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.235244187, 'message_signature': '5b2951f6bea1b8110683938015d4a16fc62ce291a4fda06d55642ede2476c29b'}]}, 'timestamp': '2026-01-22 17:49:55.594847', '_unique_id': '6b05bab4926f44cba6c88839efe1240b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.596 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.596 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/network.outgoing.packets volume: 131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.596 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/network.outgoing.packets volume: 123 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.596 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.outgoing.packets volume: 131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d6da1f6-dbdc-4979-9ccb-b75b012f53c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 131, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000046-38456dbd-001e-46c3-ae2d-5e1765611833-tapf7f0a8be-56', 'timestamp': '2026-01-22T17:49:55.596303', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'tapf7f0a8be-56', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:5d:bb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7f0a8be-56'}, 'message_id': 'c2ff2654-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.229955555, 'message_signature': 'c05394ffb2f0d0bb6632eb0dc530e5c2c4fc3cd149696c53f5ad1082197e316d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 123, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000047-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-tap598a2a6c-cd', 'timestamp': '2026-01-22T17:49:55.596303', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'tap598a2a6c-cd', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:71:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap598a2a6c-cd'}, 'message_id': 'c2ff2f78-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.232844872, 'message_signature': 'ffb57da9016cd60dbf892f951bf545ee9d2fc217f7e47a7d3badaa077b49733a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 131, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:49:55.596303', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': 'c2ff3784-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.235244187, 'message_signature': '8d54648972d0fc6210e255adac01f2e8bfa19d096cb954ef66c51fdcc50afa07'}]}, 'timestamp': '2026-01-22 17:49:55.596991', '_unique_id': '974ed4e9c5544deb8da8dbe7fad59adf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.597 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.598 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.598 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.598 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.598 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13e022d7-567e-4115-9f7d-a7fe40434577', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '38456dbd-001e-46c3-ae2d-5e1765611833-vda', 'timestamp': '2026-01-22T17:49:55.598123', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'instance-00000046', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c2ff6c7c-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.240300322, 'message_signature': 'a9ce5c78626fc58bd4d004fa4ce955776ba08645a71e4c9d62b8369cc80c857d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-vda', 'timestamp': '2026-01-22T17:49:55.598123', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'instance-00000047', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c2ff74ec-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.250994909, 'message_signature': 'f591f0c8f40079b803a816ccad203e792b2ff5b656620a5d0ed28f52672db36b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:49:55.598123', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c2ff7d20-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.259944799, 'message_signature': '928d5e9cd66c5675fc8fd0b41c426890ceb98d04c0bd3b2f73d64f56a34e6731'}]}, 'timestamp': '2026-01-22 17:49:55.598766', '_unique_id': '3e3e62788adb4368ba64a38e56c61edb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.599 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.600 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.600 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.600 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-7188366>, <NovaLikeServer: tempest-server-test-1547454978>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-7188366>, <NovaLikeServer: tempest-server-test-1547454978>]
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.600 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.600 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/memory.usage volume: 42.0546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.600 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/memory.usage volume: 43.31640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.600 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/memory.usage volume: 42.58203125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aff007fe-fe22-4f3d-ae06-bd91c36e5727', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.0546875, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'timestamp': '2026-01-22T17:49:55.600542', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'instance-00000046', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'c2ffcba4-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.326472763, 'message_signature': 'b3abbfcebfb1676c95008095dc78997a8cba55dea2a064447d9b16f8f3a2bdf6'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.31640625, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'timestamp': '2026-01-22T17:49:55.600542', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'instance-00000047', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'c2ffd3ce-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.338707501, 'message_signature': '7fe85e6981f60863ebb208451cc46821bc74279c0a81c54333577cda5ffdca7e'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.58203125, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'timestamp': '2026-01-22T17:49:55.600542', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'c2ffdb6c-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.35171082, 'message_signature': '46d5277823efa75cc8fb20ef1214e9e946e07402ede13bde173b88fe56724fe6'}]}, 'timestamp': '2026-01-22 17:49:55.601178', '_unique_id': 'bd7460f8b2cf4ad19b679ebf022ca1fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.601 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.602 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.602 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/disk.device.read.bytes volume: 30820864 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.602 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk.device.read.bytes volume: 30824960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.602 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.read.bytes volume: 30824960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08842b89-0b97-4377-9d4e-71d21a4ed430', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30820864, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '38456dbd-001e-46c3-ae2d-5e1765611833-vda', 'timestamp': '2026-01-22T17:49:55.602312', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'instance-00000046', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c30010dc-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.271517379, 'message_signature': 'd6ea27a56eecfe4eada91f80b64977f1859eb7cba421459793bc1f1380a3cf53'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30824960, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-vda', 'timestamp': '2026-01-22T17:49:55.602312', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'instance-00000047', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3001a8c-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.284652482, 'message_signature': 'cce1fd83be056977b5cfaa455062bd99047410abdd30c127e26edf93bbd03ed7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30824960, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:49:55.602312', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c30022ca-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.297449025, 'message_signature': '5eec0ab1473d9c1334fc0fc3a31886e0a75e2886fc055ca51f4cfe0af7e3d2ba'}]}, 'timestamp': '2026-01-22 17:49:55.603007', '_unique_id': 'ad59a55b68814e2db9197c8967885259'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.603 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.604 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.604 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.604 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.604 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18971952-4481-4314-89bd-039c37a01076', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000046-38456dbd-001e-46c3-ae2d-5e1765611833-tapf7f0a8be-56', 'timestamp': '2026-01-22T17:49:55.604136', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'tapf7f0a8be-56', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:5d:bb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7f0a8be-56'}, 'message_id': 'c3005740-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.229955555, 'message_signature': 'd69886845e0a9325f720950372850dece1b78e967db4c52c2963ac3d5437e1fa'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000047-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-tap598a2a6c-cd', 'timestamp': '2026-01-22T17:49:55.604136', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'tap598a2a6c-cd', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:71:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap598a2a6c-cd'}, 'message_id': 'c300606e-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.232844872, 'message_signature': '936a80c6ae7cd847840be1008f09d23af5f222af117271581e3855a2b0bd2d44'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:49:55.604136', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': 'c30069a6-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.235244187, 'message_signature': 'c50a272fd7a4c6409dab4915bbd8e145e21e44dc03dae07400057e9652aa1695'}]}, 'timestamp': '2026-01-22 17:49:55.604828', '_unique_id': '9101e8a6d14c49a1b1e99ae593a1d7bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.605 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/disk.device.write.latency volume: 2791540947 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.606 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk.device.write.latency volume: 2833092480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.606 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.write.latency volume: 25211898842 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9114713-2657-46a7-98c7-03337292edcb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2791540947, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '38456dbd-001e-46c3-ae2d-5e1765611833-vda', 'timestamp': '2026-01-22T17:49:55.605969', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'instance-00000046', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3009eb2-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.271517379, 'message_signature': '37a5d6fb3d6f35427d486e8e3013782bfb2d02572a896214a78dd3fd0c5934f6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2833092480, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-vda', 'timestamp': '2026-01-22T17:49:55.605969', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'instance-00000047', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c300a678-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.284652482, 'message_signature': '6a9d9cd240e90ac3583233f93b61dd4ec3a159c97fa1335bb3abe0cb748365f0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25211898842, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:49:55.605969', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c300adda-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.297449025, 'message_signature': '15e5d7f24c50c3b2a9c191b9d799cbaa25b4b13bf73695246c378c366f789981'}]}, 'timestamp': '2026-01-22 17:49:55.606566', '_unique_id': '8fdbe2caac414ebca23af4c30d73dca6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.607 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/disk.device.write.bytes volume: 73211904 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.608 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.608 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.write.bytes volume: 73220096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '458600ec-0075-443c-b9e4-f7532c7ecf81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73211904, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '38456dbd-001e-46c3-ae2d-5e1765611833-vda', 'timestamp': '2026-01-22T17:49:55.607909', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'instance-00000046', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c300eb2e-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.271517379, 'message_signature': '4263c31059195f97ffa692792d2ee571df9806b7a3f2ff28b6e4086a49c820b2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-vda', 'timestamp': '2026-01-22T17:49:55.607909', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'instance-00000047', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c300f326-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.284652482, 'message_signature': '07dfd2ca5ef371009ad03ec8a32551f43dd1c43f7d4f47e970408af8eb691e94'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73220096, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:49:55.607909', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c300fa88-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.297449025, 'message_signature': '5b26884e55821baa9474a091ecc1cb663902cdceb5092dc34b73f030f0d39963'}]}, 'timestamp': '2026-01-22 17:49:55.608527', '_unique_id': '48c11a95181b48189785257ed19b23f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.609 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-7188366>, <NovaLikeServer: tempest-server-test-1547454978>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-7188366>, <NovaLikeServer: tempest-server-test-1547454978>]
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.610 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.610 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/disk.device.read.requests volume: 1136 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.610 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk.device.read.requests volume: 1137 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.610 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.read.requests volume: 1137 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96a8344f-eb90-4cce-a1fc-1888fb779f07', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1136, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '38456dbd-001e-46c3-ae2d-5e1765611833-vda', 'timestamp': '2026-01-22T17:49:55.610131', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'instance-00000046', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3014268-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.271517379, 'message_signature': '2ea3ac7b572d50e7bd45896cf5e33f7ee45ddbe358bd764aabdf32c55343ea19'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1137, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-vda', 'timestamp': '2026-01-22T17:49:55.610131', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'instance-00000047', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3014cb8-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.284652482, 'message_signature': '63a6b9ff110c3e5bdf8937600bb982990e13d9b2bf3f1854715f2bad1bad5e35'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1137, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:49:55.610131', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c30157da-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.297449025, 'message_signature': 'a1fbdf3e52c1e94efb72cdad9d603683a2a1b5ae3b84abf44f38c6e7243979ba'}]}, 'timestamp': '2026-01-22 17:49:55.610953', '_unique_id': '43e4d1541ab242989b8fbf9e41c19bcf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.611 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.612 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.612 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.612 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-7188366>, <NovaLikeServer: tempest-server-test-1547454978>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-7188366>, <NovaLikeServer: tempest-server-test-1547454978>]
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.612 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.612 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/network.incoming.bytes volume: 7367 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.612 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/network.incoming.bytes volume: 7208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.incoming.bytes volume: 7616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bad7fb9-0b05-4559-a9ed-3be59866bcd0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7367, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000046-38456dbd-001e-46c3-ae2d-5e1765611833-tapf7f0a8be-56', 'timestamp': '2026-01-22T17:49:55.612599', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'tapf7f0a8be-56', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:5d:bb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7f0a8be-56'}, 'message_id': 'c301a2a8-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.229955555, 'message_signature': 'f2cb2a631a9bb0df612d7fb6df158e6ffa8749f50dea53189434ffe763d78ab8'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7208, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000047-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-tap598a2a6c-cd', 'timestamp': '2026-01-22T17:49:55.612599', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'tap598a2a6c-cd', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:71:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap598a2a6c-cd'}, 'message_id': 'c301ab90-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.232844872, 'message_signature': '1eb279ba09544fff15283d3c38f4cdfe54ba251667dab6fd32c47c5d2fd4a56f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7616, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:49:55.612599', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': 'c301b342-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.235244187, 'message_signature': 'd569164edb6db978a58ede8728ac11f97e05534029818670d532c28aa1bebac3'}]}, 'timestamp': '2026-01-22 17:49:55.613265', '_unique_id': '4a91a902c6aa4494a25a6fcd11306a72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.613 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.614 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.614 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.614 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.614 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '170333eb-4cb8-408b-99f7-968054199128', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000046-38456dbd-001e-46c3-ae2d-5e1765611833-tapf7f0a8be-56', 'timestamp': '2026-01-22T17:49:55.614390', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'tapf7f0a8be-56', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:5d:bb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7f0a8be-56'}, 'message_id': 'c301e81c-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.229955555, 'message_signature': '9144df691566b268c84c717aa5c6baaf61b04567886e82faf09dadd344c0d183'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000047-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-tap598a2a6c-cd', 'timestamp': '2026-01-22T17:49:55.614390', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'tap598a2a6c-cd', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:71:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap598a2a6c-cd'}, 'message_id': 'c301f19a-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.232844872, 'message_signature': '4b3b4ad95b1f6bc113307971dcc972c65c1d15019f518b4186a0c4565a2a243b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:49:55.614390', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': 'c301f960-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.235244187, 'message_signature': 'bd3a158348caf18be237e237e5e7f7c5a04cf4bfb549402453da8d20d060cbc1'}]}, 'timestamp': '2026-01-22 17:49:55.615058', '_unique_id': '17d8926622b447bca31629aabaa302ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.615 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.616 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.616 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.616 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk.device.allocation volume: 30875648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.616 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95d0239b-d50f-4660-afe6-1c2a7b848e77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '38456dbd-001e-46c3-ae2d-5e1765611833-vda', 'timestamp': '2026-01-22T17:49:55.616206', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'instance-00000046', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3022e76-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.240300322, 'message_signature': 'c988b4b929a01d43c098f425281355ec21fc939367edb25d716ada8642e72917'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30875648, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-vda', 'timestamp': '2026-01-22T17:49:55.616206', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'instance-00000047', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3023628-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.250994909, 'message_signature': '6dbda114a2637dabc3e4b78ef4d6886f982833c54c603d040d592c0ade7655fb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:49:55.616206', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3023e2a-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.259944799, 'message_signature': '7123f0c42528b4ebcbcc73e7aefe262269b482d3135f8a11de571b72967b0dc3'}]}, 'timestamp': '2026-01-22 17:49:55.616814', '_unique_id': '074199fc06284e4bb019db04d766de2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.617 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.618 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.618 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18d80816-7894-47fc-8483-e3c19aa28073', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000046-38456dbd-001e-46c3-ae2d-5e1765611833-tapf7f0a8be-56', 'timestamp': '2026-01-22T17:49:55.617907', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'tapf7f0a8be-56', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:5d:bb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7f0a8be-56'}, 'message_id': 'c3027110-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.229955555, 'message_signature': '215618eff85d47bbbeb40598c4385098f62faec715a47f2cde30195ad671869e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000047-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-tap598a2a6c-cd', 'timestamp': '2026-01-22T17:49:55.617907', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'tap598a2a6c-cd', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:71:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap598a2a6c-cd'}, 'message_id': 'c30278ea-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.232844872, 'message_signature': '89072f7b6b714184c0ef32bad441fe7c9e9b3bc0a7339bfbec39d36f9bd8660f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:49:55.617907', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': 'c30280a6-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.235244187, 'message_signature': 'fded3db6cc7975d027120284a36096c644f368a776aa6ddbfb6fc292ef23b706'}]}, 'timestamp': '2026-01-22 17:49:55.618523', '_unique_id': 'cc106d09efef4d50bb05c4283dbbba88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/network.outgoing.bytes volume: 11596 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.619 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/network.outgoing.bytes volume: 10851 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.outgoing.bytes volume: 11596 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bece761c-1657-40d5-869a-3253c5d1433f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11596, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000046-38456dbd-001e-46c3-ae2d-5e1765611833-tapf7f0a8be-56', 'timestamp': '2026-01-22T17:49:55.619655', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'tapf7f0a8be-56', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:5d:bb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7f0a8be-56'}, 'message_id': 'c302b56c-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.229955555, 'message_signature': 'dc71a08fcc65fa0e2a123201b3593f819f0710bef95f95d9c2f833cfa6e0528a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10851, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000047-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-tap598a2a6c-cd', 'timestamp': '2026-01-22T17:49:55.619655', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'tap598a2a6c-cd', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:71:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap598a2a6c-cd'}, 'message_id': 'c302be40-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.232844872, 'message_signature': '37ade4a1bb7e9dc72fddc2fab6dbaf9deb870d71f340d6cc42aa5a3b35da624d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11596, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:49:55.619655', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': 'c302c66a-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.235244187, 'message_signature': 'b39bef3add5d9058a28e5d0f73a316cb3b4ee1f684018e028e6bcfa4e6e416de'}]}, 'timestamp': '2026-01-22 17:49:55.620317', '_unique_id': '258289e2a22e4a3eac1a3d91f8edee67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.620 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.621 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.621 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/disk.device.read.latency volume: 153082246 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.621 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/disk.device.read.latency volume: 167336917 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.621 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/disk.device.read.latency volume: 266223895 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73e14caf-bee2-4a05-9a46-06399ddcff95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 153082246, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '38456dbd-001e-46c3-ae2d-5e1765611833-vda', 'timestamp': '2026-01-22T17:49:55.621429', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'instance-00000046', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c302faa4-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.271517379, 'message_signature': '43734bc0e4d078eb0a905b2a55f38fdc3f940c3ff3f9bad9172f0420bdcdc139'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 167336917, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-vda', 'timestamp': '2026-01-22T17:49:55.621429', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'instance-00000047', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3030346-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.284652482, 'message_signature': '2aaa0e3812f6f00169bea985ae44704a42ad7892ac0db965dce6d7c91b4ec813'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 266223895, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619-vda', 'timestamp': '2026-01-22T17:49:55.621429', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'instance-00000045', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3030a8a-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.297449025, 'message_signature': 'f64e91072d6c15273cc7cb2cbb9e0abbe4cfb35a7e7bffc29cd2cf8ef1221cef'}]}, 'timestamp': '2026-01-22 17:49:55.622053', '_unique_id': '8a342ad8ee3646288a74c53e7fd19f7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.622 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.623 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.623 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.623 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-7188366>, <NovaLikeServer: tempest-server-test-1547454978>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-7188366>, <NovaLikeServer: tempest-server-test-1547454978>]
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.623 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.623 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.623 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb49410f-9ad2-41c1-bec8-752ed835753d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000046-38456dbd-001e-46c3-ae2d-5e1765611833-tapf7f0a8be-56', 'timestamp': '2026-01-22T17:49:55.623622', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'tapf7f0a8be-56', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:5d:bb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7f0a8be-56'}, 'message_id': 'c30351d4-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.229955555, 'message_signature': '02984819dabb63d77134c4837fd192be4af8e4c51055e30543406e18138fa2f3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000047-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-tap598a2a6c-cd', 'timestamp': '2026-01-22T17:49:55.623622', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'tap598a2a6c-cd', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:71:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap598a2a6c-cd'}, 'message_id': 'c3035e72-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.232844872, 'message_signature': 'de5c63bc69c9931c3b787178797b93e710dc8a7d553ee1a4e901e5ee1d66c4c0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:49:55.623622', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': 'c303673c-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.235244187, 'message_signature': 'e552af5b1bf5522c7fd7029c94b5bcce002391c816928bff378e2f99e0bb8815'}]}, 'timestamp': '2026-01-22 17:49:55.624433', '_unique_id': 'f4065bdf02d04d1d97aa1ec8cd92f918'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.624 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.625 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.625 12 DEBUG ceilometer.compute.pollsters [-] 38456dbd-001e-46c3-ae2d-5e1765611833/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.625 12 DEBUG ceilometer.compute.pollsters [-] d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 DEBUG ceilometer.compute.pollsters [-] df3ded3b-e065-4dee-93d3-e1ced39c8619/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '374af3e7-a63e-451e-a775-03b67cd30898', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000046-38456dbd-001e-46c3-ae2d-5e1765611833-tapf7f0a8be-56', 'timestamp': '2026-01-22T17:49:55.625599', 'resource_metadata': {'display_name': 'tempest-server-test-7188366', 'name': 'tapf7f0a8be-56', 'instance_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:5d:bb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7f0a8be-56'}, 'message_id': 'c3039e46-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.229955555, 'message_signature': '76235f1bb1f589a6fc5e7f1b02380382e2f166e1bdf44c0b9690e257702cd479'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000047-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-tap598a2a6c-cd', 'timestamp': '2026-01-22T17:49:55.625599', 'resource_metadata': {'display_name': 'tempest-server-test-1547454978', 'name': 'tap598a2a6c-cd', 'instance_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9e:71:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap598a2a6c-cd'}, 'message_id': 'c303a620-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.232844872, 'message_signature': 'cf786cf9416f0bc89e241c753a50a51b34987aaeb7ad56ffc88f76eb467a21a1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000045-df3ded3b-e065-4dee-93d3-e1ced39c8619-tap11393b0e-74', 'timestamp': '2026-01-22T17:49:55.625599', 'resource_metadata': {'display_name': 'tempest-server-test-1131995147', 'name': 'tap11393b0e-74', 'instance_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:50:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11393b0e-74'}, 'message_id': 'c303adbe-f7ba-11f0-9e69-fa163eaea1db', 'monotonic_time': 6559.235244187, 'message_signature': 'c22c5d12e41276b3f7c299aeed1504470081b7233e90e6e831a977835081f26c'}]}, 'timestamp': '2026-01-22 17:49:55.626259', '_unique_id': '75e889bd5c5544b5bec39c8799d887a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:49:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:49:55.626 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:49:58 compute-0 nova_compute[183075]: 2026-01-22 17:49:58.004 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:49:58 compute-0 nova_compute[183075]: 2026-01-22 17:49:58.294 183079 INFO nova.compute.manager [None req-6b105ab3-91c4-461e-ab8f-430285151e59 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Get console output
Jan 22 17:49:58 compute-0 nova_compute[183075]: 2026-01-22 17:49:58.300 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:49:58 compute-0 nova_compute[183075]: 2026-01-22 17:49:58.755 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:03 compute-0 nova_compute[183075]: 2026-01-22 17:50:03.005 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:03 compute-0 nova_compute[183075]: 2026-01-22 17:50:03.410 183079 INFO nova.compute.manager [None req-c6fe16b3-0d37-40e6-80c1-f84e4d3d5867 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Get console output
Jan 22 17:50:03 compute-0 nova_compute[183075]: 2026-01-22 17:50:03.413 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:50:03 compute-0 nova_compute[183075]: 2026-01-22 17:50:03.757 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:05 compute-0 podman[241431]: 2026-01-22 17:50:05.336420306 +0000 UTC m=+0.046362724 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:50:05 compute-0 podman[241430]: 2026-01-22 17:50:05.366240996 +0000 UTC m=+0.077756396 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 22 17:50:05 compute-0 podman[241432]: 2026-01-22 17:50:05.369432412 +0000 UTC m=+0.075944028 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=)
Jan 22 17:50:06 compute-0 nova_compute[183075]: 2026-01-22 17:50:06.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:50:08 compute-0 nova_compute[183075]: 2026-01-22 17:50:08.028 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:08 compute-0 nova_compute[183075]: 2026-01-22 17:50:08.559 183079 INFO nova.compute.manager [None req-17d6ea59-d34b-49b3-9948-2038768d4a14 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Get console output
Jan 22 17:50:08 compute-0 nova_compute[183075]: 2026-01-22 17:50:08.565 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:50:08 compute-0 nova_compute[183075]: 2026-01-22 17:50:08.759 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:08 compute-0 nova_compute[183075]: 2026-01-22 17:50:08.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:50:11 compute-0 podman[241489]: 2026-01-22 17:50:11.351768884 +0000 UTC m=+0.056558338 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 17:50:11 compute-0 nova_compute[183075]: 2026-01-22 17:50:11.803 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:50:11 compute-0 nova_compute[183075]: 2026-01-22 17:50:11.803 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 17:50:11 compute-0 nova_compute[183075]: 2026-01-22 17:50:11.820 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 17:50:13 compute-0 nova_compute[183075]: 2026-01-22 17:50:13.030 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:13 compute-0 nova_compute[183075]: 2026-01-22 17:50:13.690 183079 INFO nova.compute.manager [None req-c347acd9-355a-433d-ba12-34e52e830004 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Get console output
Jan 22 17:50:13 compute-0 nova_compute[183075]: 2026-01-22 17:50:13.695 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:50:13 compute-0 nova_compute[183075]: 2026-01-22 17:50:13.761 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:18 compute-0 nova_compute[183075]: 2026-01-22 17:50:18.033 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:18 compute-0 nova_compute[183075]: 2026-01-22 17:50:18.762 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:18 compute-0 nova_compute[183075]: 2026-01-22 17:50:18.886 183079 INFO nova.compute.manager [None req-ab7ddb94-a0bc-4e4e-9d7f-ff051cb32522 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Get console output
Jan 22 17:50:18 compute-0 nova_compute[183075]: 2026-01-22 17:50:18.892 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:50:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:21.165 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:50:21 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:21.166 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:50:21 compute-0 nova_compute[183075]: 2026-01-22 17:50:21.166 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:21 compute-0 nova_compute[183075]: 2026-01-22 17:50:21.941 183079 DEBUG nova.compute.manager [req-8a1fd0bb-04e6-42d6-aa3c-bdbaf70e0745 req-f554ac2e-6be0-4269-87b3-ad5cf681f8aa a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Received event network-changed-11393b0e-74ee-456c-8793-6b2a6cb69a8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:50:21 compute-0 nova_compute[183075]: 2026-01-22 17:50:21.941 183079 DEBUG nova.compute.manager [req-8a1fd0bb-04e6-42d6-aa3c-bdbaf70e0745 req-f554ac2e-6be0-4269-87b3-ad5cf681f8aa a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Refreshing instance network info cache due to event network-changed-11393b0e-74ee-456c-8793-6b2a6cb69a8c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:50:21 compute-0 nova_compute[183075]: 2026-01-22 17:50:21.941 183079 DEBUG oslo_concurrency.lockutils [req-8a1fd0bb-04e6-42d6-aa3c-bdbaf70e0745 req-f554ac2e-6be0-4269-87b3-ad5cf681f8aa a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-df3ded3b-e065-4dee-93d3-e1ced39c8619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:50:21 compute-0 nova_compute[183075]: 2026-01-22 17:50:21.942 183079 DEBUG oslo_concurrency.lockutils [req-8a1fd0bb-04e6-42d6-aa3c-bdbaf70e0745 req-f554ac2e-6be0-4269-87b3-ad5cf681f8aa a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-df3ded3b-e065-4dee-93d3-e1ced39c8619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:50:21 compute-0 nova_compute[183075]: 2026-01-22 17:50:21.942 183079 DEBUG nova.network.neutron [req-8a1fd0bb-04e6-42d6-aa3c-bdbaf70e0745 req-f554ac2e-6be0-4269-87b3-ad5cf681f8aa a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Refreshing network info cache for port 11393b0e-74ee-456c-8793-6b2a6cb69a8c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:50:22 compute-0 podman[241509]: 2026-01-22 17:50:22.340846543 +0000 UTC m=+0.048168623 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:50:23 compute-0 nova_compute[183075]: 2026-01-22 17:50:23.037 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:23 compute-0 nova_compute[183075]: 2026-01-22 17:50:23.713 183079 DEBUG nova.network.neutron [req-8a1fd0bb-04e6-42d6-aa3c-bdbaf70e0745 req-f554ac2e-6be0-4269-87b3-ad5cf681f8aa a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Updated VIF entry in instance network info cache for port 11393b0e-74ee-456c-8793-6b2a6cb69a8c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:50:23 compute-0 nova_compute[183075]: 2026-01-22 17:50:23.713 183079 DEBUG nova.network.neutron [req-8a1fd0bb-04e6-42d6-aa3c-bdbaf70e0745 req-f554ac2e-6be0-4269-87b3-ad5cf681f8aa a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Updating instance_info_cache with network_info: [{"id": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "address": "fa:16:3e:af:50:24", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11393b0e-74", "ovs_interfaceid": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:50:23 compute-0 nova_compute[183075]: 2026-01-22 17:50:23.747 183079 DEBUG oslo_concurrency.lockutils [req-8a1fd0bb-04e6-42d6-aa3c-bdbaf70e0745 req-f554ac2e-6be0-4269-87b3-ad5cf681f8aa a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-df3ded3b-e065-4dee-93d3-e1ced39c8619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:50:23 compute-0 nova_compute[183075]: 2026-01-22 17:50:23.764 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:24 compute-0 nova_compute[183075]: 2026-01-22 17:50:24.262 183079 DEBUG nova.compute.manager [req-f80bb369-f6d4-41ea-81b2-e9e412b3b4ca req-3af5abd5-ea50-40ee-9a2a-868c332e4d37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Received event network-changed-f7f0a8be-56b9-4a12-8c48-0b7a66239107 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:50:24 compute-0 nova_compute[183075]: 2026-01-22 17:50:24.262 183079 DEBUG nova.compute.manager [req-f80bb369-f6d4-41ea-81b2-e9e412b3b4ca req-3af5abd5-ea50-40ee-9a2a-868c332e4d37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Refreshing instance network info cache due to event network-changed-f7f0a8be-56b9-4a12-8c48-0b7a66239107. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:50:24 compute-0 nova_compute[183075]: 2026-01-22 17:50:24.262 183079 DEBUG oslo_concurrency.lockutils [req-f80bb369-f6d4-41ea-81b2-e9e412b3b4ca req-3af5abd5-ea50-40ee-9a2a-868c332e4d37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-38456dbd-001e-46c3-ae2d-5e1765611833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:50:24 compute-0 nova_compute[183075]: 2026-01-22 17:50:24.263 183079 DEBUG oslo_concurrency.lockutils [req-f80bb369-f6d4-41ea-81b2-e9e412b3b4ca req-3af5abd5-ea50-40ee-9a2a-868c332e4d37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-38456dbd-001e-46c3-ae2d-5e1765611833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:50:24 compute-0 nova_compute[183075]: 2026-01-22 17:50:24.263 183079 DEBUG nova.network.neutron [req-f80bb369-f6d4-41ea-81b2-e9e412b3b4ca req-3af5abd5-ea50-40ee-9a2a-868c332e4d37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Refreshing network info cache for port f7f0a8be-56b9-4a12-8c48-0b7a66239107 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:50:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:25.168 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:50:25 compute-0 podman[241534]: 2026-01-22 17:50:25.327529393 +0000 UTC m=+0.043329383 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:50:25 compute-0 nova_compute[183075]: 2026-01-22 17:50:25.942 183079 DEBUG nova.network.neutron [req-f80bb369-f6d4-41ea-81b2-e9e412b3b4ca req-3af5abd5-ea50-40ee-9a2a-868c332e4d37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Updated VIF entry in instance network info cache for port f7f0a8be-56b9-4a12-8c48-0b7a66239107. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:50:25 compute-0 nova_compute[183075]: 2026-01-22 17:50:25.942 183079 DEBUG nova.network.neutron [req-f80bb369-f6d4-41ea-81b2-e9e412b3b4ca req-3af5abd5-ea50-40ee-9a2a-868c332e4d37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Updating instance_info_cache with network_info: [{"id": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "address": "fa:16:3e:d4:5d:bb", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7f0a8be-56", "ovs_interfaceid": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:50:25 compute-0 nova_compute[183075]: 2026-01-22 17:50:25.975 183079 DEBUG oslo_concurrency.lockutils [req-f80bb369-f6d4-41ea-81b2-e9e412b3b4ca req-3af5abd5-ea50-40ee-9a2a-868c332e4d37 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-38456dbd-001e-46c3-ae2d-5e1765611833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:50:26 compute-0 nova_compute[183075]: 2026-01-22 17:50:26.796 183079 DEBUG nova.compute.manager [req-8c17ad58-e3d0-4401-9828-0e30d13990fd req-5587715d-4b72-49a3-ac7c-a60aef4243cf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Received event network-changed-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:50:26 compute-0 nova_compute[183075]: 2026-01-22 17:50:26.796 183079 DEBUG nova.compute.manager [req-8c17ad58-e3d0-4401-9828-0e30d13990fd req-5587715d-4b72-49a3-ac7c-a60aef4243cf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Refreshing instance network info cache due to event network-changed-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:50:26 compute-0 nova_compute[183075]: 2026-01-22 17:50:26.797 183079 DEBUG oslo_concurrency.lockutils [req-8c17ad58-e3d0-4401-9828-0e30d13990fd req-5587715d-4b72-49a3-ac7c-a60aef4243cf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:50:26 compute-0 nova_compute[183075]: 2026-01-22 17:50:26.797 183079 DEBUG oslo_concurrency.lockutils [req-8c17ad58-e3d0-4401-9828-0e30d13990fd req-5587715d-4b72-49a3-ac7c-a60aef4243cf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:50:26 compute-0 nova_compute[183075]: 2026-01-22 17:50:26.797 183079 DEBUG nova.network.neutron [req-8c17ad58-e3d0-4401-9828-0e30d13990fd req-5587715d-4b72-49a3-ac7c-a60aef4243cf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Refreshing network info cache for port 598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:50:27 compute-0 nova_compute[183075]: 2026-01-22 17:50:27.764 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:50:27 compute-0 nova_compute[183075]: 2026-01-22 17:50:27.798 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Triggering sync for uuid df3ded3b-e065-4dee-93d3-e1ced39c8619 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 17:50:27 compute-0 nova_compute[183075]: 2026-01-22 17:50:27.799 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Triggering sync for uuid 38456dbd-001e-46c3-ae2d-5e1765611833 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 17:50:27 compute-0 nova_compute[183075]: 2026-01-22 17:50:27.799 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Triggering sync for uuid d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 17:50:27 compute-0 nova_compute[183075]: 2026-01-22 17:50:27.800 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "df3ded3b-e065-4dee-93d3-e1ced39c8619" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:27 compute-0 nova_compute[183075]: 2026-01-22 17:50:27.801 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:27 compute-0 nova_compute[183075]: 2026-01-22 17:50:27.802 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "38456dbd-001e-46c3-ae2d-5e1765611833" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:27 compute-0 nova_compute[183075]: 2026-01-22 17:50:27.802 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "38456dbd-001e-46c3-ae2d-5e1765611833" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:27 compute-0 nova_compute[183075]: 2026-01-22 17:50:27.803 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:27 compute-0 nova_compute[183075]: 2026-01-22 17:50:27.804 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:27 compute-0 nova_compute[183075]: 2026-01-22 17:50:27.844 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:27 compute-0 nova_compute[183075]: 2026-01-22 17:50:27.845 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "38456dbd-001e-46c3-ae2d-5e1765611833" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.042s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:27 compute-0 nova_compute[183075]: 2026-01-22 17:50:27.845 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:27 compute-0 nova_compute[183075]: 2026-01-22 17:50:27.933 183079 DEBUG nova.network.neutron [req-8c17ad58-e3d0-4401-9828-0e30d13990fd req-5587715d-4b72-49a3-ac7c-a60aef4243cf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Updated VIF entry in instance network info cache for port 598a2a6c-cdfc-40ec-9ff8-b57bac7c6063. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:50:27 compute-0 nova_compute[183075]: 2026-01-22 17:50:27.933 183079 DEBUG nova.network.neutron [req-8c17ad58-e3d0-4401-9828-0e30d13990fd req-5587715d-4b72-49a3-ac7c-a60aef4243cf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Updating instance_info_cache with network_info: [{"id": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "address": "fa:16:3e:9e:71:1f", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598a2a6c-cd", "ovs_interfaceid": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:50:27 compute-0 nova_compute[183075]: 2026-01-22 17:50:27.952 183079 DEBUG oslo_concurrency.lockutils [req-8c17ad58-e3d0-4401-9828-0e30d13990fd req-5587715d-4b72-49a3-ac7c-a60aef4243cf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:50:28 compute-0 nova_compute[183075]: 2026-01-22 17:50:28.041 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:28 compute-0 nova_compute[183075]: 2026-01-22 17:50:28.765 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:28 compute-0 nova_compute[183075]: 2026-01-22 17:50:28.883 183079 DEBUG nova.compute.manager [req-34d044ae-ace3-4506-b0c2-ab957444c5b4 req-983e3e47-8757-4465-8f4c-ee7886a05928 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Received event network-changed-11393b0e-74ee-456c-8793-6b2a6cb69a8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:50:28 compute-0 nova_compute[183075]: 2026-01-22 17:50:28.883 183079 DEBUG nova.compute.manager [req-34d044ae-ace3-4506-b0c2-ab957444c5b4 req-983e3e47-8757-4465-8f4c-ee7886a05928 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Refreshing instance network info cache due to event network-changed-11393b0e-74ee-456c-8793-6b2a6cb69a8c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:50:28 compute-0 nova_compute[183075]: 2026-01-22 17:50:28.884 183079 DEBUG oslo_concurrency.lockutils [req-34d044ae-ace3-4506-b0c2-ab957444c5b4 req-983e3e47-8757-4465-8f4c-ee7886a05928 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-df3ded3b-e065-4dee-93d3-e1ced39c8619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:50:28 compute-0 nova_compute[183075]: 2026-01-22 17:50:28.884 183079 DEBUG oslo_concurrency.lockutils [req-34d044ae-ace3-4506-b0c2-ab957444c5b4 req-983e3e47-8757-4465-8f4c-ee7886a05928 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-df3ded3b-e065-4dee-93d3-e1ced39c8619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:50:28 compute-0 nova_compute[183075]: 2026-01-22 17:50:28.884 183079 DEBUG nova.network.neutron [req-34d044ae-ace3-4506-b0c2-ab957444c5b4 req-983e3e47-8757-4465-8f4c-ee7886a05928 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Refreshing network info cache for port 11393b0e-74ee-456c-8793-6b2a6cb69a8c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:50:29 compute-0 nova_compute[183075]: 2026-01-22 17:50:29.826 183079 DEBUG nova.network.neutron [req-34d044ae-ace3-4506-b0c2-ab957444c5b4 req-983e3e47-8757-4465-8f4c-ee7886a05928 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Updated VIF entry in instance network info cache for port 11393b0e-74ee-456c-8793-6b2a6cb69a8c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:50:29 compute-0 nova_compute[183075]: 2026-01-22 17:50:29.826 183079 DEBUG nova.network.neutron [req-34d044ae-ace3-4506-b0c2-ab957444c5b4 req-983e3e47-8757-4465-8f4c-ee7886a05928 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Updating instance_info_cache with network_info: [{"id": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "address": "fa:16:3e:af:50:24", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11393b0e-74", "ovs_interfaceid": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:50:29 compute-0 nova_compute[183075]: 2026-01-22 17:50:29.849 183079 DEBUG oslo_concurrency.lockutils [req-34d044ae-ace3-4506-b0c2-ab957444c5b4 req-983e3e47-8757-4465-8f4c-ee7886a05928 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-df3ded3b-e065-4dee-93d3-e1ced39c8619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:50:29 compute-0 nova_compute[183075]: 2026-01-22 17:50:29.850 183079 DEBUG nova.compute.manager [req-34d044ae-ace3-4506-b0c2-ab957444c5b4 req-983e3e47-8757-4465-8f4c-ee7886a05928 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Received event network-changed-f7f0a8be-56b9-4a12-8c48-0b7a66239107 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:50:29 compute-0 nova_compute[183075]: 2026-01-22 17:50:29.850 183079 DEBUG nova.compute.manager [req-34d044ae-ace3-4506-b0c2-ab957444c5b4 req-983e3e47-8757-4465-8f4c-ee7886a05928 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Refreshing instance network info cache due to event network-changed-f7f0a8be-56b9-4a12-8c48-0b7a66239107. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:50:29 compute-0 nova_compute[183075]: 2026-01-22 17:50:29.851 183079 DEBUG oslo_concurrency.lockutils [req-34d044ae-ace3-4506-b0c2-ab957444c5b4 req-983e3e47-8757-4465-8f4c-ee7886a05928 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-38456dbd-001e-46c3-ae2d-5e1765611833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:50:29 compute-0 nova_compute[183075]: 2026-01-22 17:50:29.851 183079 DEBUG oslo_concurrency.lockutils [req-34d044ae-ace3-4506-b0c2-ab957444c5b4 req-983e3e47-8757-4465-8f4c-ee7886a05928 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-38456dbd-001e-46c3-ae2d-5e1765611833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:50:29 compute-0 nova_compute[183075]: 2026-01-22 17:50:29.852 183079 DEBUG nova.network.neutron [req-34d044ae-ace3-4506-b0c2-ab957444c5b4 req-983e3e47-8757-4465-8f4c-ee7886a05928 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Refreshing network info cache for port f7f0a8be-56b9-4a12-8c48-0b7a66239107 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:50:30 compute-0 podman[198895]: time="2026-01-22T17:50:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 22 17:50:30 compute-0 podman[198895]: @ - - [22/Jan/2026:17:50:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22788 "" "Go-http-client/1.1"
Jan 22 17:50:30 compute-0 nova_compute[183075]: 2026-01-22 17:50:30.758 183079 DEBUG nova.network.neutron [req-34d044ae-ace3-4506-b0c2-ab957444c5b4 req-983e3e47-8757-4465-8f4c-ee7886a05928 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Updated VIF entry in instance network info cache for port f7f0a8be-56b9-4a12-8c48-0b7a66239107. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:50:30 compute-0 nova_compute[183075]: 2026-01-22 17:50:30.759 183079 DEBUG nova.network.neutron [req-34d044ae-ace3-4506-b0c2-ab957444c5b4 req-983e3e47-8757-4465-8f4c-ee7886a05928 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Updating instance_info_cache with network_info: [{"id": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "address": "fa:16:3e:d4:5d:bb", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7f0a8be-56", "ovs_interfaceid": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:50:30 compute-0 nova_compute[183075]: 2026-01-22 17:50:30.781 183079 DEBUG oslo_concurrency.lockutils [req-34d044ae-ace3-4506-b0c2-ab957444c5b4 req-983e3e47-8757-4465-8f4c-ee7886a05928 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-38456dbd-001e-46c3-ae2d-5e1765611833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:50:30 compute-0 nova_compute[183075]: 2026-01-22 17:50:30.984 183079 DEBUG nova.compute.manager [req-eff686cd-14cb-42fb-a0d0-f283f42a9021 req-e3d12971-8f19-4cd9-b4ae-8ec183dcc32a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Received event network-changed-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:50:30 compute-0 nova_compute[183075]: 2026-01-22 17:50:30.986 183079 DEBUG nova.compute.manager [req-eff686cd-14cb-42fb-a0d0-f283f42a9021 req-e3d12971-8f19-4cd9-b4ae-8ec183dcc32a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Refreshing instance network info cache due to event network-changed-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:50:30 compute-0 nova_compute[183075]: 2026-01-22 17:50:30.986 183079 DEBUG oslo_concurrency.lockutils [req-eff686cd-14cb-42fb-a0d0-f283f42a9021 req-e3d12971-8f19-4cd9-b4ae-8ec183dcc32a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:50:30 compute-0 nova_compute[183075]: 2026-01-22 17:50:30.987 183079 DEBUG oslo_concurrency.lockutils [req-eff686cd-14cb-42fb-a0d0-f283f42a9021 req-e3d12971-8f19-4cd9-b4ae-8ec183dcc32a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:50:30 compute-0 nova_compute[183075]: 2026-01-22 17:50:30.987 183079 DEBUG nova.network.neutron [req-eff686cd-14cb-42fb-a0d0-f283f42a9021 req-e3d12971-8f19-4cd9-b4ae-8ec183dcc32a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Refreshing network info cache for port 598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:50:33 compute-0 nova_compute[183075]: 2026-01-22 17:50:33.045 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:33 compute-0 nova_compute[183075]: 2026-01-22 17:50:33.188 183079 DEBUG nova.network.neutron [req-eff686cd-14cb-42fb-a0d0-f283f42a9021 req-e3d12971-8f19-4cd9-b4ae-8ec183dcc32a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Updated VIF entry in instance network info cache for port 598a2a6c-cdfc-40ec-9ff8-b57bac7c6063. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:50:33 compute-0 nova_compute[183075]: 2026-01-22 17:50:33.189 183079 DEBUG nova.network.neutron [req-eff686cd-14cb-42fb-a0d0-f283f42a9021 req-e3d12971-8f19-4cd9-b4ae-8ec183dcc32a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Updating instance_info_cache with network_info: [{"id": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "address": "fa:16:3e:9e:71:1f", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598a2a6c-cd", "ovs_interfaceid": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:50:33 compute-0 nova_compute[183075]: 2026-01-22 17:50:33.209 183079 DEBUG oslo_concurrency.lockutils [req-eff686cd-14cb-42fb-a0d0-f283f42a9021 req-e3d12971-8f19-4cd9-b4ae-8ec183dcc32a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:50:33 compute-0 nova_compute[183075]: 2026-01-22 17:50:33.767 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:34 compute-0 nova_compute[183075]: 2026-01-22 17:50:34.828 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:50:36 compute-0 podman[241568]: 2026-01-22 17:50:36.353000258 +0000 UTC m=+0.057662237 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Jan 22 17:50:36 compute-0 podman[241569]: 2026-01-22 17:50:36.382397337 +0000 UTC m=+0.074983392 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Jan 22 17:50:36 compute-0 podman[241567]: 2026-01-22 17:50:36.391300005 +0000 UTC m=+0.101061871 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 22 17:50:37 compute-0 nova_compute[183075]: 2026-01-22 17:50:37.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:50:38 compute-0 nova_compute[183075]: 2026-01-22 17:50:38.048 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:38 compute-0 nova_compute[183075]: 2026-01-22 17:50:38.768 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:38 compute-0 nova_compute[183075]: 2026-01-22 17:50:38.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:50:39 compute-0 nova_compute[183075]: 2026-01-22 17:50:39.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:50:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:41.969 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:41.970 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:41.971 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:42 compute-0 podman[241633]: 2026-01-22 17:50:42.379509354 +0000 UTC m=+0.082022161 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:50:43 compute-0 nova_compute[183075]: 2026-01-22 17:50:43.051 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:43 compute-0 nova_compute[183075]: 2026-01-22 17:50:43.813 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.103 183079 DEBUG oslo_concurrency.lockutils [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.103 183079 DEBUG oslo_concurrency.lockutils [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.104 183079 DEBUG oslo_concurrency.lockutils [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.104 183079 DEBUG oslo_concurrency.lockutils [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.104 183079 DEBUG oslo_concurrency.lockutils [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.106 183079 INFO nova.compute.manager [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Terminating instance
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.107 183079 DEBUG nova.compute.manager [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:50:44 compute-0 kernel: tap598a2a6c-cd (unregistering): left promiscuous mode
Jan 22 17:50:44 compute-0 NetworkManager[55454]: <info>  [1769104244.1298] device (tap598a2a6c-cd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.136 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:44 compute-0 ovn_controller[95372]: 2026-01-22T17:50:44Z|00796|binding|INFO|Releasing lport 598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 from this chassis (sb_readonly=0)
Jan 22 17:50:44 compute-0 ovn_controller[95372]: 2026-01-22T17:50:44Z|00797|binding|INFO|Setting lport 598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 down in Southbound
Jan 22 17:50:44 compute-0 ovn_controller[95372]: 2026-01-22T17:50:44Z|00798|binding|INFO|Removing iface tap598a2a6c-cd ovn-installed in OVS
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.139 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:44.143 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:71:1f 10.100.0.14'], port_security=['fa:16:3e:9e:71:1f 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'd6dc2bc3-625d-4ff7-a390-ae19df6cdfc2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6287aaea-e1ec-4da1-8661-f9d695351d5e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.191'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=598a2a6c-cdfc-40ec-9ff8-b57bac7c6063) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:50:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:44.145 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:50:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:44.146 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.150 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:44.163 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e3d966-8e54-47e8-ab51-412689d1698b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:44 compute-0 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000047.scope: Deactivated successfully.
Jan 22 17:50:44 compute-0 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000047.scope: Consumed 16.783s CPU time.
Jan 22 17:50:44 compute-0 systemd-machined[154382]: Machine qemu-71-instance-00000047 terminated.
Jan 22 17:50:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:44.191 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[fee0ab61-e945-4a6b-a0c3-a3e41a64df3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:44.194 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e2969e-fb57-42d8-93c9-c6fa8df6daa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:44.220 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[9d507254-e8d2-4c90-bddf-1a6bfb5846e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:44.236 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[aa942506-b8c7-4557-ab3e-b9dbc60a1ca7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 300, 'tx_packets': 158, 'rx_bytes': 25696, 'tx_bytes': 17999, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 300, 'tx_packets': 158, 'rx_bytes': 25696, 'tx_bytes': 17999, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639368, 'reachable_time': 28357, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241665, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:44.252 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[065eb9c6-2164-45b9-a386-161380ab0cc7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 639379, 'tstamp': 639379}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241666, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 639381, 'tstamp': 639381}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241666, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:44.253 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.255 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:44.258 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.258 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:44.259 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:50:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:44.259 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:50:44 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:44.259 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.331 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.335 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.407 183079 INFO nova.virt.libvirt.driver [-] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Instance destroyed successfully.
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.407 183079 DEBUG nova.objects.instance [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.419 183079 DEBUG nova.virt.libvirt.vif [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:49:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1547454978',display_name='tempest-server-test-1547454978',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1547454978',id=71,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:49:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-6n6k9xyp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:49:18Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "address": "fa:16:3e:9e:71:1f", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598a2a6c-cd", "ovs_interfaceid": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.420 183079 DEBUG nova.network.os_vif_util [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "address": "fa:16:3e:9e:71:1f", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598a2a6c-cd", "ovs_interfaceid": "598a2a6c-cdfc-40ec-9ff8-b57bac7c6063", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.420 183079 DEBUG nova.network.os_vif_util [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9e:71:1f,bridge_name='br-int',has_traffic_filtering=True,id=598a2a6c-cdfc-40ec-9ff8-b57bac7c6063,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap598a2a6c-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.421 183079 DEBUG os_vif [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9e:71:1f,bridge_name='br-int',has_traffic_filtering=True,id=598a2a6c-cdfc-40ec-9ff8-b57bac7c6063,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap598a2a6c-cd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.422 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.423 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap598a2a6c-cd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.426 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.429 183079 INFO os_vif [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9e:71:1f,bridge_name='br-int',has_traffic_filtering=True,id=598a2a6c-cdfc-40ec-9ff8-b57bac7c6063,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap598a2a6c-cd')
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.430 183079 INFO nova.virt.libvirt.driver [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Deleting instance files /var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2_del
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.430 183079 INFO nova.virt.libvirt.driver [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Deletion of /var/lib/nova/instances/d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2_del complete
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.480 183079 INFO nova.compute.manager [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.481 183079 DEBUG oslo.service.loopingcall [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.481 183079 DEBUG nova.compute.manager [-] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.481 183079 DEBUG nova.network.neutron [-] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.850 183079 DEBUG nova.compute.manager [req-66d7ad47-bdcb-40bf-b418-3d0f14ba5400 req-fe20c702-4f81-46ef-a0f6-0d2a97e94916 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Received event network-vif-unplugged-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.850 183079 DEBUG oslo_concurrency.lockutils [req-66d7ad47-bdcb-40bf-b418-3d0f14ba5400 req-fe20c702-4f81-46ef-a0f6-0d2a97e94916 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.851 183079 DEBUG oslo_concurrency.lockutils [req-66d7ad47-bdcb-40bf-b418-3d0f14ba5400 req-fe20c702-4f81-46ef-a0f6-0d2a97e94916 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.851 183079 DEBUG oslo_concurrency.lockutils [req-66d7ad47-bdcb-40bf-b418-3d0f14ba5400 req-fe20c702-4f81-46ef-a0f6-0d2a97e94916 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.851 183079 DEBUG nova.compute.manager [req-66d7ad47-bdcb-40bf-b418-3d0f14ba5400 req-fe20c702-4f81-46ef-a0f6-0d2a97e94916 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] No waiting events found dispatching network-vif-unplugged-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:50:44 compute-0 nova_compute[183075]: 2026-01-22 17:50:44.851 183079 DEBUG nova.compute.manager [req-66d7ad47-bdcb-40bf-b418-3d0f14ba5400 req-fe20c702-4f81-46ef-a0f6-0d2a97e94916 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Received event network-vif-unplugged-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:50:45 compute-0 nova_compute[183075]: 2026-01-22 17:50:45.680 183079 DEBUG nova.network.neutron [-] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:50:45 compute-0 nova_compute[183075]: 2026-01-22 17:50:45.698 183079 INFO nova.compute.manager [-] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Took 1.22 seconds to deallocate network for instance.
Jan 22 17:50:45 compute-0 nova_compute[183075]: 2026-01-22 17:50:45.737 183079 DEBUG oslo_concurrency.lockutils [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:45 compute-0 nova_compute[183075]: 2026-01-22 17:50:45.737 183079 DEBUG oslo_concurrency.lockutils [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:45 compute-0 nova_compute[183075]: 2026-01-22 17:50:45.757 183079 DEBUG nova.compute.manager [req-0da44253-edfd-4b4f-85ee-b03d4f58a4e9 req-e859ef34-20b1-4611-8e0d-b3c0f5ca5b72 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Received event network-vif-deleted-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:50:45 compute-0 nova_compute[183075]: 2026-01-22 17:50:45.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:50:45 compute-0 nova_compute[183075]: 2026-01-22 17:50:45.834 183079 DEBUG nova.compute.provider_tree [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:50:45 compute-0 nova_compute[183075]: 2026-01-22 17:50:45.846 183079 DEBUG nova.scheduler.client.report [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:50:45 compute-0 nova_compute[183075]: 2026-01-22 17:50:45.864 183079 DEBUG oslo_concurrency.lockutils [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:45 compute-0 nova_compute[183075]: 2026-01-22 17:50:45.891 183079 INFO nova.scheduler.client.report [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2
Jan 22 17:50:45 compute-0 nova_compute[183075]: 2026-01-22 17:50:45.954 183079 DEBUG oslo_concurrency.lockutils [None req-2de3830f-8dfe-476b-9119-ed347427380c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.850s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.161 183079 DEBUG oslo_concurrency.lockutils [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "38456dbd-001e-46c3-ae2d-5e1765611833" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.162 183079 DEBUG oslo_concurrency.lockutils [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "38456dbd-001e-46c3-ae2d-5e1765611833" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.162 183079 DEBUG oslo_concurrency.lockutils [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.162 183079 DEBUG oslo_concurrency.lockutils [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.162 183079 DEBUG oslo_concurrency.lockutils [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.163 183079 INFO nova.compute.manager [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Terminating instance
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.164 183079 DEBUG nova.compute.manager [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:50:46 compute-0 kernel: tapf7f0a8be-56 (unregistering): left promiscuous mode
Jan 22 17:50:46 compute-0 NetworkManager[55454]: <info>  [1769104246.1808] device (tapf7f0a8be-56): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.224 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:46 compute-0 ovn_controller[95372]: 2026-01-22T17:50:46Z|00799|binding|INFO|Releasing lport f7f0a8be-56b9-4a12-8c48-0b7a66239107 from this chassis (sb_readonly=0)
Jan 22 17:50:46 compute-0 ovn_controller[95372]: 2026-01-22T17:50:46Z|00800|binding|INFO|Setting lport f7f0a8be-56b9-4a12-8c48-0b7a66239107 down in Southbound
Jan 22 17:50:46 compute-0 ovn_controller[95372]: 2026-01-22T17:50:46Z|00801|binding|INFO|Removing iface tapf7f0a8be-56 ovn-installed in OVS
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.230 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:46.240 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:5d:bb 10.100.0.10'], port_security=['fa:16:3e:d4:5d:bb 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '38456dbd-001e-46c3-ae2d-5e1765611833', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '574fccba-75c6-4dd6-8b13-040f91d4cf29', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.219'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=f7f0a8be-56b9-4a12-8c48-0b7a66239107) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:50:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:46.242 104629 INFO neutron.agent.ovn.metadata.agent [-] Port f7f0a8be-56b9-4a12-8c48-0b7a66239107 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.242 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:46.244 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:50:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:46.264 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[34495b85-fa28-4ac1-a050-24fdaedae31b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:46 compute-0 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000046.scope: Deactivated successfully.
Jan 22 17:50:46 compute-0 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000046.scope: Consumed 18.430s CPU time.
Jan 22 17:50:46 compute-0 systemd-machined[154382]: Machine qemu-70-instance-00000046 terminated.
Jan 22 17:50:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:46.305 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c762fb35-1763-4a65-ad6f-f5b2b0be6c0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:46.309 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[cec17709-3a5d-4869-82f6-6a49e29bccd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:46.333 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[c1f1954d-5489-4672-a477-b8ac6d1fcd8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:46.349 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[12f939f7-baf9-476c-8085-535c89459bf9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 300, 'tx_packets': 160, 'rx_bytes': 25696, 'tx_bytes': 18083, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 300, 'tx_packets': 160, 'rx_bytes': 25696, 'tx_bytes': 18083, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639368, 'reachable_time': 28357, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241691, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:46.363 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1a1c976c-393c-40c2-971d-fe7cef0056c3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 639379, 'tstamp': 639379}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241692, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 639381, 'tstamp': 639381}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241692, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:46.365 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.368 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.372 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:46.372 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:50:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:46.372 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:50:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:46.372 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:50:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:46.373 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.423 183079 INFO nova.virt.libvirt.driver [-] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Instance destroyed successfully.
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.423 183079 DEBUG nova.objects.instance [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid 38456dbd-001e-46c3-ae2d-5e1765611833 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.437 183079 DEBUG nova.virt.libvirt.vif [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:48:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-7188366',display_name='tempest-server-test-7188366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-7188366',id=70,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:48:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-jlj121jw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:48:15Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=38456dbd-001e-46c3-ae2d-5e1765611833,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "address": "fa:16:3e:d4:5d:bb", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7f0a8be-56", "ovs_interfaceid": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.438 183079 DEBUG nova.network.os_vif_util [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "address": "fa:16:3e:d4:5d:bb", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7f0a8be-56", "ovs_interfaceid": "f7f0a8be-56b9-4a12-8c48-0b7a66239107", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.438 183079 DEBUG nova.network.os_vif_util [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d4:5d:bb,bridge_name='br-int',has_traffic_filtering=True,id=f7f0a8be-56b9-4a12-8c48-0b7a66239107,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7f0a8be-56') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.439 183079 DEBUG os_vif [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:5d:bb,bridge_name='br-int',has_traffic_filtering=True,id=f7f0a8be-56b9-4a12-8c48-0b7a66239107,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7f0a8be-56') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.440 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.440 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7f0a8be-56, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.441 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.442 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.444 183079 INFO os_vif [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:5d:bb,bridge_name='br-int',has_traffic_filtering=True,id=f7f0a8be-56b9-4a12-8c48-0b7a66239107,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7f0a8be-56')
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.444 183079 INFO nova.virt.libvirt.driver [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Deleting instance files /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833_del
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.445 183079 INFO nova.virt.libvirt.driver [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Deletion of /var/lib/nova/instances/38456dbd-001e-46c3-ae2d-5e1765611833_del complete
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.491 183079 INFO nova.compute.manager [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.491 183079 DEBUG oslo.service.loopingcall [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.492 183079 DEBUG nova.compute.manager [-] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.492 183079 DEBUG nova.network.neutron [-] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.946 183079 DEBUG nova.compute.manager [req-7808649b-5948-4379-8f39-6a11040c9f9d req-e1d78add-d9f3-47d0-86cb-806f7a6824d1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Received event network-vif-plugged-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.946 183079 DEBUG oslo_concurrency.lockutils [req-7808649b-5948-4379-8f39-6a11040c9f9d req-e1d78add-d9f3-47d0-86cb-806f7a6824d1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.946 183079 DEBUG oslo_concurrency.lockutils [req-7808649b-5948-4379-8f39-6a11040c9f9d req-e1d78add-d9f3-47d0-86cb-806f7a6824d1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.946 183079 DEBUG oslo_concurrency.lockutils [req-7808649b-5948-4379-8f39-6a11040c9f9d req-e1d78add-d9f3-47d0-86cb-806f7a6824d1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.946 183079 DEBUG nova.compute.manager [req-7808649b-5948-4379-8f39-6a11040c9f9d req-e1d78add-d9f3-47d0-86cb-806f7a6824d1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] No waiting events found dispatching network-vif-plugged-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:50:46 compute-0 nova_compute[183075]: 2026-01-22 17:50:46.947 183079 WARNING nova.compute.manager [req-7808649b-5948-4379-8f39-6a11040c9f9d req-e1d78add-d9f3-47d0-86cb-806f7a6824d1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Received unexpected event network-vif-plugged-598a2a6c-cdfc-40ec-9ff8-b57bac7c6063 for instance with vm_state deleted and task_state None.
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.150 183079 DEBUG nova.network.neutron [-] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.169 183079 INFO nova.compute.manager [-] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Took 0.68 seconds to deallocate network for instance.
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.210 183079 DEBUG oslo_concurrency.lockutils [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.211 183079 DEBUG oslo_concurrency.lockutils [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.279 183079 DEBUG nova.compute.provider_tree [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.299 183079 DEBUG nova.scheduler.client.report [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.319 183079 DEBUG oslo_concurrency.lockutils [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.340 183079 INFO nova.scheduler.client.report [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance 38456dbd-001e-46c3-ae2d-5e1765611833
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.411 183079 DEBUG oslo_concurrency.lockutils [None req-ca09bd04-7949-4dfa-b022-2347812247ee 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "38456dbd-001e-46c3-ae2d-5e1765611833" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.838 183079 DEBUG nova.compute.manager [req-305ce212-4d6b-40bb-b1af-883d359d3bba req-c86e1493-9ad6-4698-8c45-34461a9b5e26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Received event network-vif-unplugged-f7f0a8be-56b9-4a12-8c48-0b7a66239107 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.838 183079 DEBUG oslo_concurrency.lockutils [req-305ce212-4d6b-40bb-b1af-883d359d3bba req-c86e1493-9ad6-4698-8c45-34461a9b5e26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.839 183079 DEBUG oslo_concurrency.lockutils [req-305ce212-4d6b-40bb-b1af-883d359d3bba req-c86e1493-9ad6-4698-8c45-34461a9b5e26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.839 183079 DEBUG oslo_concurrency.lockutils [req-305ce212-4d6b-40bb-b1af-883d359d3bba req-c86e1493-9ad6-4698-8c45-34461a9b5e26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.839 183079 DEBUG nova.compute.manager [req-305ce212-4d6b-40bb-b1af-883d359d3bba req-c86e1493-9ad6-4698-8c45-34461a9b5e26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] No waiting events found dispatching network-vif-unplugged-f7f0a8be-56b9-4a12-8c48-0b7a66239107 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.839 183079 WARNING nova.compute.manager [req-305ce212-4d6b-40bb-b1af-883d359d3bba req-c86e1493-9ad6-4698-8c45-34461a9b5e26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Received unexpected event network-vif-unplugged-f7f0a8be-56b9-4a12-8c48-0b7a66239107 for instance with vm_state deleted and task_state None.
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.839 183079 DEBUG nova.compute.manager [req-305ce212-4d6b-40bb-b1af-883d359d3bba req-c86e1493-9ad6-4698-8c45-34461a9b5e26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Received event network-vif-plugged-f7f0a8be-56b9-4a12-8c48-0b7a66239107 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.840 183079 DEBUG oslo_concurrency.lockutils [req-305ce212-4d6b-40bb-b1af-883d359d3bba req-c86e1493-9ad6-4698-8c45-34461a9b5e26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.840 183079 DEBUG oslo_concurrency.lockutils [req-305ce212-4d6b-40bb-b1af-883d359d3bba req-c86e1493-9ad6-4698-8c45-34461a9b5e26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.840 183079 DEBUG oslo_concurrency.lockutils [req-305ce212-4d6b-40bb-b1af-883d359d3bba req-c86e1493-9ad6-4698-8c45-34461a9b5e26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "38456dbd-001e-46c3-ae2d-5e1765611833-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.840 183079 DEBUG nova.compute.manager [req-305ce212-4d6b-40bb-b1af-883d359d3bba req-c86e1493-9ad6-4698-8c45-34461a9b5e26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] No waiting events found dispatching network-vif-plugged-f7f0a8be-56b9-4a12-8c48-0b7a66239107 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.841 183079 WARNING nova.compute.manager [req-305ce212-4d6b-40bb-b1af-883d359d3bba req-c86e1493-9ad6-4698-8c45-34461a9b5e26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Received unexpected event network-vif-plugged-f7f0a8be-56b9-4a12-8c48-0b7a66239107 for instance with vm_state deleted and task_state None.
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.841 183079 DEBUG nova.compute.manager [req-305ce212-4d6b-40bb-b1af-883d359d3bba req-c86e1493-9ad6-4698-8c45-34461a9b5e26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Received event network-vif-deleted-f7f0a8be-56b9-4a12-8c48-0b7a66239107 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.878 183079 DEBUG oslo_concurrency.lockutils [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "df3ded3b-e065-4dee-93d3-e1ced39c8619" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.878 183079 DEBUG oslo_concurrency.lockutils [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.878 183079 DEBUG oslo_concurrency.lockutils [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.878 183079 DEBUG oslo_concurrency.lockutils [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.879 183079 DEBUG oslo_concurrency.lockutils [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.880 183079 INFO nova.compute.manager [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Terminating instance
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.880 183079 DEBUG nova.compute.manager [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:50:47 compute-0 kernel: tap11393b0e-74 (unregistering): left promiscuous mode
Jan 22 17:50:47 compute-0 NetworkManager[55454]: <info>  [1769104247.9065] device (tap11393b0e-74): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:50:47 compute-0 ovn_controller[95372]: 2026-01-22T17:50:47Z|00802|binding|INFO|Releasing lport 11393b0e-74ee-456c-8793-6b2a6cb69a8c from this chassis (sb_readonly=0)
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.913 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:47 compute-0 ovn_controller[95372]: 2026-01-22T17:50:47Z|00803|binding|INFO|Setting lport 11393b0e-74ee-456c-8793-6b2a6cb69a8c down in Southbound
Jan 22 17:50:47 compute-0 ovn_controller[95372]: 2026-01-22T17:50:47Z|00804|binding|INFO|Removing iface tap11393b0e-74 ovn-installed in OVS
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.915 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:47.920 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:50:24 10.100.0.12'], port_security=['fa:16:3e:af:50:24 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'df3ded3b-e065-4dee-93d3-e1ced39c8619', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '574fccba-75c6-4dd6-8b13-040f91d4cf29', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.243'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=11393b0e-74ee-456c-8793-6b2a6cb69a8c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:50:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:47.922 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 11393b0e-74ee-456c-8793-6b2a6cb69a8c in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:50:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:47.923 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:50:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:47.924 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c084c588-77cc-451c-9822-daa0ca1747c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:47.925 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace which is not needed anymore
Jan 22 17:50:47 compute-0 nova_compute[183075]: 2026-01-22 17:50:47.929 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:47 compute-0 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000045.scope: Deactivated successfully.
Jan 22 17:50:47 compute-0 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000045.scope: Consumed 21.996s CPU time.
Jan 22 17:50:47 compute-0 systemd-machined[154382]: Machine qemu-69-instance-00000045 terminated.
Jan 22 17:50:48 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240481]: [NOTICE]   (240485) : haproxy version is 2.8.14-c23fe91
Jan 22 17:50:48 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240481]: [NOTICE]   (240485) : path to executable is /usr/sbin/haproxy
Jan 22 17:50:48 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240481]: [WARNING]  (240485) : Exiting Master process...
Jan 22 17:50:48 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240481]: [ALERT]    (240485) : Current worker (240487) exited with code 143 (Terminated)
Jan 22 17:50:48 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[240481]: [WARNING]  (240485) : All workers exited. Exiting... (0)
Jan 22 17:50:48 compute-0 systemd[1]: libpod-86b7ddc8fb11ca0d8d513aa3bd0c09fd24d0db577008358da0229169bb1faf4e.scope: Deactivated successfully.
Jan 22 17:50:48 compute-0 podman[241735]: 2026-01-22 17:50:48.052214882 +0000 UTC m=+0.043723184 container died 86b7ddc8fb11ca0d8d513aa3bd0c09fd24d0db577008358da0229169bb1faf4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 17:50:48 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-86b7ddc8fb11ca0d8d513aa3bd0c09fd24d0db577008358da0229169bb1faf4e-userdata-shm.mount: Deactivated successfully.
Jan 22 17:50:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-fb8ea020f48aebd88578bf0755c742772153b6daf58fcdfb6d33204e874fca13-merged.mount: Deactivated successfully.
Jan 22 17:50:48 compute-0 podman[241735]: 2026-01-22 17:50:48.097766044 +0000 UTC m=+0.089274336 container cleanup 86b7ddc8fb11ca0d8d513aa3bd0c09fd24d0db577008358da0229169bb1faf4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:50:48 compute-0 systemd[1]: libpod-conmon-86b7ddc8fb11ca0d8d513aa3bd0c09fd24d0db577008358da0229169bb1faf4e.scope: Deactivated successfully.
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.135 183079 INFO nova.virt.libvirt.driver [-] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Instance destroyed successfully.
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.135 183079 DEBUG nova.objects.instance [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid df3ded3b-e065-4dee-93d3-e1ced39c8619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.148 183079 DEBUG nova.virt.libvirt.vif [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:47:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1131995147',display_name='tempest-server-test-1131995147',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1131995147',id=69,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:47:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-m3xyla0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:47:11Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=df3ded3b-e065-4dee-93d3-e1ced39c8619,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "address": "fa:16:3e:af:50:24", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11393b0e-74", "ovs_interfaceid": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.148 183079 DEBUG nova.network.os_vif_util [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "address": "fa:16:3e:af:50:24", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11393b0e-74", "ovs_interfaceid": "11393b0e-74ee-456c-8793-6b2a6cb69a8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.149 183079 DEBUG nova.network.os_vif_util [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:af:50:24,bridge_name='br-int',has_traffic_filtering=True,id=11393b0e-74ee-456c-8793-6b2a6cb69a8c,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11393b0e-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.149 183079 DEBUG os_vif [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:50:24,bridge_name='br-int',has_traffic_filtering=True,id=11393b0e-74ee-456c-8793-6b2a6cb69a8c,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11393b0e-74') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.150 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.151 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap11393b0e-74, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.152 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.153 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.156 183079 INFO os_vif [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:50:24,bridge_name='br-int',has_traffic_filtering=True,id=11393b0e-74ee-456c-8793-6b2a6cb69a8c,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11393b0e-74')
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.156 183079 INFO nova.virt.libvirt.driver [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Deleting instance files /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619_del
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.157 183079 INFO nova.virt.libvirt.driver [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Deletion of /var/lib/nova/instances/df3ded3b-e065-4dee-93d3-e1ced39c8619_del complete
Jan 22 17:50:48 compute-0 podman[241771]: 2026-01-22 17:50:48.163369463 +0000 UTC m=+0.044527985 container remove 86b7ddc8fb11ca0d8d513aa3bd0c09fd24d0db577008358da0229169bb1faf4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 17:50:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:48.168 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3b9d0c5f-b629-4a4f-b891-afb1f6b5d734]: (4, ('Thu Jan 22 05:50:47 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (86b7ddc8fb11ca0d8d513aa3bd0c09fd24d0db577008358da0229169bb1faf4e)\n86b7ddc8fb11ca0d8d513aa3bd0c09fd24d0db577008358da0229169bb1faf4e\nThu Jan 22 05:50:48 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (86b7ddc8fb11ca0d8d513aa3bd0c09fd24d0db577008358da0229169bb1faf4e)\n86b7ddc8fb11ca0d8d513aa3bd0c09fd24d0db577008358da0229169bb1faf4e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:48.170 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f4eab27e-07c0-4058-8339-87cd53544d31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:48.171 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.173 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:48 compute-0 kernel: tap88ed9213-70: left promiscuous mode
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.187 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:48.191 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[82bcda1b-9799-484f-8802-634d75350990]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:48.204 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c2de2da2-5812-4d05-9290-870959d684de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:48.205 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9722d59c-5c60-4c75-9904-7978a2954ebb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.210 183079 INFO nova.compute.manager [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.211 183079 DEBUG oslo.service.loopingcall [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.211 183079 DEBUG nova.compute.manager [-] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.212 183079 DEBUG nova.network.neutron [-] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:50:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:48.221 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8b418116-cb55-47fe-9466-99aedaa3df5a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639362, 'reachable_time': 34019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241797, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:48.224 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:50:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:50:48.224 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[3a496577-f684-4de2-a0be-7714f404199b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:50:48 compute-0 systemd[1]: run-netns-ovnmeta\x2d88ed9213\x2d7d48\x2d4c42\x2dad60\x2dce3fcf104f71.mount: Deactivated successfully.
Jan 22 17:50:48 compute-0 nova_compute[183075]: 2026-01-22 17:50:48.863 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:49 compute-0 nova_compute[183075]: 2026-01-22 17:50:49.269 183079 DEBUG nova.network.neutron [-] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:50:49 compute-0 nova_compute[183075]: 2026-01-22 17:50:49.286 183079 INFO nova.compute.manager [-] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Took 1.07 seconds to deallocate network for instance.
Jan 22 17:50:49 compute-0 nova_compute[183075]: 2026-01-22 17:50:49.331 183079 DEBUG oslo_concurrency.lockutils [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:49 compute-0 nova_compute[183075]: 2026-01-22 17:50:49.331 183079 DEBUG oslo_concurrency.lockutils [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:49 compute-0 nova_compute[183075]: 2026-01-22 17:50:49.373 183079 DEBUG nova.compute.provider_tree [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:50:49 compute-0 nova_compute[183075]: 2026-01-22 17:50:49.386 183079 DEBUG nova.scheduler.client.report [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:50:49 compute-0 nova_compute[183075]: 2026-01-22 17:50:49.406 183079 DEBUG oslo_concurrency.lockutils [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:49 compute-0 nova_compute[183075]: 2026-01-22 17:50:49.431 183079 INFO nova.scheduler.client.report [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance df3ded3b-e065-4dee-93d3-e1ced39c8619
Jan 22 17:50:49 compute-0 nova_compute[183075]: 2026-01-22 17:50:49.491 183079 DEBUG oslo_concurrency.lockutils [None req-5f21e8ee-8f49-4ad2-96c5-f60ed3830fe3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.170 183079 DEBUG nova.compute.manager [req-ae67daca-7594-467a-a998-07944130d774 req-278e8299-e2e2-4ef4-bd70-13a91265abaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Received event network-vif-unplugged-11393b0e-74ee-456c-8793-6b2a6cb69a8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.171 183079 DEBUG oslo_concurrency.lockutils [req-ae67daca-7594-467a-a998-07944130d774 req-278e8299-e2e2-4ef4-bd70-13a91265abaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.171 183079 DEBUG oslo_concurrency.lockutils [req-ae67daca-7594-467a-a998-07944130d774 req-278e8299-e2e2-4ef4-bd70-13a91265abaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.171 183079 DEBUG oslo_concurrency.lockutils [req-ae67daca-7594-467a-a998-07944130d774 req-278e8299-e2e2-4ef4-bd70-13a91265abaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.172 183079 DEBUG nova.compute.manager [req-ae67daca-7594-467a-a998-07944130d774 req-278e8299-e2e2-4ef4-bd70-13a91265abaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] No waiting events found dispatching network-vif-unplugged-11393b0e-74ee-456c-8793-6b2a6cb69a8c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.172 183079 WARNING nova.compute.manager [req-ae67daca-7594-467a-a998-07944130d774 req-278e8299-e2e2-4ef4-bd70-13a91265abaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Received unexpected event network-vif-unplugged-11393b0e-74ee-456c-8793-6b2a6cb69a8c for instance with vm_state deleted and task_state None.
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.173 183079 DEBUG nova.compute.manager [req-ae67daca-7594-467a-a998-07944130d774 req-278e8299-e2e2-4ef4-bd70-13a91265abaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Received event network-vif-plugged-11393b0e-74ee-456c-8793-6b2a6cb69a8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.173 183079 DEBUG oslo_concurrency.lockutils [req-ae67daca-7594-467a-a998-07944130d774 req-278e8299-e2e2-4ef4-bd70-13a91265abaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.174 183079 DEBUG oslo_concurrency.lockutils [req-ae67daca-7594-467a-a998-07944130d774 req-278e8299-e2e2-4ef4-bd70-13a91265abaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.174 183079 DEBUG oslo_concurrency.lockutils [req-ae67daca-7594-467a-a998-07944130d774 req-278e8299-e2e2-4ef4-bd70-13a91265abaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "df3ded3b-e065-4dee-93d3-e1ced39c8619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.174 183079 DEBUG nova.compute.manager [req-ae67daca-7594-467a-a998-07944130d774 req-278e8299-e2e2-4ef4-bd70-13a91265abaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] No waiting events found dispatching network-vif-plugged-11393b0e-74ee-456c-8793-6b2a6cb69a8c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.175 183079 WARNING nova.compute.manager [req-ae67daca-7594-467a-a998-07944130d774 req-278e8299-e2e2-4ef4-bd70-13a91265abaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Received unexpected event network-vif-plugged-11393b0e-74ee-456c-8793-6b2a6cb69a8c for instance with vm_state deleted and task_state None.
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.175 183079 DEBUG nova.compute.manager [req-ae67daca-7594-467a-a998-07944130d774 req-278e8299-e2e2-4ef4-bd70-13a91265abaf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Received event network-vif-deleted-11393b0e-74ee-456c-8793-6b2a6cb69a8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:50:50 compute-0 nova_compute[183075]: 2026-01-22 17:50:50.804 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:50:51 compute-0 nova_compute[183075]: 2026-01-22 17:50:51.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:50:51 compute-0 nova_compute[183075]: 2026-01-22 17:50:51.820 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:51 compute-0 nova_compute[183075]: 2026-01-22 17:50:51.820 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:51 compute-0 nova_compute[183075]: 2026-01-22 17:50:51.820 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:51 compute-0 nova_compute[183075]: 2026-01-22 17:50:51.820 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:50:51 compute-0 nova_compute[183075]: 2026-01-22 17:50:51.962 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:50:51 compute-0 nova_compute[183075]: 2026-01-22 17:50:51.962 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5724MB free_disk=73.35136032104492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:50:51 compute-0 nova_compute[183075]: 2026-01-22 17:50:51.963 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:51 compute-0 nova_compute[183075]: 2026-01-22 17:50:51.963 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:52 compute-0 nova_compute[183075]: 2026-01-22 17:50:52.008 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:50:52 compute-0 nova_compute[183075]: 2026-01-22 17:50:52.008 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:50:52 compute-0 nova_compute[183075]: 2026-01-22 17:50:52.026 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:50:52 compute-0 nova_compute[183075]: 2026-01-22 17:50:52.041 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:50:52 compute-0 nova_compute[183075]: 2026-01-22 17:50:52.066 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:50:52 compute-0 nova_compute[183075]: 2026-01-22 17:50:52.066 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:53 compute-0 nova_compute[183075]: 2026-01-22 17:50:53.066 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:50:53 compute-0 nova_compute[183075]: 2026-01-22 17:50:53.067 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:50:53 compute-0 nova_compute[183075]: 2026-01-22 17:50:53.154 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:53 compute-0 podman[241800]: 2026-01-22 17:50:53.341807053 +0000 UTC m=+0.051318038 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:50:53 compute-0 nova_compute[183075]: 2026-01-22 17:50:53.909 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:54 compute-0 nova_compute[183075]: 2026-01-22 17:50:54.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:50:56 compute-0 podman[241824]: 2026-01-22 17:50:56.332816082 +0000 UTC m=+0.048311657 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:50:56 compute-0 nova_compute[183075]: 2026-01-22 17:50:56.804 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:56 compute-0 nova_compute[183075]: 2026-01-22 17:50:56.805 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:56 compute-0 nova_compute[183075]: 2026-01-22 17:50:56.818 183079 DEBUG nova.compute.manager [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:50:56 compute-0 nova_compute[183075]: 2026-01-22 17:50:56.887 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:56 compute-0 nova_compute[183075]: 2026-01-22 17:50:56.887 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:56 compute-0 nova_compute[183075]: 2026-01-22 17:50:56.893 183079 DEBUG nova.virt.hardware [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:50:56 compute-0 nova_compute[183075]: 2026-01-22 17:50:56.894 183079 INFO nova.compute.claims [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.020 183079 DEBUG nova.compute.provider_tree [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.036 183079 DEBUG nova.scheduler.client.report [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.053 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.053 183079 DEBUG nova.compute.manager [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.099 183079 DEBUG nova.compute.manager [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.099 183079 DEBUG nova.network.neutron [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.127 183079 INFO nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.146 183079 DEBUG nova.compute.manager [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.246 183079 DEBUG nova.compute.manager [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.248 183079 DEBUG nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.248 183079 INFO nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Creating image(s)
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.249 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.249 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.249 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.265 183079 DEBUG oslo_concurrency.processutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.328 183079 DEBUG oslo_concurrency.processutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.330 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.330 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.341 183079 DEBUG oslo_concurrency.processutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.394 183079 DEBUG oslo_concurrency.processutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.395 183079 DEBUG oslo_concurrency.processutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.429 183079 DEBUG oslo_concurrency.processutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.430 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.431 183079 DEBUG oslo_concurrency.processutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.485 183079 DEBUG oslo_concurrency.processutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.486 183079 DEBUG nova.virt.disk.api [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.486 183079 DEBUG oslo_concurrency.processutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.536 183079 DEBUG oslo_concurrency.processutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.537 183079 DEBUG nova.virt.disk.api [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.538 183079 DEBUG nova.objects.instance [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid c9f0a876-68d5-4c5d-b4cf-62a36101777d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.554 183079 DEBUG nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.554 183079 DEBUG nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Ensure instance console log exists: /var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.555 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.555 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.555 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:50:57 compute-0 nova_compute[183075]: 2026-01-22 17:50:57.825 183079 DEBUG nova.policy [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:50:58 compute-0 nova_compute[183075]: 2026-01-22 17:50:58.158 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:58 compute-0 nova_compute[183075]: 2026-01-22 17:50:58.867 183079 DEBUG nova.network.neutron [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Successfully updated port: 1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:50:58 compute-0 nova_compute[183075]: 2026-01-22 17:50:58.891 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-c9f0a876-68d5-4c5d-b4cf-62a36101777d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:50:58 compute-0 nova_compute[183075]: 2026-01-22 17:50:58.892 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-c9f0a876-68d5-4c5d-b4cf-62a36101777d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:50:58 compute-0 nova_compute[183075]: 2026-01-22 17:50:58.892 183079 DEBUG nova.network.neutron [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:50:58 compute-0 nova_compute[183075]: 2026-01-22 17:50:58.910 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:50:59 compute-0 nova_compute[183075]: 2026-01-22 17:50:59.005 183079 DEBUG nova.compute.manager [req-570a2481-4366-4b78-8482-60e8d569fc7b req-0ee7d98a-6c77-4a62-b079-f1b6a9212cab a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Received event network-changed-1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:50:59 compute-0 nova_compute[183075]: 2026-01-22 17:50:59.005 183079 DEBUG nova.compute.manager [req-570a2481-4366-4b78-8482-60e8d569fc7b req-0ee7d98a-6c77-4a62-b079-f1b6a9212cab a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Refreshing instance network info cache due to event network-changed-1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:50:59 compute-0 nova_compute[183075]: 2026-01-22 17:50:59.006 183079 DEBUG oslo_concurrency.lockutils [req-570a2481-4366-4b78-8482-60e8d569fc7b req-0ee7d98a-6c77-4a62-b079-f1b6a9212cab a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-c9f0a876-68d5-4c5d-b4cf-62a36101777d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:50:59 compute-0 nova_compute[183075]: 2026-01-22 17:50:59.076 183079 DEBUG nova.network.neutron [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:50:59 compute-0 nova_compute[183075]: 2026-01-22 17:50:59.406 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769104244.4046988, d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:50:59 compute-0 nova_compute[183075]: 2026-01-22 17:50:59.407 183079 INFO nova.compute.manager [-] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] VM Stopped (Lifecycle Event)
Jan 22 17:50:59 compute-0 nova_compute[183075]: 2026-01-22 17:50:59.428 183079 DEBUG nova.compute.manager [None req-dd7de32b-ac0a-41bc-a1f5-94c8b31183d7 - - - - - -] [instance: d6dc2bc3-625d-4ff7-a390-ae19df6cdfc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.026 183079 DEBUG nova.network.neutron [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Updating instance_info_cache with network_info: [{"id": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "address": "fa:16:3e:2d:00:4b", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dcdc02a-a9", "ovs_interfaceid": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.042 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-c9f0a876-68d5-4c5d-b4cf-62a36101777d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.042 183079 DEBUG nova.compute.manager [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Instance network_info: |[{"id": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "address": "fa:16:3e:2d:00:4b", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dcdc02a-a9", "ovs_interfaceid": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.043 183079 DEBUG oslo_concurrency.lockutils [req-570a2481-4366-4b78-8482-60e8d569fc7b req-0ee7d98a-6c77-4a62-b079-f1b6a9212cab a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-c9f0a876-68d5-4c5d-b4cf-62a36101777d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.043 183079 DEBUG nova.network.neutron [req-570a2481-4366-4b78-8482-60e8d569fc7b req-0ee7d98a-6c77-4a62-b079-f1b6a9212cab a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Refreshing network info cache for port 1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.045 183079 DEBUG nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Start _get_guest_xml network_info=[{"id": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "address": "fa:16:3e:2d:00:4b", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dcdc02a-a9", "ovs_interfaceid": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.049 183079 WARNING nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.053 183079 DEBUG nova.virt.libvirt.host [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.053 183079 DEBUG nova.virt.libvirt.host [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.055 183079 DEBUG nova.virt.libvirt.host [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.056 183079 DEBUG nova.virt.libvirt.host [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.056 183079 DEBUG nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.056 183079 DEBUG nova.virt.hardware [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.057 183079 DEBUG nova.virt.hardware [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.057 183079 DEBUG nova.virt.hardware [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.057 183079 DEBUG nova.virt.hardware [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.057 183079 DEBUG nova.virt.hardware [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.057 183079 DEBUG nova.virt.hardware [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.058 183079 DEBUG nova.virt.hardware [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.058 183079 DEBUG nova.virt.hardware [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.058 183079 DEBUG nova.virt.hardware [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.058 183079 DEBUG nova.virt.hardware [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.058 183079 DEBUG nova.virt.hardware [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.062 183079 DEBUG nova.virt.libvirt.vif [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:50:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1080468316',display_name='tempest-server-test-1080468316',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1080468316',id=72,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-sgonrbi7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:50:57Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=c9f0a876-68d5-4c5d-b4cf-62a36101777d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "address": "fa:16:3e:2d:00:4b", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dcdc02a-a9", "ovs_interfaceid": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.062 183079 DEBUG nova.network.os_vif_util [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "address": "fa:16:3e:2d:00:4b", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dcdc02a-a9", "ovs_interfaceid": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.063 183079 DEBUG nova.network.os_vif_util [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:00:4b,bridge_name='br-int',has_traffic_filtering=True,id=1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dcdc02a-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.063 183079 DEBUG nova.objects.instance [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid c9f0a876-68d5-4c5d-b4cf-62a36101777d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.108 183079 DEBUG nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:51:00 compute-0 nova_compute[183075]:   <uuid>c9f0a876-68d5-4c5d-b4cf-62a36101777d</uuid>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   <name>instance-00000048</name>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1080468316</nova:name>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:51:00</nova:creationTime>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:51:00 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:51:00 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:51:00 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:51:00 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:51:00 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:51:00 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:51:00 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:51:00 compute-0 nova_compute[183075]:         <nova:port uuid="1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0">
Jan 22 17:51:00 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <system>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <entry name="serial">c9f0a876-68d5-4c5d-b4cf-62a36101777d</entry>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <entry name="uuid">c9f0a876-68d5-4c5d-b4cf-62a36101777d</entry>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     </system>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   <os>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   </os>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   <features>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   </features>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:2d:00:4b"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <target dev="tap1dcdc02a-a9"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d/console.log" append="off"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <video>
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     </video>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:51:00 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:51:00 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:51:00 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:51:00 compute-0 nova_compute[183075]: </domain>
Jan 22 17:51:00 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.110 183079 DEBUG nova.compute.manager [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Preparing to wait for external event network-vif-plugged-1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.110 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.110 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.110 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.111 183079 DEBUG nova.virt.libvirt.vif [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:50:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1080468316',display_name='tempest-server-test-1080468316',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1080468316',id=72,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-sgonrbi7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:50:57Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=c9f0a876-68d5-4c5d-b4cf-62a36101777d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "address": "fa:16:3e:2d:00:4b", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dcdc02a-a9", "ovs_interfaceid": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.111 183079 DEBUG nova.network.os_vif_util [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "address": "fa:16:3e:2d:00:4b", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dcdc02a-a9", "ovs_interfaceid": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.112 183079 DEBUG nova.network.os_vif_util [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:00:4b,bridge_name='br-int',has_traffic_filtering=True,id=1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dcdc02a-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.112 183079 DEBUG os_vif [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:00:4b,bridge_name='br-int',has_traffic_filtering=True,id=1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dcdc02a-a9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.113 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.114 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.115 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.117 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.117 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1dcdc02a-a9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.118 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1dcdc02a-a9, col_values=(('external_ids', {'iface-id': '1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:00:4b', 'vm-uuid': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.119 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:00 compute-0 NetworkManager[55454]: <info>  [1769104260.1221] manager: (tap1dcdc02a-a9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.122 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.127 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.127 183079 INFO os_vif [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:00:4b,bridge_name='br-int',has_traffic_filtering=True,id=1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dcdc02a-a9')
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.249 183079 DEBUG nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.250 183079 DEBUG nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:2d:00:4b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:51:00 compute-0 kernel: tap1dcdc02a-a9: entered promiscuous mode
Jan 22 17:51:00 compute-0 NetworkManager[55454]: <info>  [1769104260.3032] manager: (tap1dcdc02a-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/320)
Jan 22 17:51:00 compute-0 ovn_controller[95372]: 2026-01-22T17:51:00Z|00805|binding|INFO|Claiming lport 1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 for this chassis.
Jan 22 17:51:00 compute-0 ovn_controller[95372]: 2026-01-22T17:51:00Z|00806|binding|INFO|1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0: Claiming fa:16:3e:2d:00:4b 10.100.0.3
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.304 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.311 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:00:4b 10.100.0.3'], port_security=['fa:16:3e:2d:00:4b 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f61e4b27-bb7c-42d2-a372-c8137640f8ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.312 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.313 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:51:00 compute-0 ovn_controller[95372]: 2026-01-22T17:51:00Z|00807|binding|INFO|Setting lport 1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 up in Southbound
Jan 22 17:51:00 compute-0 ovn_controller[95372]: 2026-01-22T17:51:00Z|00808|binding|INFO|Setting lport 1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 ovn-installed in OVS
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.317 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.318 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.324 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.326 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0b2718d3-fac8-4317-9b7a-c1b7c609aa19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.326 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88ed9213-71 in ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.328 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88ed9213-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.328 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5d65048d-f7b3-46bd-960f-7bad3a15735a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.329 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[281c8a01-3125-4739-86a6-b25bbf68053c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:00 compute-0 systemd-udevd[241879]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.339 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[2087ce32-adfe-410e-b310-b7cfc83150d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:00 compute-0 systemd-machined[154382]: New machine qemu-72-instance-00000048.
Jan 22 17:51:00 compute-0 NetworkManager[55454]: <info>  [1769104260.3479] device (tap1dcdc02a-a9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:51:00 compute-0 NetworkManager[55454]: <info>  [1769104260.3487] device (tap1dcdc02a-a9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:51:00 compute-0 systemd[1]: Started Virtual Machine qemu-72-instance-00000048.
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.363 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e56bcbfc-0354-4206-8803-2e7d49877b33]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.391 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[001d8550-23d8-4665-b6bf-391b0b5f9fbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.396 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d92bd5-ed3a-42f5-b480-d8696821084d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:00 compute-0 NetworkManager[55454]: <info>  [1769104260.3974] manager: (tap88ed9213-70): new Veth device (/org/freedesktop/NetworkManager/Devices/321)
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.424 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[1b75aeb1-87e0-4ddc-8fb1-a71a36392bde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.427 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[e377e93e-d9c4-4fa2-8a40-38d1fea7b34e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:00 compute-0 NetworkManager[55454]: <info>  [1769104260.4477] device (tap88ed9213-70): carrier: link connected
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.453 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[b1bb5734-b343-43cb-b508-106fd609e97d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.471 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8ae86dbc-f161-4476-bc1e-b48aa1a45668]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662414, 'reachable_time': 31076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241912, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.492 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f45a291f-2ab2-4738-b5ac-fa0a841e4934]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:fa93'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662414, 'tstamp': 662414}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241913, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.510 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[01e640a1-6628-41ff-9a09-c3f316282043]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662414, 'reachable_time': 31076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241914, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.546 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a07a9a-c791-405b-be0d-ee9df81c3f05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.601 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[70fecbdf-e888-4cfa-98f3-80b798b748c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.603 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.603 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.604 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:51:00 compute-0 NetworkManager[55454]: <info>  [1769104260.6067] manager: (tap88ed9213-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Jan 22 17:51:00 compute-0 kernel: tap88ed9213-70: entered promiscuous mode
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.606 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.610 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.611 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:00 compute-0 ovn_controller[95372]: 2026-01-22T17:51:00Z|00809|binding|INFO|Releasing lport da93a32e-eed6-4124-8f08-78b0bfe2cb69 from this chassis (sb_readonly=0)
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.611 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.612 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.612 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[08e1e330-6bca-4e2b-8c18-320480b0431a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.613 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:51:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:00.613 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'env', 'PROCESS_TAG=haproxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88ed9213-7d48-4c42-ad60-ce3fcf104f71.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.625 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.632 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104260.6319704, c9f0a876-68d5-4c5d-b4cf-62a36101777d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.633 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] VM Started (Lifecycle Event)
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.652 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.655 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104260.6321268, c9f0a876-68d5-4c5d-b4cf-62a36101777d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.656 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] VM Paused (Lifecycle Event)
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.674 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.678 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:51:00 compute-0 nova_compute[183075]: 2026-01-22 17:51:00.695 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:51:00 compute-0 podman[241954]: 2026-01-22 17:51:00.967413839 +0000 UTC m=+0.048816530 container create f3df2a8422e17d4c67e5c57ad760c65a8ddd7ed62aae9781365514a97ea51b25 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:51:01 compute-0 systemd[1]: Started libpod-conmon-f3df2a8422e17d4c67e5c57ad760c65a8ddd7ed62aae9781365514a97ea51b25.scope.
Jan 22 17:51:01 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:51:01 compute-0 podman[241954]: 2026-01-22 17:51:00.942318216 +0000 UTC m=+0.023720937 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:51:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a02df10c0bfff98e76b6cbd052239ed02779ddd2e3bb9f5e5294c4af2de3a08/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:51:01 compute-0 podman[241954]: 2026-01-22 17:51:01.049051848 +0000 UTC m=+0.130454559 container init f3df2a8422e17d4c67e5c57ad760c65a8ddd7ed62aae9781365514a97ea51b25 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:51:01 compute-0 podman[241954]: 2026-01-22 17:51:01.054235257 +0000 UTC m=+0.135637948 container start f3df2a8422e17d4c67e5c57ad760c65a8ddd7ed62aae9781365514a97ea51b25 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 17:51:01 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241968]: [NOTICE]   (241972) : New worker (241974) forked
Jan 22 17:51:01 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241968]: [NOTICE]   (241972) : Loading success.
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.121 183079 DEBUG nova.compute.manager [req-818effe4-f559-4dd6-bd5e-31ba5d7e6812 req-330d6397-a5bd-44f3-9740-ee45c328b9fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Received event network-vif-plugged-1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.122 183079 DEBUG oslo_concurrency.lockutils [req-818effe4-f559-4dd6-bd5e-31ba5d7e6812 req-330d6397-a5bd-44f3-9740-ee45c328b9fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.122 183079 DEBUG oslo_concurrency.lockutils [req-818effe4-f559-4dd6-bd5e-31ba5d7e6812 req-330d6397-a5bd-44f3-9740-ee45c328b9fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.122 183079 DEBUG oslo_concurrency.lockutils [req-818effe4-f559-4dd6-bd5e-31ba5d7e6812 req-330d6397-a5bd-44f3-9740-ee45c328b9fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.122 183079 DEBUG nova.compute.manager [req-818effe4-f559-4dd6-bd5e-31ba5d7e6812 req-330d6397-a5bd-44f3-9740-ee45c328b9fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Processing event network-vif-plugged-1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.123 183079 DEBUG nova.compute.manager [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.126 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104261.1266572, c9f0a876-68d5-4c5d-b4cf-62a36101777d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.127 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] VM Resumed (Lifecycle Event)
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.128 183079 DEBUG nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.131 183079 INFO nova.virt.libvirt.driver [-] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Instance spawned successfully.
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.131 183079 DEBUG nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.150 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.156 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.159 183079 DEBUG nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.160 183079 DEBUG nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.160 183079 DEBUG nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.160 183079 DEBUG nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.161 183079 DEBUG nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.161 183079 DEBUG nova.virt.libvirt.driver [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.185 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.214 183079 INFO nova.compute.manager [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Took 3.97 seconds to spawn the instance on the hypervisor.
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.214 183079 DEBUG nova.compute.manager [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.284 183079 INFO nova.compute.manager [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Took 4.41 seconds to build instance.
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.316 183079 DEBUG oslo_concurrency.lockutils [None req-f7e4bdf7-5510-4a68-87cb-125bc5665200 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.511s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.423 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769104246.4217536, 38456dbd-001e-46c3-ae2d-5e1765611833 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.423 183079 INFO nova.compute.manager [-] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] VM Stopped (Lifecycle Event)
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.447 183079 DEBUG nova.compute.manager [None req-f7c4fd4e-2c4a-4f18-a42d-be6cd7b98749 - - - - - -] [instance: 38456dbd-001e-46c3-ae2d-5e1765611833] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.787 183079 DEBUG nova.network.neutron [req-570a2481-4366-4b78-8482-60e8d569fc7b req-0ee7d98a-6c77-4a62-b079-f1b6a9212cab a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Updated VIF entry in instance network info cache for port 1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.788 183079 DEBUG nova.network.neutron [req-570a2481-4366-4b78-8482-60e8d569fc7b req-0ee7d98a-6c77-4a62-b079-f1b6a9212cab a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Updating instance_info_cache with network_info: [{"id": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "address": "fa:16:3e:2d:00:4b", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dcdc02a-a9", "ovs_interfaceid": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:51:01 compute-0 nova_compute[183075]: 2026-01-22 17:51:01.809 183079 DEBUG oslo_concurrency.lockutils [req-570a2481-4366-4b78-8482-60e8d569fc7b req-0ee7d98a-6c77-4a62-b079-f1b6a9212cab a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-c9f0a876-68d5-4c5d-b4cf-62a36101777d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:51:02 compute-0 nova_compute[183075]: 2026-01-22 17:51:02.715 183079 INFO nova.compute.manager [None req-bd08b186-7dd4-4a48-b536-667f709b120c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Get console output
Jan 22 17:51:03 compute-0 nova_compute[183075]: 2026-01-22 17:51:03.133 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769104248.1320322, df3ded3b-e065-4dee-93d3-e1ced39c8619 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:51:03 compute-0 nova_compute[183075]: 2026-01-22 17:51:03.133 183079 INFO nova.compute.manager [-] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] VM Stopped (Lifecycle Event)
Jan 22 17:51:03 compute-0 nova_compute[183075]: 2026-01-22 17:51:03.155 183079 DEBUG nova.compute.manager [None req-a84abf6e-8735-488a-9627-1207cab3d1fe - - - - - -] [instance: df3ded3b-e065-4dee-93d3-e1ced39c8619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:51:03 compute-0 nova_compute[183075]: 2026-01-22 17:51:03.188 183079 DEBUG nova.compute.manager [req-9f4878e6-f09e-4578-b8fb-9e371b256740 req-fb4635a3-5285-4a6e-86e0-5c155a24d28e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Received event network-vif-plugged-1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:51:03 compute-0 nova_compute[183075]: 2026-01-22 17:51:03.188 183079 DEBUG oslo_concurrency.lockutils [req-9f4878e6-f09e-4578-b8fb-9e371b256740 req-fb4635a3-5285-4a6e-86e0-5c155a24d28e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:51:03 compute-0 nova_compute[183075]: 2026-01-22 17:51:03.189 183079 DEBUG oslo_concurrency.lockutils [req-9f4878e6-f09e-4578-b8fb-9e371b256740 req-fb4635a3-5285-4a6e-86e0-5c155a24d28e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:51:03 compute-0 nova_compute[183075]: 2026-01-22 17:51:03.189 183079 DEBUG oslo_concurrency.lockutils [req-9f4878e6-f09e-4578-b8fb-9e371b256740 req-fb4635a3-5285-4a6e-86e0-5c155a24d28e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:51:03 compute-0 nova_compute[183075]: 2026-01-22 17:51:03.189 183079 DEBUG nova.compute.manager [req-9f4878e6-f09e-4578-b8fb-9e371b256740 req-fb4635a3-5285-4a6e-86e0-5c155a24d28e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] No waiting events found dispatching network-vif-plugged-1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:51:03 compute-0 nova_compute[183075]: 2026-01-22 17:51:03.189 183079 WARNING nova.compute.manager [req-9f4878e6-f09e-4578-b8fb-9e371b256740 req-fb4635a3-5285-4a6e-86e0-5c155a24d28e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Received unexpected event network-vif-plugged-1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 for instance with vm_state active and task_state None.
Jan 22 17:51:03 compute-0 nova_compute[183075]: 2026-01-22 17:51:03.913 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:05 compute-0 nova_compute[183075]: 2026-01-22 17:51:05.121 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:07 compute-0 podman[241985]: 2026-01-22 17:51:07.359107239 +0000 UTC m=+0.063115283 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, config_id=openstack_network_exporter, name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 17:51:07 compute-0 podman[241984]: 2026-01-22 17:51:07.368772429 +0000 UTC m=+0.075915357 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 22 17:51:07 compute-0 podman[241983]: 2026-01-22 17:51:07.380437171 +0000 UTC m=+0.089518871 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 17:51:07 compute-0 nova_compute[183075]: 2026-01-22 17:51:07.854 183079 INFO nova.compute.manager [None req-11c3a7c6-3153-46cc-925e-cc185174a199 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Get console output
Jan 22 17:51:07 compute-0 nova_compute[183075]: 2026-01-22 17:51:07.859 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:51:08 compute-0 nova_compute[183075]: 2026-01-22 17:51:08.951 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:10 compute-0 nova_compute[183075]: 2026-01-22 17:51:10.169 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:12 compute-0 ovn_controller[95372]: 2026-01-22T17:51:12Z|00146|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2d:00:4b 10.100.0.3
Jan 22 17:51:12 compute-0 ovn_controller[95372]: 2026-01-22T17:51:12Z|00147|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2d:00:4b 10.100.0.3
Jan 22 17:51:12 compute-0 nova_compute[183075]: 2026-01-22 17:51:12.979 183079 INFO nova.compute.manager [None req-5924992e-5521-4ab8-b0b4-729dcdcea0ec 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Get console output
Jan 22 17:51:12 compute-0 nova_compute[183075]: 2026-01-22 17:51:12.984 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:51:13 compute-0 podman[242069]: 2026-01-22 17:51:13.354711977 +0000 UTC m=+0.060195855 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:51:13 compute-0 nova_compute[183075]: 2026-01-22 17:51:13.953 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:15 compute-0 nova_compute[183075]: 2026-01-22 17:51:15.172 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:18 compute-0 nova_compute[183075]: 2026-01-22 17:51:18.103 183079 INFO nova.compute.manager [None req-5f8a1da7-0dc4-47ec-8509-d7250b3510d1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Get console output
Jan 22 17:51:18 compute-0 nova_compute[183075]: 2026-01-22 17:51:18.106 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.377 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.378 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.744 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.745 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.3671041
Jan 22 17:51:18 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.3:40104 [22/Jan/2026:17:51:18.377] listener listener/metadata 0/0/0/368/368 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.754 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.755 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.778 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.778 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0230689
Jan 22 17:51:18 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.3:40114 [22/Jan/2026:17:51:18.754] listener listener/metadata 0/0/0/24/24 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.784 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.784 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.800 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.801 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0162561
Jan 22 17:51:18 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.3:40120 [22/Jan/2026:17:51:18.782] listener listener/metadata 0/0/0/18/18 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.805 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.806 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.824 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.825 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0184433
Jan 22 17:51:18 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.3:40132 [22/Jan/2026:17:51:18.805] listener listener/metadata 0/0/0/19/19 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.829 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.830 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.845 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.846 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0153379
Jan 22 17:51:18 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.3:40136 [22/Jan/2026:17:51:18.829] listener listener/metadata 0/0/0/16/16 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.850 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.851 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.871 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:18 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.3:40148 [22/Jan/2026:17:51:18.850] listener listener/metadata 0/0/0/21/21 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.871 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0205255
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.875 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.876 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.892 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.893 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0165589
Jan 22 17:51:18 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.3:40150 [22/Jan/2026:17:51:18.875] listener listener/metadata 0/0/0/17/17 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.897 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.898 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.912 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.912 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0147870
Jan 22 17:51:18 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.3:40154 [22/Jan/2026:17:51:18.897] listener listener/metadata 0/0/0/15/15 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.916 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.917 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.930 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.931 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0139544
Jan 22 17:51:18 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.3:40164 [22/Jan/2026:17:51:18.916] listener listener/metadata 0/0/0/14/14 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.934 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.934 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.950 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:18 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.3:40176 [22/Jan/2026:17:51:18.933] listener listener/metadata 0/0/0/16/16 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.950 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0160437
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.955 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:18 compute-0 nova_compute[183075]: 2026-01-22 17:51:18.955 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.956 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:18 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.3:40180 [22/Jan/2026:17:51:18.954] listener listener/metadata 0/0/0/15/15 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.969 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0133150
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.977 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.978 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.998 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:18 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.3:40192 [22/Jan/2026:17:51:18.976] listener listener/metadata 0/0/0/22/22 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:51:18 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:18.999 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0208561
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:19.001 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:19.002 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:19.014 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.3:40196 [22/Jan/2026:17:51:19.001] listener listener/metadata 0/0/0/13/13 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:19.014 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0121937
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:19.017 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:19.017 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:19.031 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:19.031 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0136616
Jan 22 17:51:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.3:40208 [22/Jan/2026:17:51:19.017] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:19.035 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:19.035 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:19.046 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.3:40224 [22/Jan/2026:17:51:19.035] listener listener/metadata 0/0/0/11/11 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:19.047 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0111225
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:19.050 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:19.051 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.3
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:19.063 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:19 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.3:40238 [22/Jan/2026:17:51:19.050] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:51:19 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:19.063 104990 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0126171
Jan 22 17:51:20 compute-0 nova_compute[183075]: 2026-01-22 17:51:20.173 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:23 compute-0 nova_compute[183075]: 2026-01-22 17:51:23.237 183079 INFO nova.compute.manager [None req-380cea82-701d-43ac-9fc8-7fc73e85e927 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Get console output
Jan 22 17:51:23 compute-0 nova_compute[183075]: 2026-01-22 17:51:23.241 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:51:24 compute-0 nova_compute[183075]: 2026-01-22 17:51:24.005 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:24 compute-0 podman[242089]: 2026-01-22 17:51:24.350449815 +0000 UTC m=+0.055175771 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:51:25 compute-0 nova_compute[183075]: 2026-01-22 17:51:25.199 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:27 compute-0 podman[242114]: 2026-01-22 17:51:27.350655428 +0000 UTC m=+0.062384314 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:51:27 compute-0 nova_compute[183075]: 2026-01-22 17:51:27.932 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "e0a2ef50-95ae-4e45-bedb-f385cf225914" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:51:27 compute-0 nova_compute[183075]: 2026-01-22 17:51:27.932 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "e0a2ef50-95ae-4e45-bedb-f385cf225914" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:51:27 compute-0 nova_compute[183075]: 2026-01-22 17:51:27.950 183079 DEBUG nova.compute.manager [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.015 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.016 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.025 183079 DEBUG nova.virt.hardware [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.025 183079 INFO nova.compute.claims [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.139 183079 DEBUG nova.compute.provider_tree [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.153 183079 DEBUG nova.scheduler.client.report [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.174 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.175 183079 DEBUG nova.compute.manager [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.232 183079 DEBUG nova.compute.manager [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.233 183079 DEBUG nova.network.neutron [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.254 183079 INFO nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.273 183079 DEBUG nova.compute.manager [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.370 183079 DEBUG nova.compute.manager [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.371 183079 DEBUG nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.371 183079 INFO nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Creating image(s)
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.372 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.372 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.373 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.385 183079 DEBUG oslo_concurrency.processutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.442 183079 DEBUG oslo_concurrency.processutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.443 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.443 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.455 183079 DEBUG oslo_concurrency.processutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.508 183079 DEBUG oslo_concurrency.processutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.509 183079 DEBUG oslo_concurrency.processutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.548 183079 DEBUG oslo_concurrency.processutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.549 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.550 183079 DEBUG oslo_concurrency.processutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.609 183079 DEBUG oslo_concurrency.processutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.610 183079 DEBUG nova.virt.disk.api [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.611 183079 DEBUG oslo_concurrency.processutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.668 183079 DEBUG oslo_concurrency.processutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.668 183079 DEBUG nova.virt.disk.api [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.669 183079 DEBUG nova.objects.instance [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid e0a2ef50-95ae-4e45-bedb-f385cf225914 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.760 183079 DEBUG nova.policy [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.926 183079 DEBUG nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.926 183079 DEBUG nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Ensure instance console log exists: /var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.927 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.927 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:51:28 compute-0 nova_compute[183075]: 2026-01-22 17:51:28.927 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:51:29 compute-0 nova_compute[183075]: 2026-01-22 17:51:29.009 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:29 compute-0 nova_compute[183075]: 2026-01-22 17:51:29.856 183079 DEBUG nova.network.neutron [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Successfully updated port: fdc3f392-ef15-4cba-9920-303c2d328978 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:51:29 compute-0 nova_compute[183075]: 2026-01-22 17:51:29.870 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-e0a2ef50-95ae-4e45-bedb-f385cf225914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:51:29 compute-0 nova_compute[183075]: 2026-01-22 17:51:29.871 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-e0a2ef50-95ae-4e45-bedb-f385cf225914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:51:29 compute-0 nova_compute[183075]: 2026-01-22 17:51:29.871 183079 DEBUG nova.network.neutron [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:51:29 compute-0 nova_compute[183075]: 2026-01-22 17:51:29.901 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:29.902 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:51:29 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:29.904 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:51:29 compute-0 nova_compute[183075]: 2026-01-22 17:51:29.940 183079 DEBUG nova.compute.manager [req-3b0d9114-9003-4fb2-8fc9-c837778dd4de req-c107dc9f-068b-4aba-9656-79a8b59d27d4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Received event network-changed-fdc3f392-ef15-4cba-9920-303c2d328978 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:51:29 compute-0 nova_compute[183075]: 2026-01-22 17:51:29.941 183079 DEBUG nova.compute.manager [req-3b0d9114-9003-4fb2-8fc9-c837778dd4de req-c107dc9f-068b-4aba-9656-79a8b59d27d4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Refreshing instance network info cache due to event network-changed-fdc3f392-ef15-4cba-9920-303c2d328978. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:51:29 compute-0 nova_compute[183075]: 2026-01-22 17:51:29.941 183079 DEBUG oslo_concurrency.lockutils [req-3b0d9114-9003-4fb2-8fc9-c837778dd4de req-c107dc9f-068b-4aba-9656-79a8b59d27d4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-e0a2ef50-95ae-4e45-bedb-f385cf225914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.004 183079 DEBUG nova.network.neutron [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.202 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:30 compute-0 ovn_controller[95372]: 2026-01-22T17:51:30Z|00810|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.912 183079 DEBUG nova.network.neutron [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Updating instance_info_cache with network_info: [{"id": "fdc3f392-ef15-4cba-9920-303c2d328978", "address": "fa:16:3e:99:ce:f5", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3f392-ef", "ovs_interfaceid": "fdc3f392-ef15-4cba-9920-303c2d328978", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.934 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-e0a2ef50-95ae-4e45-bedb-f385cf225914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.934 183079 DEBUG nova.compute.manager [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Instance network_info: |[{"id": "fdc3f392-ef15-4cba-9920-303c2d328978", "address": "fa:16:3e:99:ce:f5", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3f392-ef", "ovs_interfaceid": "fdc3f392-ef15-4cba-9920-303c2d328978", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.934 183079 DEBUG oslo_concurrency.lockutils [req-3b0d9114-9003-4fb2-8fc9-c837778dd4de req-c107dc9f-068b-4aba-9656-79a8b59d27d4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-e0a2ef50-95ae-4e45-bedb-f385cf225914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.935 183079 DEBUG nova.network.neutron [req-3b0d9114-9003-4fb2-8fc9-c837778dd4de req-c107dc9f-068b-4aba-9656-79a8b59d27d4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Refreshing network info cache for port fdc3f392-ef15-4cba-9920-303c2d328978 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.938 183079 DEBUG nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Start _get_guest_xml network_info=[{"id": "fdc3f392-ef15-4cba-9920-303c2d328978", "address": "fa:16:3e:99:ce:f5", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3f392-ef", "ovs_interfaceid": "fdc3f392-ef15-4cba-9920-303c2d328978", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.944 183079 WARNING nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.948 183079 DEBUG nova.virt.libvirt.host [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.949 183079 DEBUG nova.virt.libvirt.host [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.951 183079 DEBUG nova.virt.libvirt.host [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.952 183079 DEBUG nova.virt.libvirt.host [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.952 183079 DEBUG nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.952 183079 DEBUG nova.virt.hardware [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.953 183079 DEBUG nova.virt.hardware [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.953 183079 DEBUG nova.virt.hardware [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.953 183079 DEBUG nova.virt.hardware [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.954 183079 DEBUG nova.virt.hardware [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.954 183079 DEBUG nova.virt.hardware [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.954 183079 DEBUG nova.virt.hardware [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.954 183079 DEBUG nova.virt.hardware [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.955 183079 DEBUG nova.virt.hardware [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.955 183079 DEBUG nova.virt.hardware [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.955 183079 DEBUG nova.virt.hardware [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.959 183079 DEBUG nova.virt.libvirt.vif [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:51:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1351617608',display_name='tempest-server-test-1351617608',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1351617608',id=73,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-qcrtjop2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:51:28Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=e0a2ef50-95ae-4e45-bedb-f385cf225914,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fdc3f392-ef15-4cba-9920-303c2d328978", "address": "fa:16:3e:99:ce:f5", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3f392-ef", "ovs_interfaceid": "fdc3f392-ef15-4cba-9920-303c2d328978", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.959 183079 DEBUG nova.network.os_vif_util [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "fdc3f392-ef15-4cba-9920-303c2d328978", "address": "fa:16:3e:99:ce:f5", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3f392-ef", "ovs_interfaceid": "fdc3f392-ef15-4cba-9920-303c2d328978", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.960 183079 DEBUG nova.network.os_vif_util [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:ce:f5,bridge_name='br-int',has_traffic_filtering=True,id=fdc3f392-ef15-4cba-9920-303c2d328978,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfdc3f392-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.961 183079 DEBUG nova.objects.instance [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid e0a2ef50-95ae-4e45-bedb-f385cf225914 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.978 183079 DEBUG nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:51:30 compute-0 nova_compute[183075]:   <uuid>e0a2ef50-95ae-4e45-bedb-f385cf225914</uuid>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   <name>instance-00000049</name>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1351617608</nova:name>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:51:30</nova:creationTime>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:51:30 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:51:30 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:51:30 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:51:30 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:51:30 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:51:30 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:51:30 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:51:30 compute-0 nova_compute[183075]:         <nova:port uuid="fdc3f392-ef15-4cba-9920-303c2d328978">
Jan 22 17:51:30 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <system>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <entry name="serial">e0a2ef50-95ae-4e45-bedb-f385cf225914</entry>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <entry name="uuid">e0a2ef50-95ae-4e45-bedb-f385cf225914</entry>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     </system>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   <os>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   </os>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   <features>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   </features>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914/disk"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:99:ce:f5"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <target dev="tapfdc3f392-ef"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914/console.log" append="off"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <video>
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     </video>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:51:30 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:51:30 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:51:30 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:51:30 compute-0 nova_compute[183075]: </domain>
Jan 22 17:51:30 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.979 183079 DEBUG nova.compute.manager [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Preparing to wait for external event network-vif-plugged-fdc3f392-ef15-4cba-9920-303c2d328978 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.979 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.979 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.980 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.980 183079 DEBUG nova.virt.libvirt.vif [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:51:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1351617608',display_name='tempest-server-test-1351617608',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1351617608',id=73,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-qcrtjop2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:51:28Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=e0a2ef50-95ae-4e45-bedb-f385cf225914,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fdc3f392-ef15-4cba-9920-303c2d328978", "address": "fa:16:3e:99:ce:f5", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3f392-ef", "ovs_interfaceid": "fdc3f392-ef15-4cba-9920-303c2d328978", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.981 183079 DEBUG nova.network.os_vif_util [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "fdc3f392-ef15-4cba-9920-303c2d328978", "address": "fa:16:3e:99:ce:f5", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3f392-ef", "ovs_interfaceid": "fdc3f392-ef15-4cba-9920-303c2d328978", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.981 183079 DEBUG nova.network.os_vif_util [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:ce:f5,bridge_name='br-int',has_traffic_filtering=True,id=fdc3f392-ef15-4cba-9920-303c2d328978,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfdc3f392-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.981 183079 DEBUG os_vif [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:ce:f5,bridge_name='br-int',has_traffic_filtering=True,id=fdc3f392-ef15-4cba-9920-303c2d328978,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfdc3f392-ef') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.982 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.982 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.983 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.984 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.985 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfdc3f392-ef, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.985 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfdc3f392-ef, col_values=(('external_ids', {'iface-id': 'fdc3f392-ef15-4cba-9920-303c2d328978', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:99:ce:f5', 'vm-uuid': 'e0a2ef50-95ae-4e45-bedb-f385cf225914'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.986 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:30 compute-0 NetworkManager[55454]: <info>  [1769104290.9877] manager: (tapfdc3f392-ef): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/323)
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.989 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.993 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:30 compute-0 nova_compute[183075]: 2026-01-22 17:51:30.993 183079 INFO os_vif [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:ce:f5,bridge_name='br-int',has_traffic_filtering=True,id=fdc3f392-ef15-4cba-9920-303c2d328978,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfdc3f392-ef')
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.047 183079 DEBUG nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.048 183079 DEBUG nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:99:ce:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:51:31 compute-0 kernel: tapfdc3f392-ef: entered promiscuous mode
Jan 22 17:51:31 compute-0 NetworkManager[55454]: <info>  [1769104291.1105] manager: (tapfdc3f392-ef): new Tun device (/org/freedesktop/NetworkManager/Devices/324)
Jan 22 17:51:31 compute-0 ovn_controller[95372]: 2026-01-22T17:51:31Z|00811|binding|INFO|Claiming lport fdc3f392-ef15-4cba-9920-303c2d328978 for this chassis.
Jan 22 17:51:31 compute-0 ovn_controller[95372]: 2026-01-22T17:51:31Z|00812|binding|INFO|fdc3f392-ef15-4cba-9920-303c2d328978: Claiming fa:16:3e:99:ce:f5 10.100.0.6
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.113 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:31.117 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:ce:f5 10.100.0.6'], port_security=['fa:16:3e:99:ce:f5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f61e4b27-bb7c-42d2-a372-c8137640f8ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=fdc3f392-ef15-4cba-9920-303c2d328978) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:51:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:31.119 104629 INFO neutron.agent.ovn.metadata.agent [-] Port fdc3f392-ef15-4cba-9920-303c2d328978 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:51:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:31.121 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:51:31 compute-0 ovn_controller[95372]: 2026-01-22T17:51:31Z|00813|binding|INFO|Setting lport fdc3f392-ef15-4cba-9920-303c2d328978 ovn-installed in OVS
Jan 22 17:51:31 compute-0 ovn_controller[95372]: 2026-01-22T17:51:31Z|00814|binding|INFO|Setting lport fdc3f392-ef15-4cba-9920-303c2d328978 up in Southbound
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.127 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.128 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.132 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:31.137 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[004d8fbd-fe24-4f2c-9310-e748239d417e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:31 compute-0 systemd-udevd[242169]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:51:31 compute-0 systemd-machined[154382]: New machine qemu-73-instance-00000049.
Jan 22 17:51:31 compute-0 NetworkManager[55454]: <info>  [1769104291.1563] device (tapfdc3f392-ef): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:51:31 compute-0 NetworkManager[55454]: <info>  [1769104291.1572] device (tapfdc3f392-ef): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:51:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:31.165 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a09c127a-0d8a-4f5e-a36f-2dd175afd282]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:31.167 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[02064ec1-54ea-4d7f-b879-cbc89bc823bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:31 compute-0 systemd[1]: Started Virtual Machine qemu-73-instance-00000049.
Jan 22 17:51:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:31.195 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[b99bf7db-5b95-4018-9877-f94c8506f806]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:31.211 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[41aa4528-025f-47c0-8a94-8ea2c9f96d51]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6131, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6131, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662414, 'reachable_time': 31076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242179, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:31.227 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cd16d776-6b89-4adf-ac7f-58d846cf0b00]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662427, 'tstamp': 662427}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242183, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662429, 'tstamp': 662429}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242183, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:51:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:31.229 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.230 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:31.234 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:51:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:31.234 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:51:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:31.235 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:51:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:31.235 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.438 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104291.4380848, e0a2ef50-95ae-4e45-bedb-f385cf225914 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.439 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] VM Started (Lifecycle Event)
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.571 183079 DEBUG nova.compute.manager [req-6d6eb9eb-40d2-457e-b0aa-a824a60e3ce4 req-61de9383-2fec-40de-b6ec-f0405194ac8e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Received event network-vif-plugged-fdc3f392-ef15-4cba-9920-303c2d328978 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.571 183079 DEBUG oslo_concurrency.lockutils [req-6d6eb9eb-40d2-457e-b0aa-a824a60e3ce4 req-61de9383-2fec-40de-b6ec-f0405194ac8e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.572 183079 DEBUG oslo_concurrency.lockutils [req-6d6eb9eb-40d2-457e-b0aa-a824a60e3ce4 req-61de9383-2fec-40de-b6ec-f0405194ac8e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.572 183079 DEBUG oslo_concurrency.lockutils [req-6d6eb9eb-40d2-457e-b0aa-a824a60e3ce4 req-61de9383-2fec-40de-b6ec-f0405194ac8e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.572 183079 DEBUG nova.compute.manager [req-6d6eb9eb-40d2-457e-b0aa-a824a60e3ce4 req-61de9383-2fec-40de-b6ec-f0405194ac8e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Processing event network-vif-plugged-fdc3f392-ef15-4cba-9920-303c2d328978 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.574 183079 DEBUG nova.compute.manager [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.579 183079 DEBUG nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.584 183079 INFO nova.virt.libvirt.driver [-] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Instance spawned successfully.
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.584 183079 DEBUG nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.591 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.595 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.602 183079 DEBUG nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.603 183079 DEBUG nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.603 183079 DEBUG nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.604 183079 DEBUG nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.604 183079 DEBUG nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.604 183079 DEBUG nova.virt.libvirt.driver [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.611 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.611 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104291.4383008, e0a2ef50-95ae-4e45-bedb-f385cf225914 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.611 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] VM Paused (Lifecycle Event)
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.634 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.637 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104291.5778358, e0a2ef50-95ae-4e45-bedb-f385cf225914 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.637 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] VM Resumed (Lifecycle Event)
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.672 183079 INFO nova.compute.manager [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Took 3.30 seconds to spawn the instance on the hypervisor.
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.673 183079 DEBUG nova.compute.manager [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.674 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.683 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.710 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.742 183079 INFO nova.compute.manager [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Took 3.75 seconds to build instance.
Jan 22 17:51:31 compute-0 nova_compute[183075]: 2026-01-22 17:51:31.761 183079 DEBUG oslo_concurrency.lockutils [None req-146b99c0-de1e-4e7e-ae4d-90d5d430e30b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "e0a2ef50-95ae-4e45-bedb-f385cf225914" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.829s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:51:32 compute-0 nova_compute[183075]: 2026-01-22 17:51:32.205 183079 DEBUG nova.network.neutron [req-3b0d9114-9003-4fb2-8fc9-c837778dd4de req-c107dc9f-068b-4aba-9656-79a8b59d27d4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Updated VIF entry in instance network info cache for port fdc3f392-ef15-4cba-9920-303c2d328978. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:51:32 compute-0 nova_compute[183075]: 2026-01-22 17:51:32.205 183079 DEBUG nova.network.neutron [req-3b0d9114-9003-4fb2-8fc9-c837778dd4de req-c107dc9f-068b-4aba-9656-79a8b59d27d4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Updating instance_info_cache with network_info: [{"id": "fdc3f392-ef15-4cba-9920-303c2d328978", "address": "fa:16:3e:99:ce:f5", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3f392-ef", "ovs_interfaceid": "fdc3f392-ef15-4cba-9920-303c2d328978", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:51:32 compute-0 nova_compute[183075]: 2026-01-22 17:51:32.222 183079 DEBUG oslo_concurrency.lockutils [req-3b0d9114-9003-4fb2-8fc9-c837778dd4de req-c107dc9f-068b-4aba-9656-79a8b59d27d4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-e0a2ef50-95ae-4e45-bedb-f385cf225914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:51:32 compute-0 nova_compute[183075]: 2026-01-22 17:51:32.385 183079 INFO nova.compute.manager [None req-3dbee03a-9880-4916-bfd0-0a96d912a601 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Get console output
Jan 22 17:51:32 compute-0 nova_compute[183075]: 2026-01-22 17:51:32.390 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:51:33 compute-0 nova_compute[183075]: 2026-01-22 17:51:33.668 183079 DEBUG nova.compute.manager [req-2ecbf7bb-ef51-48d8-a693-32c8a37c72d5 req-499b5c23-6c13-4bcb-95d8-7138785243dc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Received event network-vif-plugged-fdc3f392-ef15-4cba-9920-303c2d328978 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:51:33 compute-0 nova_compute[183075]: 2026-01-22 17:51:33.668 183079 DEBUG oslo_concurrency.lockutils [req-2ecbf7bb-ef51-48d8-a693-32c8a37c72d5 req-499b5c23-6c13-4bcb-95d8-7138785243dc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:51:33 compute-0 nova_compute[183075]: 2026-01-22 17:51:33.668 183079 DEBUG oslo_concurrency.lockutils [req-2ecbf7bb-ef51-48d8-a693-32c8a37c72d5 req-499b5c23-6c13-4bcb-95d8-7138785243dc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:51:33 compute-0 nova_compute[183075]: 2026-01-22 17:51:33.669 183079 DEBUG oslo_concurrency.lockutils [req-2ecbf7bb-ef51-48d8-a693-32c8a37c72d5 req-499b5c23-6c13-4bcb-95d8-7138785243dc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:51:33 compute-0 nova_compute[183075]: 2026-01-22 17:51:33.669 183079 DEBUG nova.compute.manager [req-2ecbf7bb-ef51-48d8-a693-32c8a37c72d5 req-499b5c23-6c13-4bcb-95d8-7138785243dc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] No waiting events found dispatching network-vif-plugged-fdc3f392-ef15-4cba-9920-303c2d328978 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:51:33 compute-0 nova_compute[183075]: 2026-01-22 17:51:33.669 183079 WARNING nova.compute.manager [req-2ecbf7bb-ef51-48d8-a693-32c8a37c72d5 req-499b5c23-6c13-4bcb-95d8-7138785243dc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Received unexpected event network-vif-plugged-fdc3f392-ef15-4cba-9920-303c2d328978 for instance with vm_state active and task_state None.
Jan 22 17:51:34 compute-0 nova_compute[183075]: 2026-01-22 17:51:34.014 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:34 compute-0 nova_compute[183075]: 2026-01-22 17:51:34.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:51:35 compute-0 nova_compute[183075]: 2026-01-22 17:51:35.986 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:37 compute-0 nova_compute[183075]: 2026-01-22 17:51:37.486 183079 INFO nova.compute.manager [None req-81d198a1-1862-493c-aa43-b4aa8f9532e5 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Get console output
Jan 22 17:51:38 compute-0 podman[242194]: 2026-01-22 17:51:38.362352224 +0000 UTC m=+0.059385524 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, distribution-scope=public, container_name=openstack_network_exporter, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:51:38 compute-0 podman[242193]: 2026-01-22 17:51:38.369967018 +0000 UTC m=+0.060893824 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:51:38 compute-0 podman[242192]: 2026-01-22 17:51:38.399522101 +0000 UTC m=+0.103788795 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:51:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:38.906 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:51:39 compute-0 nova_compute[183075]: 2026-01-22 17:51:39.014 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:39 compute-0 nova_compute[183075]: 2026-01-22 17:51:39.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:51:39 compute-0 nova_compute[183075]: 2026-01-22 17:51:39.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:51:39 compute-0 nova_compute[183075]: 2026-01-22 17:51:39.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:51:40 compute-0 nova_compute[183075]: 2026-01-22 17:51:40.988 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:41.971 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:51:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:41.971 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:51:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:41.972 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:51:42 compute-0 nova_compute[183075]: 2026-01-22 17:51:42.634 183079 INFO nova.compute.manager [None req-3b2bb5c6-d292-4dd3-8dac-b8094cb93996 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Get console output
Jan 22 17:51:42 compute-0 nova_compute[183075]: 2026-01-22 17:51:42.641 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:51:44 compute-0 nova_compute[183075]: 2026-01-22 17:51:44.016 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:44 compute-0 podman[242267]: 2026-01-22 17:51:44.353128143 +0000 UTC m=+0.063916295 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 22 17:51:44 compute-0 ovn_controller[95372]: 2026-01-22T17:51:44Z|00148|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:99:ce:f5 10.100.0.6
Jan 22 17:51:44 compute-0 ovn_controller[95372]: 2026-01-22T17:51:44Z|00149|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:99:ce:f5 10.100.0.6
Jan 22 17:51:45 compute-0 nova_compute[183075]: 2026-01-22 17:51:45.784 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:51:45 compute-0 nova_compute[183075]: 2026-01-22 17:51:45.992 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:47 compute-0 nova_compute[183075]: 2026-01-22 17:51:47.774 183079 INFO nova.compute.manager [None req-77341764-94d5-4177-b093-add222999aa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Get console output
Jan 22 17:51:47 compute-0 nova_compute[183075]: 2026-01-22 17:51:47.778 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:51:49 compute-0 nova_compute[183075]: 2026-01-22 17:51:49.019 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:50.888 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:50.888 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:51:50 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:50 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:50 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:50 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:50 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:50 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:51:50 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:50 compute-0 nova_compute[183075]: 2026-01-22 17:51:50.993 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.772 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.772 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.8838310
Jan 22 17:51:51 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.6:56242 [22/Jan/2026:17:51:50.887] listener listener/metadata 0/0/0/885/885 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.784 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.785 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:51 compute-0 nova_compute[183075]: 2026-01-22 17:51:51.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.800 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.801 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0158217
Jan 22 17:51:51 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.6:56258 [22/Jan/2026:17:51:51.784] listener listener/metadata 0/0/0/17/17 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.807 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.807 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:51 compute-0 nova_compute[183075]: 2026-01-22 17:51:51.811 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:51:51 compute-0 nova_compute[183075]: 2026-01-22 17:51:51.812 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:51:51 compute-0 nova_compute[183075]: 2026-01-22 17:51:51.812 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:51:51 compute-0 nova_compute[183075]: 2026-01-22 17:51:51.812 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.823 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.823 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0158234
Jan 22 17:51:51 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.6:56274 [22/Jan/2026:17:51:51.806] listener listener/metadata 0/0/0/16/16 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.830 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.831 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:51:51 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:51 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.846 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.847 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0161376
Jan 22 17:51:51 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.6:56282 [22/Jan/2026:17:51:51.830] listener listener/metadata 0/0/0/17/17 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.854 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.855 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.871 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.872 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0172427
Jan 22 17:51:51 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.6:56298 [22/Jan/2026:17:51:51.853] listener listener/metadata 0/0/0/18/18 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.881 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.882 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:51 compute-0 nova_compute[183075]: 2026-01-22 17:51:51.896 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.901 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.902 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0197606
Jan 22 17:51:51 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.6:56306 [22/Jan/2026:17:51:51.880] listener listener/metadata 0/0/0/21/21 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.910 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.910 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.922 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.922 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0120480
Jan 22 17:51:51 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.6:56320 [22/Jan/2026:17:51:51.909] listener listener/metadata 0/0/0/13/13 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.928 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.929 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.943 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.944 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0152051
Jan 22 17:51:51 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.6:56328 [22/Jan/2026:17:51:51.928] listener listener/metadata 0/0/0/15/15 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.950 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.950 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:51 compute-0 nova_compute[183075]: 2026-01-22 17:51:51.956 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:51:51 compute-0 nova_compute[183075]: 2026-01-22 17:51:51.957 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.964 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:51 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.6:56332 [22/Jan/2026:17:51:51.949] listener listener/metadata 0/0/0/14/14 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.964 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0143549
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.970 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.970 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.982 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.982 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0124259
Jan 22 17:51:51 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.6:56344 [22/Jan/2026:17:51:51.969] listener listener/metadata 0/0/0/13/13 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.987 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:51.988 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:51:51 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:52 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.6:56358 [22/Jan/2026:17:51:51.987] listener listener/metadata 0/0/0/16/16 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.003 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0150216
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.014 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.018 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.020 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.021 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.042 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:52 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.6:56370 [22/Jan/2026:17:51:52.017] listener listener/metadata 0/0/0/24/24 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.042 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0224655
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.047 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.048 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.069 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:52 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.6:56382 [22/Jan/2026:17:51:52.046] listener listener/metadata 0/0/0/23/23 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.070 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0225596
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.075 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.076 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.096 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.097 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0207739
Jan 22 17:51:52 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.6:56390 [22/Jan/2026:17:51:52.074] listener listener/metadata 0/0/0/22/22 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.102 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.103 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.119 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.120 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0172169
Jan 22 17:51:52 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.6:56406 [22/Jan/2026:17:51:52.101] listener listener/metadata 0/0/0/18/18 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.120 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.122 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.127 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.128 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.6
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.148 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:51:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:51:52.149 104990 INFO eventlet.wsgi.server [-] 10.100.0.6,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0204849
Jan 22 17:51:52 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241974]: 10.100.0.6:56412 [22/Jan/2026:17:51:52.126] listener listener/metadata 0/0/0/22/22 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.180 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.353 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.354 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5396MB free_disk=73.2955322265625GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.355 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.355 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.429 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance c9f0a876-68d5-4c5d-b4cf-62a36101777d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.430 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance e0a2ef50-95ae-4e45-bedb-f385cf225914 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.430 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.430 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.494 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.508 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.525 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.526 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.967 183079 INFO nova.compute.manager [None req-34ea3962-289b-49f6-823b-eb478e863afb 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Get console output
Jan 22 17:51:52 compute-0 nova_compute[183075]: 2026-01-22 17:51:52.972 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:51:53 compute-0 nova_compute[183075]: 2026-01-22 17:51:53.526 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:51:53 compute-0 nova_compute[183075]: 2026-01-22 17:51:53.527 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:51:53 compute-0 nova_compute[183075]: 2026-01-22 17:51:53.527 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:51:54 compute-0 nova_compute[183075]: 2026-01-22 17:51:54.062 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:54 compute-0 nova_compute[183075]: 2026-01-22 17:51:54.632 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-c9f0a876-68d5-4c5d-b4cf-62a36101777d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:51:54 compute-0 nova_compute[183075]: 2026-01-22 17:51:54.632 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-c9f0a876-68d5-4c5d-b4cf-62a36101777d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:51:54 compute-0 nova_compute[183075]: 2026-01-22 17:51:54.632 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:51:54 compute-0 nova_compute[183075]: 2026-01-22 17:51:54.633 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c9f0a876-68d5-4c5d-b4cf-62a36101777d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:51:55 compute-0 podman[242307]: 2026-01-22 17:51:55.340556027 +0000 UTC m=+0.055547171 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.465 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'name': 'tempest-server-test-1351617608', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000049', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.468 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'name': 'tempest-server-test-1080468316', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000048', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.468 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.475 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/disk.device.usage volume: 29818880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.484 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7191153-ca87-419f-ac52-12bbbcc640c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29818880, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914-vda', 'timestamp': '2026-01-22T17:51:55.468357', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'instance-00000049', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a735582-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.228776076, 'message_signature': 'e2078a59f4e438860dbe1bada39a98e9673a168beff4922be6961500b956fb08'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d-vda', 'timestamp': '2026-01-22T17:51:55.468357', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'instance-00000048', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a74b58a-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.2367605, 'message_signature': '0e54b3e0b179dd060444c665686b823b75d2acde4525d52099ee2a2f03d00f26'}]}, 'timestamp': '2026-01-22 17:51:55.485305', '_unique_id': 'b47e7b28591846d1b49d46df67cab30e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.486 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.487 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.490 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e0a2ef50-95ae-4e45-bedb-f385cf225914 / tapfdc3f392-ef inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.491 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.493 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c9f0a876-68d5-4c5d-b4cf-62a36101777d / tap1dcdc02a-a9 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.493 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24074380-3def-40a3-bcfe-428757ca4426', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000049-e0a2ef50-95ae-4e45-bedb-f385cf225914-tapfdc3f392-ef', 'timestamp': '2026-01-22T17:51:55.487540', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'tapfdc3f392-ef', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:99:ce:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfdc3f392-ef'}, 'message_id': '0a75a8d2-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.247992111, 'message_signature': '06682571f1809f7cb92adf3b545397da3c82edc88edc455dd95391ccd95d4829'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000048-c9f0a876-68d5-4c5d-b4cf-62a36101777d-tap1dcdc02a-a9', 'timestamp': '2026-01-22T17:51:55.487540', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'tap1dcdc02a-a9', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:00:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1dcdc02a-a9'}, 'message_id': '0a75fb52-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.251935817, 'message_signature': '17cd7a15b814903f88b340c6b0d2cc255bd6876a88a06ca2b6360edcbcb92763'}]}, 'timestamp': '2026-01-22 17:51:55.493575', '_unique_id': 'b484805c2b424aa1b9a5ecf107f241f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.494 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.495 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/network.incoming.bytes volume: 7305 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.495 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/network.incoming.bytes volume: 7521 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23ab9a13-8fff-46bf-ad2d-a8d8a489a6c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7305, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000049-e0a2ef50-95ae-4e45-bedb-f385cf225914-tapfdc3f392-ef', 'timestamp': '2026-01-22T17:51:55.495030', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'tapfdc3f392-ef', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:99:ce:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfdc3f392-ef'}, 'message_id': '0a763d6a-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.247992111, 'message_signature': '4e2dafb8e4c55bd906614504196973b4fc537c40ee4b22e638840a95c430841f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7521, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000048-c9f0a876-68d5-4c5d-b4cf-62a36101777d-tap1dcdc02a-a9', 'timestamp': '2026-01-22T17:51:55.495030', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'tap1dcdc02a-a9', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:00:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1dcdc02a-a9'}, 'message_id': '0a764580-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.251935817, 'message_signature': '29e0f76006b6dc21f1661772f0a24eaeeb2c1075ad12f2f29766d05368e90a94'}]}, 'timestamp': '2026-01-22 17:51:55.495488', '_unique_id': '5433b728641b4134ad4226ef7302a121'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1351617608>, <NovaLikeServer: tempest-server-test-1080468316>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1351617608>, <NovaLikeServer: tempest-server-test-1080468316>]
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.496 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.510 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/cpu volume: 11610000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.525 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/cpu volume: 11130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1e5aa3e-4170-45d7-afd2-73ad92255146', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11610000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'timestamp': '2026-01-22T17:51:55.496982', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'instance-00000049', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0a789e98-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.27067922, 'message_signature': '9dac8d6900f9d7aba88686c2e2a49a7ebca3ea2b5573e27e6af2456ba96e37bb'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11130000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'timestamp': '2026-01-22T17:51:55.496982', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'instance-00000048', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0a7af788-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.286115824, 'message_signature': '9d03d5041e55fbbf93b9d75edebe62d3053364b81c1bd4309b22e07050a0c081'}]}, 'timestamp': '2026-01-22 17:51:55.526313', '_unique_id': '9ee7a35f9b93436e8cb6a13a00304fcb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.527 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.528 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.528 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89132e30-d3e8-4222-a6d8-7fe87025fd4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000049-e0a2ef50-95ae-4e45-bedb-f385cf225914-tapfdc3f392-ef', 'timestamp': '2026-01-22T17:51:55.528062', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'tapfdc3f392-ef', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:99:ce:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfdc3f392-ef'}, 'message_id': '0a7b495e-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.247992111, 'message_signature': '708d21a65aba66d119c9b400a5a29dadceb21e3294d834bf3367588b90ac2dc8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000048-c9f0a876-68d5-4c5d-b4cf-62a36101777d-tap1dcdc02a-a9', 'timestamp': '2026-01-22T17:51:55.528062', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'tap1dcdc02a-a9', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:00:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1dcdc02a-a9'}, 'message_id': '0a7b5494-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.251935817, 'message_signature': '5b0c8ed0f08abecd10d3a74545718dffc1c451219a81b4504673e24997935b40'}]}, 'timestamp': '2026-01-22 17:51:55.528662', '_unique_id': '5f9acf1ebd1d456bb3e966bd620aaf89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.529 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5e3ebc5-01ae-4b6d-bb95-d8fdfb2aa56e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000049-e0a2ef50-95ae-4e45-bedb-f385cf225914-tapfdc3f392-ef', 'timestamp': '2026-01-22T17:51:55.529826', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'tapfdc3f392-ef', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:99:ce:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfdc3f392-ef'}, 'message_id': '0a7b8da6-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.247992111, 'message_signature': '78136247d24517fa8533e1e360c315228c8dcdf867804e51ff87bdc078de9a1d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000048-c9f0a876-68d5-4c5d-b4cf-62a36101777d-tap1dcdc02a-a9', 'timestamp': '2026-01-22T17:51:55.529826', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'tap1dcdc02a-a9', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:00:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1dcdc02a-a9'}, 'message_id': '0a7b9940-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.251935817, 'message_signature': 'c476012fe67dbb51e81a136ef4df0c8aaf459b1ade46f7ccf4d05d32db311ce8'}]}, 'timestamp': '2026-01-22 17:51:55.530381', '_unique_id': '227a064e04734eb8bbdf9d7fee91022e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.531 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.541 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/disk.device.write.latency volume: 2764769918 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.552 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk.device.write.latency volume: 2684339448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '008b6a18-84d9-4854-95b6-ea5c15f25859', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2764769918, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914-vda', 'timestamp': '2026-01-22T17:51:55.531603', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'instance-00000049', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a7d67ac-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.292058313, 'message_signature': 'bc8edad991874c53345aa63fefde1e3f5d5a5c7352dd2565d260cd0c01595b86'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2684339448, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d-vda', 'timestamp': '2026-01-22T17:51:55.531603', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'instance-00000048', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a7f15ca-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.302651297, 'message_signature': '1a081673fe82923aaca9a8bbc297846b40140e4eacc4315088ffc89ad536e9ee'}]}, 'timestamp': '2026-01-22 17:51:55.553294', '_unique_id': '78836e80781f4ced9a351671d30c59e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.554 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/network.incoming.packets volume: 62 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.555 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/network.incoming.packets volume: 66 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d408d1f-ff83-488b-8981-fb46a952b3b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 62, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000049-e0a2ef50-95ae-4e45-bedb-f385cf225914-tapfdc3f392-ef', 'timestamp': '2026-01-22T17:51:55.554959', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'tapfdc3f392-ef', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:99:ce:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfdc3f392-ef'}, 'message_id': '0a7f63fe-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.247992111, 'message_signature': '1d6fd091e49edaa5903508efcf2fee55ff7000e2f6c86eb6b8f39426479f19a3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 66, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000048-c9f0a876-68d5-4c5d-b4cf-62a36101777d-tap1dcdc02a-a9', 'timestamp': '2026-01-22T17:51:55.554959', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'tap1dcdc02a-a9', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:00:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1dcdc02a-a9'}, 'message_id': '0a7f8014-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.251935817, 'message_signature': '843e30f4182b4d6ec278f2d1e7594b83d5663d9d5a96ca2dbf7c0c46ce7cffa0'}]}, 'timestamp': '2026-01-22 17:51:55.555968', '_unique_id': '45606555298b40b6a06901e4e49d8642'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.556 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.557 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.557 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.557 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1351617608>, <NovaLikeServer: tempest-server-test-1080468316>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1351617608>, <NovaLikeServer: tempest-server-test-1080468316>]
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.557 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.557 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.557 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bc63344-57eb-4847-9950-578038cb4455', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000049-e0a2ef50-95ae-4e45-bedb-f385cf225914-tapfdc3f392-ef', 'timestamp': '2026-01-22T17:51:55.557649', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'tapfdc3f392-ef', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:99:ce:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfdc3f392-ef'}, 'message_id': '0a7fccf4-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.247992111, 'message_signature': '7627d587f20475abf4e2b2460a877fffd6fbb6185c543d349f5d85339a798e1a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000048-c9f0a876-68d5-4c5d-b4cf-62a36101777d-tap1dcdc02a-a9', 'timestamp': '2026-01-22T17:51:55.557649', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'tap1dcdc02a-a9', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:00:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1dcdc02a-a9'}, 'message_id': '0a7fd640-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.251935817, 'message_signature': '96b1176e4afb05ab9b67baaf1e5f49fa3a72551b8075bec1a23cf8ee1101a259'}]}, 'timestamp': '2026-01-22 17:51:55.558165', '_unique_id': 'd6be3d02462348bcb3a22d254437ee0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.558 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.559 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.559 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.559 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0001fa9-b70a-47be-8600-148d343cb740', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914-vda', 'timestamp': '2026-01-22T17:51:55.559272', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'instance-00000049', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a800d0e-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.228776076, 'message_signature': '318950306413ffc646313fca17da078a52071de7d9727507e1921bc519a2cbe1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d-vda', 'timestamp': '2026-01-22T17:51:55.559272', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'instance-00000048', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a80188a-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.2367605, 'message_signature': '9cd4329dddc723267e741b924b10fd118cd75c8c512c04c8f210f0ec666d8bfa'}]}, 'timestamp': '2026-01-22 17:51:55.559853', '_unique_id': 'f31c32e6679a43b5bbe66a0927667796'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.560 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/disk.device.read.bytes volume: 30050816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk.device.read.bytes volume: 30059008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54cc98e7-002b-4046-bec5-7cc292498949', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30050816, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914-vda', 'timestamp': '2026-01-22T17:51:55.560965', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'instance-00000049', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a804df0-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.292058313, 'message_signature': '00a3bd3dd23e432253937843842535f115fdb639f9a9f1b4ad252bb3e19b8bd8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30059008, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d-vda', 'timestamp': '2026-01-22T17:51:55.560965', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'instance-00000048', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a80587c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.302651297, 'message_signature': '177034a4dc848cbefb0ebd026a97c92a6fafa053afc603fcfd50c62736b83ce6'}]}, 'timestamp': '2026-01-22 17:51:55.561487', '_unique_id': 'd7b27b8155db4f98963e8b1f9a64037c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.561 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.562 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.562 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.562 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1351617608>, <NovaLikeServer: tempest-server-test-1080468316>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1351617608>, <NovaLikeServer: tempest-server-test-1080468316>]
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.563 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.563 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/network.outgoing.bytes volume: 10121 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.563 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/network.outgoing.bytes volume: 10658 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a30022c-a9d6-4e60-82b7-fb660e62cb76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10121, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000049-e0a2ef50-95ae-4e45-bedb-f385cf225914-tapfdc3f392-ef', 'timestamp': '2026-01-22T17:51:55.563212', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'tapfdc3f392-ef', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:99:ce:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfdc3f392-ef'}, 'message_id': '0a80a6ec-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.247992111, 'message_signature': 'a7101c322024c4f923c4c5d4927ef4b6b975d18e9069f267e110a4c1fa88b72e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10658, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000048-c9f0a876-68d5-4c5d-b4cf-62a36101777d-tap1dcdc02a-a9', 'timestamp': '2026-01-22T17:51:55.563212', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'tap1dcdc02a-a9', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:00:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1dcdc02a-a9'}, 'message_id': '0a80b286-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.251935817, 'message_signature': 'c160ce1ddd66ae6622d907c502d94a077c0db5e3bed348e76521af101d931fdb'}]}, 'timestamp': '2026-01-22 17:51:55.563792', '_unique_id': '8fec3c51c878475591011e407f5cf68a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.564 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/disk.device.read.requests volume: 1113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk.device.read.requests volume: 1115 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86dbd901-6ce0-4b3f-a1ed-96244f8e96ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1113, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914-vda', 'timestamp': '2026-01-22T17:51:55.564889', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'instance-00000049', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a80e5e4-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.292058313, 'message_signature': 'a79178453c1cd63c3a2a29e76130eb60f75e3abb8c8566047ba1f441ae374461'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1115, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d-vda', 'timestamp': '2026-01-22T17:51:55.564889', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'instance-00000048', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a80edb4-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.302651297, 'message_signature': '59f1917c80864c6be539c8940db1227bbac0f348922ebb4fac082bc4dbeb2f79'}]}, 'timestamp': '2026-01-22 17:51:55.565295', '_unique_id': '626f70488d1a423d8cbb7909d144ffc6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.565 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.566 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.566 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.566 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad98171a-f8d5-40a5-82d8-8d7af401c461', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000049-e0a2ef50-95ae-4e45-bedb-f385cf225914-tapfdc3f392-ef', 'timestamp': '2026-01-22T17:51:55.566427', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'tapfdc3f392-ef', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:99:ce:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfdc3f392-ef'}, 'message_id': '0a812220-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.247992111, 'message_signature': '4cf44729e1b6612f432cd924a348b059ca25bb25621a3208ea060c2df63ab497'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000048-c9f0a876-68d5-4c5d-b4cf-62a36101777d-tap1dcdc02a-a9', 'timestamp': '2026-01-22T17:51:55.566427', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'tap1dcdc02a-a9', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:00:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1dcdc02a-a9'}, 'message_id': '0a812b4e-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.251935817, 'message_signature': 'b11ea88478e596973de68d113f2fb97e64b25fec588516e22b6baf3830dd9213'}]}, 'timestamp': '2026-01-22 17:51:55.566879', '_unique_id': '6b8473c777e6489282b9a0c0844eb075'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.567 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/disk.device.read.latency volume: 143698523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk.device.read.latency volume: 168791204 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8d42d1b-f71b-482c-ab5e-bb9ac05c8b06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 143698523, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914-vda', 'timestamp': '2026-01-22T17:51:55.567970', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'instance-00000049', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a815e3e-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.292058313, 'message_signature': '660068eb515d4c703cb126624fe50f1898cee8cdeee2d19a147a4f5930ba41a2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 168791204, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d-vda', 'timestamp': '2026-01-22T17:51:55.567970', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'instance-00000048', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a816604-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.302651297, 'message_signature': 'a42bc7bea8be8dc1859c23c1f9fc82df83db37b41a0afd5e0c3f870590736481'}]}, 'timestamp': '2026-01-22 17:51:55.568376', '_unique_id': '219d9a2ed92c4f7da73166d69261ff6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.568 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.569 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.569 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/disk.device.write.requests volume: 317 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.569 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk.device.write.requests volume: 344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6300a3e1-6d11-4145-9ae6-3cf8c03aa6e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 317, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914-vda', 'timestamp': '2026-01-22T17:51:55.569450', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'instance-00000049', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a81996c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.292058313, 'message_signature': 'bbed1bcdb7123d3d897bac7fa4f4d805bac0ba6b6b6a9004812053fb301f5181'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 344, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d-vda', 'timestamp': '2026-01-22T17:51:55.569450', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'instance-00000048', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a81a380-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.302651297, 'message_signature': 'be4464b6dcbc49d62b8407a17b8faf82a06981ce17bc8cbfdee667148ee2c921'}]}, 'timestamp': '2026-01-22 17:51:55.569950', '_unique_id': 'aaee7d5a79f742dba6b854afa292c1c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.570 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5856f809-7487-4f6f-9fce-222f299ff33e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914-vda', 'timestamp': '2026-01-22T17:51:55.571013', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'instance-00000049', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a81d51c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.228776076, 'message_signature': '7ea75b49ccb3625762dac4058b74ab0f6c31ee5f43203923227ac74f2ea4c7fe'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d-vda', 'timestamp': '2026-01-22T17:51:55.571013', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'instance-00000048', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a81dcec-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.2367605, 'message_signature': '7ff9697acee069c924e60e0c8af318630f5a7d79cb599cc51aaa73a400543388'}]}, 'timestamp': '2026-01-22 17:51:55.571447', '_unique_id': 'ef5f3c9a03fd4a07b622b75d4fcb496c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.571 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.572 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.572 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/network.outgoing.packets volume: 115 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.572 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/network.outgoing.packets volume: 122 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8502c4ef-5c7f-4421-a7b1-4439996b6279', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 115, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000049-e0a2ef50-95ae-4e45-bedb-f385cf225914-tapfdc3f392-ef', 'timestamp': '2026-01-22T17:51:55.572500', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'tapfdc3f392-ef', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:99:ce:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfdc3f392-ef'}, 'message_id': '0a820f46-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.247992111, 'message_signature': '4d42624aff6ead3a1391157b33dcf2c4d441acde1b997fb5edb2ae775e443694'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 122, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000048-c9f0a876-68d5-4c5d-b4cf-62a36101777d-tap1dcdc02a-a9', 'timestamp': '2026-01-22T17:51:55.572500', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'tap1dcdc02a-a9', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:00:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1dcdc02a-a9'}, 'message_id': '0a82182e-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.251935817, 'message_signature': 'f00103c28aac35f8bee50015844b56edfa244b48a0cf0391c5bbbc1747a1bd64'}]}, 'timestamp': '2026-01-22 17:51:55.572943', '_unique_id': 'caf06df1715c46158c96228d0467ec8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.573 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/memory.usage volume: 42.53515625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/memory.usage volume: 42.578125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e28b029-9675-4836-8de9-359526423a70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.53515625, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'timestamp': '2026-01-22T17:51:55.573988', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'instance-00000049', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '0a82495c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.27067922, 'message_signature': '1bc09102076e4684f3c426c3a3b6e7777eea6b8365e093d9e691839777b1a572'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.578125, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'timestamp': '2026-01-22T17:51:55.573988', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'instance-00000048', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '0a825118-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.286115824, 'message_signature': 'e5cfda4bb4e6556ed008f9bd09be8de40fbc1ec236f1a4acfedae12d656cd19d'}]}, 'timestamp': '2026-01-22 17:51:55.574391', '_unique_id': 'ac4240778fd84a7f87b13ed84061acc7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.575 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.575 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/disk.device.write.bytes volume: 72871936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.575 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/disk.device.write.bytes volume: 73138176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c27c7a9-a4c0-4fe1-815c-98ac7a47452e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72871936, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914-vda', 'timestamp': '2026-01-22T17:51:55.575459', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'instance-00000049', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a8282be-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.292058313, 'message_signature': '15be39ba9348dafb5a49f679036894ac3af6d0352158b35fd1391c48587c1f8b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73138176, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d-vda', 'timestamp': '2026-01-22T17:51:55.575459', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'instance-00000048', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0a828b7e-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.302651297, 'message_signature': 'c174665ce71303648b3a6ea5e291df8a35ae6b204a679606aa473d704dd2ffec'}]}, 'timestamp': '2026-01-22 17:51:55.575888', '_unique_id': 'af359c463a0747e29fa9b214c214cbb6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.576 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.577 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.577 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1351617608>, <NovaLikeServer: tempest-server-test-1080468316>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1351617608>, <NovaLikeServer: tempest-server-test-1080468316>]
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.577 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.577 12 DEBUG ceilometer.compute.pollsters [-] e0a2ef50-95ae-4e45-bedb-f385cf225914/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.577 12 DEBUG ceilometer.compute.pollsters [-] c9f0a876-68d5-4c5d-b4cf-62a36101777d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '333705dd-b7f8-4a7d-9f60-2f67f63c2fcf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000049-e0a2ef50-95ae-4e45-bedb-f385cf225914-tapfdc3f392-ef', 'timestamp': '2026-01-22T17:51:55.577278', 'resource_metadata': {'display_name': 'tempest-server-test-1351617608', 'name': 'tapfdc3f392-ef', 'instance_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:99:ce:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfdc3f392-ef'}, 'message_id': '0a82c9fe-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.247992111, 'message_signature': 'd176b0190613091f2ee31d94fe99ba5f2dfac45f4e8f7453cfac5b6c084ff73d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000048-c9f0a876-68d5-4c5d-b4cf-62a36101777d-tap1dcdc02a-a9', 'timestamp': '2026-01-22T17:51:55.577278', 'resource_metadata': {'display_name': 'tempest-server-test-1080468316', 'name': 'tap1dcdc02a-a9', 'instance_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:00:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1dcdc02a-a9'}, 'message_id': '0a82d214-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6679.251935817, 'message_signature': '68aedb5c7a2ec30b78adf92160bebeba711cc849c69c572aa567a371b0a529af'}]}, 'timestamp': '2026-01-22 17:51:55.577729', '_unique_id': '254eddebe0724ddbb4ee977aee3395e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:51:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:51:55.578 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:51:55 compute-0 nova_compute[183075]: 2026-01-22 17:51:55.995 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:51:56 compute-0 nova_compute[183075]: 2026-01-22 17:51:56.802 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Updating instance_info_cache with network_info: [{"id": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "address": "fa:16:3e:2d:00:4b", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dcdc02a-a9", "ovs_interfaceid": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:51:56 compute-0 nova_compute[183075]: 2026-01-22 17:51:56.820 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-c9f0a876-68d5-4c5d-b4cf-62a36101777d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:51:56 compute-0 nova_compute[183075]: 2026-01-22 17:51:56.820 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:51:56 compute-0 nova_compute[183075]: 2026-01-22 17:51:56.820 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:51:56 compute-0 nova_compute[183075]: 2026-01-22 17:51:56.821 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:51:56 compute-0 nova_compute[183075]: 2026-01-22 17:51:56.821 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:51:58 compute-0 nova_compute[183075]: 2026-01-22 17:51:58.117 183079 INFO nova.compute.manager [None req-5d4b57a2-1aba-4473-815e-2d5b6861837a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Get console output
Jan 22 17:51:58 compute-0 nova_compute[183075]: 2026-01-22 17:51:58.121 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:51:58 compute-0 podman[242332]: 2026-01-22 17:51:58.357487639 +0000 UTC m=+0.052480979 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:51:59 compute-0 nova_compute[183075]: 2026-01-22 17:51:59.064 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:00 compute-0 nova_compute[183075]: 2026-01-22 17:52:00.345 183079 DEBUG nova.compute.manager [req-02ac1831-5f2b-49c0-bdb5-2c6671bf9a72 req-e8df4895-665a-42c9-8f86-956e4a99c135 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Received event network-changed-1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:52:00 compute-0 nova_compute[183075]: 2026-01-22 17:52:00.346 183079 DEBUG nova.compute.manager [req-02ac1831-5f2b-49c0-bdb5-2c6671bf9a72 req-e8df4895-665a-42c9-8f86-956e4a99c135 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Refreshing instance network info cache due to event network-changed-1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:52:00 compute-0 nova_compute[183075]: 2026-01-22 17:52:00.346 183079 DEBUG oslo_concurrency.lockutils [req-02ac1831-5f2b-49c0-bdb5-2c6671bf9a72 req-e8df4895-665a-42c9-8f86-956e4a99c135 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-c9f0a876-68d5-4c5d-b4cf-62a36101777d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:52:00 compute-0 nova_compute[183075]: 2026-01-22 17:52:00.346 183079 DEBUG oslo_concurrency.lockutils [req-02ac1831-5f2b-49c0-bdb5-2c6671bf9a72 req-e8df4895-665a-42c9-8f86-956e4a99c135 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-c9f0a876-68d5-4c5d-b4cf-62a36101777d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:52:00 compute-0 nova_compute[183075]: 2026-01-22 17:52:00.347 183079 DEBUG nova.network.neutron [req-02ac1831-5f2b-49c0-bdb5-2c6671bf9a72 req-e8df4895-665a-42c9-8f86-956e4a99c135 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Refreshing network info cache for port 1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:52:00 compute-0 nova_compute[183075]: 2026-01-22 17:52:00.998 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:01 compute-0 nova_compute[183075]: 2026-01-22 17:52:01.448 183079 DEBUG nova.network.neutron [req-02ac1831-5f2b-49c0-bdb5-2c6671bf9a72 req-e8df4895-665a-42c9-8f86-956e4a99c135 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Updated VIF entry in instance network info cache for port 1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:52:01 compute-0 nova_compute[183075]: 2026-01-22 17:52:01.449 183079 DEBUG nova.network.neutron [req-02ac1831-5f2b-49c0-bdb5-2c6671bf9a72 req-e8df4895-665a-42c9-8f86-956e4a99c135 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Updating instance_info_cache with network_info: [{"id": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "address": "fa:16:3e:2d:00:4b", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dcdc02a-a9", "ovs_interfaceid": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:52:01 compute-0 nova_compute[183075]: 2026-01-22 17:52:01.499 183079 DEBUG oslo_concurrency.lockutils [req-02ac1831-5f2b-49c0-bdb5-2c6671bf9a72 req-e8df4895-665a-42c9-8f86-956e4a99c135 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-c9f0a876-68d5-4c5d-b4cf-62a36101777d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.135 183079 DEBUG oslo_concurrency.lockutils [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "e0a2ef50-95ae-4e45-bedb-f385cf225914" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.136 183079 DEBUG oslo_concurrency.lockutils [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "e0a2ef50-95ae-4e45-bedb-f385cf225914" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.136 183079 DEBUG oslo_concurrency.lockutils [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.136 183079 DEBUG oslo_concurrency.lockutils [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.136 183079 DEBUG oslo_concurrency.lockutils [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.137 183079 INFO nova.compute.manager [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Terminating instance
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.138 183079 DEBUG nova.compute.manager [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:52:02 compute-0 kernel: tapfdc3f392-ef (unregistering): left promiscuous mode
Jan 22 17:52:02 compute-0 NetworkManager[55454]: <info>  [1769104322.1590] device (tapfdc3f392-ef): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:52:02 compute-0 ovn_controller[95372]: 2026-01-22T17:52:02Z|00815|binding|INFO|Releasing lport fdc3f392-ef15-4cba-9920-303c2d328978 from this chassis (sb_readonly=0)
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.168 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:02 compute-0 ovn_controller[95372]: 2026-01-22T17:52:02Z|00816|binding|INFO|Setting lport fdc3f392-ef15-4cba-9920-303c2d328978 down in Southbound
Jan 22 17:52:02 compute-0 ovn_controller[95372]: 2026-01-22T17:52:02Z|00817|binding|INFO|Removing iface tapfdc3f392-ef ovn-installed in OVS
Jan 22 17:52:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:02.180 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:ce:f5 10.100.0.6'], port_security=['fa:16:3e:99:ce:f5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e0a2ef50-95ae-4e45-bedb-f385cf225914', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f61e4b27-bb7c-42d2-a372-c8137640f8ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=fdc3f392-ef15-4cba-9920-303c2d328978) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.181 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:02.182 104629 INFO neutron.agent.ovn.metadata.agent [-] Port fdc3f392-ef15-4cba-9920-303c2d328978 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:52:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:02.183 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:52:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:02.197 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[205343c8-66c0-438d-a592-b89cb77ab7af]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:02 compute-0 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000049.scope: Deactivated successfully.
Jan 22 17:52:02 compute-0 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000049.scope: Consumed 13.336s CPU time.
Jan 22 17:52:02 compute-0 systemd-machined[154382]: Machine qemu-73-instance-00000049 terminated.
Jan 22 17:52:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:02.226 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[09c61785-1a7d-4a7a-9db2-0cd09b1cb8bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:02.229 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[685c9afc-4f2e-4717-8f66-3c92264f742d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:02.251 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[cc16fcbf-e4b4-4228-b37b-43879064af8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:02.266 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2dd578e4-8e08-4607-993a-8c5682e62ce7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 203, 'tx_packets': 105, 'rx_bytes': 17350, 'tx_bytes': 11992, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 203, 'tx_packets': 105, 'rx_bytes': 17350, 'tx_bytes': 11992, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662414, 'reachable_time': 31076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242370, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:02.278 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b2b234-2683-4367-ba68-94a3de11b634]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662427, 'tstamp': 662427}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242371, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662429, 'tstamp': 662429}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242371, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:02.279 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.280 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.284 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:02.285 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:52:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:02.285 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:52:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:02.285 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:52:02 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:02.286 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.356 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.360 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.386 183079 INFO nova.virt.libvirt.driver [-] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Instance destroyed successfully.
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.387 183079 DEBUG nova.objects.instance [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid e0a2ef50-95ae-4e45-bedb-f385cf225914 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.400 183079 DEBUG nova.virt.libvirt.vif [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:51:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1351617608',display_name='tempest-server-test-1351617608',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1351617608',id=73,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:51:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-qcrtjop2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:51:31Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=e0a2ef50-95ae-4e45-bedb-f385cf225914,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fdc3f392-ef15-4cba-9920-303c2d328978", "address": "fa:16:3e:99:ce:f5", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3f392-ef", "ovs_interfaceid": "fdc3f392-ef15-4cba-9920-303c2d328978", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.401 183079 DEBUG nova.network.os_vif_util [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "fdc3f392-ef15-4cba-9920-303c2d328978", "address": "fa:16:3e:99:ce:f5", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3f392-ef", "ovs_interfaceid": "fdc3f392-ef15-4cba-9920-303c2d328978", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.401 183079 DEBUG nova.network.os_vif_util [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:ce:f5,bridge_name='br-int',has_traffic_filtering=True,id=fdc3f392-ef15-4cba-9920-303c2d328978,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfdc3f392-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.402 183079 DEBUG os_vif [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:ce:f5,bridge_name='br-int',has_traffic_filtering=True,id=fdc3f392-ef15-4cba-9920-303c2d328978,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfdc3f392-ef') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.403 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.403 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfdc3f392-ef, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.429 183079 DEBUG nova.compute.manager [req-be0014aa-c626-4e2e-ae95-61e869edc1c6 req-2cdcbdaf-8ea9-4abe-8166-c43329088039 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Received event network-changed-fdc3f392-ef15-4cba-9920-303c2d328978 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.430 183079 DEBUG nova.compute.manager [req-be0014aa-c626-4e2e-ae95-61e869edc1c6 req-2cdcbdaf-8ea9-4abe-8166-c43329088039 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Refreshing instance network info cache due to event network-changed-fdc3f392-ef15-4cba-9920-303c2d328978. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.430 183079 DEBUG oslo_concurrency.lockutils [req-be0014aa-c626-4e2e-ae95-61e869edc1c6 req-2cdcbdaf-8ea9-4abe-8166-c43329088039 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-e0a2ef50-95ae-4e45-bedb-f385cf225914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.431 183079 DEBUG oslo_concurrency.lockutils [req-be0014aa-c626-4e2e-ae95-61e869edc1c6 req-2cdcbdaf-8ea9-4abe-8166-c43329088039 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-e0a2ef50-95ae-4e45-bedb-f385cf225914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.431 183079 DEBUG nova.network.neutron [req-be0014aa-c626-4e2e-ae95-61e869edc1c6 req-2cdcbdaf-8ea9-4abe-8166-c43329088039 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Refreshing network info cache for port fdc3f392-ef15-4cba-9920-303c2d328978 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.444 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.446 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.448 183079 INFO os_vif [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:ce:f5,bridge_name='br-int',has_traffic_filtering=True,id=fdc3f392-ef15-4cba-9920-303c2d328978,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfdc3f392-ef')
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.449 183079 INFO nova.virt.libvirt.driver [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Deleting instance files /var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914_del
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.450 183079 INFO nova.virt.libvirt.driver [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Deletion of /var/lib/nova/instances/e0a2ef50-95ae-4e45-bedb-f385cf225914_del complete
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.510 183079 INFO nova.compute.manager [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.511 183079 DEBUG oslo.service.loopingcall [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.511 183079 DEBUG nova.compute.manager [-] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.511 183079 DEBUG nova.network.neutron [-] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.711 183079 DEBUG nova.compute.manager [req-773179d6-d0ef-4041-9218-14dd633a8b47 req-248760bf-54af-414b-afb6-e6ccf20b76bc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Received event network-vif-unplugged-fdc3f392-ef15-4cba-9920-303c2d328978 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.712 183079 DEBUG oslo_concurrency.lockutils [req-773179d6-d0ef-4041-9218-14dd633a8b47 req-248760bf-54af-414b-afb6-e6ccf20b76bc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.712 183079 DEBUG oslo_concurrency.lockutils [req-773179d6-d0ef-4041-9218-14dd633a8b47 req-248760bf-54af-414b-afb6-e6ccf20b76bc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.712 183079 DEBUG oslo_concurrency.lockutils [req-773179d6-d0ef-4041-9218-14dd633a8b47 req-248760bf-54af-414b-afb6-e6ccf20b76bc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.713 183079 DEBUG nova.compute.manager [req-773179d6-d0ef-4041-9218-14dd633a8b47 req-248760bf-54af-414b-afb6-e6ccf20b76bc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] No waiting events found dispatching network-vif-unplugged-fdc3f392-ef15-4cba-9920-303c2d328978 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:52:02 compute-0 nova_compute[183075]: 2026-01-22 17:52:02.713 183079 DEBUG nova.compute.manager [req-773179d6-d0ef-4041-9218-14dd633a8b47 req-248760bf-54af-414b-afb6-e6ccf20b76bc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Received event network-vif-unplugged-fdc3f392-ef15-4cba-9920-303c2d328978 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:52:03 compute-0 nova_compute[183075]: 2026-01-22 17:52:03.376 183079 DEBUG nova.network.neutron [-] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:52:03 compute-0 nova_compute[183075]: 2026-01-22 17:52:03.393 183079 INFO nova.compute.manager [-] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Took 0.88 seconds to deallocate network for instance.
Jan 22 17:52:03 compute-0 nova_compute[183075]: 2026-01-22 17:52:03.436 183079 DEBUG oslo_concurrency.lockutils [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:03 compute-0 nova_compute[183075]: 2026-01-22 17:52:03.436 183079 DEBUG oslo_concurrency.lockutils [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:03 compute-0 nova_compute[183075]: 2026-01-22 17:52:03.645 183079 DEBUG nova.network.neutron [req-be0014aa-c626-4e2e-ae95-61e869edc1c6 req-2cdcbdaf-8ea9-4abe-8166-c43329088039 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Updated VIF entry in instance network info cache for port fdc3f392-ef15-4cba-9920-303c2d328978. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:52:03 compute-0 nova_compute[183075]: 2026-01-22 17:52:03.646 183079 DEBUG nova.network.neutron [req-be0014aa-c626-4e2e-ae95-61e869edc1c6 req-2cdcbdaf-8ea9-4abe-8166-c43329088039 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Updating instance_info_cache with network_info: [{"id": "fdc3f392-ef15-4cba-9920-303c2d328978", "address": "fa:16:3e:99:ce:f5", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3f392-ef", "ovs_interfaceid": "fdc3f392-ef15-4cba-9920-303c2d328978", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:52:03 compute-0 nova_compute[183075]: 2026-01-22 17:52:03.667 183079 DEBUG oslo_concurrency.lockutils [req-be0014aa-c626-4e2e-ae95-61e869edc1c6 req-2cdcbdaf-8ea9-4abe-8166-c43329088039 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-e0a2ef50-95ae-4e45-bedb-f385cf225914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:52:03 compute-0 nova_compute[183075]: 2026-01-22 17:52:03.702 183079 DEBUG nova.compute.provider_tree [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:52:03 compute-0 nova_compute[183075]: 2026-01-22 17:52:03.715 183079 DEBUG nova.scheduler.client.report [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:52:03 compute-0 nova_compute[183075]: 2026-01-22 17:52:03.735 183079 DEBUG oslo_concurrency.lockutils [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:03 compute-0 nova_compute[183075]: 2026-01-22 17:52:03.760 183079 INFO nova.scheduler.client.report [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance e0a2ef50-95ae-4e45-bedb-f385cf225914
Jan 22 17:52:03 compute-0 nova_compute[183075]: 2026-01-22 17:52:03.816 183079 DEBUG oslo_concurrency.lockutils [None req-54fa1f6c-1109-4262-ada6-91e965f53f22 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "e0a2ef50-95ae-4e45-bedb-f385cf225914" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:04 compute-0 nova_compute[183075]: 2026-01-22 17:52:04.067 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:04 compute-0 nova_compute[183075]: 2026-01-22 17:52:04.783 183079 DEBUG nova.compute.manager [req-1f1c8a15-63e0-4a24-a210-18b32512d7e4 req-66660bfe-330f-453e-819f-2ef26f949add a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Received event network-vif-plugged-fdc3f392-ef15-4cba-9920-303c2d328978 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:52:04 compute-0 nova_compute[183075]: 2026-01-22 17:52:04.783 183079 DEBUG oslo_concurrency.lockutils [req-1f1c8a15-63e0-4a24-a210-18b32512d7e4 req-66660bfe-330f-453e-819f-2ef26f949add a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:04 compute-0 nova_compute[183075]: 2026-01-22 17:52:04.783 183079 DEBUG oslo_concurrency.lockutils [req-1f1c8a15-63e0-4a24-a210-18b32512d7e4 req-66660bfe-330f-453e-819f-2ef26f949add a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:04 compute-0 nova_compute[183075]: 2026-01-22 17:52:04.784 183079 DEBUG oslo_concurrency.lockutils [req-1f1c8a15-63e0-4a24-a210-18b32512d7e4 req-66660bfe-330f-453e-819f-2ef26f949add a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e0a2ef50-95ae-4e45-bedb-f385cf225914-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:04 compute-0 nova_compute[183075]: 2026-01-22 17:52:04.784 183079 DEBUG nova.compute.manager [req-1f1c8a15-63e0-4a24-a210-18b32512d7e4 req-66660bfe-330f-453e-819f-2ef26f949add a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] No waiting events found dispatching network-vif-plugged-fdc3f392-ef15-4cba-9920-303c2d328978 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:52:04 compute-0 nova_compute[183075]: 2026-01-22 17:52:04.784 183079 WARNING nova.compute.manager [req-1f1c8a15-63e0-4a24-a210-18b32512d7e4 req-66660bfe-330f-453e-819f-2ef26f949add a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Received unexpected event network-vif-plugged-fdc3f392-ef15-4cba-9920-303c2d328978 for instance with vm_state deleted and task_state None.
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.047 183079 DEBUG oslo_concurrency.lockutils [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.048 183079 DEBUG oslo_concurrency.lockutils [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.048 183079 DEBUG oslo_concurrency.lockutils [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.049 183079 DEBUG oslo_concurrency.lockutils [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.049 183079 DEBUG oslo_concurrency.lockutils [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.050 183079 INFO nova.compute.manager [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Terminating instance
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.051 183079 DEBUG nova.compute.manager [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:52:05 compute-0 kernel: tap1dcdc02a-a9 (unregistering): left promiscuous mode
Jan 22 17:52:05 compute-0 NetworkManager[55454]: <info>  [1769104325.0796] device (tap1dcdc02a-a9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:52:05 compute-0 ovn_controller[95372]: 2026-01-22T17:52:05Z|00818|binding|INFO|Releasing lport 1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 from this chassis (sb_readonly=0)
Jan 22 17:52:05 compute-0 ovn_controller[95372]: 2026-01-22T17:52:05Z|00819|binding|INFO|Setting lport 1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 down in Southbound
Jan 22 17:52:05 compute-0 ovn_controller[95372]: 2026-01-22T17:52:05Z|00820|binding|INFO|Removing iface tap1dcdc02a-a9 ovn-installed in OVS
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.084 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.087 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:05.091 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:00:4b 10.100.0.3'], port_security=['fa:16:3e:2d:00:4b 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c9f0a876-68d5-4c5d-b4cf-62a36101777d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f61e4b27-bb7c-42d2-a372-c8137640f8ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.208'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:52:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:05.092 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:52:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:05.093 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:52:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:05.093 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ab623a4d-40fe-471c-870a-2ca3e8d17195]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:05.094 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace which is not needed anymore
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.100 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:05 compute-0 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d00000048.scope: Deactivated successfully.
Jan 22 17:52:05 compute-0 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d00000048.scope: Consumed 14.009s CPU time.
Jan 22 17:52:05 compute-0 systemd-machined[154382]: Machine qemu-72-instance-00000048 terminated.
Jan 22 17:52:05 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241968]: [NOTICE]   (241972) : haproxy version is 2.8.14-c23fe91
Jan 22 17:52:05 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241968]: [NOTICE]   (241972) : path to executable is /usr/sbin/haproxy
Jan 22 17:52:05 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241968]: [WARNING]  (241972) : Exiting Master process...
Jan 22 17:52:05 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241968]: [WARNING]  (241972) : Exiting Master process...
Jan 22 17:52:05 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241968]: [ALERT]    (241972) : Current worker (241974) exited with code 143 (Terminated)
Jan 22 17:52:05 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[241968]: [WARNING]  (241972) : All workers exited. Exiting... (0)
Jan 22 17:52:05 compute-0 systemd[1]: libpod-f3df2a8422e17d4c67e5c57ad760c65a8ddd7ed62aae9781365514a97ea51b25.scope: Deactivated successfully.
Jan 22 17:52:05 compute-0 podman[242409]: 2026-01-22 17:52:05.223888381 +0000 UTC m=+0.045492041 container died f3df2a8422e17d4c67e5c57ad760c65a8ddd7ed62aae9781365514a97ea51b25 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:52:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-8a02df10c0bfff98e76b6cbd052239ed02779ddd2e3bb9f5e5294c4af2de3a08-merged.mount: Deactivated successfully.
Jan 22 17:52:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f3df2a8422e17d4c67e5c57ad760c65a8ddd7ed62aae9781365514a97ea51b25-userdata-shm.mount: Deactivated successfully.
Jan 22 17:52:05 compute-0 podman[242409]: 2026-01-22 17:52:05.264032997 +0000 UTC m=+0.085636657 container cleanup f3df2a8422e17d4c67e5c57ad760c65a8ddd7ed62aae9781365514a97ea51b25 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 17:52:05 compute-0 NetworkManager[55454]: <info>  [1769104325.2716] manager: (tap1dcdc02a-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/325)
Jan 22 17:52:05 compute-0 systemd[1]: libpod-conmon-f3df2a8422e17d4c67e5c57ad760c65a8ddd7ed62aae9781365514a97ea51b25.scope: Deactivated successfully.
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.273 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.279 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.304 183079 INFO nova.virt.libvirt.driver [-] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Instance destroyed successfully.
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.305 183079 DEBUG nova.objects.instance [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid c9f0a876-68d5-4c5d-b4cf-62a36101777d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.318 183079 DEBUG nova.virt.libvirt.vif [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:50:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1080468316',display_name='tempest-server-test-1080468316',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1080468316',id=72,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:51:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-sgonrbi7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:51:01Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=c9f0a876-68d5-4c5d-b4cf-62a36101777d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "address": "fa:16:3e:2d:00:4b", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dcdc02a-a9", "ovs_interfaceid": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.319 183079 DEBUG nova.network.os_vif_util [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "address": "fa:16:3e:2d:00:4b", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dcdc02a-a9", "ovs_interfaceid": "1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.319 183079 DEBUG nova.network.os_vif_util [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2d:00:4b,bridge_name='br-int',has_traffic_filtering=True,id=1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dcdc02a-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.320 183079 DEBUG os_vif [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:00:4b,bridge_name='br-int',has_traffic_filtering=True,id=1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dcdc02a-a9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.321 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.322 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1dcdc02a-a9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.323 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.325 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.327 183079 INFO os_vif [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:00:4b,bridge_name='br-int',has_traffic_filtering=True,id=1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dcdc02a-a9')
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.328 183079 INFO nova.virt.libvirt.driver [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Deleting instance files /var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d_del
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.329 183079 INFO nova.virt.libvirt.driver [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Deletion of /var/lib/nova/instances/c9f0a876-68d5-4c5d-b4cf-62a36101777d_del complete
Jan 22 17:52:05 compute-0 podman[242441]: 2026-01-22 17:52:05.337993391 +0000 UTC m=+0.046748875 container remove f3df2a8422e17d4c67e5c57ad760c65a8ddd7ed62aae9781365514a97ea51b25 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:52:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:05.345 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2213b69c-bdea-469c-a3ac-47610df0c6ed]: (4, ('Thu Jan 22 05:52:05 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (f3df2a8422e17d4c67e5c57ad760c65a8ddd7ed62aae9781365514a97ea51b25)\nf3df2a8422e17d4c67e5c57ad760c65a8ddd7ed62aae9781365514a97ea51b25\nThu Jan 22 05:52:05 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (f3df2a8422e17d4c67e5c57ad760c65a8ddd7ed62aae9781365514a97ea51b25)\nf3df2a8422e17d4c67e5c57ad760c65a8ddd7ed62aae9781365514a97ea51b25\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:05.346 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d53fd612-c0a0-40f2-9046-9f9cab60be9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:05.347 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.349 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:05 compute-0 kernel: tap88ed9213-70: left promiscuous mode
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.362 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:05.365 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[686f9387-6df9-4c84-a768-44d39ccfe0c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:05.377 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9c139d64-dc96-4252-8349-f758baff247b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.378 183079 INFO nova.compute.manager [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 22 17:52:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:05.378 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c7fd4a30-e5e0-4264-8e65-753730c8d7d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.378 183079 DEBUG oslo.service.loopingcall [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.379 183079 DEBUG nova.compute.manager [-] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.379 183079 DEBUG nova.network.neutron [-] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:52:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:05.394 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3fe47863-f102-424d-b787-199ae50c5bfa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662408, 'reachable_time': 36716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242469, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:05.397 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:52:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:05.397 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[369f7c9f-39c0-4ff8-a6cf-ee737ff53550]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d88ed9213\x2d7d48\x2d4c42\x2dad60\x2dce3fcf104f71.mount: Deactivated successfully.
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.835 183079 DEBUG nova.compute.manager [req-045abd1f-c7d7-4d23-b7ed-0670b39fbc36 req-5e135ff1-1474-47ea-9ce1-d32cdafb705b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Received event network-vif-unplugged-1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.836 183079 DEBUG oslo_concurrency.lockutils [req-045abd1f-c7d7-4d23-b7ed-0670b39fbc36 req-5e135ff1-1474-47ea-9ce1-d32cdafb705b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.836 183079 DEBUG oslo_concurrency.lockutils [req-045abd1f-c7d7-4d23-b7ed-0670b39fbc36 req-5e135ff1-1474-47ea-9ce1-d32cdafb705b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.837 183079 DEBUG oslo_concurrency.lockutils [req-045abd1f-c7d7-4d23-b7ed-0670b39fbc36 req-5e135ff1-1474-47ea-9ce1-d32cdafb705b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.837 183079 DEBUG nova.compute.manager [req-045abd1f-c7d7-4d23-b7ed-0670b39fbc36 req-5e135ff1-1474-47ea-9ce1-d32cdafb705b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] No waiting events found dispatching network-vif-unplugged-1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:52:05 compute-0 nova_compute[183075]: 2026-01-22 17:52:05.837 183079 DEBUG nova.compute.manager [req-045abd1f-c7d7-4d23-b7ed-0670b39fbc36 req-5e135ff1-1474-47ea-9ce1-d32cdafb705b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Received event network-vif-unplugged-1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:52:06 compute-0 nova_compute[183075]: 2026-01-22 17:52:06.502 183079 DEBUG nova.network.neutron [-] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:52:06 compute-0 nova_compute[183075]: 2026-01-22 17:52:06.519 183079 INFO nova.compute.manager [-] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Took 1.14 seconds to deallocate network for instance.
Jan 22 17:52:06 compute-0 nova_compute[183075]: 2026-01-22 17:52:06.570 183079 DEBUG oslo_concurrency.lockutils [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:06 compute-0 nova_compute[183075]: 2026-01-22 17:52:06.570 183079 DEBUG oslo_concurrency.lockutils [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:06 compute-0 nova_compute[183075]: 2026-01-22 17:52:06.616 183079 DEBUG nova.compute.provider_tree [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:52:06 compute-0 nova_compute[183075]: 2026-01-22 17:52:06.637 183079 DEBUG nova.scheduler.client.report [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:52:06 compute-0 nova_compute[183075]: 2026-01-22 17:52:06.778 183079 DEBUG oslo_concurrency.lockutils [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:06 compute-0 nova_compute[183075]: 2026-01-22 17:52:06.801 183079 INFO nova.scheduler.client.report [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance c9f0a876-68d5-4c5d-b4cf-62a36101777d
Jan 22 17:52:06 compute-0 nova_compute[183075]: 2026-01-22 17:52:06.885 183079 DEBUG oslo_concurrency.lockutils [None req-7261a3f8-ce15-40ec-aa84-f40929980cf2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.837s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:07 compute-0 nova_compute[183075]: 2026-01-22 17:52:07.963 183079 DEBUG nova.compute.manager [req-f7e355f2-8002-42a6-a3a0-4805ef543e0d req-9111968d-e7e0-4c3f-a4cd-187027d6485c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Received event network-vif-plugged-1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:52:07 compute-0 nova_compute[183075]: 2026-01-22 17:52:07.963 183079 DEBUG oslo_concurrency.lockutils [req-f7e355f2-8002-42a6-a3a0-4805ef543e0d req-9111968d-e7e0-4c3f-a4cd-187027d6485c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:07 compute-0 nova_compute[183075]: 2026-01-22 17:52:07.964 183079 DEBUG oslo_concurrency.lockutils [req-f7e355f2-8002-42a6-a3a0-4805ef543e0d req-9111968d-e7e0-4c3f-a4cd-187027d6485c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:07 compute-0 nova_compute[183075]: 2026-01-22 17:52:07.964 183079 DEBUG oslo_concurrency.lockutils [req-f7e355f2-8002-42a6-a3a0-4805ef543e0d req-9111968d-e7e0-4c3f-a4cd-187027d6485c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "c9f0a876-68d5-4c5d-b4cf-62a36101777d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:07 compute-0 nova_compute[183075]: 2026-01-22 17:52:07.964 183079 DEBUG nova.compute.manager [req-f7e355f2-8002-42a6-a3a0-4805ef543e0d req-9111968d-e7e0-4c3f-a4cd-187027d6485c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] No waiting events found dispatching network-vif-plugged-1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:52:07 compute-0 nova_compute[183075]: 2026-01-22 17:52:07.965 183079 WARNING nova.compute.manager [req-f7e355f2-8002-42a6-a3a0-4805ef543e0d req-9111968d-e7e0-4c3f-a4cd-187027d6485c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Received unexpected event network-vif-plugged-1dcdc02a-a93d-49f3-9e2b-8c4ade1238f0 for instance with vm_state deleted and task_state None.
Jan 22 17:52:09 compute-0 nova_compute[183075]: 2026-01-22 17:52:09.067 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:09 compute-0 podman[242472]: 2026-01-22 17:52:09.352134216 +0000 UTC m=+0.056568428 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=openstack_network_exporter, vcs-type=git, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public)
Jan 22 17:52:09 compute-0 podman[242471]: 2026-01-22 17:52:09.370724335 +0000 UTC m=+0.077403727 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:52:09 compute-0 podman[242470]: 2026-01-22 17:52:09.397101272 +0000 UTC m=+0.107003681 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 17:52:10 compute-0 nova_compute[183075]: 2026-01-22 17:52:10.325 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:10 compute-0 nova_compute[183075]: 2026-01-22 17:52:10.628 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:10 compute-0 nova_compute[183075]: 2026-01-22 17:52:10.630 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:10 compute-0 nova_compute[183075]: 2026-01-22 17:52:10.650 183079 DEBUG nova.compute.manager [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:52:10 compute-0 nova_compute[183075]: 2026-01-22 17:52:10.721 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:10 compute-0 nova_compute[183075]: 2026-01-22 17:52:10.722 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:10 compute-0 nova_compute[183075]: 2026-01-22 17:52:10.728 183079 DEBUG nova.virt.hardware [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:52:10 compute-0 nova_compute[183075]: 2026-01-22 17:52:10.729 183079 INFO nova.compute.claims [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:52:10 compute-0 nova_compute[183075]: 2026-01-22 17:52:10.818 183079 DEBUG nova.compute.provider_tree [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:52:10 compute-0 nova_compute[183075]: 2026-01-22 17:52:10.835 183079 DEBUG nova.scheduler.client.report [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:52:10 compute-0 nova_compute[183075]: 2026-01-22 17:52:10.862 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:10 compute-0 nova_compute[183075]: 2026-01-22 17:52:10.864 183079 DEBUG nova.compute.manager [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:52:10 compute-0 nova_compute[183075]: 2026-01-22 17:52:10.980 183079 DEBUG nova.compute.manager [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:52:10 compute-0 nova_compute[183075]: 2026-01-22 17:52:10.981 183079 DEBUG nova.network.neutron [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.005 183079 INFO nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.024 183079 DEBUG nova.compute.manager [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.109 183079 DEBUG nova.compute.manager [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.110 183079 DEBUG nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.111 183079 INFO nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Creating image(s)
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.112 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.112 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.113 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.130 183079 DEBUG oslo_concurrency.processutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.206 183079 DEBUG oslo_concurrency.processutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.207 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.207 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.220 183079 DEBUG oslo_concurrency.processutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.283 183079 DEBUG oslo_concurrency.processutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.284 183079 DEBUG oslo_concurrency.processutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.320 183079 DEBUG oslo_concurrency.processutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.321 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.321 183079 DEBUG oslo_concurrency.processutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.382 183079 DEBUG oslo_concurrency.processutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.383 183079 DEBUG nova.virt.disk.api [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.384 183079 DEBUG oslo_concurrency.processutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.446 183079 DEBUG oslo_concurrency.processutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.447 183079 DEBUG nova.virt.disk.api [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.448 183079 DEBUG nova.objects.instance [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid eadc13fe-ef8b-46e5-af17-40b408aa5aff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.462 183079 DEBUG nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.462 183079 DEBUG nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Ensure instance console log exists: /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.463 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.463 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.464 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:11 compute-0 nova_compute[183075]: 2026-01-22 17:52:11.660 183079 DEBUG nova.policy [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:52:12 compute-0 nova_compute[183075]: 2026-01-22 17:52:12.757 183079 DEBUG nova.network.neutron [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Successfully created port: 6f89fabe-02ec-4340-9703-82e93c101ebc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:52:13 compute-0 nova_compute[183075]: 2026-01-22 17:52:13.078 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:52:13 compute-0 nova_compute[183075]: 2026-01-22 17:52:13.395 183079 DEBUG nova.network.neutron [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Successfully updated port: 6f89fabe-02ec-4340-9703-82e93c101ebc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:52:13 compute-0 nova_compute[183075]: 2026-01-22 17:52:13.409 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:52:13 compute-0 nova_compute[183075]: 2026-01-22 17:52:13.410 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:52:13 compute-0 nova_compute[183075]: 2026-01-22 17:52:13.410 183079 DEBUG nova.network.neutron [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:52:13 compute-0 nova_compute[183075]: 2026-01-22 17:52:13.474 183079 DEBUG nova.compute.manager [req-11768433-0a17-4789-a8f6-8d8ec44e5b3e req-889dfb4e-976d-4a3d-ae74-307b1e14c064 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Received event network-changed-6f89fabe-02ec-4340-9703-82e93c101ebc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:52:13 compute-0 nova_compute[183075]: 2026-01-22 17:52:13.475 183079 DEBUG nova.compute.manager [req-11768433-0a17-4789-a8f6-8d8ec44e5b3e req-889dfb4e-976d-4a3d-ae74-307b1e14c064 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Refreshing instance network info cache due to event network-changed-6f89fabe-02ec-4340-9703-82e93c101ebc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:52:13 compute-0 nova_compute[183075]: 2026-01-22 17:52:13.475 183079 DEBUG oslo_concurrency.lockutils [req-11768433-0a17-4789-a8f6-8d8ec44e5b3e req-889dfb4e-976d-4a3d-ae74-307b1e14c064 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:52:13 compute-0 nova_compute[183075]: 2026-01-22 17:52:13.517 183079 DEBUG nova.network.neutron [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.065 183079 DEBUG nova.network.neutron [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Updating instance_info_cache with network_info: [{"id": "6f89fabe-02ec-4340-9703-82e93c101ebc", "address": "fa:16:3e:45:45:58", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f89fabe-02", "ovs_interfaceid": "6f89fabe-02ec-4340-9703-82e93c101ebc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.070 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.095 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.096 183079 DEBUG nova.compute.manager [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Instance network_info: |[{"id": "6f89fabe-02ec-4340-9703-82e93c101ebc", "address": "fa:16:3e:45:45:58", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f89fabe-02", "ovs_interfaceid": "6f89fabe-02ec-4340-9703-82e93c101ebc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.097 183079 DEBUG oslo_concurrency.lockutils [req-11768433-0a17-4789-a8f6-8d8ec44e5b3e req-889dfb4e-976d-4a3d-ae74-307b1e14c064 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.098 183079 DEBUG nova.network.neutron [req-11768433-0a17-4789-a8f6-8d8ec44e5b3e req-889dfb4e-976d-4a3d-ae74-307b1e14c064 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Refreshing network info cache for port 6f89fabe-02ec-4340-9703-82e93c101ebc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.103 183079 DEBUG nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Start _get_guest_xml network_info=[{"id": "6f89fabe-02ec-4340-9703-82e93c101ebc", "address": "fa:16:3e:45:45:58", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f89fabe-02", "ovs_interfaceid": "6f89fabe-02ec-4340-9703-82e93c101ebc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.112 183079 WARNING nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.120 183079 DEBUG nova.virt.libvirt.host [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.121 183079 DEBUG nova.virt.libvirt.host [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.125 183079 DEBUG nova.virt.libvirt.host [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.125 183079 DEBUG nova.virt.libvirt.host [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.126 183079 DEBUG nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.126 183079 DEBUG nova.virt.hardware [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.126 183079 DEBUG nova.virt.hardware [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.127 183079 DEBUG nova.virt.hardware [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.127 183079 DEBUG nova.virt.hardware [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.127 183079 DEBUG nova.virt.hardware [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.127 183079 DEBUG nova.virt.hardware [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.127 183079 DEBUG nova.virt.hardware [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.128 183079 DEBUG nova.virt.hardware [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.128 183079 DEBUG nova.virt.hardware [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.128 183079 DEBUG nova.virt.hardware [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.128 183079 DEBUG nova.virt.hardware [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.132 183079 DEBUG nova.virt.libvirt.vif [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:52:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1171118669',display_name='tempest-server-test-1171118669',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1171118669',id=74,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-csh87x03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:52:11Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=eadc13fe-ef8b-46e5-af17-40b408aa5aff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6f89fabe-02ec-4340-9703-82e93c101ebc", "address": "fa:16:3e:45:45:58", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f89fabe-02", "ovs_interfaceid": "6f89fabe-02ec-4340-9703-82e93c101ebc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.132 183079 DEBUG nova.network.os_vif_util [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "6f89fabe-02ec-4340-9703-82e93c101ebc", "address": "fa:16:3e:45:45:58", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f89fabe-02", "ovs_interfaceid": "6f89fabe-02ec-4340-9703-82e93c101ebc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.133 183079 DEBUG nova.network.os_vif_util [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:45:58,bridge_name='br-int',has_traffic_filtering=True,id=6f89fabe-02ec-4340-9703-82e93c101ebc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f89fabe-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.133 183079 DEBUG nova.objects.instance [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid eadc13fe-ef8b-46e5-af17-40b408aa5aff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.147 183079 DEBUG nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:52:14 compute-0 nova_compute[183075]:   <uuid>eadc13fe-ef8b-46e5-af17-40b408aa5aff</uuid>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   <name>instance-0000004a</name>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1171118669</nova:name>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:52:14</nova:creationTime>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:52:14 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:52:14 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:52:14 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:52:14 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:52:14 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:52:14 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:52:14 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:52:14 compute-0 nova_compute[183075]:         <nova:port uuid="6f89fabe-02ec-4340-9703-82e93c101ebc">
Jan 22 17:52:14 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <system>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <entry name="serial">eadc13fe-ef8b-46e5-af17-40b408aa5aff</entry>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <entry name="uuid">eadc13fe-ef8b-46e5-af17-40b408aa5aff</entry>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     </system>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   <os>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   </os>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   <features>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   </features>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:45:45:58"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <target dev="tap6f89fabe-02"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/console.log" append="off"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <video>
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     </video>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:52:14 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:52:14 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:52:14 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:52:14 compute-0 nova_compute[183075]: </domain>
Jan 22 17:52:14 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.149 183079 DEBUG nova.compute.manager [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Preparing to wait for external event network-vif-plugged-6f89fabe-02ec-4340-9703-82e93c101ebc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.149 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.150 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.150 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.151 183079 DEBUG nova.virt.libvirt.vif [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:52:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1171118669',display_name='tempest-server-test-1171118669',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1171118669',id=74,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-csh87x03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:52:11Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=eadc13fe-ef8b-46e5-af17-40b408aa5aff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6f89fabe-02ec-4340-9703-82e93c101ebc", "address": "fa:16:3e:45:45:58", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f89fabe-02", "ovs_interfaceid": "6f89fabe-02ec-4340-9703-82e93c101ebc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.151 183079 DEBUG nova.network.os_vif_util [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "6f89fabe-02ec-4340-9703-82e93c101ebc", "address": "fa:16:3e:45:45:58", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f89fabe-02", "ovs_interfaceid": "6f89fabe-02ec-4340-9703-82e93c101ebc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.152 183079 DEBUG nova.network.os_vif_util [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:45:58,bridge_name='br-int',has_traffic_filtering=True,id=6f89fabe-02ec-4340-9703-82e93c101ebc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f89fabe-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.152 183079 DEBUG os_vif [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:45:58,bridge_name='br-int',has_traffic_filtering=True,id=6f89fabe-02ec-4340-9703-82e93c101ebc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f89fabe-02') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.153 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.153 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.153 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.156 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.156 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6f89fabe-02, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.157 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6f89fabe-02, col_values=(('external_ids', {'iface-id': '6f89fabe-02ec-4340-9703-82e93c101ebc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:45:58', 'vm-uuid': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.159 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:14 compute-0 NetworkManager[55454]: <info>  [1769104334.1598] manager: (tap6f89fabe-02): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/326)
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.161 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.165 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.167 183079 INFO os_vif [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:45:58,bridge_name='br-int',has_traffic_filtering=True,id=6f89fabe-02ec-4340-9703-82e93c101ebc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f89fabe-02')
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.219 183079 DEBUG nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.220 183079 DEBUG nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:45:45:58, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:52:14 compute-0 kernel: tap6f89fabe-02: entered promiscuous mode
Jan 22 17:52:14 compute-0 NetworkManager[55454]: <info>  [1769104334.2901] manager: (tap6f89fabe-02): new Tun device (/org/freedesktop/NetworkManager/Devices/327)
Jan 22 17:52:14 compute-0 ovn_controller[95372]: 2026-01-22T17:52:14Z|00821|binding|INFO|Claiming lport 6f89fabe-02ec-4340-9703-82e93c101ebc for this chassis.
Jan 22 17:52:14 compute-0 ovn_controller[95372]: 2026-01-22T17:52:14Z|00822|binding|INFO|6f89fabe-02ec-4340-9703-82e93c101ebc: Claiming fa:16:3e:45:45:58 10.100.0.4
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.291 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.304 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:45:58 10.100.0.4'], port_security=['fa:16:3e:45:45:58 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '865177eb-37df-4b01-a85a-fea79abf013c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=6f89fabe-02ec-4340-9703-82e93c101ebc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.306 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 6f89fabe-02ec-4340-9703-82e93c101ebc in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:52:14 compute-0 ovn_controller[95372]: 2026-01-22T17:52:14Z|00823|binding|INFO|Setting lport 6f89fabe-02ec-4340-9703-82e93c101ebc up in Southbound
Jan 22 17:52:14 compute-0 ovn_controller[95372]: 2026-01-22T17:52:14Z|00824|binding|INFO|Setting lport 6f89fabe-02ec-4340-9703-82e93c101ebc ovn-installed in OVS
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.307 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.307 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.314 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.321 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3ae9572f-328c-48e7-9471-d3f4212bed22]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.322 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88ed9213-71 in ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:52:14 compute-0 systemd-udevd[242566]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.324 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88ed9213-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.324 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8638bb3a-683d-42cd-ba00-f46d3bbcffd4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.325 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6195f1b3-c079-48d9-b519-ee10d0a82422]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:14 compute-0 NetworkManager[55454]: <info>  [1769104334.3389] device (tap6f89fabe-02): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.337 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[52601d4f-dab7-4921-bc27-cf0abdfc6b62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:14 compute-0 NetworkManager[55454]: <info>  [1769104334.3399] device (tap6f89fabe-02): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:52:14 compute-0 systemd-machined[154382]: New machine qemu-74-instance-0000004a.
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.355 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d3c4fc77-e6c9-479f-bd4f-c87c8ce6e6b2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:14 compute-0 systemd[1]: Started Virtual Machine qemu-74-instance-0000004a.
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.385 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[653ae748-d633-46e3-8db1-7f59c0e2b499]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.391 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[876a8f8c-5cd6-4df3-872d-4c9792560483]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:14 compute-0 NetworkManager[55454]: <info>  [1769104334.3923] manager: (tap88ed9213-70): new Veth device (/org/freedesktop/NetworkManager/Devices/328)
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.431 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[9d45107e-254f-40d7-9213-f0829d0caf9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.435 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[34bcf758-effb-4f1a-ba33-6069d9be4c0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:14 compute-0 podman[242573]: 2026-01-22 17:52:14.465235427 +0000 UTC m=+0.081093366 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:52:14 compute-0 NetworkManager[55454]: <info>  [1769104334.4722] device (tap88ed9213-70): carrier: link connected
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.475 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[b8687ca8-6e57-4e34-9001-167dfbe99f6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.494 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a9f348e4-9fd4-441e-82f7-febc13cc9453]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 669817, 'reachable_time': 23193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242616, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.511 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[46bc640a-4276-4fb6-96ac-a2214b6a4ff1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:fa93'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 669817, 'tstamp': 669817}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242617, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.528 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[913cb9d3-3ec2-45b0-a4b9-3414d89cefa4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 669817, 'reachable_time': 23193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242618, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.559 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[548bd5ce-e934-4bf0-9d4d-6979f80bdf93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.743 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[92aadef0-d6cc-46cd-8872-280a241d6b0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.745 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.745 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.746 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.748 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:14 compute-0 NetworkManager[55454]: <info>  [1769104334.7490] manager: (tap88ed9213-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/329)
Jan 22 17:52:14 compute-0 kernel: tap88ed9213-70: entered promiscuous mode
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.752 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.753 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:52:14 compute-0 ovn_controller[95372]: 2026-01-22T17:52:14Z|00825|binding|INFO|Releasing lport da93a32e-eed6-4124-8f08-78b0bfe2cb69 from this chassis (sb_readonly=0)
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.754 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.756 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.757 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:52:14 compute-0 nova_compute[183075]: 2026-01-22 17:52:14.769 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.769 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4eb43c87-e492-40e3-98d6-09c7065bbe55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.772 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:52:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:14.773 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'env', 'PROCESS_TAG=haproxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88ed9213-7d48-4c42-ad60-ce3fcf104f71.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:52:15 compute-0 podman[242650]: 2026-01-22 17:52:15.128571247 +0000 UTC m=+0.047297820 container create 5f29505b917845c1b55edd6e7c18effb6aa9abc30bc109186cf7135b336cb0a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:52:15 compute-0 systemd[1]: Started libpod-conmon-5f29505b917845c1b55edd6e7c18effb6aa9abc30bc109186cf7135b336cb0a3.scope.
Jan 22 17:52:15 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:52:15 compute-0 podman[242650]: 2026-01-22 17:52:15.102478927 +0000 UTC m=+0.021205520 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:52:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a14daccc3a656e38221d6c677b14d4f3c45249cbd5830a61370d07ea94ab5ff6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:52:15 compute-0 podman[242650]: 2026-01-22 17:52:15.211335627 +0000 UTC m=+0.130062200 container init 5f29505b917845c1b55edd6e7c18effb6aa9abc30bc109186cf7135b336cb0a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 17:52:15 compute-0 podman[242650]: 2026-01-22 17:52:15.216816444 +0000 UTC m=+0.135543027 container start 5f29505b917845c1b55edd6e7c18effb6aa9abc30bc109186cf7135b336cb0a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 17:52:15 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242666]: [NOTICE]   (242670) : New worker (242672) forked
Jan 22 17:52:15 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242666]: [NOTICE]   (242670) : Loading success.
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.555 183079 DEBUG nova.compute.manager [req-72a903c1-f8f5-42c6-b721-3a01785f623b req-405962ca-963e-4cb1-afb0-fa0b724e08f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Received event network-vif-plugged-6f89fabe-02ec-4340-9703-82e93c101ebc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.555 183079 DEBUG oslo_concurrency.lockutils [req-72a903c1-f8f5-42c6-b721-3a01785f623b req-405962ca-963e-4cb1-afb0-fa0b724e08f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.555 183079 DEBUG oslo_concurrency.lockutils [req-72a903c1-f8f5-42c6-b721-3a01785f623b req-405962ca-963e-4cb1-afb0-fa0b724e08f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.556 183079 DEBUG oslo_concurrency.lockutils [req-72a903c1-f8f5-42c6-b721-3a01785f623b req-405962ca-963e-4cb1-afb0-fa0b724e08f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.556 183079 DEBUG nova.compute.manager [req-72a903c1-f8f5-42c6-b721-3a01785f623b req-405962ca-963e-4cb1-afb0-fa0b724e08f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Processing event network-vif-plugged-6f89fabe-02ec-4340-9703-82e93c101ebc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.556 183079 DEBUG nova.compute.manager [req-72a903c1-f8f5-42c6-b721-3a01785f623b req-405962ca-963e-4cb1-afb0-fa0b724e08f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Received event network-vif-plugged-6f89fabe-02ec-4340-9703-82e93c101ebc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.556 183079 DEBUG oslo_concurrency.lockutils [req-72a903c1-f8f5-42c6-b721-3a01785f623b req-405962ca-963e-4cb1-afb0-fa0b724e08f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.556 183079 DEBUG oslo_concurrency.lockutils [req-72a903c1-f8f5-42c6-b721-3a01785f623b req-405962ca-963e-4cb1-afb0-fa0b724e08f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.557 183079 DEBUG oslo_concurrency.lockutils [req-72a903c1-f8f5-42c6-b721-3a01785f623b req-405962ca-963e-4cb1-afb0-fa0b724e08f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.557 183079 DEBUG nova.compute.manager [req-72a903c1-f8f5-42c6-b721-3a01785f623b req-405962ca-963e-4cb1-afb0-fa0b724e08f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] No waiting events found dispatching network-vif-plugged-6f89fabe-02ec-4340-9703-82e93c101ebc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.557 183079 WARNING nova.compute.manager [req-72a903c1-f8f5-42c6-b721-3a01785f623b req-405962ca-963e-4cb1-afb0-fa0b724e08f9 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Received unexpected event network-vif-plugged-6f89fabe-02ec-4340-9703-82e93c101ebc for instance with vm_state building and task_state spawning.
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.593 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104335.5927203, eadc13fe-ef8b-46e5-af17-40b408aa5aff => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.593 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] VM Started (Lifecycle Event)
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.595 183079 DEBUG nova.compute.manager [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.598 183079 DEBUG nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.601 183079 INFO nova.virt.libvirt.driver [-] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Instance spawned successfully.
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.601 183079 DEBUG nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.614 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.618 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.624 183079 DEBUG nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.624 183079 DEBUG nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.624 183079 DEBUG nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.625 183079 DEBUG nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.625 183079 DEBUG nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.626 183079 DEBUG nova.virt.libvirt.driver [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.652 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.652 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104335.594766, eadc13fe-ef8b-46e5-af17-40b408aa5aff => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.652 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] VM Paused (Lifecycle Event)
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.675 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.678 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104335.5974917, eadc13fe-ef8b-46e5-af17-40b408aa5aff => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.678 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] VM Resumed (Lifecycle Event)
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.686 183079 INFO nova.compute.manager [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Took 4.58 seconds to spawn the instance on the hypervisor.
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.686 183079 DEBUG nova.compute.manager [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.712 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.715 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.738 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.751 183079 INFO nova.compute.manager [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Took 5.05 seconds to build instance.
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.773 183079 DEBUG oslo_concurrency.lockutils [None req-8f7a876a-f265-442c-af77-76ec95880602 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.809 183079 DEBUG nova.network.neutron [req-11768433-0a17-4789-a8f6-8d8ec44e5b3e req-889dfb4e-976d-4a3d-ae74-307b1e14c064 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Updated VIF entry in instance network info cache for port 6f89fabe-02ec-4340-9703-82e93c101ebc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.809 183079 DEBUG nova.network.neutron [req-11768433-0a17-4789-a8f6-8d8ec44e5b3e req-889dfb4e-976d-4a3d-ae74-307b1e14c064 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Updating instance_info_cache with network_info: [{"id": "6f89fabe-02ec-4340-9703-82e93c101ebc", "address": "fa:16:3e:45:45:58", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f89fabe-02", "ovs_interfaceid": "6f89fabe-02ec-4340-9703-82e93c101ebc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:52:15 compute-0 nova_compute[183075]: 2026-01-22 17:52:15.826 183079 DEBUG oslo_concurrency.lockutils [req-11768433-0a17-4789-a8f6-8d8ec44e5b3e req-889dfb4e-976d-4a3d-ae74-307b1e14c064 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:52:16 compute-0 nova_compute[183075]: 2026-01-22 17:52:16.168 183079 INFO nova.compute.manager [None req-979d7c4b-5733-4671-a61d-e9b34a2e12f4 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Get console output
Jan 22 17:52:16 compute-0 nova_compute[183075]: 2026-01-22 17:52:16.174 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:52:17 compute-0 nova_compute[183075]: 2026-01-22 17:52:17.386 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769104322.385408, e0a2ef50-95ae-4e45-bedb-f385cf225914 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:52:17 compute-0 nova_compute[183075]: 2026-01-22 17:52:17.386 183079 INFO nova.compute.manager [-] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] VM Stopped (Lifecycle Event)
Jan 22 17:52:17 compute-0 nova_compute[183075]: 2026-01-22 17:52:17.403 183079 DEBUG nova.compute.manager [None req-cc56aa86-63b8-4cb9-b439-be9aa7abb309 - - - - - -] [instance: e0a2ef50-95ae-4e45-bedb-f385cf225914] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:52:19 compute-0 nova_compute[183075]: 2026-01-22 17:52:19.071 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:19 compute-0 nova_compute[183075]: 2026-01-22 17:52:19.158 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:20 compute-0 nova_compute[183075]: 2026-01-22 17:52:20.302 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769104325.3018732, c9f0a876-68d5-4c5d-b4cf-62a36101777d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:52:20 compute-0 nova_compute[183075]: 2026-01-22 17:52:20.303 183079 INFO nova.compute.manager [-] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] VM Stopped (Lifecycle Event)
Jan 22 17:52:20 compute-0 nova_compute[183075]: 2026-01-22 17:52:20.320 183079 DEBUG nova.compute.manager [None req-c514ecfd-8efc-463b-81fc-d480ed0e7a09 - - - - - -] [instance: c9f0a876-68d5-4c5d-b4cf-62a36101777d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:52:21 compute-0 nova_compute[183075]: 2026-01-22 17:52:21.319 183079 INFO nova.compute.manager [None req-e4fc56b2-f97d-4346-a614-a2d64b12b1a1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Get console output
Jan 22 17:52:21 compute-0 nova_compute[183075]: 2026-01-22 17:52:21.326 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:52:24 compute-0 nova_compute[183075]: 2026-01-22 17:52:24.106 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:24 compute-0 nova_compute[183075]: 2026-01-22 17:52:24.159 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:26 compute-0 podman[242688]: 2026-01-22 17:52:26.342223429 +0000 UTC m=+0.051017629 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:52:26 compute-0 nova_compute[183075]: 2026-01-22 17:52:26.450 183079 INFO nova.compute.manager [None req-27f80b0e-3175-4b0c-9d90-d57f257921b2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Get console output
Jan 22 17:52:28 compute-0 ovn_controller[95372]: 2026-01-22T17:52:28Z|00150|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:45:45:58 10.100.0.4
Jan 22 17:52:28 compute-0 ovn_controller[95372]: 2026-01-22T17:52:28Z|00151|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:45:58 10.100.0.4
Jan 22 17:52:29 compute-0 nova_compute[183075]: 2026-01-22 17:52:29.109 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:29 compute-0 nova_compute[183075]: 2026-01-22 17:52:29.161 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:29 compute-0 podman[242725]: 2026-01-22 17:52:29.337247273 +0000 UTC m=+0.048802010 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:52:31 compute-0 nova_compute[183075]: 2026-01-22 17:52:31.597 183079 INFO nova.compute.manager [None req-41212f2a-b6fd-4fc7-b280-56ea077aa6d2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Get console output
Jan 22 17:52:31 compute-0 nova_compute[183075]: 2026-01-22 17:52:31.602 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:52:34 compute-0 nova_compute[183075]: 2026-01-22 17:52:34.111 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:34 compute-0 nova_compute[183075]: 2026-01-22 17:52:34.162 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:34.873 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:52:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:34.874 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:52:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:52:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:52:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:52:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:52:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:52:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:52:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.774 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.775 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.9001074
Jan 22 17:52:35 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.4:46104 [22/Jan/2026:17:52:34.872] listener listener/metadata 0/0/0/902/902 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.783 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.784 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:52:35 compute-0 nova_compute[183075]: 2026-01-22 17:52:35.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.802 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.803 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0190725
Jan 22 17:52:35 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.4:46108 [22/Jan/2026:17:52:35.783] listener listener/metadata 0/0/0/20/20 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.807 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.808 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.824 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:52:35 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.4:46120 [22/Jan/2026:17:52:35.806] listener listener/metadata 0/0/0/18/18 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.825 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0169575
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.829 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.830 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.850 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.851 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0213070
Jan 22 17:52:35 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.4:46128 [22/Jan/2026:17:52:35.829] listener listener/metadata 0/0/0/22/22 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.858 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.858 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.874 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.875 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0165610
Jan 22 17:52:35 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.4:46142 [22/Jan/2026:17:52:35.857] listener listener/metadata 0/0/0/17/17 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.879 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.880 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.894 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.895 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0148544
Jan 22 17:52:35 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.4:46148 [22/Jan/2026:17:52:35.879] listener listener/metadata 0/0/0/15/15 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.899 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.899 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.912 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.912 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0133719
Jan 22 17:52:35 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.4:46160 [22/Jan/2026:17:52:35.898] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.916 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.917 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.931 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.932 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0150054
Jan 22 17:52:35 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.4:46164 [22/Jan/2026:17:52:35.916] listener listener/metadata 0/0/0/15/15 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.936 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.937 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.950 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:52:35 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.4:46174 [22/Jan/2026:17:52:35.936] listener listener/metadata 0/0/0/14/14 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.950 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0134666
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.954 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.955 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.968 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.969 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0134816
Jan 22 17:52:35 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.4:46180 [22/Jan/2026:17:52:35.954] listener listener/metadata 0/0/0/14/14 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.974 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.975 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:52:35 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.4:46190 [22/Jan/2026:17:52:35.974] listener listener/metadata 0/0/0/22/22 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:52:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:35.996 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0215077
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.005 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.006 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.050 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:52:36 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.4:46200 [22/Jan/2026:17:52:36.005] listener listener/metadata 0/0/0/47/47 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.052 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0460906
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.055 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.056 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.070 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.071 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0144961
Jan 22 17:52:36 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.4:46204 [22/Jan/2026:17:52:36.055] listener listener/metadata 0/0/0/15/15 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.074 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.074 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.089 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.089 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0149577
Jan 22 17:52:36 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.4:46208 [22/Jan/2026:17:52:36.074] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.093 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.094 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.110 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.110 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0164125
Jan 22 17:52:36 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.4:46216 [22/Jan/2026:17:52:36.093] listener listener/metadata 0/0/0/17/17 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.114 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.114 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.128 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:52:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:36.128 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0141826
Jan 22 17:52:36 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.4:46222 [22/Jan/2026:17:52:36.113] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:52:36 compute-0 nova_compute[183075]: 2026-01-22 17:52:36.750 183079 INFO nova.compute.manager [None req-056e9c7f-8cb5-4a9e-b1fb-cb2eaecec00e 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Get console output
Jan 22 17:52:36 compute-0 nova_compute[183075]: 2026-01-22 17:52:36.756 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:52:39 compute-0 nova_compute[183075]: 2026-01-22 17:52:39.114 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:39 compute-0 nova_compute[183075]: 2026-01-22 17:52:39.163 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:39 compute-0 nova_compute[183075]: 2026-01-22 17:52:39.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:52:39 compute-0 nova_compute[183075]: 2026-01-22 17:52:39.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:52:40 compute-0 podman[242751]: 2026-01-22 17:52:40.381549914 +0000 UTC m=+0.065082737 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public)
Jan 22 17:52:40 compute-0 podman[242750]: 2026-01-22 17:52:40.381556734 +0000 UTC m=+0.079216766 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 17:52:40 compute-0 podman[242749]: 2026-01-22 17:52:40.391346736 +0000 UTC m=+0.091861504 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 22 17:52:40 compute-0 nova_compute[183075]: 2026-01-22 17:52:40.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:52:41 compute-0 nova_compute[183075]: 2026-01-22 17:52:41.896 183079 INFO nova.compute.manager [None req-97cd9329-7674-40f8-b344-2b9d02e6d817 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Get console output
Jan 22 17:52:41 compute-0 nova_compute[183075]: 2026-01-22 17:52:41.902 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:52:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:41.972 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:41.972 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:52:41.973 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:44 compute-0 nova_compute[183075]: 2026-01-22 17:52:44.116 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:44 compute-0 nova_compute[183075]: 2026-01-22 17:52:44.165 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:44 compute-0 ovn_controller[95372]: 2026-01-22T17:52:44Z|00826|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 22 17:52:45 compute-0 podman[242813]: 2026-01-22 17:52:45.344400614 +0000 UTC m=+0.053637470 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:52:45 compute-0 nova_compute[183075]: 2026-01-22 17:52:45.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:52:47 compute-0 nova_compute[183075]: 2026-01-22 17:52:47.042 183079 INFO nova.compute.manager [None req-c9fe1800-e138-45b1-81ae-e03ecf7db477 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Get console output
Jan 22 17:52:47 compute-0 nova_compute[183075]: 2026-01-22 17:52:47.047 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:52:49 compute-0 nova_compute[183075]: 2026-01-22 17:52:49.120 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:49 compute-0 nova_compute[183075]: 2026-01-22 17:52:49.166 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:52 compute-0 nova_compute[183075]: 2026-01-22 17:52:52.196 183079 INFO nova.compute.manager [None req-71bee47b-ba66-4889-8b70-296b08037ed9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Get console output
Jan 22 17:52:52 compute-0 nova_compute[183075]: 2026-01-22 17:52:52.202 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:52:52 compute-0 nova_compute[183075]: 2026-01-22 17:52:52.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:52:52 compute-0 nova_compute[183075]: 2026-01-22 17:52:52.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:52:52 compute-0 nova_compute[183075]: 2026-01-22 17:52:52.804 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:52:52 compute-0 nova_compute[183075]: 2026-01-22 17:52:52.805 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:52:52 compute-0 nova_compute[183075]: 2026-01-22 17:52:52.828 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:52 compute-0 nova_compute[183075]: 2026-01-22 17:52:52.828 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:52 compute-0 nova_compute[183075]: 2026-01-22 17:52:52.829 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:52 compute-0 nova_compute[183075]: 2026-01-22 17:52:52.829 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:52:52 compute-0 nova_compute[183075]: 2026-01-22 17:52:52.884 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:52:52 compute-0 nova_compute[183075]: 2026-01-22 17:52:52.939 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:52:52 compute-0 nova_compute[183075]: 2026-01-22 17:52:52.940 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:52:52 compute-0 nova_compute[183075]: 2026-01-22 17:52:52.995 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:52:53 compute-0 nova_compute[183075]: 2026-01-22 17:52:53.136 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:52:53 compute-0 nova_compute[183075]: 2026-01-22 17:52:53.137 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5524MB free_disk=73.32297897338867GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:52:53 compute-0 nova_compute[183075]: 2026-01-22 17:52:53.137 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:52:53 compute-0 nova_compute[183075]: 2026-01-22 17:52:53.138 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:52:53 compute-0 nova_compute[183075]: 2026-01-22 17:52:53.210 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance eadc13fe-ef8b-46e5-af17-40b408aa5aff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:52:53 compute-0 nova_compute[183075]: 2026-01-22 17:52:53.211 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:52:53 compute-0 nova_compute[183075]: 2026-01-22 17:52:53.211 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:52:53 compute-0 nova_compute[183075]: 2026-01-22 17:52:53.251 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:52:53 compute-0 nova_compute[183075]: 2026-01-22 17:52:53.267 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:52:53 compute-0 nova_compute[183075]: 2026-01-22 17:52:53.285 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:52:53 compute-0 nova_compute[183075]: 2026-01-22 17:52:53.285 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:52:54 compute-0 nova_compute[183075]: 2026-01-22 17:52:54.162 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:54 compute-0 nova_compute[183075]: 2026-01-22 17:52:54.167 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:55 compute-0 nova_compute[183075]: 2026-01-22 17:52:55.268 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:52:55 compute-0 nova_compute[183075]: 2026-01-22 17:52:55.269 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:52:55 compute-0 nova_compute[183075]: 2026-01-22 17:52:55.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:52:57 compute-0 nova_compute[183075]: 2026-01-22 17:52:57.326 183079 INFO nova.compute.manager [None req-7bf847fa-0d8b-41bb-b71c-6fcbdc16cb0c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Get console output
Jan 22 17:52:57 compute-0 nova_compute[183075]: 2026-01-22 17:52:57.332 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:52:57 compute-0 podman[242841]: 2026-01-22 17:52:57.349970284 +0000 UTC m=+0.053699541 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:52:59 compute-0 nova_compute[183075]: 2026-01-22 17:52:59.164 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:52:59 compute-0 nova_compute[183075]: 2026-01-22 17:52:59.168 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:00 compute-0 podman[242867]: 2026-01-22 17:53:00.341622129 +0000 UTC m=+0.051282747 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:53:02 compute-0 nova_compute[183075]: 2026-01-22 17:53:02.450 183079 INFO nova.compute.manager [None req-a7f21ed6-fa21-4286-a609-18aba60d3cd2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Get console output
Jan 22 17:53:02 compute-0 nova_compute[183075]: 2026-01-22 17:53:02.453 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:53:04 compute-0 nova_compute[183075]: 2026-01-22 17:53:04.166 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:04 compute-0 nova_compute[183075]: 2026-01-22 17:53:04.169 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:07 compute-0 nova_compute[183075]: 2026-01-22 17:53:07.583 183079 INFO nova.compute.manager [None req-dc0f9642-07d1-47ba-a2c2-4b0359a17cb2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Get console output
Jan 22 17:53:07 compute-0 nova_compute[183075]: 2026-01-22 17:53:07.589 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:53:09 compute-0 nova_compute[183075]: 2026-01-22 17:53:09.168 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:09 compute-0 nova_compute[183075]: 2026-01-22 17:53:09.170 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:11 compute-0 podman[242893]: 2026-01-22 17:53:11.349706548 +0000 UTC m=+0.054252576 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 17:53:11 compute-0 podman[242892]: 2026-01-22 17:53:11.374818332 +0000 UTC m=+0.082627947 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 17:53:11 compute-0 podman[242891]: 2026-01-22 17:53:11.384581023 +0000 UTC m=+0.095392159 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:53:12 compute-0 nova_compute[183075]: 2026-01-22 17:53:12.878 183079 INFO nova.compute.manager [None req-0fe5a326-9e3c-4962-88f4-795d017a690e 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Get console output
Jan 22 17:53:12 compute-0 nova_compute[183075]: 2026-01-22 17:53:12.883 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:53:14 compute-0 nova_compute[183075]: 2026-01-22 17:53:14.169 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:14 compute-0 nova_compute[183075]: 2026-01-22 17:53:14.171 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:16 compute-0 podman[242955]: 2026-01-22 17:53:16.350471535 +0000 UTC m=+0.062011534 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:53:18 compute-0 nova_compute[183075]: 2026-01-22 17:53:18.960 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:53:18 compute-0 nova_compute[183075]: 2026-01-22 17:53:18.961 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:53:18 compute-0 nova_compute[183075]: 2026-01-22 17:53:18.981 183079 DEBUG nova.compute.manager [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:53:19 compute-0 nova_compute[183075]: 2026-01-22 17:53:19.093 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:53:19 compute-0 nova_compute[183075]: 2026-01-22 17:53:19.093 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:53:19 compute-0 nova_compute[183075]: 2026-01-22 17:53:19.108 183079 DEBUG nova.virt.hardware [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:53:19 compute-0 nova_compute[183075]: 2026-01-22 17:53:19.108 183079 INFO nova.compute.claims [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:53:19 compute-0 nova_compute[183075]: 2026-01-22 17:53:19.171 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:19 compute-0 nova_compute[183075]: 2026-01-22 17:53:19.423 183079 DEBUG nova.scheduler.client.report [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Refreshing inventories for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 17:53:19 compute-0 nova_compute[183075]: 2026-01-22 17:53:19.438 183079 DEBUG nova.scheduler.client.report [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Updating ProviderTree inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 17:53:19 compute-0 nova_compute[183075]: 2026-01-22 17:53:19.439 183079 DEBUG nova.compute.provider_tree [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 17:53:19 compute-0 nova_compute[183075]: 2026-01-22 17:53:19.461 183079 DEBUG nova.scheduler.client.report [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Refreshing aggregate associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 17:53:19 compute-0 nova_compute[183075]: 2026-01-22 17:53:19.481 183079 DEBUG nova.scheduler.client.report [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Refreshing trait associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NODE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 17:53:19 compute-0 nova_compute[183075]: 2026-01-22 17:53:19.539 183079 DEBUG nova.compute.provider_tree [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:53:19 compute-0 nova_compute[183075]: 2026-01-22 17:53:19.635 183079 DEBUG nova.scheduler.client.report [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:53:19 compute-0 nova_compute[183075]: 2026-01-22 17:53:19.854 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.761s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:53:19 compute-0 nova_compute[183075]: 2026-01-22 17:53:19.855 183079 DEBUG nova.compute.manager [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:53:19 compute-0 nova_compute[183075]: 2026-01-22 17:53:19.985 183079 DEBUG nova.compute.manager [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:53:19 compute-0 nova_compute[183075]: 2026-01-22 17:53:19.986 183079 DEBUG nova.network.neutron [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.040 183079 INFO nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.069 183079 DEBUG nova.compute.manager [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.164 183079 DEBUG nova.compute.manager [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.165 183079 DEBUG nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.166 183079 INFO nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Creating image(s)
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.166 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.167 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.167 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.182 183079 DEBUG oslo_concurrency.processutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.238 183079 DEBUG oslo_concurrency.processutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.239 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.240 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.250 183079 DEBUG oslo_concurrency.processutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.304 183079 DEBUG oslo_concurrency.processutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.305 183079 DEBUG oslo_concurrency.processutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.338 183079 DEBUG oslo_concurrency.processutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.338 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.339 183079 DEBUG oslo_concurrency.processutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.392 183079 DEBUG oslo_concurrency.processutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.393 183079 DEBUG nova.virt.disk.api [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.394 183079 DEBUG oslo_concurrency.processutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.448 183079 DEBUG oslo_concurrency.processutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.449 183079 DEBUG nova.virt.disk.api [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.450 183079 DEBUG nova.objects.instance [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid 42bc2eb6-0654-413e-bf4f-7926c9f3efb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.470 183079 DEBUG nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.471 183079 DEBUG nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Ensure instance console log exists: /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.471 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.472 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.472 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:53:20 compute-0 nova_compute[183075]: 2026-01-22 17:53:20.671 183079 DEBUG nova.policy [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:53:24 compute-0 nova_compute[183075]: 2026-01-22 17:53:24.171 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:24 compute-0 nova_compute[183075]: 2026-01-22 17:53:24.173 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:24.673 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:53:24 compute-0 nova_compute[183075]: 2026-01-22 17:53:24.674 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:24.674 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:53:24 compute-0 nova_compute[183075]: 2026-01-22 17:53:24.773 183079 DEBUG nova.network.neutron [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Successfully created port: 9c1da312-2c1b-451b-9ea1-34ff96520bb9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:53:26 compute-0 nova_compute[183075]: 2026-01-22 17:53:26.388 183079 DEBUG nova.network.neutron [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Successfully updated port: 9c1da312-2c1b-451b-9ea1-34ff96520bb9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:53:26 compute-0 nova_compute[183075]: 2026-01-22 17:53:26.404 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-42bc2eb6-0654-413e-bf4f-7926c9f3efb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:53:26 compute-0 nova_compute[183075]: 2026-01-22 17:53:26.405 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-42bc2eb6-0654-413e-bf4f-7926c9f3efb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:53:26 compute-0 nova_compute[183075]: 2026-01-22 17:53:26.405 183079 DEBUG nova.network.neutron [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:53:26 compute-0 nova_compute[183075]: 2026-01-22 17:53:26.529 183079 DEBUG nova.network.neutron [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:53:26 compute-0 nova_compute[183075]: 2026-01-22 17:53:26.807 183079 DEBUG nova.compute.manager [req-5fd8582f-875d-4e03-b330-46868f52db27 req-5dbe435b-81e9-462a-8ebe-ed8bf73b11af a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Received event network-changed-9c1da312-2c1b-451b-9ea1-34ff96520bb9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:53:26 compute-0 nova_compute[183075]: 2026-01-22 17:53:26.807 183079 DEBUG nova.compute.manager [req-5fd8582f-875d-4e03-b330-46868f52db27 req-5dbe435b-81e9-462a-8ebe-ed8bf73b11af a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Refreshing instance network info cache due to event network-changed-9c1da312-2c1b-451b-9ea1-34ff96520bb9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:53:26 compute-0 nova_compute[183075]: 2026-01-22 17:53:26.808 183079 DEBUG oslo_concurrency.lockutils [req-5fd8582f-875d-4e03-b330-46868f52db27 req-5dbe435b-81e9-462a-8ebe-ed8bf73b11af a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-42bc2eb6-0654-413e-bf4f-7926c9f3efb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.118 183079 DEBUG nova.network.neutron [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Updating instance_info_cache with network_info: [{"id": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "address": "fa:16:3e:6a:a8:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c1da312-2c", "ovs_interfaceid": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.140 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-42bc2eb6-0654-413e-bf4f-7926c9f3efb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.141 183079 DEBUG nova.compute.manager [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Instance network_info: |[{"id": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "address": "fa:16:3e:6a:a8:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c1da312-2c", "ovs_interfaceid": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.141 183079 DEBUG oslo_concurrency.lockutils [req-5fd8582f-875d-4e03-b330-46868f52db27 req-5dbe435b-81e9-462a-8ebe-ed8bf73b11af a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-42bc2eb6-0654-413e-bf4f-7926c9f3efb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.142 183079 DEBUG nova.network.neutron [req-5fd8582f-875d-4e03-b330-46868f52db27 req-5dbe435b-81e9-462a-8ebe-ed8bf73b11af a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Refreshing network info cache for port 9c1da312-2c1b-451b-9ea1-34ff96520bb9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.145 183079 DEBUG nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Start _get_guest_xml network_info=[{"id": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "address": "fa:16:3e:6a:a8:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c1da312-2c", "ovs_interfaceid": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.150 183079 WARNING nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.156 183079 DEBUG nova.virt.libvirt.host [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.157 183079 DEBUG nova.virt.libvirt.host [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.165 183079 DEBUG nova.virt.libvirt.host [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.165 183079 DEBUG nova.virt.libvirt.host [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.166 183079 DEBUG nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.166 183079 DEBUG nova.virt.hardware [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.167 183079 DEBUG nova.virt.hardware [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.167 183079 DEBUG nova.virt.hardware [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.167 183079 DEBUG nova.virt.hardware [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.167 183079 DEBUG nova.virt.hardware [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.167 183079 DEBUG nova.virt.hardware [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.168 183079 DEBUG nova.virt.hardware [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.168 183079 DEBUG nova.virt.hardware [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.168 183079 DEBUG nova.virt.hardware [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.168 183079 DEBUG nova.virt.hardware [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.168 183079 DEBUG nova.virt.hardware [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.172 183079 DEBUG nova.virt.libvirt.vif [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:53:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1316386013',display_name='tempest-server-test-1316386013',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1316386013',id=75,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-tht8ezry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:53:20Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=42bc2eb6-0654-413e-bf4f-7926c9f3efb5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "address": "fa:16:3e:6a:a8:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c1da312-2c", "ovs_interfaceid": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.172 183079 DEBUG nova.network.os_vif_util [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "address": "fa:16:3e:6a:a8:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c1da312-2c", "ovs_interfaceid": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.173 183079 DEBUG nova.network.os_vif_util [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:a8:74,bridge_name='br-int',has_traffic_filtering=True,id=9c1da312-2c1b-451b-9ea1-34ff96520bb9,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c1da312-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.173 183079 DEBUG nova.objects.instance [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid 42bc2eb6-0654-413e-bf4f-7926c9f3efb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.186 183079 DEBUG nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:53:27 compute-0 nova_compute[183075]:   <uuid>42bc2eb6-0654-413e-bf4f-7926c9f3efb5</uuid>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   <name>instance-0000004b</name>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1316386013</nova:name>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:53:27</nova:creationTime>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:53:27 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:53:27 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:53:27 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:53:27 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:53:27 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:53:27 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:53:27 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:53:27 compute-0 nova_compute[183075]:         <nova:port uuid="9c1da312-2c1b-451b-9ea1-34ff96520bb9">
Jan 22 17:53:27 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <system>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <entry name="serial">42bc2eb6-0654-413e-bf4f-7926c9f3efb5</entry>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <entry name="uuid">42bc2eb6-0654-413e-bf4f-7926c9f3efb5</entry>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     </system>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   <os>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   </os>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   <features>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   </features>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:6a:a8:74"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <target dev="tap9c1da312-2c"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/console.log" append="off"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <video>
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     </video>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:53:27 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:53:27 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:53:27 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:53:27 compute-0 nova_compute[183075]: </domain>
Jan 22 17:53:27 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.188 183079 DEBUG nova.compute.manager [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Preparing to wait for external event network-vif-plugged-9c1da312-2c1b-451b-9ea1-34ff96520bb9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.188 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.189 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.189 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.189 183079 DEBUG nova.virt.libvirt.vif [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:53:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1316386013',display_name='tempest-server-test-1316386013',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1316386013',id=75,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-tht8ezry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:53:20Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=42bc2eb6-0654-413e-bf4f-7926c9f3efb5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "address": "fa:16:3e:6a:a8:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c1da312-2c", "ovs_interfaceid": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.190 183079 DEBUG nova.network.os_vif_util [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "address": "fa:16:3e:6a:a8:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c1da312-2c", "ovs_interfaceid": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.190 183079 DEBUG nova.network.os_vif_util [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:a8:74,bridge_name='br-int',has_traffic_filtering=True,id=9c1da312-2c1b-451b-9ea1-34ff96520bb9,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c1da312-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.191 183079 DEBUG os_vif [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:a8:74,bridge_name='br-int',has_traffic_filtering=True,id=9c1da312-2c1b-451b-9ea1-34ff96520bb9,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c1da312-2c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.191 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.192 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.192 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.195 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.196 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c1da312-2c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.196 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9c1da312-2c, col_values=(('external_ids', {'iface-id': '9c1da312-2c1b-451b-9ea1-34ff96520bb9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:a8:74', 'vm-uuid': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.198 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:27 compute-0 NetworkManager[55454]: <info>  [1769104407.2000] manager: (tap9c1da312-2c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/330)
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.200 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.208 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.210 183079 INFO os_vif [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:a8:74,bridge_name='br-int',has_traffic_filtering=True,id=9c1da312-2c1b-451b-9ea1-34ff96520bb9,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c1da312-2c')
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.364 183079 DEBUG nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.365 183079 DEBUG nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:6a:a8:74, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:53:27 compute-0 kernel: tap9c1da312-2c: entered promiscuous mode
Jan 22 17:53:27 compute-0 NetworkManager[55454]: <info>  [1769104407.4442] manager: (tap9c1da312-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/331)
Jan 22 17:53:27 compute-0 systemd-udevd[243015]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.490 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:27 compute-0 ovn_controller[95372]: 2026-01-22T17:53:27Z|00827|binding|INFO|Claiming lport 9c1da312-2c1b-451b-9ea1-34ff96520bb9 for this chassis.
Jan 22 17:53:27 compute-0 ovn_controller[95372]: 2026-01-22T17:53:27Z|00828|binding|INFO|9c1da312-2c1b-451b-9ea1-34ff96520bb9: Claiming fa:16:3e:6a:a8:74 10.100.0.7
Jan 22 17:53:27 compute-0 NetworkManager[55454]: <info>  [1769104407.4998] device (tap9c1da312-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:53:27 compute-0 NetworkManager[55454]: <info>  [1769104407.5005] device (tap9c1da312-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:53:27 compute-0 ovn_controller[95372]: 2026-01-22T17:53:27Z|00829|binding|INFO|Setting lport 9c1da312-2c1b-451b-9ea1-34ff96520bb9 ovn-installed in OVS
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.505 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:27 compute-0 podman[242998]: 2026-01-22 17:53:27.508529416 +0000 UTC m=+0.071806337 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:53:27 compute-0 systemd-machined[154382]: New machine qemu-75-instance-0000004b.
Jan 22 17:53:27 compute-0 systemd[1]: Started Virtual Machine qemu-75-instance-0000004b.
Jan 22 17:53:27 compute-0 ovn_controller[95372]: 2026-01-22T17:53:27Z|00830|binding|INFO|Setting lport 9c1da312-2c1b-451b-9ea1-34ff96520bb9 up in Southbound
Jan 22 17:53:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:27.544 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:a8:74 10.100.0.7'], port_security=['fa:16:3e:6a:a8:74 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '865177eb-37df-4b01-a85a-fea79abf013c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=9c1da312-2c1b-451b-9ea1-34ff96520bb9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:53:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:27.546 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 9c1da312-2c1b-451b-9ea1-34ff96520bb9 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:53:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:27.548 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:53:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:27.566 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[05b37730-5eda-44f4-aaa6-244b9359ed78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:53:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:27.598 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[20acd83f-6c54-4cc4-a063-f2a21d2c2e8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:53:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:27.601 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[e9a74b63-426a-471e-ae8d-3028bb342001]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:53:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:27.629 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[2d4263c1-7a82-46a6-8dcf-e8ae138cbc6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:53:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:27.645 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[36ee7413-456f-48d6-ae68-c8a492e43611]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 55, 'rx_bytes': 8920, 'tx_bytes': 6197, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 55, 'rx_bytes': 8920, 'tx_bytes': 6197, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 669817, 'reachable_time': 23193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243044, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:53:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:27.661 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0432e2e6-b9f9-47bd-a3c7-a836ebd3cb86]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 669828, 'tstamp': 669828}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243045, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 669843, 'tstamp': 669843}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243045, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:53:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:27.663 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:53:27 compute-0 nova_compute[183075]: 2026-01-22 17:53:27.665 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:27.666 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:53:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:27.666 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:53:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:27.667 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:53:27 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:27.667 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.105 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104408.1049225, 42bc2eb6-0654-413e-bf4f-7926c9f3efb5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.106 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] VM Started (Lifecycle Event)
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.173 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.179 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104408.1050928, 42bc2eb6-0654-413e-bf4f-7926c9f3efb5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.179 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] VM Paused (Lifecycle Event)
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.200 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.204 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.280 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:53:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:28.689 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.787 183079 DEBUG nova.network.neutron [req-5fd8582f-875d-4e03-b330-46868f52db27 req-5dbe435b-81e9-462a-8ebe-ed8bf73b11af a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Updated VIF entry in instance network info cache for port 9c1da312-2c1b-451b-9ea1-34ff96520bb9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.787 183079 DEBUG nova.network.neutron [req-5fd8582f-875d-4e03-b330-46868f52db27 req-5dbe435b-81e9-462a-8ebe-ed8bf73b11af a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Updating instance_info_cache with network_info: [{"id": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "address": "fa:16:3e:6a:a8:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c1da312-2c", "ovs_interfaceid": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.801 183079 DEBUG oslo_concurrency.lockutils [req-5fd8582f-875d-4e03-b330-46868f52db27 req-5dbe435b-81e9-462a-8ebe-ed8bf73b11af a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-42bc2eb6-0654-413e-bf4f-7926c9f3efb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.886 183079 DEBUG nova.compute.manager [req-85a11408-d213-4149-800c-29ed9b63c585 req-c9c53e6e-d1f6-4652-b625-51ba8c0892e6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Received event network-vif-plugged-9c1da312-2c1b-451b-9ea1-34ff96520bb9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.886 183079 DEBUG oslo_concurrency.lockutils [req-85a11408-d213-4149-800c-29ed9b63c585 req-c9c53e6e-d1f6-4652-b625-51ba8c0892e6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.887 183079 DEBUG oslo_concurrency.lockutils [req-85a11408-d213-4149-800c-29ed9b63c585 req-c9c53e6e-d1f6-4652-b625-51ba8c0892e6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.887 183079 DEBUG oslo_concurrency.lockutils [req-85a11408-d213-4149-800c-29ed9b63c585 req-c9c53e6e-d1f6-4652-b625-51ba8c0892e6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.887 183079 DEBUG nova.compute.manager [req-85a11408-d213-4149-800c-29ed9b63c585 req-c9c53e6e-d1f6-4652-b625-51ba8c0892e6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Processing event network-vif-plugged-9c1da312-2c1b-451b-9ea1-34ff96520bb9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.887 183079 DEBUG nova.compute.manager [req-85a11408-d213-4149-800c-29ed9b63c585 req-c9c53e6e-d1f6-4652-b625-51ba8c0892e6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Received event network-vif-plugged-9c1da312-2c1b-451b-9ea1-34ff96520bb9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.888 183079 DEBUG oslo_concurrency.lockutils [req-85a11408-d213-4149-800c-29ed9b63c585 req-c9c53e6e-d1f6-4652-b625-51ba8c0892e6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.888 183079 DEBUG oslo_concurrency.lockutils [req-85a11408-d213-4149-800c-29ed9b63c585 req-c9c53e6e-d1f6-4652-b625-51ba8c0892e6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.888 183079 DEBUG oslo_concurrency.lockutils [req-85a11408-d213-4149-800c-29ed9b63c585 req-c9c53e6e-d1f6-4652-b625-51ba8c0892e6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.888 183079 DEBUG nova.compute.manager [req-85a11408-d213-4149-800c-29ed9b63c585 req-c9c53e6e-d1f6-4652-b625-51ba8c0892e6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] No waiting events found dispatching network-vif-plugged-9c1da312-2c1b-451b-9ea1-34ff96520bb9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.889 183079 WARNING nova.compute.manager [req-85a11408-d213-4149-800c-29ed9b63c585 req-c9c53e6e-d1f6-4652-b625-51ba8c0892e6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Received unexpected event network-vif-plugged-9c1da312-2c1b-451b-9ea1-34ff96520bb9 for instance with vm_state building and task_state spawning.
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.889 183079 DEBUG nova.compute.manager [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.892 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104408.8924935, 42bc2eb6-0654-413e-bf4f-7926c9f3efb5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.893 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] VM Resumed (Lifecycle Event)
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.894 183079 DEBUG nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.897 183079 INFO nova.virt.libvirt.driver [-] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Instance spawned successfully.
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.898 183079 DEBUG nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.913 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.917 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.925 183079 DEBUG nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.925 183079 DEBUG nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.926 183079 DEBUG nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.926 183079 DEBUG nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.927 183079 DEBUG nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.927 183079 DEBUG nova.virt.libvirt.driver [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.933 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.973 183079 INFO nova.compute.manager [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Took 8.81 seconds to spawn the instance on the hypervisor.
Jan 22 17:53:28 compute-0 nova_compute[183075]: 2026-01-22 17:53:28.974 183079 DEBUG nova.compute.manager [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:53:29 compute-0 nova_compute[183075]: 2026-01-22 17:53:29.040 183079 INFO nova.compute.manager [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Took 9.98 seconds to build instance.
Jan 22 17:53:29 compute-0 nova_compute[183075]: 2026-01-22 17:53:29.061 183079 DEBUG oslo_concurrency.lockutils [None req-b026474c-fa98-4056-b45e-7e4d3be61ffd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:53:29 compute-0 nova_compute[183075]: 2026-01-22 17:53:29.173 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:31 compute-0 podman[243053]: 2026-01-22 17:53:31.346472877 +0000 UTC m=+0.052906790 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:53:32 compute-0 nova_compute[183075]: 2026-01-22 17:53:32.199 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:32 compute-0 nova_compute[183075]: 2026-01-22 17:53:32.718 183079 INFO nova.compute.manager [None req-60bfdc26-1333-4ef6-8c51-706345e3efca 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Get console output
Jan 22 17:53:32 compute-0 nova_compute[183075]: 2026-01-22 17:53:32.723 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:53:34 compute-0 nova_compute[183075]: 2026-01-22 17:53:34.175 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:37 compute-0 nova_compute[183075]: 2026-01-22 17:53:37.202 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:37 compute-0 nova_compute[183075]: 2026-01-22 17:53:37.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:53:38 compute-0 nova_compute[183075]: 2026-01-22 17:53:38.145 183079 INFO nova.compute.manager [None req-7f54e8cb-ad32-4d3d-885a-85cee2e8be03 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Get console output
Jan 22 17:53:38 compute-0 nova_compute[183075]: 2026-01-22 17:53:38.150 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:53:39 compute-0 nova_compute[183075]: 2026-01-22 17:53:39.176 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:39 compute-0 nova_compute[183075]: 2026-01-22 17:53:39.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:53:41 compute-0 ovn_controller[95372]: 2026-01-22T17:53:41Z|00152|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6a:a8:74 10.100.0.7
Jan 22 17:53:41 compute-0 ovn_controller[95372]: 2026-01-22T17:53:41Z|00153|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6a:a8:74 10.100.0.7
Jan 22 17:53:41 compute-0 nova_compute[183075]: 2026-01-22 17:53:41.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:53:41 compute-0 nova_compute[183075]: 2026-01-22 17:53:41.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:53:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:41.973 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:53:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:41.974 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:53:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:41.974 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:53:42 compute-0 nova_compute[183075]: 2026-01-22 17:53:42.205 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:42 compute-0 podman[243117]: 2026-01-22 17:53:42.348791011 +0000 UTC m=+0.052430497 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 22 17:53:42 compute-0 podman[243118]: 2026-01-22 17:53:42.381687204 +0000 UTC m=+0.084259571 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350)
Jan 22 17:53:42 compute-0 podman[243116]: 2026-01-22 17:53:42.402412919 +0000 UTC m=+0.109434315 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 22 17:53:43 compute-0 nova_compute[183075]: 2026-01-22 17:53:43.690 183079 INFO nova.compute.manager [None req-29458bb6-6d76-49ba-93f4-a871100feed4 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Get console output
Jan 22 17:53:43 compute-0 nova_compute[183075]: 2026-01-22 17:53:43.695 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:53:44 compute-0 nova_compute[183075]: 2026-01-22 17:53:44.179 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:46 compute-0 nova_compute[183075]: 2026-01-22 17:53:46.784 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:53:47 compute-0 nova_compute[183075]: 2026-01-22 17:53:47.207 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:47 compute-0 podman[243178]: 2026-01-22 17:53:47.346176427 +0000 UTC m=+0.062625901 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.212 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.212 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.763 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.763 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.5509932
Jan 22 17:53:48 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.7:46466 [22/Jan/2026:17:53:48.211] listener listener/metadata 0/0/0/552/552 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.771 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.772 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.792 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:53:48 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.7:46480 [22/Jan/2026:17:53:48.771] listener listener/metadata 0/0/0/21/21 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.792 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0203381
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.796 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.796 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.813 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.814 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0176394
Jan 22 17:53:48 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.7:46496 [22/Jan/2026:17:53:48.795] listener listener/metadata 0/0/0/18/18 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.819 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.819 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.832 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.832 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0130126
Jan 22 17:53:48 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.7:46510 [22/Jan/2026:17:53:48.818] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.837 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.837 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.854 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.855 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0170786
Jan 22 17:53:48 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.7:46520 [22/Jan/2026:17:53:48.836] listener listener/metadata 0/0/0/18/18 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.859 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.859 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:53:48 compute-0 nova_compute[183075]: 2026-01-22 17:53:48.867 183079 INFO nova.compute.manager [None req-09b49644-9331-42d0-a6a7-1cb460a5a179 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Get console output
Jan 22 17:53:48 compute-0 nova_compute[183075]: 2026-01-22 17:53:48.871 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.874 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:53:48 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.7:46530 [22/Jan/2026:17:53:48.858] listener listener/metadata 0/0/0/16/16 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.875 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0157094
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.879 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.880 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.900 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.900 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0205443
Jan 22 17:53:48 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.7:46546 [22/Jan/2026:17:53:48.879] listener listener/metadata 0/0/0/21/21 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.905 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.906 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.920 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.921 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0152850
Jan 22 17:53:48 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.7:46560 [22/Jan/2026:17:53:48.905] listener listener/metadata 0/0/0/16/16 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.927 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.927 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.944 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.944 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0169158
Jan 22 17:53:48 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.7:46568 [22/Jan/2026:17:53:48.926] listener listener/metadata 0/0/0/17/17 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.950 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.951 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.963 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.965 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0143392
Jan 22 17:53:48 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.7:46580 [22/Jan/2026:17:53:48.950] listener listener/metadata 0/0/0/16/16 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.973 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.974 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:53:48 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.7:46594 [22/Jan/2026:17:53:48.973] listener listener/metadata 0/0/0/17/17 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:53:48 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.990 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0161092
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:48.999 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.000 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.015 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:53:49 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.7:46596 [22/Jan/2026:17:53:48.999] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.015 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0152168
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.019 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.019 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.039 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.040 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0204182
Jan 22 17:53:49 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.7:46598 [22/Jan/2026:17:53:49.018] listener listener/metadata 0/0/0/21/21 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.044 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.045 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.058 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.058 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0135264
Jan 22 17:53:49 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.7:46612 [22/Jan/2026:17:53:49.044] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.063 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.063 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.077 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.078 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0145831
Jan 22 17:53:49 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.7:46628 [22/Jan/2026:17:53:49.062] listener listener/metadata 0/0/0/15/15 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.082 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.082 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.7
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.093 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:53:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:53:49.093 104990 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0106015
Jan 22 17:53:49 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.7:46632 [22/Jan/2026:17:53:49.082] listener listener/metadata 0/0/0/11/11 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:53:49 compute-0 nova_compute[183075]: 2026-01-22 17:53:49.181 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:52 compute-0 nova_compute[183075]: 2026-01-22 17:53:52.211 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:52 compute-0 nova_compute[183075]: 2026-01-22 17:53:52.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:53:52 compute-0 nova_compute[183075]: 2026-01-22 17:53:52.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:53:52 compute-0 nova_compute[183075]: 2026-01-22 17:53:52.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:53:54 compute-0 nova_compute[183075]: 2026-01-22 17:53:54.130 183079 INFO nova.compute.manager [None req-fb1c520c-5f22-4d03-9b03-47fafb0fb0fd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Get console output
Jan 22 17:53:54 compute-0 nova_compute[183075]: 2026-01-22 17:53:54.134 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:53:54 compute-0 nova_compute[183075]: 2026-01-22 17:53:54.183 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:54 compute-0 nova_compute[183075]: 2026-01-22 17:53:54.647 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:53:54 compute-0 nova_compute[183075]: 2026-01-22 17:53:54.647 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:53:54 compute-0 nova_compute[183075]: 2026-01-22 17:53:54.648 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:53:54 compute-0 nova_compute[183075]: 2026-01-22 17:53:54.648 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid eadc13fe-ef8b-46e5-af17-40b408aa5aff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.465 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'name': 'tempest-server-test-1316386013', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.468 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'name': 'tempest-server-test-1171118669', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.468 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.485 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/cpu volume: 11150000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.509 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/cpu volume: 11990000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f63c2af7-2059-43bb-b218-b4b1540e7907', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11150000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'timestamp': '2026-01-22T17:53:55.468790', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '51fb6f70-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.246229787, 'message_signature': '1c180e414e2fb44e5af96178cbc348ef044f1805c362681d1355d1bfd712465d'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11990000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'timestamp': '2026-01-22T17:53:55.468790', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '51ff0860-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.269785609, 'message_signature': 'dbfc30ececcd69debcbc347427fd9407b0f89561b18678d5f9ccad847187ec82'}]}, 'timestamp': '2026-01-22 17:53:55.510087', '_unique_id': '042c36fc1da648c0a5a02809fbb9fdda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.511 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.512 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.514 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 42bc2eb6-0654-413e-bf4f-7926c9f3efb5 / tap9c1da312-2c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.514 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.517 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for eadc13fe-ef8b-46e5-af17-40b408aa5aff / tap6f89fabe-02 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.517 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8548ff9e-788f-46eb-8ea2-3b68f5fa4df1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:53:55.512590', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '51ffd43e-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.273070167, 'message_signature': '41be4667cad20186fc34100e00306c2386f9058b632fb56c873d0b8e97bf4902'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:53:55.512590', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '52002fa6-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.275685837, 'message_signature': 'f075e8f0d54c6cbe2e429a7d03aee42d25fed4c45b34fb4562b56062b396a3db'}]}, 'timestamp': '2026-01-22 17:53:55.517570', '_unique_id': '259cfb0e85354e8380ab0fd277c6d04b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.519 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.519 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.519 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15027e4b-ca71-4db9-9d0f-2aee06cfe69f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:53:55.519238', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '52007bb4-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.273070167, 'message_signature': 'de0937166045793419504385ff06be96ab654988b241b2c3611dfe29ae679f3e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:53:55.519238', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '52008460-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.275685837, 'message_signature': 'ca41172a059ee0e2ab24fbac00a6e768398da16eae8c2b68a4cdb7f2af44e164'}]}, 'timestamp': '2026-01-22 17:53:55.519782', '_unique_id': '948d51e65ac044febae3de3e3a545d04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.520 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.526 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.usage volume: 29818880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.531 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f92548ed-4463-47e0-b6c3-780c789959f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29818880, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:53:55.520898', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '52019bca-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.281326178, 'message_signature': '3f37b33763662877f7f1309ce01e9afdb5ca737fef2d71743c0e2560133d35d6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:53:55.520898', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '52026c1c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.287263548, 'message_signature': '4ccade6228606fdb6ef824d4188259bf0b36ecc4c230c1ffe04a4a45361c1284'}]}, 'timestamp': '2026-01-22 17:53:55.532203', '_unique_id': '2f1c236fd5ce4d738106b0d1cc46185c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.533 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.533 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.533 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa35dcc1-ca03-4730-b3b5-b2c87f329cd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:53:55.533646', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '5202ae2a-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.273070167, 'message_signature': '66a8ecb1e2ce14fc1b4c8d38b3b82b66ccb7024b0f563ae2319377ce64e7db9e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:53:55.533646', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '5202b64a-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.275685837, 'message_signature': 'dd9d22bb8df299ff524bbd363b43863e987bed3bc25458de7b7d2641f1a74050'}]}, 'timestamp': '2026-01-22 17:53:55.534085', '_unique_id': '7debb7c24235464b8a35adfd1aa8647e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.535 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.535 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.outgoing.bytes volume: 10570 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.535 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.outgoing.bytes volume: 11596 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8322738-5a3b-4264-9408-d1d477fa61eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10570, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:53:55.535194', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '5202ea48-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.273070167, 'message_signature': '1543a5ddb0470303ed40e38a8c0828a444c69014fd7d1a3ee28afc7c5b70d76f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11596, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:53:55.535194', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '5202f268-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.275685837, 'message_signature': '30c8d1327f5cb58bd43297a1f15faac122183f7ca797c5f41ef078a578e9ffac'}]}, 'timestamp': '2026-01-22 17:53:55.535680', '_unique_id': '7cf7faebb6e0413a8bd9b6697084ae1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.536 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.550 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.read.latency volume: 153122306 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.565 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.read.latency volume: 162196742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ff425df-7645-433e-9da4-cfba4f243e17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 153122306, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:53:55.536988', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '520554b8-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.29741845, 'message_signature': '746a642000dfa3636446f6f569a7590938341983a2d5e6c72fbec88b10c3bf62'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 162196742, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:53:55.536988', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '52078760-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.311796556, 'message_signature': 'bfcb7f0df81a6bc86df4aa19e55730cebfbefad6396b58606ae80f9f57361cb2'}]}, 'timestamp': '2026-01-22 17:53:55.565753', '_unique_id': '6da85c2663b04f1dad316c8d3c9cf5f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.566 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.567 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.567 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/memory.usage volume: 42.79296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.568 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/memory.usage volume: 42.0703125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e401d301-d53a-4010-b4a0-a862fe78f2f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.79296875, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'timestamp': '2026-01-22T17:53:55.567922', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '5207eb60-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.246229787, 'message_signature': 'c374c1721070a230acd6456e38622dfb457f5034cb5e218cf63bdb5d51cf9ee6'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.0703125, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'timestamp': '2026-01-22T17:53:55.567922', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '5207f6dc-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.269785609, 'message_signature': 'd8f3ac220d0b2f5110f93f6b5b62195b0484723d2a5aa761dcb2589ecd8901d4'}]}, 'timestamp': '2026-01-22 17:53:55.568536', '_unique_id': 'a00088ef23c74fdb84ced8e73d06bdc8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.569 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.570 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.570 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.570 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1316386013>, <NovaLikeServer: tempest-server-test-1171118669>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1316386013>, <NovaLikeServer: tempest-server-test-1171118669>]
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.570 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.570 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.read.bytes volume: 30050816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.571 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.read.bytes volume: 31209984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '164ba28d-0de7-4c8c-b05a-19474294ccc2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30050816, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:53:55.570884', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '52085d98-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.29741845, 'message_signature': 'cdca6d6cf1df038ee4d0283183ec21dd23b21c772197bdfbb812d51fdf558e2b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31209984, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:53:55.570884', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5208684c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.311796556, 'message_signature': '7aff2921e203d01b93113f8451996d8e125353176cc67056e0910547e0eec7d9'}]}, 'timestamp': '2026-01-22 17:53:55.571440', '_unique_id': '87d1e6f79ee346038e0023f4bd9b2796'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.572 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.573 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.573 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.allocation volume: 29892608 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.573 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f69e8727-c104-47c7-af0f-5a4b2c194aac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29892608, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:53:55.573227', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5208b93c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.281326178, 'message_signature': '19d0e3c3a4686fe713db08ae26c5785f7c9b899d04653f8bebae671d476d655b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:53:55.573227', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5208c526-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.287263548, 'message_signature': '152861954a089feca95c506440f70d50e438e62a9702d40b23e6202dd174c569'}]}, 'timestamp': '2026-01-22 17:53:55.573822', '_unique_id': '6360573d6df649678dea2a772becab8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.575 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.575 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.write.bytes volume: 72871936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.575 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.write.bytes volume: 73220096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '608d140e-93d3-41ee-9f50-3b671396b35b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72871936, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:53:55.575507', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '52091364-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.29741845, 'message_signature': '200eed9e9cffadeae82a6ba2bfd3f89a16d0f4640a37deb48ad74b29db2cd69f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73220096, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:53:55.575507', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '52091ddc-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.311796556, 'message_signature': '6a807b00c2243d88983fd0ac8317e392e7b08b3f1fb3da06e8af312a0e451582'}]}, 'timestamp': '2026-01-22 17:53:55.576087', '_unique_id': 'eaf54fedc0e54788944d806d22d0a1a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.576 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.577 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.577 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.577 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1316386013>, <NovaLikeServer: tempest-server-test-1171118669>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1316386013>, <NovaLikeServer: tempest-server-test-1171118669>]
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.578 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.578 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.write.requests volume: 315 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.578 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.write.requests volume: 346 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '478d3647-7a66-4169-83e9-0812991bd479', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 315, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:53:55.578134', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '520979f8-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.29741845, 'message_signature': 'd1968639b0cd23b6343e71c9e63030bdfe75169c335ed7251880993d6582561e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 346, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:53:55.578134', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '52098574-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.311796556, 'message_signature': 'd1932086809e42db6dce7cc0c82b3060287245708f8f94b4c404b88cb366ca84'}]}, 'timestamp': '2026-01-22 17:53:55.578761', '_unique_id': 'c0a8137851be4dc0ba21656545f52522'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.579 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.580 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.580 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.write.latency volume: 3150419303 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.580 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.write.latency volume: 2227812578 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26369a77-40e3-446b-959f-88c1f80fa27d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3150419303, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:53:55.580368', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5209d01a-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.29741845, 'message_signature': '5a26f6c91916cafab55ca69f4ccbb36aab4ff0b19d61de4aeb999b3b9d1ad2a2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2227812578, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:53:55.580368', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5209dbbe-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.311796556, 'message_signature': 'add79783962a4f255c0009d60cab09a5495a999629317cb0e99a762384db2749'}]}, 'timestamp': '2026-01-22 17:53:55.580950', '_unique_id': '054d6287d0f64af9b35eac1354890a21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.581 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.582 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.582 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.read.requests volume: 1113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.582 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.read.requests volume: 1153 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbd4991a-a234-46a1-aa6a-0a002dc11a0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1113, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:53:55.582427', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '520a207e-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.29741845, 'message_signature': '50ee5424ee315a15fd9b93c211d951f59f1f1d58fdb1c7cb85da3157f37960cd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1153, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:53:55.582427', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '520a2bbe-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.311796556, 'message_signature': 'e47c9c5cd36deac6bc5acc70735cbaac85b7cace835f4717e560dbe7ad5c9ca4'}]}, 'timestamp': '2026-01-22 17:53:55.582993', '_unique_id': '85d76d910ee24f8098a99cd99c572bc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.583 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.584 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.584 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f2d5b4a-de46-41ad-9d67-ddb22173dbd6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:53:55.584810', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '520a7c7c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.281326178, 'message_signature': '6ce7dc4a8aa4ec350de390c6d135e320c494f87426b84b3d14e7a3ad36ad2976'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:53:55.584810', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '520a84ec-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.287263548, 'message_signature': 'c91cd553ba162369ecaf6c97e0d07816a8e14f6eb116d38dc0df0a167b921f28'}]}, 'timestamp': '2026-01-22 17:53:55.585247', '_unique_id': '6f94344274464e52803e52d69b65732c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.585 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.586 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.586 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.586 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'beb8eb79-cb40-4c1d-aa08-62494106f1d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:53:55.586460', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '520abcf0-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.273070167, 'message_signature': '98fe1d9eac2fa8aa141cceeb76259572cb8970b2f0ddea8951c5f130313e2737'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:53:55.586460', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '520ac772-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.275685837, 'message_signature': '98c9a56685c3d0a287a12b6612ce183a2bd84cdd84b66e331f40f5ee42eaac82'}]}, 'timestamp': '2026-01-22 17:53:55.586956', '_unique_id': 'd4aad1e7b6dd4d58ac74585802051f45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.587 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.588 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.588 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.outgoing.packets volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.588 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.outgoing.packets volume: 131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4d43552-21c6-48f0-a4ec-a9d583580de8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 120, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:53:55.588129', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '520afe22-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.273070167, 'message_signature': 'cf98f867fd7d455a6728938b97f7eff21a5f9401ff7a2b9ba9e2d2964916c354'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 131, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:53:55.588129', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '520b06ce-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.275685837, 'message_signature': '39215198ec1d967bbc47adf85eea82a2586cef3857df2e75e53f8f2484adbc2b'}]}, 'timestamp': '2026-01-22 17:53:55.588576', '_unique_id': '6b8cab1aab8f4a668ecc07e51c904871'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.589 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35c4bff4-843e-4dae-b79d-bf518dd1a839', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:53:55.589690', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '520b3aea-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.273070167, 'message_signature': '30e7601f5e334344d5b6f1d987a81618a9ada6c8614af9a321aa4335bc145c28'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:53:55.589690', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '520b4332-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.275685837, 'message_signature': 'fd76557ee73ace3b1fa35e83dfc50458182098979f610f0768ac58bba21456ac'}]}, 'timestamp': '2026-01-22 17:53:55.590120', '_unique_id': 'd678039668394938a725f99b8237571d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.591 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.591 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.591 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97f69529-0223-4c59-88b6-21688087fc61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:53:55.591216', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '520b7726-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.273070167, 'message_signature': 'd22b7770cedfc3b26d64d9963d95adf2ddf0bd5e0c6704975cf9a8c97dc6140c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:53:55.591216', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '520b7f32-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.275685837, 'message_signature': '28239d6a89e824d80bfe5101ff3a8ea7c59bb24f84d890fc033bf4694dcafe3b'}]}, 'timestamp': '2026-01-22 17:53:55.591687', '_unique_id': '6d3def5fb3004a518fd942240a9731f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.592 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1316386013>, <NovaLikeServer: tempest-server-test-1171118669>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1316386013>, <NovaLikeServer: tempest-server-test-1171118669>]
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.593 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.593 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.incoming.bytes volume: 7207 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.593 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.incoming.bytes volume: 7447 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '853d2c80-7a34-4d1c-aab7-6218ee579c9e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7207, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:53:55.593075', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '520bbf1a-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.273070167, 'message_signature': 'd0733a99eced4af3ed52478cca86195b21da447acdc72d00f9b906182a01d71d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7447, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:53:55.593075', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '520bc99c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.275685837, 'message_signature': 'c84c03779c22c00b05df99ad9fe0fa61b7f63acc30173bfe14651776c8f45f4a'}]}, 'timestamp': '2026-01-22 17:53:55.593602', '_unique_id': '511207784589439b8a1c8d284f633ad8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.594 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.595 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.595 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.595 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.incoming.packets volume: 65 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '399a7bc6-62d3-4ea7-bd95-abdec6c4ca9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:53:55.595162', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '520c12e4-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.273070167, 'message_signature': '25f216540621bd9ffc741043b25d429f82b6888f7a93f963384ece057348eb36'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 65, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:53:55.595162', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '520c1fbe-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6799.275685837, 'message_signature': '7b63b7881fa8908b04558cdbd56def3f15e199cc2ded4b90a9aed01a90a5ad27'}]}, 'timestamp': '2026-01-22 17:53:55.595813', '_unique_id': '9748c9f888114db3bdfa1b2e4a8cbb99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.596 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.597 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.597 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:53:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:53:55.597 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1316386013>, <NovaLikeServer: tempest-server-test-1171118669>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1316386013>, <NovaLikeServer: tempest-server-test-1171118669>]
Jan 22 17:53:56 compute-0 nova_compute[183075]: 2026-01-22 17:53:56.974 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Updating instance_info_cache with network_info: [{"id": "6f89fabe-02ec-4340-9703-82e93c101ebc", "address": "fa:16:3e:45:45:58", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f89fabe-02", "ovs_interfaceid": "6f89fabe-02ec-4340-9703-82e93c101ebc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:53:56 compute-0 nova_compute[183075]: 2026-01-22 17:53:56.995 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:53:56 compute-0 nova_compute[183075]: 2026-01-22 17:53:56.996 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:53:56 compute-0 nova_compute[183075]: 2026-01-22 17:53:56.996 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:53:56 compute-0 nova_compute[183075]: 2026-01-22 17:53:56.997 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:53:56 compute-0 nova_compute[183075]: 2026-01-22 17:53:56.997 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:53:56 compute-0 nova_compute[183075]: 2026-01-22 17:53:56.997 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.023 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.024 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.024 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.024 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.090 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.146 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.147 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.204 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.209 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.226 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.265 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.266 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.323 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.473 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.474 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5374MB free_disk=73.29404067993164GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.474 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.474 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:53:57 compute-0 ovn_controller[95372]: 2026-01-22T17:53:57Z|00831|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.551 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance eadc13fe-ef8b-46e5-af17-40b408aa5aff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.551 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 42bc2eb6-0654-413e-bf4f-7926c9f3efb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.552 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.552 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.599 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.627 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.647 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:53:57 compute-0 nova_compute[183075]: 2026-01-22 17:53:57.648 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:53:58 compute-0 podman[243211]: 2026-01-22 17:53:58.345204173 +0000 UTC m=+0.048041090 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:53:59 compute-0 nova_compute[183075]: 2026-01-22 17:53:59.184 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:53:59 compute-0 nova_compute[183075]: 2026-01-22 17:53:59.272 183079 INFO nova.compute.manager [None req-9a2fd986-fa19-4348-a9d6-e42ccc0d9d0f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Get console output
Jan 22 17:53:59 compute-0 nova_compute[183075]: 2026-01-22 17:53:59.277 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:54:02 compute-0 nova_compute[183075]: 2026-01-22 17:54:02.229 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:02 compute-0 podman[243236]: 2026-01-22 17:54:02.332687954 +0000 UTC m=+0.047486905 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:54:04 compute-0 nova_compute[183075]: 2026-01-22 17:54:04.186 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:04 compute-0 nova_compute[183075]: 2026-01-22 17:54:04.411 183079 INFO nova.compute.manager [None req-f3925c60-e477-481a-98c7-796315c38881 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Get console output
Jan 22 17:54:04 compute-0 nova_compute[183075]: 2026-01-22 17:54:04.416 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:54:07 compute-0 nova_compute[183075]: 2026-01-22 17:54:07.233 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:09 compute-0 nova_compute[183075]: 2026-01-22 17:54:09.188 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:09 compute-0 nova_compute[183075]: 2026-01-22 17:54:09.548 183079 INFO nova.compute.manager [None req-edfd2580-5d25-42dc-89e3-3e2994842901 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Get console output
Jan 22 17:54:09 compute-0 nova_compute[183075]: 2026-01-22 17:54:09.552 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:54:12 compute-0 nova_compute[183075]: 2026-01-22 17:54:12.236 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:13 compute-0 podman[243261]: 2026-01-22 17:54:13.350689829 +0000 UTC m=+0.049792556 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:54:13 compute-0 podman[243260]: 2026-01-22 17:54:13.379611965 +0000 UTC m=+0.081999580 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Jan 22 17:54:13 compute-0 podman[243262]: 2026-01-22 17:54:13.444464144 +0000 UTC m=+0.138543576 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 17:54:14 compute-0 nova_compute[183075]: 2026-01-22 17:54:14.189 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:14 compute-0 nova_compute[183075]: 2026-01-22 17:54:14.668 183079 INFO nova.compute.manager [None req-a04c6f11-c0bd-4044-818f-8f495108c651 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Get console output
Jan 22 17:54:14 compute-0 nova_compute[183075]: 2026-01-22 17:54:14.673 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:54:16 compute-0 nova_compute[183075]: 2026-01-22 17:54:16.644 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:54:17 compute-0 nova_compute[183075]: 2026-01-22 17:54:17.238 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:18 compute-0 podman[243323]: 2026-01-22 17:54:18.348643261 +0000 UTC m=+0.058195412 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:54:19 compute-0 nova_compute[183075]: 2026-01-22 17:54:19.192 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:19 compute-0 nova_compute[183075]: 2026-01-22 17:54:19.858 183079 INFO nova.compute.manager [None req-29b4f077-5b89-44b0-a2aa-39734976eaa1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Get console output
Jan 22 17:54:19 compute-0 nova_compute[183075]: 2026-01-22 17:54:19.862 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:54:22 compute-0 nova_compute[183075]: 2026-01-22 17:54:22.241 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:24 compute-0 nova_compute[183075]: 2026-01-22 17:54:24.193 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:24 compute-0 nova_compute[183075]: 2026-01-22 17:54:24.976 183079 INFO nova.compute.manager [None req-5fc41b49-537b-40cc-90b8-bec2612a079e 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Get console output
Jan 22 17:54:24 compute-0 nova_compute[183075]: 2026-01-22 17:54:24.983 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:54:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:26.442 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:54:26 compute-0 nova_compute[183075]: 2026-01-22 17:54:26.443 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:26.443 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:54:27 compute-0 nova_compute[183075]: 2026-01-22 17:54:27.243 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:27 compute-0 nova_compute[183075]: 2026-01-22 17:54:27.981 183079 DEBUG nova.compute.manager [req-4b3b0587-5cb3-4f18-bb22-aab7686a6106 req-dcbfae78-1a0a-4d63-bf17-4d0191c3b0a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Received event network-changed-6f89fabe-02ec-4340-9703-82e93c101ebc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:54:27 compute-0 nova_compute[183075]: 2026-01-22 17:54:27.982 183079 DEBUG nova.compute.manager [req-4b3b0587-5cb3-4f18-bb22-aab7686a6106 req-dcbfae78-1a0a-4d63-bf17-4d0191c3b0a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Refreshing instance network info cache due to event network-changed-6f89fabe-02ec-4340-9703-82e93c101ebc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:54:27 compute-0 nova_compute[183075]: 2026-01-22 17:54:27.982 183079 DEBUG oslo_concurrency.lockutils [req-4b3b0587-5cb3-4f18-bb22-aab7686a6106 req-dcbfae78-1a0a-4d63-bf17-4d0191c3b0a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:54:27 compute-0 nova_compute[183075]: 2026-01-22 17:54:27.982 183079 DEBUG oslo_concurrency.lockutils [req-4b3b0587-5cb3-4f18-bb22-aab7686a6106 req-dcbfae78-1a0a-4d63-bf17-4d0191c3b0a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:54:27 compute-0 nova_compute[183075]: 2026-01-22 17:54:27.982 183079 DEBUG nova.network.neutron [req-4b3b0587-5cb3-4f18-bb22-aab7686a6106 req-dcbfae78-1a0a-4d63-bf17-4d0191c3b0a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Refreshing network info cache for port 6f89fabe-02ec-4340-9703-82e93c101ebc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:54:28 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:28.445 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:54:29 compute-0 nova_compute[183075]: 2026-01-22 17:54:29.244 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:29 compute-0 podman[243343]: 2026-01-22 17:54:29.342588921 +0000 UTC m=+0.049810217 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:54:29 compute-0 nova_compute[183075]: 2026-01-22 17:54:29.740 183079 DEBUG nova.network.neutron [req-4b3b0587-5cb3-4f18-bb22-aab7686a6106 req-dcbfae78-1a0a-4d63-bf17-4d0191c3b0a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Updated VIF entry in instance network info cache for port 6f89fabe-02ec-4340-9703-82e93c101ebc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:54:29 compute-0 nova_compute[183075]: 2026-01-22 17:54:29.741 183079 DEBUG nova.network.neutron [req-4b3b0587-5cb3-4f18-bb22-aab7686a6106 req-dcbfae78-1a0a-4d63-bf17-4d0191c3b0a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Updating instance_info_cache with network_info: [{"id": "6f89fabe-02ec-4340-9703-82e93c101ebc", "address": "fa:16:3e:45:45:58", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f89fabe-02", "ovs_interfaceid": "6f89fabe-02ec-4340-9703-82e93c101ebc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:54:29 compute-0 nova_compute[183075]: 2026-01-22 17:54:29.760 183079 DEBUG oslo_concurrency.lockutils [req-4b3b0587-5cb3-4f18-bb22-aab7686a6106 req-dcbfae78-1a0a-4d63-bf17-4d0191c3b0a7 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:54:30 compute-0 nova_compute[183075]: 2026-01-22 17:54:30.796 183079 DEBUG nova.compute.manager [req-b1f388e4-7659-46ea-a7e9-13f9c6b39cd5 req-bc01ab8d-6c11-4f2f-b141-831505539b5d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Received event network-changed-9c1da312-2c1b-451b-9ea1-34ff96520bb9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:54:30 compute-0 nova_compute[183075]: 2026-01-22 17:54:30.796 183079 DEBUG nova.compute.manager [req-b1f388e4-7659-46ea-a7e9-13f9c6b39cd5 req-bc01ab8d-6c11-4f2f-b141-831505539b5d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Refreshing instance network info cache due to event network-changed-9c1da312-2c1b-451b-9ea1-34ff96520bb9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:54:30 compute-0 nova_compute[183075]: 2026-01-22 17:54:30.797 183079 DEBUG oslo_concurrency.lockutils [req-b1f388e4-7659-46ea-a7e9-13f9c6b39cd5 req-bc01ab8d-6c11-4f2f-b141-831505539b5d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-42bc2eb6-0654-413e-bf4f-7926c9f3efb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:54:30 compute-0 nova_compute[183075]: 2026-01-22 17:54:30.797 183079 DEBUG oslo_concurrency.lockutils [req-b1f388e4-7659-46ea-a7e9-13f9c6b39cd5 req-bc01ab8d-6c11-4f2f-b141-831505539b5d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-42bc2eb6-0654-413e-bf4f-7926c9f3efb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:54:30 compute-0 nova_compute[183075]: 2026-01-22 17:54:30.797 183079 DEBUG nova.network.neutron [req-b1f388e4-7659-46ea-a7e9-13f9c6b39cd5 req-bc01ab8d-6c11-4f2f-b141-831505539b5d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Refreshing network info cache for port 9c1da312-2c1b-451b-9ea1-34ff96520bb9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:54:32 compute-0 nova_compute[183075]: 2026-01-22 17:54:32.092 183079 DEBUG nova.network.neutron [req-b1f388e4-7659-46ea-a7e9-13f9c6b39cd5 req-bc01ab8d-6c11-4f2f-b141-831505539b5d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Updated VIF entry in instance network info cache for port 9c1da312-2c1b-451b-9ea1-34ff96520bb9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:54:32 compute-0 nova_compute[183075]: 2026-01-22 17:54:32.092 183079 DEBUG nova.network.neutron [req-b1f388e4-7659-46ea-a7e9-13f9c6b39cd5 req-bc01ab8d-6c11-4f2f-b141-831505539b5d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Updating instance_info_cache with network_info: [{"id": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "address": "fa:16:3e:6a:a8:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c1da312-2c", "ovs_interfaceid": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:54:32 compute-0 nova_compute[183075]: 2026-01-22 17:54:32.116 183079 DEBUG oslo_concurrency.lockutils [req-b1f388e4-7659-46ea-a7e9-13f9c6b39cd5 req-bc01ab8d-6c11-4f2f-b141-831505539b5d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-42bc2eb6-0654-413e-bf4f-7926c9f3efb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:54:32 compute-0 nova_compute[183075]: 2026-01-22 17:54:32.245 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.032 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.033 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.049 183079 DEBUG nova.compute.manager [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.145 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.146 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.153 183079 DEBUG nova.virt.hardware [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.153 183079 INFO nova.compute.claims [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:54:33 compute-0 podman[243367]: 2026-01-22 17:54:33.345453404 +0000 UTC m=+0.052785857 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.520 183079 DEBUG nova.compute.provider_tree [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.540 183079 DEBUG nova.scheduler.client.report [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.561 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.415s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.562 183079 DEBUG nova.compute.manager [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.606 183079 DEBUG nova.compute.manager [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.606 183079 DEBUG nova.network.neutron [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.626 183079 INFO nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.653 183079 DEBUG nova.compute.manager [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.766 183079 DEBUG nova.compute.manager [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.767 183079 DEBUG nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.768 183079 INFO nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Creating image(s)
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.768 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.768 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.769 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.781 183079 DEBUG oslo_concurrency.processutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.839 183079 DEBUG oslo_concurrency.processutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.840 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.841 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.852 183079 DEBUG oslo_concurrency.processutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.906 183079 DEBUG oslo_concurrency.processutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:54:33 compute-0 nova_compute[183075]: 2026-01-22 17:54:33.907 183079 DEBUG oslo_concurrency.processutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:54:34 compute-0 nova_compute[183075]: 2026-01-22 17:54:34.007 183079 DEBUG oslo_concurrency.processutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk 1073741824" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:54:34 compute-0 nova_compute[183075]: 2026-01-22 17:54:34.008 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:54:34 compute-0 nova_compute[183075]: 2026-01-22 17:54:34.009 183079 DEBUG oslo_concurrency.processutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:54:34 compute-0 nova_compute[183075]: 2026-01-22 17:54:34.077 183079 DEBUG oslo_concurrency.processutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:54:34 compute-0 nova_compute[183075]: 2026-01-22 17:54:34.078 183079 DEBUG nova.virt.disk.api [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:54:34 compute-0 nova_compute[183075]: 2026-01-22 17:54:34.078 183079 DEBUG oslo_concurrency.processutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:54:34 compute-0 nova_compute[183075]: 2026-01-22 17:54:34.140 183079 DEBUG oslo_concurrency.processutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:54:34 compute-0 nova_compute[183075]: 2026-01-22 17:54:34.141 183079 DEBUG nova.virt.disk.api [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:54:34 compute-0 nova_compute[183075]: 2026-01-22 17:54:34.142 183079 DEBUG nova.objects.instance [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid f7112f58-59da-486c-9079-e0bc3fa6a3e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:54:34 compute-0 nova_compute[183075]: 2026-01-22 17:54:34.155 183079 DEBUG nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:54:34 compute-0 nova_compute[183075]: 2026-01-22 17:54:34.156 183079 DEBUG nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Ensure instance console log exists: /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:54:34 compute-0 nova_compute[183075]: 2026-01-22 17:54:34.156 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:54:34 compute-0 nova_compute[183075]: 2026-01-22 17:54:34.156 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:54:34 compute-0 nova_compute[183075]: 2026-01-22 17:54:34.157 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:54:34 compute-0 nova_compute[183075]: 2026-01-22 17:54:34.247 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:34 compute-0 nova_compute[183075]: 2026-01-22 17:54:34.696 183079 DEBUG nova.policy [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:54:37 compute-0 nova_compute[183075]: 2026-01-22 17:54:37.247 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:37 compute-0 nova_compute[183075]: 2026-01-22 17:54:37.818 183079 DEBUG nova.network.neutron [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Successfully created port: b10e0a25-0ce5-49eb-ac39-9b770e42a3fc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:54:38 compute-0 nova_compute[183075]: 2026-01-22 17:54:38.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:54:39 compute-0 nova_compute[183075]: 2026-01-22 17:54:39.250 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:40 compute-0 nova_compute[183075]: 2026-01-22 17:54:40.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:54:41 compute-0 nova_compute[183075]: 2026-01-22 17:54:41.686 183079 DEBUG nova.network.neutron [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Successfully updated port: b10e0a25-0ce5-49eb-ac39-9b770e42a3fc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:54:41 compute-0 nova_compute[183075]: 2026-01-22 17:54:41.925 183079 DEBUG nova.compute.manager [req-4ad2a669-9b0e-4c94-a2d7-e10bd9319d1d req-b5568e97-b15f-4bf3-85fc-3d17f2022278 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Received event network-changed-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:54:41 compute-0 nova_compute[183075]: 2026-01-22 17:54:41.925 183079 DEBUG nova.compute.manager [req-4ad2a669-9b0e-4c94-a2d7-e10bd9319d1d req-b5568e97-b15f-4bf3-85fc-3d17f2022278 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Refreshing instance network info cache due to event network-changed-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:54:41 compute-0 nova_compute[183075]: 2026-01-22 17:54:41.926 183079 DEBUG oslo_concurrency.lockutils [req-4ad2a669-9b0e-4c94-a2d7-e10bd9319d1d req-b5568e97-b15f-4bf3-85fc-3d17f2022278 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-f7112f58-59da-486c-9079-e0bc3fa6a3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:54:41 compute-0 nova_compute[183075]: 2026-01-22 17:54:41.926 183079 DEBUG oslo_concurrency.lockutils [req-4ad2a669-9b0e-4c94-a2d7-e10bd9319d1d req-b5568e97-b15f-4bf3-85fc-3d17f2022278 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-f7112f58-59da-486c-9079-e0bc3fa6a3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:54:41 compute-0 nova_compute[183075]: 2026-01-22 17:54:41.926 183079 DEBUG nova.network.neutron [req-4ad2a669-9b0e-4c94-a2d7-e10bd9319d1d req-b5568e97-b15f-4bf3-85fc-3d17f2022278 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Refreshing network info cache for port b10e0a25-0ce5-49eb-ac39-9b770e42a3fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:54:41 compute-0 nova_compute[183075]: 2026-01-22 17:54:41.940 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-f7112f58-59da-486c-9079-e0bc3fa6a3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:54:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:41.975 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:54:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:41.976 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:54:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:41.976 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:54:42 compute-0 nova_compute[183075]: 2026-01-22 17:54:42.249 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:42 compute-0 nova_compute[183075]: 2026-01-22 17:54:42.786 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:54:42 compute-0 nova_compute[183075]: 2026-01-22 17:54:42.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:54:42 compute-0 nova_compute[183075]: 2026-01-22 17:54:42.908 183079 DEBUG nova.network.neutron [req-4ad2a669-9b0e-4c94-a2d7-e10bd9319d1d req-b5568e97-b15f-4bf3-85fc-3d17f2022278 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:54:43 compute-0 nova_compute[183075]: 2026-01-22 17:54:43.661 183079 DEBUG nova.network.neutron [req-4ad2a669-9b0e-4c94-a2d7-e10bd9319d1d req-b5568e97-b15f-4bf3-85fc-3d17f2022278 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:54:43 compute-0 nova_compute[183075]: 2026-01-22 17:54:43.678 183079 DEBUG oslo_concurrency.lockutils [req-4ad2a669-9b0e-4c94-a2d7-e10bd9319d1d req-b5568e97-b15f-4bf3-85fc-3d17f2022278 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-f7112f58-59da-486c-9079-e0bc3fa6a3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:54:43 compute-0 nova_compute[183075]: 2026-01-22 17:54:43.678 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-f7112f58-59da-486c-9079-e0bc3fa6a3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:54:43 compute-0 nova_compute[183075]: 2026-01-22 17:54:43.679 183079 DEBUG nova.network.neutron [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:54:43 compute-0 nova_compute[183075]: 2026-01-22 17:54:43.798 183079 DEBUG nova.network.neutron [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:54:44 compute-0 nova_compute[183075]: 2026-01-22 17:54:44.252 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:44 compute-0 podman[243408]: 2026-01-22 17:54:44.353645016 +0000 UTC m=+0.049899929 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, name=ubi9-minimal, release=1755695350, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 17:54:44 compute-0 podman[243407]: 2026-01-22 17:54:44.353738468 +0000 UTC m=+0.052858818 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:54:44 compute-0 podman[243406]: 2026-01-22 17:54:44.367483467 +0000 UTC m=+0.070610975 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 22 17:54:44 compute-0 nova_compute[183075]: 2026-01-22 17:54:44.978 183079 DEBUG nova.network.neutron [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Updating instance_info_cache with network_info: [{"id": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "address": "fa:16:3e:db:07:5c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb10e0a25-0c", "ovs_interfaceid": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.001 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-f7112f58-59da-486c-9079-e0bc3fa6a3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.002 183079 DEBUG nova.compute.manager [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Instance network_info: |[{"id": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "address": "fa:16:3e:db:07:5c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb10e0a25-0c", "ovs_interfaceid": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.004 183079 DEBUG nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Start _get_guest_xml network_info=[{"id": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "address": "fa:16:3e:db:07:5c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb10e0a25-0c", "ovs_interfaceid": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.009 183079 WARNING nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.018 183079 DEBUG nova.virt.libvirt.host [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.019 183079 DEBUG nova.virt.libvirt.host [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.023 183079 DEBUG nova.virt.libvirt.host [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.023 183079 DEBUG nova.virt.libvirt.host [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.024 183079 DEBUG nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.024 183079 DEBUG nova.virt.hardware [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.024 183079 DEBUG nova.virt.hardware [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.025 183079 DEBUG nova.virt.hardware [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.025 183079 DEBUG nova.virt.hardware [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.025 183079 DEBUG nova.virt.hardware [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.025 183079 DEBUG nova.virt.hardware [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.025 183079 DEBUG nova.virt.hardware [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.026 183079 DEBUG nova.virt.hardware [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.026 183079 DEBUG nova.virt.hardware [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.026 183079 DEBUG nova.virt.hardware [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.026 183079 DEBUG nova.virt.hardware [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.029 183079 DEBUG nova.virt.libvirt.vif [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:54:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-832225797',display_name='tempest-server-test-832225797',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-832225797',id=76,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-x4ls3ch2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:54:33Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=f7112f58-59da-486c-9079-e0bc3fa6a3e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "address": "fa:16:3e:db:07:5c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb10e0a25-0c", "ovs_interfaceid": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.030 183079 DEBUG nova.network.os_vif_util [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "address": "fa:16:3e:db:07:5c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb10e0a25-0c", "ovs_interfaceid": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.030 183079 DEBUG nova.network.os_vif_util [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:07:5c,bridge_name='br-int',has_traffic_filtering=True,id=b10e0a25-0ce5-49eb-ac39-9b770e42a3fc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb10e0a25-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.031 183079 DEBUG nova.objects.instance [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid f7112f58-59da-486c-9079-e0bc3fa6a3e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.050 183079 DEBUG nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:54:45 compute-0 nova_compute[183075]:   <uuid>f7112f58-59da-486c-9079-e0bc3fa6a3e6</uuid>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   <name>instance-0000004c</name>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-832225797</nova:name>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:54:45</nova:creationTime>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:54:45 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:54:45 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:54:45 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:54:45 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:54:45 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:54:45 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:54:45 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:54:45 compute-0 nova_compute[183075]:         <nova:port uuid="b10e0a25-0ce5-49eb-ac39-9b770e42a3fc">
Jan 22 17:54:45 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <system>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <entry name="serial">f7112f58-59da-486c-9079-e0bc3fa6a3e6</entry>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <entry name="uuid">f7112f58-59da-486c-9079-e0bc3fa6a3e6</entry>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     </system>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   <os>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   </os>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   <features>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   </features>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:db:07:5c"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <target dev="tapb10e0a25-0c"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/console.log" append="off"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <video>
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     </video>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:54:45 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:54:45 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:54:45 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:54:45 compute-0 nova_compute[183075]: </domain>
Jan 22 17:54:45 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.051 183079 DEBUG nova.compute.manager [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Preparing to wait for external event network-vif-plugged-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.052 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.052 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.053 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.053 183079 DEBUG nova.virt.libvirt.vif [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:54:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-832225797',display_name='tempest-server-test-832225797',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-832225797',id=76,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-x4ls3ch2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:54:33Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=f7112f58-59da-486c-9079-e0bc3fa6a3e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "address": "fa:16:3e:db:07:5c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb10e0a25-0c", "ovs_interfaceid": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.054 183079 DEBUG nova.network.os_vif_util [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "address": "fa:16:3e:db:07:5c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb10e0a25-0c", "ovs_interfaceid": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.054 183079 DEBUG nova.network.os_vif_util [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:07:5c,bridge_name='br-int',has_traffic_filtering=True,id=b10e0a25-0ce5-49eb-ac39-9b770e42a3fc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb10e0a25-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.055 183079 DEBUG os_vif [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:07:5c,bridge_name='br-int',has_traffic_filtering=True,id=b10e0a25-0ce5-49eb-ac39-9b770e42a3fc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb10e0a25-0c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.055 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.056 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.056 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.060 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.061 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb10e0a25-0c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.061 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb10e0a25-0c, col_values=(('external_ids', {'iface-id': 'b10e0a25-0ce5-49eb-ac39-9b770e42a3fc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:07:5c', 'vm-uuid': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.063 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:45 compute-0 NetworkManager[55454]: <info>  [1769104485.0642] manager: (tapb10e0a25-0c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/332)
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.066 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.072 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.075 183079 INFO os_vif [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:07:5c,bridge_name='br-int',has_traffic_filtering=True,id=b10e0a25-0ce5-49eb-ac39-9b770e42a3fc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb10e0a25-0c')
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.118 183079 DEBUG nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.119 183079 DEBUG nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:db:07:5c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:54:45 compute-0 NetworkManager[55454]: <info>  [1769104485.1719] manager: (tapb10e0a25-0c): new Tun device (/org/freedesktop/NetworkManager/Devices/333)
Jan 22 17:54:45 compute-0 kernel: tapb10e0a25-0c: entered promiscuous mode
Jan 22 17:54:45 compute-0 ovn_controller[95372]: 2026-01-22T17:54:45Z|00832|binding|INFO|Claiming lport b10e0a25-0ce5-49eb-ac39-9b770e42a3fc for this chassis.
Jan 22 17:54:45 compute-0 ovn_controller[95372]: 2026-01-22T17:54:45Z|00833|binding|INFO|b10e0a25-0ce5-49eb-ac39-9b770e42a3fc: Claiming fa:16:3e:db:07:5c 10.100.0.12
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.176 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:45.184 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:07:5c 10.100.0.12'], port_security=['fa:16:3e:db:07:5c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '865177eb-37df-4b01-a85a-fea79abf013c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=b10e0a25-0ce5-49eb-ac39-9b770e42a3fc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:54:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:45.185 104629 INFO neutron.agent.ovn.metadata.agent [-] Port b10e0a25-0ce5-49eb-ac39-9b770e42a3fc in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:54:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:45.186 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:54:45 compute-0 ovn_controller[95372]: 2026-01-22T17:54:45Z|00834|binding|INFO|Setting lport b10e0a25-0ce5-49eb-ac39-9b770e42a3fc up in Southbound
Jan 22 17:54:45 compute-0 ovn_controller[95372]: 2026-01-22T17:54:45Z|00835|binding|INFO|Setting lport b10e0a25-0ce5-49eb-ac39-9b770e42a3fc ovn-installed in OVS
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.191 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:45.209 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[99e77289-a254-4918-bfdf-d7a92e42506b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:54:45 compute-0 systemd-udevd[243488]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:54:45 compute-0 systemd-machined[154382]: New machine qemu-76-instance-0000004c.
Jan 22 17:54:45 compute-0 NetworkManager[55454]: <info>  [1769104485.2271] device (tapb10e0a25-0c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:54:45 compute-0 NetworkManager[55454]: <info>  [1769104485.2282] device (tapb10e0a25-0c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:54:45 compute-0 systemd[1]: Started Virtual Machine qemu-76-instance-0000004c.
Jan 22 17:54:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:45.265 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[1773eca4-ee5d-4f1a-8f65-b986f4ea00ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:54:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:45.268 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[3eae3e04-cda3-4b61-b417-4980fa96ab8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:54:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:45.302 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[4d255538-1d6c-4b21-a590-978e7ef3b895]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:54:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:45.322 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ea4acf91-9f08-4369-bc80-d47903b153d5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 106, 'rx_bytes': 17308, 'tx_bytes': 12058, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 106, 'rx_bytes': 17308, 'tx_bytes': 12058, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 669817, 'reachable_time': 41616, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243502, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:54:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:45.340 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ba7634d4-8d86-4cdd-85d5-e2dae10e6ce7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 669828, 'tstamp': 669828}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243503, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 669843, 'tstamp': 669843}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243503, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:54:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:45.342 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.343 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.344 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:45.345 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:54:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:45.345 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:54:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:45.345 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:54:45 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:54:45.345 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.561 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104485.5612555, f7112f58-59da-486c-9079-e0bc3fa6a3e6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.562 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] VM Started (Lifecycle Event)
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.590 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.595 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104485.561503, f7112f58-59da-486c-9079-e0bc3fa6a3e6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.595 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] VM Paused (Lifecycle Event)
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.614 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.618 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.637 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.821 183079 DEBUG nova.compute.manager [req-9ae95695-8ee5-4f90-8875-a507aefc5f69 req-ad0962e5-c518-4c8f-9d1f-8145cd11296e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Received event network-vif-plugged-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.821 183079 DEBUG oslo_concurrency.lockutils [req-9ae95695-8ee5-4f90-8875-a507aefc5f69 req-ad0962e5-c518-4c8f-9d1f-8145cd11296e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.822 183079 DEBUG oslo_concurrency.lockutils [req-9ae95695-8ee5-4f90-8875-a507aefc5f69 req-ad0962e5-c518-4c8f-9d1f-8145cd11296e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.822 183079 DEBUG oslo_concurrency.lockutils [req-9ae95695-8ee5-4f90-8875-a507aefc5f69 req-ad0962e5-c518-4c8f-9d1f-8145cd11296e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.823 183079 DEBUG nova.compute.manager [req-9ae95695-8ee5-4f90-8875-a507aefc5f69 req-ad0962e5-c518-4c8f-9d1f-8145cd11296e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Processing event network-vif-plugged-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.823 183079 DEBUG nova.compute.manager [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.827 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104485.8263674, f7112f58-59da-486c-9079-e0bc3fa6a3e6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.827 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] VM Resumed (Lifecycle Event)
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.828 183079 DEBUG nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.832 183079 INFO nova.virt.libvirt.driver [-] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Instance spawned successfully.
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.832 183079 DEBUG nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.861 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.863 183079 DEBUG nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.864 183079 DEBUG nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.864 183079 DEBUG nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.865 183079 DEBUG nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.865 183079 DEBUG nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.865 183079 DEBUG nova.virt.libvirt.driver [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.869 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.901 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.925 183079 INFO nova.compute.manager [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Took 12.16 seconds to spawn the instance on the hypervisor.
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.925 183079 DEBUG nova.compute.manager [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:54:45 compute-0 nova_compute[183075]: 2026-01-22 17:54:45.987 183079 INFO nova.compute.manager [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Took 12.89 seconds to build instance.
Jan 22 17:54:46 compute-0 nova_compute[183075]: 2026-01-22 17:54:46.002 183079 DEBUG oslo_concurrency.lockutils [None req-266c3586-cfc5-46a0-bd6e-0b64fbeb1ba8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.969s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:54:46 compute-0 nova_compute[183075]: 2026-01-22 17:54:46.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:54:47 compute-0 nova_compute[183075]: 2026-01-22 17:54:47.106 183079 INFO nova.compute.manager [None req-9b40fb67-bbcb-4496-a7a2-00c9b9a73beb 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Get console output
Jan 22 17:54:47 compute-0 nova_compute[183075]: 2026-01-22 17:54:47.111 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:54:47 compute-0 nova_compute[183075]: 2026-01-22 17:54:47.907 183079 DEBUG nova.compute.manager [req-037ca5b0-9868-4b03-b48a-057a5661a5ed req-5e29835a-4ca6-484c-b2a2-3c11e446b836 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Received event network-vif-plugged-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:54:47 compute-0 nova_compute[183075]: 2026-01-22 17:54:47.907 183079 DEBUG oslo_concurrency.lockutils [req-037ca5b0-9868-4b03-b48a-057a5661a5ed req-5e29835a-4ca6-484c-b2a2-3c11e446b836 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:54:47 compute-0 nova_compute[183075]: 2026-01-22 17:54:47.908 183079 DEBUG oslo_concurrency.lockutils [req-037ca5b0-9868-4b03-b48a-057a5661a5ed req-5e29835a-4ca6-484c-b2a2-3c11e446b836 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:54:47 compute-0 nova_compute[183075]: 2026-01-22 17:54:47.908 183079 DEBUG oslo_concurrency.lockutils [req-037ca5b0-9868-4b03-b48a-057a5661a5ed req-5e29835a-4ca6-484c-b2a2-3c11e446b836 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:54:47 compute-0 nova_compute[183075]: 2026-01-22 17:54:47.908 183079 DEBUG nova.compute.manager [req-037ca5b0-9868-4b03-b48a-057a5661a5ed req-5e29835a-4ca6-484c-b2a2-3c11e446b836 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] No waiting events found dispatching network-vif-plugged-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:54:47 compute-0 nova_compute[183075]: 2026-01-22 17:54:47.909 183079 WARNING nova.compute.manager [req-037ca5b0-9868-4b03-b48a-057a5661a5ed req-5e29835a-4ca6-484c-b2a2-3c11e446b836 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Received unexpected event network-vif-plugged-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc for instance with vm_state active and task_state None.
Jan 22 17:54:49 compute-0 nova_compute[183075]: 2026-01-22 17:54:49.254 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:49 compute-0 podman[243511]: 2026-01-22 17:54:49.354798044 +0000 UTC m=+0.065789496 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 22 17:54:50 compute-0 nova_compute[183075]: 2026-01-22 17:54:50.065 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:51 compute-0 nova_compute[183075]: 2026-01-22 17:54:51.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:54:51 compute-0 nova_compute[183075]: 2026-01-22 17:54:51.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 17:54:52 compute-0 nova_compute[183075]: 2026-01-22 17:54:52.216 183079 INFO nova.compute.manager [None req-9b9b4e75-9daa-4780-aec2-7c562d0ba8da 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Get console output
Jan 22 17:54:52 compute-0 nova_compute[183075]: 2026-01-22 17:54:52.221 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:54:52 compute-0 nova_compute[183075]: 2026-01-22 17:54:52.840 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:54:52 compute-0 nova_compute[183075]: 2026-01-22 17:54:52.841 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:54:53 compute-0 nova_compute[183075]: 2026-01-22 17:54:53.673 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-42bc2eb6-0654-413e-bf4f-7926c9f3efb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:54:53 compute-0 nova_compute[183075]: 2026-01-22 17:54:53.674 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-42bc2eb6-0654-413e-bf4f-7926c9f3efb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:54:53 compute-0 nova_compute[183075]: 2026-01-22 17:54:53.674 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:54:54 compute-0 nova_compute[183075]: 2026-01-22 17:54:54.255 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.067 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.115 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Updating instance_info_cache with network_info: [{"id": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "address": "fa:16:3e:6a:a8:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c1da312-2c", "ovs_interfaceid": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.134 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-42bc2eb6-0654-413e-bf4f-7926c9f3efb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.134 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.134 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.134 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.135 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.156 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.157 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.157 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.157 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.232 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.300 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.301 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.358 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.367 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.424 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.426 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.486 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.494 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.554 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.555 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.616 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.808 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.810 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5211MB free_disk=73.29375076293945GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.811 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.811 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.901 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance eadc13fe-ef8b-46e5-af17-40b408aa5aff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.902 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 42bc2eb6-0654-413e-bf4f-7926c9f3efb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.902 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance f7112f58-59da-486c-9079-e0bc3fa6a3e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.902 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.902 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.965 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:54:55 compute-0 nova_compute[183075]: 2026-01-22 17:54:55.981 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:54:56 compute-0 nova_compute[183075]: 2026-01-22 17:54:56.007 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:54:56 compute-0 nova_compute[183075]: 2026-01-22 17:54:56.008 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:54:56 compute-0 nova_compute[183075]: 2026-01-22 17:54:56.661 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:54:57 compute-0 nova_compute[183075]: 2026-01-22 17:54:57.357 183079 INFO nova.compute.manager [None req-74e20946-fabe-482e-9ff8-34d783cf0f6c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Get console output
Jan 22 17:54:57 compute-0 nova_compute[183075]: 2026-01-22 17:54:57.363 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:54:58 compute-0 ovn_controller[95372]: 2026-01-22T17:54:58Z|00154|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:db:07:5c 10.100.0.12
Jan 22 17:54:58 compute-0 ovn_controller[95372]: 2026-01-22T17:54:58Z|00155|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:db:07:5c 10.100.0.12
Jan 22 17:54:59 compute-0 nova_compute[183075]: 2026-01-22 17:54:59.256 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:00 compute-0 nova_compute[183075]: 2026-01-22 17:55:00.070 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:00 compute-0 podman[243589]: 2026-01-22 17:55:00.338950891 +0000 UTC m=+0.049014096 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:55:02 compute-0 nova_compute[183075]: 2026-01-22 17:55:02.479 183079 INFO nova.compute.manager [None req-9be16929-f282-4130-af89-181135f8b4c7 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Get console output
Jan 22 17:55:02 compute-0 nova_compute[183075]: 2026-01-22 17:55:02.485 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:55:04 compute-0 nova_compute[183075]: 2026-01-22 17:55:04.257 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:04 compute-0 podman[243615]: 2026-01-22 17:55:04.336925214 +0000 UTC m=+0.047457014 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:55:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:04.918 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:55:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:04.919 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:55:04 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:55:04 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:55:04 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:55:04 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:55:04 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:55:04 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:55:04 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:55:05 compute-0 nova_compute[183075]: 2026-01-22 17:55:05.129 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.518 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.519 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.5997152
Jan 22 17:55:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.12:46986 [22/Jan/2026:17:55:04.918] listener listener/metadata 0/0/0/1601/1601 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.526 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.527 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.545 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.545 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0185447
Jan 22 17:55:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.12:46996 [22/Jan/2026:17:55:06.526] listener listener/metadata 0/0/0/19/19 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.549 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.550 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.565 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.565 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0152519
Jan 22 17:55:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.12:47002 [22/Jan/2026:17:55:06.549] listener listener/metadata 0/0/0/16/16 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.570 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.570 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.583 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.583 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0130041
Jan 22 17:55:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.12:47016 [22/Jan/2026:17:55:06.569] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.588 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.588 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.600 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.600 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0116637
Jan 22 17:55:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.12:47032 [22/Jan/2026:17:55:06.587] listener listener/metadata 0/0/0/12/12 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.604 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.604 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.618 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.618 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0136588
Jan 22 17:55:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.12:47042 [22/Jan/2026:17:55:06.604] listener listener/metadata 0/0/0/14/14 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.622 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.623 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.635 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:55:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.12:47058 [22/Jan/2026:17:55:06.622] listener listener/metadata 0/0/0/13/13 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.636 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0129344
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.642 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.643 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.661 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.661 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0177848
Jan 22 17:55:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.12:47060 [22/Jan/2026:17:55:06.641] listener listener/metadata 0/0/0/19/19 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.666 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.667 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.680 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.680 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0134773
Jan 22 17:55:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.12:47066 [22/Jan/2026:17:55:06.666] listener listener/metadata 0/0/0/14/14 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.684 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.685 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.698 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.698 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0132828
Jan 22 17:55:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.12:47080 [22/Jan/2026:17:55:06.684] listener listener/metadata 0/0/0/14/14 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.703 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.704 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:55:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.12:47082 [22/Jan/2026:17:55:06.702] listener listener/metadata 0/0/0/19/19 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.722 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0180001
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.744 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.745 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.760 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.760 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0155408
Jan 22 17:55:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.12:47096 [22/Jan/2026:17:55:06.743] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.765 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.766 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.781 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:55:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.12:47106 [22/Jan/2026:17:55:06.765] listener listener/metadata 0/0/0/16/16 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.782 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0158067
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.788 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.788 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.802 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:55:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.12:47114 [22/Jan/2026:17:55:06.787] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.802 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0138292
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.807 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.807 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.823 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:55:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.12:47120 [22/Jan/2026:17:55:06.806] listener listener/metadata 0/0/0/16/16 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.823 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0161011
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.827 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.827 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.12
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.839 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:55:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:06.840 104990 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0122626
Jan 22 17:55:06 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242672]: 10.100.0.12:47126 [22/Jan/2026:17:55:06.826] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:55:07 compute-0 nova_compute[183075]: 2026-01-22 17:55:07.607 183079 INFO nova.compute.manager [None req-112ec85e-2602-480c-88fa-0a3d89d92b83 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Get console output
Jan 22 17:55:07 compute-0 nova_compute[183075]: 2026-01-22 17:55:07.612 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:55:09 compute-0 nova_compute[183075]: 2026-01-22 17:55:09.259 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:10 compute-0 nova_compute[183075]: 2026-01-22 17:55:10.131 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:10 compute-0 nova_compute[183075]: 2026-01-22 17:55:10.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:55:12 compute-0 nova_compute[183075]: 2026-01-22 17:55:12.763 183079 INFO nova.compute.manager [None req-0f95f868-bdfb-4e10-9e74-444d0488ef44 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Get console output
Jan 22 17:55:12 compute-0 nova_compute[183075]: 2026-01-22 17:55:12.767 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:55:14 compute-0 nova_compute[183075]: 2026-01-22 17:55:14.260 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:15 compute-0 nova_compute[183075]: 2026-01-22 17:55:15.151 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:15 compute-0 ovn_controller[95372]: 2026-01-22T17:55:15Z|00836|memory_trim|INFO|Detected inactivity (last active 30028 ms ago): trimming memory
Jan 22 17:55:15 compute-0 podman[243641]: 2026-01-22 17:55:15.35573306 +0000 UTC m=+0.057144533 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.openshift.expose-services=)
Jan 22 17:55:15 compute-0 podman[243640]: 2026-01-22 17:55:15.371797411 +0000 UTC m=+0.076751319 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:55:15 compute-0 podman[243639]: 2026-01-22 17:55:15.378390288 +0000 UTC m=+0.087515428 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 22 17:55:17 compute-0 nova_compute[183075]: 2026-01-22 17:55:17.875 183079 INFO nova.compute.manager [None req-66dc59aa-6745-41c8-8139-d53ff10614b7 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Get console output
Jan 22 17:55:17 compute-0 nova_compute[183075]: 2026-01-22 17:55:17.879 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:55:19 compute-0 nova_compute[183075]: 2026-01-22 17:55:19.263 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:20 compute-0 nova_compute[183075]: 2026-01-22 17:55:20.153 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:20 compute-0 podman[243705]: 2026-01-22 17:55:20.344403782 +0000 UTC m=+0.049503698 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute)
Jan 22 17:55:21 compute-0 nova_compute[183075]: 2026-01-22 17:55:21.800 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:55:21 compute-0 nova_compute[183075]: 2026-01-22 17:55:21.801 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 17:55:21 compute-0 nova_compute[183075]: 2026-01-22 17:55:21.826 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 17:55:23 compute-0 nova_compute[183075]: 2026-01-22 17:55:23.010 183079 INFO nova.compute.manager [None req-685251d1-b64b-4b46-8c0a-df28c17d931f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Get console output
Jan 22 17:55:23 compute-0 nova_compute[183075]: 2026-01-22 17:55:23.015 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:55:24 compute-0 nova_compute[183075]: 2026-01-22 17:55:24.264 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:25 compute-0 nova_compute[183075]: 2026-01-22 17:55:25.155 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:28 compute-0 nova_compute[183075]: 2026-01-22 17:55:28.140 183079 INFO nova.compute.manager [None req-39480553-2d3c-4d49-aedd-7c9e58088eea 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Get console output
Jan 22 17:55:28 compute-0 nova_compute[183075]: 2026-01-22 17:55:28.144 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:55:29 compute-0 nova_compute[183075]: 2026-01-22 17:55:29.266 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:30 compute-0 nova_compute[183075]: 2026-01-22 17:55:30.158 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:31 compute-0 podman[243725]: 2026-01-22 17:55:31.349569593 +0000 UTC m=+0.049873598 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:55:33 compute-0 nova_compute[183075]: 2026-01-22 17:55:33.259 183079 INFO nova.compute.manager [None req-4f720ee4-ee9b-461d-8295-4cad1ab451d8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Get console output
Jan 22 17:55:33 compute-0 nova_compute[183075]: 2026-01-22 17:55:33.264 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:55:34 compute-0 nova_compute[183075]: 2026-01-22 17:55:34.270 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:35 compute-0 nova_compute[183075]: 2026-01-22 17:55:35.160 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:35 compute-0 podman[243749]: 2026-01-22 17:55:35.333441718 +0000 UTC m=+0.042268565 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:55:38 compute-0 nova_compute[183075]: 2026-01-22 17:55:38.385 183079 INFO nova.compute.manager [None req-e0167a7b-04cb-4b66-a833-00930f7732e3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Get console output
Jan 22 17:55:38 compute-0 nova_compute[183075]: 2026-01-22 17:55:38.390 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:55:39 compute-0 nova_compute[183075]: 2026-01-22 17:55:39.271 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:40 compute-0 nova_compute[183075]: 2026-01-22 17:55:40.161 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:40 compute-0 nova_compute[183075]: 2026-01-22 17:55:40.814 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:55:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:41.977 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:55:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:41.977 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:55:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:41.978 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:55:42 compute-0 nova_compute[183075]: 2026-01-22 17:55:42.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:55:43 compute-0 nova_compute[183075]: 2026-01-22 17:55:43.519 183079 INFO nova.compute.manager [None req-ce1bb3d3-ae77-49c9-b320-ca00200a6acf 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Get console output
Jan 22 17:55:43 compute-0 nova_compute[183075]: 2026-01-22 17:55:43.525 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:55:43 compute-0 nova_compute[183075]: 2026-01-22 17:55:43.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:55:43 compute-0 nova_compute[183075]: 2026-01-22 17:55:43.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:55:44 compute-0 nova_compute[183075]: 2026-01-22 17:55:44.273 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:45 compute-0 nova_compute[183075]: 2026-01-22 17:55:45.163 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:46 compute-0 podman[243775]: 2026-01-22 17:55:46.345479352 +0000 UTC m=+0.048301006 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 22 17:55:46 compute-0 podman[243776]: 2026-01-22 17:55:46.362124019 +0000 UTC m=+0.061005057 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, vcs-type=git, config_id=openstack_network_exporter, version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Jan 22 17:55:46 compute-0 podman[243774]: 2026-01-22 17:55:46.397266711 +0000 UTC m=+0.102889470 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:55:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:46.818 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:55:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:46.819 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:55:46 compute-0 nova_compute[183075]: 2026-01-22 17:55:46.819 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:47 compute-0 nova_compute[183075]: 2026-01-22 17:55:47.873 183079 DEBUG nova.compute.manager [req-2056e4e4-4fe9-4030-9003-7fba204aca7b req-306d4343-5ab0-4660-9d93-0ba1ceb5755c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Received event network-changed-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:55:47 compute-0 nova_compute[183075]: 2026-01-22 17:55:47.873 183079 DEBUG nova.compute.manager [req-2056e4e4-4fe9-4030-9003-7fba204aca7b req-306d4343-5ab0-4660-9d93-0ba1ceb5755c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Refreshing instance network info cache due to event network-changed-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:55:47 compute-0 nova_compute[183075]: 2026-01-22 17:55:47.874 183079 DEBUG oslo_concurrency.lockutils [req-2056e4e4-4fe9-4030-9003-7fba204aca7b req-306d4343-5ab0-4660-9d93-0ba1ceb5755c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-f7112f58-59da-486c-9079-e0bc3fa6a3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:55:47 compute-0 nova_compute[183075]: 2026-01-22 17:55:47.874 183079 DEBUG oslo_concurrency.lockutils [req-2056e4e4-4fe9-4030-9003-7fba204aca7b req-306d4343-5ab0-4660-9d93-0ba1ceb5755c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-f7112f58-59da-486c-9079-e0bc3fa6a3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:55:47 compute-0 nova_compute[183075]: 2026-01-22 17:55:47.874 183079 DEBUG nova.network.neutron [req-2056e4e4-4fe9-4030-9003-7fba204aca7b req-306d4343-5ab0-4660-9d93-0ba1ceb5755c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Refreshing network info cache for port b10e0a25-0ce5-49eb-ac39-9b770e42a3fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:55:48 compute-0 nova_compute[183075]: 2026-01-22 17:55:48.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:55:49 compute-0 nova_compute[183075]: 2026-01-22 17:55:49.275 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:49 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:55:49.821 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:55:50 compute-0 nova_compute[183075]: 2026-01-22 17:55:50.165 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:51 compute-0 podman[243836]: 2026-01-22 17:55:51.337537546 +0000 UTC m=+0.052885659 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 17:55:51 compute-0 nova_compute[183075]: 2026-01-22 17:55:51.743 183079 DEBUG nova.network.neutron [req-2056e4e4-4fe9-4030-9003-7fba204aca7b req-306d4343-5ab0-4660-9d93-0ba1ceb5755c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Updated VIF entry in instance network info cache for port b10e0a25-0ce5-49eb-ac39-9b770e42a3fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:55:51 compute-0 nova_compute[183075]: 2026-01-22 17:55:51.743 183079 DEBUG nova.network.neutron [req-2056e4e4-4fe9-4030-9003-7fba204aca7b req-306d4343-5ab0-4660-9d93-0ba1ceb5755c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Updating instance_info_cache with network_info: [{"id": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "address": "fa:16:3e:db:07:5c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb10e0a25-0c", "ovs_interfaceid": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:55:51 compute-0 nova_compute[183075]: 2026-01-22 17:55:51.761 183079 DEBUG oslo_concurrency.lockutils [req-2056e4e4-4fe9-4030-9003-7fba204aca7b req-306d4343-5ab0-4660-9d93-0ba1ceb5755c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-f7112f58-59da-486c-9079-e0bc3fa6a3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:55:52 compute-0 nova_compute[183075]: 2026-01-22 17:55:52.034 183079 DEBUG nova.compute.manager [req-124c5ae4-50ac-4d03-b154-c6262b7ff60c req-8885d495-891c-45ce-abff-8cd0496199b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Received event network-changed-6f89fabe-02ec-4340-9703-82e93c101ebc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:55:52 compute-0 nova_compute[183075]: 2026-01-22 17:55:52.034 183079 DEBUG nova.compute.manager [req-124c5ae4-50ac-4d03-b154-c6262b7ff60c req-8885d495-891c-45ce-abff-8cd0496199b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Refreshing instance network info cache due to event network-changed-6f89fabe-02ec-4340-9703-82e93c101ebc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:55:52 compute-0 nova_compute[183075]: 2026-01-22 17:55:52.035 183079 DEBUG oslo_concurrency.lockutils [req-124c5ae4-50ac-4d03-b154-c6262b7ff60c req-8885d495-891c-45ce-abff-8cd0496199b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:55:52 compute-0 nova_compute[183075]: 2026-01-22 17:55:52.035 183079 DEBUG oslo_concurrency.lockutils [req-124c5ae4-50ac-4d03-b154-c6262b7ff60c req-8885d495-891c-45ce-abff-8cd0496199b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:55:52 compute-0 nova_compute[183075]: 2026-01-22 17:55:52.035 183079 DEBUG nova.network.neutron [req-124c5ae4-50ac-4d03-b154-c6262b7ff60c req-8885d495-891c-45ce-abff-8cd0496199b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Refreshing network info cache for port 6f89fabe-02ec-4340-9703-82e93c101ebc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:55:54 compute-0 nova_compute[183075]: 2026-01-22 17:55:54.276 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:54 compute-0 nova_compute[183075]: 2026-01-22 17:55:54.784 183079 DEBUG nova.network.neutron [req-124c5ae4-50ac-4d03-b154-c6262b7ff60c req-8885d495-891c-45ce-abff-8cd0496199b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Updated VIF entry in instance network info cache for port 6f89fabe-02ec-4340-9703-82e93c101ebc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:55:54 compute-0 nova_compute[183075]: 2026-01-22 17:55:54.785 183079 DEBUG nova.network.neutron [req-124c5ae4-50ac-4d03-b154-c6262b7ff60c req-8885d495-891c-45ce-abff-8cd0496199b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Updating instance_info_cache with network_info: [{"id": "6f89fabe-02ec-4340-9703-82e93c101ebc", "address": "fa:16:3e:45:45:58", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f89fabe-02", "ovs_interfaceid": "6f89fabe-02ec-4340-9703-82e93c101ebc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:55:54 compute-0 nova_compute[183075]: 2026-01-22 17:55:54.786 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:55:54 compute-0 nova_compute[183075]: 2026-01-22 17:55:54.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:55:54 compute-0 nova_compute[183075]: 2026-01-22 17:55:54.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:55:54 compute-0 nova_compute[183075]: 2026-01-22 17:55:54.814 183079 DEBUG oslo_concurrency.lockutils [req-124c5ae4-50ac-4d03-b154-c6262b7ff60c req-8885d495-891c-45ce-abff-8cd0496199b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:55:54 compute-0 nova_compute[183075]: 2026-01-22 17:55:54.991 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:55:54 compute-0 nova_compute[183075]: 2026-01-22 17:55:54.991 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:55:54 compute-0 nova_compute[183075]: 2026-01-22 17:55:54.991 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:55:54 compute-0 nova_compute[183075]: 2026-01-22 17:55:54.991 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid eadc13fe-ef8b-46e5-af17-40b408aa5aff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:55:55 compute-0 nova_compute[183075]: 2026-01-22 17:55:55.168 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.465 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'name': 'tempest-server-test-832225797', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.468 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'name': 'tempest-server-test-1316386013', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.469 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'name': 'tempest-server-test-1171118669', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.470 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.470 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.470 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-832225797>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-832225797>]
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.470 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.474 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f7112f58-59da-486c-9079-e0bc3fa6a3e6 / tapb10e0a25-0c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.475 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.478 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.incoming.bytes.delta volume: 168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.480 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.incoming.bytes.delta volume: 168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1db4962f-6d78-47b2-9043-1bfca88e8e02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004c-f7112f58-59da-486c-9079-e0bc3fa6a3e6-tapb10e0a25-0c', 'timestamp': '2026-01-22T17:55:55.470841', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'tapb10e0a25-0c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:07:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb10e0a25-0c'}, 'message_id': '99805798-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.23125465, 'message_signature': '86fc8dae1d0c9bf9350b93ff4e232955540f387c6dfc78acc003a47ae38c192b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 168, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:55:55.470841', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '9980c3cc-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.236299225, 'message_signature': 'af8322c2492d8ab68b9d066298412a100d31419df70396105682fb527c80a62f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 168, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:55:55.470841', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '99811c0a-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.238845383, 'message_signature': 'ea2988e37b6a2702eef0128a879abc5bff01d6600e1c47857498f82a3bf6fb25'}]}, 'timestamp': '2026-01-22 17:55:55.480751', '_unique_id': '09b4a771f9db4d43a050741a191cbe96'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.482 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.483 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.483 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.483 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-832225797>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-832225797>]
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.484 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.484 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.484 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.484 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19f1b490-a186-4827-a509-6e2667744379', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004c-f7112f58-59da-486c-9079-e0bc3fa6a3e6-tapb10e0a25-0c', 'timestamp': '2026-01-22T17:55:55.484076', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'tapb10e0a25-0c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:07:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb10e0a25-0c'}, 'message_id': '9981aa6c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.23125465, 'message_signature': 'd48214679d6df8c129f5ea955508340de90379641063937b3f88c1455e9099e7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:55:55.484076', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '9981b462-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.236299225, 'message_signature': 'e4f6571efcb5872268b4071f60fe4213222be254ef1e53a39634d9fbe038f85c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:55:55.484076', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '9981bd68-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.238845383, 'message_signature': '2537f4e7658fcd62f41220acfdf908c5db2255c12444bfee67ded5058ed17c41'}]}, 'timestamp': '2026-01-22 17:55:55.484833', '_unique_id': '13bb5b87ed9e4cd2b3560570ed51488e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.486 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.486 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.486 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.outgoing.bytes.delta volume: 1026 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.486 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfca9eee-3289-43a3-81d4-451ddaa29db3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004c-f7112f58-59da-486c-9079-e0bc3fa6a3e6-tapb10e0a25-0c', 'timestamp': '2026-01-22T17:55:55.486190', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'tapb10e0a25-0c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:07:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb10e0a25-0c'}, 'message_id': '9981fca6-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.23125465, 'message_signature': 'efce52795e3943943dbb5805d555cbdfcea984ca2b8895a6027588494e347a1b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 1026, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:55:55.486190', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '9982055c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.236299225, 'message_signature': 'e9bbe09b7f9ca55783191d4f5a7f5a12465203fca0ff832007a7bb9eed3dc00f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:55:55.486190', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '99820eee-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.238845383, 'message_signature': '2d80e1bfa7b7094bb29f4dc87d581c091088c97013c6c7ecb605a6a75c0d4b0b'}]}, 'timestamp': '2026-01-22 17:55:55.486898', '_unique_id': 'c8f091ef0a5f406f95be0b8c73c8adc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.487 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.488 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.488 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/network.outgoing.bytes volume: 11638 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.488 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.outgoing.bytes volume: 11596 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.488 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.outgoing.bytes volume: 11596 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f08c08dd-c7fd-4ea4-b83a-06553c9b3269', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11638, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004c-f7112f58-59da-486c-9079-e0bc3fa6a3e6-tapb10e0a25-0c', 'timestamp': '2026-01-22T17:55:55.488286', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'tapb10e0a25-0c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:07:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb10e0a25-0c'}, 'message_id': '99824f76-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.23125465, 'message_signature': '57dae2a472a8f4905e343b4cdb1507fe78528c8141621fcb9d2d5e4f86406748'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11596, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:55:55.488286', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '99825c5a-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.236299225, 'message_signature': '216811ce9f64f9f5e90c088fc103a093ff775892aa9f422929eeede3fc67b2ac'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11596, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:55:55.488286', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '998267f4-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.238845383, 'message_signature': 'becbad692223cd8f4439f5704bc3aeb29e086c101962384d1760d9da9c64f729'}]}, 'timestamp': '2026-01-22 17:55:55.489218', '_unique_id': '9a3abc49c61f4616a8cd8fb36cf220b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.489 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.490 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.506 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk.device.read.bytes volume: 31283712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.521 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.read.bytes volume: 30063104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.536 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.read.bytes volume: 31209984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6c69d59-f93d-4002-b03f-a4747f253e83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31283712, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6-vda', 'timestamp': '2026-01-22T17:55:55.490871', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'instance-0000004c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '99852606-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.251351059, 'message_signature': 'bcbfb7880548b9e5be6c750510591b93b956114a578854e8fb1ea103243bf023'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30063104, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:55:55.490871', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '99877f82-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.267727418, 'message_signature': 'afd90f3a5c385b7354f4c48fe7d1f0810f508dfc09eb87d34d9ed5b807bffea7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31209984, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:55:55.490871', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9989cc7e-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.283115191, 'message_signature': 'c6b37ebff8406d98b927bccd1db7c4de3e3aed9df20c1a11feaeeac5821a6cab'}]}, 'timestamp': '2026-01-22 17:55:55.537822', '_unique_id': '07a07828e0614b42af52220d4acdaffb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.540 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.540 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.540 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.540 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9def6a47-37f4-4c82-823f-cc8eea2111e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004c-f7112f58-59da-486c-9079-e0bc3fa6a3e6-tapb10e0a25-0c', 'timestamp': '2026-01-22T17:55:55.540159', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'tapb10e0a25-0c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:07:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb10e0a25-0c'}, 'message_id': '998a395c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.23125465, 'message_signature': '39ba13f05e17e039efbe96434c8614834d9f2137ab029b4e33453b13d4e9aa38'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:55:55.540159', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '998a4294-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.236299225, 'message_signature': '14dfe50689b03dd321216b856020e4f967e5ed20d52b5c1758f93217fa06d3b4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:55:55.540159', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '998a4c3a-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.238845383, 'message_signature': '8d3292873683ec28bb8154ae71ab9600d551b48fe8feaddbf1e7e05a45a97fe9'}]}, 'timestamp': '2026-01-22 17:55:55.540898', '_unique_id': '2268e763c61240f6bf76f2672e5edaf5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.542 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.542 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.542 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-832225797>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-832225797>]
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.542 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.542 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/network.incoming.bytes volume: 7247 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.542 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.incoming.bytes volume: 7375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.543 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.incoming.bytes volume: 7615 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '634bc0e4-763d-4245-863b-6eca44a30bd0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7247, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004c-f7112f58-59da-486c-9079-e0bc3fa6a3e6-tapb10e0a25-0c', 'timestamp': '2026-01-22T17:55:55.542585', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'tapb10e0a25-0c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:07:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb10e0a25-0c'}, 'message_id': '998a99e2-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.23125465, 'message_signature': 'be83ee71c1b50027c8a30b5eb9fc7e6d5492f964adb94f65d741ce8e6848c117'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7375, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:55:55.542585', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '998aa4aa-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.236299225, 'message_signature': '82823afbeb3aa74fc1927a360d82adcdcb4761c5126668ba7e232977e4f70705'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7615, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:55:55.542585', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '998aad10-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.238845383, 'message_signature': 'ba7ad8046b5af52b70a6a56eb828347c53e48a2c8860d292b070ff79358968d4'}]}, 'timestamp': '2026-01-22 17:55:55.543408', '_unique_id': '14e8e12b13824c6db52ffbd57d4b2e91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.544 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.545 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.545 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-832225797>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-832225797>]
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.545 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.545 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk.device.write.bytes volume: 73101312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.545 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.write.bytes volume: 73220096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.545 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.write.bytes volume: 73220096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2425904-703b-4fcf-b1ec-caa64873daa7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73101312, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6-vda', 'timestamp': '2026-01-22T17:55:55.545304', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'instance-0000004c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '998b02b0-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.251351059, 'message_signature': 'a064c415e88b9156f531a432d7254a7f5184d3f21a4276b8a72767a7a811df51'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73220096, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:55:55.545304', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '998b0ed6-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.267727418, 'message_signature': 'f2bbef25030ecd34d5e2e387176cf7cf5665ef39b90b4a9e496b9e2870cde9ce'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73220096, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:55:55.545304', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '998b196c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.283115191, 'message_signature': 'e10bbe17bb78e4fe5764cb32ff5702f68e481567ef709986a617498d70bfdc59'}]}, 'timestamp': '2026-01-22 17:55:55.546182', '_unique_id': 'c4ba369c945343319f7dc1cf815b8bbb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.547 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk.device.read.latency volume: 161026391 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.548 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.read.latency volume: 154131603 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.548 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.read.latency volume: 162196742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c330a4ee-d389-4118-82d7-aa3c932cbb31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 161026391, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6-vda', 'timestamp': '2026-01-22T17:55:55.547939', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'instance-0000004c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '998b68c2-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.251351059, 'message_signature': '7ceef3383f4e222d94361760fdbdc7d92be49780b47f54a73910ce9bdcb76d4a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 154131603, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:55:55.547939', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '998b72ae-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.267727418, 'message_signature': 'c5480519a5111a7d26c77fa210b3c97becd255df9a0faf3d3e83381dbd811653'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 162196742, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:55:55.547939', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '998b7cae-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.283115191, 'message_signature': 'd428f23d534595d88100dc78e524f6b837155cdff7ac4317473d0adae681be5c'}]}, 'timestamp': '2026-01-22 17:55:55.548704', '_unique_id': 'ad7bb39cce554b50a1fdf30db65a6635'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.549 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.550 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.550 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk.device.write.requests volume: 330 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.551 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.write.requests volume: 345 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.551 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.write.requests volume: 346 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4d052f4-de0c-43b7-8afa-ecc8e16202d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 330, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6-vda', 'timestamp': '2026-01-22T17:55:55.550785', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'instance-0000004c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '998bda50-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.251351059, 'message_signature': 'a223b60c747703965515bb6b19c075f2735664c457028c8cb66eeb60003b3e14'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 345, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:55:55.550785', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '998be6bc-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.267727418, 'message_signature': '3905d417a1b3520dd4cb105dabfb1c2774c72d23cd454bf1e2704d2944c8b833'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 346, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:55:55.550785', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '998bf2c4-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.283115191, 'message_signature': '57ed5cba071d70a5503e644d9cf9e19a6a1322593ce45012de7374cc6d78f61a'}]}, 'timestamp': '2026-01-22 17:55:55.551788', '_unique_id': 'e5b29957b6d244d28597caf8f89f5da6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.553 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk.device.read.requests volume: 1156 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.554 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.read.requests volume: 1116 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.554 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.read.requests volume: 1153 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d919dbf-243c-49cb-963b-c998634c068b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1156, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6-vda', 'timestamp': '2026-01-22T17:55:55.553904', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'instance-0000004c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '998c51b0-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.251351059, 'message_signature': '62571a6153e9148edc53b4c49cb5f292f9b94b0079b41c4c4017d7c4415311be'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1116, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:55:55.553904', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '998c5a3e-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.267727418, 'message_signature': '1e33a9245d606c7422ca7dc9d740e3a260615925844ac26fd62631aab2941d2b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1153, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:55:55.553904', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '998c629a-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.283115191, 'message_signature': '47203870afedc19cfa800a0cff905e3b046893b047a0ccea1650d35605fd0081'}]}, 'timestamp': '2026-01-22 17:55:55.554567', '_unique_id': 'ecdd19b398f94bfcbc7b5b06d9d53b1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.555 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.556 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/network.incoming.packets volume: 62 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.556 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.incoming.packets volume: 65 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.556 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.incoming.packets volume: 69 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6187b8b2-f232-49af-bda3-9a9dea4556da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 62, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004c-f7112f58-59da-486c-9079-e0bc3fa6a3e6-tapb10e0a25-0c', 'timestamp': '2026-01-22T17:55:55.556013', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'tapb10e0a25-0c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:07:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb10e0a25-0c'}, 'message_id': '998ca3f4-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.23125465, 'message_signature': '4f61974afc48532ff8cee4698792c0487a0a3c17681a138536f4bb9cdc7aafd3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 65, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:55:55.556013', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '998cac78-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.236299225, 'message_signature': 'fdd8f14a90d71806d0148bbff1ba4f7b65e247c2355fa6fa30514675ec69f650'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 69, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:55:55.556013', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '998cb4a2-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.238845383, 'message_signature': '036726946f9a4c6f2af08e2f762ca715879cf6facf105466a441d893702274e6'}]}, 'timestamp': '2026-01-22 17:55:55.556692', '_unique_id': 'b46b3fcd4d744eee9542b3139df9f9fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.557 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.558 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.558 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b9fa037-e414-474c-b468-6e349443c882', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004c-f7112f58-59da-486c-9079-e0bc3fa6a3e6-tapb10e0a25-0c', 'timestamp': '2026-01-22T17:55:55.557962', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'tapb10e0a25-0c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:07:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb10e0a25-0c'}, 'message_id': '998cf174-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.23125465, 'message_signature': '26cbb05190298bf02590ca601fb55936daeffcfc6aa0040491b110b37c969bf7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:55:55.557962', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '998cfb9c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.236299225, 'message_signature': '2bbb568ed23eacb66bc0428e4a0c46c94a846c739251089f6d53158e035e20b4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:55:55.557962', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '998d03f8-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.238845383, 'message_signature': 'de9b81bc96955fe90aefa9b8e535dabdfbea71c0b4ecb99979bca6db6372dda0'}]}, 'timestamp': '2026-01-22 17:55:55.558725', '_unique_id': '28585b0f6e49405582e58ce1609a6379'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.559 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.560 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.560 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/network.outgoing.packets volume: 132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.560 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.outgoing.packets volume: 131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.560 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.outgoing.packets volume: 131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f419759c-1387-427d-82c6-e4374d5bfe58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 132, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004c-f7112f58-59da-486c-9079-e0bc3fa6a3e6-tapb10e0a25-0c', 'timestamp': '2026-01-22T17:55:55.560120', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'tapb10e0a25-0c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:07:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb10e0a25-0c'}, 'message_id': '998d4412-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.23125465, 'message_signature': '170fb76d405916c82c3867e60ff7f944d5c15895fb4a0a3dab999c8158e25956'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 131, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:55:55.560120', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '998d4d9a-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.236299225, 'message_signature': 'c0778f91b80d302df3da4ad0ba3f9ed5a0bb8203ae77674d498e302216f484a9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 131, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:55:55.560120', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '998d565a-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.238845383, 'message_signature': '7ed863f695ea5c221a913a7ad270e9c98f7a924f0cfe0b44408ca119deeb2f52'}]}, 'timestamp': '2026-01-22 17:55:55.560813', '_unique_id': 'ef3d63bc614f4573be58235fd7bc570e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.561 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.562 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.562 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.562 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.562 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b08b1659-723f-4a19-8fe2-aa1852990f46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004c-f7112f58-59da-486c-9079-e0bc3fa6a3e6-tapb10e0a25-0c', 'timestamp': '2026-01-22T17:55:55.562139', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'tapb10e0a25-0c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:07:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb10e0a25-0c'}, 'message_id': '998d92e6-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.23125465, 'message_signature': 'a2bf8389a7c948c0fad9e2bdb028476a261339b9d74d8eff0316eb85c33a5590'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004b-42bc2eb6-0654-413e-bf4f-7926c9f3efb5-tap9c1da312-2c', 'timestamp': '2026-01-22T17:55:55.562139', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'tap9c1da312-2c', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:a8:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c1da312-2c'}, 'message_id': '998d9c1e-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.236299225, 'message_signature': '18ac869d43ada3e55cfac43665f59bb19a78d4fb77887179b4398e87cd5266b8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004a-eadc13fe-ef8b-46e5-af17-40b408aa5aff-tap6f89fabe-02', 'timestamp': '2026-01-22T17:55:55.562139', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'tap6f89fabe-02', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:45:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f89fabe-02'}, 'message_id': '998da63c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.238845383, 'message_signature': '1574bc820d8ae8e699533a9c1dcaf14c9703801f9ef9b922e4bf0c8baf8398f7'}]}, 'timestamp': '2026-01-22 17:55:55.562862', '_unique_id': 'e740042881264d43bd5e4946a9595437'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.563 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.564 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.572 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.578 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.584 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a375e469-6ad7-4521-b094-39fe3c0e446d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6-vda', 'timestamp': '2026-01-22T17:55:55.564105', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'instance-0000004c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '998f28ae-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.324559212, 'message_signature': 'f3eaf7bd7c9eb630c1ad4c91beb5f15a6b78dcdcd3ecec2bdd41aba8290d725f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:55:55.564105', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '999025b0-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.333391109, 'message_signature': '9c0893c2f34ecdc41a453d615eb546f1f9da7d52be06e814ec726c6174fcf894'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:55:55.564105', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '99910f48-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.339711879, 'message_signature': '22a602f103783a9a7afa0e201087301caecd471de463cb9f5715964aade7909c'}]}, 'timestamp': '2026-01-22 17:55:55.585300', '_unique_id': '5d7c8e5ad974491189d863848d777047'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.586 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.587 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.587 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.587 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.allocation volume: 30875648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.587 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4d32e1e-fc05-4211-a409-1f74e12adba5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6-vda', 'timestamp': '2026-01-22T17:55:55.587373', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'instance-0000004c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '99916e5c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.324559212, 'message_signature': '213263babe98767693cb893ff39da10dcda403907aaacd51cb6dfe7ae8b67ef3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30875648, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:55:55.587373', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '99917794-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.333391109, 'message_signature': '225003f19c447a6f1b801e6e3129a0a019392449d4a50101394fc3c9aba0702a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:55:55.587373', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '99917ef6-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.339711879, 'message_signature': '96991b99882534e0c1939cf1090fdcfb032dca27ae4a8241f0b8067544dc71ad'}]}, 'timestamp': '2026-01-22 17:55:55.588058', '_unique_id': 'f54977499d26465cb8c9a8c67b2708e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.588 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.589 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.589 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk.device.write.latency volume: 2122537544 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.589 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.write.latency volume: 3187515059 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.589 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.write.latency volume: 2227812578 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff62d1a3-6e45-41df-9480-86ac9947d123', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2122537544, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6-vda', 'timestamp': '2026-01-22T17:55:55.589283', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'instance-0000004c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9991b740-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.251351059, 'message_signature': '9aa9bfae178cdbd50aff38d438ee07c8e87af7992d0d6fc5cad3e00f09acede0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3187515059, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:55:55.589283', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9991bf56-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.267727418, 'message_signature': '26a422ac0ad687d8155c234c944010cfb233531fa5c8b429fdcc9f14bd66ed9a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2227812578, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:55:55.589283', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9991c780-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.283115191, 'message_signature': '2e7657c2e67642a78b0a2d9ecf43e9c9e60010913e0a84ced3da8bcd9b599c4c'}]}, 'timestamp': '2026-01-22 17:55:55.589916', '_unique_id': '2a02c4d7539c4a6191e2d3231bb7fcda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.590 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.591 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.610 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/cpu volume: 11300000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.624 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/cpu volume: 11900000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.640 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/cpu volume: 12490000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59e0b926-b420-48b8-9a11-beea79e31cd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11300000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'timestamp': '2026-01-22T17:55:55.591150', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'instance-0000004c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9994fe28-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.370641737, 'message_signature': 'fafa0fa8bcbfde932181d13546a27dbf02df58c3e67f15b0717e5e475acff3f0'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11900000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'timestamp': '2026-01-22T17:55:55.591150', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '99973300-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.385122035, 'message_signature': 'a7c0c16bb1e5dbc916c6604d2c85545cd780ad3df61c204e201b47df804f6e14'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12490000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'timestamp': '2026-01-22T17:55:55.591150', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9999a07c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.401069563, 'message_signature': '3bcf8f1ded13c9d18c76758b77a4e2359f1050158dd66f2c018ee48b3e42891e'}]}, 'timestamp': '2026-01-22 17:55:55.641474', '_unique_id': '6e2af4d65aeb414ab24dc4b118cb78fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.642 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.643 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.643 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/memory.usage volume: 42.1953125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.644 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/memory.usage volume: 42.4296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.644 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/memory.usage volume: 42.1328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87af521d-90f3-48a5-aafa-73267c3dbd19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.1953125, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'timestamp': '2026-01-22T17:55:55.643768', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'instance-0000004c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '999a090e-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.370641737, 'message_signature': 'd761c9eeb4bbcb457d8d5ed3ed5c107412aeb9e1b6616d8f841e3308da16ac0e'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.4296875, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'timestamp': '2026-01-22T17:55:55.643768', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '999a11d8-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.385122035, 'message_signature': 'b9b35f849a57ae66536b6ef4f02d7fbfcb73faf831ee47d640ccbc07b415c80c'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.1328125, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'timestamp': '2026-01-22T17:55:55.643768', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '999a19ee-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.401069563, 'message_signature': 'cb686d98567adc8d604153c6bfba7142f696925f08c75697994b7c0ae47c0f47'}]}, 'timestamp': '2026-01-22 17:55:55.644475', '_unique_id': 'ba78907afb21433ba0c0ec5306a4b5ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.645 12 DEBUG ceilometer.compute.pollsters [-] f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.646 12 DEBUG ceilometer.compute.pollsters [-] 42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.646 12 DEBUG ceilometer.compute.pollsters [-] eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c2670bb-48bd-4b08-81c1-0905f4fd4bcb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6-vda', 'timestamp': '2026-01-22T17:55:55.645845', 'resource_metadata': {'display_name': 'tempest-server-test-832225797', 'name': 'instance-0000004c', 'instance_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '999a5ad0-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.324559212, 'message_signature': '6114f1cc6da15bb903762ea93a39f310141062715bc183c7664e843cd9b8075a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5-vda', 'timestamp': '2026-01-22T17:55:55.645845', 'resource_metadata': {'display_name': 'tempest-server-test-1316386013', 'name': 'instance-0000004b', 'instance_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '999a6476-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.333391109, 'message_signature': '7dbe40c042f93565465635c28d091966ac52a981504fb8d468386e9153485c24'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff-vda', 'timestamp': '2026-01-22T17:55:55.645845', 'resource_metadata': {'display_name': 'tempest-server-test-1171118669', 'name': 'instance-0000004a', 'instance_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '999a6dcc-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 6919.339711879, 'message_signature': 'bbe19a33244f74fb40eebe0304d86f372f9773826223131895d0244ffb94205d'}]}, 'timestamp': '2026-01-22 17:55:55.646651', '_unique_id': 'a281bef3571f4c77a5af19dd2647a13d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:55:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:55:55.647 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:55:55 compute-0 nova_compute[183075]: 2026-01-22 17:55:55.824 183079 DEBUG nova.compute.manager [req-7e456a85-7fa2-469e-a677-cd23791c1f7f req-2e5fd7c0-83ee-4c84-bd92-a7d1513c4b65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Received event network-changed-9c1da312-2c1b-451b-9ea1-34ff96520bb9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:55:55 compute-0 nova_compute[183075]: 2026-01-22 17:55:55.825 183079 DEBUG nova.compute.manager [req-7e456a85-7fa2-469e-a677-cd23791c1f7f req-2e5fd7c0-83ee-4c84-bd92-a7d1513c4b65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Refreshing instance network info cache due to event network-changed-9c1da312-2c1b-451b-9ea1-34ff96520bb9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:55:55 compute-0 nova_compute[183075]: 2026-01-22 17:55:55.825 183079 DEBUG oslo_concurrency.lockutils [req-7e456a85-7fa2-469e-a677-cd23791c1f7f req-2e5fd7c0-83ee-4c84-bd92-a7d1513c4b65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-42bc2eb6-0654-413e-bf4f-7926c9f3efb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:55:55 compute-0 nova_compute[183075]: 2026-01-22 17:55:55.825 183079 DEBUG oslo_concurrency.lockutils [req-7e456a85-7fa2-469e-a677-cd23791c1f7f req-2e5fd7c0-83ee-4c84-bd92-a7d1513c4b65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-42bc2eb6-0654-413e-bf4f-7926c9f3efb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:55:55 compute-0 nova_compute[183075]: 2026-01-22 17:55:55.825 183079 DEBUG nova.network.neutron [req-7e456a85-7fa2-469e-a677-cd23791c1f7f req-2e5fd7c0-83ee-4c84-bd92-a7d1513c4b65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Refreshing network info cache for port 9c1da312-2c1b-451b-9ea1-34ff96520bb9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.000 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Updating instance_info_cache with network_info: [{"id": "6f89fabe-02ec-4340-9703-82e93c101ebc", "address": "fa:16:3e:45:45:58", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f89fabe-02", "ovs_interfaceid": "6f89fabe-02ec-4340-9703-82e93c101ebc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.019 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-eadc13fe-ef8b-46e5-af17-40b408aa5aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.020 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.020 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.021 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.021 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.021 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.046 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.046 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.047 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.047 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.060 183079 DEBUG nova.network.neutron [req-7e456a85-7fa2-469e-a677-cd23791c1f7f req-2e5fd7c0-83ee-4c84-bd92-a7d1513c4b65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Updated VIF entry in instance network info cache for port 9c1da312-2c1b-451b-9ea1-34ff96520bb9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.061 183079 DEBUG nova.network.neutron [req-7e456a85-7fa2-469e-a677-cd23791c1f7f req-2e5fd7c0-83ee-4c84-bd92-a7d1513c4b65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Updating instance_info_cache with network_info: [{"id": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "address": "fa:16:3e:6a:a8:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c1da312-2c", "ovs_interfaceid": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.076 183079 DEBUG oslo_concurrency.lockutils [req-7e456a85-7fa2-469e-a677-cd23791c1f7f req-2e5fd7c0-83ee-4c84-bd92-a7d1513c4b65 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-42bc2eb6-0654-413e-bf4f-7926c9f3efb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.123 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.190 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.191 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.243 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.248 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.303 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.304 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.358 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.363 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.420 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.421 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.481 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.637 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.639 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5241MB free_disk=73.2657699584961GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.639 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.640 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.705 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance eadc13fe-ef8b-46e5-af17-40b408aa5aff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.706 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 42bc2eb6-0654-413e-bf4f-7926c9f3efb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.706 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance f7112f58-59da-486c-9079-e0bc3fa6a3e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.706 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.706 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.775 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.790 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.791 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.791 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.906 183079 DEBUG nova.compute.manager [req-223d5acd-529f-4514-be1f-42fc22b2a048 req-7bebf4dc-b67f-4a1b-aa7e-f634ebbccd83 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Received event network-changed-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.906 183079 DEBUG nova.compute.manager [req-223d5acd-529f-4514-be1f-42fc22b2a048 req-7bebf4dc-b67f-4a1b-aa7e-f634ebbccd83 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Refreshing instance network info cache due to event network-changed-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.907 183079 DEBUG oslo_concurrency.lockutils [req-223d5acd-529f-4514-be1f-42fc22b2a048 req-7bebf4dc-b67f-4a1b-aa7e-f634ebbccd83 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-f7112f58-59da-486c-9079-e0bc3fa6a3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.908 183079 DEBUG oslo_concurrency.lockutils [req-223d5acd-529f-4514-be1f-42fc22b2a048 req-7bebf4dc-b67f-4a1b-aa7e-f634ebbccd83 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-f7112f58-59da-486c-9079-e0bc3fa6a3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:55:57 compute-0 nova_compute[183075]: 2026-01-22 17:55:57.910 183079 DEBUG nova.network.neutron [req-223d5acd-529f-4514-be1f-42fc22b2a048 req-7bebf4dc-b67f-4a1b-aa7e-f634ebbccd83 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Refreshing network info cache for port b10e0a25-0ce5-49eb-ac39-9b770e42a3fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:55:59 compute-0 nova_compute[183075]: 2026-01-22 17:55:59.278 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:55:59 compute-0 nova_compute[183075]: 2026-01-22 17:55:59.956 183079 DEBUG nova.network.neutron [req-223d5acd-529f-4514-be1f-42fc22b2a048 req-7bebf4dc-b67f-4a1b-aa7e-f634ebbccd83 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Updated VIF entry in instance network info cache for port b10e0a25-0ce5-49eb-ac39-9b770e42a3fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:55:59 compute-0 nova_compute[183075]: 2026-01-22 17:55:59.956 183079 DEBUG nova.network.neutron [req-223d5acd-529f-4514-be1f-42fc22b2a048 req-7bebf4dc-b67f-4a1b-aa7e-f634ebbccd83 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Updating instance_info_cache with network_info: [{"id": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "address": "fa:16:3e:db:07:5c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb10e0a25-0c", "ovs_interfaceid": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:55:59 compute-0 nova_compute[183075]: 2026-01-22 17:55:59.970 183079 DEBUG oslo_concurrency.lockutils [req-223d5acd-529f-4514-be1f-42fc22b2a048 req-7bebf4dc-b67f-4a1b-aa7e-f634ebbccd83 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-f7112f58-59da-486c-9079-e0bc3fa6a3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:56:00 compute-0 nova_compute[183075]: 2026-01-22 17:56:00.170 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:02 compute-0 podman[243878]: 2026-01-22 17:56:02.34243881 +0000 UTC m=+0.050507055 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:56:04 compute-0 nova_compute[183075]: 2026-01-22 17:56:04.281 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:05 compute-0 nova_compute[183075]: 2026-01-22 17:56:05.172 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:06 compute-0 podman[243907]: 2026-01-22 17:56:06.368698181 +0000 UTC m=+0.077786436 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.408 183079 DEBUG oslo_concurrency.lockutils [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.408 183079 DEBUG oslo_concurrency.lockutils [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.409 183079 DEBUG oslo_concurrency.lockutils [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.409 183079 DEBUG oslo_concurrency.lockutils [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.409 183079 DEBUG oslo_concurrency.lockutils [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.410 183079 INFO nova.compute.manager [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Terminating instance
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.411 183079 DEBUG nova.compute.manager [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:56:08 compute-0 kernel: tapb10e0a25-0c (unregistering): left promiscuous mode
Jan 22 17:56:08 compute-0 NetworkManager[55454]: <info>  [1769104568.4568] device (tapb10e0a25-0c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:56:08 compute-0 ovn_controller[95372]: 2026-01-22T17:56:08Z|00837|binding|INFO|Releasing lport b10e0a25-0ce5-49eb-ac39-9b770e42a3fc from this chassis (sb_readonly=0)
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.466 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:08 compute-0 ovn_controller[95372]: 2026-01-22T17:56:08Z|00838|binding|INFO|Setting lport b10e0a25-0ce5-49eb-ac39-9b770e42a3fc down in Southbound
Jan 22 17:56:08 compute-0 ovn_controller[95372]: 2026-01-22T17:56:08Z|00839|binding|INFO|Removing iface tapb10e0a25-0c ovn-installed in OVS
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.469 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:08.473 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:07:5c 10.100.0.12'], port_security=['fa:16:3e:db:07:5c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f7112f58-59da-486c-9079-e0bc3fa6a3e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '68147189-d12b-4a79-b81a-a70d0e93d075 e384bd4b-9ab1-4663-8bdf-170c928c64ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.226'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=b10e0a25-0ce5-49eb-ac39-9b770e42a3fc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:56:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:08.475 104629 INFO neutron.agent.ovn.metadata.agent [-] Port b10e0a25-0ce5-49eb-ac39-9b770e42a3fc in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:56:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:08.476 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.481 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:08.496 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[796a1415-968a-4b9e-8530-5e3d82810796]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:08 compute-0 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d0000004c.scope: Deactivated successfully.
Jan 22 17:56:08 compute-0 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d0000004c.scope: Consumed 15.858s CPU time.
Jan 22 17:56:08 compute-0 systemd-machined[154382]: Machine qemu-76-instance-0000004c terminated.
Jan 22 17:56:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:08.529 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[1ac66112-e8b7-42be-ac79-84103f5023ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:08.532 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[6aafad06-8ce7-41cc-baba-2367eb72a78a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:08.558 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a6b4052a-2b00-406c-a7c4-f2d6c0608488]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:08.577 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[74be2bd3-047a-4e79-bec9-8963c6d06de0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 301, 'tx_packets': 157, 'rx_bytes': 25738, 'tx_bytes': 17917, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 301, 'tx_packets': 157, 'rx_bytes': 25738, 'tx_bytes': 17917, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 669817, 'reachable_time': 41616, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243943, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:08.594 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c2576801-2fbd-401b-8f89-0f653ffb4a02]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 669828, 'tstamp': 669828}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243944, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 669843, 'tstamp': 669843}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243944, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:08.596 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.597 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.601 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:08.602 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:56:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:08.602 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:56:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:08.603 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:56:08 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:08.603 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.669 183079 INFO nova.virt.libvirt.driver [-] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Instance destroyed successfully.
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.670 183079 DEBUG nova.objects.instance [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid f7112f58-59da-486c-9079-e0bc3fa6a3e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.682 183079 DEBUG nova.virt.libvirt.vif [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:54:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-832225797',display_name='tempest-server-test-832225797',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-832225797',id=76,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:54:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-x4ls3ch2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:54:45Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=f7112f58-59da-486c-9079-e0bc3fa6a3e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "address": "fa:16:3e:db:07:5c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb10e0a25-0c", "ovs_interfaceid": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.682 183079 DEBUG nova.network.os_vif_util [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "address": "fa:16:3e:db:07:5c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb10e0a25-0c", "ovs_interfaceid": "b10e0a25-0ce5-49eb-ac39-9b770e42a3fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.683 183079 DEBUG nova.network.os_vif_util [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:db:07:5c,bridge_name='br-int',has_traffic_filtering=True,id=b10e0a25-0ce5-49eb-ac39-9b770e42a3fc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb10e0a25-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.683 183079 DEBUG os_vif [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:07:5c,bridge_name='br-int',has_traffic_filtering=True,id=b10e0a25-0ce5-49eb-ac39-9b770e42a3fc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb10e0a25-0c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.685 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.685 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb10e0a25-0c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.687 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.690 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.691 183079 INFO os_vif [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:07:5c,bridge_name='br-int',has_traffic_filtering=True,id=b10e0a25-0ce5-49eb-ac39-9b770e42a3fc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb10e0a25-0c')
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.692 183079 INFO nova.virt.libvirt.driver [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Deleting instance files /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6_del
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.693 183079 INFO nova.virt.libvirt.driver [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Deletion of /var/lib/nova/instances/f7112f58-59da-486c-9079-e0bc3fa6a3e6_del complete
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.737 183079 INFO nova.compute.manager [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.738 183079 DEBUG oslo.service.loopingcall [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.738 183079 DEBUG nova.compute.manager [-] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:56:08 compute-0 nova_compute[183075]: 2026-01-22 17:56:08.738 183079 DEBUG nova.network.neutron [-] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:56:09 compute-0 nova_compute[183075]: 2026-01-22 17:56:09.283 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:09 compute-0 nova_compute[183075]: 2026-01-22 17:56:09.443 183079 DEBUG nova.compute.manager [req-3011b1b1-150f-4dd7-932c-03d480939e79 req-a5cd3f64-acd8-4298-a86b-33ea986886e3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Received event network-vif-unplugged-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:56:09 compute-0 nova_compute[183075]: 2026-01-22 17:56:09.444 183079 DEBUG oslo_concurrency.lockutils [req-3011b1b1-150f-4dd7-932c-03d480939e79 req-a5cd3f64-acd8-4298-a86b-33ea986886e3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:09 compute-0 nova_compute[183075]: 2026-01-22 17:56:09.444 183079 DEBUG oslo_concurrency.lockutils [req-3011b1b1-150f-4dd7-932c-03d480939e79 req-a5cd3f64-acd8-4298-a86b-33ea986886e3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:09 compute-0 nova_compute[183075]: 2026-01-22 17:56:09.444 183079 DEBUG oslo_concurrency.lockutils [req-3011b1b1-150f-4dd7-932c-03d480939e79 req-a5cd3f64-acd8-4298-a86b-33ea986886e3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:09 compute-0 nova_compute[183075]: 2026-01-22 17:56:09.445 183079 DEBUG nova.compute.manager [req-3011b1b1-150f-4dd7-932c-03d480939e79 req-a5cd3f64-acd8-4298-a86b-33ea986886e3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] No waiting events found dispatching network-vif-unplugged-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:56:09 compute-0 nova_compute[183075]: 2026-01-22 17:56:09.445 183079 DEBUG nova.compute.manager [req-3011b1b1-150f-4dd7-932c-03d480939e79 req-a5cd3f64-acd8-4298-a86b-33ea986886e3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Received event network-vif-unplugged-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:56:09 compute-0 nova_compute[183075]: 2026-01-22 17:56:09.699 183079 DEBUG nova.network.neutron [-] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:56:09 compute-0 nova_compute[183075]: 2026-01-22 17:56:09.717 183079 INFO nova.compute.manager [-] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Took 0.98 seconds to deallocate network for instance.
Jan 22 17:56:09 compute-0 nova_compute[183075]: 2026-01-22 17:56:09.758 183079 DEBUG oslo_concurrency.lockutils [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:09 compute-0 nova_compute[183075]: 2026-01-22 17:56:09.759 183079 DEBUG oslo_concurrency.lockutils [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:09 compute-0 nova_compute[183075]: 2026-01-22 17:56:09.847 183079 DEBUG nova.compute.provider_tree [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:56:09 compute-0 nova_compute[183075]: 2026-01-22 17:56:09.863 183079 DEBUG nova.scheduler.client.report [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:56:09 compute-0 nova_compute[183075]: 2026-01-22 17:56:09.886 183079 DEBUG oslo_concurrency.lockutils [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:09 compute-0 nova_compute[183075]: 2026-01-22 17:56:09.912 183079 INFO nova.scheduler.client.report [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance f7112f58-59da-486c-9079-e0bc3fa6a3e6
Jan 22 17:56:09 compute-0 nova_compute[183075]: 2026-01-22 17:56:09.975 183079 DEBUG oslo_concurrency.lockutils [None req-4d895036-603e-48af-89ff-d1486358097a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.057 183079 DEBUG nova.compute.manager [req-c5521cec-8d0c-4d92-87d3-f461fd2e4a2d req-bc2ec659-52d9-4fae-bd7b-63f848a9ffe4 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Received event network-vif-deleted-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.489 183079 DEBUG oslo_concurrency.lockutils [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.489 183079 DEBUG oslo_concurrency.lockutils [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.489 183079 DEBUG oslo_concurrency.lockutils [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.490 183079 DEBUG oslo_concurrency.lockutils [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.490 183079 DEBUG oslo_concurrency.lockutils [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.491 183079 INFO nova.compute.manager [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Terminating instance
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.492 183079 DEBUG nova.compute.manager [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:56:10 compute-0 kernel: tap9c1da312-2c (unregistering): left promiscuous mode
Jan 22 17:56:10 compute-0 NetworkManager[55454]: <info>  [1769104570.5165] device (tap9c1da312-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:56:10 compute-0 ovn_controller[95372]: 2026-01-22T17:56:10Z|00840|binding|INFO|Releasing lport 9c1da312-2c1b-451b-9ea1-34ff96520bb9 from this chassis (sb_readonly=0)
Jan 22 17:56:10 compute-0 ovn_controller[95372]: 2026-01-22T17:56:10Z|00841|binding|INFO|Setting lport 9c1da312-2c1b-451b-9ea1-34ff96520bb9 down in Southbound
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.523 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:10 compute-0 ovn_controller[95372]: 2026-01-22T17:56:10Z|00842|binding|INFO|Removing iface tap9c1da312-2c ovn-installed in OVS
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.525 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:10.531 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:a8:74 10.100.0.7'], port_security=['fa:16:3e:6a:a8:74 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '42bc2eb6-0654-413e-bf4f-7926c9f3efb5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '68147189-d12b-4a79-b81a-a70d0e93d075', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.235'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=9c1da312-2c1b-451b-9ea1-34ff96520bb9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:56:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:10.532 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 9c1da312-2c1b-451b-9ea1-34ff96520bb9 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:56:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:10.533 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.536 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:10.548 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[392b264e-b8ad-4c8b-b0a7-92ebe2ff0748]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:10.578 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ccb0cdf7-1adc-489b-a366-a3d80e4709aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:10 compute-0 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d0000004b.scope: Deactivated successfully.
Jan 22 17:56:10 compute-0 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d0000004b.scope: Consumed 19.160s CPU time.
Jan 22 17:56:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:10.582 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d15cbed8-c772-45b2-98ee-d4f0754315b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:10 compute-0 systemd-machined[154382]: Machine qemu-75-instance-0000004b terminated.
Jan 22 17:56:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:10.608 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[0ff7f05b-0f56-4152-912c-cfab7c7f3296]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:10.624 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ca01c467-482e-41f1-9eed-c124ca656f0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 301, 'tx_packets': 159, 'rx_bytes': 25738, 'tx_bytes': 18001, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 301, 'tx_packets': 159, 'rx_bytes': 25738, 'tx_bytes': 18001, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 669817, 'reachable_time': 41616, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243971, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:10.640 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[89c9866c-543d-4da7-9cb5-c619656d5ce6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 669828, 'tstamp': 669828}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243972, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 669843, 'tstamp': 669843}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243972, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:10.641 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.643 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:10.647 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.647 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:10.647 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:56:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:10.647 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:56:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:10.648 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:56:10 compute-0 kernel: tap9c1da312-2c: entered promiscuous mode
Jan 22 17:56:10 compute-0 kernel: tap9c1da312-2c (unregistering): left promiscuous mode
Jan 22 17:56:10 compute-0 NetworkManager[55454]: <info>  [1769104570.7094] manager: (tap9c1da312-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/334)
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.714 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.739 183079 INFO nova.virt.libvirt.driver [-] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Instance destroyed successfully.
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.740 183079 DEBUG nova.objects.instance [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid 42bc2eb6-0654-413e-bf4f-7926c9f3efb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.752 183079 DEBUG nova.virt.libvirt.vif [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:53:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1316386013',display_name='tempest-server-test-1316386013',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1316386013',id=75,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:53:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-tht8ezry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:53:29Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=42bc2eb6-0654-413e-bf4f-7926c9f3efb5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "address": "fa:16:3e:6a:a8:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c1da312-2c", "ovs_interfaceid": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.752 183079 DEBUG nova.network.os_vif_util [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "address": "fa:16:3e:6a:a8:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c1da312-2c", "ovs_interfaceid": "9c1da312-2c1b-451b-9ea1-34ff96520bb9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.753 183079 DEBUG nova.network.os_vif_util [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6a:a8:74,bridge_name='br-int',has_traffic_filtering=True,id=9c1da312-2c1b-451b-9ea1-34ff96520bb9,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c1da312-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.753 183079 DEBUG os_vif [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:a8:74,bridge_name='br-int',has_traffic_filtering=True,id=9c1da312-2c1b-451b-9ea1-34ff96520bb9,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c1da312-2c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.754 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.754 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c1da312-2c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.756 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.757 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.759 183079 INFO os_vif [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:a8:74,bridge_name='br-int',has_traffic_filtering=True,id=9c1da312-2c1b-451b-9ea1-34ff96520bb9,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c1da312-2c')
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.759 183079 INFO nova.virt.libvirt.driver [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Deleting instance files /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5_del
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.760 183079 INFO nova.virt.libvirt.driver [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Deletion of /var/lib/nova/instances/42bc2eb6-0654-413e-bf4f-7926c9f3efb5_del complete
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.814 183079 INFO nova.compute.manager [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Took 0.32 seconds to destroy the instance on the hypervisor.
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.814 183079 DEBUG oslo.service.loopingcall [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.815 183079 DEBUG nova.compute.manager [-] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:56:10 compute-0 nova_compute[183075]: 2026-01-22 17:56:10.815 183079 DEBUG nova.network.neutron [-] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.520 183079 DEBUG nova.compute.manager [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Received event network-vif-plugged-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.520 183079 DEBUG oslo_concurrency.lockutils [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.520 183079 DEBUG oslo_concurrency.lockutils [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.520 183079 DEBUG oslo_concurrency.lockutils [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "f7112f58-59da-486c-9079-e0bc3fa6a3e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.520 183079 DEBUG nova.compute.manager [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] No waiting events found dispatching network-vif-plugged-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.521 183079 WARNING nova.compute.manager [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Received unexpected event network-vif-plugged-b10e0a25-0ce5-49eb-ac39-9b770e42a3fc for instance with vm_state deleted and task_state None.
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.521 183079 DEBUG nova.compute.manager [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Received event network-vif-unplugged-9c1da312-2c1b-451b-9ea1-34ff96520bb9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.521 183079 DEBUG oslo_concurrency.lockutils [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.521 183079 DEBUG oslo_concurrency.lockutils [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.522 183079 DEBUG oslo_concurrency.lockutils [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.522 183079 DEBUG nova.compute.manager [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] No waiting events found dispatching network-vif-unplugged-9c1da312-2c1b-451b-9ea1-34ff96520bb9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.522 183079 DEBUG nova.compute.manager [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Received event network-vif-unplugged-9c1da312-2c1b-451b-9ea1-34ff96520bb9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.522 183079 DEBUG nova.compute.manager [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Received event network-vif-plugged-9c1da312-2c1b-451b-9ea1-34ff96520bb9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.522 183079 DEBUG oslo_concurrency.lockutils [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.523 183079 DEBUG oslo_concurrency.lockutils [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.523 183079 DEBUG oslo_concurrency.lockutils [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.523 183079 DEBUG nova.compute.manager [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] No waiting events found dispatching network-vif-plugged-9c1da312-2c1b-451b-9ea1-34ff96520bb9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:56:11 compute-0 nova_compute[183075]: 2026-01-22 17:56:11.523 183079 WARNING nova.compute.manager [req-8f27bdfc-58c1-47e2-9b00-868ace4f9e74 req-643e8960-6535-4da7-9aa5-703a8ef380b3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Received unexpected event network-vif-plugged-9c1da312-2c1b-451b-9ea1-34ff96520bb9 for instance with vm_state active and task_state deleting.
Jan 22 17:56:12 compute-0 nova_compute[183075]: 2026-01-22 17:56:12.639 183079 DEBUG nova.network.neutron [-] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:56:12 compute-0 nova_compute[183075]: 2026-01-22 17:56:12.654 183079 INFO nova.compute.manager [-] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Took 1.84 seconds to deallocate network for instance.
Jan 22 17:56:12 compute-0 nova_compute[183075]: 2026-01-22 17:56:12.689 183079 DEBUG oslo_concurrency.lockutils [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:12 compute-0 nova_compute[183075]: 2026-01-22 17:56:12.689 183079 DEBUG oslo_concurrency.lockutils [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:12 compute-0 nova_compute[183075]: 2026-01-22 17:56:12.699 183079 DEBUG nova.compute.manager [req-7323bef0-9407-41eb-8fe0-1f395a88fe4b req-eda5ab69-c0c5-47db-8139-1d490ff498d0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Received event network-vif-deleted-9c1da312-2c1b-451b-9ea1-34ff96520bb9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:56:12 compute-0 nova_compute[183075]: 2026-01-22 17:56:12.741 183079 DEBUG nova.compute.provider_tree [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:56:12 compute-0 nova_compute[183075]: 2026-01-22 17:56:12.753 183079 DEBUG nova.scheduler.client.report [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:56:12 compute-0 nova_compute[183075]: 2026-01-22 17:56:12.770 183079 DEBUG oslo_concurrency.lockutils [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:12 compute-0 nova_compute[183075]: 2026-01-22 17:56:12.790 183079 INFO nova.scheduler.client.report [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance 42bc2eb6-0654-413e-bf4f-7926c9f3efb5
Jan 22 17:56:12 compute-0 nova_compute[183075]: 2026-01-22 17:56:12.841 183079 DEBUG oslo_concurrency.lockutils [None req-7a4e148f-197b-4f08-b124-624508cbafa8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "42bc2eb6-0654-413e-bf4f-7926c9f3efb5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.351s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:13 compute-0 nova_compute[183075]: 2026-01-22 17:56:13.828 183079 DEBUG oslo_concurrency.lockutils [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:13 compute-0 nova_compute[183075]: 2026-01-22 17:56:13.829 183079 DEBUG oslo_concurrency.lockutils [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:13 compute-0 nova_compute[183075]: 2026-01-22 17:56:13.829 183079 DEBUG oslo_concurrency.lockutils [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:13 compute-0 nova_compute[183075]: 2026-01-22 17:56:13.829 183079 DEBUG oslo_concurrency.lockutils [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:13 compute-0 nova_compute[183075]: 2026-01-22 17:56:13.829 183079 DEBUG oslo_concurrency.lockutils [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:13 compute-0 nova_compute[183075]: 2026-01-22 17:56:13.830 183079 INFO nova.compute.manager [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Terminating instance
Jan 22 17:56:13 compute-0 nova_compute[183075]: 2026-01-22 17:56:13.831 183079 DEBUG nova.compute.manager [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:56:13 compute-0 kernel: tap6f89fabe-02 (unregistering): left promiscuous mode
Jan 22 17:56:13 compute-0 NetworkManager[55454]: <info>  [1769104573.8524] device (tap6f89fabe-02): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:56:13 compute-0 ovn_controller[95372]: 2026-01-22T17:56:13Z|00843|binding|INFO|Releasing lport 6f89fabe-02ec-4340-9703-82e93c101ebc from this chassis (sb_readonly=0)
Jan 22 17:56:13 compute-0 nova_compute[183075]: 2026-01-22 17:56:13.857 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:13 compute-0 ovn_controller[95372]: 2026-01-22T17:56:13Z|00844|binding|INFO|Setting lport 6f89fabe-02ec-4340-9703-82e93c101ebc down in Southbound
Jan 22 17:56:13 compute-0 ovn_controller[95372]: 2026-01-22T17:56:13Z|00845|binding|INFO|Removing iface tap6f89fabe-02 ovn-installed in OVS
Jan 22 17:56:13 compute-0 nova_compute[183075]: 2026-01-22 17:56:13.860 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:13.864 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:45:58 10.100.0.4'], port_security=['fa:16:3e:45:45:58 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'eadc13fe-ef8b-46e5-af17-40b408aa5aff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'e384bd4b-9ab1-4663-8bdf-170c928c64ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.203'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=6f89fabe-02ec-4340-9703-82e93c101ebc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:56:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:13.866 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 6f89fabe-02ec-4340-9703-82e93c101ebc in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:56:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:13.867 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:56:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:13.868 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[90851169-0011-4cca-a1d7-ffcfae159e1d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:13.868 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace which is not needed anymore
Jan 22 17:56:13 compute-0 nova_compute[183075]: 2026-01-22 17:56:13.874 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:13 compute-0 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d0000004a.scope: Deactivated successfully.
Jan 22 17:56:13 compute-0 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d0000004a.scope: Consumed 22.727s CPU time.
Jan 22 17:56:13 compute-0 systemd-machined[154382]: Machine qemu-74-instance-0000004a terminated.
Jan 22 17:56:14 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242666]: [NOTICE]   (242670) : haproxy version is 2.8.14-c23fe91
Jan 22 17:56:14 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242666]: [NOTICE]   (242670) : path to executable is /usr/sbin/haproxy
Jan 22 17:56:14 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242666]: [WARNING]  (242670) : Exiting Master process...
Jan 22 17:56:14 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242666]: [ALERT]    (242670) : Current worker (242672) exited with code 143 (Terminated)
Jan 22 17:56:14 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[242666]: [WARNING]  (242670) : All workers exited. Exiting... (0)
Jan 22 17:56:14 compute-0 systemd[1]: libpod-5f29505b917845c1b55edd6e7c18effb6aa9abc30bc109186cf7135b336cb0a3.scope: Deactivated successfully.
Jan 22 17:56:14 compute-0 podman[244007]: 2026-01-22 17:56:14.012970005 +0000 UTC m=+0.050110185 container died 5f29505b917845c1b55edd6e7c18effb6aa9abc30bc109186cf7135b336cb0a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:56:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-a14daccc3a656e38221d6c677b14d4f3c45249cbd5830a61370d07ea94ab5ff6-merged.mount: Deactivated successfully.
Jan 22 17:56:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5f29505b917845c1b55edd6e7c18effb6aa9abc30bc109186cf7135b336cb0a3-userdata-shm.mount: Deactivated successfully.
Jan 22 17:56:14 compute-0 podman[244007]: 2026-01-22 17:56:14.053270596 +0000 UTC m=+0.090410776 container cleanup 5f29505b917845c1b55edd6e7c18effb6aa9abc30bc109186cf7135b336cb0a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.053 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.057 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:14 compute-0 systemd[1]: libpod-conmon-5f29505b917845c1b55edd6e7c18effb6aa9abc30bc109186cf7135b336cb0a3.scope: Deactivated successfully.
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.085 183079 INFO nova.virt.libvirt.driver [-] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Instance destroyed successfully.
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.086 183079 DEBUG nova.objects.instance [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid eadc13fe-ef8b-46e5-af17-40b408aa5aff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.097 183079 DEBUG nova.virt.libvirt.vif [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:52:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1171118669',display_name='tempest-server-test-1171118669',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1171118669',id=74,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:52:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-csh87x03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:52:15Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=eadc13fe-ef8b-46e5-af17-40b408aa5aff,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6f89fabe-02ec-4340-9703-82e93c101ebc", "address": "fa:16:3e:45:45:58", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f89fabe-02", "ovs_interfaceid": "6f89fabe-02ec-4340-9703-82e93c101ebc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.098 183079 DEBUG nova.network.os_vif_util [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "6f89fabe-02ec-4340-9703-82e93c101ebc", "address": "fa:16:3e:45:45:58", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f89fabe-02", "ovs_interfaceid": "6f89fabe-02ec-4340-9703-82e93c101ebc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.098 183079 DEBUG nova.network.os_vif_util [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:45:45:58,bridge_name='br-int',has_traffic_filtering=True,id=6f89fabe-02ec-4340-9703-82e93c101ebc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f89fabe-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.099 183079 DEBUG os_vif [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:45:58,bridge_name='br-int',has_traffic_filtering=True,id=6f89fabe-02ec-4340-9703-82e93c101ebc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f89fabe-02') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.100 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.100 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6f89fabe-02, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.103 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.106 183079 INFO os_vif [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:45:58,bridge_name='br-int',has_traffic_filtering=True,id=6f89fabe-02ec-4340-9703-82e93c101ebc,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f89fabe-02')
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.107 183079 INFO nova.virt.libvirt.driver [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Deleting instance files /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff_del
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.107 183079 INFO nova.virt.libvirt.driver [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Deletion of /var/lib/nova/instances/eadc13fe-ef8b-46e5-af17-40b408aa5aff_del complete
Jan 22 17:56:14 compute-0 podman[244046]: 2026-01-22 17:56:14.131456582 +0000 UTC m=+0.052274323 container remove 5f29505b917845c1b55edd6e7c18effb6aa9abc30bc109186cf7135b336cb0a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:56:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:14.137 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5fd1b294-4382-4cb3-b894-d057ad2641be]: (4, ('Thu Jan 22 05:56:13 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (5f29505b917845c1b55edd6e7c18effb6aa9abc30bc109186cf7135b336cb0a3)\n5f29505b917845c1b55edd6e7c18effb6aa9abc30bc109186cf7135b336cb0a3\nThu Jan 22 05:56:14 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (5f29505b917845c1b55edd6e7c18effb6aa9abc30bc109186cf7135b336cb0a3)\n5f29505b917845c1b55edd6e7c18effb6aa9abc30bc109186cf7135b336cb0a3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:14.139 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[59d56319-1089-4910-9c06-9509c1119115]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:14.139 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.141 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:14 compute-0 kernel: tap88ed9213-70: left promiscuous mode
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.150 183079 INFO nova.compute.manager [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Took 0.32 seconds to destroy the instance on the hypervisor.
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.152 183079 DEBUG oslo.service.loopingcall [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.152 183079 DEBUG nova.compute.manager [-] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.152 183079 DEBUG nova.network.neutron [-] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.155 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:14.157 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d52e277d-afec-49e5-8e35-97ee90d88384]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:14.178 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c27e3cfc-d4c1-433c-9701-44db0cf188af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:14.180 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[93c79a52-7184-4165-9898-63683dfdb0d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:14.200 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cbfc9afe-86bf-489e-ab6b-b09af745a8f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 669808, 'reachable_time': 22583, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244065, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:14 compute-0 systemd[1]: run-netns-ovnmeta\x2d88ed9213\x2d7d48\x2d4c42\x2dad60\x2dce3fcf104f71.mount: Deactivated successfully.
Jan 22 17:56:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:14.204 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:56:14 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:14.204 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[649f0e40-9b23-4a9c-a3e2-0f8ee3e5b0de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.284 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:14 compute-0 nova_compute[183075]: 2026-01-22 17:56:14.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:56:15 compute-0 nova_compute[183075]: 2026-01-22 17:56:15.090 183079 DEBUG nova.compute.manager [req-2ceeaa05-89db-4b35-85e0-c8ceecc385a7 req-d3c0a7ec-7049-4a22-b089-c77fa2a3dcae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Received event network-vif-unplugged-6f89fabe-02ec-4340-9703-82e93c101ebc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:56:15 compute-0 nova_compute[183075]: 2026-01-22 17:56:15.090 183079 DEBUG oslo_concurrency.lockutils [req-2ceeaa05-89db-4b35-85e0-c8ceecc385a7 req-d3c0a7ec-7049-4a22-b089-c77fa2a3dcae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:15 compute-0 nova_compute[183075]: 2026-01-22 17:56:15.091 183079 DEBUG oslo_concurrency.lockutils [req-2ceeaa05-89db-4b35-85e0-c8ceecc385a7 req-d3c0a7ec-7049-4a22-b089-c77fa2a3dcae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:15 compute-0 nova_compute[183075]: 2026-01-22 17:56:15.091 183079 DEBUG oslo_concurrency.lockutils [req-2ceeaa05-89db-4b35-85e0-c8ceecc385a7 req-d3c0a7ec-7049-4a22-b089-c77fa2a3dcae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:15 compute-0 nova_compute[183075]: 2026-01-22 17:56:15.091 183079 DEBUG nova.compute.manager [req-2ceeaa05-89db-4b35-85e0-c8ceecc385a7 req-d3c0a7ec-7049-4a22-b089-c77fa2a3dcae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] No waiting events found dispatching network-vif-unplugged-6f89fabe-02ec-4340-9703-82e93c101ebc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:56:15 compute-0 nova_compute[183075]: 2026-01-22 17:56:15.091 183079 DEBUG nova.compute.manager [req-2ceeaa05-89db-4b35-85e0-c8ceecc385a7 req-d3c0a7ec-7049-4a22-b089-c77fa2a3dcae a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Received event network-vif-unplugged-6f89fabe-02ec-4340-9703-82e93c101ebc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:56:16 compute-0 nova_compute[183075]: 2026-01-22 17:56:16.221 183079 DEBUG nova.network.neutron [-] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:56:16 compute-0 nova_compute[183075]: 2026-01-22 17:56:16.235 183079 INFO nova.compute.manager [-] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Took 2.08 seconds to deallocate network for instance.
Jan 22 17:56:16 compute-0 nova_compute[183075]: 2026-01-22 17:56:16.272 183079 DEBUG oslo_concurrency.lockutils [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:16 compute-0 nova_compute[183075]: 2026-01-22 17:56:16.272 183079 DEBUG oslo_concurrency.lockutils [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:16 compute-0 nova_compute[183075]: 2026-01-22 17:56:16.328 183079 DEBUG nova.compute.provider_tree [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:56:16 compute-0 nova_compute[183075]: 2026-01-22 17:56:16.342 183079 DEBUG nova.scheduler.client.report [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:56:16 compute-0 nova_compute[183075]: 2026-01-22 17:56:16.361 183079 DEBUG oslo_concurrency.lockutils [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:16 compute-0 nova_compute[183075]: 2026-01-22 17:56:16.434 183079 INFO nova.scheduler.client.report [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance eadc13fe-ef8b-46e5-af17-40b408aa5aff
Jan 22 17:56:16 compute-0 nova_compute[183075]: 2026-01-22 17:56:16.491 183079 DEBUG oslo_concurrency.lockutils [None req-61f73311-546b-4cf2-98fd-10c747c6dae8 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:17 compute-0 nova_compute[183075]: 2026-01-22 17:56:17.186 183079 DEBUG nova.compute.manager [req-2c1e5f1f-6ac5-4652-99bc-5b1f61aac602 req-753b32df-f6c8-492f-a5b1-583bf4923e9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Received event network-vif-plugged-6f89fabe-02ec-4340-9703-82e93c101ebc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:56:17 compute-0 nova_compute[183075]: 2026-01-22 17:56:17.187 183079 DEBUG oslo_concurrency.lockutils [req-2c1e5f1f-6ac5-4652-99bc-5b1f61aac602 req-753b32df-f6c8-492f-a5b1-583bf4923e9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:17 compute-0 nova_compute[183075]: 2026-01-22 17:56:17.187 183079 DEBUG oslo_concurrency.lockutils [req-2c1e5f1f-6ac5-4652-99bc-5b1f61aac602 req-753b32df-f6c8-492f-a5b1-583bf4923e9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:17 compute-0 nova_compute[183075]: 2026-01-22 17:56:17.188 183079 DEBUG oslo_concurrency.lockutils [req-2c1e5f1f-6ac5-4652-99bc-5b1f61aac602 req-753b32df-f6c8-492f-a5b1-583bf4923e9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "eadc13fe-ef8b-46e5-af17-40b408aa5aff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:17 compute-0 nova_compute[183075]: 2026-01-22 17:56:17.188 183079 DEBUG nova.compute.manager [req-2c1e5f1f-6ac5-4652-99bc-5b1f61aac602 req-753b32df-f6c8-492f-a5b1-583bf4923e9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] No waiting events found dispatching network-vif-plugged-6f89fabe-02ec-4340-9703-82e93c101ebc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:56:17 compute-0 nova_compute[183075]: 2026-01-22 17:56:17.188 183079 WARNING nova.compute.manager [req-2c1e5f1f-6ac5-4652-99bc-5b1f61aac602 req-753b32df-f6c8-492f-a5b1-583bf4923e9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Received unexpected event network-vif-plugged-6f89fabe-02ec-4340-9703-82e93c101ebc for instance with vm_state deleted and task_state None.
Jan 22 17:56:17 compute-0 nova_compute[183075]: 2026-01-22 17:56:17.189 183079 DEBUG nova.compute.manager [req-2c1e5f1f-6ac5-4652-99bc-5b1f61aac602 req-753b32df-f6c8-492f-a5b1-583bf4923e9e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Received event network-vif-deleted-6f89fabe-02ec-4340-9703-82e93c101ebc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:56:17 compute-0 podman[244067]: 2026-01-22 17:56:17.362836715 +0000 UTC m=+0.062335113 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent)
Jan 22 17:56:17 compute-0 podman[244068]: 2026-01-22 17:56:17.374449877 +0000 UTC m=+0.069565777 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 17:56:17 compute-0 podman[244066]: 2026-01-22 17:56:17.436755498 +0000 UTC m=+0.136792720 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 17:56:19 compute-0 nova_compute[183075]: 2026-01-22 17:56:19.102 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:19 compute-0 nova_compute[183075]: 2026-01-22 17:56:19.287 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:22 compute-0 podman[244132]: 2026-01-22 17:56:22.343540062 +0000 UTC m=+0.056189558 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.533 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.533 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.546 183079 DEBUG nova.compute.manager [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.620 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.621 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.627 183079 DEBUG nova.virt.hardware [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.628 183079 INFO nova.compute.claims [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.668 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769104568.6678967, f7112f58-59da-486c-9079-e0bc3fa6a3e6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.669 183079 INFO nova.compute.manager [-] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] VM Stopped (Lifecycle Event)
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.703 183079 DEBUG nova.compute.manager [None req-9b84821b-ab67-40ce-9c47-c74b53526054 - - - - - -] [instance: f7112f58-59da-486c-9079-e0bc3fa6a3e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.748 183079 DEBUG nova.compute.provider_tree [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.766 183079 DEBUG nova.scheduler.client.report [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.790 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.791 183079 DEBUG nova.compute.manager [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.828 183079 DEBUG nova.compute.manager [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.829 183079 DEBUG nova.network.neutron [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.845 183079 INFO nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.864 183079 DEBUG nova.compute.manager [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.953 183079 DEBUG nova.compute.manager [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.955 183079 DEBUG nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.955 183079 INFO nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Creating image(s)
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.956 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.956 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.957 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:23 compute-0 nova_compute[183075]: 2026-01-22 17:56:23.974 183079 DEBUG oslo_concurrency.processutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.037 183079 DEBUG oslo_concurrency.processutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.039 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.040 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.056 183079 DEBUG oslo_concurrency.processutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.110 183079 DEBUG oslo_concurrency.processutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.111 183079 DEBUG oslo_concurrency.processutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.140 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.145 183079 DEBUG oslo_concurrency.processutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.146 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.146 183079 DEBUG oslo_concurrency.processutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.208 183079 DEBUG oslo_concurrency.processutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.209 183079 DEBUG nova.virt.disk.api [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.209 183079 DEBUG oslo_concurrency.processutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.261 183079 DEBUG oslo_concurrency.processutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.262 183079 DEBUG nova.virt.disk.api [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.262 183079 DEBUG nova.objects.instance [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.276 183079 DEBUG nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.276 183079 DEBUG nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Ensure instance console log exists: /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.277 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.277 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.277 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.290 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:24 compute-0 nova_compute[183075]: 2026-01-22 17:56:24.887 183079 DEBUG nova.policy [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:56:25 compute-0 nova_compute[183075]: 2026-01-22 17:56:25.739 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769104570.7371244, 42bc2eb6-0654-413e-bf4f-7926c9f3efb5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:56:25 compute-0 nova_compute[183075]: 2026-01-22 17:56:25.740 183079 INFO nova.compute.manager [-] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] VM Stopped (Lifecycle Event)
Jan 22 17:56:25 compute-0 nova_compute[183075]: 2026-01-22 17:56:25.766 183079 DEBUG nova.compute.manager [None req-0ac15ba2-5098-4523-8917-55735d76b592 - - - - - -] [instance: 42bc2eb6-0654-413e-bf4f-7926c9f3efb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:56:26 compute-0 nova_compute[183075]: 2026-01-22 17:56:26.379 183079 DEBUG nova.network.neutron [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Successfully updated port: 09c606a4-b433-44a9-9c14-690255317cf3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:56:26 compute-0 nova_compute[183075]: 2026-01-22 17:56:26.479 183079 DEBUG nova.compute.manager [req-50473022-d2dd-4edb-b424-61ecb9247a6d req-021a7d05-1a1b-4114-bedb-6b395de9715d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Received event network-changed-09c606a4-b433-44a9-9c14-690255317cf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:56:26 compute-0 nova_compute[183075]: 2026-01-22 17:56:26.480 183079 DEBUG nova.compute.manager [req-50473022-d2dd-4edb-b424-61ecb9247a6d req-021a7d05-1a1b-4114-bedb-6b395de9715d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Refreshing instance network info cache due to event network-changed-09c606a4-b433-44a9-9c14-690255317cf3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:56:26 compute-0 nova_compute[183075]: 2026-01-22 17:56:26.480 183079 DEBUG oslo_concurrency.lockutils [req-50473022-d2dd-4edb-b424-61ecb9247a6d req-021a7d05-1a1b-4114-bedb-6b395de9715d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:56:26 compute-0 nova_compute[183075]: 2026-01-22 17:56:26.480 183079 DEBUG oslo_concurrency.lockutils [req-50473022-d2dd-4edb-b424-61ecb9247a6d req-021a7d05-1a1b-4114-bedb-6b395de9715d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:56:26 compute-0 nova_compute[183075]: 2026-01-22 17:56:26.481 183079 DEBUG nova.network.neutron [req-50473022-d2dd-4edb-b424-61ecb9247a6d req-021a7d05-1a1b-4114-bedb-6b395de9715d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Refreshing network info cache for port 09c606a4-b433-44a9-9c14-690255317cf3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:56:26 compute-0 nova_compute[183075]: 2026-01-22 17:56:26.493 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:56:26 compute-0 nova_compute[183075]: 2026-01-22 17:56:26.665 183079 DEBUG nova.network.neutron [req-50473022-d2dd-4edb-b424-61ecb9247a6d req-021a7d05-1a1b-4114-bedb-6b395de9715d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:56:26 compute-0 nova_compute[183075]: 2026-01-22 17:56:26.874 183079 DEBUG nova.network.neutron [req-50473022-d2dd-4edb-b424-61ecb9247a6d req-021a7d05-1a1b-4114-bedb-6b395de9715d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:56:26 compute-0 nova_compute[183075]: 2026-01-22 17:56:26.890 183079 DEBUG oslo_concurrency.lockutils [req-50473022-d2dd-4edb-b424-61ecb9247a6d req-021a7d05-1a1b-4114-bedb-6b395de9715d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:56:26 compute-0 nova_compute[183075]: 2026-01-22 17:56:26.892 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:56:26 compute-0 nova_compute[183075]: 2026-01-22 17:56:26.893 183079 DEBUG nova.network.neutron [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:56:27 compute-0 nova_compute[183075]: 2026-01-22 17:56:27.763 183079 DEBUG nova.network.neutron [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.082 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769104574.0816085, eadc13fe-ef8b-46e5-af17-40b408aa5aff => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.083 183079 INFO nova.compute.manager [-] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] VM Stopped (Lifecycle Event)
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.111 183079 DEBUG nova.compute.manager [None req-a4260892-a85e-4f48-8aec-eabefd52d1ad - - - - - -] [instance: eadc13fe-ef8b-46e5-af17-40b408aa5aff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.146 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.292 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.773 183079 DEBUG nova.network.neutron [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Updating instance_info_cache with network_info: [{"id": "09c606a4-b433-44a9-9c14-690255317cf3", "address": "fa:16:3e:6d:7a:4c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09c606a4-b4", "ovs_interfaceid": "09c606a4-b433-44a9-9c14-690255317cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.796 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.796 183079 DEBUG nova.compute.manager [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Instance network_info: |[{"id": "09c606a4-b433-44a9-9c14-690255317cf3", "address": "fa:16:3e:6d:7a:4c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09c606a4-b4", "ovs_interfaceid": "09c606a4-b433-44a9-9c14-690255317cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.799 183079 DEBUG nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Start _get_guest_xml network_info=[{"id": "09c606a4-b433-44a9-9c14-690255317cf3", "address": "fa:16:3e:6d:7a:4c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09c606a4-b4", "ovs_interfaceid": "09c606a4-b433-44a9-9c14-690255317cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.805 183079 WARNING nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.811 183079 DEBUG nova.virt.libvirt.host [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.812 183079 DEBUG nova.virt.libvirt.host [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.817 183079 DEBUG nova.virt.libvirt.host [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.818 183079 DEBUG nova.virt.libvirt.host [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.818 183079 DEBUG nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.819 183079 DEBUG nova.virt.hardware [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.819 183079 DEBUG nova.virt.hardware [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.819 183079 DEBUG nova.virt.hardware [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.820 183079 DEBUG nova.virt.hardware [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.820 183079 DEBUG nova.virt.hardware [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.820 183079 DEBUG nova.virt.hardware [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.820 183079 DEBUG nova.virt.hardware [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.820 183079 DEBUG nova.virt.hardware [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.821 183079 DEBUG nova.virt.hardware [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.821 183079 DEBUG nova.virt.hardware [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.821 183079 DEBUG nova.virt.hardware [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.825 183079 DEBUG nova.virt.libvirt.vif [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:56:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1286187819',display_name='tempest-server-test-1286187819',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1286187819',id=77,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-b5y1dca2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:56:23Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=4e7d63e0-335d-4cc1-9701-ba3728ed3e2d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09c606a4-b433-44a9-9c14-690255317cf3", "address": "fa:16:3e:6d:7a:4c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09c606a4-b4", "ovs_interfaceid": "09c606a4-b433-44a9-9c14-690255317cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.826 183079 DEBUG nova.network.os_vif_util [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "09c606a4-b433-44a9-9c14-690255317cf3", "address": "fa:16:3e:6d:7a:4c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09c606a4-b4", "ovs_interfaceid": "09c606a4-b433-44a9-9c14-690255317cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.826 183079 DEBUG nova.network.os_vif_util [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:7a:4c,bridge_name='br-int',has_traffic_filtering=True,id=09c606a4-b433-44a9-9c14-690255317cf3,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap09c606a4-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.827 183079 DEBUG nova.objects.instance [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.842 183079 DEBUG nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:56:29 compute-0 nova_compute[183075]:   <uuid>4e7d63e0-335d-4cc1-9701-ba3728ed3e2d</uuid>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   <name>instance-0000004d</name>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1286187819</nova:name>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:56:29</nova:creationTime>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:56:29 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:56:29 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:56:29 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:56:29 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:56:29 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:56:29 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:56:29 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:56:29 compute-0 nova_compute[183075]:         <nova:port uuid="09c606a4-b433-44a9-9c14-690255317cf3">
Jan 22 17:56:29 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <system>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <entry name="serial">4e7d63e0-335d-4cc1-9701-ba3728ed3e2d</entry>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <entry name="uuid">4e7d63e0-335d-4cc1-9701-ba3728ed3e2d</entry>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     </system>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   <os>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   </os>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   <features>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   </features>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:6d:7a:4c"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <target dev="tap09c606a4-b4"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/console.log" append="off"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <video>
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     </video>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:56:29 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:56:29 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:56:29 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:56:29 compute-0 nova_compute[183075]: </domain>
Jan 22 17:56:29 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.844 183079 DEBUG nova.compute.manager [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Preparing to wait for external event network-vif-plugged-09c606a4-b433-44a9-9c14-690255317cf3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.845 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.845 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.845 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.846 183079 DEBUG nova.virt.libvirt.vif [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:56:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1286187819',display_name='tempest-server-test-1286187819',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1286187819',id=77,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-b5y1dca2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:56:23Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=4e7d63e0-335d-4cc1-9701-ba3728ed3e2d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09c606a4-b433-44a9-9c14-690255317cf3", "address": "fa:16:3e:6d:7a:4c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09c606a4-b4", "ovs_interfaceid": "09c606a4-b433-44a9-9c14-690255317cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.847 183079 DEBUG nova.network.os_vif_util [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "09c606a4-b433-44a9-9c14-690255317cf3", "address": "fa:16:3e:6d:7a:4c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09c606a4-b4", "ovs_interfaceid": "09c606a4-b433-44a9-9c14-690255317cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.848 183079 DEBUG nova.network.os_vif_util [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:7a:4c,bridge_name='br-int',has_traffic_filtering=True,id=09c606a4-b433-44a9-9c14-690255317cf3,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap09c606a4-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.848 183079 DEBUG os_vif [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:7a:4c,bridge_name='br-int',has_traffic_filtering=True,id=09c606a4-b433-44a9-9c14-690255317cf3,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap09c606a4-b4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.849 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.849 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.850 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.854 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.855 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09c606a4-b4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.855 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09c606a4-b4, col_values=(('external_ids', {'iface-id': '09c606a4-b433-44a9-9c14-690255317cf3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:7a:4c', 'vm-uuid': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.856 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:29 compute-0 NetworkManager[55454]: <info>  [1769104589.8588] manager: (tap09c606a4-b4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.859 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.866 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.867 183079 INFO os_vif [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:7a:4c,bridge_name='br-int',has_traffic_filtering=True,id=09c606a4-b433-44a9-9c14-690255317cf3,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap09c606a4-b4')
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.928 183079 DEBUG nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:56:29 compute-0 nova_compute[183075]: 2026-01-22 17:56:29.928 183079 DEBUG nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:6d:7a:4c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:56:30 compute-0 kernel: tap09c606a4-b4: entered promiscuous mode
Jan 22 17:56:30 compute-0 NetworkManager[55454]: <info>  [1769104590.0067] manager: (tap09c606a4-b4): new Tun device (/org/freedesktop/NetworkManager/Devices/336)
Jan 22 17:56:30 compute-0 ovn_controller[95372]: 2026-01-22T17:56:30Z|00846|binding|INFO|Claiming lport 09c606a4-b433-44a9-9c14-690255317cf3 for this chassis.
Jan 22 17:56:30 compute-0 ovn_controller[95372]: 2026-01-22T17:56:30Z|00847|binding|INFO|09c606a4-b433-44a9-9c14-690255317cf3: Claiming fa:16:3e:6d:7a:4c 10.100.0.9
Jan 22 17:56:30 compute-0 nova_compute[183075]: 2026-01-22 17:56:30.006 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.015 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:7a:4c 10.100.0.9'], port_security=['fa:16:3e:6d:7a:4c 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-test_protocol_number_rule-587073029', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-test_protocol_number_rule-587073029', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ff9e54bc-1b37-40fb-a3e6-dc01342b1d06', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=09c606a4-b433-44a9-9c14-690255317cf3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.016 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 09c606a4-b433-44a9-9c14-690255317cf3 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.017 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:56:30 compute-0 nova_compute[183075]: 2026-01-22 17:56:30.023 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:30 compute-0 ovn_controller[95372]: 2026-01-22T17:56:30Z|00848|binding|INFO|Setting lport 09c606a4-b433-44a9-9c14-690255317cf3 ovn-installed in OVS
Jan 22 17:56:30 compute-0 ovn_controller[95372]: 2026-01-22T17:56:30Z|00849|binding|INFO|Setting lport 09c606a4-b433-44a9-9c14-690255317cf3 up in Southbound
Jan 22 17:56:30 compute-0 nova_compute[183075]: 2026-01-22 17:56:30.025 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:30 compute-0 nova_compute[183075]: 2026-01-22 17:56:30.028 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.034 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4a975a09-5d57-4eb4-a327-cf4e6b622a6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.035 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88ed9213-71 in ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.036 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88ed9213-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.036 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[16733265-1818-40ab-be92-b6b3b8f700a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.037 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6e8226fc-eaea-4953-9a22-61f3a402a034]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:30 compute-0 systemd-udevd[244186]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.051 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[b5c2cbb1-f968-4524-9e2f-5808e1785da1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:30 compute-0 systemd-machined[154382]: New machine qemu-77-instance-0000004d.
Jan 22 17:56:30 compute-0 NetworkManager[55454]: <info>  [1769104590.0611] device (tap09c606a4-b4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:56:30 compute-0 NetworkManager[55454]: <info>  [1769104590.0621] device (tap09c606a4-b4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:56:30 compute-0 systemd[1]: Started Virtual Machine qemu-77-instance-0000004d.
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.078 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7daf010f-ecd4-4a32-a77f-9b52a4be42ef]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.104 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e18a76-2501-461e-8e41-70473b4638fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.109 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3a99d1aa-7741-467a-95e5-ce5a9f201c47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:30 compute-0 NetworkManager[55454]: <info>  [1769104590.1108] manager: (tap88ed9213-70): new Veth device (/org/freedesktop/NetworkManager/Devices/337)
Jan 22 17:56:30 compute-0 systemd-udevd[244190]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.141 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[28998324-56b2-4f30-bcb1-7660075462a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.144 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d413b1fd-9f25-4d53-8461-1fcbccefa702]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:30 compute-0 NetworkManager[55454]: <info>  [1769104590.1715] device (tap88ed9213-70): carrier: link connected
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.178 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[066de5c4-39b9-409d-89a2-dee706a7288a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.199 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[401066b6-7a3c-4acd-9c1d-e4e83a8cb573]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 695387, 'reachable_time': 20150, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244218, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.217 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b926eabe-e925-4f3b-b601-4134a469ae30]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:fa93'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 695387, 'tstamp': 695387}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244219, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.234 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a301cd45-74a6-40fd-8589-68ef5908a009]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 695387, 'reachable_time': 20150, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244220, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.271 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b23ba430-5f4b-4f8a-85bb-c4caf00697d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.324 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[22ce862f-87c9-4e3c-a798-2c022f6419bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.325 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.326 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.326 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:56:30 compute-0 kernel: tap88ed9213-70: entered promiscuous mode
Jan 22 17:56:30 compute-0 NetworkManager[55454]: <info>  [1769104590.3284] manager: (tap88ed9213-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/338)
Jan 22 17:56:30 compute-0 nova_compute[183075]: 2026-01-22 17:56:30.329 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.331 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:56:30 compute-0 ovn_controller[95372]: 2026-01-22T17:56:30Z|00850|binding|INFO|Releasing lport da93a32e-eed6-4124-8f08-78b0bfe2cb69 from this chassis (sb_readonly=1)
Jan 22 17:56:30 compute-0 nova_compute[183075]: 2026-01-22 17:56:30.344 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.345 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.346 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7161f4f5-8748-4a89-a9e5-d222f13a1423]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.346 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:56:30 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:30.347 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'env', 'PROCESS_TAG=haproxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88ed9213-7d48-4c42-ad60-ce3fcf104f71.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:56:30 compute-0 podman[244252]: 2026-01-22 17:56:30.720883347 +0000 UTC m=+0.043869907 container create e8a0036f265f341458871343ab2f6ca2cba969eeb9084a4d11c8a4cbc05470ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:56:30 compute-0 systemd[1]: Started libpod-conmon-e8a0036f265f341458871343ab2f6ca2cba969eeb9084a4d11c8a4cbc05470ad.scope.
Jan 22 17:56:30 compute-0 podman[244252]: 2026-01-22 17:56:30.697177112 +0000 UTC m=+0.020163652 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:56:30 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:56:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62d027dd0e0f46d9ed60192c6f4101eba46e37650383b96eb40fba91a2b5bff5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:56:30 compute-0 podman[244252]: 2026-01-22 17:56:30.822604386 +0000 UTC m=+0.145590916 container init e8a0036f265f341458871343ab2f6ca2cba969eeb9084a4d11c8a4cbc05470ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:56:30 compute-0 podman[244252]: 2026-01-22 17:56:30.827954109 +0000 UTC m=+0.150940629 container start e8a0036f265f341458871343ab2f6ca2cba969eeb9084a4d11c8a4cbc05470ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 17:56:30 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244267]: [NOTICE]   (244271) : New worker (244273) forked
Jan 22 17:56:30 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244267]: [NOTICE]   (244271) : Loading success.
Jan 22 17:56:31 compute-0 nova_compute[183075]: 2026-01-22 17:56:31.318 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104591.3177307, 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:56:31 compute-0 nova_compute[183075]: 2026-01-22 17:56:31.318 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] VM Started (Lifecycle Event)
Jan 22 17:56:31 compute-0 nova_compute[183075]: 2026-01-22 17:56:31.335 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:56:31 compute-0 nova_compute[183075]: 2026-01-22 17:56:31.339 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104591.3179724, 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:56:31 compute-0 nova_compute[183075]: 2026-01-22 17:56:31.339 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] VM Paused (Lifecycle Event)
Jan 22 17:56:31 compute-0 nova_compute[183075]: 2026-01-22 17:56:31.355 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:56:31 compute-0 nova_compute[183075]: 2026-01-22 17:56:31.358 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:56:31 compute-0 nova_compute[183075]: 2026-01-22 17:56:31.378 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.691 183079 DEBUG nova.compute.manager [req-1149471d-c92a-4767-a6cd-081ed6786a43 req-c0802130-3e50-47d8-8e31-f48f7a7014cf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Received event network-vif-plugged-09c606a4-b433-44a9-9c14-690255317cf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.691 183079 DEBUG oslo_concurrency.lockutils [req-1149471d-c92a-4767-a6cd-081ed6786a43 req-c0802130-3e50-47d8-8e31-f48f7a7014cf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.692 183079 DEBUG oslo_concurrency.lockutils [req-1149471d-c92a-4767-a6cd-081ed6786a43 req-c0802130-3e50-47d8-8e31-f48f7a7014cf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.692 183079 DEBUG oslo_concurrency.lockutils [req-1149471d-c92a-4767-a6cd-081ed6786a43 req-c0802130-3e50-47d8-8e31-f48f7a7014cf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.692 183079 DEBUG nova.compute.manager [req-1149471d-c92a-4767-a6cd-081ed6786a43 req-c0802130-3e50-47d8-8e31-f48f7a7014cf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Processing event network-vif-plugged-09c606a4-b433-44a9-9c14-690255317cf3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.693 183079 DEBUG nova.compute.manager [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.696 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104592.6960354, 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.697 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] VM Resumed (Lifecycle Event)
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.698 183079 DEBUG nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.702 183079 INFO nova.virt.libvirt.driver [-] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Instance spawned successfully.
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.702 183079 DEBUG nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.722 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.729 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.734 183079 DEBUG nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.734 183079 DEBUG nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.735 183079 DEBUG nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.735 183079 DEBUG nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.736 183079 DEBUG nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.736 183079 DEBUG nova.virt.libvirt.driver [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.767 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.801 183079 INFO nova.compute.manager [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Took 8.85 seconds to spawn the instance on the hypervisor.
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.801 183079 DEBUG nova.compute.manager [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.858 183079 INFO nova.compute.manager [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Took 9.26 seconds to build instance.
Jan 22 17:56:32 compute-0 nova_compute[183075]: 2026-01-22 17:56:32.874 183079 DEBUG oslo_concurrency.lockutils [None req-54476b78-3fa1-4878-9a56-c0d2bdc7456c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:33 compute-0 podman[244289]: 2026-01-22 17:56:33.350881844 +0000 UTC m=+0.058989554 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:56:34 compute-0 nova_compute[183075]: 2026-01-22 17:56:34.063 183079 INFO nova.compute.manager [None req-91998eb6-a2f8-40a6-91bb-9827c877a2f4 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:56:34 compute-0 nova_compute[183075]: 2026-01-22 17:56:34.068 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:56:34 compute-0 nova_compute[183075]: 2026-01-22 17:56:34.295 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:34 compute-0 nova_compute[183075]: 2026-01-22 17:56:34.785 183079 DEBUG nova.compute.manager [req-193eea0c-37d1-4281-9317-927288bbf8ca req-0d281b1c-de11-4c75-9613-9d0a52f265ef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Received event network-vif-plugged-09c606a4-b433-44a9-9c14-690255317cf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:56:34 compute-0 nova_compute[183075]: 2026-01-22 17:56:34.785 183079 DEBUG oslo_concurrency.lockutils [req-193eea0c-37d1-4281-9317-927288bbf8ca req-0d281b1c-de11-4c75-9613-9d0a52f265ef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:34 compute-0 nova_compute[183075]: 2026-01-22 17:56:34.785 183079 DEBUG oslo_concurrency.lockutils [req-193eea0c-37d1-4281-9317-927288bbf8ca req-0d281b1c-de11-4c75-9613-9d0a52f265ef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:34 compute-0 nova_compute[183075]: 2026-01-22 17:56:34.786 183079 DEBUG oslo_concurrency.lockutils [req-193eea0c-37d1-4281-9317-927288bbf8ca req-0d281b1c-de11-4c75-9613-9d0a52f265ef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:34 compute-0 nova_compute[183075]: 2026-01-22 17:56:34.786 183079 DEBUG nova.compute.manager [req-193eea0c-37d1-4281-9317-927288bbf8ca req-0d281b1c-de11-4c75-9613-9d0a52f265ef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] No waiting events found dispatching network-vif-plugged-09c606a4-b433-44a9-9c14-690255317cf3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:56:34 compute-0 nova_compute[183075]: 2026-01-22 17:56:34.786 183079 WARNING nova.compute.manager [req-193eea0c-37d1-4281-9317-927288bbf8ca req-0d281b1c-de11-4c75-9613-9d0a52f265ef a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Received unexpected event network-vif-plugged-09c606a4-b433-44a9-9c14-690255317cf3 for instance with vm_state active and task_state None.
Jan 22 17:56:34 compute-0 nova_compute[183075]: 2026-01-22 17:56:34.857 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:37 compute-0 podman[244314]: 2026-01-22 17:56:37.386525867 +0000 UTC m=+0.095876412 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:56:39 compute-0 nova_compute[183075]: 2026-01-22 17:56:39.297 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:39 compute-0 nova_compute[183075]: 2026-01-22 17:56:39.356 183079 INFO nova.compute.manager [None req-e361ea92-328e-4de6-a231-96ea20c1d6fd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:56:39 compute-0 nova_compute[183075]: 2026-01-22 17:56:39.360 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:56:39 compute-0 nova_compute[183075]: 2026-01-22 17:56:39.859 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:41.978 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:41.979 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:56:41.980 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:42 compute-0 nova_compute[183075]: 2026-01-22 17:56:42.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:56:43 compute-0 nova_compute[183075]: 2026-01-22 17:56:43.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:56:44 compute-0 nova_compute[183075]: 2026-01-22 17:56:44.299 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:44 compute-0 nova_compute[183075]: 2026-01-22 17:56:44.481 183079 INFO nova.compute.manager [None req-02b50e30-804e-42c4-a166-a2376bbd4521 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:56:44 compute-0 nova_compute[183075]: 2026-01-22 17:56:44.485 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:56:44 compute-0 nova_compute[183075]: 2026-01-22 17:56:44.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:56:44 compute-0 nova_compute[183075]: 2026-01-22 17:56:44.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:56:44 compute-0 nova_compute[183075]: 2026-01-22 17:56:44.861 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:45 compute-0 ovn_controller[95372]: 2026-01-22T17:56:45Z|00156|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6d:7a:4c 10.100.0.9
Jan 22 17:56:45 compute-0 ovn_controller[95372]: 2026-01-22T17:56:45Z|00157|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6d:7a:4c 10.100.0.9
Jan 22 17:56:48 compute-0 podman[244352]: 2026-01-22 17:56:48.339416315 +0000 UTC m=+0.050925306 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 17:56:48 compute-0 podman[244353]: 2026-01-22 17:56:48.355471006 +0000 UTC m=+0.063383281 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 17:56:48 compute-0 podman[244351]: 2026-01-22 17:56:48.404684005 +0000 UTC m=+0.115958400 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:56:49 compute-0 nova_compute[183075]: 2026-01-22 17:56:49.301 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:49 compute-0 nova_compute[183075]: 2026-01-22 17:56:49.626 183079 INFO nova.compute.manager [None req-0215cbde-5725-4d1e-b26c-077ced4fdda6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:56:49 compute-0 nova_compute[183075]: 2026-01-22 17:56:49.630 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:56:49 compute-0 nova_compute[183075]: 2026-01-22 17:56:49.863 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:50 compute-0 nova_compute[183075]: 2026-01-22 17:56:50.782 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:56:53 compute-0 podman[244417]: 2026-01-22 17:56:53.36249417 +0000 UTC m=+0.069160216 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:56:54 compute-0 nova_compute[183075]: 2026-01-22 17:56:54.302 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:54 compute-0 nova_compute[183075]: 2026-01-22 17:56:54.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:56:54 compute-0 nova_compute[183075]: 2026-01-22 17:56:54.864 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.101 183079 INFO nova.compute.manager [None req-1abb7ec6-48fc-47f4-af62-dbb673142d44 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.111 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.112 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.112 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.112 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.109 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.190 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.273 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.274 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.346 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.488 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.489 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5524MB free_disk=73.32399368286133GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.490 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.490 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.562 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.563 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.563 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.595 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.616 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.640 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:56:55 compute-0 nova_compute[183075]: 2026-01-22 17:56:55.641 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:56:56 compute-0 nova_compute[183075]: 2026-01-22 17:56:56.642 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:56:56 compute-0 nova_compute[183075]: 2026-01-22 17:56:56.643 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:56:56 compute-0 nova_compute[183075]: 2026-01-22 17:56:56.703 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:56:56 compute-0 nova_compute[183075]: 2026-01-22 17:56:56.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:56:56 compute-0 nova_compute[183075]: 2026-01-22 17:56:56.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:56:56 compute-0 nova_compute[183075]: 2026-01-22 17:56:56.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:56:59 compute-0 nova_compute[183075]: 2026-01-22 17:56:59.305 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:56:59 compute-0 nova_compute[183075]: 2026-01-22 17:56:59.873 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:00 compute-0 ovn_controller[95372]: 2026-01-22T17:57:00Z|00851|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Jan 22 17:57:00 compute-0 nova_compute[183075]: 2026-01-22 17:57:00.462 183079 INFO nova.compute.manager [None req-306f9447-3274-4ef9-bde4-25c971797560 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:57:00 compute-0 nova_compute[183075]: 2026-01-22 17:57:00.466 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:57:04 compute-0 nova_compute[183075]: 2026-01-22 17:57:04.308 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:04 compute-0 podman[244447]: 2026-01-22 17:57:04.382673063 +0000 UTC m=+0.091458573 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:57:04 compute-0 nova_compute[183075]: 2026-01-22 17:57:04.920 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:05 compute-0 nova_compute[183075]: 2026-01-22 17:57:05.621 183079 INFO nova.compute.manager [None req-7f1b1d7d-b8c1-403b-aa4c-aee3d390b411 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:57:05 compute-0 nova_compute[183075]: 2026-01-22 17:57:05.626 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:57:08 compute-0 podman[244471]: 2026-01-22 17:57:08.364734479 +0000 UTC m=+0.076036070 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:57:09 compute-0 nova_compute[183075]: 2026-01-22 17:57:09.310 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:09 compute-0 nova_compute[183075]: 2026-01-22 17:57:09.922 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:10 compute-0 nova_compute[183075]: 2026-01-22 17:57:10.746 183079 INFO nova.compute.manager [None req-d8c27057-4c24-44fd-9779-eddb45359a95 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:57:10 compute-0 nova_compute[183075]: 2026-01-22 17:57:10.751 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:57:14 compute-0 nova_compute[183075]: 2026-01-22 17:57:14.312 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:14 compute-0 nova_compute[183075]: 2026-01-22 17:57:14.924 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:15 compute-0 nova_compute[183075]: 2026-01-22 17:57:15.863 183079 INFO nova.compute.manager [None req-148df399-3056-41d5-853e-6832d0015f08 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:57:15 compute-0 nova_compute[183075]: 2026-01-22 17:57:15.867 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:57:19 compute-0 nova_compute[183075]: 2026-01-22 17:57:19.313 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:19 compute-0 podman[244496]: 2026-01-22 17:57:19.349479453 +0000 UTC m=+0.049900970 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 17:57:19 compute-0 podman[244497]: 2026-01-22 17:57:19.364732122 +0000 UTC m=+0.059781965 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 17:57:19 compute-0 podman[244495]: 2026-01-22 17:57:19.378519842 +0000 UTC m=+0.083490591 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:57:19 compute-0 nova_compute[183075]: 2026-01-22 17:57:19.927 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:20 compute-0 nova_compute[183075]: 2026-01-22 17:57:20.976 183079 INFO nova.compute.manager [None req-99c915f0-ada9-4fbc-b141-8d63b1a6c81e 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:57:20 compute-0 nova_compute[183075]: 2026-01-22 17:57:20.981 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:57:24 compute-0 nova_compute[183075]: 2026-01-22 17:57:24.316 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:24 compute-0 podman[244560]: 2026-01-22 17:57:24.342002489 +0000 UTC m=+0.054922174 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 22 17:57:24 compute-0 nova_compute[183075]: 2026-01-22 17:57:24.930 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:26 compute-0 nova_compute[183075]: 2026-01-22 17:57:26.099 183079 INFO nova.compute.manager [None req-3db6602f-db69-41d0-acc8-3e5741805004 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:57:26 compute-0 nova_compute[183075]: 2026-01-22 17:57:26.103 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:57:29 compute-0 nova_compute[183075]: 2026-01-22 17:57:29.317 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:29 compute-0 nova_compute[183075]: 2026-01-22 17:57:29.932 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:31 compute-0 nova_compute[183075]: 2026-01-22 17:57:31.229 183079 INFO nova.compute.manager [None req-fd17a6dd-ac78-468a-bf8a-ab4ca155503d 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:57:31 compute-0 nova_compute[183075]: 2026-01-22 17:57:31.233 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:57:34 compute-0 nova_compute[183075]: 2026-01-22 17:57:34.319 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:34 compute-0 nova_compute[183075]: 2026-01-22 17:57:34.935 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:35 compute-0 podman[244580]: 2026-01-22 17:57:35.339067372 +0000 UTC m=+0.047855644 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:57:36 compute-0 nova_compute[183075]: 2026-01-22 17:57:36.507 183079 INFO nova.compute.manager [None req-5ee3bae5-3d37-4f6b-87fa-3f5db9be5ec4 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:57:36 compute-0 nova_compute[183075]: 2026-01-22 17:57:36.514 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:57:39 compute-0 nova_compute[183075]: 2026-01-22 17:57:39.321 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:39 compute-0 podman[244604]: 2026-01-22 17:57:39.3588602 +0000 UTC m=+0.062928829 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:57:39 compute-0 nova_compute[183075]: 2026-01-22 17:57:39.937 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:41 compute-0 nova_compute[183075]: 2026-01-22 17:57:41.667 183079 INFO nova.compute.manager [None req-deadc048-a435-4dc5-84fb-79e4cfee8af2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:57:41 compute-0 nova_compute[183075]: 2026-01-22 17:57:41.672 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:57:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:57:41.980 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:57:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:57:41.980 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:57:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:57:41.981 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:57:44 compute-0 nova_compute[183075]: 2026-01-22 17:57:44.322 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:44 compute-0 nova_compute[183075]: 2026-01-22 17:57:44.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:57:44 compute-0 nova_compute[183075]: 2026-01-22 17:57:44.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:57:44 compute-0 nova_compute[183075]: 2026-01-22 17:57:44.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:57:44 compute-0 nova_compute[183075]: 2026-01-22 17:57:44.939 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:45 compute-0 nova_compute[183075]: 2026-01-22 17:57:45.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:57:46 compute-0 nova_compute[183075]: 2026-01-22 17:57:46.811 183079 INFO nova.compute.manager [None req-c61c7ce0-afc9-447a-9028-48a89ad5a4c7 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:57:46 compute-0 nova_compute[183075]: 2026-01-22 17:57:46.817 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:57:49 compute-0 nova_compute[183075]: 2026-01-22 17:57:49.324 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:49 compute-0 nova_compute[183075]: 2026-01-22 17:57:49.942 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:50 compute-0 podman[244644]: 2026-01-22 17:57:50.347030433 +0000 UTC m=+0.053334721 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:57:50 compute-0 podman[244645]: 2026-01-22 17:57:50.362590711 +0000 UTC m=+0.062819266 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Jan 22 17:57:50 compute-0 podman[244643]: 2026-01-22 17:57:50.416437955 +0000 UTC m=+0.125472436 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 22 17:57:50 compute-0 sshd-session[244641]: Received disconnect from 45.148.10.157 port 22128:11:  [preauth]
Jan 22 17:57:50 compute-0 sshd-session[244641]: Disconnected from authenticating user root 45.148.10.157 port 22128 [preauth]
Jan 22 17:57:51 compute-0 nova_compute[183075]: 2026-01-22 17:57:51.784 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:57:51 compute-0 nova_compute[183075]: 2026-01-22 17:57:51.960 183079 INFO nova.compute.manager [None req-1ffe33be-1659-4e8d-aef5-d9c2f44fa879 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:57:51 compute-0 nova_compute[183075]: 2026-01-22 17:57:51.964 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:57:54 compute-0 nova_compute[183075]: 2026-01-22 17:57:54.325 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:54 compute-0 nova_compute[183075]: 2026-01-22 17:57:54.943 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:55 compute-0 podman[244704]: 2026-01-22 17:57:55.344787131 +0000 UTC m=+0.056410094 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.466 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'name': 'tempest-server-test-1286187819', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.466 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.469 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d / tap09c606a4-b4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.469 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90a3539b-37f2-4593-8c1d-0d70a21f8f24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004d-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-tap09c606a4-b4', 'timestamp': '2026-01-22T17:57:55.466598', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'tap09c606a4-b4', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:7a:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09c606a4-b4'}, 'message_id': 'e105fdfc-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.227039581, 'message_signature': '487d4a8b90773aaf94ce30f177bdfd67195f78d1bea856e5589bfb64baaf8d69'}]}, 'timestamp': '2026-01-22 17:57:55.469842', '_unique_id': '43f63d8ce56545cb8c70c295051902f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.471 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.483 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk.device.write.latency volume: 2274523787 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf95766e-4a62-44cb-833c-ee7939754452', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2274523787, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-vda', 'timestamp': '2026-01-22T17:57:55.471956', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'instance-0000004d', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e10832f2-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.232373755, 'message_signature': '1f1730a1b0cf5e79ffc49fc29224cab35160502d8eb9496d95cc208ec478ccd4'}]}, 'timestamp': '2026-01-22 17:57:55.484299', '_unique_id': '4908c22a28f14dbba6e485879f9099bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.485 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.486 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1286187819>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1286187819>]
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.486 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.501 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/cpu volume: 10820000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfa180b7-c178-4e12-ace7-909514ece146', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10820000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'timestamp': '2026-01-22T17:57:55.486300', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'instance-0000004d', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e10ae600-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.261696896, 'message_signature': '78d4e93a2ba89410240c8eacee172bc75402fca4e9420328533c65f352fba7bc'}]}, 'timestamp': '2026-01-22 17:57:55.502051', '_unique_id': 'ed2000bb701f40aeace7fee52a5cc891'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.503 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/network.outgoing.packets volume: 44 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e7672ac-c7cd-42b1-9cf9-c98544583197', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 44, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004d-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-tap09c606a4-b4', 'timestamp': '2026-01-22T17:57:55.503997', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'tap09c606a4-b4', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:7a:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09c606a4-b4'}, 'message_id': 'e10b403c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.227039581, 'message_signature': '5e21bc232292e438748959ea5fb785060b349f658974e798ea0483b01163e143'}]}, 'timestamp': '2026-01-22 17:57:55.504248', '_unique_id': 'e87ca829d7604f5e89b67002e6d79881'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.505 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.505 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3befb95d-c732-4df2-8a7f-fe619f5b68e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004d-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-tap09c606a4-b4', 'timestamp': '2026-01-22T17:57:55.505339', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'tap09c606a4-b4', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:7a:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09c606a4-b4'}, 'message_id': 'e10b73fe-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.227039581, 'message_signature': '2a5e7afb20b532f528c4773198a3ad2e9b2addcdfe97d5f80baeec02318937a9'}]}, 'timestamp': '2026-01-22 17:57:55.505568', '_unique_id': '4131f75eea3e48a895c2ac576600c7c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.506 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk.device.write.bytes volume: 26091520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f5e034e-831b-42ed-b078-99c024fcab80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 26091520, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-vda', 'timestamp': '2026-01-22T17:57:55.506823', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'instance-0000004d', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e10bae14-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.232373755, 'message_signature': '0d43f1cd10538bddb7fd2dbf0cee23f9482182eee8700ef22a644ad3e1ebeb46'}]}, 'timestamp': '2026-01-22 17:57:55.507049', '_unique_id': '870f869bd5e44d13803f97a817b72bd3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3e3c0e9-2b90-41b4-b103-2574805683c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004d-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-tap09c606a4-b4', 'timestamp': '2026-01-22T17:57:55.508141', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'tap09c606a4-b4', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:7a:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09c606a4-b4'}, 'message_id': 'e10be17c-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.227039581, 'message_signature': 'd6f4fa5dcf7b0a34d1360274cbd0afd14b0b576f78a5bd18586ad23959ae11d5'}]}, 'timestamp': '2026-01-22 17:57:55.508371', '_unique_id': '42671cab033b4463a346ace392f40431'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.509 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.509 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/network.outgoing.bytes volume: 3532 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91b6362c-b897-4641-b712-344cff409311', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3532, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004d-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-tap09c606a4-b4', 'timestamp': '2026-01-22T17:57:55.509447', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'tap09c606a4-b4', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:7a:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09c606a4-b4'}, 'message_id': 'e10c1462-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.227039581, 'message_signature': 'e53c2c4e613e13f74edff22eeb1379044dc425f6d248f6c0b60e76437b683043'}]}, 'timestamp': '2026-01-22 17:57:55.509697', '_unique_id': 'cd16bea7a990409a9f48687468520d4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.510 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fadf4df9-c509-40ef-82cd-6ce1f0a4f38e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004d-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-tap09c606a4-b4', 'timestamp': '2026-01-22T17:57:55.510781', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'tap09c606a4-b4', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:7a:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09c606a4-b4'}, 'message_id': 'e10c4888-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.227039581, 'message_signature': '6d0a8e7bba42876f1d652e3d831e3a52c5e8f1c764ea687948cb075d5a9be447'}]}, 'timestamp': '2026-01-22 17:57:55.511006', '_unique_id': '58d38f6c8441458ba2ecac50b79f7132'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.511 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.512 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.517 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk.device.usage volume: 28704768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02594e0f-ca79-44bd-92c4-3ca5fcd12ddc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28704768, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-vda', 'timestamp': '2026-01-22T17:57:55.512336', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'instance-0000004d', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e10d66f0-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.272767245, 'message_signature': 'c12290c24e81ce214ed05349f3fbac79fd5ae79f8b5a8f120909bd8ba17a9dec'}]}, 'timestamp': '2026-01-22 17:57:55.518416', '_unique_id': '5858b04651904cd9933550e5a44b06f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.520 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.520 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk.device.write.requests volume: 252 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95704268-a490-4503-b6d4-569e5ceaad47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 252, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-vda', 'timestamp': '2026-01-22T17:57:55.520280', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'instance-0000004d', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e10dbcb8-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.232373755, 'message_signature': '3fc3b57c7f7054342b5407de449cb0380ae1dc1eccab48613150fd3527e77aab'}]}, 'timestamp': '2026-01-22 17:57:55.520540', '_unique_id': '514ee49097f94bce9506b5cae11610e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.521 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk.device.read.latency volume: 191228816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92cd981c-b095-428c-b5f6-03bc46e44ad3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 191228816, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-vda', 'timestamp': '2026-01-22T17:57:55.521676', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'instance-0000004d', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e10df21e-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.232373755, 'message_signature': '6cfeae17f5ecfdb1b3250c5aab7dcdd2e869f9c9f281813ad55888a4bb316fd7'}]}, 'timestamp': '2026-01-22 17:57:55.521903', '_unique_id': 'e2aa85eecd274e0d91fdc838d48064b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.522 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.523 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.523 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1286187819>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1286187819>]
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.523 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.523 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/network.incoming.bytes volume: 1562 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'afa9c930-fbe1-4de5-b9ff-c9391f7fae92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1562, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004d-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-tap09c606a4-b4', 'timestamp': '2026-01-22T17:57:55.523395', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'tap09c606a4-b4', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:7a:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09c606a4-b4'}, 'message_id': 'e10e35da-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.227039581, 'message_signature': 'f222ae7918b8000f1263809a30f8f9790adbd256fbbef6ecf8c0998d69d7dc87'}]}, 'timestamp': '2026-01-22 17:57:55.523682', '_unique_id': 'ab08f0b40aa24f1191e0c759afb0defc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.524 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1286187819>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1286187819>]
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.525 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.525 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '161f4b91-539a-4ccc-95ce-a4deaa39d47d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004d-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-tap09c606a4-b4', 'timestamp': '2026-01-22T17:57:55.525196', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'tap09c606a4-b4', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:7a:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09c606a4-b4'}, 'message_id': 'e10e7d7e-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.227039581, 'message_signature': 'a937445aab211d36fe98df6d4ae12e839695432726f633a21a540790f502ec0f'}]}, 'timestamp': '2026-01-22 17:57:55.525515', '_unique_id': 'a64670f23d4544f1b38cae29e31b1ddf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.526 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk.device.read.requests volume: 978 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5998fdf3-ea28-4fad-8568-3aab6f035578', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 978, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-vda', 'timestamp': '2026-01-22T17:57:55.526693', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'instance-0000004d', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e10eb622-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.232373755, 'message_signature': '6d755a773467df523c06b9dec494cda7e5d9afa196189f4936a3ef8034d7e795'}]}, 'timestamp': '2026-01-22 17:57:55.526912', '_unique_id': 'ebfb4390e5ab4884a30fb32741df809c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.527 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk.device.allocation volume: 29106176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9a06ba7-5c59-4e78-ba26-fbd664058cf4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29106176, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-vda', 'timestamp': '2026-01-22T17:57:55.527989', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'instance-0000004d', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e10ee8c2-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.272767245, 'message_signature': '13ccb10cda00290c69c500faf702a2136ecef863b11749a754eff1513e78b72d'}]}, 'timestamp': '2026-01-22 17:57:55.528207', '_unique_id': 'ed14fd003881452283997a79250fee53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd94b1fba-00f8-41ef-865a-78078d5c2c03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004d-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-tap09c606a4-b4', 'timestamp': '2026-01-22T17:57:55.529267', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'tap09c606a4-b4', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:7a:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09c606a4-b4'}, 'message_id': 'e10f1aea-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.227039581, 'message_signature': '23d33d6031e5d5ee64a77ab176d3f743a3a4efe53c3ca18baa791b73a8e07bb5'}]}, 'timestamp': '2026-01-22 17:57:55.529505', '_unique_id': 'a93c375f8a4d44808bc59cb174a55b81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.530 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.530 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk.device.read.bytes volume: 28105728 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68fd677a-d26e-4b0f-93a6-01e5123840bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28105728, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-vda', 'timestamp': '2026-01-22T17:57:55.530658', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'instance-0000004d', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e10f5104-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.232373755, 'message_signature': 'ee5d615286deaf341de714938c94d5de3016d1f0a5d61d4c4bdf335a0f922508'}]}, 'timestamp': '2026-01-22 17:57:55.530878', '_unique_id': 'b5c9a6254ffa4ed296ea4c26b1084980'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.531 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/memory.usage volume: 42.71484375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ace2ce2b-1836-4bee-87e4-545c30bcda78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.71484375, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'timestamp': '2026-01-22T17:57:55.531990', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'instance-0000004d', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'e10f84ee-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.261696896, 'message_signature': '07489e449a0dbfe46d1512a290723a0adf9fdb88a7f0188714688a21cb1b18ee'}]}, 'timestamp': '2026-01-22 17:57:55.532206', '_unique_id': '8b6ffc2fba1f41aea92ade8ef43d8ffe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.533 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.533 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3644c80-29cb-47cd-924d-65be1d13d78d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-vda', 'timestamp': '2026-01-22T17:57:55.533275', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'instance-0000004d', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e10fb874-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.272767245, 'message_signature': '564a1a677ef0f8b117eddb7c6aa0a22804991f952e108f2b78d76b098a0e11e5'}]}, 'timestamp': '2026-01-22 17:57:55.533530', '_unique_id': '0b1f2aed9d6c46ffb85d9d33873082c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1286187819>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1286187819>]
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.534 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 DEBUG ceilometer.compute.pollsters [-] 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '164e02d2-c888-4fa8-be73-361a38414c88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004d-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-tap09c606a4-b4', 'timestamp': '2026-01-22T17:57:55.535029', 'resource_metadata': {'display_name': 'tempest-server-test-1286187819', 'name': 'tap09c606a4-b4', 'instance_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6d:7a:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09c606a4-b4'}, 'message_id': 'e10ffdc0-f7bb-11f0-9e69-fa163eaea1db', 'monotonic_time': 7039.227039581, 'message_signature': '07f0aadba80bfaea9b111aa0457bfceab4e855f892c90683f743a128cce3d7d8'}]}, 'timestamp': '2026-01-22 17:57:55.535353', '_unique_id': 'd3300d7b840c4f0797952e2f66c9eb89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:57:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:57:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:57:55 compute-0 nova_compute[183075]: 2026-01-22 17:57:55.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:57:55 compute-0 nova_compute[183075]: 2026-01-22 17:57:55.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:57:55 compute-0 nova_compute[183075]: 2026-01-22 17:57:55.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:57:56 compute-0 nova_compute[183075]: 2026-01-22 17:57:56.433 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:57:56 compute-0 nova_compute[183075]: 2026-01-22 17:57:56.434 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:57:56 compute-0 nova_compute[183075]: 2026-01-22 17:57:56.434 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:57:56 compute-0 nova_compute[183075]: 2026-01-22 17:57:56.434 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:57:57 compute-0 nova_compute[183075]: 2026-01-22 17:57:57.087 183079 INFO nova.compute.manager [None req-32cb45be-5ca4-4999-9c21-e6d55a14a7d6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:57:57 compute-0 nova_compute[183075]: 2026-01-22 17:57:57.094 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.018 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Updating instance_info_cache with network_info: [{"id": "09c606a4-b433-44a9-9c14-690255317cf3", "address": "fa:16:3e:6d:7a:4c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09c606a4-b4", "ovs_interfaceid": "09c606a4-b433-44a9-9c14-690255317cf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.041 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.042 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.043 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.043 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.043 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.069 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.069 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.070 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.070 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.153 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.220 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.222 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.288 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.454 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.455 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5557MB free_disk=73.3239631652832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.456 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.456 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.592 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.593 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.593 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.731 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.744 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.746 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:57:58 compute-0 nova_compute[183075]: 2026-01-22 17:57:58.746 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:57:59 compute-0 nova_compute[183075]: 2026-01-22 17:57:59.328 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:57:59 compute-0 nova_compute[183075]: 2026-01-22 17:57:59.491 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:57:59 compute-0 nova_compute[183075]: 2026-01-22 17:57:59.946 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:02 compute-0 nova_compute[183075]: 2026-01-22 17:58:02.329 183079 INFO nova.compute.manager [None req-245d0292-1168-4e00-8cee-4cfbc223e380 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Get console output
Jan 22 17:58:02 compute-0 nova_compute[183075]: 2026-01-22 17:58:02.335 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:58:04 compute-0 nova_compute[183075]: 2026-01-22 17:58:04.329 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:04 compute-0 nova_compute[183075]: 2026-01-22 17:58:04.987 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:05.047 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:58:05 compute-0 nova_compute[183075]: 2026-01-22 17:58:05.047 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:05 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:05.048 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 17:58:05 compute-0 nova_compute[183075]: 2026-01-22 17:58:05.889 183079 DEBUG nova.compute.manager [req-3ce74a5e-6201-4a5d-bdc9-718dc45c7b95 req-9982c8ef-21d1-47c9-bbb8-6b92038810fe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Received event network-changed-09c606a4-b433-44a9-9c14-690255317cf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:58:05 compute-0 nova_compute[183075]: 2026-01-22 17:58:05.890 183079 DEBUG nova.compute.manager [req-3ce74a5e-6201-4a5d-bdc9-718dc45c7b95 req-9982c8ef-21d1-47c9-bbb8-6b92038810fe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Refreshing instance network info cache due to event network-changed-09c606a4-b433-44a9-9c14-690255317cf3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:58:05 compute-0 nova_compute[183075]: 2026-01-22 17:58:05.890 183079 DEBUG oslo_concurrency.lockutils [req-3ce74a5e-6201-4a5d-bdc9-718dc45c7b95 req-9982c8ef-21d1-47c9-bbb8-6b92038810fe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:58:05 compute-0 nova_compute[183075]: 2026-01-22 17:58:05.890 183079 DEBUG oslo_concurrency.lockutils [req-3ce74a5e-6201-4a5d-bdc9-718dc45c7b95 req-9982c8ef-21d1-47c9-bbb8-6b92038810fe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:58:05 compute-0 nova_compute[183075]: 2026-01-22 17:58:05.890 183079 DEBUG nova.network.neutron [req-3ce74a5e-6201-4a5d-bdc9-718dc45c7b95 req-9982c8ef-21d1-47c9-bbb8-6b92038810fe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Refreshing network info cache for port 09c606a4-b433-44a9-9c14-690255317cf3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:58:06 compute-0 podman[244731]: 2026-01-22 17:58:06.34870912 +0000 UTC m=+0.053928076 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.198 183079 DEBUG nova.network.neutron [req-3ce74a5e-6201-4a5d-bdc9-718dc45c7b95 req-9982c8ef-21d1-47c9-bbb8-6b92038810fe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Updated VIF entry in instance network info cache for port 09c606a4-b433-44a9-9c14-690255317cf3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.199 183079 DEBUG nova.network.neutron [req-3ce74a5e-6201-4a5d-bdc9-718dc45c7b95 req-9982c8ef-21d1-47c9-bbb8-6b92038810fe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Updating instance_info_cache with network_info: [{"id": "09c606a4-b433-44a9-9c14-690255317cf3", "address": "fa:16:3e:6d:7a:4c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09c606a4-b4", "ovs_interfaceid": "09c606a4-b433-44a9-9c14-690255317cf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.233 183079 DEBUG oslo_concurrency.lockutils [req-3ce74a5e-6201-4a5d-bdc9-718dc45c7b95 req-9982c8ef-21d1-47c9-bbb8-6b92038810fe a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.299 183079 DEBUG oslo_concurrency.lockutils [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.300 183079 DEBUG oslo_concurrency.lockutils [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.300 183079 DEBUG oslo_concurrency.lockutils [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.301 183079 DEBUG oslo_concurrency.lockutils [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.301 183079 DEBUG oslo_concurrency.lockutils [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.303 183079 INFO nova.compute.manager [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Terminating instance
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.305 183079 DEBUG nova.compute.manager [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:58:07 compute-0 kernel: tap09c606a4-b4 (unregistering): left promiscuous mode
Jan 22 17:58:07 compute-0 NetworkManager[55454]: <info>  [1769104687.3279] device (tap09c606a4-b4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:58:07 compute-0 ovn_controller[95372]: 2026-01-22T17:58:07Z|00852|binding|INFO|Releasing lport 09c606a4-b433-44a9-9c14-690255317cf3 from this chassis (sb_readonly=0)
Jan 22 17:58:07 compute-0 ovn_controller[95372]: 2026-01-22T17:58:07Z|00853|binding|INFO|Setting lport 09c606a4-b433-44a9-9c14-690255317cf3 down in Southbound
Jan 22 17:58:07 compute-0 ovn_controller[95372]: 2026-01-22T17:58:07Z|00854|binding|INFO|Removing iface tap09c606a4-b4 ovn-installed in OVS
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.339 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:07.344 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:7a:4c 10.100.0.9'], port_security=['fa:16:3e:6d:7a:4c 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-test_protocol_number_rule-587073029', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4e7d63e0-335d-4cc1-9701-ba3728ed3e2d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-test_protocol_number_rule-587073029', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ff9e54bc-1b37-40fb-a3e6-dc01342b1d06', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.227'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=09c606a4-b433-44a9-9c14-690255317cf3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:58:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:07.346 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 09c606a4-b433-44a9-9c14-690255317cf3 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:58:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:07.347 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:58:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:07.348 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[071668b8-3925-418a-95e4-094170ac4da6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:07.349 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace which is not needed anymore
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.359 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:07 compute-0 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Jan 22 17:58:07 compute-0 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d0000004d.scope: Consumed 15.723s CPU time.
Jan 22 17:58:07 compute-0 systemd-machined[154382]: Machine qemu-77-instance-0000004d terminated.
Jan 22 17:58:07 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244267]: [NOTICE]   (244271) : haproxy version is 2.8.14-c23fe91
Jan 22 17:58:07 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244267]: [NOTICE]   (244271) : path to executable is /usr/sbin/haproxy
Jan 22 17:58:07 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244267]: [WARNING]  (244271) : Exiting Master process...
Jan 22 17:58:07 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244267]: [ALERT]    (244271) : Current worker (244273) exited with code 143 (Terminated)
Jan 22 17:58:07 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244267]: [WARNING]  (244271) : All workers exited. Exiting... (0)
Jan 22 17:58:07 compute-0 systemd[1]: libpod-e8a0036f265f341458871343ab2f6ca2cba969eeb9084a4d11c8a4cbc05470ad.scope: Deactivated successfully.
Jan 22 17:58:07 compute-0 podman[244782]: 2026-01-22 17:58:07.465426154 +0000 UTC m=+0.041203773 container died e8a0036f265f341458871343ab2f6ca2cba969eeb9084a4d11c8a4cbc05470ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 17:58:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8a0036f265f341458871343ab2f6ca2cba969eeb9084a4d11c8a4cbc05470ad-userdata-shm.mount: Deactivated successfully.
Jan 22 17:58:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-62d027dd0e0f46d9ed60192c6f4101eba46e37650383b96eb40fba91a2b5bff5-merged.mount: Deactivated successfully.
Jan 22 17:58:07 compute-0 podman[244782]: 2026-01-22 17:58:07.495258469 +0000 UTC m=+0.071036078 container cleanup e8a0036f265f341458871343ab2f6ca2cba969eeb9084a4d11c8a4cbc05470ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:58:07 compute-0 systemd[1]: libpod-conmon-e8a0036f265f341458871343ab2f6ca2cba969eeb9084a4d11c8a4cbc05470ad.scope: Deactivated successfully.
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.553 183079 INFO nova.virt.libvirt.driver [-] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Instance destroyed successfully.
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.553 183079 DEBUG nova.objects.instance [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:58:07 compute-0 podman[244810]: 2026-01-22 17:58:07.556349538 +0000 UTC m=+0.043405082 container remove e8a0036f265f341458871343ab2f6ca2cba969eeb9084a4d11c8a4cbc05470ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:58:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:07.561 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9d306fc1-ab40-491a-98da-cf5ae5abe715]: (4, ('Thu Jan 22 05:58:07 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (e8a0036f265f341458871343ab2f6ca2cba969eeb9084a4d11c8a4cbc05470ad)\ne8a0036f265f341458871343ab2f6ca2cba969eeb9084a4d11c8a4cbc05470ad\nThu Jan 22 05:58:07 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (e8a0036f265f341458871343ab2f6ca2cba969eeb9084a4d11c8a4cbc05470ad)\ne8a0036f265f341458871343ab2f6ca2cba969eeb9084a4d11c8a4cbc05470ad\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:07.563 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0896bdde-af46-42f8-8c8b-d0a6a123460e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:07.565 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.567 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:07 compute-0 kernel: tap88ed9213-70: left promiscuous mode
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.577 183079 DEBUG nova.virt.libvirt.vif [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:56:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1286187819',display_name='tempest-server-test-1286187819',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1286187819',id=77,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:56:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-b5y1dca2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:56:32Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=4e7d63e0-335d-4cc1-9701-ba3728ed3e2d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09c606a4-b433-44a9-9c14-690255317cf3", "address": "fa:16:3e:6d:7a:4c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09c606a4-b4", "ovs_interfaceid": "09c606a4-b433-44a9-9c14-690255317cf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.578 183079 DEBUG nova.network.os_vif_util [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "09c606a4-b433-44a9-9c14-690255317cf3", "address": "fa:16:3e:6d:7a:4c", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09c606a4-b4", "ovs_interfaceid": "09c606a4-b433-44a9-9c14-690255317cf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.578 183079 DEBUG nova.network.os_vif_util [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6d:7a:4c,bridge_name='br-int',has_traffic_filtering=True,id=09c606a4-b433-44a9-9c14-690255317cf3,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap09c606a4-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.579 183079 DEBUG os_vif [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6d:7a:4c,bridge_name='br-int',has_traffic_filtering=True,id=09c606a4-b433-44a9-9c14-690255317cf3,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap09c606a4-b4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.581 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.582 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09c606a4-b4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.582 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:07.583 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4d43fc92-d19f-44aa-bb43-7c48ef9bca78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.584 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.585 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.589 183079 INFO os_vif [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6d:7a:4c,bridge_name='br-int',has_traffic_filtering=True,id=09c606a4-b433-44a9-9c14-690255317cf3,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap09c606a4-b4')
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.590 183079 INFO nova.virt.libvirt.driver [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Deleting instance files /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d_del
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.590 183079 INFO nova.virt.libvirt.driver [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Deletion of /var/lib/nova/instances/4e7d63e0-335d-4cc1-9701-ba3728ed3e2d_del complete
Jan 22 17:58:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:07.600 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f70976f8-a9d9-471f-8ab1-e85b6b137fae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:07.601 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[8846c20f-e3b3-4c02-8f8c-e4b49ff969eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:07.615 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4ebf5e93-c393-480c-ac9d-1f847a7af5e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 695380, 'reachable_time': 35513, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244845, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:07 compute-0 systemd[1]: run-netns-ovnmeta\x2d88ed9213\x2d7d48\x2d4c42\x2dad60\x2dce3fcf104f71.mount: Deactivated successfully.
Jan 22 17:58:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:07.618 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:58:07 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:07.619 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[9c2af8a8-d9cf-4b95-b92d-d362c8bf9e7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.655 183079 INFO nova.compute.manager [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.656 183079 DEBUG oslo.service.loopingcall [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.656 183079 DEBUG nova.compute.manager [-] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.656 183079 DEBUG nova.network.neutron [-] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.966 183079 DEBUG nova.compute.manager [req-9f05263d-9119-4e63-8c96-985135e32335 req-b70171ff-0ef3-4c20-bdb5-6a041f3c73b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Received event network-vif-unplugged-09c606a4-b433-44a9-9c14-690255317cf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.967 183079 DEBUG oslo_concurrency.lockutils [req-9f05263d-9119-4e63-8c96-985135e32335 req-b70171ff-0ef3-4c20-bdb5-6a041f3c73b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.967 183079 DEBUG oslo_concurrency.lockutils [req-9f05263d-9119-4e63-8c96-985135e32335 req-b70171ff-0ef3-4c20-bdb5-6a041f3c73b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.967 183079 DEBUG oslo_concurrency.lockutils [req-9f05263d-9119-4e63-8c96-985135e32335 req-b70171ff-0ef3-4c20-bdb5-6a041f3c73b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.967 183079 DEBUG nova.compute.manager [req-9f05263d-9119-4e63-8c96-985135e32335 req-b70171ff-0ef3-4c20-bdb5-6a041f3c73b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] No waiting events found dispatching network-vif-unplugged-09c606a4-b433-44a9-9c14-690255317cf3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.968 183079 DEBUG nova.compute.manager [req-9f05263d-9119-4e63-8c96-985135e32335 req-b70171ff-0ef3-4c20-bdb5-6a041f3c73b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Received event network-vif-unplugged-09c606a4-b433-44a9-9c14-690255317cf3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.968 183079 DEBUG nova.compute.manager [req-9f05263d-9119-4e63-8c96-985135e32335 req-b70171ff-0ef3-4c20-bdb5-6a041f3c73b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Received event network-vif-plugged-09c606a4-b433-44a9-9c14-690255317cf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.968 183079 DEBUG oslo_concurrency.lockutils [req-9f05263d-9119-4e63-8c96-985135e32335 req-b70171ff-0ef3-4c20-bdb5-6a041f3c73b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.968 183079 DEBUG oslo_concurrency.lockutils [req-9f05263d-9119-4e63-8c96-985135e32335 req-b70171ff-0ef3-4c20-bdb5-6a041f3c73b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.969 183079 DEBUG oslo_concurrency.lockutils [req-9f05263d-9119-4e63-8c96-985135e32335 req-b70171ff-0ef3-4c20-bdb5-6a041f3c73b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.969 183079 DEBUG nova.compute.manager [req-9f05263d-9119-4e63-8c96-985135e32335 req-b70171ff-0ef3-4c20-bdb5-6a041f3c73b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] No waiting events found dispatching network-vif-plugged-09c606a4-b433-44a9-9c14-690255317cf3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:58:07 compute-0 nova_compute[183075]: 2026-01-22 17:58:07.969 183079 WARNING nova.compute.manager [req-9f05263d-9119-4e63-8c96-985135e32335 req-b70171ff-0ef3-4c20-bdb5-6a041f3c73b1 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Received unexpected event network-vif-plugged-09c606a4-b433-44a9-9c14-690255317cf3 for instance with vm_state active and task_state deleting.
Jan 22 17:58:08 compute-0 nova_compute[183075]: 2026-01-22 17:58:08.446 183079 DEBUG nova.network.neutron [-] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:58:08 compute-0 nova_compute[183075]: 2026-01-22 17:58:08.465 183079 INFO nova.compute.manager [-] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Took 0.81 seconds to deallocate network for instance.
Jan 22 17:58:08 compute-0 nova_compute[183075]: 2026-01-22 17:58:08.504 183079 DEBUG oslo_concurrency.lockutils [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:08 compute-0 nova_compute[183075]: 2026-01-22 17:58:08.504 183079 DEBUG oslo_concurrency.lockutils [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:08 compute-0 nova_compute[183075]: 2026-01-22 17:58:08.562 183079 DEBUG nova.compute.provider_tree [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:58:08 compute-0 nova_compute[183075]: 2026-01-22 17:58:08.577 183079 DEBUG nova.scheduler.client.report [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:58:08 compute-0 nova_compute[183075]: 2026-01-22 17:58:08.596 183079 DEBUG oslo_concurrency.lockutils [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:08 compute-0 nova_compute[183075]: 2026-01-22 17:58:08.617 183079 INFO nova.scheduler.client.report [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d
Jan 22 17:58:08 compute-0 nova_compute[183075]: 2026-01-22 17:58:08.673 183079 DEBUG oslo_concurrency.lockutils [None req-1a28054a-5c62-4f29-ab5b-9a45680a6e2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "4e7d63e0-335d-4cc1-9701-ba3728ed3e2d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.373s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:09 compute-0 nova_compute[183075]: 2026-01-22 17:58:09.331 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:10 compute-0 podman[244846]: 2026-01-22 17:58:10.341544188 +0000 UTC m=+0.051324347 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:58:12 compute-0 nova_compute[183075]: 2026-01-22 17:58:12.583 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:13 compute-0 nova_compute[183075]: 2026-01-22 17:58:13.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.221 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "d98d7460-4481-41d1-8ad7-97b85cb698ad" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.222 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "d98d7460-4481-41d1-8ad7-97b85cb698ad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.236 183079 DEBUG nova.compute.manager [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.306 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.307 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.313 183079 DEBUG nova.virt.hardware [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.313 183079 INFO nova.compute.claims [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.333 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.430 183079 DEBUG nova.compute.provider_tree [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.446 183079 DEBUG nova.scheduler.client.report [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.469 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.471 183079 DEBUG nova.compute.manager [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.534 183079 DEBUG nova.compute.manager [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.535 183079 DEBUG nova.network.neutron [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.563 183079 INFO nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.591 183079 DEBUG nova.compute.manager [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.690 183079 DEBUG nova.compute.manager [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.691 183079 DEBUG nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.692 183079 INFO nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Creating image(s)
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.692 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/d98d7460-4481-41d1-8ad7-97b85cb698ad/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.693 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/d98d7460-4481-41d1-8ad7-97b85cb698ad/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.694 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/d98d7460-4481-41d1-8ad7-97b85cb698ad/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.705 183079 DEBUG oslo_concurrency.processutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.791 183079 DEBUG nova.policy [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.807 183079 DEBUG oslo_concurrency.processutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.808 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.809 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.823 183079 DEBUG oslo_concurrency.processutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.889 183079 DEBUG oslo_concurrency.processutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.890 183079 DEBUG oslo_concurrency.processutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/d98d7460-4481-41d1-8ad7-97b85cb698ad/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.922 183079 DEBUG oslo_concurrency.processutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/d98d7460-4481-41d1-8ad7-97b85cb698ad/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.923 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.923 183079 DEBUG oslo_concurrency.processutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.976 183079 DEBUG oslo_concurrency.processutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.977 183079 DEBUG nova.virt.disk.api [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/d98d7460-4481-41d1-8ad7-97b85cb698ad/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:58:14 compute-0 nova_compute[183075]: 2026-01-22 17:58:14.977 183079 DEBUG oslo_concurrency.processutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d98d7460-4481-41d1-8ad7-97b85cb698ad/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:58:15 compute-0 nova_compute[183075]: 2026-01-22 17:58:15.034 183079 DEBUG oslo_concurrency.processutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d98d7460-4481-41d1-8ad7-97b85cb698ad/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:58:15 compute-0 nova_compute[183075]: 2026-01-22 17:58:15.035 183079 DEBUG nova.virt.disk.api [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/d98d7460-4481-41d1-8ad7-97b85cb698ad/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:58:15 compute-0 nova_compute[183075]: 2026-01-22 17:58:15.036 183079 DEBUG nova.objects.instance [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid d98d7460-4481-41d1-8ad7-97b85cb698ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:58:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:15.051 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:58:15 compute-0 nova_compute[183075]: 2026-01-22 17:58:15.051 183079 DEBUG nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:58:15 compute-0 nova_compute[183075]: 2026-01-22 17:58:15.052 183079 DEBUG nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Ensure instance console log exists: /var/lib/nova/instances/d98d7460-4481-41d1-8ad7-97b85cb698ad/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:58:15 compute-0 nova_compute[183075]: 2026-01-22 17:58:15.052 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:15 compute-0 nova_compute[183075]: 2026-01-22 17:58:15.052 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:15 compute-0 nova_compute[183075]: 2026-01-22 17:58:15.053 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:15 compute-0 nova_compute[183075]: 2026-01-22 17:58:15.584 183079 DEBUG nova.network.neutron [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Successfully created port: 0b4e1e83-23e3-4ee1-8291-e1197b83e358 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:58:16 compute-0 nova_compute[183075]: 2026-01-22 17:58:16.357 183079 DEBUG nova.network.neutron [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Successfully updated port: 0b4e1e83-23e3-4ee1-8291-e1197b83e358 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:58:16 compute-0 nova_compute[183075]: 2026-01-22 17:58:16.381 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-d98d7460-4481-41d1-8ad7-97b85cb698ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:58:16 compute-0 nova_compute[183075]: 2026-01-22 17:58:16.382 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-d98d7460-4481-41d1-8ad7-97b85cb698ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:58:16 compute-0 nova_compute[183075]: 2026-01-22 17:58:16.382 183079 DEBUG nova.network.neutron [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:58:16 compute-0 nova_compute[183075]: 2026-01-22 17:58:16.465 183079 DEBUG nova.compute.manager [req-3f82e083-ff26-46b6-9f78-aa9ca9b9eca8 req-a78523c8-10d5-4c04-a286-9a7927ba17bb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Received event network-changed-0b4e1e83-23e3-4ee1-8291-e1197b83e358 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:58:16 compute-0 nova_compute[183075]: 2026-01-22 17:58:16.466 183079 DEBUG nova.compute.manager [req-3f82e083-ff26-46b6-9f78-aa9ca9b9eca8 req-a78523c8-10d5-4c04-a286-9a7927ba17bb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Refreshing instance network info cache due to event network-changed-0b4e1e83-23e3-4ee1-8291-e1197b83e358. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:58:16 compute-0 nova_compute[183075]: 2026-01-22 17:58:16.466 183079 DEBUG oslo_concurrency.lockutils [req-3f82e083-ff26-46b6-9f78-aa9ca9b9eca8 req-a78523c8-10d5-4c04-a286-9a7927ba17bb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-d98d7460-4481-41d1-8ad7-97b85cb698ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:58:16 compute-0 nova_compute[183075]: 2026-01-22 17:58:16.776 183079 DEBUG nova.network.neutron [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.437 183079 DEBUG nova.network.neutron [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Updating instance_info_cache with network_info: [{"id": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "address": "fa:16:3e:c2:1a:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b4e1e83-23", "ovs_interfaceid": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.461 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-d98d7460-4481-41d1-8ad7-97b85cb698ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.461 183079 DEBUG nova.compute.manager [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Instance network_info: |[{"id": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "address": "fa:16:3e:c2:1a:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b4e1e83-23", "ovs_interfaceid": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.462 183079 DEBUG oslo_concurrency.lockutils [req-3f82e083-ff26-46b6-9f78-aa9ca9b9eca8 req-a78523c8-10d5-4c04-a286-9a7927ba17bb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-d98d7460-4481-41d1-8ad7-97b85cb698ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.462 183079 DEBUG nova.network.neutron [req-3f82e083-ff26-46b6-9f78-aa9ca9b9eca8 req-a78523c8-10d5-4c04-a286-9a7927ba17bb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Refreshing network info cache for port 0b4e1e83-23e3-4ee1-8291-e1197b83e358 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.464 183079 DEBUG nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Start _get_guest_xml network_info=[{"id": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "address": "fa:16:3e:c2:1a:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b4e1e83-23", "ovs_interfaceid": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.469 183079 WARNING nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.473 183079 DEBUG nova.virt.libvirt.host [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.473 183079 DEBUG nova.virt.libvirt.host [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.479 183079 DEBUG nova.virt.libvirt.host [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.480 183079 DEBUG nova.virt.libvirt.host [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.480 183079 DEBUG nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.480 183079 DEBUG nova.virt.hardware [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.481 183079 DEBUG nova.virt.hardware [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.481 183079 DEBUG nova.virt.hardware [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.482 183079 DEBUG nova.virt.hardware [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.482 183079 DEBUG nova.virt.hardware [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.482 183079 DEBUG nova.virt.hardware [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.482 183079 DEBUG nova.virt.hardware [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.483 183079 DEBUG nova.virt.hardware [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.483 183079 DEBUG nova.virt.hardware [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.483 183079 DEBUG nova.virt.hardware [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.483 183079 DEBUG nova.virt.hardware [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.487 183079 DEBUG nova.virt.libvirt.vif [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:58:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-263526970',display_name='tempest-server-test-263526970',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-263526970',id=78,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-00duqttj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:58:14Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=d98d7460-4481-41d1-8ad7-97b85cb698ad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "address": "fa:16:3e:c2:1a:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b4e1e83-23", "ovs_interfaceid": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.487 183079 DEBUG nova.network.os_vif_util [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "address": "fa:16:3e:c2:1a:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b4e1e83-23", "ovs_interfaceid": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.487 183079 DEBUG nova.network.os_vif_util [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:1a:74,bridge_name='br-int',has_traffic_filtering=True,id=0b4e1e83-23e3-4ee1-8291-e1197b83e358,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b4e1e83-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.488 183079 DEBUG nova.objects.instance [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid d98d7460-4481-41d1-8ad7-97b85cb698ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.505 183079 DEBUG nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:58:17 compute-0 nova_compute[183075]:   <uuid>d98d7460-4481-41d1-8ad7-97b85cb698ad</uuid>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   <name>instance-0000004e</name>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-263526970</nova:name>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:58:17</nova:creationTime>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:58:17 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:58:17 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:58:17 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:58:17 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:58:17 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:58:17 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:58:17 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:58:17 compute-0 nova_compute[183075]:         <nova:port uuid="0b4e1e83-23e3-4ee1-8291-e1197b83e358">
Jan 22 17:58:17 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <system>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <entry name="serial">d98d7460-4481-41d1-8ad7-97b85cb698ad</entry>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <entry name="uuid">d98d7460-4481-41d1-8ad7-97b85cb698ad</entry>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     </system>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   <os>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   </os>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   <features>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   </features>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/d98d7460-4481-41d1-8ad7-97b85cb698ad/disk"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:c2:1a:74"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <target dev="tap0b4e1e83-23"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/d98d7460-4481-41d1-8ad7-97b85cb698ad/console.log" append="off"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <video>
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     </video>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:58:17 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:58:17 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:58:17 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:58:17 compute-0 nova_compute[183075]: </domain>
Jan 22 17:58:17 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.507 183079 DEBUG nova.compute.manager [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Preparing to wait for external event network-vif-plugged-0b4e1e83-23e3-4ee1-8291-e1197b83e358 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.508 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.509 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.509 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.511 183079 DEBUG nova.virt.libvirt.vif [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:58:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-263526970',display_name='tempest-server-test-263526970',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-263526970',id=78,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-00duqttj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:58:14Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=d98d7460-4481-41d1-8ad7-97b85cb698ad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "address": "fa:16:3e:c2:1a:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b4e1e83-23", "ovs_interfaceid": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.512 183079 DEBUG nova.network.os_vif_util [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "address": "fa:16:3e:c2:1a:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b4e1e83-23", "ovs_interfaceid": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.513 183079 DEBUG nova.network.os_vif_util [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:1a:74,bridge_name='br-int',has_traffic_filtering=True,id=0b4e1e83-23e3-4ee1-8291-e1197b83e358,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b4e1e83-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.514 183079 DEBUG os_vif [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:1a:74,bridge_name='br-int',has_traffic_filtering=True,id=0b4e1e83-23e3-4ee1-8291-e1197b83e358,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b4e1e83-23') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.516 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.517 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.518 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.523 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.524 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b4e1e83-23, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.525 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b4e1e83-23, col_values=(('external_ids', {'iface-id': '0b4e1e83-23e3-4ee1-8291-e1197b83e358', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c2:1a:74', 'vm-uuid': 'd98d7460-4481-41d1-8ad7-97b85cb698ad'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.528 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:17 compute-0 NetworkManager[55454]: <info>  [1769104697.5291] manager: (tap0b4e1e83-23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/339)
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.532 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.534 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.535 183079 INFO os_vif [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:1a:74,bridge_name='br-int',has_traffic_filtering=True,id=0b4e1e83-23e3-4ee1-8291-e1197b83e358,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b4e1e83-23')
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.604 183079 DEBUG nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.606 183079 DEBUG nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:c2:1a:74, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:58:17 compute-0 kernel: tap0b4e1e83-23: entered promiscuous mode
Jan 22 17:58:17 compute-0 NetworkManager[55454]: <info>  [1769104697.6575] manager: (tap0b4e1e83-23): new Tun device (/org/freedesktop/NetworkManager/Devices/340)
Jan 22 17:58:17 compute-0 ovn_controller[95372]: 2026-01-22T17:58:17Z|00855|binding|INFO|Claiming lport 0b4e1e83-23e3-4ee1-8291-e1197b83e358 for this chassis.
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.658 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:17 compute-0 ovn_controller[95372]: 2026-01-22T17:58:17Z|00856|binding|INFO|0b4e1e83-23e3-4ee1-8291-e1197b83e358: Claiming fa:16:3e:c2:1a:74 10.100.0.5
Jan 22 17:58:17 compute-0 ovn_controller[95372]: 2026-01-22T17:58:17Z|00857|binding|INFO|Setting lport 0b4e1e83-23e3-4ee1-8291-e1197b83e358 ovn-installed in OVS
Jan 22 17:58:17 compute-0 ovn_controller[95372]: 2026-01-22T17:58:17Z|00858|binding|INFO|Setting lport 0b4e1e83-23e3-4ee1-8291-e1197b83e358 up in Southbound
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.671 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.671 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:1a:74 10.100.0.5'], port_security=['fa:16:3e:c2:1a:74 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd98d7460-4481-41d1-8ad7-97b85cb698ad', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '63a5bff5-793d-4ac2-8d85-9660aed4c099', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=0b4e1e83-23e3-4ee1-8291-e1197b83e358) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.673 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 0b4e1e83-23e3-4ee1-8291-e1197b83e358 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.673 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.674 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.685 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7db2d46b-cc1c-489a-8a1a-80624386089f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.686 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88ed9213-71 in ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.688 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88ed9213-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.688 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[df7b6ad5-3da2-4b8d-bad8-7ad4e55b884d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.689 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c241cf33-e065-4e7a-a2ff-9dba1b908a3c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:17 compute-0 systemd-machined[154382]: New machine qemu-78-instance-0000004e.
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.699 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[3cf75ef3-5cf3-47fd-8a2e-1ebb38e3f2ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:17 compute-0 systemd-udevd[244906]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:58:17 compute-0 NetworkManager[55454]: <info>  [1769104697.7112] device (tap0b4e1e83-23): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:58:17 compute-0 systemd[1]: Started Virtual Machine qemu-78-instance-0000004e.
Jan 22 17:58:17 compute-0 NetworkManager[55454]: <info>  [1769104697.7127] device (tap0b4e1e83-23): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.721 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[031c607d-a160-4e53-960d-943d28a0d2e0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.744 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[b1ca4cb8-341c-41eb-a435-5083721617e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.749 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[79f92c4b-6a59-4251-a4df-24f5ecb0ca12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:17 compute-0 systemd-udevd[244909]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:58:17 compute-0 NetworkManager[55454]: <info>  [1769104697.7501] manager: (tap88ed9213-70): new Veth device (/org/freedesktop/NetworkManager/Devices/341)
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.777 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[924b282f-bf1e-464f-9f26-696e24f67cf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.780 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[56fbe56a-7df5-4f01-94e7-e9d36991a078]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:17 compute-0 NetworkManager[55454]: <info>  [1769104697.7993] device (tap88ed9213-70): carrier: link connected
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.803 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[eaa41e01-15cd-4ce3-9d3b-f3de991d9ec7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.822 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f0d4d795-21a3-4b1b-bfae-3613cdc6b4c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 232], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706150, 'reachable_time': 37615, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244937, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.838 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e12beee1-d2b6-47e9-9fa9-993ab8a52fe1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:fa93'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706150, 'tstamp': 706150}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244938, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.860 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6ba91922-c602-45b5-bfdb-3725bf944afe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 232], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706150, 'reachable_time': 37615, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244939, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.889 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[da9cd8d2-1f15-454b-b513-1c07d84ffaea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.916 183079 DEBUG nova.compute.manager [req-a4c16b25-9308-430b-b41c-f8be1571ba7b req-eda9673d-0b43-48aa-87cb-57e060810f69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Received event network-vif-plugged-0b4e1e83-23e3-4ee1-8291-e1197b83e358 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.917 183079 DEBUG oslo_concurrency.lockutils [req-a4c16b25-9308-430b-b41c-f8be1571ba7b req-eda9673d-0b43-48aa-87cb-57e060810f69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.917 183079 DEBUG oslo_concurrency.lockutils [req-a4c16b25-9308-430b-b41c-f8be1571ba7b req-eda9673d-0b43-48aa-87cb-57e060810f69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.917 183079 DEBUG oslo_concurrency.lockutils [req-a4c16b25-9308-430b-b41c-f8be1571ba7b req-eda9673d-0b43-48aa-87cb-57e060810f69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.917 183079 DEBUG nova.compute.manager [req-a4c16b25-9308-430b-b41c-f8be1571ba7b req-eda9673d-0b43-48aa-87cb-57e060810f69 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Processing event network-vif-plugged-0b4e1e83-23e3-4ee1-8291-e1197b83e358 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.939 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[09ad7335-aa4b-471d-a003-a0d3e97a2e54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.941 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.941 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.942 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.944 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:17 compute-0 kernel: tap88ed9213-70: entered promiscuous mode
Jan 22 17:58:17 compute-0 NetworkManager[55454]: <info>  [1769104697.9448] manager: (tap88ed9213-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/342)
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.947 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.948 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.949 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:17 compute-0 ovn_controller[95372]: 2026-01-22T17:58:17Z|00859|binding|INFO|Releasing lport da93a32e-eed6-4124-8f08-78b0bfe2cb69 from this chassis (sb_readonly=0)
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.950 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.951 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.952 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[73b3d1ef-87bd-4476-b307-e0da167966fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.952 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:58:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:17.953 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'env', 'PROCESS_TAG=haproxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88ed9213-7d48-4c42-ad60-ce3fcf104f71.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:58:17 compute-0 nova_compute[183075]: 2026-01-22 17:58:17.961 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.063 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104698.0633547, d98d7460-4481-41d1-8ad7-97b85cb698ad => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.064 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] VM Started (Lifecycle Event)
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.067 183079 DEBUG nova.compute.manager [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.071 183079 DEBUG nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.074 183079 INFO nova.virt.libvirt.driver [-] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Instance spawned successfully.
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.075 183079 DEBUG nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.109 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.114 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.118 183079 DEBUG nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.118 183079 DEBUG nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.119 183079 DEBUG nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.120 183079 DEBUG nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.120 183079 DEBUG nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.121 183079 DEBUG nova.virt.libvirt.driver [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.152 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.152 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104698.0634592, d98d7460-4481-41d1-8ad7-97b85cb698ad => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.153 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] VM Paused (Lifecycle Event)
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.181 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.190 183079 INFO nova.compute.manager [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Took 3.50 seconds to spawn the instance on the hypervisor.
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.191 183079 DEBUG nova.compute.manager [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.192 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104698.0705903, d98d7460-4481-41d1-8ad7-97b85cb698ad => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.193 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] VM Resumed (Lifecycle Event)
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.219 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.222 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.244 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.253 183079 INFO nova.compute.manager [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Took 3.97 seconds to build instance.
Jan 22 17:58:18 compute-0 nova_compute[183075]: 2026-01-22 17:58:18.268 183079 DEBUG oslo_concurrency.lockutils [None req-59777f14-6d7e-4387-badf-467e61bc684c 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "d98d7460-4481-41d1-8ad7-97b85cb698ad" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:18 compute-0 podman[244978]: 2026-01-22 17:58:18.300201022 +0000 UTC m=+0.052425396 container create 4dd3a4d975f00934ea32e1963c2386f1e9c71781266bac796c248aa19d53a2b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:58:18 compute-0 systemd[1]: Started libpod-conmon-4dd3a4d975f00934ea32e1963c2386f1e9c71781266bac796c248aa19d53a2b1.scope.
Jan 22 17:58:18 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:58:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d0870b1da9ab0907d4c6a023c91b7e6cfbc78074f74b1ac3823a459d00e47bef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:58:18 compute-0 podman[244978]: 2026-01-22 17:58:18.272891495 +0000 UTC m=+0.025115959 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:58:18 compute-0 podman[244978]: 2026-01-22 17:58:18.376527443 +0000 UTC m=+0.128751837 container init 4dd3a4d975f00934ea32e1963c2386f1e9c71781266bac796c248aa19d53a2b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:58:18 compute-0 podman[244978]: 2026-01-22 17:58:18.383766808 +0000 UTC m=+0.135991182 container start 4dd3a4d975f00934ea32e1963c2386f1e9c71781266bac796c248aa19d53a2b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:58:18 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244993]: [NOTICE]   (244997) : New worker (244999) forked
Jan 22 17:58:18 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244993]: [NOTICE]   (244997) : Loading success.
Jan 22 17:58:19 compute-0 nova_compute[183075]: 2026-01-22 17:58:19.335 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:19 compute-0 nova_compute[183075]: 2026-01-22 17:58:19.828 183079 DEBUG nova.network.neutron [req-3f82e083-ff26-46b6-9f78-aa9ca9b9eca8 req-a78523c8-10d5-4c04-a286-9a7927ba17bb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Updated VIF entry in instance network info cache for port 0b4e1e83-23e3-4ee1-8291-e1197b83e358. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:58:19 compute-0 nova_compute[183075]: 2026-01-22 17:58:19.828 183079 DEBUG nova.network.neutron [req-3f82e083-ff26-46b6-9f78-aa9ca9b9eca8 req-a78523c8-10d5-4c04-a286-9a7927ba17bb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Updating instance_info_cache with network_info: [{"id": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "address": "fa:16:3e:c2:1a:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b4e1e83-23", "ovs_interfaceid": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:58:19 compute-0 nova_compute[183075]: 2026-01-22 17:58:19.843 183079 DEBUG oslo_concurrency.lockutils [req-3f82e083-ff26-46b6-9f78-aa9ca9b9eca8 req-a78523c8-10d5-4c04-a286-9a7927ba17bb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-d98d7460-4481-41d1-8ad7-97b85cb698ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:58:20 compute-0 nova_compute[183075]: 2026-01-22 17:58:19.999 183079 DEBUG nova.compute.manager [req-bb54abef-22fb-4627-b82d-a556d1bd8667 req-e664687b-ee27-4375-a225-ad6bc1216944 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Received event network-vif-plugged-0b4e1e83-23e3-4ee1-8291-e1197b83e358 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:58:20 compute-0 nova_compute[183075]: 2026-01-22 17:58:19.999 183079 DEBUG oslo_concurrency.lockutils [req-bb54abef-22fb-4627-b82d-a556d1bd8667 req-e664687b-ee27-4375-a225-ad6bc1216944 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:20 compute-0 nova_compute[183075]: 2026-01-22 17:58:19.999 183079 DEBUG oslo_concurrency.lockutils [req-bb54abef-22fb-4627-b82d-a556d1bd8667 req-e664687b-ee27-4375-a225-ad6bc1216944 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:20 compute-0 nova_compute[183075]: 2026-01-22 17:58:20.000 183079 DEBUG oslo_concurrency.lockutils [req-bb54abef-22fb-4627-b82d-a556d1bd8667 req-e664687b-ee27-4375-a225-ad6bc1216944 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:20 compute-0 nova_compute[183075]: 2026-01-22 17:58:20.000 183079 DEBUG nova.compute.manager [req-bb54abef-22fb-4627-b82d-a556d1bd8667 req-e664687b-ee27-4375-a225-ad6bc1216944 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] No waiting events found dispatching network-vif-plugged-0b4e1e83-23e3-4ee1-8291-e1197b83e358 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:58:20 compute-0 nova_compute[183075]: 2026-01-22 17:58:20.000 183079 WARNING nova.compute.manager [req-bb54abef-22fb-4627-b82d-a556d1bd8667 req-e664687b-ee27-4375-a225-ad6bc1216944 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Received unexpected event network-vif-plugged-0b4e1e83-23e3-4ee1-8291-e1197b83e358 for instance with vm_state active and task_state None.
Jan 22 17:58:20 compute-0 nova_compute[183075]: 2026-01-22 17:58:20.867 183079 INFO nova.compute.manager [None req-075c57f0-8e53-4dec-9ca6-a534723eedf6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Get console output
Jan 22 17:58:20 compute-0 nova_compute[183075]: 2026-01-22 17:58:20.874 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:58:21 compute-0 podman[245009]: 2026-01-22 17:58:21.357738103 +0000 UTC m=+0.054959404 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:58:21 compute-0 podman[245010]: 2026-01-22 17:58:21.364423224 +0000 UTC m=+0.055979372 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, managed_by=edpm_ansible, config_id=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 17:58:21 compute-0 podman[245008]: 2026-01-22 17:58:21.372909963 +0000 UTC m=+0.075438667 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:58:22 compute-0 nova_compute[183075]: 2026-01-22 17:58:22.529 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:22 compute-0 nova_compute[183075]: 2026-01-22 17:58:22.551 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769104687.5497816, 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:58:22 compute-0 nova_compute[183075]: 2026-01-22 17:58:22.552 183079 INFO nova.compute.manager [-] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] VM Stopped (Lifecycle Event)
Jan 22 17:58:22 compute-0 nova_compute[183075]: 2026-01-22 17:58:22.574 183079 DEBUG nova.compute.manager [None req-8791e03b-1254-4ecf-94fe-6cdf5d837265 - - - - - -] [instance: 4e7d63e0-335d-4cc1-9701-ba3728ed3e2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:58:24 compute-0 nova_compute[183075]: 2026-01-22 17:58:24.338 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:26 compute-0 nova_compute[183075]: 2026-01-22 17:58:26.010 183079 INFO nova.compute.manager [None req-c894d34c-6229-4c3e-88b3-92ced0f16d2f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Get console output
Jan 22 17:58:26 compute-0 nova_compute[183075]: 2026-01-22 17:58:26.016 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:58:26 compute-0 podman[245073]: 2026-01-22 17:58:26.345906677 +0000 UTC m=+0.056837535 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:58:27 compute-0 nova_compute[183075]: 2026-01-22 17:58:27.531 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:29 compute-0 nova_compute[183075]: 2026-01-22 17:58:29.340 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:29 compute-0 ovn_controller[95372]: 2026-01-22T17:58:29Z|00158|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c2:1a:74 10.100.0.5
Jan 22 17:58:29 compute-0 ovn_controller[95372]: 2026-01-22T17:58:29Z|00159|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c2:1a:74 10.100.0.5
Jan 22 17:58:31 compute-0 nova_compute[183075]: 2026-01-22 17:58:31.141 183079 INFO nova.compute.manager [None req-2eb79d1a-0f74-4370-96c6-8747e4e4722b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Get console output
Jan 22 17:58:31 compute-0 nova_compute[183075]: 2026-01-22 17:58:31.145 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:58:32 compute-0 nova_compute[183075]: 2026-01-22 17:58:32.534 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:34 compute-0 nova_compute[183075]: 2026-01-22 17:58:34.341 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:34.866 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:58:34 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:34.867 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:58:34 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:58:34 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:58:34 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:58:34 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:58:34 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:58:34 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:58:34 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.876 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.876 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.0095606
Jan 22 17:58:35 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244999]: 10.100.0.5:43740 [22/Jan/2026:17:58:34.865] listener listener/metadata 0/0/0/1010/1010 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.886 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.886 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.905 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:58:35 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244999]: 10.100.0.5:43744 [22/Jan/2026:17:58:35.885] listener listener/metadata 0/0/0/20/20 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.905 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0193963
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.910 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.910 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.922 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.923 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0128059
Jan 22 17:58:35 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244999]: 10.100.0.5:43748 [22/Jan/2026:17:58:35.909] listener listener/metadata 0/0/0/13/13 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.929 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.930 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.967 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.968 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0378520
Jan 22 17:58:35 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244999]: 10.100.0.5:43764 [22/Jan/2026:17:58:35.929] listener listener/metadata 0/0/0/39/39 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.975 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.976 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.988 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.989 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0128748
Jan 22 17:58:35 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244999]: 10.100.0.5:43780 [22/Jan/2026:17:58:35.975] listener listener/metadata 0/0/0/14/14 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.995 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:35.996 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:58:35 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.009 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.010 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0141668
Jan 22 17:58:36 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244999]: 10.100.0.5:43786 [22/Jan/2026:17:58:35.995] listener listener/metadata 0/0/0/15/15 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.014 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.015 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.028 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.028 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0132442
Jan 22 17:58:36 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244999]: 10.100.0.5:43794 [22/Jan/2026:17:58:36.014] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.033 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.034 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.052 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.052 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0180683
Jan 22 17:58:36 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244999]: 10.100.0.5:43796 [22/Jan/2026:17:58:36.033] listener listener/metadata 0/0/0/19/19 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.057 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.058 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.071 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.072 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0141413
Jan 22 17:58:36 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244999]: 10.100.0.5:43802 [22/Jan/2026:17:58:36.056] listener listener/metadata 0/0/0/15/15 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.076 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.077 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.090 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:58:36 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244999]: 10.100.0.5:43804 [22/Jan/2026:17:58:36.076] listener listener/metadata 0/0/0/14/14 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.090 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0132511
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.094 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.095 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:58:36 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244999]: 10.100.0.5:43810 [22/Jan/2026:17:58:36.094] listener listener/metadata 0/0/0/12/12 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.106 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0118163
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.114 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.115 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.192 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.192 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0774901
Jan 22 17:58:36 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244999]: 10.100.0.5:43822 [22/Jan/2026:17:58:36.114] listener listener/metadata 0/0/0/78/78 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.196 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.197 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:58:36 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244999]: 10.100.0.5:43838 [22/Jan/2026:17:58:36.196] listener listener/metadata 0/0/0/14/14 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.210 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.211 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0139973
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.214 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.214 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.231 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.232 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0174656
Jan 22 17:58:36 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244999]: 10.100.0.5:43846 [22/Jan/2026:17:58:36.214] listener listener/metadata 0/0/0/18/18 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.236 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.237 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.250 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:58:36 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244999]: 10.100.0.5:43858 [22/Jan/2026:17:58:36.236] listener listener/metadata 0/0/0/14/14 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.250 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0133905
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.254 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.255 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.267 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:58:36 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244999]: 10.100.0.5:43868 [22/Jan/2026:17:58:36.254] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:58:36 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:36.268 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0128736
Jan 22 17:58:36 compute-0 nova_compute[183075]: 2026-01-22 17:58:36.357 183079 INFO nova.compute.manager [None req-9b41a0b1-2274-47c8-b075-d74cdf531d29 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Get console output
Jan 22 17:58:36 compute-0 nova_compute[183075]: 2026-01-22 17:58:36.362 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:58:37 compute-0 podman[245115]: 2026-01-22 17:58:37.342562108 +0000 UTC m=+0.055418117 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:58:37 compute-0 nova_compute[183075]: 2026-01-22 17:58:37.576 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:39 compute-0 nova_compute[183075]: 2026-01-22 17:58:39.343 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:41 compute-0 podman[245140]: 2026-01-22 17:58:41.348199733 +0000 UTC m=+0.053186527 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:58:41 compute-0 nova_compute[183075]: 2026-01-22 17:58:41.516 183079 INFO nova.compute.manager [None req-f80c7573-e4b7-4bdd-9c74-040de5962935 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Get console output
Jan 22 17:58:41 compute-0 nova_compute[183075]: 2026-01-22 17:58:41.522 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:58:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:41.981 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:41.982 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:41.983 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:42 compute-0 nova_compute[183075]: 2026-01-22 17:58:42.579 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:44 compute-0 nova_compute[183075]: 2026-01-22 17:58:44.345 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:44 compute-0 nova_compute[183075]: 2026-01-22 17:58:44.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:58:44 compute-0 nova_compute[183075]: 2026-01-22 17:58:44.989 183079 DEBUG nova.compute.manager [req-dc9a31fe-aff4-49b7-bca0-e9e32d3abd6e req-a007b827-f5cd-46fc-9fa5-ccc64bee1f82 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Received event network-changed-0b4e1e83-23e3-4ee1-8291-e1197b83e358 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:58:44 compute-0 nova_compute[183075]: 2026-01-22 17:58:44.989 183079 DEBUG nova.compute.manager [req-dc9a31fe-aff4-49b7-bca0-e9e32d3abd6e req-a007b827-f5cd-46fc-9fa5-ccc64bee1f82 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Refreshing instance network info cache due to event network-changed-0b4e1e83-23e3-4ee1-8291-e1197b83e358. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:58:44 compute-0 nova_compute[183075]: 2026-01-22 17:58:44.990 183079 DEBUG oslo_concurrency.lockutils [req-dc9a31fe-aff4-49b7-bca0-e9e32d3abd6e req-a007b827-f5cd-46fc-9fa5-ccc64bee1f82 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-d98d7460-4481-41d1-8ad7-97b85cb698ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:58:44 compute-0 nova_compute[183075]: 2026-01-22 17:58:44.990 183079 DEBUG oslo_concurrency.lockutils [req-dc9a31fe-aff4-49b7-bca0-e9e32d3abd6e req-a007b827-f5cd-46fc-9fa5-ccc64bee1f82 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-d98d7460-4481-41d1-8ad7-97b85cb698ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:58:44 compute-0 nova_compute[183075]: 2026-01-22 17:58:44.990 183079 DEBUG nova.network.neutron [req-dc9a31fe-aff4-49b7-bca0-e9e32d3abd6e req-a007b827-f5cd-46fc-9fa5-ccc64bee1f82 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Refreshing network info cache for port 0b4e1e83-23e3-4ee1-8291-e1197b83e358 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:58:45 compute-0 nova_compute[183075]: 2026-01-22 17:58:45.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:58:46 compute-0 nova_compute[183075]: 2026-01-22 17:58:46.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:58:47 compute-0 nova_compute[183075]: 2026-01-22 17:58:47.084 183079 DEBUG nova.compute.manager [req-b55b7b03-8f64-446d-b17d-cffce5633c08 req-fdcee025-a9ed-4b96-a262-02eb3d891334 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Received event network-changed-0b4e1e83-23e3-4ee1-8291-e1197b83e358 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:58:47 compute-0 nova_compute[183075]: 2026-01-22 17:58:47.085 183079 DEBUG nova.compute.manager [req-b55b7b03-8f64-446d-b17d-cffce5633c08 req-fdcee025-a9ed-4b96-a262-02eb3d891334 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Refreshing instance network info cache due to event network-changed-0b4e1e83-23e3-4ee1-8291-e1197b83e358. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:58:47 compute-0 nova_compute[183075]: 2026-01-22 17:58:47.085 183079 DEBUG oslo_concurrency.lockutils [req-b55b7b03-8f64-446d-b17d-cffce5633c08 req-fdcee025-a9ed-4b96-a262-02eb3d891334 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-d98d7460-4481-41d1-8ad7-97b85cb698ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:58:47 compute-0 nova_compute[183075]: 2026-01-22 17:58:47.623 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:47 compute-0 nova_compute[183075]: 2026-01-22 17:58:47.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:58:47 compute-0 nova_compute[183075]: 2026-01-22 17:58:47.831 183079 DEBUG nova.network.neutron [req-dc9a31fe-aff4-49b7-bca0-e9e32d3abd6e req-a007b827-f5cd-46fc-9fa5-ccc64bee1f82 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Updated VIF entry in instance network info cache for port 0b4e1e83-23e3-4ee1-8291-e1197b83e358. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:58:47 compute-0 nova_compute[183075]: 2026-01-22 17:58:47.831 183079 DEBUG nova.network.neutron [req-dc9a31fe-aff4-49b7-bca0-e9e32d3abd6e req-a007b827-f5cd-46fc-9fa5-ccc64bee1f82 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Updating instance_info_cache with network_info: [{"id": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "address": "fa:16:3e:c2:1a:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b4e1e83-23", "ovs_interfaceid": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:58:47 compute-0 nova_compute[183075]: 2026-01-22 17:58:47.904 183079 DEBUG oslo_concurrency.lockutils [req-dc9a31fe-aff4-49b7-bca0-e9e32d3abd6e req-a007b827-f5cd-46fc-9fa5-ccc64bee1f82 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-d98d7460-4481-41d1-8ad7-97b85cb698ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:58:47 compute-0 nova_compute[183075]: 2026-01-22 17:58:47.905 183079 DEBUG oslo_concurrency.lockutils [req-b55b7b03-8f64-446d-b17d-cffce5633c08 req-fdcee025-a9ed-4b96-a262-02eb3d891334 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-d98d7460-4481-41d1-8ad7-97b85cb698ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:58:47 compute-0 nova_compute[183075]: 2026-01-22 17:58:47.906 183079 DEBUG nova.network.neutron [req-b55b7b03-8f64-446d-b17d-cffce5633c08 req-fdcee025-a9ed-4b96-a262-02eb3d891334 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Refreshing network info cache for port 0b4e1e83-23e3-4ee1-8291-e1197b83e358 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:58:49 compute-0 nova_compute[183075]: 2026-01-22 17:58:49.347 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:51 compute-0 nova_compute[183075]: 2026-01-22 17:58:51.889 183079 DEBUG nova.network.neutron [req-b55b7b03-8f64-446d-b17d-cffce5633c08 req-fdcee025-a9ed-4b96-a262-02eb3d891334 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Updated VIF entry in instance network info cache for port 0b4e1e83-23e3-4ee1-8291-e1197b83e358. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:58:51 compute-0 nova_compute[183075]: 2026-01-22 17:58:51.890 183079 DEBUG nova.network.neutron [req-b55b7b03-8f64-446d-b17d-cffce5633c08 req-fdcee025-a9ed-4b96-a262-02eb3d891334 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Updating instance_info_cache with network_info: [{"id": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "address": "fa:16:3e:c2:1a:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b4e1e83-23", "ovs_interfaceid": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:58:51 compute-0 nova_compute[183075]: 2026-01-22 17:58:51.913 183079 DEBUG oslo_concurrency.lockutils [req-b55b7b03-8f64-446d-b17d-cffce5633c08 req-fdcee025-a9ed-4b96-a262-02eb3d891334 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-d98d7460-4481-41d1-8ad7-97b85cb698ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.036 183079 DEBUG nova.compute.manager [req-a561e873-082c-497a-a0dd-479cef76604e req-fc6ae976-dfc8-4256-892d-9b4ece786458 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Received event network-changed-0b4e1e83-23e3-4ee1-8291-e1197b83e358 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.036 183079 DEBUG nova.compute.manager [req-a561e873-082c-497a-a0dd-479cef76604e req-fc6ae976-dfc8-4256-892d-9b4ece786458 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Refreshing instance network info cache due to event network-changed-0b4e1e83-23e3-4ee1-8291-e1197b83e358. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.036 183079 DEBUG oslo_concurrency.lockutils [req-a561e873-082c-497a-a0dd-479cef76604e req-fc6ae976-dfc8-4256-892d-9b4ece786458 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-d98d7460-4481-41d1-8ad7-97b85cb698ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.037 183079 DEBUG oslo_concurrency.lockutils [req-a561e873-082c-497a-a0dd-479cef76604e req-fc6ae976-dfc8-4256-892d-9b4ece786458 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-d98d7460-4481-41d1-8ad7-97b85cb698ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.037 183079 DEBUG nova.network.neutron [req-a561e873-082c-497a-a0dd-479cef76604e req-fc6ae976-dfc8-4256-892d-9b4ece786458 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Refreshing network info cache for port 0b4e1e83-23e3-4ee1-8291-e1197b83e358 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.207 183079 DEBUG oslo_concurrency.lockutils [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "d98d7460-4481-41d1-8ad7-97b85cb698ad" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.208 183079 DEBUG oslo_concurrency.lockutils [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "d98d7460-4481-41d1-8ad7-97b85cb698ad" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.208 183079 DEBUG oslo_concurrency.lockutils [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.209 183079 DEBUG oslo_concurrency.lockutils [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.209 183079 DEBUG oslo_concurrency.lockutils [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.210 183079 INFO nova.compute.manager [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Terminating instance
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.211 183079 DEBUG nova.compute.manager [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 17:58:52 compute-0 kernel: tap0b4e1e83-23 (unregistering): left promiscuous mode
Jan 22 17:58:52 compute-0 NetworkManager[55454]: <info>  [1769104732.2397] device (tap0b4e1e83-23): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:58:52 compute-0 ovn_controller[95372]: 2026-01-22T17:58:52Z|00860|binding|INFO|Releasing lport 0b4e1e83-23e3-4ee1-8291-e1197b83e358 from this chassis (sb_readonly=0)
Jan 22 17:58:52 compute-0 ovn_controller[95372]: 2026-01-22T17:58:52Z|00861|binding|INFO|Setting lport 0b4e1e83-23e3-4ee1-8291-e1197b83e358 down in Southbound
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.249 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:52 compute-0 ovn_controller[95372]: 2026-01-22T17:58:52Z|00862|binding|INFO|Removing iface tap0b4e1e83-23 ovn-installed in OVS
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.251 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.267 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:52.270 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:1a:74 10.100.0.5'], port_security=['fa:16:3e:c2:1a:74 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd98d7460-4481-41d1-8ad7-97b85cb698ad', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '63a5bff5-793d-4ac2-8d85-9660aed4c099', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.223'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=0b4e1e83-23e3-4ee1-8291-e1197b83e358) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:58:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:52.271 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 0b4e1e83-23e3-4ee1-8291-e1197b83e358 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 17:58:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:52.273 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 17:58:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:52.274 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c606a76f-98b9-4457-a4ac-628efc420075]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:52.275 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace which is not needed anymore
Jan 22 17:58:52 compute-0 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d0000004e.scope: Deactivated successfully.
Jan 22 17:58:52 compute-0 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d0000004e.scope: Consumed 12.472s CPU time.
Jan 22 17:58:52 compute-0 systemd-machined[154382]: Machine qemu-78-instance-0000004e terminated.
Jan 22 17:58:52 compute-0 podman[245167]: 2026-01-22 17:58:52.328602305 +0000 UTC m=+0.062556019 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 17:58:52 compute-0 podman[245168]: 2026-01-22 17:58:52.328938284 +0000 UTC m=+0.059885827 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Jan 22 17:58:52 compute-0 podman[245164]: 2026-01-22 17:58:52.37657407 +0000 UTC m=+0.112666632 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:58:52 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244993]: [NOTICE]   (244997) : haproxy version is 2.8.14-c23fe91
Jan 22 17:58:52 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244993]: [NOTICE]   (244997) : path to executable is /usr/sbin/haproxy
Jan 22 17:58:52 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244993]: [WARNING]  (244997) : Exiting Master process...
Jan 22 17:58:52 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244993]: [ALERT]    (244997) : Current worker (244999) exited with code 143 (Terminated)
Jan 22 17:58:52 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[244993]: [WARNING]  (244997) : All workers exited. Exiting... (0)
Jan 22 17:58:52 compute-0 systemd[1]: libpod-4dd3a4d975f00934ea32e1963c2386f1e9c71781266bac796c248aa19d53a2b1.scope: Deactivated successfully.
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.436 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:52 compute-0 podman[245252]: 2026-01-22 17:58:52.438676337 +0000 UTC m=+0.079361934 container died 4dd3a4d975f00934ea32e1963c2386f1e9c71781266bac796c248aa19d53a2b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.440 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.465 183079 INFO nova.virt.libvirt.driver [-] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Instance destroyed successfully.
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.466 183079 DEBUG nova.objects.instance [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid d98d7460-4481-41d1-8ad7-97b85cb698ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.491 183079 DEBUG nova.virt.libvirt.vif [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:58:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-263526970',display_name='tempest-server-test-263526970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-263526970',id=78,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:58:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-00duqttj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:58:18Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=d98d7460-4481-41d1-8ad7-97b85cb698ad,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "address": "fa:16:3e:c2:1a:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b4e1e83-23", "ovs_interfaceid": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.491 183079 DEBUG nova.network.os_vif_util [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "address": "fa:16:3e:c2:1a:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b4e1e83-23", "ovs_interfaceid": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.491 183079 DEBUG nova.network.os_vif_util [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c2:1a:74,bridge_name='br-int',has_traffic_filtering=True,id=0b4e1e83-23e3-4ee1-8291-e1197b83e358,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b4e1e83-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.492 183079 DEBUG os_vif [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:1a:74,bridge_name='br-int',has_traffic_filtering=True,id=0b4e1e83-23e3-4ee1-8291-e1197b83e358,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b4e1e83-23') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.493 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.493 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b4e1e83-23, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.496 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.498 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.501 183079 INFO os_vif [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:1a:74,bridge_name='br-int',has_traffic_filtering=True,id=0b4e1e83-23e3-4ee1-8291-e1197b83e358,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b4e1e83-23')
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.501 183079 INFO nova.virt.libvirt.driver [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Deleting instance files /var/lib/nova/instances/d98d7460-4481-41d1-8ad7-97b85cb698ad_del
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.502 183079 INFO nova.virt.libvirt.driver [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Deletion of /var/lib/nova/instances/d98d7460-4481-41d1-8ad7-97b85cb698ad_del complete
Jan 22 17:58:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4dd3a4d975f00934ea32e1963c2386f1e9c71781266bac796c248aa19d53a2b1-userdata-shm.mount: Deactivated successfully.
Jan 22 17:58:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-d0870b1da9ab0907d4c6a023c91b7e6cfbc78074f74b1ac3823a459d00e47bef-merged.mount: Deactivated successfully.
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.579 183079 INFO nova.compute.manager [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.581 183079 DEBUG oslo.service.loopingcall [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.581 183079 DEBUG nova.compute.manager [-] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.582 183079 DEBUG nova.network.neutron [-] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 17:58:52 compute-0 podman[245252]: 2026-01-22 17:58:52.63404963 +0000 UTC m=+0.274735207 container cleanup 4dd3a4d975f00934ea32e1963c2386f1e9c71781266bac796c248aa19d53a2b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:58:52 compute-0 systemd[1]: libpod-conmon-4dd3a4d975f00934ea32e1963c2386f1e9c71781266bac796c248aa19d53a2b1.scope: Deactivated successfully.
Jan 22 17:58:52 compute-0 podman[245293]: 2026-01-22 17:58:52.796716851 +0000 UTC m=+0.142786365 container remove 4dd3a4d975f00934ea32e1963c2386f1e9c71781266bac796c248aa19d53a2b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:58:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:52.801 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0488062c-0456-47c0-890c-b8a7464491e0]: (4, ('Thu Jan 22 05:58:52 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (4dd3a4d975f00934ea32e1963c2386f1e9c71781266bac796c248aa19d53a2b1)\n4dd3a4d975f00934ea32e1963c2386f1e9c71781266bac796c248aa19d53a2b1\nThu Jan 22 05:58:52 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (4dd3a4d975f00934ea32e1963c2386f1e9c71781266bac796c248aa19d53a2b1)\n4dd3a4d975f00934ea32e1963c2386f1e9c71781266bac796c248aa19d53a2b1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:52.803 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bf6b607f-8808-4553-8fbc-66ee3aa733ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:52.804 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.805 183079 DEBUG nova.compute.manager [req-f5b7af1f-9696-45a7-9fd9-280b0c5bc38c req-2ee70a73-a792-4070-96bf-4382b2246093 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Received event network-vif-unplugged-0b4e1e83-23e3-4ee1-8291-e1197b83e358 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.805 183079 DEBUG oslo_concurrency.lockutils [req-f5b7af1f-9696-45a7-9fd9-280b0c5bc38c req-2ee70a73-a792-4070-96bf-4382b2246093 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.805 183079 DEBUG oslo_concurrency.lockutils [req-f5b7af1f-9696-45a7-9fd9-280b0c5bc38c req-2ee70a73-a792-4070-96bf-4382b2246093 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.806 183079 DEBUG oslo_concurrency.lockutils [req-f5b7af1f-9696-45a7-9fd9-280b0c5bc38c req-2ee70a73-a792-4070-96bf-4382b2246093 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.806 183079 DEBUG nova.compute.manager [req-f5b7af1f-9696-45a7-9fd9-280b0c5bc38c req-2ee70a73-a792-4070-96bf-4382b2246093 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] No waiting events found dispatching network-vif-unplugged-0b4e1e83-23e3-4ee1-8291-e1197b83e358 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.806 183079 DEBUG nova.compute.manager [req-f5b7af1f-9696-45a7-9fd9-280b0c5bc38c req-2ee70a73-a792-4070-96bf-4382b2246093 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Received event network-vif-unplugged-0b4e1e83-23e3-4ee1-8291-e1197b83e358 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 17:58:52 compute-0 kernel: tap88ed9213-70: left promiscuous mode
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.806 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:52 compute-0 nova_compute[183075]: 2026-01-22 17:58:52.817 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:52.820 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ded70791-f24d-41b1-8bb6-20daa0cfd609]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:52.839 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1bcf05d5-a06b-4a59-ae52-92dfcefe94e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:52.841 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9daf7522-27e0-4e50-87b4-d8eb3fa3eb37]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:52.855 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2be15082-1d13-4bb8-a7b9-b43578b0da84]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706144, 'reachable_time': 33276, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245308, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:52 compute-0 systemd[1]: run-netns-ovnmeta\x2d88ed9213\x2d7d48\x2d4c42\x2dad60\x2dce3fcf104f71.mount: Deactivated successfully.
Jan 22 17:58:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:52.858 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 17:58:52 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:58:52.859 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[677fe592-1c55-44b8-ac58-0a23c306b202]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.539 183079 DEBUG nova.network.neutron [-] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.586 183079 INFO nova.compute.manager [-] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Took 1.00 seconds to deallocate network for instance.
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.674 183079 DEBUG oslo_concurrency.lockutils [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.675 183079 DEBUG oslo_concurrency.lockutils [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.709 183079 DEBUG nova.scheduler.client.report [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Refreshing inventories for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.716 183079 DEBUG nova.network.neutron [req-a561e873-082c-497a-a0dd-479cef76604e req-fc6ae976-dfc8-4256-892d-9b4ece786458 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Updated VIF entry in instance network info cache for port 0b4e1e83-23e3-4ee1-8291-e1197b83e358. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.716 183079 DEBUG nova.network.neutron [req-a561e873-082c-497a-a0dd-479cef76604e req-fc6ae976-dfc8-4256-892d-9b4ece786458 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Updating instance_info_cache with network_info: [{"id": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "address": "fa:16:3e:c2:1a:74", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b4e1e83-23", "ovs_interfaceid": "0b4e1e83-23e3-4ee1-8291-e1197b83e358", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.733 183079 DEBUG nova.scheduler.client.report [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Updating ProviderTree inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.733 183079 DEBUG nova.compute.provider_tree [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.752 183079 DEBUG nova.scheduler.client.report [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Refreshing aggregate associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.767 183079 DEBUG oslo_concurrency.lockutils [req-a561e873-082c-497a-a0dd-479cef76604e req-fc6ae976-dfc8-4256-892d-9b4ece786458 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-d98d7460-4481-41d1-8ad7-97b85cb698ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.771 183079 DEBUG nova.scheduler.client.report [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Refreshing trait associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NODE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.815 183079 DEBUG nova.compute.provider_tree [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.841 183079 DEBUG nova.scheduler.client.report [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.880 183079 DEBUG oslo_concurrency.lockutils [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:53 compute-0 nova_compute[183075]: 2026-01-22 17:58:53.909 183079 INFO nova.scheduler.client.report [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance d98d7460-4481-41d1-8ad7-97b85cb698ad
Jan 22 17:58:54 compute-0 nova_compute[183075]: 2026-01-22 17:58:54.029 183079 DEBUG oslo_concurrency.lockutils [None req-2e9319cf-d557-403f-aabd-1c68d59b03dc 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "d98d7460-4481-41d1-8ad7-97b85cb698ad" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.821s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:54 compute-0 nova_compute[183075]: 2026-01-22 17:58:54.129 183079 DEBUG nova.compute.manager [req-aa871a95-21f9-4647-91fb-575994bd871c req-41fa11e6-0b2f-4685-8196-90295727e310 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Received event network-vif-deleted-0b4e1e83-23e3-4ee1-8291-e1197b83e358 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:58:54 compute-0 nova_compute[183075]: 2026-01-22 17:58:54.130 183079 INFO nova.compute.manager [req-aa871a95-21f9-4647-91fb-575994bd871c req-41fa11e6-0b2f-4685-8196-90295727e310 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Neutron deleted interface 0b4e1e83-23e3-4ee1-8291-e1197b83e358; detaching it from the instance and deleting it from the info cache
Jan 22 17:58:54 compute-0 nova_compute[183075]: 2026-01-22 17:58:54.130 183079 DEBUG nova.network.neutron [req-aa871a95-21f9-4647-91fb-575994bd871c req-41fa11e6-0b2f-4685-8196-90295727e310 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 22 17:58:54 compute-0 nova_compute[183075]: 2026-01-22 17:58:54.132 183079 DEBUG nova.compute.manager [req-aa871a95-21f9-4647-91fb-575994bd871c req-41fa11e6-0b2f-4685-8196-90295727e310 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Detach interface failed, port_id=0b4e1e83-23e3-4ee1-8291-e1197b83e358, reason: Instance d98d7460-4481-41d1-8ad7-97b85cb698ad could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 17:58:54 compute-0 nova_compute[183075]: 2026-01-22 17:58:54.348 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:54 compute-0 nova_compute[183075]: 2026-01-22 17:58:54.895 183079 DEBUG nova.compute.manager [req-e504af17-6c1c-4196-a86e-965be6f8b3fb req-242180cc-423a-4c9c-9b70-5ab9f2aac782 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Received event network-vif-plugged-0b4e1e83-23e3-4ee1-8291-e1197b83e358 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:58:54 compute-0 nova_compute[183075]: 2026-01-22 17:58:54.895 183079 DEBUG oslo_concurrency.lockutils [req-e504af17-6c1c-4196-a86e-965be6f8b3fb req-242180cc-423a-4c9c-9b70-5ab9f2aac782 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:54 compute-0 nova_compute[183075]: 2026-01-22 17:58:54.896 183079 DEBUG oslo_concurrency.lockutils [req-e504af17-6c1c-4196-a86e-965be6f8b3fb req-242180cc-423a-4c9c-9b70-5ab9f2aac782 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:54 compute-0 nova_compute[183075]: 2026-01-22 17:58:54.896 183079 DEBUG oslo_concurrency.lockutils [req-e504af17-6c1c-4196-a86e-965be6f8b3fb req-242180cc-423a-4c9c-9b70-5ab9f2aac782 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "d98d7460-4481-41d1-8ad7-97b85cb698ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:54 compute-0 nova_compute[183075]: 2026-01-22 17:58:54.896 183079 DEBUG nova.compute.manager [req-e504af17-6c1c-4196-a86e-965be6f8b3fb req-242180cc-423a-4c9c-9b70-5ab9f2aac782 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] No waiting events found dispatching network-vif-plugged-0b4e1e83-23e3-4ee1-8291-e1197b83e358 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:58:54 compute-0 nova_compute[183075]: 2026-01-22 17:58:54.896 183079 WARNING nova.compute.manager [req-e504af17-6c1c-4196-a86e-965be6f8b3fb req-242180cc-423a-4c9c-9b70-5ab9f2aac782 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Received unexpected event network-vif-plugged-0b4e1e83-23e3-4ee1-8291-e1197b83e358 for instance with vm_state deleted and task_state None.
Jan 22 17:58:56 compute-0 nova_compute[183075]: 2026-01-22 17:58:56.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:58:56 compute-0 nova_compute[183075]: 2026-01-22 17:58:56.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:58:56 compute-0 nova_compute[183075]: 2026-01-22 17:58:56.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:58:56 compute-0 nova_compute[183075]: 2026-01-22 17:58:56.804 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.064 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "cf09548c-3631-48db-b474-279b88fc113d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.065 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "cf09548c-3631-48db-b474-279b88fc113d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.086 183079 DEBUG nova.compute.manager [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.156 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.156 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.164 183079 DEBUG nova.virt.hardware [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.164 183079 INFO nova.compute.claims [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Claim successful on node compute-0.ctlplane.example.com
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.289 183079 DEBUG nova.compute.provider_tree [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.307 183079 DEBUG nova.scheduler.client.report [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.330 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.331 183079 DEBUG nova.compute.manager [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 17:58:57 compute-0 podman[245309]: 2026-01-22 17:58:57.338393833 +0000 UTC m=+0.053166566 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.389 183079 DEBUG nova.compute.manager [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.391 183079 DEBUG nova.network.neutron [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.413 183079 INFO nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.434 183079 DEBUG nova.compute.manager [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.496 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.524 183079 DEBUG nova.compute.manager [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.525 183079 DEBUG nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.525 183079 INFO nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Creating image(s)
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.526 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.526 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.527 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.543 183079 DEBUG nova.policy [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.547 183079 DEBUG oslo_concurrency.processutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.610 183079 DEBUG oslo_concurrency.processutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.611 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.612 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.624 183079 DEBUG oslo_concurrency.processutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.680 183079 DEBUG oslo_concurrency.processutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.681 183079 DEBUG oslo_concurrency.processutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.719 183079 DEBUG oslo_concurrency.processutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.720 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.721 183079 DEBUG oslo_concurrency.processutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.773 183079 DEBUG oslo_concurrency.processutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.775 183079 DEBUG nova.virt.disk.api [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.775 183079 DEBUG oslo_concurrency.processutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.791 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.811 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.812 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.812 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.812 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.851 183079 DEBUG oslo_concurrency.processutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.852 183079 DEBUG nova.virt.disk.api [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.852 183079 DEBUG nova.objects.instance [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid cf09548c-3631-48db-b474-279b88fc113d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.867 183079 DEBUG nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.867 183079 DEBUG nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Ensure instance console log exists: /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.868 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.868 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.868 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.971 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.972 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5645MB free_disk=73.35103607177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.972 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:58:57 compute-0 nova_compute[183075]: 2026-01-22 17:58:57.973 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:58:58 compute-0 nova_compute[183075]: 2026-01-22 17:58:58.023 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance cf09548c-3631-48db-b474-279b88fc113d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 17:58:58 compute-0 nova_compute[183075]: 2026-01-22 17:58:58.023 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 17:58:58 compute-0 nova_compute[183075]: 2026-01-22 17:58:58.023 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 17:58:58 compute-0 nova_compute[183075]: 2026-01-22 17:58:58.057 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 17:58:58 compute-0 nova_compute[183075]: 2026-01-22 17:58:58.069 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 17:58:58 compute-0 nova_compute[183075]: 2026-01-22 17:58:58.086 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 17:58:58 compute-0 nova_compute[183075]: 2026-01-22 17:58:58.086 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:58:59 compute-0 nova_compute[183075]: 2026-01-22 17:58:59.083 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:58:59 compute-0 nova_compute[183075]: 2026-01-22 17:58:59.083 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 17:58:59 compute-0 nova_compute[183075]: 2026-01-22 17:58:59.350 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:00 compute-0 nova_compute[183075]: 2026-01-22 17:59:00.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:59:01 compute-0 nova_compute[183075]: 2026-01-22 17:59:01.797 183079 DEBUG nova.network.neutron [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Successfully created port: dc533cce-5ac7-469b-ac7f-effa9e40a838 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 17:59:02 compute-0 nova_compute[183075]: 2026-01-22 17:59:02.548 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:02 compute-0 nova_compute[183075]: 2026-01-22 17:59:02.720 183079 DEBUG nova.network.neutron [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Successfully updated port: dc533cce-5ac7-469b-ac7f-effa9e40a838 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 17:59:02 compute-0 nova_compute[183075]: 2026-01-22 17:59:02.732 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-cf09548c-3631-48db-b474-279b88fc113d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:59:02 compute-0 nova_compute[183075]: 2026-01-22 17:59:02.732 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-cf09548c-3631-48db-b474-279b88fc113d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:59:02 compute-0 nova_compute[183075]: 2026-01-22 17:59:02.733 183079 DEBUG nova.network.neutron [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 17:59:02 compute-0 nova_compute[183075]: 2026-01-22 17:59:02.794 183079 DEBUG nova.compute.manager [req-182fbde5-b6e9-472e-b38f-d20c938fc2a1 req-a5efd043-ae44-4a25-82f0-dcf767c677b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Received event network-changed-dc533cce-5ac7-469b-ac7f-effa9e40a838 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:59:02 compute-0 nova_compute[183075]: 2026-01-22 17:59:02.794 183079 DEBUG nova.compute.manager [req-182fbde5-b6e9-472e-b38f-d20c938fc2a1 req-a5efd043-ae44-4a25-82f0-dcf767c677b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Refreshing instance network info cache due to event network-changed-dc533cce-5ac7-469b-ac7f-effa9e40a838. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 17:59:02 compute-0 nova_compute[183075]: 2026-01-22 17:59:02.795 183079 DEBUG oslo_concurrency.lockutils [req-182fbde5-b6e9-472e-b38f-d20c938fc2a1 req-a5efd043-ae44-4a25-82f0-dcf767c677b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-cf09548c-3631-48db-b474-279b88fc113d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:59:02 compute-0 nova_compute[183075]: 2026-01-22 17:59:02.850 183079 DEBUG nova.network.neutron [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.940 183079 DEBUG nova.network.neutron [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Updating instance_info_cache with network_info: [{"id": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "address": "fa:16:3e:65:51:bc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc533cce-5a", "ovs_interfaceid": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.959 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-cf09548c-3631-48db-b474-279b88fc113d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.959 183079 DEBUG nova.compute.manager [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Instance network_info: |[{"id": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "address": "fa:16:3e:65:51:bc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc533cce-5a", "ovs_interfaceid": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.960 183079 DEBUG oslo_concurrency.lockutils [req-182fbde5-b6e9-472e-b38f-d20c938fc2a1 req-a5efd043-ae44-4a25-82f0-dcf767c677b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-cf09548c-3631-48db-b474-279b88fc113d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.960 183079 DEBUG nova.network.neutron [req-182fbde5-b6e9-472e-b38f-d20c938fc2a1 req-a5efd043-ae44-4a25-82f0-dcf767c677b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Refreshing network info cache for port dc533cce-5ac7-469b-ac7f-effa9e40a838 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.963 183079 DEBUG nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Start _get_guest_xml network_info=[{"id": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "address": "fa:16:3e:65:51:bc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc533cce-5a", "ovs_interfaceid": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.969 183079 WARNING nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.976 183079 DEBUG nova.virt.libvirt.host [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.977 183079 DEBUG nova.virt.libvirt.host [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.985 183079 DEBUG nova.virt.libvirt.host [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.986 183079 DEBUG nova.virt.libvirt.host [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.987 183079 DEBUG nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.987 183079 DEBUG nova.virt.hardware [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.987 183079 DEBUG nova.virt.hardware [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.987 183079 DEBUG nova.virt.hardware [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.988 183079 DEBUG nova.virt.hardware [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.988 183079 DEBUG nova.virt.hardware [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.988 183079 DEBUG nova.virt.hardware [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.988 183079 DEBUG nova.virt.hardware [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.989 183079 DEBUG nova.virt.hardware [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.989 183079 DEBUG nova.virt.hardware [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.989 183079 DEBUG nova.virt.hardware [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.989 183079 DEBUG nova.virt.hardware [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.993 183079 DEBUG nova.virt.libvirt.vif [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:58:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1105403482',display_name='tempest-server-test-1105403482',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1105403482',id=79,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-qz3but4j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:58:57Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=cf09548c-3631-48db-b474-279b88fc113d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "address": "fa:16:3e:65:51:bc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc533cce-5a", "ovs_interfaceid": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.993 183079 DEBUG nova.network.os_vif_util [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "address": "fa:16:3e:65:51:bc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc533cce-5a", "ovs_interfaceid": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.994 183079 DEBUG nova.network.os_vif_util [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:51:bc,bridge_name='br-int',has_traffic_filtering=True,id=dc533cce-5ac7-469b-ac7f-effa9e40a838,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc533cce-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:59:03 compute-0 nova_compute[183075]: 2026-01-22 17:59:03.995 183079 DEBUG nova.objects.instance [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid cf09548c-3631-48db-b474-279b88fc113d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.009 183079 DEBUG nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:59:04 compute-0 nova_compute[183075]:   <uuid>cf09548c-3631-48db-b474-279b88fc113d</uuid>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   <name>instance-0000004f</name>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   <metadata>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1105403482</nova:name>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 17:59:03</nova:creationTime>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 17:59:04 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 17:59:04 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 17:59:04 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 17:59:04 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:59:04 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 17:59:04 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 17:59:04 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 17:59:04 compute-0 nova_compute[183075]:         <nova:port uuid="dc533cce-5ac7-469b-ac7f-effa9e40a838">
Jan 22 17:59:04 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   </metadata>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <system>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <entry name="serial">cf09548c-3631-48db-b474-279b88fc113d</entry>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <entry name="uuid">cf09548c-3631-48db-b474-279b88fc113d</entry>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     </system>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   <os>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   </os>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   <features>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <apic/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   </features>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   </clock>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   </cpu>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   <devices>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     </disk>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:65:51:bc"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <target dev="tapdc533cce-5a"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     </interface>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/console.log" append="off"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     </serial>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <video>
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     </video>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     </rng>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 17:59:04 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 17:59:04 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 17:59:04 compute-0 nova_compute[183075]:   </devices>
Jan 22 17:59:04 compute-0 nova_compute[183075]: </domain>
Jan 22 17:59:04 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.011 183079 DEBUG nova.compute.manager [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Preparing to wait for external event network-vif-plugged-dc533cce-5ac7-469b-ac7f-effa9e40a838 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.011 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "cf09548c-3631-48db-b474-279b88fc113d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.012 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "cf09548c-3631-48db-b474-279b88fc113d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.012 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "cf09548c-3631-48db-b474-279b88fc113d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.013 183079 DEBUG nova.virt.libvirt.vif [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:58:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1105403482',display_name='tempest-server-test-1105403482',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1105403482',id=79,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-qz3but4j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T17:58:57Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=cf09548c-3631-48db-b474-279b88fc113d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "address": "fa:16:3e:65:51:bc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc533cce-5a", "ovs_interfaceid": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.013 183079 DEBUG nova.network.os_vif_util [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "address": "fa:16:3e:65:51:bc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc533cce-5a", "ovs_interfaceid": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.014 183079 DEBUG nova.network.os_vif_util [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:51:bc,bridge_name='br-int',has_traffic_filtering=True,id=dc533cce-5ac7-469b-ac7f-effa9e40a838,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc533cce-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.014 183079 DEBUG os_vif [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:51:bc,bridge_name='br-int',has_traffic_filtering=True,id=dc533cce-5ac7-469b-ac7f-effa9e40a838,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc533cce-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.015 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.015 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.015 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.018 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.018 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdc533cce-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.018 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdc533cce-5a, col_values=(('external_ids', {'iface-id': 'dc533cce-5ac7-469b-ac7f-effa9e40a838', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:51:bc', 'vm-uuid': 'cf09548c-3631-48db-b474-279b88fc113d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.020 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:04 compute-0 NetworkManager[55454]: <info>  [1769104744.0217] manager: (tapdc533cce-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/343)
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.022 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.032 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.032 183079 INFO os_vif [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:51:bc,bridge_name='br-int',has_traffic_filtering=True,id=dc533cce-5ac7-469b-ac7f-effa9e40a838,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc533cce-5a')
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.082 183079 DEBUG nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.082 183079 DEBUG nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:65:51:bc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 17:59:04 compute-0 kernel: tapdc533cce-5a: entered promiscuous mode
Jan 22 17:59:04 compute-0 NetworkManager[55454]: <info>  [1769104744.1630] manager: (tapdc533cce-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/344)
Jan 22 17:59:04 compute-0 ovn_controller[95372]: 2026-01-22T17:59:04Z|00863|binding|INFO|Claiming lport dc533cce-5ac7-469b-ac7f-effa9e40a838 for this chassis.
Jan 22 17:59:04 compute-0 ovn_controller[95372]: 2026-01-22T17:59:04Z|00864|binding|INFO|dc533cce-5ac7-469b-ac7f-effa9e40a838: Claiming fa:16:3e:65:51:bc 10.100.0.11
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.163 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.171 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:51:bc 10.100.0.11'], port_security=['fa:16:3e:65:51:bc 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '93cc952c-d4f7-47c9-94ed-c14dd990188b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=dc533cce-5ac7-469b-ac7f-effa9e40a838) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.173 104629 INFO neutron.agent.ovn.metadata.agent [-] Port dc533cce-5ac7-469b-ac7f-effa9e40a838 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.175 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:59:04 compute-0 ovn_controller[95372]: 2026-01-22T17:59:04Z|00865|binding|INFO|Setting lport dc533cce-5ac7-469b-ac7f-effa9e40a838 up in Southbound
Jan 22 17:59:04 compute-0 ovn_controller[95372]: 2026-01-22T17:59:04Z|00866|binding|INFO|Setting lport dc533cce-5ac7-469b-ac7f-effa9e40a838 ovn-installed in OVS
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.178 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.181 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.186 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1ab18273-7070-45ac-ab09-c79fe215e6ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.187 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88ed9213-71 in ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.189 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88ed9213-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.189 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2521bc30-c3e7-4330-8398-565378fd06ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.191 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1896a334-4d01-41c6-abba-304332d3dcaf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:59:04 compute-0 systemd-udevd[245362]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.203 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[718ed40b-5e7a-409a-b714-bce4808044b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:59:04 compute-0 systemd-machined[154382]: New machine qemu-79-instance-0000004f.
Jan 22 17:59:04 compute-0 NetworkManager[55454]: <info>  [1769104744.2179] device (tapdc533cce-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:59:04 compute-0 NetworkManager[55454]: <info>  [1769104744.2186] device (tapdc533cce-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:59:04 compute-0 systemd[1]: Started Virtual Machine qemu-79-instance-0000004f.
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.228 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ae824e6d-030e-4319-aa3a-2fc96310a4fe]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.255 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ff7fec0c-3eef-48e4-8401-8a4f4c98b836]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.260 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[807f8358-4308-49c6-bc72-9ab04bdd0c82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:59:04 compute-0 NetworkManager[55454]: <info>  [1769104744.2610] manager: (tap88ed9213-70): new Veth device (/org/freedesktop/NetworkManager/Devices/345)
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.289 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[8f400a40-2d44-499c-912f-599449f796d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.292 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[892eb022-8532-49f1-a9ab-53ce4b6672ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:59:04 compute-0 NetworkManager[55454]: <info>  [1769104744.3174] device (tap88ed9213-70): carrier: link connected
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.325 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[48f3b450-13b7-46f3-b03f-00cf85de4d60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.344 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d5b819f3-6bdc-4ba9-a564-789781f102c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 235], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 710801, 'reachable_time': 21320, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245394, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.352 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.359 183079 DEBUG nova.compute.manager [req-5dda8240-4635-4fbb-bf57-66e9377cc7d9 req-7184144f-4458-4877-94c7-787a84dbdb80 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Received event network-vif-plugged-dc533cce-5ac7-469b-ac7f-effa9e40a838 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.359 183079 DEBUG oslo_concurrency.lockutils [req-5dda8240-4635-4fbb-bf57-66e9377cc7d9 req-7184144f-4458-4877-94c7-787a84dbdb80 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "cf09548c-3631-48db-b474-279b88fc113d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.359 183079 DEBUG oslo_concurrency.lockutils [req-5dda8240-4635-4fbb-bf57-66e9377cc7d9 req-7184144f-4458-4877-94c7-787a84dbdb80 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cf09548c-3631-48db-b474-279b88fc113d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.359 183079 DEBUG oslo_concurrency.lockutils [req-5dda8240-4635-4fbb-bf57-66e9377cc7d9 req-7184144f-4458-4877-94c7-787a84dbdb80 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cf09548c-3631-48db-b474-279b88fc113d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.359 183079 DEBUG nova.compute.manager [req-5dda8240-4635-4fbb-bf57-66e9377cc7d9 req-7184144f-4458-4877-94c7-787a84dbdb80 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Processing event network-vif-plugged-dc533cce-5ac7-469b-ac7f-effa9e40a838 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.364 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6c8465b0-22df-45f0-96ef-5aa479d8dc9b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:fa93'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 710801, 'tstamp': 710801}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245395, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.393 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1f43ee74-cc49-4f65-8ef9-60b58e720380]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 235], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 710801, 'reachable_time': 21320, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245396, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.436 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[17829056-8db6-4b00-ac3f-5f4bad2e89b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.502 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[85c20bc7-d3b4-43aa-92bd-d5e8400cd9e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.504 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.504 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.505 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.507 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:04 compute-0 NetworkManager[55454]: <info>  [1769104744.5078] manager: (tap88ed9213-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/346)
Jan 22 17:59:04 compute-0 kernel: tap88ed9213-70: entered promiscuous mode
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.510 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.511 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.511 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:04 compute-0 ovn_controller[95372]: 2026-01-22T17:59:04Z|00867|binding|INFO|Releasing lport da93a32e-eed6-4124-8f08-78b0bfe2cb69 from this chassis (sb_readonly=0)
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.513 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.513 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 17:59:04 compute-0 nova_compute[183075]: 2026-01-22 17:59:04.523 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.522 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f64d7332-5b9c-4e86-9985-55568ace112c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.525 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: global
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 17:59:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:04.526 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'env', 'PROCESS_TAG=haproxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88ed9213-7d48-4c42-ad60-ce3fcf104f71.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 17:59:04 compute-0 podman[245427]: 2026-01-22 17:59:04.906900768 +0000 UTC m=+0.068524861 container create 04a32a677217d4ab22e6e54d2d52f9ec2922348febea7da56820c714c00ee980 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 17:59:04 compute-0 systemd[1]: Started libpod-conmon-04a32a677217d4ab22e6e54d2d52f9ec2922348febea7da56820c714c00ee980.scope.
Jan 22 17:59:04 compute-0 systemd[1]: Started libcrun container.
Jan 22 17:59:04 compute-0 podman[245427]: 2026-01-22 17:59:04.873948469 +0000 UTC m=+0.035572592 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:59:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b13dd48e0da30bd204a7b7877a583e39df262b17dac74447d84ad5f6a35ae668/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:59:04 compute-0 podman[245427]: 2026-01-22 17:59:04.985886 +0000 UTC m=+0.147510123 container init 04a32a677217d4ab22e6e54d2d52f9ec2922348febea7da56820c714c00ee980 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:59:04 compute-0 podman[245427]: 2026-01-22 17:59:04.991771809 +0000 UTC m=+0.153395902 container start 04a32a677217d4ab22e6e54d2d52f9ec2922348febea7da56820c714c00ee980 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:59:05 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245443]: [NOTICE]   (245447) : New worker (245451) forked
Jan 22 17:59:05 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245443]: [NOTICE]   (245447) : Loading success.
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.108 183079 DEBUG nova.compute.manager [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.110 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104745.1070516, cf09548c-3631-48db-b474-279b88fc113d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.110 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] VM Started (Lifecycle Event)
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.112 183079 DEBUG nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.116 183079 INFO nova.virt.libvirt.driver [-] [instance: cf09548c-3631-48db-b474-279b88fc113d] Instance spawned successfully.
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.117 183079 DEBUG nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.130 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.137 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.140 183079 DEBUG nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.141 183079 DEBUG nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.142 183079 DEBUG nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.142 183079 DEBUG nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.143 183079 DEBUG nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.144 183079 DEBUG nova.virt.libvirt.driver [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.168 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.170 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104745.1074402, cf09548c-3631-48db-b474-279b88fc113d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.170 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] VM Paused (Lifecycle Event)
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.191 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.195 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104745.11236, cf09548c-3631-48db-b474-279b88fc113d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.195 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] VM Resumed (Lifecycle Event)
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.200 183079 INFO nova.compute.manager [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Took 7.68 seconds to spawn the instance on the hypervisor.
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.200 183079 DEBUG nova.compute.manager [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.209 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.214 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.242 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.259 183079 INFO nova.compute.manager [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Took 8.12 seconds to build instance.
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.275 183079 DEBUG oslo_concurrency.lockutils [None req-dc993309-892f-48bf-8ab2-f30d862e613b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "cf09548c-3631-48db-b474-279b88fc113d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.807 183079 DEBUG nova.network.neutron [req-182fbde5-b6e9-472e-b38f-d20c938fc2a1 req-a5efd043-ae44-4a25-82f0-dcf767c677b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Updated VIF entry in instance network info cache for port dc533cce-5ac7-469b-ac7f-effa9e40a838. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.807 183079 DEBUG nova.network.neutron [req-182fbde5-b6e9-472e-b38f-d20c938fc2a1 req-a5efd043-ae44-4a25-82f0-dcf767c677b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Updating instance_info_cache with network_info: [{"id": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "address": "fa:16:3e:65:51:bc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc533cce-5a", "ovs_interfaceid": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 17:59:05 compute-0 nova_compute[183075]: 2026-01-22 17:59:05.825 183079 DEBUG oslo_concurrency.lockutils [req-182fbde5-b6e9-472e-b38f-d20c938fc2a1 req-a5efd043-ae44-4a25-82f0-dcf767c677b6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-cf09548c-3631-48db-b474-279b88fc113d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 17:59:06 compute-0 nova_compute[183075]: 2026-01-22 17:59:06.356 183079 INFO nova.compute.manager [None req-7cd4e374-11e5-440a-96b9-c60f2727d62e 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Get console output
Jan 22 17:59:06 compute-0 nova_compute[183075]: 2026-01-22 17:59:06.360 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:59:06 compute-0 nova_compute[183075]: 2026-01-22 17:59:06.419 183079 DEBUG nova.compute.manager [req-e35b5b87-d729-4af1-8dc5-abc206e57840 req-568b068b-3ec7-4fd0-ac01-89d3e0b37c8f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Received event network-vif-plugged-dc533cce-5ac7-469b-ac7f-effa9e40a838 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 17:59:06 compute-0 nova_compute[183075]: 2026-01-22 17:59:06.420 183079 DEBUG oslo_concurrency.lockutils [req-e35b5b87-d729-4af1-8dc5-abc206e57840 req-568b068b-3ec7-4fd0-ac01-89d3e0b37c8f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "cf09548c-3631-48db-b474-279b88fc113d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:59:06 compute-0 nova_compute[183075]: 2026-01-22 17:59:06.420 183079 DEBUG oslo_concurrency.lockutils [req-e35b5b87-d729-4af1-8dc5-abc206e57840 req-568b068b-3ec7-4fd0-ac01-89d3e0b37c8f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cf09548c-3631-48db-b474-279b88fc113d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:59:06 compute-0 nova_compute[183075]: 2026-01-22 17:59:06.420 183079 DEBUG oslo_concurrency.lockutils [req-e35b5b87-d729-4af1-8dc5-abc206e57840 req-568b068b-3ec7-4fd0-ac01-89d3e0b37c8f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cf09548c-3631-48db-b474-279b88fc113d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:59:06 compute-0 nova_compute[183075]: 2026-01-22 17:59:06.420 183079 DEBUG nova.compute.manager [req-e35b5b87-d729-4af1-8dc5-abc206e57840 req-568b068b-3ec7-4fd0-ac01-89d3e0b37c8f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] No waiting events found dispatching network-vif-plugged-dc533cce-5ac7-469b-ac7f-effa9e40a838 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 17:59:06 compute-0 nova_compute[183075]: 2026-01-22 17:59:06.421 183079 WARNING nova.compute.manager [req-e35b5b87-d729-4af1-8dc5-abc206e57840 req-568b068b-3ec7-4fd0-ac01-89d3e0b37c8f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Received unexpected event network-vif-plugged-dc533cce-5ac7-469b-ac7f-effa9e40a838 for instance with vm_state active and task_state None.
Jan 22 17:59:07 compute-0 nova_compute[183075]: 2026-01-22 17:59:07.464 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769104732.463316, d98d7460-4481-41d1-8ad7-97b85cb698ad => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 17:59:07 compute-0 nova_compute[183075]: 2026-01-22 17:59:07.464 183079 INFO nova.compute.manager [-] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] VM Stopped (Lifecycle Event)
Jan 22 17:59:07 compute-0 nova_compute[183075]: 2026-01-22 17:59:07.481 183079 DEBUG nova.compute.manager [None req-394b4d0e-be21-4d09-afa1-6968acd970b6 - - - - - -] [instance: d98d7460-4481-41d1-8ad7-97b85cb698ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 17:59:08 compute-0 podman[245465]: 2026-01-22 17:59:08.3475445 +0000 UTC m=+0.057267356 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:59:09 compute-0 nova_compute[183075]: 2026-01-22 17:59:09.021 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:09 compute-0 nova_compute[183075]: 2026-01-22 17:59:09.354 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:11 compute-0 nova_compute[183075]: 2026-01-22 17:59:11.472 183079 INFO nova.compute.manager [None req-d8cc3b13-788b-4b1b-9fb5-cbd35a50c4b1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Get console output
Jan 22 17:59:11 compute-0 nova_compute[183075]: 2026-01-22 17:59:11.477 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:59:11 compute-0 podman[245490]: 2026-01-22 17:59:11.848564092 +0000 UTC m=+0.050174335 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:59:14 compute-0 nova_compute[183075]: 2026-01-22 17:59:14.024 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:14 compute-0 nova_compute[183075]: 2026-01-22 17:59:14.356 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:16 compute-0 ovn_controller[95372]: 2026-01-22T17:59:16Z|00160|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:65:51:bc 10.100.0.11
Jan 22 17:59:16 compute-0 ovn_controller[95372]: 2026-01-22T17:59:16Z|00161|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:65:51:bc 10.100.0.11
Jan 22 17:59:16 compute-0 nova_compute[183075]: 2026-01-22 17:59:16.631 183079 INFO nova.compute.manager [None req-2505a869-fbad-442c-a762-25c06fc7934e 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Get console output
Jan 22 17:59:16 compute-0 nova_compute[183075]: 2026-01-22 17:59:16.637 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:59:19 compute-0 nova_compute[183075]: 2026-01-22 17:59:19.029 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:19 compute-0 nova_compute[183075]: 2026-01-22 17:59:19.360 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:21 compute-0 nova_compute[183075]: 2026-01-22 17:59:21.754 183079 INFO nova.compute.manager [None req-b3eb33f7-698d-4bbe-8a04-8552fa01e8ad 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Get console output
Jan 22 17:59:21 compute-0 nova_compute[183075]: 2026-01-22 17:59:21.759 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.137 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.139 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.729 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.729 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.5908206
Jan 22 17:59:22 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.11:47634 [22/Jan/2026:17:59:22.137] listener listener/metadata 0/0/0/592/592 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.738 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.738 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.750 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.750 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0123746
Jan 22 17:59:22 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.11:47640 [22/Jan/2026:17:59:22.737] listener listener/metadata 0/0/0/13/13 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.754 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.754 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.770 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.770 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0159998
Jan 22 17:59:22 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.11:47652 [22/Jan/2026:17:59:22.753] listener listener/metadata 0/0/0/16/16 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.775 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.775 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.790 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.790 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0151062
Jan 22 17:59:22 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.11:47666 [22/Jan/2026:17:59:22.774] listener listener/metadata 0/0/0/15/15 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.794 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.795 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.810 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.810 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0156078
Jan 22 17:59:22 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.11:47682 [22/Jan/2026:17:59:22.794] listener listener/metadata 0/0/0/16/16 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.815 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.816 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.830 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.830 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0147741
Jan 22 17:59:22 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.11:47686 [22/Jan/2026:17:59:22.815] listener listener/metadata 0/0/0/15/15 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.835 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.836 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.848 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:59:22 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.11:47698 [22/Jan/2026:17:59:22.835] listener listener/metadata 0/0/0/13/13 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.848 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0126040
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.852 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.853 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.864 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:59:22 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.11:47710 [22/Jan/2026:17:59:22.852] listener listener/metadata 0/0/0/12/12 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.865 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0119193
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.870 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.871 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.884 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:59:22 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.11:47716 [22/Jan/2026:17:59:22.870] listener listener/metadata 0/0/0/14/14 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.884 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0133622
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.888 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.889 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.900 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.901 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0120935
Jan 22 17:59:22 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.11:47728 [22/Jan/2026:17:59:22.888] listener listener/metadata 0/0/0/13/13 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.905 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.905 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.919 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0135133
Jan 22 17:59:22 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.11:47736 [22/Jan/2026:17:59:22.904] listener listener/metadata 0/0/0/14/14 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.927 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.928 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.942 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:59:22 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.11:47748 [22/Jan/2026:17:59:22.927] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.943 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0149381
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.947 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.948 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.960 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:59:22 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.11:47752 [22/Jan/2026:17:59:22.947] listener listener/metadata 0/0/0/13/13 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.960 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0120065
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.964 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.964 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.976 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.976 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0124755
Jan 22 17:59:22 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.11:47754 [22/Jan/2026:17:59:22.963] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.981 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.981 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.993 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.993 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0117409
Jan 22 17:59:22 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.11:47756 [22/Jan/2026:17:59:22.981] listener listener/metadata 0/0/0/12/12 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.997 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:22.998 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.11
Jan 22 17:59:22 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 17:59:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:23.008 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 17:59:23 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.11:47764 [22/Jan/2026:17:59:22.997] listener listener/metadata 0/0/0/11/11 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 17:59:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:23.008 104990 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0107257
Jan 22 17:59:23 compute-0 podman[245537]: 2026-01-22 17:59:23.355784572 +0000 UTC m=+0.059823086 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:59:23 compute-0 podman[245536]: 2026-01-22 17:59:23.378700161 +0000 UTC m=+0.086112736 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 17:59:23 compute-0 podman[245535]: 2026-01-22 17:59:23.390385186 +0000 UTC m=+0.100824582 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller)
Jan 22 17:59:24 compute-0 nova_compute[183075]: 2026-01-22 17:59:24.032 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:24 compute-0 nova_compute[183075]: 2026-01-22 17:59:24.362 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:26 compute-0 nova_compute[183075]: 2026-01-22 17:59:26.888 183079 INFO nova.compute.manager [None req-717e5e4a-f2c4-4710-bd8b-5fc4c1bbb7ac 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Get console output
Jan 22 17:59:26 compute-0 nova_compute[183075]: 2026-01-22 17:59:26.895 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:59:28 compute-0 podman[245600]: 2026-01-22 17:59:28.37144549 +0000 UTC m=+0.080854923 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:59:29 compute-0 nova_compute[183075]: 2026-01-22 17:59:29.036 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:29 compute-0 nova_compute[183075]: 2026-01-22 17:59:29.364 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:32 compute-0 nova_compute[183075]: 2026-01-22 17:59:32.034 183079 INFO nova.compute.manager [None req-254064fa-cde2-48f7-b867-24abc3a99c26 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Get console output
Jan 22 17:59:32 compute-0 nova_compute[183075]: 2026-01-22 17:59:32.039 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:59:34 compute-0 nova_compute[183075]: 2026-01-22 17:59:34.038 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:34 compute-0 ovn_controller[95372]: 2026-01-22T17:59:34Z|00868|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Jan 22 17:59:34 compute-0 nova_compute[183075]: 2026-01-22 17:59:34.367 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:37 compute-0 nova_compute[183075]: 2026-01-22 17:59:37.252 183079 INFO nova.compute.manager [None req-5ea6c3b0-c271-42a3-be5b-30510e9eea07 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Get console output
Jan 22 17:59:37 compute-0 nova_compute[183075]: 2026-01-22 17:59:37.257 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:59:39 compute-0 nova_compute[183075]: 2026-01-22 17:59:39.043 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:39 compute-0 podman[245620]: 2026-01-22 17:59:39.342772227 +0000 UTC m=+0.056734982 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:59:39 compute-0 nova_compute[183075]: 2026-01-22 17:59:39.368 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:41.983 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 17:59:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:41.984 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 17:59:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 17:59:41.985 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 17:59:42 compute-0 podman[245644]: 2026-01-22 17:59:42.349733415 +0000 UTC m=+0.059038485 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:59:42 compute-0 nova_compute[183075]: 2026-01-22 17:59:42.383 183079 INFO nova.compute.manager [None req-84695e25-4bd2-4a79-92d8-c957c3a7cf14 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Get console output
Jan 22 17:59:42 compute-0 nova_compute[183075]: 2026-01-22 17:59:42.390 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:59:44 compute-0 nova_compute[183075]: 2026-01-22 17:59:44.046 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:44 compute-0 nova_compute[183075]: 2026-01-22 17:59:44.371 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:44 compute-0 nova_compute[183075]: 2026-01-22 17:59:44.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:59:46 compute-0 nova_compute[183075]: 2026-01-22 17:59:46.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:59:47 compute-0 nova_compute[183075]: 2026-01-22 17:59:47.516 183079 INFO nova.compute.manager [None req-d6711f77-2c41-4dd9-bc06-16a6c41af224 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Get console output
Jan 22 17:59:47 compute-0 nova_compute[183075]: 2026-01-22 17:59:47.521 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:59:47 compute-0 nova_compute[183075]: 2026-01-22 17:59:47.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:59:48 compute-0 nova_compute[183075]: 2026-01-22 17:59:48.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:59:49 compute-0 nova_compute[183075]: 2026-01-22 17:59:49.051 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:49 compute-0 nova_compute[183075]: 2026-01-22 17:59:49.373 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:52 compute-0 nova_compute[183075]: 2026-01-22 17:59:52.645 183079 INFO nova.compute.manager [None req-6f6d2c84-2a9d-496c-9bba-593d90882140 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Get console output
Jan 22 17:59:52 compute-0 nova_compute[183075]: 2026-01-22 17:59:52.649 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:59:54 compute-0 nova_compute[183075]: 2026-01-22 17:59:54.054 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:54 compute-0 podman[245670]: 2026-01-22 17:59:54.358774431 +0000 UTC m=+0.060641298 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, distribution-scope=public, container_name=openstack_network_exporter, version=9.6, vendor=Red Hat, Inc.)
Jan 22 17:59:54 compute-0 podman[245669]: 2026-01-22 17:59:54.366923241 +0000 UTC m=+0.067153714 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 22 17:59:54 compute-0 nova_compute[183075]: 2026-01-22 17:59:54.375 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:54 compute-0 podman[245668]: 2026-01-22 17:59:54.38355767 +0000 UTC m=+0.094258575 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.466 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'cf09548c-3631-48db-b474-279b88fc113d', 'name': 'tempest-server-test-1105403482', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004f', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.467 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.484 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/memory.usage volume: 42.859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58ac07d2-2b05-49ea-bff3-4dbc080f9cc2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.859375, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'timestamp': '2026-01-22T17:59:55.467159', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'instance-0000004f', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '288ed022-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.244698441, 'message_signature': '2e8eb27362f5727d4c4bd31b0a62accc0e9e498ca2094470344393c79e62a12b'}]}, 'timestamp': '2026-01-22 17:59:55.484751', '_unique_id': '011ffe6536e84054949c325b3e0e6ede'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.486 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.492 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '327683b4-8f52-4d40-8728-041496db7a4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cf09548c-3631-48db-b474-279b88fc113d-vda', 'timestamp': '2026-01-22T17:59:55.486596', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'instance-0000004f', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '289018ce-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.247033334, 'message_signature': 'd55d4f61911183bc083598382f123d2dd64b03f5c32e16a7ecf042d3bf9fbd50'}]}, 'timestamp': '2026-01-22 17:59:55.493105', '_unique_id': '36ddc582e75c467ba7f28737a45a317f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.493 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.494 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.495 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for cf09548c-3631-48db-b474-279b88fc113d / tapdc533cce-5a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.496 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17fb86a1-79c4-4f9a-825a-543e2db006b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004f-cf09548c-3631-48db-b474-279b88fc113d-tapdc533cce-5a', 'timestamp': '2026-01-22T17:59:55.494335', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'tapdc533cce-5a', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:51:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc533cce-5a'}, 'message_id': '289098c6-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.254758763, 'message_signature': 'eb73cb605b2f7612a335ab20ebcb0e59000daa27b864a769e303d30e4459a037'}]}, 'timestamp': '2026-01-22 17:59:55.496409', '_unique_id': 'bcc0cc2bd4d1445daec29d19e8aa2747'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1105403482>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1105403482>]
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.497 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.512 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/disk.device.write.requests volume: 324 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e4f34f8-d427-497b-b84d-cbc1de79a353', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 324, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cf09548c-3631-48db-b474-279b88fc113d-vda', 'timestamp': '2026-01-22T17:59:55.498028', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'instance-0000004f', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '28931b28-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.258511784, 'message_signature': 'b76addbd680f8eaf658024b3acfa3d2cd9fe3e8b2928d0057480a830d656152e'}]}, 'timestamp': '2026-01-22 17:59:55.512989', '_unique_id': '8bec9b4f19aa4fdeb8e96580522c9fc8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.514 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.515 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.516 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.516 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1105403482>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1105403482>]
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.516 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.516 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94835027-6bbb-401e-9681-26b8a21accf8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004f-cf09548c-3631-48db-b474-279b88fc113d-tapdc533cce-5a', 'timestamp': '2026-01-22T17:59:55.516680', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'tapdc533cce-5a', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:51:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc533cce-5a'}, 'message_id': '2893c3fc-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.254758763, 'message_signature': 'ba6dd9b188b7c7f73777f2e00374d3c25be1bf0810c46db17653240314883764'}]}, 'timestamp': '2026-01-22 17:59:55.517318', '_unique_id': '13ac48c66e954fa4b38436b19bb63480'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.519 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.520 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9caf7bb8-90ad-4f59-987b-5e1688db22d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004f-cf09548c-3631-48db-b474-279b88fc113d-tapdc533cce-5a', 'timestamp': '2026-01-22T17:59:55.520024', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'tapdc533cce-5a', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:51:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc533cce-5a'}, 'message_id': '28944066-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.254758763, 'message_signature': '1ef95bf9000935095846a5b5ca7ede037873211e25f44eafab89649cec7539e7'}]}, 'timestamp': '2026-01-22 17:59:55.520384', '_unique_id': '4ca2dfc109f940ba8a5fa6963e41bbb2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.522 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.522 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10607fa9-8083-4041-84ec-47a60ebf779e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004f-cf09548c-3631-48db-b474-279b88fc113d-tapdc533cce-5a', 'timestamp': '2026-01-22T17:59:55.522168', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'tapdc533cce-5a', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:51:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc533cce-5a'}, 'message_id': '289493f4-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.254758763, 'message_signature': '6435299586dba498e7efdbc80d66cd21af2a99ed1419001a4e54e438ed72dcd4'}]}, 'timestamp': '2026-01-22 17:59:55.522539', '_unique_id': '1635b31700c543a484329e29d49bd051'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.524 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.524 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/disk.device.read.bytes volume: 30808576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'decfbaea-580c-4655-b10d-7238ea7088fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30808576, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cf09548c-3631-48db-b474-279b88fc113d-vda', 'timestamp': '2026-01-22T17:59:55.524295', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'instance-0000004f', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2894e6ce-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.258511784, 'message_signature': 'a0265b52be14dcec406f4777ef07fa3bc78d2f0fd3e9b986dbd3f9101ef53297'}]}, 'timestamp': '2026-01-22 17:59:55.524651', '_unique_id': 'fa7efcdd23064c23917473ec1110f9f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.526 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.526 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/disk.device.write.bytes volume: 73011200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e591f97c-e0e6-455b-912f-71b5c96ead48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73011200, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cf09548c-3631-48db-b474-279b88fc113d-vda', 'timestamp': '2026-01-22T17:59:55.526531', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'instance-0000004f', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '28953e9e-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.258511784, 'message_signature': '835f74908c49bf9d89f49488b16824a867f15e2290176394798f859ed3272499'}]}, 'timestamp': '2026-01-22 17:59:55.526872', '_unique_id': '2598b86326e14619a7e133e0c52a5898'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.528 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.528 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/network.outgoing.packets volume: 131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1f5100a-844c-4d18-a677-3cd53e6ec0e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 131, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004f-cf09548c-3631-48db-b474-279b88fc113d-tapdc533cce-5a', 'timestamp': '2026-01-22T17:59:55.528495', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'tapdc533cce-5a', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:51:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc533cce-5a'}, 'message_id': '289594fc-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.254758763, 'message_signature': '53b30a0a8c96d7d67b6db7f8e014b27e75a5d0788997e1cc334039701f605b54'}]}, 'timestamp': '2026-01-22 17:59:55.529130', '_unique_id': 'edd1090e800a4629be662b4e8e4db522'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ef1fc0b-4eb8-4975-b854-aeea25d24afa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004f-cf09548c-3631-48db-b474-279b88fc113d-tapdc533cce-5a', 'timestamp': '2026-01-22T17:59:55.530260', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'tapdc533cce-5a', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:51:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc533cce-5a'}, 'message_id': '2895cd82-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.254758763, 'message_signature': '0d25aadf7b6aeffeef4533f5a89c93e703e0225acc87312ba40b086ff4995815'}]}, 'timestamp': '2026-01-22 17:59:55.530487', '_unique_id': 'd19ad66c2beb49b7b4cdf14fac0b0ec1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.531 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.531 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.531 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1105403482>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1105403482>]
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.531 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.531 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0856ebce-13d9-4c03-b563-b673f0b1619a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004f-cf09548c-3631-48db-b474-279b88fc113d-tapdc533cce-5a', 'timestamp': '2026-01-22T17:59:55.531954', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'tapdc533cce-5a', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:51:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc533cce-5a'}, 'message_id': '28960fb8-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.254758763, 'message_signature': 'bdf8f732ce54add52e6e212b8f2b98a9d293f10e5e9129cfde75865231bb92ae'}]}, 'timestamp': '2026-01-22 17:59:55.532183', '_unique_id': '01e7c28b128d43f5b2340f94d78a46ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/disk.device.allocation volume: 30220288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2df924b-1db1-4144-9aa3-f9f39d55649c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30220288, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cf09548c-3631-48db-b474-279b88fc113d-vda', 'timestamp': '2026-01-22T17:59:55.533267', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'instance-0000004f', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '289642e4-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.247033334, 'message_signature': '995f940669acdd8246b85baadefee06ecd19a28a48ae25c2b0db7a32ebe4d3cd'}]}, 'timestamp': '2026-01-22 17:59:55.533486', '_unique_id': '25ecf5de8d49438ea751e3d7adfa3256'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.534 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.534 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/disk.device.read.latency volume: 181697017 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c9c4260-3a9c-4d3a-910e-a31bf3b77a33', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 181697017, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cf09548c-3631-48db-b474-279b88fc113d-vda', 'timestamp': '2026-01-22T17:59:55.534537', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'instance-0000004f', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2896750c-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.258511784, 'message_signature': '7aa472e8fb599f6daa52ccf561ed87f0dc5e5dc0f90297957f7ca68a9c9aba10'}]}, 'timestamp': '2026-01-22 17:59:55.534782', '_unique_id': '1f1157cb4398455ebcbee68cf1547dda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.535 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.536 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.536 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1105403482>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1105403482>]
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.536 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.536 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f46654a-809f-4662-a906-1459200e0935', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004f-cf09548c-3631-48db-b474-279b88fc113d-tapdc533cce-5a', 'timestamp': '2026-01-22T17:59:55.536383', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'tapdc533cce-5a', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:51:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc533cce-5a'}, 'message_id': '2896bdaa-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.254758763, 'message_signature': 'd0ce45a6d88eb70c20d893884dbbefb5a8236ce2ded90206f02f5c37306942f6'}]}, 'timestamp': '2026-01-22 17:59:55.536662', '_unique_id': '7110f36346bb4ad2b2d23f41bdd7f5fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.537 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/cpu volume: 11540000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a150a602-e190-4c5c-941f-24618be9a275', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11540000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'timestamp': '2026-01-22T17:59:55.537791', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'instance-0000004f', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2896f3b0-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.244698441, 'message_signature': 'c41dfae715db2aeaa27f3ab6cc6819fe1a9fdb0ff6341740d67ec6022098eddf'}]}, 'timestamp': '2026-01-22 17:59:55.538014', '_unique_id': 'c671dc99d0d9472fb1eab98a8cfae91d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.538 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/network.outgoing.bytes volume: 11596 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2c05b2c-4e61-4769-a0b7-81b89e1d17f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11596, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004f-cf09548c-3631-48db-b474-279b88fc113d-tapdc533cce-5a', 'timestamp': '2026-01-22T17:59:55.539104', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'tapdc533cce-5a', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:51:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc533cce-5a'}, 'message_id': '28972722-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.254758763, 'message_signature': '929a8f42336547b83f0f0128662af73d27e2da7aacf7f6fc3e020069184e1081'}]}, 'timestamp': '2026-01-22 17:59:55.539340', '_unique_id': '900be19105a44c2d9e3e20c153d4db0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.540 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.540 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/disk.device.read.requests volume: 1133 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0205848d-961e-4157-a654-bb359a103ce5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1133, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cf09548c-3631-48db-b474-279b88fc113d-vda', 'timestamp': '2026-01-22T17:59:55.540478', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'instance-0000004f', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '28975c92-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.258511784, 'message_signature': '2015653b23b1691285e8fe0a9082ceaa0aba9f4c298d2dab4595c425a67f1b20'}]}, 'timestamp': '2026-01-22 17:59:55.540736', '_unique_id': '19f5e4db11bb4061b0da3c5a6f2e47e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.541 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/disk.device.write.latency volume: 2646123985 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe0c6913-1bc8-4533-81ea-63dedc391fc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2646123985, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cf09548c-3631-48db-b474-279b88fc113d-vda', 'timestamp': '2026-01-22T17:59:55.541876', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'instance-0000004f', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '289794a0-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.258511784, 'message_signature': '30718c72ad041b18bdb0c91b33df1e8d716705aec1c73189e60807a4667a2f35'}]}, 'timestamp': '2026-01-22 17:59:55.542164', '_unique_id': 'ca2ac4eb56264361a294b8c2053d11a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.542 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.543 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.543 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/network.incoming.bytes volume: 7214 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '043124ad-bf02-4235-9f2c-9f1964fdec2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7214, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-0000004f-cf09548c-3631-48db-b474-279b88fc113d-tapdc533cce-5a', 'timestamp': '2026-01-22T17:59:55.543385', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'tapdc533cce-5a', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:51:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc533cce-5a'}, 'message_id': '2897ce20-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.254758763, 'message_signature': 'e1c91a5339ff9140a9cb7e9da8a5e73153eabadbd6dd7315f755e5f2bc24ec35'}]}, 'timestamp': '2026-01-22 17:59:55.543610', '_unique_id': '3037c611897c493ea97052ff48450cc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.544 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 DEBUG ceilometer.compute.pollsters [-] cf09548c-3631-48db-b474-279b88fc113d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfb87021-5e58-4bba-a159-f86e3f850a30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'cf09548c-3631-48db-b474-279b88fc113d-vda', 'timestamp': '2026-01-22T17:59:55.545014', 'resource_metadata': {'display_name': 'tempest-server-test-1105403482', 'name': 'instance-0000004f', 'instance_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '28980df4-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7159.247033334, 'message_signature': 'bcc899d0520574ae90b239013a036c8dbd7ccb4842e5a30384f78cf5a4ce710b'}]}, 'timestamp': '2026-01-22 17:59:55.545255', '_unique_id': 'f18db0f710b64b71874aa9535e96979a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:59:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 17:59:55.545 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:59:55 compute-0 nova_compute[183075]: 2026-01-22 17:59:55.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:59:56 compute-0 nova_compute[183075]: 2026-01-22 17:59:56.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:59:56 compute-0 nova_compute[183075]: 2026-01-22 17:59:56.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 17:59:57 compute-0 nova_compute[183075]: 2026-01-22 17:59:57.782 183079 INFO nova.compute.manager [None req-3eea5f82-e2dc-41f8-9171-a860ece7765b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Get console output
Jan 22 17:59:57 compute-0 nova_compute[183075]: 2026-01-22 17:59:57.787 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 17:59:57 compute-0 nova_compute[183075]: 2026-01-22 17:59:57.806 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 17:59:57 compute-0 nova_compute[183075]: 2026-01-22 17:59:57.807 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 17:59:57 compute-0 nova_compute[183075]: 2026-01-22 17:59:57.807 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 17:59:58 compute-0 nova_compute[183075]: 2026-01-22 17:59:58.784 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-cf09548c-3631-48db-b474-279b88fc113d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 17:59:58 compute-0 nova_compute[183075]: 2026-01-22 17:59:58.784 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-cf09548c-3631-48db-b474-279b88fc113d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 17:59:58 compute-0 nova_compute[183075]: 2026-01-22 17:59:58.784 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 17:59:58 compute-0 nova_compute[183075]: 2026-01-22 17:59:58.784 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid cf09548c-3631-48db-b474-279b88fc113d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 17:59:59 compute-0 nova_compute[183075]: 2026-01-22 17:59:59.056 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 17:59:59 compute-0 podman[245733]: 2026-01-22 17:59:59.345419334 +0000 UTC m=+0.056063224 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 17:59:59 compute-0 nova_compute[183075]: 2026-01-22 17:59:59.376 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:01 compute-0 nova_compute[183075]: 2026-01-22 18:00:01.968 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] Updating instance_info_cache with network_info: [{"id": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "address": "fa:16:3e:65:51:bc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc533cce-5a", "ovs_interfaceid": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:00:01 compute-0 nova_compute[183075]: 2026-01-22 18:00:01.985 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-cf09548c-3631-48db-b474-279b88fc113d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:00:01 compute-0 nova_compute[183075]: 2026-01-22 18:00:01.985 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 18:00:01 compute-0 nova_compute[183075]: 2026-01-22 18:00:01.986 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:00:01 compute-0 nova_compute[183075]: 2026-01-22 18:00:01.986 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:00:01 compute-0 nova_compute[183075]: 2026-01-22 18:00:01.986 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 18:00:01 compute-0 nova_compute[183075]: 2026-01-22 18:00:01.986 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.009 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.010 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.010 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.010 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.076 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.131 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.132 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.187 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.349 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.350 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5556MB free_disk=73.32295989990234GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.350 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.350 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.420 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance cf09548c-3631-48db-b474-279b88fc113d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.420 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.420 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.462 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.480 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.509 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 18:00:02 compute-0 nova_compute[183075]: 2026-01-22 18:00:02.509 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.013 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.013 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.028 183079 DEBUG nova.compute.manager [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.107 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.108 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.115 183079 DEBUG nova.virt.hardware [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.115 183079 INFO nova.compute.claims [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Claim successful on node compute-0.ctlplane.example.com
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.239 183079 DEBUG nova.compute.provider_tree [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.256 183079 DEBUG nova.scheduler.client.report [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.277 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.278 183079 DEBUG nova.compute.manager [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.336 183079 DEBUG nova.compute.manager [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.336 183079 DEBUG nova.network.neutron [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.358 183079 INFO nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.372 183079 DEBUG nova.compute.manager [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.476 183079 DEBUG nova.compute.manager [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.477 183079 DEBUG nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.477 183079 INFO nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Creating image(s)
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.478 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.478 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.479 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.491 183079 DEBUG oslo_concurrency.processutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.549 183079 DEBUG oslo_concurrency.processutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.550 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.550 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.565 183079 DEBUG oslo_concurrency.processutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.628 183079 DEBUG oslo_concurrency.processutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.629 183079 DEBUG oslo_concurrency.processutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.662 183079 DEBUG oslo_concurrency.processutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.663 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.664 183079 DEBUG oslo_concurrency.processutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.726 183079 DEBUG oslo_concurrency.processutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.727 183079 DEBUG nova.virt.disk.api [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.727 183079 DEBUG oslo_concurrency.processutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.787 183079 DEBUG oslo_concurrency.processutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.788 183079 DEBUG nova.virt.disk.api [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.788 183079 DEBUG nova.objects.instance [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.804 183079 DEBUG nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.804 183079 DEBUG nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Ensure instance console log exists: /var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.805 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.805 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.805 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:00:03 compute-0 nova_compute[183075]: 2026-01-22 18:00:03.829 183079 DEBUG nova.policy [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 18:00:04 compute-0 nova_compute[183075]: 2026-01-22 18:00:04.060 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:04.376 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:00:04 compute-0 nova_compute[183075]: 2026-01-22 18:00:04.376 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:04.379 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 18:00:04 compute-0 nova_compute[183075]: 2026-01-22 18:00:04.780 183079 DEBUG nova.network.neutron [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Successfully created port: 69ac495b-e1bf-41f5-94f0-2e829df4fc35 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 18:00:05 compute-0 nova_compute[183075]: 2026-01-22 18:00:05.466 183079 DEBUG nova.network.neutron [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Successfully updated port: 69ac495b-e1bf-41f5-94f0-2e829df4fc35 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 18:00:05 compute-0 nova_compute[183075]: 2026-01-22 18:00:05.480 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:00:05 compute-0 nova_compute[183075]: 2026-01-22 18:00:05.480 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:00:05 compute-0 nova_compute[183075]: 2026-01-22 18:00:05.480 183079 DEBUG nova.network.neutron [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 18:00:05 compute-0 nova_compute[183075]: 2026-01-22 18:00:05.564 183079 DEBUG nova.compute.manager [req-01fd9f93-4b97-46df-ab24-b4553a41c6d1 req-9d67fb4a-2f9a-4f92-804d-ae385b7254f3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Received event network-changed-69ac495b-e1bf-41f5-94f0-2e829df4fc35 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:00:05 compute-0 nova_compute[183075]: 2026-01-22 18:00:05.565 183079 DEBUG nova.compute.manager [req-01fd9f93-4b97-46df-ab24-b4553a41c6d1 req-9d67fb4a-2f9a-4f92-804d-ae385b7254f3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Refreshing instance network info cache due to event network-changed-69ac495b-e1bf-41f5-94f0-2e829df4fc35. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 18:00:05 compute-0 nova_compute[183075]: 2026-01-22 18:00:05.565 183079 DEBUG oslo_concurrency.lockutils [req-01fd9f93-4b97-46df-ab24-b4553a41c6d1 req-9d67fb4a-2f9a-4f92-804d-ae385b7254f3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:00:05 compute-0 nova_compute[183075]: 2026-01-22 18:00:05.789 183079 DEBUG nova.network.neutron [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.503 183079 DEBUG nova.network.neutron [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Updating instance_info_cache with network_info: [{"id": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "address": "fa:16:3e:49:9e:1a", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69ac495b-e1", "ovs_interfaceid": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.522 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.522 183079 DEBUG nova.compute.manager [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Instance network_info: |[{"id": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "address": "fa:16:3e:49:9e:1a", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69ac495b-e1", "ovs_interfaceid": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.523 183079 DEBUG oslo_concurrency.lockutils [req-01fd9f93-4b97-46df-ab24-b4553a41c6d1 req-9d67fb4a-2f9a-4f92-804d-ae385b7254f3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.523 183079 DEBUG nova.network.neutron [req-01fd9f93-4b97-46df-ab24-b4553a41c6d1 req-9d67fb4a-2f9a-4f92-804d-ae385b7254f3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Refreshing network info cache for port 69ac495b-e1bf-41f5-94f0-2e829df4fc35 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.525 183079 DEBUG nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Start _get_guest_xml network_info=[{"id": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "address": "fa:16:3e:49:9e:1a", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69ac495b-e1", "ovs_interfaceid": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.531 183079 WARNING nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.534 183079 DEBUG nova.virt.libvirt.host [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.535 183079 DEBUG nova.virt.libvirt.host [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.537 183079 DEBUG nova.virt.libvirt.host [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.538 183079 DEBUG nova.virt.libvirt.host [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.538 183079 DEBUG nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.538 183079 DEBUG nova.virt.hardware [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.539 183079 DEBUG nova.virt.hardware [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.539 183079 DEBUG nova.virt.hardware [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.539 183079 DEBUG nova.virt.hardware [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.539 183079 DEBUG nova.virt.hardware [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.539 183079 DEBUG nova.virt.hardware [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.540 183079 DEBUG nova.virt.hardware [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.540 183079 DEBUG nova.virt.hardware [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.540 183079 DEBUG nova.virt.hardware [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.540 183079 DEBUG nova.virt.hardware [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.540 183079 DEBUG nova.virt.hardware [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.543 183079 DEBUG nova.virt.libvirt.vif [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T18:00:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1402958561',display_name='tempest-server-test-1402958561',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1402958561',id=80,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-ejlisxlb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T18:00:03Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=7fef7daf-622f-4f8a-ba6a-25fac7fd68ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "address": "fa:16:3e:49:9e:1a", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69ac495b-e1", "ovs_interfaceid": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.544 183079 DEBUG nova.network.os_vif_util [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "address": "fa:16:3e:49:9e:1a", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69ac495b-e1", "ovs_interfaceid": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.544 183079 DEBUG nova.network.os_vif_util [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:9e:1a,bridge_name='br-int',has_traffic_filtering=True,id=69ac495b-e1bf-41f5-94f0-2e829df4fc35,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69ac495b-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.545 183079 DEBUG nova.objects.instance [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.556 183079 DEBUG nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] End _get_guest_xml xml=<domain type="kvm">
Jan 22 18:00:06 compute-0 nova_compute[183075]:   <uuid>7fef7daf-622f-4f8a-ba6a-25fac7fd68ed</uuid>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   <name>instance-00000050</name>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   <metadata>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1402958561</nova:name>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 18:00:06</nova:creationTime>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 18:00:06 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 18:00:06 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 18:00:06 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 18:00:06 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 18:00:06 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 18:00:06 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 18:00:06 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 18:00:06 compute-0 nova_compute[183075]:         <nova:port uuid="69ac495b-e1bf-41f5-94f0-2e829df4fc35">
Jan 22 18:00:06 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   </metadata>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <system>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <entry name="serial">7fef7daf-622f-4f8a-ba6a-25fac7fd68ed</entry>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <entry name="uuid">7fef7daf-622f-4f8a-ba6a-25fac7fd68ed</entry>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     </system>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   <os>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   </os>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   <features>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <apic/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   </features>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   </clock>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   </cpu>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   <devices>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed/disk"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     </disk>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:49:9e:1a"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <target dev="tap69ac495b-e1"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     </interface>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed/console.log" append="off"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     </serial>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <video>
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     </video>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     </rng>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 18:00:06 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 18:00:06 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 18:00:06 compute-0 nova_compute[183075]:   </devices>
Jan 22 18:00:06 compute-0 nova_compute[183075]: </domain>
Jan 22 18:00:06 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.557 183079 DEBUG nova.compute.manager [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Preparing to wait for external event network-vif-plugged-69ac495b-e1bf-41f5-94f0-2e829df4fc35 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.558 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.558 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.558 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.559 183079 DEBUG nova.virt.libvirt.vif [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T18:00:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1402958561',display_name='tempest-server-test-1402958561',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1402958561',id=80,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-ejlisxlb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T18:00:03Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=7fef7daf-622f-4f8a-ba6a-25fac7fd68ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "address": "fa:16:3e:49:9e:1a", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69ac495b-e1", "ovs_interfaceid": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.559 183079 DEBUG nova.network.os_vif_util [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "address": "fa:16:3e:49:9e:1a", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69ac495b-e1", "ovs_interfaceid": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.559 183079 DEBUG nova.network.os_vif_util [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:9e:1a,bridge_name='br-int',has_traffic_filtering=True,id=69ac495b-e1bf-41f5-94f0-2e829df4fc35,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69ac495b-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.560 183079 DEBUG os_vif [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:9e:1a,bridge_name='br-int',has_traffic_filtering=True,id=69ac495b-e1bf-41f5-94f0-2e829df4fc35,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69ac495b-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.560 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.561 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.561 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.564 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.565 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69ac495b-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.565 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap69ac495b-e1, col_values=(('external_ids', {'iface-id': '69ac495b-e1bf-41f5-94f0-2e829df4fc35', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:49:9e:1a', 'vm-uuid': '7fef7daf-622f-4f8a-ba6a-25fac7fd68ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.567 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.569 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 18:00:06 compute-0 NetworkManager[55454]: <info>  [1769104806.5695] manager: (tap69ac495b-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/347)
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.574 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.575 183079 INFO os_vif [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:9e:1a,bridge_name='br-int',has_traffic_filtering=True,id=69ac495b-e1bf-41f5-94f0-2e829df4fc35,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69ac495b-e1')
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.624 183079 DEBUG nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.624 183079 DEBUG nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:49:9e:1a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 18:00:06 compute-0 kernel: tap69ac495b-e1: entered promiscuous mode
Jan 22 18:00:06 compute-0 NetworkManager[55454]: <info>  [1769104806.6938] manager: (tap69ac495b-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/348)
Jan 22 18:00:06 compute-0 ovn_controller[95372]: 2026-01-22T18:00:06Z|00869|binding|INFO|Claiming lport 69ac495b-e1bf-41f5-94f0-2e829df4fc35 for this chassis.
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.693 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:06 compute-0 ovn_controller[95372]: 2026-01-22T18:00:06Z|00870|binding|INFO|69ac495b-e1bf-41f5-94f0-2e829df4fc35: Claiming fa:16:3e:49:9e:1a 10.100.0.5
Jan 22 18:00:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:06.703 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:9e:1a 10.100.0.5'], port_security=['fa:16:3e:49:9e:1a 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7fef7daf-622f-4f8a-ba6a-25fac7fd68ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '93cc952c-d4f7-47c9-94ed-c14dd990188b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=69ac495b-e1bf-41f5-94f0-2e829df4fc35) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:00:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:06.704 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 69ac495b-e1bf-41f5-94f0-2e829df4fc35 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 18:00:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:06.705 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 18:00:06 compute-0 ovn_controller[95372]: 2026-01-22T18:00:06Z|00871|binding|INFO|Setting lport 69ac495b-e1bf-41f5-94f0-2e829df4fc35 up in Southbound
Jan 22 18:00:06 compute-0 ovn_controller[95372]: 2026-01-22T18:00:06Z|00872|binding|INFO|Setting lport 69ac495b-e1bf-41f5-94f0-2e829df4fc35 ovn-installed in OVS
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.707 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.710 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:06.726 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6eab32e2-bad6-49a4-abeb-524a60823c15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:00:06 compute-0 systemd-machined[154382]: New machine qemu-80-instance-00000050.
Jan 22 18:00:06 compute-0 systemd-udevd[245793]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 18:00:06 compute-0 NetworkManager[55454]: <info>  [1769104806.7448] device (tap69ac495b-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 18:00:06 compute-0 NetworkManager[55454]: <info>  [1769104806.7452] device (tap69ac495b-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 18:00:06 compute-0 systemd[1]: Started Virtual Machine qemu-80-instance-00000050.
Jan 22 18:00:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:06.762 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[178e2c7f-4a88-472e-b66f-eddc4a119c86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:00:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:06.766 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ce98c03e-42e0-4739-be9e-ce2488cf49fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:00:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:06.792 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[965ecfc8-ccac-425f-8f65-0a4a05d4e6ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:00:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:06.807 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[11d5c624-b5b6-43d9-a32a-07ccb7a4145b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6132, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 54, 'rx_bytes': 8920, 'tx_bytes': 6132, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 235], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 710801, 'reachable_time': 15312, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245803, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:00:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:06.823 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d2772005-f341-45f4-a84b-da3458e88c26]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 710816, 'tstamp': 710816}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245806, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 710819, 'tstamp': 710819}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245806, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:00:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:06.825 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.826 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:06 compute-0 nova_compute[183075]: 2026-01-22 18:00:06.827 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:06.827 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:00:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:06.828 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 18:00:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:06.828 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:00:06 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:06.828 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.260 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104807.2597647, 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.261 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] VM Started (Lifecycle Event)
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.280 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.286 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104807.260905, 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.286 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] VM Paused (Lifecycle Event)
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.311 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.315 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.335 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.633 183079 DEBUG nova.compute.manager [req-6de6fbbb-e3fa-486b-a90d-983275c94cbb req-e72cda86-c012-4736-9f68-f946038568cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Received event network-vif-plugged-69ac495b-e1bf-41f5-94f0-2e829df4fc35 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.634 183079 DEBUG oslo_concurrency.lockutils [req-6de6fbbb-e3fa-486b-a90d-983275c94cbb req-e72cda86-c012-4736-9f68-f946038568cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.635 183079 DEBUG oslo_concurrency.lockutils [req-6de6fbbb-e3fa-486b-a90d-983275c94cbb req-e72cda86-c012-4736-9f68-f946038568cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.635 183079 DEBUG oslo_concurrency.lockutils [req-6de6fbbb-e3fa-486b-a90d-983275c94cbb req-e72cda86-c012-4736-9f68-f946038568cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.636 183079 DEBUG nova.compute.manager [req-6de6fbbb-e3fa-486b-a90d-983275c94cbb req-e72cda86-c012-4736-9f68-f946038568cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Processing event network-vif-plugged-69ac495b-e1bf-41f5-94f0-2e829df4fc35 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.637 183079 DEBUG nova.compute.manager [req-6de6fbbb-e3fa-486b-a90d-983275c94cbb req-e72cda86-c012-4736-9f68-f946038568cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Received event network-vif-plugged-69ac495b-e1bf-41f5-94f0-2e829df4fc35 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.637 183079 DEBUG oslo_concurrency.lockutils [req-6de6fbbb-e3fa-486b-a90d-983275c94cbb req-e72cda86-c012-4736-9f68-f946038568cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.638 183079 DEBUG oslo_concurrency.lockutils [req-6de6fbbb-e3fa-486b-a90d-983275c94cbb req-e72cda86-c012-4736-9f68-f946038568cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.638 183079 DEBUG oslo_concurrency.lockutils [req-6de6fbbb-e3fa-486b-a90d-983275c94cbb req-e72cda86-c012-4736-9f68-f946038568cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.639 183079 DEBUG nova.compute.manager [req-6de6fbbb-e3fa-486b-a90d-983275c94cbb req-e72cda86-c012-4736-9f68-f946038568cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] No waiting events found dispatching network-vif-plugged-69ac495b-e1bf-41f5-94f0-2e829df4fc35 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.640 183079 WARNING nova.compute.manager [req-6de6fbbb-e3fa-486b-a90d-983275c94cbb req-e72cda86-c012-4736-9f68-f946038568cd a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Received unexpected event network-vif-plugged-69ac495b-e1bf-41f5-94f0-2e829df4fc35 for instance with vm_state building and task_state spawning.
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.641 183079 DEBUG nova.compute.manager [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.646 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104807.645749, 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.646 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] VM Resumed (Lifecycle Event)
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.648 183079 DEBUG nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.651 183079 INFO nova.virt.libvirt.driver [-] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Instance spawned successfully.
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.651 183079 DEBUG nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.668 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.672 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.675 183079 DEBUG nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.676 183079 DEBUG nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.676 183079 DEBUG nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.677 183079 DEBUG nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.677 183079 DEBUG nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.677 183079 DEBUG nova.virt.libvirt.driver [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.705 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.733 183079 INFO nova.compute.manager [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Took 4.26 seconds to spawn the instance on the hypervisor.
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.734 183079 DEBUG nova.compute.manager [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.799 183079 INFO nova.compute.manager [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Took 4.72 seconds to build instance.
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.806 183079 DEBUG nova.network.neutron [req-01fd9f93-4b97-46df-ab24-b4553a41c6d1 req-9d67fb4a-2f9a-4f92-804d-ae385b7254f3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Updated VIF entry in instance network info cache for port 69ac495b-e1bf-41f5-94f0-2e829df4fc35. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.807 183079 DEBUG nova.network.neutron [req-01fd9f93-4b97-46df-ab24-b4553a41c6d1 req-9d67fb4a-2f9a-4f92-804d-ae385b7254f3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Updating instance_info_cache with network_info: [{"id": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "address": "fa:16:3e:49:9e:1a", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69ac495b-e1", "ovs_interfaceid": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.816 183079 DEBUG oslo_concurrency.lockutils [None req-e9811f67-8aaf-4bb7-8dbf-f0e79eab2621 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.803s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:00:07 compute-0 nova_compute[183075]: 2026-01-22 18:00:07.819 183079 DEBUG oslo_concurrency.lockutils [req-01fd9f93-4b97-46df-ab24-b4553a41c6d1 req-9d67fb4a-2f9a-4f92-804d-ae385b7254f3 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:00:08 compute-0 nova_compute[183075]: 2026-01-22 18:00:08.443 183079 INFO nova.compute.manager [None req-7d7ed784-6d0e-4824-870d-b959c79a39ed 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Get console output
Jan 22 18:00:08 compute-0 nova_compute[183075]: 2026-01-22 18:00:08.448 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:00:09 compute-0 nova_compute[183075]: 2026-01-22 18:00:09.417 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:10 compute-0 podman[245815]: 2026-01-22 18:00:10.352081614 +0000 UTC m=+0.062294682 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 18:00:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:10.383 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:00:11 compute-0 nova_compute[183075]: 2026-01-22 18:00:11.569 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:13 compute-0 podman[245839]: 2026-01-22 18:00:13.350761497 +0000 UTC m=+0.064497781 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 18:00:13 compute-0 nova_compute[183075]: 2026-01-22 18:00:13.568 183079 INFO nova.compute.manager [None req-40be76c2-c523-45ba-9d7f-f6c65a7f1114 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Get console output
Jan 22 18:00:13 compute-0 nova_compute[183075]: 2026-01-22 18:00:13.575 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:00:14 compute-0 nova_compute[183075]: 2026-01-22 18:00:14.420 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:16 compute-0 nova_compute[183075]: 2026-01-22 18:00:16.572 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:18 compute-0 nova_compute[183075]: 2026-01-22 18:00:18.699 183079 INFO nova.compute.manager [None req-e5899561-1fc0-4e43-9a33-9cc1daca7c5d 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Get console output
Jan 22 18:00:18 compute-0 nova_compute[183075]: 2026-01-22 18:00:18.704 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:00:19 compute-0 nova_compute[183075]: 2026-01-22 18:00:19.423 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:19 compute-0 ovn_controller[95372]: 2026-01-22T18:00:19Z|00162|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:49:9e:1a 10.100.0.5
Jan 22 18:00:19 compute-0 ovn_controller[95372]: 2026-01-22T18:00:19Z|00163|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:49:9e:1a 10.100.0.5
Jan 22 18:00:20 compute-0 nova_compute[183075]: 2026-01-22 18:00:20.487 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:00:20 compute-0 nova_compute[183075]: 2026-01-22 18:00:20.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:00:21 compute-0 nova_compute[183075]: 2026-01-22 18:00:21.576 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:23 compute-0 nova_compute[183075]: 2026-01-22 18:00:23.844 183079 INFO nova.compute.manager [None req-351c8b87-bcc0-44ef-8aeb-d3f9b403e7f2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Get console output
Jan 22 18:00:23 compute-0 nova_compute[183075]: 2026-01-22 18:00:23.849 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:00:24 compute-0 nova_compute[183075]: 2026-01-22 18:00:24.425 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:24.904 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:00:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:24.905 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 18:00:24 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:00:24 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:00:24 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:00:24 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:00:24 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:00:24 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 18:00:24 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:00:25 compute-0 podman[245875]: 2026-01-22 18:00:25.377822521 +0000 UTC m=+0.079918059 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 18:00:25 compute-0 podman[245877]: 2026-01-22 18:00:25.378706394 +0000 UTC m=+0.073124104 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, name=ubi9-minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 18:00:25 compute-0 podman[245876]: 2026-01-22 18:00:25.389355972 +0000 UTC m=+0.076750293 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:25.904 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:25.905 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.9997990
Jan 22 18:00:25 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.5:46692 [22/Jan/2026:18:00:24.903] listener listener/metadata 0/0/0/1001/1001 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:25.917 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:25.917 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:25.947 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:25.948 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0303929
Jan 22 18:00:25 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.5:46696 [22/Jan/2026:18:00:25.916] listener listener/metadata 0/0/0/31/31 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:25.952 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:25.952 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:25.967 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:25.968 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0154781
Jan 22 18:00:25 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.5:46704 [22/Jan/2026:18:00:25.951] listener listener/metadata 0/0/0/16/16 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:25.972 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:25.973 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:25.985 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:25.986 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0129387
Jan 22 18:00:25 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.5:46708 [22/Jan/2026:18:00:25.972] listener listener/metadata 0/0/0/13/13 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:25.990 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:25.991 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 18:00:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.004 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:00:26 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.5:46722 [22/Jan/2026:18:00:25.990] listener listener/metadata 0/0/0/14/14 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.004 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0135584
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.009 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.010 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.022 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.022 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0127358
Jan 22 18:00:26 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.5:46734 [22/Jan/2026:18:00:26.008] listener listener/metadata 0/0/0/13/13 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.027 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.027 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.048 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.048 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0207303
Jan 22 18:00:26 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.5:46738 [22/Jan/2026:18:00:26.026] listener listener/metadata 0/0/0/21/21 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.052 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.053 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.067 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.068 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0150650
Jan 22 18:00:26 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.5:46752 [22/Jan/2026:18:00:26.052] listener listener/metadata 0/0/0/15/15 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.074 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.074 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.086 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.086 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0118546
Jan 22 18:00:26 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.5:46766 [22/Jan/2026:18:00:26.073] listener listener/metadata 0/0/0/12/12 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.091 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.092 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.106 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:00:26 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.5:46782 [22/Jan/2026:18:00:26.091] listener listener/metadata 0/0/0/14/14 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.106 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0142865
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.110 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.111 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:00:26 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.5:46794 [22/Jan/2026:18:00:26.110] listener listener/metadata 0/0/0/11/11 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.122 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0113332
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.131 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.131 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.157 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.158 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0268400
Jan 22 18:00:26 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.5:46802 [22/Jan/2026:18:00:26.130] listener listener/metadata 0/0/0/27/27 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.163 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.163 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.176 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:00:26 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.5:46810 [22/Jan/2026:18:00:26.162] listener listener/metadata 0/0/0/14/14 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.177 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0133829
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.180 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.180 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.196 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:00:26 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.5:46818 [22/Jan/2026:18:00:26.180] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.196 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0161412
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.200 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.201 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.213 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:00:26 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.5:46828 [22/Jan/2026:18:00:26.200] listener listener/metadata 0/0/0/12/12 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.213 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0123084
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.218 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.218 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.5
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.233 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:00:26 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245451]: 10.100.0.5:46842 [22/Jan/2026:18:00:26.217] listener listener/metadata 0/0/0/16/16 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 18:00:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:26.234 104990 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0155122
Jan 22 18:00:26 compute-0 nova_compute[183075]: 2026-01-22 18:00:26.580 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:26 compute-0 nova_compute[183075]: 2026-01-22 18:00:26.801 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:00:26 compute-0 nova_compute[183075]: 2026-01-22 18:00:26.802 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 18:00:26 compute-0 nova_compute[183075]: 2026-01-22 18:00:26.822 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 18:00:29 compute-0 nova_compute[183075]: 2026-01-22 18:00:29.017 183079 INFO nova.compute.manager [None req-629dae06-51b7-4a50-a9f3-6b72978e16d6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Get console output
Jan 22 18:00:29 compute-0 nova_compute[183075]: 2026-01-22 18:00:29.022 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:00:29 compute-0 nova_compute[183075]: 2026-01-22 18:00:29.427 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:30 compute-0 podman[245940]: 2026-01-22 18:00:30.354421064 +0000 UTC m=+0.057619556 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 18:00:31 compute-0 nova_compute[183075]: 2026-01-22 18:00:31.585 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:34 compute-0 nova_compute[183075]: 2026-01-22 18:00:34.145 183079 INFO nova.compute.manager [None req-318778a9-f0bf-4290-8eac-96ffd41c36c0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Get console output
Jan 22 18:00:34 compute-0 nova_compute[183075]: 2026-01-22 18:00:34.151 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:00:34 compute-0 nova_compute[183075]: 2026-01-22 18:00:34.428 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:36 compute-0 nova_compute[183075]: 2026-01-22 18:00:36.589 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:36 compute-0 ovn_controller[95372]: 2026-01-22T18:00:36Z|00873|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Jan 22 18:00:39 compute-0 nova_compute[183075]: 2026-01-22 18:00:39.431 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:39 compute-0 nova_compute[183075]: 2026-01-22 18:00:39.667 183079 INFO nova.compute.manager [None req-4b960475-b0a8-430f-ab5a-2568cef61d18 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Get console output
Jan 22 18:00:39 compute-0 nova_compute[183075]: 2026-01-22 18:00:39.672 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:00:41 compute-0 podman[245960]: 2026-01-22 18:00:41.382091142 +0000 UTC m=+0.083052083 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 18:00:41 compute-0 nova_compute[183075]: 2026-01-22 18:00:41.593 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:41.984 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:00:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:41.984 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:00:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:00:41.985 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:00:44 compute-0 podman[245984]: 2026-01-22 18:00:44.339473499 +0000 UTC m=+0.047745750 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 18:00:44 compute-0 nova_compute[183075]: 2026-01-22 18:00:44.433 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:44 compute-0 nova_compute[183075]: 2026-01-22 18:00:44.788 183079 INFO nova.compute.manager [None req-e32cc08f-5b9c-4975-8c6d-0a6636ea1a0a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Get console output
Jan 22 18:00:44 compute-0 nova_compute[183075]: 2026-01-22 18:00:44.791 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:00:46 compute-0 nova_compute[183075]: 2026-01-22 18:00:46.596 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:46 compute-0 nova_compute[183075]: 2026-01-22 18:00:46.809 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:00:48 compute-0 nova_compute[183075]: 2026-01-22 18:00:48.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:00:48 compute-0 nova_compute[183075]: 2026-01-22 18:00:48.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:00:48 compute-0 nova_compute[183075]: 2026-01-22 18:00:48.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:00:49 compute-0 nova_compute[183075]: 2026-01-22 18:00:49.436 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:50 compute-0 nova_compute[183075]: 2026-01-22 18:00:50.036 183079 INFO nova.compute.manager [None req-ed905635-a972-4c64-a20f-bf08e1f5f9be 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Get console output
Jan 22 18:00:50 compute-0 nova_compute[183075]: 2026-01-22 18:00:50.039 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:00:51 compute-0 nova_compute[183075]: 2026-01-22 18:00:51.599 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:54 compute-0 nova_compute[183075]: 2026-01-22 18:00:54.444 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:54 compute-0 nova_compute[183075]: 2026-01-22 18:00:54.747 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:00:54 compute-0 nova_compute[183075]: 2026-01-22 18:00:54.822 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Triggering sync for uuid cf09548c-3631-48db-b474-279b88fc113d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 18:00:54 compute-0 nova_compute[183075]: 2026-01-22 18:00:54.822 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Triggering sync for uuid 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 18:00:54 compute-0 nova_compute[183075]: 2026-01-22 18:00:54.822 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "cf09548c-3631-48db-b474-279b88fc113d" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:00:54 compute-0 nova_compute[183075]: 2026-01-22 18:00:54.823 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "cf09548c-3631-48db-b474-279b88fc113d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:00:54 compute-0 nova_compute[183075]: 2026-01-22 18:00:54.823 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:00:54 compute-0 nova_compute[183075]: 2026-01-22 18:00:54.823 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:00:54 compute-0 nova_compute[183075]: 2026-01-22 18:00:54.851 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "cf09548c-3631-48db-b474-279b88fc113d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.028s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:00:54 compute-0 nova_compute[183075]: 2026-01-22 18:00:54.852 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.029s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:00:55 compute-0 nova_compute[183075]: 2026-01-22 18:00:55.141 183079 INFO nova.compute.manager [None req-e5650c39-66c1-4b49-969c-9f501b16e871 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Get console output
Jan 22 18:00:55 compute-0 nova_compute[183075]: 2026-01-22 18:00:55.145 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:00:56 compute-0 podman[246011]: 2026-01-22 18:00:56.345569539 +0000 UTC m=+0.048964843 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 22 18:00:56 compute-0 podman[246012]: 2026-01-22 18:00:56.354581682 +0000 UTC m=+0.053718051 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 18:00:56 compute-0 podman[246010]: 2026-01-22 18:00:56.375336662 +0000 UTC m=+0.081590033 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 18:00:56 compute-0 nova_compute[183075]: 2026-01-22 18:00:56.600 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:00:57 compute-0 nova_compute[183075]: 2026-01-22 18:00:57.859 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:00:58 compute-0 nova_compute[183075]: 2026-01-22 18:00:58.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:00:58 compute-0 nova_compute[183075]: 2026-01-22 18:00:58.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 18:00:58 compute-0 nova_compute[183075]: 2026-01-22 18:00:58.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 18:00:58 compute-0 nova_compute[183075]: 2026-01-22 18:00:58.993 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-cf09548c-3631-48db-b474-279b88fc113d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:00:58 compute-0 nova_compute[183075]: 2026-01-22 18:00:58.993 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-cf09548c-3631-48db-b474-279b88fc113d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:00:58 compute-0 nova_compute[183075]: 2026-01-22 18:00:58.993 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 18:00:58 compute-0 nova_compute[183075]: 2026-01-22 18:00:58.993 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid cf09548c-3631-48db-b474-279b88fc113d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:00:59 compute-0 nova_compute[183075]: 2026-01-22 18:00:59.447 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:00 compute-0 nova_compute[183075]: 2026-01-22 18:01:00.277 183079 INFO nova.compute.manager [None req-58ccfe58-6a0e-4cef-91f9-8c4a83378fa3 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Get console output
Jan 22 18:01:00 compute-0 nova_compute[183075]: 2026-01-22 18:01:00.284 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:01:00 compute-0 nova_compute[183075]: 2026-01-22 18:01:00.852 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] Updating instance_info_cache with network_info: [{"id": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "address": "fa:16:3e:65:51:bc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc533cce-5a", "ovs_interfaceid": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:01:00 compute-0 nova_compute[183075]: 2026-01-22 18:01:00.867 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-cf09548c-3631-48db-b474-279b88fc113d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:01:00 compute-0 nova_compute[183075]: 2026-01-22 18:01:00.868 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 18:01:00 compute-0 nova_compute[183075]: 2026-01-22 18:01:00.868 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:01:00 compute-0 nova_compute[183075]: 2026-01-22 18:01:00.868 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 18:01:00 compute-0 nova_compute[183075]: 2026-01-22 18:01:00.869 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:01:00 compute-0 nova_compute[183075]: 2026-01-22 18:01:00.891 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:00 compute-0 nova_compute[183075]: 2026-01-22 18:01:00.891 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:00 compute-0 nova_compute[183075]: 2026-01-22 18:01:00.892 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:00 compute-0 nova_compute[183075]: 2026-01-22 18:01:00.892 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 18:01:00 compute-0 podman[246076]: 2026-01-22 18:01:00.979449759 +0000 UTC m=+0.050726400 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.018 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.073 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.075 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.128 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.135 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.199 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.200 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.263 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:01:01 compute-0 CROND[246110]: (root) CMD (run-parts /etc/cron.hourly)
Jan 22 18:01:01 compute-0 run-parts[246113]: (/etc/cron.hourly) starting 0anacron
Jan 22 18:01:01 compute-0 run-parts[246119]: (/etc/cron.hourly) finished 0anacron
Jan 22 18:01:01 compute-0 CROND[246109]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.435 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.436 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5416MB free_disk=73.29372787475586GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.437 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.437 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.526 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance cf09548c-3631-48db-b474-279b88fc113d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.526 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.526 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.527 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.596 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.604 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.609 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.641 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.641 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:01.844 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:01:01 compute-0 nova_compute[183075]: 2026-01-22 18:01:01.844 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:01 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:01.845 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 18:01:02 compute-0 nova_compute[183075]: 2026-01-22 18:01:02.560 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:01:03 compute-0 nova_compute[183075]: 2026-01-22 18:01:03.089 183079 DEBUG nova.compute.manager [req-af904120-09ec-41f9-8f89-b0af2e48aeb8 req-44bc75dd-f601-435d-9ca3-03d83b4023fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Received event network-changed-dc533cce-5ac7-469b-ac7f-effa9e40a838 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:01:03 compute-0 nova_compute[183075]: 2026-01-22 18:01:03.089 183079 DEBUG nova.compute.manager [req-af904120-09ec-41f9-8f89-b0af2e48aeb8 req-44bc75dd-f601-435d-9ca3-03d83b4023fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Refreshing instance network info cache due to event network-changed-dc533cce-5ac7-469b-ac7f-effa9e40a838. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 18:01:03 compute-0 nova_compute[183075]: 2026-01-22 18:01:03.090 183079 DEBUG oslo_concurrency.lockutils [req-af904120-09ec-41f9-8f89-b0af2e48aeb8 req-44bc75dd-f601-435d-9ca3-03d83b4023fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-cf09548c-3631-48db-b474-279b88fc113d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:01:03 compute-0 nova_compute[183075]: 2026-01-22 18:01:03.090 183079 DEBUG oslo_concurrency.lockutils [req-af904120-09ec-41f9-8f89-b0af2e48aeb8 req-44bc75dd-f601-435d-9ca3-03d83b4023fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-cf09548c-3631-48db-b474-279b88fc113d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:01:03 compute-0 nova_compute[183075]: 2026-01-22 18:01:03.090 183079 DEBUG nova.network.neutron [req-af904120-09ec-41f9-8f89-b0af2e48aeb8 req-44bc75dd-f601-435d-9ca3-03d83b4023fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Refreshing network info cache for port dc533cce-5ac7-469b-ac7f-effa9e40a838 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 18:01:04 compute-0 nova_compute[183075]: 2026-01-22 18:01:04.449 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:05 compute-0 nova_compute[183075]: 2026-01-22 18:01:05.170 183079 DEBUG nova.network.neutron [req-af904120-09ec-41f9-8f89-b0af2e48aeb8 req-44bc75dd-f601-435d-9ca3-03d83b4023fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Updated VIF entry in instance network info cache for port dc533cce-5ac7-469b-ac7f-effa9e40a838. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 18:01:05 compute-0 nova_compute[183075]: 2026-01-22 18:01:05.171 183079 DEBUG nova.network.neutron [req-af904120-09ec-41f9-8f89-b0af2e48aeb8 req-44bc75dd-f601-435d-9ca3-03d83b4023fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Updating instance_info_cache with network_info: [{"id": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "address": "fa:16:3e:65:51:bc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc533cce-5a", "ovs_interfaceid": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:01:05 compute-0 nova_compute[183075]: 2026-01-22 18:01:05.194 183079 DEBUG oslo_concurrency.lockutils [req-af904120-09ec-41f9-8f89-b0af2e48aeb8 req-44bc75dd-f601-435d-9ca3-03d83b4023fc a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-cf09548c-3631-48db-b474-279b88fc113d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:01:05 compute-0 nova_compute[183075]: 2026-01-22 18:01:05.753 183079 DEBUG nova.compute.manager [req-e387b6a8-d725-446f-9284-94cc664c333f req-18c73864-cad1-488a-a4cd-ca7545b3b979 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Received event network-changed-69ac495b-e1bf-41f5-94f0-2e829df4fc35 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:01:05 compute-0 nova_compute[183075]: 2026-01-22 18:01:05.753 183079 DEBUG nova.compute.manager [req-e387b6a8-d725-446f-9284-94cc664c333f req-18c73864-cad1-488a-a4cd-ca7545b3b979 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Refreshing instance network info cache due to event network-changed-69ac495b-e1bf-41f5-94f0-2e829df4fc35. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 18:01:05 compute-0 nova_compute[183075]: 2026-01-22 18:01:05.754 183079 DEBUG oslo_concurrency.lockutils [req-e387b6a8-d725-446f-9284-94cc664c333f req-18c73864-cad1-488a-a4cd-ca7545b3b979 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:01:05 compute-0 nova_compute[183075]: 2026-01-22 18:01:05.754 183079 DEBUG oslo_concurrency.lockutils [req-e387b6a8-d725-446f-9284-94cc664c333f req-18c73864-cad1-488a-a4cd-ca7545b3b979 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:01:05 compute-0 nova_compute[183075]: 2026-01-22 18:01:05.755 183079 DEBUG nova.network.neutron [req-e387b6a8-d725-446f-9284-94cc664c333f req-18c73864-cad1-488a-a4cd-ca7545b3b979 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Refreshing network info cache for port 69ac495b-e1bf-41f5-94f0-2e829df4fc35 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 18:01:06 compute-0 nova_compute[183075]: 2026-01-22 18:01:06.607 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:07 compute-0 nova_compute[183075]: 2026-01-22 18:01:07.857 183079 DEBUG nova.network.neutron [req-e387b6a8-d725-446f-9284-94cc664c333f req-18c73864-cad1-488a-a4cd-ca7545b3b979 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Updated VIF entry in instance network info cache for port 69ac495b-e1bf-41f5-94f0-2e829df4fc35. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 18:01:07 compute-0 nova_compute[183075]: 2026-01-22 18:01:07.858 183079 DEBUG nova.network.neutron [req-e387b6a8-d725-446f-9284-94cc664c333f req-18c73864-cad1-488a-a4cd-ca7545b3b979 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Updating instance_info_cache with network_info: [{"id": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "address": "fa:16:3e:49:9e:1a", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69ac495b-e1", "ovs_interfaceid": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:01:07 compute-0 nova_compute[183075]: 2026-01-22 18:01:07.879 183079 DEBUG oslo_concurrency.lockutils [req-e387b6a8-d725-446f-9284-94cc664c333f req-18c73864-cad1-488a-a4cd-ca7545b3b979 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.289 183079 DEBUG oslo_concurrency.lockutils [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.290 183079 DEBUG oslo_concurrency.lockutils [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.290 183079 DEBUG oslo_concurrency.lockutils [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.290 183079 DEBUG oslo_concurrency.lockutils [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.291 183079 DEBUG oslo_concurrency.lockutils [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.292 183079 INFO nova.compute.manager [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Terminating instance
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.293 183079 DEBUG nova.compute.manager [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 18:01:09 compute-0 kernel: tap69ac495b-e1 (unregistering): left promiscuous mode
Jan 22 18:01:09 compute-0 NetworkManager[55454]: <info>  [1769104869.3244] device (tap69ac495b-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 18:01:09 compute-0 ovn_controller[95372]: 2026-01-22T18:01:09Z|00874|binding|INFO|Releasing lport 69ac495b-e1bf-41f5-94f0-2e829df4fc35 from this chassis (sb_readonly=0)
Jan 22 18:01:09 compute-0 ovn_controller[95372]: 2026-01-22T18:01:09Z|00875|binding|INFO|Setting lport 69ac495b-e1bf-41f5-94f0-2e829df4fc35 down in Southbound
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.330 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:09 compute-0 ovn_controller[95372]: 2026-01-22T18:01:09Z|00876|binding|INFO|Removing iface tap69ac495b-e1 ovn-installed in OVS
Jan 22 18:01:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:09.339 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:9e:1a 10.100.0.5'], port_security=['fa:16:3e:49:9e:1a 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7fef7daf-622f-4f8a-ba6a-25fac7fd68ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '93cc952c-d4f7-47c9-94ed-c14dd990188b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.242'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=69ac495b-e1bf-41f5-94f0-2e829df4fc35) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:01:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:09.341 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 69ac495b-e1bf-41f5-94f0-2e829df4fc35 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 18:01:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:09.342 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.349 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:09.363 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ea25ce67-2c98-4cb6-98c7-559688ad5419]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:09.392 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[27dc62fd-4cf4-45a6-9b34-1628e8c7e139]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:09.396 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[cf9813b1-27c5-49e8-83db-03c0e29d1213]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:09 compute-0 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d00000050.scope: Deactivated successfully.
Jan 22 18:01:09 compute-0 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d00000050.scope: Consumed 14.547s CPU time.
Jan 22 18:01:09 compute-0 systemd-machined[154382]: Machine qemu-80-instance-00000050 terminated.
Jan 22 18:01:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:09.428 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[38f8ac0f-af6e-469a-8994-588db914d288]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:09.446 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1c03ae20-c0c8-4010-831b-943e153ea01f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 202, 'tx_packets': 105, 'rx_bytes': 17308, 'tx_bytes': 11993, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 202, 'tx_packets': 105, 'rx_bytes': 17308, 'tx_bytes': 11993, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 235], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 710801, 'reachable_time': 15312, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246133, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.451 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:09.463 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e835e2-031d-4069-b208-1e5882a84f41]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 710816, 'tstamp': 710816}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246134, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88ed9213-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 710819, 'tstamp': 710819}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246134, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:09.465 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.467 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.471 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:09.472 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:01:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:09.473 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 18:01:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:09.473 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:01:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:09.473 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.516 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.521 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.545 183079 INFO nova.virt.libvirt.driver [-] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Instance destroyed successfully.
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.546 183079 DEBUG nova.objects.instance [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.563 183079 DEBUG nova.virt.libvirt.vif [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T18:00:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1402958561',display_name='tempest-server-test-1402958561',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1402958561',id=80,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T18:00:07Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-ejlisxlb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T18:00:07Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=7fef7daf-622f-4f8a-ba6a-25fac7fd68ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "address": "fa:16:3e:49:9e:1a", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69ac495b-e1", "ovs_interfaceid": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.564 183079 DEBUG nova.network.os_vif_util [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "address": "fa:16:3e:49:9e:1a", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69ac495b-e1", "ovs_interfaceid": "69ac495b-e1bf-41f5-94f0-2e829df4fc35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.565 183079 DEBUG nova.network.os_vif_util [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:49:9e:1a,bridge_name='br-int',has_traffic_filtering=True,id=69ac495b-e1bf-41f5-94f0-2e829df4fc35,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69ac495b-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.565 183079 DEBUG os_vif [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:49:9e:1a,bridge_name='br-int',has_traffic_filtering=True,id=69ac495b-e1bf-41f5-94f0-2e829df4fc35,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69ac495b-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.566 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.567 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69ac495b-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.568 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.569 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.572 183079 INFO os_vif [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:49:9e:1a,bridge_name='br-int',has_traffic_filtering=True,id=69ac495b-e1bf-41f5-94f0-2e829df4fc35,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69ac495b-e1')
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.572 183079 INFO nova.virt.libvirt.driver [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Deleting instance files /var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed_del
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.572 183079 INFO nova.virt.libvirt.driver [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Deletion of /var/lib/nova/instances/7fef7daf-622f-4f8a-ba6a-25fac7fd68ed_del complete
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.621 183079 INFO nova.compute.manager [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.622 183079 DEBUG oslo.service.loopingcall [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.622 183079 DEBUG nova.compute.manager [-] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 18:01:09 compute-0 nova_compute[183075]: 2026-01-22 18:01:09.623 183079 DEBUG nova.network.neutron [-] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 18:01:10 compute-0 nova_compute[183075]: 2026-01-22 18:01:10.296 183079 DEBUG nova.compute.manager [req-8c314c46-cae9-463a-bc26-7e1208f73dcc req-6e3aabce-da23-4c93-9296-c72b25663ac6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Received event network-vif-unplugged-69ac495b-e1bf-41f5-94f0-2e829df4fc35 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:01:10 compute-0 nova_compute[183075]: 2026-01-22 18:01:10.296 183079 DEBUG oslo_concurrency.lockutils [req-8c314c46-cae9-463a-bc26-7e1208f73dcc req-6e3aabce-da23-4c93-9296-c72b25663ac6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:10 compute-0 nova_compute[183075]: 2026-01-22 18:01:10.296 183079 DEBUG oslo_concurrency.lockutils [req-8c314c46-cae9-463a-bc26-7e1208f73dcc req-6e3aabce-da23-4c93-9296-c72b25663ac6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:10 compute-0 nova_compute[183075]: 2026-01-22 18:01:10.297 183079 DEBUG oslo_concurrency.lockutils [req-8c314c46-cae9-463a-bc26-7e1208f73dcc req-6e3aabce-da23-4c93-9296-c72b25663ac6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:10 compute-0 nova_compute[183075]: 2026-01-22 18:01:10.297 183079 DEBUG nova.compute.manager [req-8c314c46-cae9-463a-bc26-7e1208f73dcc req-6e3aabce-da23-4c93-9296-c72b25663ac6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] No waiting events found dispatching network-vif-unplugged-69ac495b-e1bf-41f5-94f0-2e829df4fc35 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:01:10 compute-0 nova_compute[183075]: 2026-01-22 18:01:10.297 183079 DEBUG nova.compute.manager [req-8c314c46-cae9-463a-bc26-7e1208f73dcc req-6e3aabce-da23-4c93-9296-c72b25663ac6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Received event network-vif-unplugged-69ac495b-e1bf-41f5-94f0-2e829df4fc35 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 18:01:10 compute-0 nova_compute[183075]: 2026-01-22 18:01:10.842 183079 DEBUG nova.network.neutron [-] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:01:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:10.848 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:01:10 compute-0 nova_compute[183075]: 2026-01-22 18:01:10.863 183079 INFO nova.compute.manager [-] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Took 1.24 seconds to deallocate network for instance.
Jan 22 18:01:10 compute-0 nova_compute[183075]: 2026-01-22 18:01:10.927 183079 DEBUG nova.compute.manager [req-eec44d1e-e8ad-4e21-981e-1fcd5e5b41b3 req-56728261-80d7-43d9-a335-ebcbfdf5ef6b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Received event network-vif-deleted-69ac495b-e1bf-41f5-94f0-2e829df4fc35 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:01:10 compute-0 nova_compute[183075]: 2026-01-22 18:01:10.941 183079 DEBUG oslo_concurrency.lockutils [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:10 compute-0 nova_compute[183075]: 2026-01-22 18:01:10.941 183079 DEBUG oslo_concurrency.lockutils [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:11 compute-0 nova_compute[183075]: 2026-01-22 18:01:11.028 183079 DEBUG nova.compute.provider_tree [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:01:11 compute-0 nova_compute[183075]: 2026-01-22 18:01:11.047 183079 DEBUG nova.scheduler.client.report [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:01:11 compute-0 nova_compute[183075]: 2026-01-22 18:01:11.072 183079 DEBUG oslo_concurrency.lockutils [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:11 compute-0 nova_compute[183075]: 2026-01-22 18:01:11.094 183079 INFO nova.scheduler.client.report [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed
Jan 22 18:01:11 compute-0 nova_compute[183075]: 2026-01-22 18:01:11.156 183079 DEBUG oslo_concurrency.lockutils [None req-0d031274-c488-4273-b234-dc909dfd349b 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.866s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:11 compute-0 nova_compute[183075]: 2026-01-22 18:01:11.903 183079 DEBUG oslo_concurrency.lockutils [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "cf09548c-3631-48db-b474-279b88fc113d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:11 compute-0 nova_compute[183075]: 2026-01-22 18:01:11.904 183079 DEBUG oslo_concurrency.lockutils [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "cf09548c-3631-48db-b474-279b88fc113d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:11 compute-0 nova_compute[183075]: 2026-01-22 18:01:11.904 183079 DEBUG oslo_concurrency.lockutils [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "cf09548c-3631-48db-b474-279b88fc113d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:11 compute-0 nova_compute[183075]: 2026-01-22 18:01:11.904 183079 DEBUG oslo_concurrency.lockutils [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "cf09548c-3631-48db-b474-279b88fc113d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:11 compute-0 nova_compute[183075]: 2026-01-22 18:01:11.905 183079 DEBUG oslo_concurrency.lockutils [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "cf09548c-3631-48db-b474-279b88fc113d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:11 compute-0 nova_compute[183075]: 2026-01-22 18:01:11.906 183079 INFO nova.compute.manager [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Terminating instance
Jan 22 18:01:11 compute-0 nova_compute[183075]: 2026-01-22 18:01:11.907 183079 DEBUG nova.compute.manager [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 18:01:11 compute-0 kernel: tapdc533cce-5a (unregistering): left promiscuous mode
Jan 22 18:01:11 compute-0 NetworkManager[55454]: <info>  [1769104871.9347] device (tapdc533cce-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 18:01:11 compute-0 ovn_controller[95372]: 2026-01-22T18:01:11Z|00877|binding|INFO|Releasing lport dc533cce-5ac7-469b-ac7f-effa9e40a838 from this chassis (sb_readonly=0)
Jan 22 18:01:11 compute-0 ovn_controller[95372]: 2026-01-22T18:01:11Z|00878|binding|INFO|Setting lport dc533cce-5ac7-469b-ac7f-effa9e40a838 down in Southbound
Jan 22 18:01:11 compute-0 ovn_controller[95372]: 2026-01-22T18:01:11Z|00879|binding|INFO|Removing iface tapdc533cce-5a ovn-installed in OVS
Jan 22 18:01:11 compute-0 nova_compute[183075]: 2026-01-22 18:01:11.946 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:11 compute-0 nova_compute[183075]: 2026-01-22 18:01:11.948 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:11 compute-0 nova_compute[183075]: 2026-01-22 18:01:11.962 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:11.983 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:51:bc 10.100.0.11'], port_security=['fa:16:3e:65:51:bc 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'cf09548c-3631-48db-b474-279b88fc113d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '93cc952c-d4f7-47c9-94ed-c14dd990188b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.186'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=dc533cce-5ac7-469b-ac7f-effa9e40a838) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:01:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:11.984 104629 INFO neutron.agent.ovn.metadata.agent [-] Port dc533cce-5ac7-469b-ac7f-effa9e40a838 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 18:01:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:11.985 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 18:01:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:11.986 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b3611163-84fc-44f8-96e1-34d93f7dd43a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:11.987 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace which is not needed anymore
Jan 22 18:01:11 compute-0 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d0000004f.scope: Deactivated successfully.
Jan 22 18:01:11 compute-0 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d0000004f.scope: Consumed 17.772s CPU time.
Jan 22 18:01:11 compute-0 systemd-machined[154382]: Machine qemu-79-instance-0000004f terminated.
Jan 22 18:01:12 compute-0 podman[246153]: 2026-01-22 18:01:12.076579498 +0000 UTC m=+0.095234112 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 18:01:12 compute-0 NetworkManager[55454]: <info>  [1769104872.1293] manager: (tapdc533cce-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/349)
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.131 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.135 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:12 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245443]: [NOTICE]   (245447) : haproxy version is 2.8.14-c23fe91
Jan 22 18:01:12 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245443]: [NOTICE]   (245447) : path to executable is /usr/sbin/haproxy
Jan 22 18:01:12 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245443]: [WARNING]  (245447) : Exiting Master process...
Jan 22 18:01:12 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245443]: [WARNING]  (245447) : Exiting Master process...
Jan 22 18:01:12 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245443]: [ALERT]    (245447) : Current worker (245451) exited with code 143 (Terminated)
Jan 22 18:01:12 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[245443]: [WARNING]  (245447) : All workers exited. Exiting... (0)
Jan 22 18:01:12 compute-0 systemd[1]: libpod-04a32a677217d4ab22e6e54d2d52f9ec2922348febea7da56820c714c00ee980.scope: Deactivated successfully.
Jan 22 18:01:12 compute-0 podman[246194]: 2026-01-22 18:01:12.147756219 +0000 UTC m=+0.054309687 container died 04a32a677217d4ab22e6e54d2d52f9ec2922348febea7da56820c714c00ee980 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.166 183079 INFO nova.virt.libvirt.driver [-] [instance: cf09548c-3631-48db-b474-279b88fc113d] Instance destroyed successfully.
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.167 183079 DEBUG nova.objects.instance [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid cf09548c-3631-48db-b474-279b88fc113d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:01:12 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-04a32a677217d4ab22e6e54d2d52f9ec2922348febea7da56820c714c00ee980-userdata-shm.mount: Deactivated successfully.
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.180 183079 DEBUG nova.virt.libvirt.vif [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T17:58:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1105403482',display_name='tempest-server-test-1105403482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1105403482',id=79,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T17:59:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-qz3but4j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T17:59:05Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=cf09548c-3631-48db-b474-279b88fc113d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "address": "fa:16:3e:65:51:bc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc533cce-5a", "ovs_interfaceid": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.181 183079 DEBUG nova.network.os_vif_util [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "address": "fa:16:3e:65:51:bc", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc533cce-5a", "ovs_interfaceid": "dc533cce-5ac7-469b-ac7f-effa9e40a838", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 18:01:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-b13dd48e0da30bd204a7b7877a583e39df262b17dac74447d84ad5f6a35ae668-merged.mount: Deactivated successfully.
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.182 183079 DEBUG nova.network.os_vif_util [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:51:bc,bridge_name='br-int',has_traffic_filtering=True,id=dc533cce-5ac7-469b-ac7f-effa9e40a838,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc533cce-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.182 183079 DEBUG os_vif [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:51:bc,bridge_name='br-int',has_traffic_filtering=True,id=dc533cce-5ac7-469b-ac7f-effa9e40a838,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc533cce-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.183 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.184 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc533cce-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.185 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.186 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:12 compute-0 podman[246194]: 2026-01-22 18:01:12.188481668 +0000 UTC m=+0.095035136 container cleanup 04a32a677217d4ab22e6e54d2d52f9ec2922348febea7da56820c714c00ee980 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.189 183079 INFO os_vif [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:51:bc,bridge_name='br-int',has_traffic_filtering=True,id=dc533cce-5ac7-469b-ac7f-effa9e40a838,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc533cce-5a')
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.190 183079 INFO nova.virt.libvirt.driver [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Deleting instance files /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d_del
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.191 183079 INFO nova.virt.libvirt.driver [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Deletion of /var/lib/nova/instances/cf09548c-3631-48db-b474-279b88fc113d_del complete
Jan 22 18:01:12 compute-0 systemd[1]: libpod-conmon-04a32a677217d4ab22e6e54d2d52f9ec2922348febea7da56820c714c00ee980.scope: Deactivated successfully.
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.241 183079 INFO nova.compute.manager [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.242 183079 DEBUG oslo.service.loopingcall [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.242 183079 DEBUG nova.compute.manager [-] [instance: cf09548c-3631-48db-b474-279b88fc113d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.242 183079 DEBUG nova.network.neutron [-] [instance: cf09548c-3631-48db-b474-279b88fc113d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 18:01:12 compute-0 podman[246235]: 2026-01-22 18:01:12.270746829 +0000 UTC m=+0.056726212 container remove 04a32a677217d4ab22e6e54d2d52f9ec2922348febea7da56820c714c00ee980 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 18:01:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:12.276 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[439a7b02-50f7-425d-946e-66e72321cee1]: (4, ('Thu Jan 22 06:01:12 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (04a32a677217d4ab22e6e54d2d52f9ec2922348febea7da56820c714c00ee980)\n04a32a677217d4ab22e6e54d2d52f9ec2922348febea7da56820c714c00ee980\nThu Jan 22 06:01:12 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (04a32a677217d4ab22e6e54d2d52f9ec2922348febea7da56820c714c00ee980)\n04a32a677217d4ab22e6e54d2d52f9ec2922348febea7da56820c714c00ee980\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:12.278 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9e55c3fc-51eb-437a-80c2-fe458755d89b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:12.279 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:01:12 compute-0 kernel: tap88ed9213-70: left promiscuous mode
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.281 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.293 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:12.295 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d352273b-153f-4c36-ab6f-c99fb30b8729]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:12.308 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e003b2eb-9d98-40f4-9683-559d938fb67f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:12.310 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cdf7d1ef-9055-4bbe-9f07-24e19c2fa211]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:12.332 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c09e5901-7def-40ec-a15e-b78d03270c1c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 710795, 'reachable_time': 20259, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246250, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:12.337 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 18:01:12 compute-0 systemd[1]: run-netns-ovnmeta\x2d88ed9213\x2d7d48\x2d4c42\x2dad60\x2dce3fcf104f71.mount: Deactivated successfully.
Jan 22 18:01:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:12.337 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[92fe4c3a-609a-41ce-bb48-d38fce0677a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.357 183079 DEBUG nova.compute.manager [req-f71958e9-5b7e-4c43-af1a-4978287dc840 req-97af8f95-7afe-4a6a-9932-333ff20aa871 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Received event network-vif-plugged-69ac495b-e1bf-41f5-94f0-2e829df4fc35 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.358 183079 DEBUG oslo_concurrency.lockutils [req-f71958e9-5b7e-4c43-af1a-4978287dc840 req-97af8f95-7afe-4a6a-9932-333ff20aa871 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.358 183079 DEBUG oslo_concurrency.lockutils [req-f71958e9-5b7e-4c43-af1a-4978287dc840 req-97af8f95-7afe-4a6a-9932-333ff20aa871 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.358 183079 DEBUG oslo_concurrency.lockutils [req-f71958e9-5b7e-4c43-af1a-4978287dc840 req-97af8f95-7afe-4a6a-9932-333ff20aa871 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "7fef7daf-622f-4f8a-ba6a-25fac7fd68ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.358 183079 DEBUG nova.compute.manager [req-f71958e9-5b7e-4c43-af1a-4978287dc840 req-97af8f95-7afe-4a6a-9932-333ff20aa871 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] No waiting events found dispatching network-vif-plugged-69ac495b-e1bf-41f5-94f0-2e829df4fc35 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:01:12 compute-0 nova_compute[183075]: 2026-01-22 18:01:12.358 183079 WARNING nova.compute.manager [req-f71958e9-5b7e-4c43-af1a-4978287dc840 req-97af8f95-7afe-4a6a-9932-333ff20aa871 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Received unexpected event network-vif-plugged-69ac495b-e1bf-41f5-94f0-2e829df4fc35 for instance with vm_state deleted and task_state None.
Jan 22 18:01:13 compute-0 nova_compute[183075]: 2026-01-22 18:01:13.058 183079 DEBUG nova.compute.manager [req-3663b91b-141c-4bb4-8ff9-48df68af4a29 req-fee4fab5-e021-4e0c-b18f-84d1294aa670 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Received event network-vif-unplugged-dc533cce-5ac7-469b-ac7f-effa9e40a838 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:01:13 compute-0 nova_compute[183075]: 2026-01-22 18:01:13.059 183079 DEBUG oslo_concurrency.lockutils [req-3663b91b-141c-4bb4-8ff9-48df68af4a29 req-fee4fab5-e021-4e0c-b18f-84d1294aa670 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "cf09548c-3631-48db-b474-279b88fc113d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:13 compute-0 nova_compute[183075]: 2026-01-22 18:01:13.059 183079 DEBUG oslo_concurrency.lockutils [req-3663b91b-141c-4bb4-8ff9-48df68af4a29 req-fee4fab5-e021-4e0c-b18f-84d1294aa670 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cf09548c-3631-48db-b474-279b88fc113d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:13 compute-0 nova_compute[183075]: 2026-01-22 18:01:13.059 183079 DEBUG oslo_concurrency.lockutils [req-3663b91b-141c-4bb4-8ff9-48df68af4a29 req-fee4fab5-e021-4e0c-b18f-84d1294aa670 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cf09548c-3631-48db-b474-279b88fc113d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:13 compute-0 nova_compute[183075]: 2026-01-22 18:01:13.060 183079 DEBUG nova.compute.manager [req-3663b91b-141c-4bb4-8ff9-48df68af4a29 req-fee4fab5-e021-4e0c-b18f-84d1294aa670 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] No waiting events found dispatching network-vif-unplugged-dc533cce-5ac7-469b-ac7f-effa9e40a838 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:01:13 compute-0 nova_compute[183075]: 2026-01-22 18:01:13.060 183079 DEBUG nova.compute.manager [req-3663b91b-141c-4bb4-8ff9-48df68af4a29 req-fee4fab5-e021-4e0c-b18f-84d1294aa670 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Received event network-vif-unplugged-dc533cce-5ac7-469b-ac7f-effa9e40a838 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 18:01:13 compute-0 nova_compute[183075]: 2026-01-22 18:01:13.060 183079 DEBUG nova.compute.manager [req-3663b91b-141c-4bb4-8ff9-48df68af4a29 req-fee4fab5-e021-4e0c-b18f-84d1294aa670 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Received event network-vif-plugged-dc533cce-5ac7-469b-ac7f-effa9e40a838 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:01:13 compute-0 nova_compute[183075]: 2026-01-22 18:01:13.060 183079 DEBUG oslo_concurrency.lockutils [req-3663b91b-141c-4bb4-8ff9-48df68af4a29 req-fee4fab5-e021-4e0c-b18f-84d1294aa670 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "cf09548c-3631-48db-b474-279b88fc113d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:13 compute-0 nova_compute[183075]: 2026-01-22 18:01:13.060 183079 DEBUG oslo_concurrency.lockutils [req-3663b91b-141c-4bb4-8ff9-48df68af4a29 req-fee4fab5-e021-4e0c-b18f-84d1294aa670 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cf09548c-3631-48db-b474-279b88fc113d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:13 compute-0 nova_compute[183075]: 2026-01-22 18:01:13.061 183079 DEBUG oslo_concurrency.lockutils [req-3663b91b-141c-4bb4-8ff9-48df68af4a29 req-fee4fab5-e021-4e0c-b18f-84d1294aa670 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "cf09548c-3631-48db-b474-279b88fc113d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:13 compute-0 nova_compute[183075]: 2026-01-22 18:01:13.061 183079 DEBUG nova.compute.manager [req-3663b91b-141c-4bb4-8ff9-48df68af4a29 req-fee4fab5-e021-4e0c-b18f-84d1294aa670 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] No waiting events found dispatching network-vif-plugged-dc533cce-5ac7-469b-ac7f-effa9e40a838 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:01:13 compute-0 nova_compute[183075]: 2026-01-22 18:01:13.061 183079 WARNING nova.compute.manager [req-3663b91b-141c-4bb4-8ff9-48df68af4a29 req-fee4fab5-e021-4e0c-b18f-84d1294aa670 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Received unexpected event network-vif-plugged-dc533cce-5ac7-469b-ac7f-effa9e40a838 for instance with vm_state active and task_state deleting.
Jan 22 18:01:13 compute-0 nova_compute[183075]: 2026-01-22 18:01:13.860 183079 DEBUG nova.network.neutron [-] [instance: cf09548c-3631-48db-b474-279b88fc113d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:01:13 compute-0 nova_compute[183075]: 2026-01-22 18:01:13.883 183079 INFO nova.compute.manager [-] [instance: cf09548c-3631-48db-b474-279b88fc113d] Took 1.64 seconds to deallocate network for instance.
Jan 22 18:01:13 compute-0 nova_compute[183075]: 2026-01-22 18:01:13.939 183079 DEBUG oslo_concurrency.lockutils [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:13 compute-0 nova_compute[183075]: 2026-01-22 18:01:13.940 183079 DEBUG oslo_concurrency.lockutils [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:14 compute-0 nova_compute[183075]: 2026-01-22 18:01:14.010 183079 DEBUG nova.compute.provider_tree [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:01:14 compute-0 nova_compute[183075]: 2026-01-22 18:01:14.030 183079 DEBUG nova.scheduler.client.report [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:01:14 compute-0 nova_compute[183075]: 2026-01-22 18:01:14.053 183079 DEBUG oslo_concurrency.lockutils [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:14 compute-0 nova_compute[183075]: 2026-01-22 18:01:14.076 183079 INFO nova.scheduler.client.report [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance cf09548c-3631-48db-b474-279b88fc113d
Jan 22 18:01:14 compute-0 nova_compute[183075]: 2026-01-22 18:01:14.141 183079 DEBUG oslo_concurrency.lockutils [None req-513e00b8-b2b4-40f0-8c60-c14e4b950192 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "cf09548c-3631-48db-b474-279b88fc113d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:14 compute-0 nova_compute[183075]: 2026-01-22 18:01:14.454 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:15 compute-0 nova_compute[183075]: 2026-01-22 18:01:15.169 183079 DEBUG nova.compute.manager [req-fb2b7e95-1698-4bcc-948c-033360cb1f0e req-a3599535-fab8-49e3-aedb-826c9f393cf6 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: cf09548c-3631-48db-b474-279b88fc113d] Received event network-vif-deleted-dc533cce-5ac7-469b-ac7f-effa9e40a838 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:01:15 compute-0 podman[246251]: 2026-01-22 18:01:15.355042853 +0000 UTC m=+0.056974429 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 18:01:17 compute-0 nova_compute[183075]: 2026-01-22 18:01:17.186 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:19 compute-0 nova_compute[183075]: 2026-01-22 18:01:19.457 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.260 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "afb51767-98e4-4f27-bf80-d54f23cd06c6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.261 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "afb51767-98e4-4f27-bf80-d54f23cd06c6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.280 183079 DEBUG nova.compute.manager [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.364 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.364 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.370 183079 DEBUG nova.virt.hardware [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.371 183079 INFO nova.compute.claims [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Claim successful on node compute-0.ctlplane.example.com
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.469 183079 DEBUG nova.compute.provider_tree [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.490 183079 DEBUG nova.scheduler.client.report [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.520 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.520 183079 DEBUG nova.compute.manager [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.574 183079 DEBUG nova.compute.manager [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.574 183079 DEBUG nova.network.neutron [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.596 183079 INFO nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.612 183079 DEBUG nova.compute.manager [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.724 183079 DEBUG nova.compute.manager [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.725 183079 DEBUG nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.726 183079 INFO nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Creating image(s)
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.726 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.726 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.727 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.739 183079 DEBUG oslo_concurrency.processutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.811 183079 DEBUG oslo_concurrency.processutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.813 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.814 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.824 183079 DEBUG oslo_concurrency.processutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.873 183079 DEBUG nova.policy [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.886 183079 DEBUG oslo_concurrency.processutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.887 183079 DEBUG oslo_concurrency.processutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.928 183079 DEBUG oslo_concurrency.processutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.929 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.930 183079 DEBUG oslo_concurrency.processutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.983 183079 DEBUG oslo_concurrency.processutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.984 183079 DEBUG nova.virt.disk.api [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 18:01:21 compute-0 nova_compute[183075]: 2026-01-22 18:01:21.984 183079 DEBUG oslo_concurrency.processutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.037 183079 DEBUG oslo_concurrency.processutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.038 183079 DEBUG nova.virt.disk.api [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.039 183079 DEBUG nova.objects.instance [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid afb51767-98e4-4f27-bf80-d54f23cd06c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.055 183079 DEBUG nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.056 183079 DEBUG nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Ensure instance console log exists: /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.056 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.057 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.057 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.198 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.747 183079 DEBUG nova.network.neutron [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Successfully updated port: 5991e52e-d36a-4639-b0c2-6e456926f678 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.777 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-afb51767-98e4-4f27-bf80-d54f23cd06c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.778 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-afb51767-98e4-4f27-bf80-d54f23cd06c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.778 183079 DEBUG nova.network.neutron [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.850 183079 DEBUG nova.compute.manager [req-540d2e00-34cf-45f8-b083-61f3c9002a5d req-2ae3f026-5b8d-4c10-afce-da26dcd6fa0e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Received event network-changed-5991e52e-d36a-4639-b0c2-6e456926f678 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.851 183079 DEBUG nova.compute.manager [req-540d2e00-34cf-45f8-b083-61f3c9002a5d req-2ae3f026-5b8d-4c10-afce-da26dcd6fa0e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Refreshing instance network info cache due to event network-changed-5991e52e-d36a-4639-b0c2-6e456926f678. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.852 183079 DEBUG oslo_concurrency.lockutils [req-540d2e00-34cf-45f8-b083-61f3c9002a5d req-2ae3f026-5b8d-4c10-afce-da26dcd6fa0e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-afb51767-98e4-4f27-bf80-d54f23cd06c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:01:22 compute-0 nova_compute[183075]: 2026-01-22 18:01:22.975 183079 DEBUG nova.network.neutron [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.583 183079 DEBUG nova.network.neutron [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Updating instance_info_cache with network_info: [{"id": "5991e52e-d36a-4639-b0c2-6e456926f678", "address": "fa:16:3e:73:94:61", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5991e52e-d3", "ovs_interfaceid": "5991e52e-d36a-4639-b0c2-6e456926f678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.603 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-afb51767-98e4-4f27-bf80-d54f23cd06c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.603 183079 DEBUG nova.compute.manager [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Instance network_info: |[{"id": "5991e52e-d36a-4639-b0c2-6e456926f678", "address": "fa:16:3e:73:94:61", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5991e52e-d3", "ovs_interfaceid": "5991e52e-d36a-4639-b0c2-6e456926f678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.604 183079 DEBUG oslo_concurrency.lockutils [req-540d2e00-34cf-45f8-b083-61f3c9002a5d req-2ae3f026-5b8d-4c10-afce-da26dcd6fa0e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-afb51767-98e4-4f27-bf80-d54f23cd06c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.604 183079 DEBUG nova.network.neutron [req-540d2e00-34cf-45f8-b083-61f3c9002a5d req-2ae3f026-5b8d-4c10-afce-da26dcd6fa0e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Refreshing network info cache for port 5991e52e-d36a-4639-b0c2-6e456926f678 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.608 183079 DEBUG nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Start _get_guest_xml network_info=[{"id": "5991e52e-d36a-4639-b0c2-6e456926f678", "address": "fa:16:3e:73:94:61", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5991e52e-d3", "ovs_interfaceid": "5991e52e-d36a-4639-b0c2-6e456926f678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.616 183079 WARNING nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.629 183079 DEBUG nova.virt.libvirt.host [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.630 183079 DEBUG nova.virt.libvirt.host [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.636 183079 DEBUG nova.virt.libvirt.host [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.637 183079 DEBUG nova.virt.libvirt.host [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.638 183079 DEBUG nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.638 183079 DEBUG nova.virt.hardware [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.638 183079 DEBUG nova.virt.hardware [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.639 183079 DEBUG nova.virt.hardware [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.639 183079 DEBUG nova.virt.hardware [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.639 183079 DEBUG nova.virt.hardware [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.640 183079 DEBUG nova.virt.hardware [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.640 183079 DEBUG nova.virt.hardware [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.640 183079 DEBUG nova.virt.hardware [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.641 183079 DEBUG nova.virt.hardware [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.641 183079 DEBUG nova.virt.hardware [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.641 183079 DEBUG nova.virt.hardware [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.647 183079 DEBUG nova.virt.libvirt.vif [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T18:01:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-2101676201',display_name='tempest-server-test-2101676201',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-2101676201',id=81,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-zhpa4cyh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T18:01:21Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=afb51767-98e4-4f27-bf80-d54f23cd06c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5991e52e-d36a-4639-b0c2-6e456926f678", "address": "fa:16:3e:73:94:61", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5991e52e-d3", "ovs_interfaceid": "5991e52e-d36a-4639-b0c2-6e456926f678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.648 183079 DEBUG nova.network.os_vif_util [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "5991e52e-d36a-4639-b0c2-6e456926f678", "address": "fa:16:3e:73:94:61", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5991e52e-d3", "ovs_interfaceid": "5991e52e-d36a-4639-b0c2-6e456926f678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.649 183079 DEBUG nova.network.os_vif_util [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:94:61,bridge_name='br-int',has_traffic_filtering=True,id=5991e52e-d36a-4639-b0c2-6e456926f678,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5991e52e-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.650 183079 DEBUG nova.objects.instance [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid afb51767-98e4-4f27-bf80-d54f23cd06c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.665 183079 DEBUG nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] End _get_guest_xml xml=<domain type="kvm">
Jan 22 18:01:23 compute-0 nova_compute[183075]:   <uuid>afb51767-98e4-4f27-bf80-d54f23cd06c6</uuid>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   <name>instance-00000051</name>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   <metadata>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-2101676201</nova:name>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 18:01:23</nova:creationTime>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 18:01:23 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 18:01:23 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 18:01:23 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 18:01:23 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 18:01:23 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 18:01:23 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 18:01:23 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 18:01:23 compute-0 nova_compute[183075]:         <nova:port uuid="5991e52e-d36a-4639-b0c2-6e456926f678">
Jan 22 18:01:23 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   </metadata>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <system>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <entry name="serial">afb51767-98e4-4f27-bf80-d54f23cd06c6</entry>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <entry name="uuid">afb51767-98e4-4f27-bf80-d54f23cd06c6</entry>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     </system>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   <os>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   </os>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   <features>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <apic/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   </features>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   </clock>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   </cpu>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   <devices>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     </disk>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:73:94:61"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <target dev="tap5991e52e-d3"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     </interface>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/console.log" append="off"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     </serial>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <video>
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     </video>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     </rng>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 18:01:23 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 18:01:23 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 18:01:23 compute-0 nova_compute[183075]:   </devices>
Jan 22 18:01:23 compute-0 nova_compute[183075]: </domain>
Jan 22 18:01:23 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.666 183079 DEBUG nova.compute.manager [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Preparing to wait for external event network-vif-plugged-5991e52e-d36a-4639-b0c2-6e456926f678 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.667 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.667 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.667 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.668 183079 DEBUG nova.virt.libvirt.vif [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T18:01:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-2101676201',display_name='tempest-server-test-2101676201',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-2101676201',id=81,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-zhpa4cyh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T18:01:21Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=afb51767-98e4-4f27-bf80-d54f23cd06c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5991e52e-d36a-4639-b0c2-6e456926f678", "address": "fa:16:3e:73:94:61", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5991e52e-d3", "ovs_interfaceid": "5991e52e-d36a-4639-b0c2-6e456926f678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.668 183079 DEBUG nova.network.os_vif_util [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "5991e52e-d36a-4639-b0c2-6e456926f678", "address": "fa:16:3e:73:94:61", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5991e52e-d3", "ovs_interfaceid": "5991e52e-d36a-4639-b0c2-6e456926f678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.669 183079 DEBUG nova.network.os_vif_util [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:94:61,bridge_name='br-int',has_traffic_filtering=True,id=5991e52e-d36a-4639-b0c2-6e456926f678,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5991e52e-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.670 183079 DEBUG os_vif [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:94:61,bridge_name='br-int',has_traffic_filtering=True,id=5991e52e-d36a-4639-b0c2-6e456926f678,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5991e52e-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.670 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.671 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.671 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.675 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.675 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5991e52e-d3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.676 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5991e52e-d3, col_values=(('external_ids', {'iface-id': '5991e52e-d36a-4639-b0c2-6e456926f678', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:73:94:61', 'vm-uuid': 'afb51767-98e4-4f27-bf80-d54f23cd06c6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.678 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:23 compute-0 NetworkManager[55454]: <info>  [1769104883.6807] manager: (tap5991e52e-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/350)
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.680 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.686 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.687 183079 INFO os_vif [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:94:61,bridge_name='br-int',has_traffic_filtering=True,id=5991e52e-d36a-4639-b0c2-6e456926f678,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5991e52e-d3')
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.737 183079 DEBUG nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.738 183079 DEBUG nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:73:94:61, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 18:01:23 compute-0 NetworkManager[55454]: <info>  [1769104883.7950] manager: (tap5991e52e-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/351)
Jan 22 18:01:23 compute-0 kernel: tap5991e52e-d3: entered promiscuous mode
Jan 22 18:01:23 compute-0 ovn_controller[95372]: 2026-01-22T18:01:23Z|00880|binding|INFO|Claiming lport 5991e52e-d36a-4639-b0c2-6e456926f678 for this chassis.
Jan 22 18:01:23 compute-0 ovn_controller[95372]: 2026-01-22T18:01:23Z|00881|binding|INFO|5991e52e-d36a-4639-b0c2-6e456926f678: Claiming fa:16:3e:73:94:61 10.100.0.5
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.798 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.804 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:73:94:61 10.100.0.5'], port_security=['fa:16:3e:73:94:61 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '75e8acf6-f0c9-4e6e-a3dc-bba78925f5d3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=5991e52e-d36a-4639-b0c2-6e456926f678) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:01:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.806 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 5991e52e-d36a-4639-b0c2-6e456926f678 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 18:01:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.808 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 18:01:23 compute-0 ovn_controller[95372]: 2026-01-22T18:01:23Z|00882|binding|INFO|Setting lport 5991e52e-d36a-4639-b0c2-6e456926f678 up in Southbound
Jan 22 18:01:23 compute-0 ovn_controller[95372]: 2026-01-22T18:01:23Z|00883|binding|INFO|Setting lport 5991e52e-d36a-4639-b0c2-6e456926f678 ovn-installed in OVS
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.811 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.818 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.825 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7bbf5f-1bc6-4ead-a726-1aefdf7f7993]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.827 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88ed9213-71 in ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 18:01:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.829 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88ed9213-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 18:01:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.829 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[01ca1139-4199-4e64-8188-919a45bcb2d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.830 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[cda5b82a-2c01-4416-b356-1dfc24b42d56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:23 compute-0 systemd-udevd[246310]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 18:01:23 compute-0 systemd-machined[154382]: New machine qemu-81-instance-00000051.
Jan 22 18:01:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.843 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[7e831587-1503-43ba-b4dc-4f51972239f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:23 compute-0 NetworkManager[55454]: <info>  [1769104883.8457] device (tap5991e52e-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 18:01:23 compute-0 NetworkManager[55454]: <info>  [1769104883.8467] device (tap5991e52e-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 18:01:23 compute-0 systemd[1]: Started Virtual Machine qemu-81-instance-00000051.
Jan 22 18:01:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.858 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1be907f3-9436-40a2-a48e-f9c4a6c01012]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.890 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[a3705c5b-967c-4d77-9892-88c8684f48de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.895 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2dba49f0-800b-4fe7-a833-059f8c46cd6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:23 compute-0 NetworkManager[55454]: <info>  [1769104883.8972] manager: (tap88ed9213-70): new Veth device (/org/freedesktop/NetworkManager/Devices/352)
Jan 22 18:01:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.932 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[4b3ca154-df13-44f6-8631-a9f8b2aeeb62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.935 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[5263e499-a666-4b9c-b8b1-f0f287f6b4af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:23 compute-0 NetworkManager[55454]: <info>  [1769104883.9590] device (tap88ed9213-70): carrier: link connected
Jan 22 18:01:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.964 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[681bba8a-09f0-4e53-a99b-34ea139ebccf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.981 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[77157069-2287-4701-9fd3-c7d149a79aeb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 240], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 724766, 'reachable_time': 21668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246343, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.998 183079 DEBUG nova.compute.manager [req-4df44009-15e9-46be-9dac-df2e975a44a7 req-4022bd9c-6374-45dd-8b86-bb2dd7c676df a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Received event network-vif-plugged-5991e52e-d36a-4639-b0c2-6e456926f678 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.998 183079 DEBUG oslo_concurrency.lockutils [req-4df44009-15e9-46be-9dac-df2e975a44a7 req-4022bd9c-6374-45dd-8b86-bb2dd7c676df a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.998 183079 DEBUG oslo_concurrency.lockutils [req-4df44009-15e9-46be-9dac-df2e975a44a7 req-4022bd9c-6374-45dd-8b86-bb2dd7c676df a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.999 183079 DEBUG oslo_concurrency.lockutils [req-4df44009-15e9-46be-9dac-df2e975a44a7 req-4022bd9c-6374-45dd-8b86-bb2dd7c676df a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:23 compute-0 nova_compute[183075]: 2026-01-22 18:01:23.999 183079 DEBUG nova.compute.manager [req-4df44009-15e9-46be-9dac-df2e975a44a7 req-4022bd9c-6374-45dd-8b86-bb2dd7c676df a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Processing event network-vif-plugged-5991e52e-d36a-4639-b0c2-6e456926f678 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:23.999 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e1696c4b-8ee9-4c76-a202-529995f5d5b2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:fa93'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 724766, 'tstamp': 724766}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246344, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:24.020 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[a1fe4176-27b4-4c46-8a69-b0487129931b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 240], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 724766, 'reachable_time': 21668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246345, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:24.068 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[880ed792-19f6-4fbe-a237-ef5c2f300b37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:24.135 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3e4b7d99-4daf-47ec-ad25-cd05a5d73355]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:24.137 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:24.137 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:24.137 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.139 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:24 compute-0 NetworkManager[55454]: <info>  [1769104884.1399] manager: (tap88ed9213-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/353)
Jan 22 18:01:24 compute-0 kernel: tap88ed9213-70: entered promiscuous mode
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.141 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:24.142 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.142 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:24 compute-0 ovn_controller[95372]: 2026-01-22T18:01:24Z|00884|binding|INFO|Releasing lport da93a32e-eed6-4124-8f08-78b0bfe2cb69 from this chassis (sb_readonly=0)
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.153 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:24.154 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:24.155 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[228b6d0b-508c-4b57-8561-c1049b7d7c65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:24.156 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: global
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: 
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: 
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: 
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 18:01:24 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:24.157 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'env', 'PROCESS_TAG=haproxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88ed9213-7d48-4c42-ad60-ce3fcf104f71.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.284 183079 DEBUG nova.compute.manager [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.285 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104884.2836654, afb51767-98e4-4f27-bf80-d54f23cd06c6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.285 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] VM Started (Lifecycle Event)
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.288 183079 DEBUG nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.292 183079 INFO nova.virt.libvirt.driver [-] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Instance spawned successfully.
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.292 183079 DEBUG nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.309 183079 DEBUG nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.309 183079 DEBUG nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.310 183079 DEBUG nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.310 183079 DEBUG nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.310 183079 DEBUG nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.311 183079 DEBUG nova.virt.libvirt.driver [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.314 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.317 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.344 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.344 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104884.2838979, afb51767-98e4-4f27-bf80-d54f23cd06c6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.344 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] VM Paused (Lifecycle Event)
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.366 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.369 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104884.287017, afb51767-98e4-4f27-bf80-d54f23cd06c6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.369 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] VM Resumed (Lifecycle Event)
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.375 183079 INFO nova.compute.manager [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Took 2.65 seconds to spawn the instance on the hypervisor.
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.375 183079 DEBUG nova.compute.manager [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.388 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.391 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.411 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.443 183079 INFO nova.compute.manager [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Took 3.10 seconds to build instance.
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.463 183079 DEBUG oslo_concurrency.lockutils [None req-a606553f-f842-4456-bfae-ac7252c4b36f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "afb51767-98e4-4f27-bf80-d54f23cd06c6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.495 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.544 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769104869.5431259, 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.544 183079 INFO nova.compute.manager [-] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] VM Stopped (Lifecycle Event)
Jan 22 18:01:24 compute-0 nova_compute[183075]: 2026-01-22 18:01:24.563 183079 DEBUG nova.compute.manager [None req-a7d4429e-9929-4eb5-bb8a-67d026857766 - - - - - -] [instance: 7fef7daf-622f-4f8a-ba6a-25fac7fd68ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:01:24 compute-0 podman[246384]: 2026-01-22 18:01:24.596819074 +0000 UTC m=+0.054486952 container create d94defef588d84aec236d5962bba79b2dabdde6a389260928b5ceb8d51b9feb3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:01:24 compute-0 systemd[1]: Started libpod-conmon-d94defef588d84aec236d5962bba79b2dabdde6a389260928b5ceb8d51b9feb3.scope.
Jan 22 18:01:24 compute-0 podman[246384]: 2026-01-22 18:01:24.566554827 +0000 UTC m=+0.024222725 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 18:01:24 compute-0 systemd[1]: Started libcrun container.
Jan 22 18:01:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f65283b4daddb7e7ba10b6b572a7946cfdf8ed8178a820f811d8285c58d33dc7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 18:01:24 compute-0 podman[246384]: 2026-01-22 18:01:24.697556013 +0000 UTC m=+0.155223911 container init d94defef588d84aec236d5962bba79b2dabdde6a389260928b5ceb8d51b9feb3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:01:24 compute-0 podman[246384]: 2026-01-22 18:01:24.702679911 +0000 UTC m=+0.160347819 container start d94defef588d84aec236d5962bba79b2dabdde6a389260928b5ceb8d51b9feb3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:01:24 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[246399]: [NOTICE]   (246403) : New worker (246405) forked
Jan 22 18:01:24 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[246399]: [NOTICE]   (246403) : Loading success.
Jan 22 18:01:26 compute-0 nova_compute[183075]: 2026-01-22 18:01:26.065 183079 DEBUG nova.compute.manager [req-650e0f62-2501-4d29-a44b-dd45ac6f431d req-5fb84463-474a-4240-b90a-615f49f95cb0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Received event network-vif-plugged-5991e52e-d36a-4639-b0c2-6e456926f678 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:01:26 compute-0 nova_compute[183075]: 2026-01-22 18:01:26.065 183079 DEBUG oslo_concurrency.lockutils [req-650e0f62-2501-4d29-a44b-dd45ac6f431d req-5fb84463-474a-4240-b90a-615f49f95cb0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:26 compute-0 nova_compute[183075]: 2026-01-22 18:01:26.067 183079 DEBUG oslo_concurrency.lockutils [req-650e0f62-2501-4d29-a44b-dd45ac6f431d req-5fb84463-474a-4240-b90a-615f49f95cb0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:26 compute-0 nova_compute[183075]: 2026-01-22 18:01:26.068 183079 DEBUG oslo_concurrency.lockutils [req-650e0f62-2501-4d29-a44b-dd45ac6f431d req-5fb84463-474a-4240-b90a-615f49f95cb0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:26 compute-0 nova_compute[183075]: 2026-01-22 18:01:26.068 183079 DEBUG nova.compute.manager [req-650e0f62-2501-4d29-a44b-dd45ac6f431d req-5fb84463-474a-4240-b90a-615f49f95cb0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] No waiting events found dispatching network-vif-plugged-5991e52e-d36a-4639-b0c2-6e456926f678 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:01:26 compute-0 nova_compute[183075]: 2026-01-22 18:01:26.069 183079 WARNING nova.compute.manager [req-650e0f62-2501-4d29-a44b-dd45ac6f431d req-5fb84463-474a-4240-b90a-615f49f95cb0 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Received unexpected event network-vif-plugged-5991e52e-d36a-4639-b0c2-6e456926f678 for instance with vm_state active and task_state None.
Jan 22 18:01:26 compute-0 nova_compute[183075]: 2026-01-22 18:01:26.664 183079 DEBUG nova.network.neutron [req-540d2e00-34cf-45f8-b083-61f3c9002a5d req-2ae3f026-5b8d-4c10-afce-da26dcd6fa0e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Updated VIF entry in instance network info cache for port 5991e52e-d36a-4639-b0c2-6e456926f678. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 18:01:26 compute-0 nova_compute[183075]: 2026-01-22 18:01:26.664 183079 DEBUG nova.network.neutron [req-540d2e00-34cf-45f8-b083-61f3c9002a5d req-2ae3f026-5b8d-4c10-afce-da26dcd6fa0e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Updating instance_info_cache with network_info: [{"id": "5991e52e-d36a-4639-b0c2-6e456926f678", "address": "fa:16:3e:73:94:61", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5991e52e-d3", "ovs_interfaceid": "5991e52e-d36a-4639-b0c2-6e456926f678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:01:26 compute-0 nova_compute[183075]: 2026-01-22 18:01:26.680 183079 DEBUG oslo_concurrency.lockutils [req-540d2e00-34cf-45f8-b083-61f3c9002a5d req-2ae3f026-5b8d-4c10-afce-da26dcd6fa0e a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-afb51767-98e4-4f27-bf80-d54f23cd06c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:01:27 compute-0 nova_compute[183075]: 2026-01-22 18:01:27.164 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769104872.1625085, cf09548c-3631-48db-b474-279b88fc113d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:01:27 compute-0 nova_compute[183075]: 2026-01-22 18:01:27.164 183079 INFO nova.compute.manager [-] [instance: cf09548c-3631-48db-b474-279b88fc113d] VM Stopped (Lifecycle Event)
Jan 22 18:01:27 compute-0 nova_compute[183075]: 2026-01-22 18:01:27.185 183079 DEBUG nova.compute.manager [None req-6eca9931-ceb6-49e6-a2ac-de844b2fff9f - - - - - -] [instance: cf09548c-3631-48db-b474-279b88fc113d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:01:27 compute-0 podman[246415]: 2026-01-22 18:01:27.359617009 +0000 UTC m=+0.057246886 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 18:01:27 compute-0 podman[246416]: 2026-01-22 18:01:27.390154203 +0000 UTC m=+0.079899147 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, name=ubi9-minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 18:01:27 compute-0 podman[246414]: 2026-01-22 18:01:27.394222213 +0000 UTC m=+0.099095646 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 22 18:01:28 compute-0 nova_compute[183075]: 2026-01-22 18:01:28.071 183079 INFO nova.compute.manager [None req-814a6a3d-d98f-4407-87bc-7bdc8959fc93 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:01:28 compute-0 nova_compute[183075]: 2026-01-22 18:01:28.080 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:01:28 compute-0 nova_compute[183075]: 2026-01-22 18:01:28.679 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:29 compute-0 nova_compute[183075]: 2026-01-22 18:01:29.496 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:31 compute-0 podman[246477]: 2026-01-22 18:01:31.346764563 +0000 UTC m=+0.058315605 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 18:01:33 compute-0 nova_compute[183075]: 2026-01-22 18:01:33.211 183079 INFO nova.compute.manager [None req-d44238c8-68ed-45c9-8b6a-bc75a935a254 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:01:33 compute-0 nova_compute[183075]: 2026-01-22 18:01:33.683 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:34 compute-0 nova_compute[183075]: 2026-01-22 18:01:34.498 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:36 compute-0 ovn_controller[95372]: 2026-01-22T18:01:36Z|00164|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:73:94:61 10.100.0.5
Jan 22 18:01:36 compute-0 ovn_controller[95372]: 2026-01-22T18:01:36Z|00165|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:73:94:61 10.100.0.5
Jan 22 18:01:38 compute-0 nova_compute[183075]: 2026-01-22 18:01:38.359 183079 INFO nova.compute.manager [None req-b2e7e770-a506-4ae7-83c8-7dc79f2e6f11 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:01:38 compute-0 nova_compute[183075]: 2026-01-22 18:01:38.365 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:01:38 compute-0 nova_compute[183075]: 2026-01-22 18:01:38.686 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:39 compute-0 nova_compute[183075]: 2026-01-22 18:01:39.539 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:41.985 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:01:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:41.986 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:01:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:01:41.986 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:01:42 compute-0 podman[246508]: 2026-01-22 18:01:42.370597666 +0000 UTC m=+0.075598101 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 18:01:43 compute-0 nova_compute[183075]: 2026-01-22 18:01:43.495 183079 INFO nova.compute.manager [None req-894d9fbf-91dc-4823-9e72-69afd54eac5f 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:01:43 compute-0 nova_compute[183075]: 2026-01-22 18:01:43.500 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:01:43 compute-0 nova_compute[183075]: 2026-01-22 18:01:43.690 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:44 compute-0 nova_compute[183075]: 2026-01-22 18:01:44.540 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:46 compute-0 podman[246533]: 2026-01-22 18:01:46.352754997 +0000 UTC m=+0.057356549 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 18:01:46 compute-0 nova_compute[183075]: 2026-01-22 18:01:46.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:01:48 compute-0 nova_compute[183075]: 2026-01-22 18:01:48.609 183079 INFO nova.compute.manager [None req-62c5d66d-c1b2-4114-b930-fd573dabe4fe 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:01:48 compute-0 nova_compute[183075]: 2026-01-22 18:01:48.615 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:01:48 compute-0 nova_compute[183075]: 2026-01-22 18:01:48.693 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:48 compute-0 nova_compute[183075]: 2026-01-22 18:01:48.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:01:48 compute-0 nova_compute[183075]: 2026-01-22 18:01:48.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:01:49 compute-0 nova_compute[183075]: 2026-01-22 18:01:49.542 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:49 compute-0 nova_compute[183075]: 2026-01-22 18:01:49.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:01:53 compute-0 nova_compute[183075]: 2026-01-22 18:01:53.697 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:53 compute-0 nova_compute[183075]: 2026-01-22 18:01:53.748 183079 INFO nova.compute.manager [None req-2258b096-6dfa-495a-a157-d36f2b0d9ee7 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:01:53 compute-0 nova_compute[183075]: 2026-01-22 18:01:53.752 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:01:53 compute-0 ovn_controller[95372]: 2026-01-22T18:01:53Z|00885|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Jan 22 18:01:54 compute-0 nova_compute[183075]: 2026-01-22 18:01:54.543 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.466 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'name': 'tempest-server-test-2101676201', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000051', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'hostId': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.467 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.470 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for afb51767-98e4-4f27-bf80-d54f23cd06c6 / tap5991e52e-d3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.471 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e38bc442-7e3e-4181-a1da-b34bcbc197a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000051-afb51767-98e4-4f27-bf80-d54f23cd06c6-tap5991e52e-d3', 'timestamp': '2026-01-22T18:01:55.467515', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'tap5991e52e-d3', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:94:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5991e52e-d3'}, 'message_id': '701356e8-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.227937662, 'message_signature': 'd3b4a1086161af71b02f059fa48e0c044d8193f1e101ccd57c0b15c09a801f69'}]}, 'timestamp': '2026-01-22 18:01:55.471514', '_unique_id': 'ac0521e0c0e341648c354a282ca5edd3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.472 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.473 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.486 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/memory.usage volume: 42.7578125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5f240af-d5b3-40c8-ad82-65c85f7dcee6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.7578125, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'timestamp': '2026-01-22T18:01:55.473450', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'instance-00000051', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '7015c43c-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.247213032, 'message_signature': '463d057540f583e7401ac0fa4b0cc2b6c7b116817f3d64b5c04311a88b3a87ae'}]}, 'timestamp': '2026-01-22 18:01:55.487411', '_unique_id': '954ed17624d34d4bb529cc4eaf9f9891'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.488 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.489 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.489 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.489 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-2101676201>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2101676201>]
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.489 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.489 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/network.outgoing.packets volume: 24 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19814181-ed38-4a32-ae7c-49ce8dd346ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 24, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000051-afb51767-98e4-4f27-bf80-d54f23cd06c6-tap5991e52e-d3', 'timestamp': '2026-01-22T18:01:55.489541', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'tap5991e52e-d3', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:94:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5991e52e-d3'}, 'message_id': '701625da-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.227937662, 'message_signature': '9607ae4213c7acc99000afb0a3013a2b0abdba961fd107bff84d06c44ef8b8f1'}]}, 'timestamp': '2026-01-22 18:01:55.489855', '_unique_id': '979eb4c9ce7b4107bc2c335594e11731'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.490 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.491 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.491 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-2101676201>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2101676201>]
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.491 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.501 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/disk.device.read.requests volume: 984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b9760dc-97e2-455a-bd35-ec3690b8dbe9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 984, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6-vda', 'timestamp': '2026-01-22T18:01:55.491392', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'instance-00000051', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7017f59a-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.251825697, 'message_signature': '4a656a324283d91adb04c9643a35df356b5d88485fd1e707b8165c76bf337291'}]}, 'timestamp': '2026-01-22 18:01:55.501791', '_unique_id': '50aa061d4475453e8fffe9b6d9a6a930'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.502 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.503 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.503 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb543ce9-2942-4327-9fa4-3100768aff0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000051-afb51767-98e4-4f27-bf80-d54f23cd06c6-tap5991e52e-d3', 'timestamp': '2026-01-22T18:01:55.503157', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'tap5991e52e-d3', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:94:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5991e52e-d3'}, 'message_id': '701838ac-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.227937662, 'message_signature': '8d65de9e5e6e4dbc5ba7d753bb5f1de787ff69642a1d0e584c12f770e27ccdba'}]}, 'timestamp': '2026-01-22 18:01:55.503485', '_unique_id': '3a394f288e5f4e66a53ce543df03352a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.504 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.505 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8808c981-005d-475c-b68b-de03a9e79baf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000051-afb51767-98e4-4f27-bf80-d54f23cd06c6-tap5991e52e-d3', 'timestamp': '2026-01-22T18:01:55.504992', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'tap5991e52e-d3', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:94:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5991e52e-d3'}, 'message_id': '70188212-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.227937662, 'message_signature': '47bdc280fe479503b50b5de866b3b5187d03d3a8a05c1ae3574dd7a15ef78b12'}]}, 'timestamp': '2026-01-22 18:01:55.505366', '_unique_id': '67886e1b2593404f8d76dd759a09238d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.506 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/disk.device.write.bytes volume: 25899008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c5c002e-e2a5-4fd8-b7ac-112a032d5725', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25899008, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6-vda', 'timestamp': '2026-01-22T18:01:55.506905', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'instance-00000051', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7018caba-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.251825697, 'message_signature': '45d35b6d515535392239081b472a2ed5e74c0bb034680e09deab1e5d17d91209'}]}, 'timestamp': '2026-01-22 18:01:55.507210', '_unique_id': 'aa52be8d10d04d7b81127cf825a6ff22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.508 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.508 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/network.incoming.bytes volume: 1436 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea9dc199-912a-4c67-ad5d-93bd03d907d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1436, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000051-afb51767-98e4-4f27-bf80-d54f23cd06c6-tap5991e52e-d3', 'timestamp': '2026-01-22T18:01:55.508761', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'tap5991e52e-d3', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:94:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5991e52e-d3'}, 'message_id': '7019134e-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.227937662, 'message_signature': '5714e80804814eba7eac123e96f96352047f4d644a06c6bfcbf589b32c5121ef'}]}, 'timestamp': '2026-01-22 18:01:55.509078', '_unique_id': 'bc3f6191ab6f43c5b435a436e9664c10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.510 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.518 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/disk.device.usage volume: 28573696 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '353b230e-62c7-4991-954b-b820c9e3433c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28573696, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6-vda', 'timestamp': '2026-01-22T18:01:55.510756', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'instance-00000051', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '701a90a2-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.27119794, 'message_signature': '86fb46ba13b2047c617004a1721dd2452eb254bdbeb9b0cb782680ebdfa0222f'}]}, 'timestamp': '2026-01-22 18:01:55.518879', '_unique_id': '5bb07ff267b348c4b83c54bd5134979c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.519 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.520 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.520 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/cpu volume: 10460000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da42c0e8-9962-4936-8ad7-bb4fef1ab278', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10460000000, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'timestamp': '2026-01-22T18:01:55.520522', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'instance-00000051', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '701adfa8-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.247213032, 'message_signature': '5671ad50f3aea41c541fb50e74e03cf77e7f7dfe64e51dcc2ff143e3fb81571d'}]}, 'timestamp': '2026-01-22 18:01:55.520857', '_unique_id': 'e5e3f70d13924018a3e7ec4adc3ccb71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.522 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.522 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/disk.device.read.latency volume: 146602287 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4ef23df-40a4-408b-8b6a-de1ccd107085', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 146602287, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6-vda', 'timestamp': '2026-01-22T18:01:55.522391', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'instance-00000051', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '701b27b0-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.251825697, 'message_signature': '65235c64d6ac91668fade11286a55223f9974a04a9116f4ae58589f35652b3b3'}]}, 'timestamp': '2026-01-22 18:01:55.522714', '_unique_id': 'c50acc8d5356417e955d8f2cf044b458'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.524 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.524 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43843215-218b-4ba0-be03-7b8719920516', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000051-afb51767-98e4-4f27-bf80-d54f23cd06c6-tap5991e52e-d3', 'timestamp': '2026-01-22T18:01:55.524249', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'tap5991e52e-d3', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:94:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5991e52e-d3'}, 'message_id': '701b7076-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.227937662, 'message_signature': '1754112c8e448df764057afd212c4ec54134ad63937f824d1eda9993cd70d23d'}]}, 'timestamp': '2026-01-22 18:01:55.524568', '_unique_id': '1d6c03d7d06942ef82fb7d07fb32e208'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.525 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/disk.device.read.bytes volume: 28441600 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6988328-4d24-427d-b405-7de209b14a1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28441600, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6-vda', 'timestamp': '2026-01-22T18:01:55.526022', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'instance-00000051', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '701bb568-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.251825697, 'message_signature': 'ea2de7e32351da729f2a5827f4ecb81209f8fe1fca77cef04d80107ad06bfdf7'}]}, 'timestamp': '2026-01-22 18:01:55.526342', '_unique_id': 'b4fb356bae304bea914cd341eaae6684'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.526 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.527 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.527 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a0fc733-3198-4809-b37e-4e602e066920', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000051-afb51767-98e4-4f27-bf80-d54f23cd06c6-tap5991e52e-d3', 'timestamp': '2026-01-22T18:01:55.527820', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'tap5991e52e-d3', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:94:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5991e52e-d3'}, 'message_id': '701bfc08-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.227937662, 'message_signature': '66c7e361c1461ebd87f575eb25029905f6a79ee44b1785b3ddeaa6085400d561'}]}, 'timestamp': '2026-01-22 18:01:55.528142', '_unique_id': '08598eed1ee346efb9380f022e49f5d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.529 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.529 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '736cfa48-11e0-4fd2-b43b-8025951cd2d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000051-afb51767-98e4-4f27-bf80-d54f23cd06c6-tap5991e52e-d3', 'timestamp': '2026-01-22T18:01:55.529699', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'tap5991e52e-d3', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:94:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5991e52e-d3'}, 'message_id': '701c4578-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.227937662, 'message_signature': '4f8229780989826d812ef07695c7ab2163e9524f33aefdeb402c726a356e5822'}]}, 'timestamp': '2026-01-22 18:01:55.530023', '_unique_id': 'cc6bcc31b7ed4e2e83f6d30306bde96c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.531 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.531 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/network.outgoing.bytes volume: 2148 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6f01a6e-ab67-47c6-b9e5-10b109e4c1e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2148, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000051-afb51767-98e4-4f27-bf80-d54f23cd06c6-tap5991e52e-d3', 'timestamp': '2026-01-22T18:01:55.531562', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'tap5991e52e-d3', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:94:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5991e52e-d3'}, 'message_id': '701c8eac-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.227937662, 'message_signature': 'd875f977799aecf65106c0cddae213b120981ecf0d4806bdf32e87500122ed72'}]}, 'timestamp': '2026-01-22 18:01:55.531895', '_unique_id': '85bde820602145229af1a01a6ab9dfb5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.533 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.533 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.533 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-2101676201>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2101676201>]
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.533 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.533 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/disk.device.allocation volume: 28647424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec7a1b48-c93f-40f8-8898-37a8d2c3769f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28647424, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6-vda', 'timestamp': '2026-01-22T18:01:55.533818', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'instance-00000051', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '701ce622-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.27119794, 'message_signature': 'ed54afa4dd1c15d021c9f66f7ade10a9df97275b79b5ac9bd5bb192dfe39e617'}]}, 'timestamp': '2026-01-22 18:01:55.534129', '_unique_id': 'b5bb6166407040d7b85db6852de31afa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.535 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.535 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37c4833b-3982-493a-994a-f61fb91aa55e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'instance-00000051-afb51767-98e4-4f27-bf80-d54f23cd06c6-tap5991e52e-d3', 'timestamp': '2026-01-22T18:01:55.535645', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'tap5991e52e-d3', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:94:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5991e52e-d3'}, 'message_id': '701d2dda-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.227937662, 'message_signature': 'f1fbbd74ae29e925930a2cba02ad47f13d173a927fef7adeb807c72087b8e3fe'}]}, 'timestamp': '2026-01-22 18:01:55.535973', '_unique_id': 'd83aa88de5d24746b346b77347725226'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.537 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.537 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49ea4338-c842-4b84-9d67-07a40793e507', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6-vda', 'timestamp': '2026-01-22T18:01:55.537419', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'instance-00000051', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '701d7286-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.27119794, 'message_signature': '7c115ce4b1e5e15048888afe73b25794107affb1f874c1f95e5205d631240dd2'}]}, 'timestamp': '2026-01-22 18:01:55.537786', '_unique_id': '1b8ef88f23b340b6ab2ca921a66e2979'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.538 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.539 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.539 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.539 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-2101676201>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2101676201>]
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.539 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.539 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/disk.device.write.requests volume: 250 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8ae2949-d995-4dd5-8a73-2d59d7bf638d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 250, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6-vda', 'timestamp': '2026-01-22T18:01:55.539723', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'instance-00000051', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '701dccf4-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.251825697, 'message_signature': '3782d9a9ad5d432e28d2adb8af8d6829362c23e49b4c5926ddee30125dad9d4e'}]}, 'timestamp': '2026-01-22 18:01:55.540035', '_unique_id': '281f386cd54443549a5dced53b89b82b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.540 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.541 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.541 12 DEBUG ceilometer.compute.pollsters [-] afb51767-98e4-4f27-bf80-d54f23cd06c6/disk.device.write.latency volume: 2950363806 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f8a6c9c-595f-453b-96a1-312d5a8e3fff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2950363806, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_name': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_name': None, 'resource_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6-vda', 'timestamp': '2026-01-22T18:01:55.541568', 'resource_metadata': {'display_name': 'tempest-server-test-2101676201', 'name': 'instance-00000051', 'instance_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'instance_type': 'm1.nano', 'host': '481ca6528745db8b4e9a8a8ae6e404b926a143d63ca9385d17cded74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '701e1574-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7279.251825697, 'message_signature': '1f6033e40b2ffa07626c19983543472f072a98d55e9c3c9422137e8347803004'}]}, 'timestamp': '2026-01-22 18:01:55.541889', '_unique_id': 'b7fa0b85a9454fa6887913b08353fbc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:01:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:01:55.542 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:01:58 compute-0 podman[246561]: 2026-01-22 18:01:58.368515697 +0000 UTC m=+0.061958214 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 18:01:58 compute-0 podman[246560]: 2026-01-22 18:01:58.383537112 +0000 UTC m=+0.087742279 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 18:01:58 compute-0 podman[246559]: 2026-01-22 18:01:58.390811089 +0000 UTC m=+0.097557445 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 18:01:58 compute-0 nova_compute[183075]: 2026-01-22 18:01:58.729 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:58 compute-0 nova_compute[183075]: 2026-01-22 18:01:58.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:01:58 compute-0 nova_compute[183075]: 2026-01-22 18:01:58.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 18:01:58 compute-0 nova_compute[183075]: 2026-01-22 18:01:58.811 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 18:01:58 compute-0 nova_compute[183075]: 2026-01-22 18:01:58.896 183079 INFO nova.compute.manager [None req-235ff398-6bd1-4f49-89a3-8b67dc34696d 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:01:58 compute-0 nova_compute[183075]: 2026-01-22 18:01:58.904 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:01:59 compute-0 nova_compute[183075]: 2026-01-22 18:01:59.545 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:01:59 compute-0 nova_compute[183075]: 2026-01-22 18:01:59.806 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:02:00 compute-0 nova_compute[183075]: 2026-01-22 18:02:00.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:02:00 compute-0 nova_compute[183075]: 2026-01-22 18:02:00.825 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:02:00 compute-0 nova_compute[183075]: 2026-01-22 18:02:00.825 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:02:00 compute-0 nova_compute[183075]: 2026-01-22 18:02:00.826 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:02:00 compute-0 nova_compute[183075]: 2026-01-22 18:02:00.826 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 18:02:00 compute-0 nova_compute[183075]: 2026-01-22 18:02:00.903 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:02:00 compute-0 nova_compute[183075]: 2026-01-22 18:02:00.963 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:02:00 compute-0 nova_compute[183075]: 2026-01-22 18:02:00.964 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:02:01 compute-0 nova_compute[183075]: 2026-01-22 18:02:01.020 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:02:01 compute-0 nova_compute[183075]: 2026-01-22 18:02:01.191 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 18:02:01 compute-0 nova_compute[183075]: 2026-01-22 18:02:01.193 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5541MB free_disk=73.32389450073242GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 18:02:01 compute-0 nova_compute[183075]: 2026-01-22 18:02:01.194 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:02:01 compute-0 nova_compute[183075]: 2026-01-22 18:02:01.195 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:02:01 compute-0 nova_compute[183075]: 2026-01-22 18:02:01.275 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance afb51767-98e4-4f27-bf80-d54f23cd06c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 18:02:01 compute-0 nova_compute[183075]: 2026-01-22 18:02:01.275 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 18:02:01 compute-0 nova_compute[183075]: 2026-01-22 18:02:01.276 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 18:02:01 compute-0 nova_compute[183075]: 2026-01-22 18:02:01.327 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:02:01 compute-0 nova_compute[183075]: 2026-01-22 18:02:01.340 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:02:01 compute-0 nova_compute[183075]: 2026-01-22 18:02:01.362 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 18:02:01 compute-0 nova_compute[183075]: 2026-01-22 18:02:01.362 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:02:02 compute-0 podman[246631]: 2026-01-22 18:02:02.358447167 +0000 UTC m=+0.070410162 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:02:02 compute-0 nova_compute[183075]: 2026-01-22 18:02:02.363 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:02:02 compute-0 nova_compute[183075]: 2026-01-22 18:02:02.363 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 18:02:02 compute-0 nova_compute[183075]: 2026-01-22 18:02:02.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:02:03 compute-0 nova_compute[183075]: 2026-01-22 18:02:03.733 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:04 compute-0 nova_compute[183075]: 2026-01-22 18:02:04.035 183079 INFO nova.compute.manager [None req-76374470-5866-466d-859d-dc362db6c2e0 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:02:04 compute-0 nova_compute[183075]: 2026-01-22 18:02:04.041 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:02:04 compute-0 nova_compute[183075]: 2026-01-22 18:02:04.626 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:08 compute-0 nova_compute[183075]: 2026-01-22 18:02:08.736 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:09 compute-0 nova_compute[183075]: 2026-01-22 18:02:09.172 183079 INFO nova.compute.manager [None req-383968d3-e2b9-4e27-9cfe-95951e7ba799 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:02:09 compute-0 nova_compute[183075]: 2026-01-22 18:02:09.177 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:02:09 compute-0 nova_compute[183075]: 2026-01-22 18:02:09.629 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:13 compute-0 podman[246651]: 2026-01-22 18:02:13.341324464 +0000 UTC m=+0.052960350 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:02:13 compute-0 nova_compute[183075]: 2026-01-22 18:02:13.740 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:14 compute-0 nova_compute[183075]: 2026-01-22 18:02:14.295 183079 INFO nova.compute.manager [None req-36227987-70c5-484c-95f2-76b105878dda 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:02:14 compute-0 nova_compute[183075]: 2026-01-22 18:02:14.300 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:02:14 compute-0 nova_compute[183075]: 2026-01-22 18:02:14.631 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:17 compute-0 podman[246678]: 2026-01-22 18:02:17.337985666 +0000 UTC m=+0.051488521 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 18:02:18 compute-0 nova_compute[183075]: 2026-01-22 18:02:18.777 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:19 compute-0 nova_compute[183075]: 2026-01-22 18:02:19.404 183079 INFO nova.compute.manager [None req-b4258d2a-feca-4ad9-b6a0-135260c6bf9e 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:02:19 compute-0 nova_compute[183075]: 2026-01-22 18:02:19.408 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:02:19 compute-0 nova_compute[183075]: 2026-01-22 18:02:19.632 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:19 compute-0 nova_compute[183075]: 2026-01-22 18:02:19.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:02:23 compute-0 nova_compute[183075]: 2026-01-22 18:02:23.781 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:24 compute-0 nova_compute[183075]: 2026-01-22 18:02:24.551 183079 INFO nova.compute.manager [None req-1475b3fd-f419-4de1-b9f9-432088fb6219 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:02:24 compute-0 nova_compute[183075]: 2026-01-22 18:02:24.555 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:02:24 compute-0 nova_compute[183075]: 2026-01-22 18:02:24.634 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:28 compute-0 nova_compute[183075]: 2026-01-22 18:02:28.783 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:29 compute-0 podman[246703]: 2026-01-22 18:02:29.381494664 +0000 UTC m=+0.055181901 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:02:29 compute-0 podman[246704]: 2026-01-22 18:02:29.401412911 +0000 UTC m=+0.067816941 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal)
Jan 22 18:02:29 compute-0 podman[246702]: 2026-01-22 18:02:29.435505432 +0000 UTC m=+0.114043520 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:02:29 compute-0 nova_compute[183075]: 2026-01-22 18:02:29.636 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:29 compute-0 nova_compute[183075]: 2026-01-22 18:02:29.649 183079 INFO nova.compute.manager [None req-85f5ceaf-df12-413b-85c7-22428195065a 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:02:29 compute-0 nova_compute[183075]: 2026-01-22 18:02:29.653 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:02:33 compute-0 podman[246769]: 2026-01-22 18:02:33.362576833 +0000 UTC m=+0.066586928 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 18:02:33 compute-0 nova_compute[183075]: 2026-01-22 18:02:33.786 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:34 compute-0 nova_compute[183075]: 2026-01-22 18:02:34.638 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:34 compute-0 nova_compute[183075]: 2026-01-22 18:02:34.865 183079 INFO nova.compute.manager [None req-ca307704-d0ea-4e9c-80d2-c1d6cffbf7d9 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:02:34 compute-0 nova_compute[183075]: 2026-01-22 18:02:34.870 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:02:38 compute-0 nova_compute[183075]: 2026-01-22 18:02:38.789 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:39 compute-0 nova_compute[183075]: 2026-01-22 18:02:39.684 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:39 compute-0 nova_compute[183075]: 2026-01-22 18:02:39.991 183079 INFO nova.compute.manager [None req-a8041ada-baa3-450e-bdff-ba708cd63435 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:02:39 compute-0 nova_compute[183075]: 2026-01-22 18:02:39.995 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:02:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:02:41.985 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:02:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:02:41.986 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:02:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:02:41.986 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:02:43 compute-0 nova_compute[183075]: 2026-01-22 18:02:43.791 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:44 compute-0 podman[246804]: 2026-01-22 18:02:44.364400204 +0000 UTC m=+0.066097445 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:02:44 compute-0 nova_compute[183075]: 2026-01-22 18:02:44.685 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:45 compute-0 nova_compute[183075]: 2026-01-22 18:02:45.120 183079 INFO nova.compute.manager [None req-60f6af46-5d02-41d6-830e-a2004b9787ce 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:02:45 compute-0 nova_compute[183075]: 2026-01-22 18:02:45.125 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:02:48 compute-0 podman[246827]: 2026-01-22 18:02:48.346710147 +0000 UTC m=+0.053359411 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 18:02:48 compute-0 nova_compute[183075]: 2026-01-22 18:02:48.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:02:48 compute-0 nova_compute[183075]: 2026-01-22 18:02:48.793 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:49 compute-0 nova_compute[183075]: 2026-01-22 18:02:49.711 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:49 compute-0 nova_compute[183075]: 2026-01-22 18:02:49.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:02:50 compute-0 nova_compute[183075]: 2026-01-22 18:02:50.251 183079 INFO nova.compute.manager [None req-f1c12184-08f6-4d66-b6cd-30124b46f6dd 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:02:50 compute-0 nova_compute[183075]: 2026-01-22 18:02:50.255 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:02:50 compute-0 nova_compute[183075]: 2026-01-22 18:02:50.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:02:50 compute-0 nova_compute[183075]: 2026-01-22 18:02:50.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:02:53 compute-0 nova_compute[183075]: 2026-01-22 18:02:53.796 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:54 compute-0 nova_compute[183075]: 2026-01-22 18:02:54.713 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:55 compute-0 nova_compute[183075]: 2026-01-22 18:02:55.369 183079 INFO nova.compute.manager [None req-d0b4592e-0acb-4433-ac7b-84846de93d07 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Get console output
Jan 22 18:02:55 compute-0 nova_compute[183075]: 2026-01-22 18:02:55.375 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:02:58 compute-0 nova_compute[183075]: 2026-01-22 18:02:58.799 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:59 compute-0 nova_compute[183075]: 2026-01-22 18:02:59.716 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:02:59 compute-0 nova_compute[183075]: 2026-01-22 18:02:59.784 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:03:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:00.060 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:03:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:00.061 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 18:03:00 compute-0 nova_compute[183075]: 2026-01-22 18:03:00.061 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:00 compute-0 podman[246853]: 2026-01-22 18:03:00.341912633 +0000 UTC m=+0.044941164 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 18:03:00 compute-0 podman[246854]: 2026-01-22 18:03:00.353878156 +0000 UTC m=+0.054421160 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, architecture=x86_64, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, container_name=openstack_network_exporter)
Jan 22 18:03:00 compute-0 podman[246852]: 2026-01-22 18:03:00.396661911 +0000 UTC m=+0.103211768 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 18:03:00 compute-0 nova_compute[183075]: 2026-01-22 18:03:00.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:03:00 compute-0 nova_compute[183075]: 2026-01-22 18:03:00.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 18:03:00 compute-0 nova_compute[183075]: 2026-01-22 18:03:00.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 18:03:00 compute-0 nova_compute[183075]: 2026-01-22 18:03:00.949 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-afb51767-98e4-4f27-bf80-d54f23cd06c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:03:00 compute-0 nova_compute[183075]: 2026-01-22 18:03:00.950 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-afb51767-98e4-4f27-bf80-d54f23cd06c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:03:00 compute-0 nova_compute[183075]: 2026-01-22 18:03:00.950 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 18:03:00 compute-0 nova_compute[183075]: 2026-01-22 18:03:00.950 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid afb51767-98e4-4f27-bf80-d54f23cd06c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.088 183079 DEBUG nova.compute.manager [req-ae789511-7f34-46a4-a227-19f977aba87a req-68361f48-caad-4427-859a-85a402e3ec5a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Received event network-changed-5991e52e-d36a-4639-b0c2-6e456926f678 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.089 183079 DEBUG nova.compute.manager [req-ae789511-7f34-46a4-a227-19f977aba87a req-68361f48-caad-4427-859a-85a402e3ec5a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Refreshing instance network info cache due to event network-changed-5991e52e-d36a-4639-b0c2-6e456926f678. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.089 183079 DEBUG oslo_concurrency.lockutils [req-ae789511-7f34-46a4-a227-19f977aba87a req-68361f48-caad-4427-859a-85a402e3ec5a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-afb51767-98e4-4f27-bf80-d54f23cd06c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.517 183079 DEBUG nova.compute.manager [req-6ab85e00-d794-4685-bfd8-e13eb67ec8b4 req-7b29b803-c8d5-45f7-b4ca-e438ddbffc8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Received event network-changed-5991e52e-d36a-4639-b0c2-6e456926f678 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.517 183079 DEBUG nova.compute.manager [req-6ab85e00-d794-4685-bfd8-e13eb67ec8b4 req-7b29b803-c8d5-45f7-b4ca-e438ddbffc8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Refreshing instance network info cache due to event network-changed-5991e52e-d36a-4639-b0c2-6e456926f678. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.518 183079 DEBUG oslo_concurrency.lockutils [req-6ab85e00-d794-4685-bfd8-e13eb67ec8b4 req-7b29b803-c8d5-45f7-b4ca-e438ddbffc8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-afb51767-98e4-4f27-bf80-d54f23cd06c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.530 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Updating instance_info_cache with network_info: [{"id": "5991e52e-d36a-4639-b0c2-6e456926f678", "address": "fa:16:3e:73:94:61", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5991e52e-d3", "ovs_interfaceid": "5991e52e-d36a-4639-b0c2-6e456926f678", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.546 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-afb51767-98e4-4f27-bf80-d54f23cd06c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.546 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.546 183079 DEBUG oslo_concurrency.lockutils [req-ae789511-7f34-46a4-a227-19f977aba87a req-68361f48-caad-4427-859a-85a402e3ec5a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-afb51767-98e4-4f27-bf80-d54f23cd06c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.547 183079 DEBUG nova.network.neutron [req-ae789511-7f34-46a4-a227-19f977aba87a req-68361f48-caad-4427-859a-85a402e3ec5a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Refreshing network info cache for port 5991e52e-d36a-4639-b0c2-6e456926f678 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.809 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.809 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.809 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.810 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.897 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.953 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:03:02 compute-0 nova_compute[183075]: 2026-01-22 18:03:02.954 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.013 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.172 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.174 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5551MB free_disk=73.32297897338867GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.174 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.174 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.335 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance afb51767-98e4-4f27-bf80-d54f23cd06c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.335 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.336 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.442 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.458 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.459 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.460 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.564 183079 DEBUG oslo_concurrency.lockutils [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "afb51767-98e4-4f27-bf80-d54f23cd06c6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.565 183079 DEBUG oslo_concurrency.lockutils [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "afb51767-98e4-4f27-bf80-d54f23cd06c6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.566 183079 DEBUG oslo_concurrency.lockutils [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.566 183079 DEBUG oslo_concurrency.lockutils [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.567 183079 DEBUG oslo_concurrency.lockutils [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.570 183079 INFO nova.compute.manager [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Terminating instance
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.572 183079 DEBUG nova.compute.manager [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 18:03:03 compute-0 kernel: tap5991e52e-d3 (unregistering): left promiscuous mode
Jan 22 18:03:03 compute-0 NetworkManager[55454]: <info>  [1769104983.6022] device (tap5991e52e-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.608 183079 DEBUG nova.network.neutron [req-ae789511-7f34-46a4-a227-19f977aba87a req-68361f48-caad-4427-859a-85a402e3ec5a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Updated VIF entry in instance network info cache for port 5991e52e-d36a-4639-b0c2-6e456926f678. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.608 183079 DEBUG nova.network.neutron [req-ae789511-7f34-46a4-a227-19f977aba87a req-68361f48-caad-4427-859a-85a402e3ec5a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Updating instance_info_cache with network_info: [{"id": "5991e52e-d36a-4639-b0c2-6e456926f678", "address": "fa:16:3e:73:94:61", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5991e52e-d3", "ovs_interfaceid": "5991e52e-d36a-4639-b0c2-6e456926f678", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:03:03 compute-0 ovn_controller[95372]: 2026-01-22T18:03:03Z|00886|binding|INFO|Releasing lport 5991e52e-d36a-4639-b0c2-6e456926f678 from this chassis (sb_readonly=0)
Jan 22 18:03:03 compute-0 ovn_controller[95372]: 2026-01-22T18:03:03Z|00887|binding|INFO|Setting lport 5991e52e-d36a-4639-b0c2-6e456926f678 down in Southbound
Jan 22 18:03:03 compute-0 ovn_controller[95372]: 2026-01-22T18:03:03Z|00888|binding|INFO|Removing iface tap5991e52e-d3 ovn-installed in OVS
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.612 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:03.616 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:73:94:61 10.100.0.5'], port_security=['fa:16:3e:73:94:61 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'afb51767-98e4-4f27-bf80-d54f23cd06c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.195'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=5991e52e-d36a-4639-b0c2-6e456926f678) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:03:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:03.617 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 5991e52e-d36a-4639-b0c2-6e456926f678 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 18:03:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:03.618 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 18:03:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:03.620 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d06f4a85-9d88-4a4d-beca-a99c7a09551d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:03.620 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace which is not needed anymore
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.629 183079 DEBUG oslo_concurrency.lockutils [req-ae789511-7f34-46a4-a227-19f977aba87a req-68361f48-caad-4427-859a-85a402e3ec5a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-afb51767-98e4-4f27-bf80-d54f23cd06c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.630 183079 DEBUG oslo_concurrency.lockutils [req-6ab85e00-d794-4685-bfd8-e13eb67ec8b4 req-7b29b803-c8d5-45f7-b4ca-e438ddbffc8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-afb51767-98e4-4f27-bf80-d54f23cd06c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.630 183079 DEBUG nova.network.neutron [req-6ab85e00-d794-4685-bfd8-e13eb67ec8b4 req-7b29b803-c8d5-45f7-b4ca-e438ddbffc8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Refreshing network info cache for port 5991e52e-d36a-4639-b0c2-6e456926f678 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.636 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:03 compute-0 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d00000051.scope: Deactivated successfully.
Jan 22 18:03:03 compute-0 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d00000051.scope: Consumed 15.006s CPU time.
Jan 22 18:03:03 compute-0 systemd-machined[154382]: Machine qemu-81-instance-00000051 terminated.
Jan 22 18:03:03 compute-0 podman[246924]: 2026-01-22 18:03:03.719993156 +0000 UTC m=+0.090565965 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 18:03:03 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[246399]: [NOTICE]   (246403) : haproxy version is 2.8.14-c23fe91
Jan 22 18:03:03 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[246399]: [NOTICE]   (246403) : path to executable is /usr/sbin/haproxy
Jan 22 18:03:03 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[246399]: [WARNING]  (246403) : Exiting Master process...
Jan 22 18:03:03 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[246399]: [WARNING]  (246403) : Exiting Master process...
Jan 22 18:03:03 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[246399]: [ALERT]    (246403) : Current worker (246405) exited with code 143 (Terminated)
Jan 22 18:03:03 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[246399]: [WARNING]  (246403) : All workers exited. Exiting... (0)
Jan 22 18:03:03 compute-0 systemd[1]: libpod-d94defef588d84aec236d5962bba79b2dabdde6a389260928b5ceb8d51b9feb3.scope: Deactivated successfully.
Jan 22 18:03:03 compute-0 podman[246965]: 2026-01-22 18:03:03.764191049 +0000 UTC m=+0.046088085 container died d94defef588d84aec236d5962bba79b2dabdde6a389260928b5ceb8d51b9feb3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:03:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d94defef588d84aec236d5962bba79b2dabdde6a389260928b5ceb8d51b9feb3-userdata-shm.mount: Deactivated successfully.
Jan 22 18:03:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-f65283b4daddb7e7ba10b6b572a7946cfdf8ed8178a820f811d8285c58d33dc7-merged.mount: Deactivated successfully.
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.801 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:03 compute-0 podman[246965]: 2026-01-22 18:03:03.802661978 +0000 UTC m=+0.084559024 container cleanup d94defef588d84aec236d5962bba79b2dabdde6a389260928b5ceb8d51b9feb3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.805 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.810 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:03 compute-0 systemd[1]: libpod-conmon-d94defef588d84aec236d5962bba79b2dabdde6a389260928b5ceb8d51b9feb3.scope: Deactivated successfully.
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.841 183079 INFO nova.virt.libvirt.driver [-] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Instance destroyed successfully.
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.843 183079 DEBUG nova.objects.instance [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid afb51767-98e4-4f27-bf80-d54f23cd06c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.856 183079 DEBUG nova.virt.libvirt.vif [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T18:01:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-2101676201',display_name='tempest-server-test-2101676201',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-2101676201',id=81,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T18:01:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-zhpa4cyh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T18:01:24Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=afb51767-98e4-4f27-bf80-d54f23cd06c6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5991e52e-d36a-4639-b0c2-6e456926f678", "address": "fa:16:3e:73:94:61", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5991e52e-d3", "ovs_interfaceid": "5991e52e-d36a-4639-b0c2-6e456926f678", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.857 183079 DEBUG nova.network.os_vif_util [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "5991e52e-d36a-4639-b0c2-6e456926f678", "address": "fa:16:3e:73:94:61", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5991e52e-d3", "ovs_interfaceid": "5991e52e-d36a-4639-b0c2-6e456926f678", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.858 183079 DEBUG nova.network.os_vif_util [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:73:94:61,bridge_name='br-int',has_traffic_filtering=True,id=5991e52e-d36a-4639-b0c2-6e456926f678,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5991e52e-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.858 183079 DEBUG os_vif [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:73:94:61,bridge_name='br-int',has_traffic_filtering=True,id=5991e52e-d36a-4639-b0c2-6e456926f678,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5991e52e-d3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.861 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.862 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5991e52e-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.865 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.870 183079 INFO os_vif [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:73:94:61,bridge_name='br-int',has_traffic_filtering=True,id=5991e52e-d36a-4639-b0c2-6e456926f678,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5991e52e-d3')
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.871 183079 INFO nova.virt.libvirt.driver [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Deleting instance files /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6_del
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.871 183079 INFO nova.virt.libvirt.driver [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Deletion of /var/lib/nova/instances/afb51767-98e4-4f27-bf80-d54f23cd06c6_del complete
Jan 22 18:03:03 compute-0 podman[246998]: 2026-01-22 18:03:03.876791449 +0000 UTC m=+0.051676396 container remove d94defef588d84aec236d5962bba79b2dabdde6a389260928b5ceb8d51b9feb3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:03:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:03.884 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[ac99a064-bfd7-49f0-a0d6-e53e285d34cb]: (4, ('Thu Jan 22 06:03:03 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (d94defef588d84aec236d5962bba79b2dabdde6a389260928b5ceb8d51b9feb3)\nd94defef588d84aec236d5962bba79b2dabdde6a389260928b5ceb8d51b9feb3\nThu Jan 22 06:03:03 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (d94defef588d84aec236d5962bba79b2dabdde6a389260928b5ceb8d51b9feb3)\nd94defef588d84aec236d5962bba79b2dabdde6a389260928b5ceb8d51b9feb3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:03.886 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[fb166e26-e4ba-438d-bf4b-42413a8bc5f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:03.887 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.888 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:03 compute-0 kernel: tap88ed9213-70: left promiscuous mode
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.900 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:03.902 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1209e2cf-057b-42f2-966e-de43cc32b406]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:03.913 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4a6da893-904d-489b-881d-6e5dad14c0f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:03.914 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[afc91892-8a9f-4b9f-86f8-69e027643414]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.922 183079 INFO nova.compute.manager [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.922 183079 DEBUG oslo.service.loopingcall [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.922 183079 DEBUG nova.compute.manager [-] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 18:03:03 compute-0 nova_compute[183075]: 2026-01-22 18:03:03.923 183079 DEBUG nova.network.neutron [-] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 18:03:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:03.928 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c3444937-a18f-4c95-9b43-9437dbf4e46e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 724758, 'reachable_time': 39309, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247022, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:03.930 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 18:03:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:03.930 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[a458748a-352f-4d5e-b920-a78c8d17626b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:03 compute-0 systemd[1]: run-netns-ovnmeta\x2d88ed9213\x2d7d48\x2d4c42\x2dad60\x2dce3fcf104f71.mount: Deactivated successfully.
Jan 22 18:03:04 compute-0 nova_compute[183075]: 2026-01-22 18:03:04.168 183079 DEBUG nova.compute.manager [req-e7739e37-a1c2-4c96-8699-7f6dfaf9a5c3 req-d8a89883-3360-468a-8234-6fd5a6ba19cb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Received event network-vif-unplugged-5991e52e-d36a-4639-b0c2-6e456926f678 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:03:04 compute-0 nova_compute[183075]: 2026-01-22 18:03:04.169 183079 DEBUG oslo_concurrency.lockutils [req-e7739e37-a1c2-4c96-8699-7f6dfaf9a5c3 req-d8a89883-3360-468a-8234-6fd5a6ba19cb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:04 compute-0 nova_compute[183075]: 2026-01-22 18:03:04.169 183079 DEBUG oslo_concurrency.lockutils [req-e7739e37-a1c2-4c96-8699-7f6dfaf9a5c3 req-d8a89883-3360-468a-8234-6fd5a6ba19cb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:04 compute-0 nova_compute[183075]: 2026-01-22 18:03:04.169 183079 DEBUG oslo_concurrency.lockutils [req-e7739e37-a1c2-4c96-8699-7f6dfaf9a5c3 req-d8a89883-3360-468a-8234-6fd5a6ba19cb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:04 compute-0 nova_compute[183075]: 2026-01-22 18:03:04.169 183079 DEBUG nova.compute.manager [req-e7739e37-a1c2-4c96-8699-7f6dfaf9a5c3 req-d8a89883-3360-468a-8234-6fd5a6ba19cb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] No waiting events found dispatching network-vif-unplugged-5991e52e-d36a-4639-b0c2-6e456926f678 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:03:04 compute-0 nova_compute[183075]: 2026-01-22 18:03:04.170 183079 DEBUG nova.compute.manager [req-e7739e37-a1c2-4c96-8699-7f6dfaf9a5c3 req-d8a89883-3360-468a-8234-6fd5a6ba19cb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Received event network-vif-unplugged-5991e52e-d36a-4639-b0c2-6e456926f678 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 18:03:04 compute-0 nova_compute[183075]: 2026-01-22 18:03:04.170 183079 DEBUG nova.compute.manager [req-e7739e37-a1c2-4c96-8699-7f6dfaf9a5c3 req-d8a89883-3360-468a-8234-6fd5a6ba19cb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Received event network-vif-plugged-5991e52e-d36a-4639-b0c2-6e456926f678 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:03:04 compute-0 nova_compute[183075]: 2026-01-22 18:03:04.170 183079 DEBUG oslo_concurrency.lockutils [req-e7739e37-a1c2-4c96-8699-7f6dfaf9a5c3 req-d8a89883-3360-468a-8234-6fd5a6ba19cb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:04 compute-0 nova_compute[183075]: 2026-01-22 18:03:04.170 183079 DEBUG oslo_concurrency.lockutils [req-e7739e37-a1c2-4c96-8699-7f6dfaf9a5c3 req-d8a89883-3360-468a-8234-6fd5a6ba19cb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:04 compute-0 nova_compute[183075]: 2026-01-22 18:03:04.171 183079 DEBUG oslo_concurrency.lockutils [req-e7739e37-a1c2-4c96-8699-7f6dfaf9a5c3 req-d8a89883-3360-468a-8234-6fd5a6ba19cb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "afb51767-98e4-4f27-bf80-d54f23cd06c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:04 compute-0 nova_compute[183075]: 2026-01-22 18:03:04.171 183079 DEBUG nova.compute.manager [req-e7739e37-a1c2-4c96-8699-7f6dfaf9a5c3 req-d8a89883-3360-468a-8234-6fd5a6ba19cb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] No waiting events found dispatching network-vif-plugged-5991e52e-d36a-4639-b0c2-6e456926f678 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:03:04 compute-0 nova_compute[183075]: 2026-01-22 18:03:04.171 183079 WARNING nova.compute.manager [req-e7739e37-a1c2-4c96-8699-7f6dfaf9a5c3 req-d8a89883-3360-468a-8234-6fd5a6ba19cb a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Received unexpected event network-vif-plugged-5991e52e-d36a-4639-b0c2-6e456926f678 for instance with vm_state active and task_state deleting.
Jan 22 18:03:04 compute-0 nova_compute[183075]: 2026-01-22 18:03:04.461 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:03:04 compute-0 nova_compute[183075]: 2026-01-22 18:03:04.461 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 18:03:04 compute-0 nova_compute[183075]: 2026-01-22 18:03:04.718 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:04 compute-0 nova_compute[183075]: 2026-01-22 18:03:04.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:03:05 compute-0 nova_compute[183075]: 2026-01-22 18:03:05.051 183079 DEBUG nova.network.neutron [req-6ab85e00-d794-4685-bfd8-e13eb67ec8b4 req-7b29b803-c8d5-45f7-b4ca-e438ddbffc8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Updated VIF entry in instance network info cache for port 5991e52e-d36a-4639-b0c2-6e456926f678. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 18:03:05 compute-0 nova_compute[183075]: 2026-01-22 18:03:05.051 183079 DEBUG nova.network.neutron [req-6ab85e00-d794-4685-bfd8-e13eb67ec8b4 req-7b29b803-c8d5-45f7-b4ca-e438ddbffc8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Updating instance_info_cache with network_info: [{"id": "5991e52e-d36a-4639-b0c2-6e456926f678", "address": "fa:16:3e:73:94:61", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5991e52e-d3", "ovs_interfaceid": "5991e52e-d36a-4639-b0c2-6e456926f678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:03:05 compute-0 nova_compute[183075]: 2026-01-22 18:03:05.078 183079 DEBUG oslo_concurrency.lockutils [req-6ab85e00-d794-4685-bfd8-e13eb67ec8b4 req-7b29b803-c8d5-45f7-b4ca-e438ddbffc8b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-afb51767-98e4-4f27-bf80-d54f23cd06c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:03:05 compute-0 nova_compute[183075]: 2026-01-22 18:03:05.267 183079 DEBUG nova.network.neutron [-] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:03:05 compute-0 nova_compute[183075]: 2026-01-22 18:03:05.285 183079 INFO nova.compute.manager [-] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Took 1.36 seconds to deallocate network for instance.
Jan 22 18:03:05 compute-0 nova_compute[183075]: 2026-01-22 18:03:05.328 183079 DEBUG oslo_concurrency.lockutils [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:05 compute-0 nova_compute[183075]: 2026-01-22 18:03:05.328 183079 DEBUG oslo_concurrency.lockutils [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:05 compute-0 nova_compute[183075]: 2026-01-22 18:03:05.367 183079 DEBUG nova.compute.provider_tree [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:03:05 compute-0 nova_compute[183075]: 2026-01-22 18:03:05.381 183079 DEBUG nova.scheduler.client.report [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:03:05 compute-0 nova_compute[183075]: 2026-01-22 18:03:05.400 183079 DEBUG oslo_concurrency.lockutils [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.071s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:05 compute-0 nova_compute[183075]: 2026-01-22 18:03:05.421 183079 INFO nova.scheduler.client.report [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance afb51767-98e4-4f27-bf80-d54f23cd06c6
Jan 22 18:03:05 compute-0 nova_compute[183075]: 2026-01-22 18:03:05.471 183079 DEBUG oslo_concurrency.lockutils [None req-ad329d69-1922-4040-819f-f5eca4e0dfd1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "afb51767-98e4-4f27-bf80-d54f23cd06c6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.906s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:08 compute-0 nova_compute[183075]: 2026-01-22 18:03:08.866 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:09 compute-0 nova_compute[183075]: 2026-01-22 18:03:09.720 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:10 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:10.065 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:03:10 compute-0 nova_compute[183075]: 2026-01-22 18:03:10.699 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "bdce5eb5-c99f-4f11-8f8e-12182942b087" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:10 compute-0 nova_compute[183075]: 2026-01-22 18:03:10.699 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "bdce5eb5-c99f-4f11-8f8e-12182942b087" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:10 compute-0 nova_compute[183075]: 2026-01-22 18:03:10.712 183079 DEBUG nova.compute.manager [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 18:03:10 compute-0 nova_compute[183075]: 2026-01-22 18:03:10.806 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:10 compute-0 nova_compute[183075]: 2026-01-22 18:03:10.806 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:10 compute-0 nova_compute[183075]: 2026-01-22 18:03:10.816 183079 DEBUG nova.virt.hardware [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 18:03:10 compute-0 nova_compute[183075]: 2026-01-22 18:03:10.816 183079 INFO nova.compute.claims [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Claim successful on node compute-0.ctlplane.example.com
Jan 22 18:03:10 compute-0 nova_compute[183075]: 2026-01-22 18:03:10.905 183079 DEBUG nova.compute.provider_tree [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:03:10 compute-0 nova_compute[183075]: 2026-01-22 18:03:10.919 183079 DEBUG nova.scheduler.client.report [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:03:10 compute-0 nova_compute[183075]: 2026-01-22 18:03:10.936 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:10 compute-0 nova_compute[183075]: 2026-01-22 18:03:10.937 183079 DEBUG nova.compute.manager [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 18:03:10 compute-0 nova_compute[183075]: 2026-01-22 18:03:10.974 183079 DEBUG nova.compute.manager [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 18:03:10 compute-0 nova_compute[183075]: 2026-01-22 18:03:10.974 183079 DEBUG nova.network.neutron [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 18:03:10 compute-0 nova_compute[183075]: 2026-01-22 18:03:10.992 183079 INFO nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.008 183079 DEBUG nova.compute.manager [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.108 183079 DEBUG nova.compute.manager [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.109 183079 DEBUG nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.110 183079 INFO nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Creating image(s)
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.110 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "/var/lib/nova/instances/bdce5eb5-c99f-4f11-8f8e-12182942b087/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.110 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/bdce5eb5-c99f-4f11-8f8e-12182942b087/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.111 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "/var/lib/nova/instances/bdce5eb5-c99f-4f11-8f8e-12182942b087/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.123 183079 DEBUG oslo_concurrency.processutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.183 183079 DEBUG oslo_concurrency.processutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.184 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.184 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.196 183079 DEBUG oslo_concurrency.processutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.250 183079 DEBUG oslo_concurrency.processutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.251 183079 DEBUG oslo_concurrency.processutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/bdce5eb5-c99f-4f11-8f8e-12182942b087/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.286 183079 DEBUG oslo_concurrency.processutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/bdce5eb5-c99f-4f11-8f8e-12182942b087/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.287 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.287 183079 DEBUG oslo_concurrency.processutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.343 183079 DEBUG oslo_concurrency.processutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.344 183079 DEBUG nova.virt.disk.api [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Checking if we can resize image /var/lib/nova/instances/bdce5eb5-c99f-4f11-8f8e-12182942b087/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.345 183079 DEBUG oslo_concurrency.processutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdce5eb5-c99f-4f11-8f8e-12182942b087/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.401 183079 DEBUG oslo_concurrency.processutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdce5eb5-c99f-4f11-8f8e-12182942b087/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.402 183079 DEBUG nova.virt.disk.api [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Cannot resize image /var/lib/nova/instances/bdce5eb5-c99f-4f11-8f8e-12182942b087/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.403 183079 DEBUG nova.objects.instance [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'migration_context' on Instance uuid bdce5eb5-c99f-4f11-8f8e-12182942b087 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.419 183079 DEBUG nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.420 183079 DEBUG nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Ensure instance console log exists: /var/lib/nova/instances/bdce5eb5-c99f-4f11-8f8e-12182942b087/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.420 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.421 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.421 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:11 compute-0 nova_compute[183075]: 2026-01-22 18:03:11.862 183079 DEBUG nova.policy [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66c24efd0cd24c57803ea8508e679d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 18:03:13 compute-0 nova_compute[183075]: 2026-01-22 18:03:13.869 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:13 compute-0 nova_compute[183075]: 2026-01-22 18:03:13.940 183079 DEBUG nova.network.neutron [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Successfully created port: 44d6a309-5606-49d3-9f67-bab5a1644db8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 18:03:14 compute-0 nova_compute[183075]: 2026-01-22 18:03:14.722 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:15 compute-0 podman[247038]: 2026-01-22 18:03:15.342737537 +0000 UTC m=+0.050287929 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:03:15 compute-0 nova_compute[183075]: 2026-01-22 18:03:15.912 183079 DEBUG nova.network.neutron [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Successfully updated port: 44d6a309-5606-49d3-9f67-bab5a1644db8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.002 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "refresh_cache-bdce5eb5-c99f-4f11-8f8e-12182942b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.002 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquired lock "refresh_cache-bdce5eb5-c99f-4f11-8f8e-12182942b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.002 183079 DEBUG nova.network.neutron [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.006 183079 DEBUG nova.compute.manager [req-11ffacc4-3e56-4922-8d4e-7dde58428891 req-cf67775a-7fcd-4231-bfd9-fa0830887d5c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Received event network-changed-44d6a309-5606-49d3-9f67-bab5a1644db8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.006 183079 DEBUG nova.compute.manager [req-11ffacc4-3e56-4922-8d4e-7dde58428891 req-cf67775a-7fcd-4231-bfd9-fa0830887d5c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Refreshing instance network info cache due to event network-changed-44d6a309-5606-49d3-9f67-bab5a1644db8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.007 183079 DEBUG oslo_concurrency.lockutils [req-11ffacc4-3e56-4922-8d4e-7dde58428891 req-cf67775a-7fcd-4231-bfd9-fa0830887d5c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-bdce5eb5-c99f-4f11-8f8e-12182942b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.128 183079 DEBUG nova.network.neutron [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.854 183079 DEBUG nova.network.neutron [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Updating instance_info_cache with network_info: [{"id": "44d6a309-5606-49d3-9f67-bab5a1644db8", "address": "fa:16:3e:cd:96:1e", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d6a309-56", "ovs_interfaceid": "44d6a309-5606-49d3-9f67-bab5a1644db8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.873 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Releasing lock "refresh_cache-bdce5eb5-c99f-4f11-8f8e-12182942b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.874 183079 DEBUG nova.compute.manager [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Instance network_info: |[{"id": "44d6a309-5606-49d3-9f67-bab5a1644db8", "address": "fa:16:3e:cd:96:1e", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d6a309-56", "ovs_interfaceid": "44d6a309-5606-49d3-9f67-bab5a1644db8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.874 183079 DEBUG oslo_concurrency.lockutils [req-11ffacc4-3e56-4922-8d4e-7dde58428891 req-cf67775a-7fcd-4231-bfd9-fa0830887d5c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-bdce5eb5-c99f-4f11-8f8e-12182942b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.875 183079 DEBUG nova.network.neutron [req-11ffacc4-3e56-4922-8d4e-7dde58428891 req-cf67775a-7fcd-4231-bfd9-fa0830887d5c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Refreshing network info cache for port 44d6a309-5606-49d3-9f67-bab5a1644db8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.877 183079 DEBUG nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Start _get_guest_xml network_info=[{"id": "44d6a309-5606-49d3-9f67-bab5a1644db8", "address": "fa:16:3e:cd:96:1e", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d6a309-56", "ovs_interfaceid": "44d6a309-5606-49d3-9f67-bab5a1644db8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.882 183079 WARNING nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.886 183079 DEBUG nova.virt.libvirt.host [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.887 183079 DEBUG nova.virt.libvirt.host [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.892 183079 DEBUG nova.virt.libvirt.host [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.893 183079 DEBUG nova.virt.libvirt.host [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.894 183079 DEBUG nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.894 183079 DEBUG nova.virt.hardware [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.894 183079 DEBUG nova.virt.hardware [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.895 183079 DEBUG nova.virt.hardware [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.895 183079 DEBUG nova.virt.hardware [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.895 183079 DEBUG nova.virt.hardware [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.895 183079 DEBUG nova.virt.hardware [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.895 183079 DEBUG nova.virt.hardware [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.896 183079 DEBUG nova.virt.hardware [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.896 183079 DEBUG nova.virt.hardware [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.896 183079 DEBUG nova.virt.hardware [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.896 183079 DEBUG nova.virt.hardware [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.900 183079 DEBUG nova.virt.libvirt.vif [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T18:03:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-313588033',display_name='tempest-server-test-313588033',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-313588033',id=82,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-7yfsp4f5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T18:03:11Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=bdce5eb5-c99f-4f11-8f8e-12182942b087,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44d6a309-5606-49d3-9f67-bab5a1644db8", "address": "fa:16:3e:cd:96:1e", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d6a309-56", "ovs_interfaceid": "44d6a309-5606-49d3-9f67-bab5a1644db8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.900 183079 DEBUG nova.network.os_vif_util [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "44d6a309-5606-49d3-9f67-bab5a1644db8", "address": "fa:16:3e:cd:96:1e", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d6a309-56", "ovs_interfaceid": "44d6a309-5606-49d3-9f67-bab5a1644db8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.901 183079 DEBUG nova.network.os_vif_util [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:96:1e,bridge_name='br-int',has_traffic_filtering=True,id=44d6a309-5606-49d3-9f67-bab5a1644db8,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44d6a309-56') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.901 183079 DEBUG nova.objects.instance [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'pci_devices' on Instance uuid bdce5eb5-c99f-4f11-8f8e-12182942b087 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.915 183079 DEBUG nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] End _get_guest_xml xml=<domain type="kvm">
Jan 22 18:03:16 compute-0 nova_compute[183075]:   <uuid>bdce5eb5-c99f-4f11-8f8e-12182942b087</uuid>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   <name>instance-00000052</name>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   <metadata>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-313588033</nova:name>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 18:03:16</nova:creationTime>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 18:03:16 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 18:03:16 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 18:03:16 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 18:03:16 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 18:03:16 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 18:03:16 compute-0 nova_compute[183075]:         <nova:user uuid="66c24efd0cd24c57803ea8508e679d06">tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member</nova:user>
Jan 22 18:03:16 compute-0 nova_compute[183075]:         <nova:project uuid="cbff5cec4b9c4d2eb317124ca20fea9f">tempest-StatelessNetworkSecGroupIPv4Test-1348485316</nova:project>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 18:03:16 compute-0 nova_compute[183075]:         <nova:port uuid="44d6a309-5606-49d3-9f67-bab5a1644db8">
Jan 22 18:03:16 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   </metadata>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <system>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <entry name="serial">bdce5eb5-c99f-4f11-8f8e-12182942b087</entry>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <entry name="uuid">bdce5eb5-c99f-4f11-8f8e-12182942b087</entry>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     </system>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   <os>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   </os>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   <features>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <apic/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   </features>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   </clock>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   </cpu>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   <devices>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/bdce5eb5-c99f-4f11-8f8e-12182942b087/disk"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     </disk>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:cd:96:1e"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <target dev="tap44d6a309-56"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     </interface>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/bdce5eb5-c99f-4f11-8f8e-12182942b087/console.log" append="off"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     </serial>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <video>
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     </video>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     </rng>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 18:03:16 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 18:03:16 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 18:03:16 compute-0 nova_compute[183075]:   </devices>
Jan 22 18:03:16 compute-0 nova_compute[183075]: </domain>
Jan 22 18:03:16 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.917 183079 DEBUG nova.compute.manager [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Preparing to wait for external event network-vif-plugged-44d6a309-5606-49d3-9f67-bab5a1644db8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.917 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.917 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.918 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.918 183079 DEBUG nova.virt.libvirt.vif [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T18:03:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-313588033',display_name='tempest-server-test-313588033',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-313588033',id=82,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-7yfsp4f5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T18:03:11Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=bdce5eb5-c99f-4f11-8f8e-12182942b087,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44d6a309-5606-49d3-9f67-bab5a1644db8", "address": "fa:16:3e:cd:96:1e", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d6a309-56", "ovs_interfaceid": "44d6a309-5606-49d3-9f67-bab5a1644db8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.919 183079 DEBUG nova.network.os_vif_util [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "44d6a309-5606-49d3-9f67-bab5a1644db8", "address": "fa:16:3e:cd:96:1e", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d6a309-56", "ovs_interfaceid": "44d6a309-5606-49d3-9f67-bab5a1644db8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.919 183079 DEBUG nova.network.os_vif_util [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:96:1e,bridge_name='br-int',has_traffic_filtering=True,id=44d6a309-5606-49d3-9f67-bab5a1644db8,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44d6a309-56') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.920 183079 DEBUG os_vif [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:96:1e,bridge_name='br-int',has_traffic_filtering=True,id=44d6a309-5606-49d3-9f67-bab5a1644db8,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44d6a309-56') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.920 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.920 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.921 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.924 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.924 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44d6a309-56, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.924 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap44d6a309-56, col_values=(('external_ids', {'iface-id': '44d6a309-5606-49d3-9f67-bab5a1644db8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cd:96:1e', 'vm-uuid': 'bdce5eb5-c99f-4f11-8f8e-12182942b087'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.926 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:16 compute-0 NetworkManager[55454]: <info>  [1769104996.9272] manager: (tap44d6a309-56): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/354)
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.930 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.934 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.935 183079 INFO os_vif [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:96:1e,bridge_name='br-int',has_traffic_filtering=True,id=44d6a309-5606-49d3-9f67-bab5a1644db8,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44d6a309-56')
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.985 183079 DEBUG nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 18:03:16 compute-0 nova_compute[183075]: 2026-01-22 18:03:16.985 183079 DEBUG nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] No VIF found with MAC fa:16:3e:cd:96:1e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 18:03:17 compute-0 kernel: tap44d6a309-56: entered promiscuous mode
Jan 22 18:03:17 compute-0 NetworkManager[55454]: <info>  [1769104997.0699] manager: (tap44d6a309-56): new Tun device (/org/freedesktop/NetworkManager/Devices/355)
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.071 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:17 compute-0 ovn_controller[95372]: 2026-01-22T18:03:17Z|00889|binding|INFO|Claiming lport 44d6a309-5606-49d3-9f67-bab5a1644db8 for this chassis.
Jan 22 18:03:17 compute-0 ovn_controller[95372]: 2026-01-22T18:03:17Z|00890|binding|INFO|44d6a309-5606-49d3-9f67-bab5a1644db8: Claiming fa:16:3e:cd:96:1e 10.100.0.14
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.078 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:96:1e 10.100.0.14'], port_security=['fa:16:3e:cd:96:1e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'bdce5eb5-c99f-4f11-8f8e-12182942b087', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '45b38a2d-f91f-4949-954f-9f963f5e478e ca06cc05-16f3-4515-9c94-ff0a817b9cf0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=44d6a309-5606-49d3-9f67-bab5a1644db8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.080 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 44d6a309-5606-49d3-9f67-bab5a1644db8 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 bound to our chassis
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.084 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.084 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:17 compute-0 ovn_controller[95372]: 2026-01-22T18:03:17Z|00891|binding|INFO|Setting lport 44d6a309-5606-49d3-9f67-bab5a1644db8 up in Southbound
Jan 22 18:03:17 compute-0 ovn_controller[95372]: 2026-01-22T18:03:17Z|00892|binding|INFO|Setting lport 44d6a309-5606-49d3-9f67-bab5a1644db8 ovn-installed in OVS
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.088 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.096 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[c8f61295-f6a4-432c-b80e-8e565ba94950]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.097 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88ed9213-71 in ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.100 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88ed9213-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.100 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[9160272b-b991-4103-9f93-554345168680]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.101 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[3d12f99c-d12f-44ff-aad3-a57983fa9604]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:17 compute-0 systemd-udevd[247081]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 18:03:17 compute-0 NetworkManager[55454]: <info>  [1769104997.1193] device (tap44d6a309-56): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 18:03:17 compute-0 NetworkManager[55454]: <info>  [1769104997.1198] device (tap44d6a309-56): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.122 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[3edf0136-3985-4abf-884d-12196ea80131]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:17 compute-0 systemd-machined[154382]: New machine qemu-82-instance-00000052.
Jan 22 18:03:17 compute-0 systemd[1]: Started Virtual Machine qemu-82-instance-00000052.
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.149 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b3ef1421-d5e6-40d2-8f3b-2402a2a7b180]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.175 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[1854c33c-5987-4f34-bef2-da4a0d5401ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.180 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[895ac3af-a0fe-4c6e-bbb1-633b79982852]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:17 compute-0 systemd-udevd[247085]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 18:03:17 compute-0 NetworkManager[55454]: <info>  [1769104997.1832] manager: (tap88ed9213-70): new Veth device (/org/freedesktop/NetworkManager/Devices/356)
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.213 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[4e7a769f-8228-45cd-979f-7766ed86b937]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.216 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[57da4889-d96b-481f-a194-13c0a3c5837a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:17 compute-0 NetworkManager[55454]: <info>  [1769104997.2390] device (tap88ed9213-70): carrier: link connected
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.243 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[8e42a23b-0757-4f70-af24-333694370f89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.261 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7d8d5d3e-94b8-46b0-8347-3ecb22c365ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 243], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736094, 'reachable_time': 34989, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247114, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.276 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[72019541-26ba-48ad-aa9d-8da8df5ccdc2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:fa93'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 736094, 'tstamp': 736094}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247115, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.285 183079 DEBUG nova.compute.manager [req-1aca7ca9-f3e3-40bc-ba31-b41856d6148f req-5fbd7d8e-c79d-472c-9892-3c4dce1bcc1d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Received event network-vif-plugged-44d6a309-5606-49d3-9f67-bab5a1644db8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.285 183079 DEBUG oslo_concurrency.lockutils [req-1aca7ca9-f3e3-40bc-ba31-b41856d6148f req-5fbd7d8e-c79d-472c-9892-3c4dce1bcc1d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.286 183079 DEBUG oslo_concurrency.lockutils [req-1aca7ca9-f3e3-40bc-ba31-b41856d6148f req-5fbd7d8e-c79d-472c-9892-3c4dce1bcc1d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.286 183079 DEBUG oslo_concurrency.lockutils [req-1aca7ca9-f3e3-40bc-ba31-b41856d6148f req-5fbd7d8e-c79d-472c-9892-3c4dce1bcc1d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.286 183079 DEBUG nova.compute.manager [req-1aca7ca9-f3e3-40bc-ba31-b41856d6148f req-5fbd7d8e-c79d-472c-9892-3c4dce1bcc1d a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Processing event network-vif-plugged-44d6a309-5606-49d3-9f67-bab5a1644db8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.291 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0b0ef5a4-6c1f-429a-a245-3bb16b26ece2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88ed9213-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:fa:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 243], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736094, 'reachable_time': 34989, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247116, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.334 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[98c69890-63fe-4e1b-8bbb-521e43b2fab8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.414 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[5b1d868f-0977-49ae-87d3-5e94093a652d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.416 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.417 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.417 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88ed9213-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:03:17 compute-0 NetworkManager[55454]: <info>  [1769104997.4976] manager: (tap88ed9213-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/357)
Jan 22 18:03:17 compute-0 kernel: tap88ed9213-70: entered promiscuous mode
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.504 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88ed9213-70, col_values=(('external_ids', {'iface-id': 'da93a32e-eed6-4124-8f08-78b0bfe2cb69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.504 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:17 compute-0 ovn_controller[95372]: 2026-01-22T18:03:17Z|00893|binding|INFO|Releasing lport da93a32e-eed6-4124-8f08-78b0bfe2cb69 from this chassis (sb_readonly=0)
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.509 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.508 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.510 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[93ce67e6-f26f-4006-a347-3e1d77ea8a0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.511 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: global
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/88ed9213-7d48-4c42-ad60-ce3fcf104f71.pid.haproxy
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID 88ed9213-7d48-4c42-ad60-ce3fcf104f71
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 18:03:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:17.513 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'env', 'PROCESS_TAG=haproxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88ed9213-7d48-4c42-ad60-ce3fcf104f71.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.518 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.614 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104997.613926, bdce5eb5-c99f-4f11-8f8e-12182942b087 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.615 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] VM Started (Lifecycle Event)
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.616 183079 DEBUG nova.compute.manager [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.621 183079 DEBUG nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.625 183079 INFO nova.virt.libvirt.driver [-] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Instance spawned successfully.
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.625 183079 DEBUG nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.632 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.636 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.643 183079 DEBUG nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.644 183079 DEBUG nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.644 183079 DEBUG nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.644 183079 DEBUG nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.645 183079 DEBUG nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.645 183079 DEBUG nova.virt.libvirt.driver [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.653 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.654 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104997.6141002, bdce5eb5-c99f-4f11-8f8e-12182942b087 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.654 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] VM Paused (Lifecycle Event)
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.674 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.679 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769104997.6214552, bdce5eb5-c99f-4f11-8f8e-12182942b087 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.679 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] VM Resumed (Lifecycle Event)
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.697 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.700 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.703 183079 INFO nova.compute.manager [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Took 6.59 seconds to spawn the instance on the hypervisor.
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.703 183079 DEBUG nova.compute.manager [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.727 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.755 183079 INFO nova.compute.manager [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Took 6.98 seconds to build instance.
Jan 22 18:03:17 compute-0 nova_compute[183075]: 2026-01-22 18:03:17.772 183079 DEBUG oslo_concurrency.lockutils [None req-5be08f97-f12f-4d22-9f3b-711f50712d19 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "bdce5eb5-c99f-4f11-8f8e-12182942b087" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:17 compute-0 podman[247155]: 2026-01-22 18:03:17.889713396 +0000 UTC m=+0.052427257 container create 8f7aa140e20e62a23c1d928c54b30a4d7e13e20e19bf9b107c2acb2a35658dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 22 18:03:17 compute-0 systemd[1]: Started libpod-conmon-8f7aa140e20e62a23c1d928c54b30a4d7e13e20e19bf9b107c2acb2a35658dba.scope.
Jan 22 18:03:17 compute-0 podman[247155]: 2026-01-22 18:03:17.859238973 +0000 UTC m=+0.021952854 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 18:03:17 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 18:03:17 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 18:03:17 compute-0 systemd[1]: Started libcrun container.
Jan 22 18:03:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58c419952728f617eda8b6fa19805b14a1ef726dd715849350d32b8e309d5a5d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 18:03:17 compute-0 podman[247155]: 2026-01-22 18:03:17.989270973 +0000 UTC m=+0.151984854 container init 8f7aa140e20e62a23c1d928c54b30a4d7e13e20e19bf9b107c2acb2a35658dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 18:03:17 compute-0 podman[247155]: 2026-01-22 18:03:17.995182353 +0000 UTC m=+0.157896214 container start 8f7aa140e20e62a23c1d928c54b30a4d7e13e20e19bf9b107c2acb2a35658dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 18:03:18 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247170]: [NOTICE]   (247175) : New worker (247177) forked
Jan 22 18:03:18 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247170]: [NOTICE]   (247175) : Loading success.
Jan 22 18:03:18 compute-0 nova_compute[183075]: 2026-01-22 18:03:18.101 183079 DEBUG nova.network.neutron [req-11ffacc4-3e56-4922-8d4e-7dde58428891 req-cf67775a-7fcd-4231-bfd9-fa0830887d5c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Updated VIF entry in instance network info cache for port 44d6a309-5606-49d3-9f67-bab5a1644db8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 18:03:18 compute-0 nova_compute[183075]: 2026-01-22 18:03:18.101 183079 DEBUG nova.network.neutron [req-11ffacc4-3e56-4922-8d4e-7dde58428891 req-cf67775a-7fcd-4231-bfd9-fa0830887d5c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Updating instance_info_cache with network_info: [{"id": "44d6a309-5606-49d3-9f67-bab5a1644db8", "address": "fa:16:3e:cd:96:1e", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d6a309-56", "ovs_interfaceid": "44d6a309-5606-49d3-9f67-bab5a1644db8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:03:18 compute-0 nova_compute[183075]: 2026-01-22 18:03:18.115 183079 DEBUG oslo_concurrency.lockutils [req-11ffacc4-3e56-4922-8d4e-7dde58428891 req-cf67775a-7fcd-4231-bfd9-fa0830887d5c a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-bdce5eb5-c99f-4f11-8f8e-12182942b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:03:18 compute-0 nova_compute[183075]: 2026-01-22 18:03:18.838 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769104983.837281, afb51767-98e4-4f27-bf80-d54f23cd06c6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:03:18 compute-0 nova_compute[183075]: 2026-01-22 18:03:18.839 183079 INFO nova.compute.manager [-] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] VM Stopped (Lifecycle Event)
Jan 22 18:03:18 compute-0 nova_compute[183075]: 2026-01-22 18:03:18.943 183079 DEBUG nova.compute.manager [None req-00a16b51-081e-4cf1-bc54-84c0b950579b - - - - - -] [instance: afb51767-98e4-4f27-bf80-d54f23cd06c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:03:19 compute-0 podman[247186]: 2026-01-22 18:03:19.334133005 +0000 UTC m=+0.044534243 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 18:03:19 compute-0 nova_compute[183075]: 2026-01-22 18:03:19.366 183079 DEBUG nova.compute.manager [req-f0f05e00-2ea2-4cef-9072-c093393cd7b3 req-5e0ab24a-cc8a-49e6-ae1a-7094736862e5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Received event network-vif-plugged-44d6a309-5606-49d3-9f67-bab5a1644db8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:03:19 compute-0 nova_compute[183075]: 2026-01-22 18:03:19.367 183079 DEBUG oslo_concurrency.lockutils [req-f0f05e00-2ea2-4cef-9072-c093393cd7b3 req-5e0ab24a-cc8a-49e6-ae1a-7094736862e5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:19 compute-0 nova_compute[183075]: 2026-01-22 18:03:19.367 183079 DEBUG oslo_concurrency.lockutils [req-f0f05e00-2ea2-4cef-9072-c093393cd7b3 req-5e0ab24a-cc8a-49e6-ae1a-7094736862e5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:19 compute-0 nova_compute[183075]: 2026-01-22 18:03:19.367 183079 DEBUG oslo_concurrency.lockutils [req-f0f05e00-2ea2-4cef-9072-c093393cd7b3 req-5e0ab24a-cc8a-49e6-ae1a-7094736862e5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:19 compute-0 nova_compute[183075]: 2026-01-22 18:03:19.367 183079 DEBUG nova.compute.manager [req-f0f05e00-2ea2-4cef-9072-c093393cd7b3 req-5e0ab24a-cc8a-49e6-ae1a-7094736862e5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] No waiting events found dispatching network-vif-plugged-44d6a309-5606-49d3-9f67-bab5a1644db8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:03:19 compute-0 nova_compute[183075]: 2026-01-22 18:03:19.367 183079 WARNING nova.compute.manager [req-f0f05e00-2ea2-4cef-9072-c093393cd7b3 req-5e0ab24a-cc8a-49e6-ae1a-7094736862e5 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Received unexpected event network-vif-plugged-44d6a309-5606-49d3-9f67-bab5a1644db8 for instance with vm_state active and task_state None.
Jan 22 18:03:19 compute-0 nova_compute[183075]: 2026-01-22 18:03:19.725 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:21 compute-0 nova_compute[183075]: 2026-01-22 18:03:21.326 183079 INFO nova.compute.manager [None req-84b91043-c413-4b64-882c-0cb31ec757a1 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Get console output
Jan 22 18:03:21 compute-0 nova_compute[183075]: 2026-01-22 18:03:21.331 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:03:21 compute-0 nova_compute[183075]: 2026-01-22 18:03:21.927 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:24 compute-0 nova_compute[183075]: 2026-01-22 18:03:24.727 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:26 compute-0 nova_compute[183075]: 2026-01-22 18:03:26.708 183079 INFO nova.compute.manager [None req-2c352969-67f2-426d-83e3-6b343c2375ff 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Get console output
Jan 22 18:03:26 compute-0 nova_compute[183075]: 2026-01-22 18:03:26.713 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:03:26 compute-0 nova_compute[183075]: 2026-01-22 18:03:26.931 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:29 compute-0 nova_compute[183075]: 2026-01-22 18:03:29.729 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:30 compute-0 ovn_controller[95372]: 2026-01-22T18:03:30Z|00166|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cd:96:1e 10.100.0.14
Jan 22 18:03:30 compute-0 ovn_controller[95372]: 2026-01-22T18:03:30Z|00167|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cd:96:1e 10.100.0.14
Jan 22 18:03:31 compute-0 podman[247225]: 2026-01-22 18:03:31.357658782 +0000 UTC m=+0.059231010 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 18:03:31 compute-0 podman[247226]: 2026-01-22 18:03:31.367766075 +0000 UTC m=+0.065829658 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.33.7, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 18:03:31 compute-0 podman[247224]: 2026-01-22 18:03:31.416532981 +0000 UTC m=+0.120818462 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 18:03:31 compute-0 nova_compute[183075]: 2026-01-22 18:03:31.827 183079 INFO nova.compute.manager [None req-0452050f-4417-443b-98a2-456c733333f6 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Get console output
Jan 22 18:03:31 compute-0 nova_compute[183075]: 2026-01-22 18:03:31.833 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:03:31 compute-0 nova_compute[183075]: 2026-01-22 18:03:31.933 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:34 compute-0 podman[247290]: 2026-01-22 18:03:34.375753878 +0000 UTC m=+0.082514018 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 18:03:34 compute-0 nova_compute[183075]: 2026-01-22 18:03:34.731 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:36 compute-0 nova_compute[183075]: 2026-01-22 18:03:36.936 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:37 compute-0 nova_compute[183075]: 2026-01-22 18:03:37.062 183079 INFO nova.compute.manager [None req-75094447-5176-4f04-81f6-7e964c2c0af2 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Get console output
Jan 22 18:03:37 compute-0 nova_compute[183075]: 2026-01-22 18:03:37.068 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:37.480 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:37.481 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:37.956 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:37.957 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.4761367
Jan 22 18:03:37 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247177]: 10.100.0.14:35656 [22/Jan/2026:18:03:37.479] listener listener/metadata 0/0/0/478/478 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:37.966 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:37.967 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:37.986 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:37.986 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0193305
Jan 22 18:03:37 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247177]: 10.100.0.14:35668 [22/Jan/2026:18:03:37.966] listener listener/metadata 0/0/0/20/20 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:37.993 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:37.993 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 18:03:37 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.008 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.008 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0145020
Jan 22 18:03:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247177]: 10.100.0.14:35682 [22/Jan/2026:18:03:37.992] listener listener/metadata 0/0/0/15/15 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.016 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.017 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.030 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.031 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0136650
Jan 22 18:03:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247177]: 10.100.0.14:35692 [22/Jan/2026:18:03:38.016] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.038 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.039 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.053 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.054 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0142915
Jan 22 18:03:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247177]: 10.100.0.14:35706 [22/Jan/2026:18:03:38.038] listener listener/metadata 0/0/0/15/15 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.059 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.060 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.071 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.071 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0114803
Jan 22 18:03:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247177]: 10.100.0.14:35718 [22/Jan/2026:18:03:38.059] listener listener/metadata 0/0/0/12/12 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.077 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.077 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.090 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.090 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0130317
Jan 22 18:03:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247177]: 10.100.0.14:35722 [22/Jan/2026:18:03:38.076] listener listener/metadata 0/0/0/13/13 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.096 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.096 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.111 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.111 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0151772
Jan 22 18:03:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247177]: 10.100.0.14:35736 [22/Jan/2026:18:03:38.095] listener listener/metadata 0/0/0/16/16 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.117 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.117 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.131 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.131 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 165 time: 0.0142658
Jan 22 18:03:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247177]: 10.100.0.14:35748 [22/Jan/2026:18:03:38.116] listener listener/metadata 0/0/0/15/15 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.136 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.137 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.149 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.150 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 165 time: 0.0127542
Jan 22 18:03:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247177]: 10.100.0.14:35752 [22/Jan/2026:18:03:38.136] listener listener/metadata 0/0/0/13/13 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.155 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.156 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.171 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0154352
Jan 22 18:03:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247177]: 10.100.0.14:35760 [22/Jan/2026:18:03:38.155] listener listener/metadata 0/0/0/16/16 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.189 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.189 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.202 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:03:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247177]: 10.100.0.14:35762 [22/Jan/2026:18:03:38.188] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.203 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0134866
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.207 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.208 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.222 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.223 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0150471
Jan 22 18:03:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247177]: 10.100.0.14:35772 [22/Jan/2026:18:03:38.207] listener listener/metadata 0/0/0/15/15 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.227 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.228 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.243 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:03:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247177]: 10.100.0.14:35788 [22/Jan/2026:18:03:38.227] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.243 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0156186
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.248 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.249 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.263 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:03:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247177]: 10.100.0.14:35802 [22/Jan/2026:18:03:38.247] listener listener/metadata 0/0/0/15/15 200 149 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.263 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 165 time: 0.0148084
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.270 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.270 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.14
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: 88ed9213-7d48-4c42-ad60-ce3fcf104f71 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.284 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:03:38 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:38.285 104990 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0147104
Jan 22 18:03:38 compute-0 haproxy-metadata-proxy-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247177]: 10.100.0.14:35818 [22/Jan/2026:18:03:38.269] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 18:03:39 compute-0 nova_compute[183075]: 2026-01-22 18:03:39.733 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:41 compute-0 nova_compute[183075]: 2026-01-22 18:03:41.940 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:41.986 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:41.987 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:41.987 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:42 compute-0 nova_compute[183075]: 2026-01-22 18:03:42.230 183079 INFO nova.compute.manager [None req-a6256d1c-9fac-4afb-9626-e36648bce534 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Get console output
Jan 22 18:03:42 compute-0 nova_compute[183075]: 2026-01-22 18:03:42.238 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:03:44 compute-0 nova_compute[183075]: 2026-01-22 18:03:44.640 183079 DEBUG nova.compute.manager [req-f5ad6fb2-60c2-4bf2-8ade-0b7e4b9a601a req-96bdfcd8-d79d-4040-8509-ed0c1de91b01 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Received event network-changed-44d6a309-5606-49d3-9f67-bab5a1644db8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:03:44 compute-0 nova_compute[183075]: 2026-01-22 18:03:44.640 183079 DEBUG nova.compute.manager [req-f5ad6fb2-60c2-4bf2-8ade-0b7e4b9a601a req-96bdfcd8-d79d-4040-8509-ed0c1de91b01 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Refreshing instance network info cache due to event network-changed-44d6a309-5606-49d3-9f67-bab5a1644db8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 18:03:44 compute-0 nova_compute[183075]: 2026-01-22 18:03:44.640 183079 DEBUG oslo_concurrency.lockutils [req-f5ad6fb2-60c2-4bf2-8ade-0b7e4b9a601a req-96bdfcd8-d79d-4040-8509-ed0c1de91b01 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-bdce5eb5-c99f-4f11-8f8e-12182942b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:03:44 compute-0 nova_compute[183075]: 2026-01-22 18:03:44.641 183079 DEBUG oslo_concurrency.lockutils [req-f5ad6fb2-60c2-4bf2-8ade-0b7e4b9a601a req-96bdfcd8-d79d-4040-8509-ed0c1de91b01 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-bdce5eb5-c99f-4f11-8f8e-12182942b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:03:44 compute-0 nova_compute[183075]: 2026-01-22 18:03:44.641 183079 DEBUG nova.network.neutron [req-f5ad6fb2-60c2-4bf2-8ade-0b7e4b9a601a req-96bdfcd8-d79d-4040-8509-ed0c1de91b01 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Refreshing network info cache for port 44d6a309-5606-49d3-9f67-bab5a1644db8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 18:03:44 compute-0 nova_compute[183075]: 2026-01-22 18:03:44.736 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:45 compute-0 nova_compute[183075]: 2026-01-22 18:03:45.535 183079 DEBUG nova.network.neutron [req-f5ad6fb2-60c2-4bf2-8ade-0b7e4b9a601a req-96bdfcd8-d79d-4040-8509-ed0c1de91b01 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Updated VIF entry in instance network info cache for port 44d6a309-5606-49d3-9f67-bab5a1644db8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 18:03:45 compute-0 nova_compute[183075]: 2026-01-22 18:03:45.536 183079 DEBUG nova.network.neutron [req-f5ad6fb2-60c2-4bf2-8ade-0b7e4b9a601a req-96bdfcd8-d79d-4040-8509-ed0c1de91b01 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Updating instance_info_cache with network_info: [{"id": "44d6a309-5606-49d3-9f67-bab5a1644db8", "address": "fa:16:3e:cd:96:1e", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d6a309-56", "ovs_interfaceid": "44d6a309-5606-49d3-9f67-bab5a1644db8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:03:45 compute-0 nova_compute[183075]: 2026-01-22 18:03:45.558 183079 DEBUG oslo_concurrency.lockutils [req-f5ad6fb2-60c2-4bf2-8ade-0b7e4b9a601a req-96bdfcd8-d79d-4040-8509-ed0c1de91b01 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-bdce5eb5-c99f-4f11-8f8e-12182942b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:03:46 compute-0 podman[247310]: 2026-01-22 18:03:46.330596673 +0000 UTC m=+0.044986645 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 18:03:46 compute-0 nova_compute[183075]: 2026-01-22 18:03:46.741 183079 DEBUG nova.compute.manager [req-c46c6779-6072-4704-b45d-6b910d80db31 req-29fc42e5-011a-4fac-8923-43ccc4b63880 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Received event network-changed-44d6a309-5606-49d3-9f67-bab5a1644db8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:03:46 compute-0 nova_compute[183075]: 2026-01-22 18:03:46.742 183079 DEBUG nova.compute.manager [req-c46c6779-6072-4704-b45d-6b910d80db31 req-29fc42e5-011a-4fac-8923-43ccc4b63880 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Refreshing instance network info cache due to event network-changed-44d6a309-5606-49d3-9f67-bab5a1644db8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 18:03:46 compute-0 nova_compute[183075]: 2026-01-22 18:03:46.742 183079 DEBUG oslo_concurrency.lockutils [req-c46c6779-6072-4704-b45d-6b910d80db31 req-29fc42e5-011a-4fac-8923-43ccc4b63880 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-bdce5eb5-c99f-4f11-8f8e-12182942b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:03:46 compute-0 nova_compute[183075]: 2026-01-22 18:03:46.743 183079 DEBUG oslo_concurrency.lockutils [req-c46c6779-6072-4704-b45d-6b910d80db31 req-29fc42e5-011a-4fac-8923-43ccc4b63880 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-bdce5eb5-c99f-4f11-8f8e-12182942b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:03:46 compute-0 nova_compute[183075]: 2026-01-22 18:03:46.743 183079 DEBUG nova.network.neutron [req-c46c6779-6072-4704-b45d-6b910d80db31 req-29fc42e5-011a-4fac-8923-43ccc4b63880 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Refreshing network info cache for port 44d6a309-5606-49d3-9f67-bab5a1644db8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 18:03:46 compute-0 nova_compute[183075]: 2026-01-22 18:03:46.921 183079 DEBUG oslo_concurrency.lockutils [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "bdce5eb5-c99f-4f11-8f8e-12182942b087" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:46 compute-0 nova_compute[183075]: 2026-01-22 18:03:46.921 183079 DEBUG oslo_concurrency.lockutils [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "bdce5eb5-c99f-4f11-8f8e-12182942b087" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:46 compute-0 nova_compute[183075]: 2026-01-22 18:03:46.921 183079 DEBUG oslo_concurrency.lockutils [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:46 compute-0 nova_compute[183075]: 2026-01-22 18:03:46.921 183079 DEBUG oslo_concurrency.lockutils [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:46 compute-0 nova_compute[183075]: 2026-01-22 18:03:46.922 183079 DEBUG oslo_concurrency.lockutils [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:46 compute-0 nova_compute[183075]: 2026-01-22 18:03:46.923 183079 INFO nova.compute.manager [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Terminating instance
Jan 22 18:03:46 compute-0 nova_compute[183075]: 2026-01-22 18:03:46.923 183079 DEBUG nova.compute.manager [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 18:03:46 compute-0 nova_compute[183075]: 2026-01-22 18:03:46.945 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:46 compute-0 kernel: tap44d6a309-56 (unregistering): left promiscuous mode
Jan 22 18:03:46 compute-0 NetworkManager[55454]: <info>  [1769105026.9527] device (tap44d6a309-56): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 18:03:46 compute-0 nova_compute[183075]: 2026-01-22 18:03:46.961 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:46 compute-0 ovn_controller[95372]: 2026-01-22T18:03:46Z|00894|binding|INFO|Releasing lport 44d6a309-5606-49d3-9f67-bab5a1644db8 from this chassis (sb_readonly=0)
Jan 22 18:03:46 compute-0 ovn_controller[95372]: 2026-01-22T18:03:46Z|00895|binding|INFO|Setting lport 44d6a309-5606-49d3-9f67-bab5a1644db8 down in Southbound
Jan 22 18:03:46 compute-0 ovn_controller[95372]: 2026-01-22T18:03:46Z|00896|binding|INFO|Removing iface tap44d6a309-56 ovn-installed in OVS
Jan 22 18:03:46 compute-0 nova_compute[183075]: 2026-01-22 18:03:46.966 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:46.976 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:96:1e 10.100.0.14'], port_security=['fa:16:3e:cd:96:1e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'bdce5eb5-c99f-4f11-8f8e-12182942b087', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbff5cec4b9c4d2eb317124ca20fea9f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '45b38a2d-f91f-4949-954f-9f963f5e478e ca06cc05-16f3-4515-9c94-ff0a817b9cf0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da66e895-2221-40ac-9c60-818222bd5a75, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=44d6a309-5606-49d3-9f67-bab5a1644db8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:03:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:46.978 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 44d6a309-5606-49d3-9f67-bab5a1644db8 in datapath 88ed9213-7d48-4c42-ad60-ce3fcf104f71 unbound from our chassis
Jan 22 18:03:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:46.979 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88ed9213-7d48-4c42-ad60-ce3fcf104f71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 18:03:46 compute-0 nova_compute[183075]: 2026-01-22 18:03:46.980 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:46.981 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[09b734f9-b097-4a1d-82b9-693044c623c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:46 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:46.984 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 namespace which is not needed anymore
Jan 22 18:03:47 compute-0 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d00000052.scope: Deactivated successfully.
Jan 22 18:03:47 compute-0 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d00000052.scope: Consumed 12.992s CPU time.
Jan 22 18:03:47 compute-0 systemd-machined[154382]: Machine qemu-82-instance-00000052 terminated.
Jan 22 18:03:47 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247170]: [NOTICE]   (247175) : haproxy version is 2.8.14-c23fe91
Jan 22 18:03:47 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247170]: [NOTICE]   (247175) : path to executable is /usr/sbin/haproxy
Jan 22 18:03:47 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247170]: [WARNING]  (247175) : Exiting Master process...
Jan 22 18:03:47 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247170]: [WARNING]  (247175) : Exiting Master process...
Jan 22 18:03:47 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247170]: [ALERT]    (247175) : Current worker (247177) exited with code 143 (Terminated)
Jan 22 18:03:47 compute-0 neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71[247170]: [WARNING]  (247175) : All workers exited. Exiting... (0)
Jan 22 18:03:47 compute-0 systemd[1]: libpod-8f7aa140e20e62a23c1d928c54b30a4d7e13e20e19bf9b107c2acb2a35658dba.scope: Deactivated successfully.
Jan 22 18:03:47 compute-0 podman[247359]: 2026-01-22 18:03:47.123878216 +0000 UTC m=+0.045218021 container died 8f7aa140e20e62a23c1d928c54b30a4d7e13e20e19bf9b107c2acb2a35658dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 18:03:47 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f7aa140e20e62a23c1d928c54b30a4d7e13e20e19bf9b107c2acb2a35658dba-userdata-shm.mount: Deactivated successfully.
Jan 22 18:03:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-58c419952728f617eda8b6fa19805b14a1ef726dd715849350d32b8e309d5a5d-merged.mount: Deactivated successfully.
Jan 22 18:03:47 compute-0 podman[247359]: 2026-01-22 18:03:47.166832346 +0000 UTC m=+0.088172151 container cleanup 8f7aa140e20e62a23c1d928c54b30a4d7e13e20e19bf9b107c2acb2a35658dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 18:03:47 compute-0 systemd[1]: libpod-conmon-8f7aa140e20e62a23c1d928c54b30a4d7e13e20e19bf9b107c2acb2a35658dba.scope: Deactivated successfully.
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.181 183079 INFO nova.virt.libvirt.driver [-] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Instance destroyed successfully.
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.182 183079 DEBUG nova.objects.instance [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lazy-loading 'resources' on Instance uuid bdce5eb5-c99f-4f11-8f8e-12182942b087 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.258 183079 DEBUG nova.virt.libvirt.vif [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T18:03:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-313588033',display_name='tempest-server-test-313588033',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-313588033',id=82,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHO8Yk4jkBgYN0j3vkMUna5n1/0pam7uUgT0M4hwSF33mPjOrl+CX7QBoQyJxqZfHd82Wc4zuVWHkXPKVwuW6xqd8w95UWfcq48R9i1fzq8gge9aTW05uGs9+kS9CPGzQ==',key_name='tempest-keypair-test-1220643046',keypairs=<?>,launch_index=0,launched_at=2026-01-22T18:03:17Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cbff5cec4b9c4d2eb317124ca20fea9f',ramdisk_id='',reservation_id='r-7yfsp4f5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316',owner_user_name='tempest-StatelessNetworkSecGroupIPv4Test-1348485316-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T18:03:17Z,user_data=None,user_id='66c24efd0cd24c57803ea8508e679d06',uuid=bdce5eb5-c99f-4f11-8f8e-12182942b087,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "44d6a309-5606-49d3-9f67-bab5a1644db8", "address": "fa:16:3e:cd:96:1e", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d6a309-56", "ovs_interfaceid": "44d6a309-5606-49d3-9f67-bab5a1644db8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.259 183079 DEBUG nova.network.os_vif_util [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converting VIF {"id": "44d6a309-5606-49d3-9f67-bab5a1644db8", "address": "fa:16:3e:cd:96:1e", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d6a309-56", "ovs_interfaceid": "44d6a309-5606-49d3-9f67-bab5a1644db8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.259 183079 DEBUG nova.network.os_vif_util [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cd:96:1e,bridge_name='br-int',has_traffic_filtering=True,id=44d6a309-5606-49d3-9f67-bab5a1644db8,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44d6a309-56') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.259 183079 DEBUG os_vif [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cd:96:1e,bridge_name='br-int',has_traffic_filtering=True,id=44d6a309-5606-49d3-9f67-bab5a1644db8,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44d6a309-56') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.261 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.261 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44d6a309-56, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.262 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.263 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.266 183079 INFO os_vif [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cd:96:1e,bridge_name='br-int',has_traffic_filtering=True,id=44d6a309-5606-49d3-9f67-bab5a1644db8,network=Network(88ed9213-7d48-4c42-ad60-ce3fcf104f71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44d6a309-56')
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.267 183079 INFO nova.virt.libvirt.driver [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Deleting instance files /var/lib/nova/instances/bdce5eb5-c99f-4f11-8f8e-12182942b087_del
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.267 183079 INFO nova.virt.libvirt.driver [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Deletion of /var/lib/nova/instances/bdce5eb5-c99f-4f11-8f8e-12182942b087_del complete
Jan 22 18:03:47 compute-0 podman[247404]: 2026-01-22 18:03:47.269101236 +0000 UTC m=+0.076017033 container remove 8f7aa140e20e62a23c1d928c54b30a4d7e13e20e19bf9b107c2acb2a35658dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:03:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:47.273 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[928cd687-08b2-4aa3-8054-47b037883ef4]: (4, ('Thu Jan 22 06:03:47 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (8f7aa140e20e62a23c1d928c54b30a4d7e13e20e19bf9b107c2acb2a35658dba)\n8f7aa140e20e62a23c1d928c54b30a4d7e13e20e19bf9b107c2acb2a35658dba\nThu Jan 22 06:03:47 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 (8f7aa140e20e62a23c1d928c54b30a4d7e13e20e19bf9b107c2acb2a35658dba)\n8f7aa140e20e62a23c1d928c54b30a4d7e13e20e19bf9b107c2acb2a35658dba\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:47.275 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[561a20f0-711b-407e-9978-7a03eb28cc56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:47.276 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88ed9213-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.278 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:47 compute-0 kernel: tap88ed9213-70: left promiscuous mode
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.290 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:47.293 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6a85af3c-46b2-49f5-aea8-78baa422d3e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:47.310 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[59a8ac3e-17bd-48ea-ae6a-32473231248c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:47.311 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e0efa2-b584-41d9-ba7e-6f909d6afe6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.316 183079 INFO nova.compute.manager [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.316 183079 DEBUG oslo.service.loopingcall [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.317 183079 DEBUG nova.compute.manager [-] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 18:03:47 compute-0 nova_compute[183075]: 2026-01-22 18:03:47.317 183079 DEBUG nova.network.neutron [-] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 18:03:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:47.327 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b9692ace-6749-4b94-bcfe-aa9133100035]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736087, 'reachable_time': 41939, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247419, 'error': None, 'target': 'ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:47.330 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88ed9213-7d48-4c42-ad60-ce3fcf104f71 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 18:03:47 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:03:47.330 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[e56aba8b-f699-4cd7-8fa9-8cadb8c66f2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:03:47 compute-0 systemd[1]: run-netns-ovnmeta\x2d88ed9213\x2d7d48\x2d4c42\x2dad60\x2dce3fcf104f71.mount: Deactivated successfully.
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.112 183079 DEBUG nova.network.neutron [req-c46c6779-6072-4704-b45d-6b910d80db31 req-29fc42e5-011a-4fac-8923-43ccc4b63880 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Updated VIF entry in instance network info cache for port 44d6a309-5606-49d3-9f67-bab5a1644db8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.112 183079 DEBUG nova.network.neutron [req-c46c6779-6072-4704-b45d-6b910d80db31 req-29fc42e5-011a-4fac-8923-43ccc4b63880 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Updating instance_info_cache with network_info: [{"id": "44d6a309-5606-49d3-9f67-bab5a1644db8", "address": "fa:16:3e:cd:96:1e", "network": {"id": "88ed9213-7d48-4c42-ad60-ce3fcf104f71", "bridge": "br-int", "label": "tempest-test-network--364650395", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbff5cec4b9c4d2eb317124ca20fea9f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d6a309-56", "ovs_interfaceid": "44d6a309-5606-49d3-9f67-bab5a1644db8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.126 183079 DEBUG oslo_concurrency.lockutils [req-c46c6779-6072-4704-b45d-6b910d80db31 req-29fc42e5-011a-4fac-8923-43ccc4b63880 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-bdce5eb5-c99f-4f11-8f8e-12182942b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.835 183079 DEBUG nova.compute.manager [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Received event network-changed-44d6a309-5606-49d3-9f67-bab5a1644db8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.835 183079 DEBUG nova.compute.manager [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Refreshing instance network info cache due to event network-changed-44d6a309-5606-49d3-9f67-bab5a1644db8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.836 183079 DEBUG oslo_concurrency.lockutils [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-bdce5eb5-c99f-4f11-8f8e-12182942b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.836 183079 DEBUG oslo_concurrency.lockutils [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-bdce5eb5-c99f-4f11-8f8e-12182942b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.836 183079 DEBUG nova.network.neutron [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Refreshing network info cache for port 44d6a309-5606-49d3-9f67-bab5a1644db8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.912 183079 DEBUG nova.network.neutron [-] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.934 183079 INFO nova.compute.manager [-] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Took 1.62 seconds to deallocate network for instance.
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.974 183079 DEBUG oslo_concurrency.lockutils [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.975 183079 DEBUG oslo_concurrency.lockutils [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.978 183079 INFO nova.network.neutron [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Port 44d6a309-5606-49d3-9f67-bab5a1644db8 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.979 183079 DEBUG nova.network.neutron [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.992 183079 DEBUG oslo_concurrency.lockutils [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-bdce5eb5-c99f-4f11-8f8e-12182942b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.992 183079 DEBUG nova.compute.manager [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Received event network-vif-unplugged-44d6a309-5606-49d3-9f67-bab5a1644db8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.993 183079 DEBUG oslo_concurrency.lockutils [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.993 183079 DEBUG oslo_concurrency.lockutils [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.993 183079 DEBUG oslo_concurrency.lockutils [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.993 183079 DEBUG nova.compute.manager [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] No waiting events found dispatching network-vif-unplugged-44d6a309-5606-49d3-9f67-bab5a1644db8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.993 183079 DEBUG nova.compute.manager [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Received event network-vif-unplugged-44d6a309-5606-49d3-9f67-bab5a1644db8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.993 183079 DEBUG nova.compute.manager [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Received event network-vif-plugged-44d6a309-5606-49d3-9f67-bab5a1644db8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.993 183079 DEBUG oslo_concurrency.lockutils [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.994 183079 DEBUG oslo_concurrency.lockutils [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.994 183079 DEBUG oslo_concurrency.lockutils [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "bdce5eb5-c99f-4f11-8f8e-12182942b087-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.994 183079 DEBUG nova.compute.manager [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] No waiting events found dispatching network-vif-plugged-44d6a309-5606-49d3-9f67-bab5a1644db8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:03:48 compute-0 nova_compute[183075]: 2026-01-22 18:03:48.994 183079 WARNING nova.compute.manager [req-e1d1a5f4-c8f4-459c-b4a4-44f7a3be921c req-71900617-5bb4-4be0-bc1e-046d0f444f93 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Received unexpected event network-vif-plugged-44d6a309-5606-49d3-9f67-bab5a1644db8 for instance with vm_state active and task_state deleting.
Jan 22 18:03:49 compute-0 nova_compute[183075]: 2026-01-22 18:03:49.022 183079 DEBUG nova.compute.provider_tree [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:03:49 compute-0 nova_compute[183075]: 2026-01-22 18:03:49.034 183079 DEBUG nova.scheduler.client.report [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:03:49 compute-0 nova_compute[183075]: 2026-01-22 18:03:49.052 183079 DEBUG oslo_concurrency.lockutils [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:49 compute-0 nova_compute[183075]: 2026-01-22 18:03:49.077 183079 INFO nova.scheduler.client.report [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Deleted allocations for instance bdce5eb5-c99f-4f11-8f8e-12182942b087
Jan 22 18:03:49 compute-0 nova_compute[183075]: 2026-01-22 18:03:49.141 183079 DEBUG oslo_concurrency.lockutils [None req-b8ad4ca7-c785-42fd-a93f-38f89c800489 66c24efd0cd24c57803ea8508e679d06 cbff5cec4b9c4d2eb317124ca20fea9f - - default default] Lock "bdce5eb5-c99f-4f11-8f8e-12182942b087" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:03:49 compute-0 nova_compute[183075]: 2026-01-22 18:03:49.737 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:49 compute-0 nova_compute[183075]: 2026-01-22 18:03:49.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:03:50 compute-0 podman[247420]: 2026-01-22 18:03:50.338404814 +0000 UTC m=+0.048517661 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 18:03:50 compute-0 nova_compute[183075]: 2026-01-22 18:03:50.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:03:50 compute-0 nova_compute[183075]: 2026-01-22 18:03:50.906 183079 DEBUG nova.compute.manager [req-7c6e32d6-6f93-4c12-abb3-affc55458c12 req-ab8885f0-68c9-4f83-a075-8eb4b16bd307 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Received event network-vif-deleted-44d6a309-5606-49d3-9f67-bab5a1644db8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:03:51 compute-0 nova_compute[183075]: 2026-01-22 18:03:51.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:03:52 compute-0 nova_compute[183075]: 2026-01-22 18:03:52.265 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:52 compute-0 nova_compute[183075]: 2026-01-22 18:03:52.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:03:54 compute-0 nova_compute[183075]: 2026-01-22 18:03:54.740 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.466 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.467 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.467 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.467 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.467 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.467 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.469 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.469 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.469 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.469 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.469 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.469 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.469 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:03:55.469 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:03:57 compute-0 nova_compute[183075]: 2026-01-22 18:03:57.269 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:03:59 compute-0 nova_compute[183075]: 2026-01-22 18:03:59.742 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:04:00.449 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:04:00 compute-0 nova_compute[183075]: 2026-01-22 18:04:00.449 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:00 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:04:00.451 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 18:04:00 compute-0 nova_compute[183075]: 2026-01-22 18:04:00.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:04:02 compute-0 nova_compute[183075]: 2026-01-22 18:04:02.181 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769105027.1795013, bdce5eb5-c99f-4f11-8f8e-12182942b087 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:04:02 compute-0 nova_compute[183075]: 2026-01-22 18:04:02.181 183079 INFO nova.compute.manager [-] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] VM Stopped (Lifecycle Event)
Jan 22 18:04:02 compute-0 nova_compute[183075]: 2026-01-22 18:04:02.201 183079 DEBUG nova.compute.manager [None req-e4eac286-6813-4c73-a61e-1cbd8f054ca9 - - - - - -] [instance: bdce5eb5-c99f-4f11-8f8e-12182942b087] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:04:02 compute-0 nova_compute[183075]: 2026-01-22 18:04:02.272 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:02 compute-0 podman[247446]: 2026-01-22 18:04:02.352698913 +0000 UTC m=+0.059371843 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-type=git, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 18:04:02 compute-0 podman[247445]: 2026-01-22 18:04:02.371130441 +0000 UTC m=+0.080323599 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 22 18:04:02 compute-0 podman[247444]: 2026-01-22 18:04:02.399918817 +0000 UTC m=+0.112022644 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:04:02 compute-0 nova_compute[183075]: 2026-01-22 18:04:02.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:04:02 compute-0 nova_compute[183075]: 2026-01-22 18:04:02.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 18:04:02 compute-0 nova_compute[183075]: 2026-01-22 18:04:02.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 18:04:02 compute-0 nova_compute[183075]: 2026-01-22 18:04:02.810 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 18:04:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:04:03.453 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:04:03 compute-0 nova_compute[183075]: 2026-01-22 18:04:03.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:04:03 compute-0 nova_compute[183075]: 2026-01-22 18:04:03.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 18:04:04 compute-0 nova_compute[183075]: 2026-01-22 18:04:04.743 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:04 compute-0 nova_compute[183075]: 2026-01-22 18:04:04.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:04:04 compute-0 nova_compute[183075]: 2026-01-22 18:04:04.829 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:04:04 compute-0 nova_compute[183075]: 2026-01-22 18:04:04.830 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:04:04 compute-0 nova_compute[183075]: 2026-01-22 18:04:04.830 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:04:04 compute-0 nova_compute[183075]: 2026-01-22 18:04:04.830 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 18:04:04 compute-0 podman[247512]: 2026-01-22 18:04:04.935827919 +0000 UTC m=+0.066599239 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 22 18:04:05 compute-0 nova_compute[183075]: 2026-01-22 18:04:05.008 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 18:04:05 compute-0 nova_compute[183075]: 2026-01-22 18:04:05.009 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5724MB free_disk=73.34983444213867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 18:04:05 compute-0 nova_compute[183075]: 2026-01-22 18:04:05.010 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:04:05 compute-0 nova_compute[183075]: 2026-01-22 18:04:05.010 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:04:05 compute-0 nova_compute[183075]: 2026-01-22 18:04:05.079 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 18:04:05 compute-0 nova_compute[183075]: 2026-01-22 18:04:05.080 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 18:04:05 compute-0 nova_compute[183075]: 2026-01-22 18:04:05.094 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing inventories for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 18:04:05 compute-0 nova_compute[183075]: 2026-01-22 18:04:05.109 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Updating ProviderTree inventory for provider 2513134c-f67c-4237-84bf-4ebe2450d610 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 18:04:05 compute-0 nova_compute[183075]: 2026-01-22 18:04:05.110 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Updating inventory in ProviderTree for provider 2513134c-f67c-4237-84bf-4ebe2450d610 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 18:04:05 compute-0 nova_compute[183075]: 2026-01-22 18:04:05.124 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing aggregate associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 18:04:05 compute-0 nova_compute[183075]: 2026-01-22 18:04:05.146 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Refreshing trait associations for resource provider 2513134c-f67c-4237-84bf-4ebe2450d610, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NODE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 18:04:05 compute-0 nova_compute[183075]: 2026-01-22 18:04:05.167 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:04:05 compute-0 nova_compute[183075]: 2026-01-22 18:04:05.181 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:04:05 compute-0 nova_compute[183075]: 2026-01-22 18:04:05.202 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 18:04:05 compute-0 nova_compute[183075]: 2026-01-22 18:04:05.203 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:04:06 compute-0 nova_compute[183075]: 2026-01-22 18:04:06.204 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:04:07 compute-0 nova_compute[183075]: 2026-01-22 18:04:07.278 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:09 compute-0 nova_compute[183075]: 2026-01-22 18:04:09.746 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:12 compute-0 nova_compute[183075]: 2026-01-22 18:04:12.223 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:12 compute-0 nova_compute[183075]: 2026-01-22 18:04:12.279 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:12 compute-0 nova_compute[183075]: 2026-01-22 18:04:12.310 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:14 compute-0 nova_compute[183075]: 2026-01-22 18:04:14.748 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:17 compute-0 nova_compute[183075]: 2026-01-22 18:04:17.284 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:17 compute-0 podman[247534]: 2026-01-22 18:04:17.331751058 +0000 UTC m=+0.042823497 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 18:04:19 compute-0 nova_compute[183075]: 2026-01-22 18:04:19.751 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:21 compute-0 podman[247559]: 2026-01-22 18:04:21.33067517 +0000 UTC m=+0.046862796 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 18:04:22 compute-0 nova_compute[183075]: 2026-01-22 18:04:22.288 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:24 compute-0 nova_compute[183075]: 2026-01-22 18:04:24.754 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:24 compute-0 nova_compute[183075]: 2026-01-22 18:04:24.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:04:27 compute-0 nova_compute[183075]: 2026-01-22 18:04:27.291 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:29 compute-0 nova_compute[183075]: 2026-01-22 18:04:29.755 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:32 compute-0 nova_compute[183075]: 2026-01-22 18:04:32.294 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:33 compute-0 podman[247586]: 2026-01-22 18:04:33.334711921 +0000 UTC m=+0.044298086 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 18:04:33 compute-0 podman[247587]: 2026-01-22 18:04:33.351664449 +0000 UTC m=+0.057819562 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9-minimal)
Jan 22 18:04:33 compute-0 podman[247585]: 2026-01-22 18:04:33.363266582 +0000 UTC m=+0.077056651 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:04:34 compute-0 nova_compute[183075]: 2026-01-22 18:04:34.757 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:35 compute-0 podman[247648]: 2026-01-22 18:04:35.345790995 +0000 UTC m=+0.055266713 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 22 18:04:36 compute-0 sshd-session[247668]: Received disconnect from 45.148.10.152 port 60234:11:  [preauth]
Jan 22 18:04:36 compute-0 sshd-session[247668]: Disconnected from authenticating user root 45.148.10.152 port 60234 [preauth]
Jan 22 18:04:37 compute-0 nova_compute[183075]: 2026-01-22 18:04:37.297 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:39 compute-0 nova_compute[183075]: 2026-01-22 18:04:39.758 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:04:41.987 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:04:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:04:41.988 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:04:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:04:41.988 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:04:42 compute-0 nova_compute[183075]: 2026-01-22 18:04:42.300 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:44 compute-0 nova_compute[183075]: 2026-01-22 18:04:44.761 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:47 compute-0 nova_compute[183075]: 2026-01-22 18:04:47.303 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:48 compute-0 podman[247670]: 2026-01-22 18:04:48.350719145 +0000 UTC m=+0.054549444 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:04:49 compute-0 nova_compute[183075]: 2026-01-22 18:04:49.839 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:04:50.968 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:ef:11 10.100.0.2 2001:db8::f816:3eff:fe34:ef11'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe34:ef11/64', 'neutron:device_id': 'ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7c020e6d-68ca-4390-a036-a6097b05932f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4c6b8b92-89d6-4150-a6bb-444d2d5ca88c) old=Port_Binding(mac=['fa:16:3e:34:ef:11 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:04:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:04:50.969 104629 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4c6b8b92-89d6-4150-a6bb-444d2d5ca88c in datapath cd33fee9-f283-4792-8796-9cc0f4021aaf updated
Jan 22 18:04:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:04:50.971 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cd33fee9-f283-4792-8796-9cc0f4021aaf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 18:04:50 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:04:50.973 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f4c0e63d-a15d-417a-8624-4359a5eec406]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:04:51 compute-0 nova_compute[183075]: 2026-01-22 18:04:51.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:04:51 compute-0 nova_compute[183075]: 2026-01-22 18:04:51.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:04:52 compute-0 nova_compute[183075]: 2026-01-22 18:04:52.306 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:52 compute-0 podman[247695]: 2026-01-22 18:04:52.372739151 +0000 UTC m=+0.077289768 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 18:04:52 compute-0 nova_compute[183075]: 2026-01-22 18:04:52.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:04:53 compute-0 nova_compute[183075]: 2026-01-22 18:04:53.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:04:54 compute-0 nova_compute[183075]: 2026-01-22 18:04:54.840 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:57 compute-0 nova_compute[183075]: 2026-01-22 18:04:57.311 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:04:57 compute-0 nova_compute[183075]: 2026-01-22 18:04:57.777 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "dd2cb045-1122-4834-9fea-6294fc690f67" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:04:57 compute-0 nova_compute[183075]: 2026-01-22 18:04:57.778 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "dd2cb045-1122-4834-9fea-6294fc690f67" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:04:57 compute-0 nova_compute[183075]: 2026-01-22 18:04:57.797 183079 DEBUG nova.compute.manager [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 18:04:57 compute-0 nova_compute[183075]: 2026-01-22 18:04:57.866 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:04:57 compute-0 nova_compute[183075]: 2026-01-22 18:04:57.866 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:04:57 compute-0 nova_compute[183075]: 2026-01-22 18:04:57.877 183079 DEBUG nova.virt.hardware [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 18:04:57 compute-0 nova_compute[183075]: 2026-01-22 18:04:57.878 183079 INFO nova.compute.claims [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Claim successful on node compute-0.ctlplane.example.com
Jan 22 18:04:57 compute-0 nova_compute[183075]: 2026-01-22 18:04:57.965 183079 DEBUG nova.compute.provider_tree [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:04:57 compute-0 nova_compute[183075]: 2026-01-22 18:04:57.978 183079 DEBUG nova.scheduler.client.report [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.000 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.001 183079 DEBUG nova.compute.manager [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.042 183079 DEBUG nova.compute.manager [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.042 183079 DEBUG nova.network.neutron [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.057 183079 INFO nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.073 183079 DEBUG nova.compute.manager [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.177 183079 DEBUG nova.compute.manager [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.178 183079 DEBUG nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.179 183079 INFO nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Creating image(s)
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.180 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "/var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.180 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "/var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.181 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "/var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.205 183079 DEBUG oslo_concurrency.processutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.281 183079 DEBUG oslo_concurrency.processutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.283 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.283 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.295 183079 DEBUG oslo_concurrency.processutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.357 183079 DEBUG oslo_concurrency.processutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.358 183079 DEBUG oslo_concurrency.processutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.395 183079 DEBUG oslo_concurrency.processutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.396 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.397 183079 DEBUG oslo_concurrency.processutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.455 183079 DEBUG oslo_concurrency.processutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.456 183079 DEBUG nova.virt.disk.api [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Checking if we can resize image /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.457 183079 DEBUG oslo_concurrency.processutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.512 183079 DEBUG oslo_concurrency.processutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.514 183079 DEBUG nova.virt.disk.api [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Cannot resize image /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.514 183079 DEBUG nova.objects.instance [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lazy-loading 'migration_context' on Instance uuid dd2cb045-1122-4834-9fea-6294fc690f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.538 183079 DEBUG nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.539 183079 DEBUG nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Ensure instance console log exists: /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.540 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.540 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.540 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:04:58 compute-0 nova_compute[183075]: 2026-01-22 18:04:58.928 183079 DEBUG nova.policy [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 18:04:59 compute-0 nova_compute[183075]: 2026-01-22 18:04:59.496 183079 DEBUG nova.network.neutron [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Successfully created port: cba5e4c6-9f20-4080-a0b3-f1ce8a74528e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 18:04:59 compute-0 nova_compute[183075]: 2026-01-22 18:04:59.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:04:59 compute-0 nova_compute[183075]: 2026-01-22 18:04:59.789 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 18:04:59 compute-0 nova_compute[183075]: 2026-01-22 18:04:59.841 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:00 compute-0 nova_compute[183075]: 2026-01-22 18:05:00.115 183079 DEBUG nova.network.neutron [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Successfully updated port: cba5e4c6-9f20-4080-a0b3-f1ce8a74528e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 18:05:00 compute-0 nova_compute[183075]: 2026-01-22 18:05:00.130 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "refresh_cache-dd2cb045-1122-4834-9fea-6294fc690f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:05:00 compute-0 nova_compute[183075]: 2026-01-22 18:05:00.131 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquired lock "refresh_cache-dd2cb045-1122-4834-9fea-6294fc690f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:05:00 compute-0 nova_compute[183075]: 2026-01-22 18:05:00.131 183079 DEBUG nova.network.neutron [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 18:05:00 compute-0 nova_compute[183075]: 2026-01-22 18:05:00.203 183079 DEBUG nova.compute.manager [req-c7c6a019-6186-456a-a108-8c112f8d8af2 req-3da5a251-9369-4388-ad68-f8281f46856f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Received event network-changed-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:05:00 compute-0 nova_compute[183075]: 2026-01-22 18:05:00.203 183079 DEBUG nova.compute.manager [req-c7c6a019-6186-456a-a108-8c112f8d8af2 req-3da5a251-9369-4388-ad68-f8281f46856f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Refreshing instance network info cache due to event network-changed-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 18:05:00 compute-0 nova_compute[183075]: 2026-01-22 18:05:00.204 183079 DEBUG oslo_concurrency.lockutils [req-c7c6a019-6186-456a-a108-8c112f8d8af2 req-3da5a251-9369-4388-ad68-f8281f46856f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-dd2cb045-1122-4834-9fea-6294fc690f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:05:00 compute-0 nova_compute[183075]: 2026-01-22 18:05:00.890 183079 DEBUG nova.network.neutron [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.315 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.768 183079 DEBUG nova.network.neutron [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Updating instance_info_cache with network_info: [{"id": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "address": "fa:16:3e:29:3d:20", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe29:3d20", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba5e4c6-9f", "ovs_interfaceid": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.800 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.800 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.800 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.801 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.935 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.936 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.937 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Releasing lock "refresh_cache-dd2cb045-1122-4834-9fea-6294fc690f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.937 183079 DEBUG nova.compute.manager [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Instance network_info: |[{"id": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "address": "fa:16:3e:29:3d:20", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe29:3d20", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba5e4c6-9f", "ovs_interfaceid": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.938 183079 DEBUG oslo_concurrency.lockutils [req-c7c6a019-6186-456a-a108-8c112f8d8af2 req-3da5a251-9369-4388-ad68-f8281f46856f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-dd2cb045-1122-4834-9fea-6294fc690f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.938 183079 DEBUG nova.network.neutron [req-c7c6a019-6186-456a-a108-8c112f8d8af2 req-3da5a251-9369-4388-ad68-f8281f46856f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Refreshing network info cache for port cba5e4c6-9f20-4080-a0b3-f1ce8a74528e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.942 183079 DEBUG nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Start _get_guest_xml network_info=[{"id": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "address": "fa:16:3e:29:3d:20", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe29:3d20", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba5e4c6-9f", "ovs_interfaceid": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.947 183079 WARNING nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.951 183079 DEBUG nova.virt.libvirt.host [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.952 183079 DEBUG nova.virt.libvirt.host [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.958 183079 DEBUG nova.virt.libvirt.host [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.958 183079 DEBUG nova.virt.libvirt.host [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.959 183079 DEBUG nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.959 183079 DEBUG nova.virt.hardware [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.960 183079 DEBUG nova.virt.hardware [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.960 183079 DEBUG nova.virt.hardware [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.960 183079 DEBUG nova.virt.hardware [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.960 183079 DEBUG nova.virt.hardware [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.960 183079 DEBUG nova.virt.hardware [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.961 183079 DEBUG nova.virt.hardware [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.961 183079 DEBUG nova.virt.hardware [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.961 183079 DEBUG nova.virt.hardware [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.961 183079 DEBUG nova.virt.hardware [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.962 183079 DEBUG nova.virt.hardware [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.966 183079 DEBUG nova.virt.libvirt.vif [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T18:04:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1479257049',display_name='tempest-server-test-1479257049',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1479257049',id=83,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN4nUgc+oV7ByTq9qVVAwVt7Z5aS2u6R2o/Luqd0qC5fBtJq9ShHu4J7y3Jz2xuF9QZoBf9pscb0QblSJ2VeHfwaE0UkbJGn/O5SM0HSKnGhFZW4WRYEXRQdnsqUpTi/Qw==',key_name='tempest-keypair-test-2068520959',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bfeeb28f5b97478fac3f61cc12827bb3',ramdisk_id='',reservation_id='r-iommjg3l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessSecGroupDualStackSlaacTest-795693713',owner_user_name='tempest-StatelessSecGroupDualStackSlaacTest-795693713-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T18:04:58Z,user_data=None,user_id='42329d6b6bc04d9daacda0eb41f36019',uuid=dd2cb045-1122-4834-9fea-6294fc690f67,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "address": "fa:16:3e:29:3d:20", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe29:3d20", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba5e4c6-9f", "ovs_interfaceid": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.966 183079 DEBUG nova.network.os_vif_util [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Converting VIF {"id": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "address": "fa:16:3e:29:3d:20", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe29:3d20", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba5e4c6-9f", "ovs_interfaceid": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.967 183079 DEBUG nova.network.os_vif_util [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=cba5e4c6-9f20-4080-a0b3-f1ce8a74528e,network=Network(cd33fee9-f283-4792-8796-9cc0f4021aaf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcba5e4c6-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 18:05:02 compute-0 nova_compute[183075]: 2026-01-22 18:05:02.968 183079 DEBUG nova.objects.instance [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lazy-loading 'pci_devices' on Instance uuid dd2cb045-1122-4834-9fea-6294fc690f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.105 183079 DEBUG nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] End _get_guest_xml xml=<domain type="kvm">
Jan 22 18:05:03 compute-0 nova_compute[183075]:   <uuid>dd2cb045-1122-4834-9fea-6294fc690f67</uuid>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   <name>instance-00000053</name>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   <metadata>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1479257049</nova:name>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 18:05:02</nova:creationTime>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 18:05:03 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 18:05:03 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 18:05:03 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 18:05:03 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 18:05:03 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 18:05:03 compute-0 nova_compute[183075]:         <nova:user uuid="42329d6b6bc04d9daacda0eb41f36019">tempest-StatelessSecGroupDualStackSlaacTest-795693713-project-member</nova:user>
Jan 22 18:05:03 compute-0 nova_compute[183075]:         <nova:project uuid="bfeeb28f5b97478fac3f61cc12827bb3">tempest-StatelessSecGroupDualStackSlaacTest-795693713</nova:project>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 18:05:03 compute-0 nova_compute[183075]:         <nova:port uuid="cba5e4c6-9f20-4080-a0b3-f1ce8a74528e">
Jan 22 18:05:03 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe29:3d20" ipVersion="6"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   </metadata>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <system>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <entry name="serial">dd2cb045-1122-4834-9fea-6294fc690f67</entry>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <entry name="uuid">dd2cb045-1122-4834-9fea-6294fc690f67</entry>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     </system>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   <os>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   </os>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   <features>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <apic/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   </features>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   </clock>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   </cpu>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   <devices>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     </disk>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:29:3d:20"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <target dev="tapcba5e4c6-9f"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     </interface>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/console.log" append="off"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     </serial>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <video>
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     </video>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     </rng>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 18:05:03 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 18:05:03 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 18:05:03 compute-0 nova_compute[183075]:   </devices>
Jan 22 18:05:03 compute-0 nova_compute[183075]: </domain>
Jan 22 18:05:03 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.107 183079 DEBUG nova.compute.manager [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Preparing to wait for external event network-vif-plugged-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.107 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.107 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.107 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.108 183079 DEBUG nova.virt.libvirt.vif [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T18:04:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1479257049',display_name='tempest-server-test-1479257049',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1479257049',id=83,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN4nUgc+oV7ByTq9qVVAwVt7Z5aS2u6R2o/Luqd0qC5fBtJq9ShHu4J7y3Jz2xuF9QZoBf9pscb0QblSJ2VeHfwaE0UkbJGn/O5SM0HSKnGhFZW4WRYEXRQdnsqUpTi/Qw==',key_name='tempest-keypair-test-2068520959',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bfeeb28f5b97478fac3f61cc12827bb3',ramdisk_id='',reservation_id='r-iommjg3l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessSecGroupDualStackSlaacTest-795693713',owner_user_name='tempest-StatelessSecGroupDualStackSlaacTest-795693713-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T18:04:58Z,user_data=None,user_id='42329d6b6bc04d9daacda0eb41f36019',uuid=dd2cb045-1122-4834-9fea-6294fc690f67,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "address": "fa:16:3e:29:3d:20", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe29:3d20", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba5e4c6-9f", "ovs_interfaceid": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.108 183079 DEBUG nova.network.os_vif_util [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Converting VIF {"id": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "address": "fa:16:3e:29:3d:20", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe29:3d20", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba5e4c6-9f", "ovs_interfaceid": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.109 183079 DEBUG nova.network.os_vif_util [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=cba5e4c6-9f20-4080-a0b3-f1ce8a74528e,network=Network(cd33fee9-f283-4792-8796-9cc0f4021aaf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcba5e4c6-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.109 183079 DEBUG os_vif [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=cba5e4c6-9f20-4080-a0b3-f1ce8a74528e,network=Network(cd33fee9-f283-4792-8796-9cc0f4021aaf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcba5e4c6-9f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.110 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.110 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.110 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.113 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.113 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcba5e4c6-9f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.113 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcba5e4c6-9f, col_values=(('external_ids', {'iface-id': 'cba5e4c6-9f20-4080-a0b3-f1ce8a74528e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:29:3d:20', 'vm-uuid': 'dd2cb045-1122-4834-9fea-6294fc690f67'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.115 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:03 compute-0 NetworkManager[55454]: <info>  [1769105103.1162] manager: (tapcba5e4c6-9f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/358)
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.118 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.121 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.122 183079 INFO os_vif [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=cba5e4c6-9f20-4080-a0b3-f1ce8a74528e,network=Network(cd33fee9-f283-4792-8796-9cc0f4021aaf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcba5e4c6-9f')
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.226 183079 DEBUG nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.226 183079 DEBUG nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] No VIF found with MAC fa:16:3e:29:3d:20, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 18:05:03 compute-0 kernel: tapcba5e4c6-9f: entered promiscuous mode
Jan 22 18:05:03 compute-0 NetworkManager[55454]: <info>  [1769105103.2853] manager: (tapcba5e4c6-9f): new Tun device (/org/freedesktop/NetworkManager/Devices/359)
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.286 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:03 compute-0 ovn_controller[95372]: 2026-01-22T18:05:03Z|00897|binding|INFO|Claiming lport cba5e4c6-9f20-4080-a0b3-f1ce8a74528e for this chassis.
Jan 22 18:05:03 compute-0 ovn_controller[95372]: 2026-01-22T18:05:03Z|00898|binding|INFO|cba5e4c6-9f20-4080-a0b3-f1ce8a74528e: Claiming fa:16:3e:29:3d:20 10.100.0.4 2001:db8::f816:3eff:fe29:3d20
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.293 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.304 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3d:20 10.100.0.4 2001:db8::f816:3eff:fe29:3d20'], port_security=['fa:16:3e:29:3d:20 10.100.0.4 2001:db8::f816:3eff:fe29:3d20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28 2001:db8::f816:3eff:fe29:3d20/64', 'neutron:device_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ad003613-25d7-4407-87d3-e15c431e7689', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7c020e6d-68ca-4390-a036-a6097b05932f, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=cba5e4c6-9f20-4080-a0b3-f1ce8a74528e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.307 104629 INFO neutron.agent.ovn.metadata.agent [-] Port cba5e4c6-9f20-4080-a0b3-f1ce8a74528e in datapath cd33fee9-f283-4792-8796-9cc0f4021aaf bound to our chassis
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.309 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cd33fee9-f283-4792-8796-9cc0f4021aaf
Jan 22 18:05:03 compute-0 systemd-udevd[247751]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.323 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0614ad16-3f1b-476e-ac41-ee7c1c827861]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.324 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcd33fee9-f1 in ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.326 211630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcd33fee9-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.326 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[60c509f2-adac-4b36-abe0-df16b694fa07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:05:03 compute-0 systemd-machined[154382]: New machine qemu-83-instance-00000053.
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.327 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[57f0a5a4-b173-4262-a602-a480acd45647]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:05:03 compute-0 NetworkManager[55454]: <info>  [1769105103.3339] device (tapcba5e4c6-9f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 18:05:03 compute-0 NetworkManager[55454]: <info>  [1769105103.3346] device (tapcba5e4c6-9f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.341 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[37b3e149-c0e7-4414-a17a-c14a0afe9072]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.346 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:03 compute-0 systemd[1]: Started Virtual Machine qemu-83-instance-00000053.
Jan 22 18:05:03 compute-0 ovn_controller[95372]: 2026-01-22T18:05:03Z|00899|binding|INFO|Setting lport cba5e4c6-9f20-4080-a0b3-f1ce8a74528e ovn-installed in OVS
Jan 22 18:05:03 compute-0 ovn_controller[95372]: 2026-01-22T18:05:03Z|00900|binding|INFO|Setting lport cba5e4c6-9f20-4080-a0b3-f1ce8a74528e up in Southbound
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.353 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.358 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[14ff6fef-92f2-4abd-afc3-a4b1b3a5a4d3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.395 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[2938424c-0fa0-450b-8b33-4679ffab2f8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:05:03 compute-0 NetworkManager[55454]: <info>  [1769105103.4018] manager: (tapcd33fee9-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/360)
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.400 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[d24405eb-d67d-40d2-ab7f-cf4684bc2156]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.438 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ea2e960e-c9df-4a6a-93e8-abf5a97a7c34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.442 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[deabb40b-027d-4a08-ae45-0ba627afe633]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:05:03 compute-0 podman[247758]: 2026-01-22 18:05:03.458157645 +0000 UTC m=+0.081773828 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 18:05:03 compute-0 podman[247759]: 2026-01-22 18:05:03.46502289 +0000 UTC m=+0.088695355 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64)
Jan 22 18:05:03 compute-0 NetworkManager[55454]: <info>  [1769105103.4715] device (tapcd33fee9-f0): carrier: link connected
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.480 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d60addc5-1c31-4a03-9e0f-a114011ad0b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:05:03 compute-0 podman[247762]: 2026-01-22 18:05:03.489859341 +0000 UTC m=+0.103831124 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.501 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[515a55d5-f8c2-4f30-8a9b-ef8d7c4e35f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd33fee9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:ef:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 246], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746717, 'reachable_time': 40874, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247846, 'error': None, 'target': 'ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.518 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[215d4103-795d-4f82-a5fb-c8ba8beaf52e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe34:ef11'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 746717, 'tstamp': 746717}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247847, 'error': None, 'target': 'ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.540 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[173ca4d9-4a03-4deb-873e-1fdf034af9c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd33fee9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:ef:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 246], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746717, 'reachable_time': 40874, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247848, 'error': None, 'target': 'ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.581 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[50ba71fa-275b-4d7c-bb60-9fd2476eee94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.656 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[1fe11d31-1df9-4e28-b0ff-cefd8fb636e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.659 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd33fee9-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.659 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.660 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd33fee9-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.662 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:03 compute-0 NetworkManager[55454]: <info>  [1769105103.6627] manager: (tapcd33fee9-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/361)
Jan 22 18:05:03 compute-0 kernel: tapcd33fee9-f0: entered promiscuous mode
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.666 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcd33fee9-f0, col_values=(('external_ids', {'iface-id': '4c6b8b92-89d6-4150-a6bb-444d2d5ca88c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:05:03 compute-0 ovn_controller[95372]: 2026-01-22T18:05:03Z|00901|binding|INFO|Releasing lport 4c6b8b92-89d6-4150-a6bb-444d2d5ca88c from this chassis (sb_readonly=0)
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.678 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:03 compute-0 nova_compute[183075]: 2026-01-22 18:05:03.679 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.681 104629 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cd33fee9-f283-4792-8796-9cc0f4021aaf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cd33fee9-f283-4792-8796-9cc0f4021aaf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.682 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b5d4e578-ca7f-41f7-93ac-36aa602eb248]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.682 104629 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: global
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     log         /dev/log local0 debug
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     log-tag     haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     user        root
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     group       root
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     maxconn     1024
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     pidfile     /var/lib/neutron/external/pids/cd33fee9-f283-4792-8796-9cc0f4021aaf.pid.haproxy
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     daemon
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: defaults
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     log global
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     mode http
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     option httplog
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     option dontlognull
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     option http-server-close
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     option forwardfor
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     retries                 3
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     timeout http-request    30s
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     timeout connect         30s
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     timeout client          32s
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     timeout server          32s
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     timeout http-keep-alive 30s
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: listen listener
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     bind 169.254.169.254:80
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:     http-request add-header X-OVN-Network-ID cd33fee9-f283-4792-8796-9cc0f4021aaf
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 18:05:03 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:03.683 104629 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'env', 'PROCESS_TAG=haproxy-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cd33fee9-f283-4792-8796-9cc0f4021aaf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 18:05:04 compute-0 podman[247880]: 2026-01-22 18:05:04.018733017 +0000 UTC m=+0.052581461 container create 612aeafed49e1c0708b67dd5cbf4552d27a9cbfe9bdea022d7dafca64af2dc9b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:05:04 compute-0 systemd[1]: Started libpod-conmon-612aeafed49e1c0708b67dd5cbf4552d27a9cbfe9bdea022d7dafca64af2dc9b.scope.
Jan 22 18:05:04 compute-0 systemd[1]: Started libcrun container.
Jan 22 18:05:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f264e97b6c1ea68ae92434cdc69afa24e06bcdbc67e718e964538da7258a37a1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 18:05:04 compute-0 podman[247880]: 2026-01-22 18:05:03.987554195 +0000 UTC m=+0.021402649 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 18:05:04 compute-0 podman[247880]: 2026-01-22 18:05:04.090668378 +0000 UTC m=+0.124516822 container init 612aeafed49e1c0708b67dd5cbf4552d27a9cbfe9bdea022d7dafca64af2dc9b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 18:05:04 compute-0 podman[247880]: 2026-01-22 18:05:04.096341281 +0000 UTC m=+0.130189705 container start 612aeafed49e1c0708b67dd5cbf4552d27a9cbfe9bdea022d7dafca64af2dc9b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.107 183079 DEBUG nova.compute.manager [req-b03b525f-9261-4551-8211-8c5be0c9b9fd req-b81914c3-7fd0-4270-926f-ea7b26e38801 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Received event network-vif-plugged-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.108 183079 DEBUG oslo_concurrency.lockutils [req-b03b525f-9261-4551-8211-8c5be0c9b9fd req-b81914c3-7fd0-4270-926f-ea7b26e38801 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.108 183079 DEBUG oslo_concurrency.lockutils [req-b03b525f-9261-4551-8211-8c5be0c9b9fd req-b81914c3-7fd0-4270-926f-ea7b26e38801 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.108 183079 DEBUG oslo_concurrency.lockutils [req-b03b525f-9261-4551-8211-8c5be0c9b9fd req-b81914c3-7fd0-4270-926f-ea7b26e38801 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.108 183079 DEBUG nova.compute.manager [req-b03b525f-9261-4551-8211-8c5be0c9b9fd req-b81914c3-7fd0-4270-926f-ea7b26e38801 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Processing event network-vif-plugged-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 18:05:04 compute-0 neutron-haproxy-ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf[247895]: [NOTICE]   (247899) : New worker (247901) forked
Jan 22 18:05:04 compute-0 neutron-haproxy-ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf[247895]: [NOTICE]   (247899) : Loading success.
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.296 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:04.296 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:05:04 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:04.298 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.524 183079 DEBUG nova.compute.manager [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.526 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769105104.5235057, dd2cb045-1122-4834-9fea-6294fc690f67 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.526 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] VM Started (Lifecycle Event)
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.530 183079 DEBUG nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.533 183079 INFO nova.virt.libvirt.driver [-] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Instance spawned successfully.
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.534 183079 DEBUG nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.554 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.559 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.563 183079 DEBUG nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.563 183079 DEBUG nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.564 183079 DEBUG nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.564 183079 DEBUG nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.565 183079 DEBUG nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.565 183079 DEBUG nova.virt.libvirt.driver [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.597 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.597 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769105104.5245802, dd2cb045-1122-4834-9fea-6294fc690f67 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.597 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] VM Paused (Lifecycle Event)
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.626 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.630 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769105104.5294228, dd2cb045-1122-4834-9fea-6294fc690f67 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.630 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] VM Resumed (Lifecycle Event)
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.638 183079 INFO nova.compute.manager [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Took 6.46 seconds to spawn the instance on the hypervisor.
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.639 183079 DEBUG nova.compute.manager [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.648 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.651 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.682 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.700 183079 INFO nova.compute.manager [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Took 6.86 seconds to build instance.
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.714 183079 DEBUG oslo_concurrency.lockutils [None req-b3c2bccc-3e9c-42b4-97fe-8b0c200cb3b5 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "dd2cb045-1122-4834-9fea-6294fc690f67" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.936s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 18:05:04 compute-0 nova_compute[183075]: 2026-01-22 18:05:04.843 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:05 compute-0 nova_compute[183075]: 2026-01-22 18:05:05.028 183079 DEBUG nova.network.neutron [req-c7c6a019-6186-456a-a108-8c112f8d8af2 req-3da5a251-9369-4388-ad68-f8281f46856f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Updated VIF entry in instance network info cache for port cba5e4c6-9f20-4080-a0b3-f1ce8a74528e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 18:05:05 compute-0 nova_compute[183075]: 2026-01-22 18:05:05.030 183079 DEBUG nova.network.neutron [req-c7c6a019-6186-456a-a108-8c112f8d8af2 req-3da5a251-9369-4388-ad68-f8281f46856f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Updating instance_info_cache with network_info: [{"id": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "address": "fa:16:3e:29:3d:20", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe29:3d20", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba5e4c6-9f", "ovs_interfaceid": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:05:05 compute-0 nova_compute[183075]: 2026-01-22 18:05:05.046 183079 DEBUG oslo_concurrency.lockutils [req-c7c6a019-6186-456a-a108-8c112f8d8af2 req-3da5a251-9369-4388-ad68-f8281f46856f a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-dd2cb045-1122-4834-9fea-6294fc690f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:05:05 compute-0 nova_compute[183075]: 2026-01-22 18:05:05.518 183079 INFO nova.compute.manager [None req-999daa5a-a386-4d71-bd3e-91d594c2cc8d 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Get console output
Jan 22 18:05:05 compute-0 nova_compute[183075]: 2026-01-22 18:05:05.522 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:05:06 compute-0 nova_compute[183075]: 2026-01-22 18:05:06.173 183079 DEBUG nova.compute.manager [req-280b0271-08e2-4c72-b05c-3fb95514d10e req-73c16795-6ede-4659-ae28-46e663b02553 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Received event network-vif-plugged-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:05:06 compute-0 nova_compute[183075]: 2026-01-22 18:05:06.174 183079 DEBUG oslo_concurrency.lockutils [req-280b0271-08e2-4c72-b05c-3fb95514d10e req-73c16795-6ede-4659-ae28-46e663b02553 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:05:06 compute-0 nova_compute[183075]: 2026-01-22 18:05:06.174 183079 DEBUG oslo_concurrency.lockutils [req-280b0271-08e2-4c72-b05c-3fb95514d10e req-73c16795-6ede-4659-ae28-46e663b02553 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:05:06 compute-0 nova_compute[183075]: 2026-01-22 18:05:06.174 183079 DEBUG oslo_concurrency.lockutils [req-280b0271-08e2-4c72-b05c-3fb95514d10e req-73c16795-6ede-4659-ae28-46e663b02553 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:05:06 compute-0 nova_compute[183075]: 2026-01-22 18:05:06.174 183079 DEBUG nova.compute.manager [req-280b0271-08e2-4c72-b05c-3fb95514d10e req-73c16795-6ede-4659-ae28-46e663b02553 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] No waiting events found dispatching network-vif-plugged-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:05:06 compute-0 nova_compute[183075]: 2026-01-22 18:05:06.174 183079 WARNING nova.compute.manager [req-280b0271-08e2-4c72-b05c-3fb95514d10e req-73c16795-6ede-4659-ae28-46e663b02553 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Received unexpected event network-vif-plugged-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e for instance with vm_state active and task_state None.
Jan 22 18:05:06 compute-0 podman[247918]: 2026-01-22 18:05:06.351020752 +0000 UTC m=+0.059842056 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 18:05:06 compute-0 nova_compute[183075]: 2026-01-22 18:05:06.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:05:06 compute-0 nova_compute[183075]: 2026-01-22 18:05:06.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:05:06 compute-0 nova_compute[183075]: 2026-01-22 18:05:06.810 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:05:06 compute-0 nova_compute[183075]: 2026-01-22 18:05:06.810 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:05:06 compute-0 nova_compute[183075]: 2026-01-22 18:05:06.810 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:05:06 compute-0 nova_compute[183075]: 2026-01-22 18:05:06.810 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 18:05:06 compute-0 nova_compute[183075]: 2026-01-22 18:05:06.877 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:05:06 compute-0 nova_compute[183075]: 2026-01-22 18:05:06.939 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:05:06 compute-0 nova_compute[183075]: 2026-01-22 18:05:06.940 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:05:06 compute-0 nova_compute[183075]: 2026-01-22 18:05:06.993 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:05:07 compute-0 nova_compute[183075]: 2026-01-22 18:05:07.135 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 18:05:07 compute-0 nova_compute[183075]: 2026-01-22 18:05:07.136 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5534MB free_disk=73.34938430786133GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 18:05:07 compute-0 nova_compute[183075]: 2026-01-22 18:05:07.137 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:05:07 compute-0 nova_compute[183075]: 2026-01-22 18:05:07.137 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:05:07 compute-0 nova_compute[183075]: 2026-01-22 18:05:07.260 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance dd2cb045-1122-4834-9fea-6294fc690f67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 18:05:07 compute-0 nova_compute[183075]: 2026-01-22 18:05:07.260 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 18:05:07 compute-0 nova_compute[183075]: 2026-01-22 18:05:07.260 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 18:05:07 compute-0 nova_compute[183075]: 2026-01-22 18:05:07.296 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:05:07 compute-0 nova_compute[183075]: 2026-01-22 18:05:07.311 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:05:07 compute-0 nova_compute[183075]: 2026-01-22 18:05:07.333 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 18:05:07 compute-0 nova_compute[183075]: 2026-01-22 18:05:07.333 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:05:08 compute-0 nova_compute[183075]: 2026-01-22 18:05:08.117 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:09 compute-0 nova_compute[183075]: 2026-01-22 18:05:09.879 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:10 compute-0 nova_compute[183075]: 2026-01-22 18:05:10.936 183079 INFO nova.compute.manager [None req-4a90a549-f23d-4ab3-b0d7-cfbbcdc1766d 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Get console output
Jan 22 18:05:10 compute-0 nova_compute[183075]: 2026-01-22 18:05:10.941 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:05:11 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:11.300 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:05:13 compute-0 nova_compute[183075]: 2026-01-22 18:05:13.156 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:14 compute-0 nova_compute[183075]: 2026-01-22 18:05:14.880 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:16 compute-0 nova_compute[183075]: 2026-01-22 18:05:16.050 183079 INFO nova.compute.manager [None req-0284104b-c8a4-423c-92ea-4d3b7302cf4a 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Get console output
Jan 22 18:05:16 compute-0 nova_compute[183075]: 2026-01-22 18:05:16.054 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:05:17 compute-0 ovn_controller[95372]: 2026-01-22T18:05:17Z|00168|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:29:3d:20 10.100.0.4
Jan 22 18:05:17 compute-0 ovn_controller[95372]: 2026-01-22T18:05:17Z|00169|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:29:3d:20 10.100.0.4
Jan 22 18:05:18 compute-0 nova_compute[183075]: 2026-01-22 18:05:18.158 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:19 compute-0 systemd[1]: Starting dnf makecache...
Jan 22 18:05:19 compute-0 podman[247964]: 2026-01-22 18:05:19.349235018 +0000 UTC m=+0.053411053 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 18:05:19 compute-0 dnf[247965]: Metadata cache refreshed recently.
Jan 22 18:05:19 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 22 18:05:19 compute-0 systemd[1]: Finished dnf makecache.
Jan 22 18:05:19 compute-0 nova_compute[183075]: 2026-01-22 18:05:19.883 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:21 compute-0 nova_compute[183075]: 2026-01-22 18:05:21.656 183079 INFO nova.compute.manager [None req-13ad10d0-0a94-4886-bfa8-380fe0fe8398 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Get console output
Jan 22 18:05:21 compute-0 nova_compute[183075]: 2026-01-22 18:05:21.663 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:05:23 compute-0 nova_compute[183075]: 2026-01-22 18:05:23.161 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:23 compute-0 podman[247989]: 2026-01-22 18:05:23.353922355 +0000 UTC m=+0.058524791 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 18:05:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:23.650 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:05:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:23.651 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 18:05:23 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:05:23 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:05:23 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:05:23 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:05:23 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:05:23 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 18:05:23 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:05:24 compute-0 nova_compute[183075]: 2026-01-22 18:05:24.884 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.899 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.899 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 2.2484510
Jan 22 18:05:25 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.4:33116 [22/Jan/2026:18:05:23.649] listener listener/metadata 0/0/0/2250/2250 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.907 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.908 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.931 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.931 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0228963
Jan 22 18:05:25 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.4:33120 [22/Jan/2026:18:05:25.907] listener listener/metadata 0/0/0/23/23 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.935 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.936 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.950 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.950 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0145540
Jan 22 18:05:25 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.4:33136 [22/Jan/2026:18:05:25.935] listener listener/metadata 0/0/0/15/15 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.956 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.956 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.973 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.974 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0173364
Jan 22 18:05:25 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.4:33148 [22/Jan/2026:18:05:25.955] listener listener/metadata 0/0/0/18/18 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.980 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.981 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.996 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:05:25 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:25.996 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0154550
Jan 22 18:05:25 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.4:33156 [22/Jan/2026:18:05:25.980] listener listener/metadata 0/0/0/16/16 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.001 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.001 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.018 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.018 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0170372
Jan 22 18:05:26 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.4:33162 [22/Jan/2026:18:05:26.000] listener listener/metadata 0/0/0/18/18 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.024 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.025 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.046 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.046 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 146 time: 0.0217142
Jan 22 18:05:26 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.4:33174 [22/Jan/2026:18:05:26.024] listener listener/metadata 0/0/0/22/22 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.052 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.053 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.069 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.069 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0168712
Jan 22 18:05:26 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.4:33176 [22/Jan/2026:18:05:26.051] listener listener/metadata 0/0/0/18/18 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.074 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.075 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.088 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.089 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0135202
Jan 22 18:05:26 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.4:33178 [22/Jan/2026:18:05:26.074] listener listener/metadata 0/0/0/14/14 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.093 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.094 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.106 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:05:26 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.4:33180 [22/Jan/2026:18:05:26.093] listener listener/metadata 0/0/0/13/13 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.107 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0125630
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.111 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.112 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:05:26 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.4:33182 [22/Jan/2026:18:05:26.111] listener listener/metadata 0/0/0/13/13 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.124 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0123572
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.134 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.134 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.152 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.152 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0183499
Jan 22 18:05:26 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.4:33196 [22/Jan/2026:18:05:26.133] listener listener/metadata 0/0/0/19/19 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.157 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.158 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.171 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.172 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0142398
Jan 22 18:05:26 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.4:33210 [22/Jan/2026:18:05:26.156] listener listener/metadata 0/0/0/15/15 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.175 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.176 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.190 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.191 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0148649
Jan 22 18:05:26 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.4:33212 [22/Jan/2026:18:05:26.175] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.195 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.195 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.207 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.207 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0114994
Jan 22 18:05:26 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.4:33224 [22/Jan/2026:18:05:26.195] listener listener/metadata 0/0/0/12/12 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.211 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.211 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.4
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.228 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:05:26 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:26.228 104990 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0169594
Jan 22 18:05:26 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.4:33232 [22/Jan/2026:18:05:26.210] listener listener/metadata 0/0/0/17/17 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 18:05:26 compute-0 nova_compute[183075]: 2026-01-22 18:05:26.861 183079 INFO nova.compute.manager [None req-c208ffe4-ff5f-477f-a9e7-0f545a335dff 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Get console output
Jan 22 18:05:26 compute-0 nova_compute[183075]: 2026-01-22 18:05:26.864 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:05:28 compute-0 nova_compute[183075]: 2026-01-22 18:05:28.163 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:29 compute-0 nova_compute[183075]: 2026-01-22 18:05:29.889 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:32 compute-0 nova_compute[183075]: 2026-01-22 18:05:32.089 183079 INFO nova.compute.manager [None req-4d55ee88-bd61-40fe-9a24-0b3825fe3565 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Get console output
Jan 22 18:05:32 compute-0 nova_compute[183075]: 2026-01-22 18:05:32.093 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:05:33 compute-0 nova_compute[183075]: 2026-01-22 18:05:33.165 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:33 compute-0 ovn_controller[95372]: 2026-01-22T18:05:33Z|00902|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Jan 22 18:05:33 compute-0 nova_compute[183075]: 2026-01-22 18:05:33.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:05:34 compute-0 podman[248014]: 2026-01-22 18:05:34.380659266 +0000 UTC m=+0.069486146 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 22 18:05:34 compute-0 podman[248015]: 2026-01-22 18:05:34.393532224 +0000 UTC m=+0.085380946 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, version=9.6, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Jan 22 18:05:34 compute-0 podman[248013]: 2026-01-22 18:05:34.395869037 +0000 UTC m=+0.096439824 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 22 18:05:34 compute-0 nova_compute[183075]: 2026-01-22 18:05:34.891 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:37 compute-0 nova_compute[183075]: 2026-01-22 18:05:37.218 183079 INFO nova.compute.manager [None req-2b97dc23-e07c-438c-9dc3-e6ef0bcc4286 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Get console output
Jan 22 18:05:37 compute-0 nova_compute[183075]: 2026-01-22 18:05:37.223 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:05:37 compute-0 podman[248077]: 2026-01-22 18:05:37.341478146 +0000 UTC m=+0.054926033 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 22 18:05:38 compute-0 nova_compute[183075]: 2026-01-22 18:05:38.168 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:39 compute-0 nova_compute[183075]: 2026-01-22 18:05:39.802 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:05:39 compute-0 nova_compute[183075]: 2026-01-22 18:05:39.802 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 18:05:39 compute-0 nova_compute[183075]: 2026-01-22 18:05:39.834 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 18:05:39 compute-0 nova_compute[183075]: 2026-01-22 18:05:39.893 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:41.989 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:05:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:41.990 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:05:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:05:41.991 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:05:42 compute-0 nova_compute[183075]: 2026-01-22 18:05:42.326 183079 INFO nova.compute.manager [None req-f536f6bd-5227-4017-b78a-f145bae8fed3 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Get console output
Jan 22 18:05:42 compute-0 nova_compute[183075]: 2026-01-22 18:05:42.330 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:05:43 compute-0 nova_compute[183075]: 2026-01-22 18:05:43.171 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:44 compute-0 nova_compute[183075]: 2026-01-22 18:05:44.894 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:47 compute-0 nova_compute[183075]: 2026-01-22 18:05:47.461 183079 INFO nova.compute.manager [None req-cc1c81b6-7f3d-4fb5-b088-35e8844f3172 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Get console output
Jan 22 18:05:47 compute-0 nova_compute[183075]: 2026-01-22 18:05:47.466 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:05:48 compute-0 nova_compute[183075]: 2026-01-22 18:05:48.173 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:49 compute-0 nova_compute[183075]: 2026-01-22 18:05:49.896 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:50 compute-0 podman[248096]: 2026-01-22 18:05:50.339071887 +0000 UTC m=+0.042265722 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:05:52 compute-0 nova_compute[183075]: 2026-01-22 18:05:52.590 183079 INFO nova.compute.manager [None req-eff0f9cd-fe60-4bc8-b20b-b68f19165831 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Get console output
Jan 22 18:05:52 compute-0 nova_compute[183075]: 2026-01-22 18:05:52.595 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:05:52 compute-0 nova_compute[183075]: 2026-01-22 18:05:52.820 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:05:53 compute-0 nova_compute[183075]: 2026-01-22 18:05:53.175 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:53 compute-0 nova_compute[183075]: 2026-01-22 18:05:53.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:05:53 compute-0 nova_compute[183075]: 2026-01-22 18:05:53.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:05:54 compute-0 podman[248121]: 2026-01-22 18:05:54.362066098 +0000 UTC m=+0.073869655 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 18:05:54 compute-0 nova_compute[183075]: 2026-01-22 18:05:54.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:05:54 compute-0 nova_compute[183075]: 2026-01-22 18:05:54.899 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.469 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'name': 'tempest-server-test-1479257049', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000053', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'hostId': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.469 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.483 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/disk.device.read.bytes volume: 30161408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '954a6d7f-bc19-4ef5-87e5-0cefa47d313a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30161408, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'dd2cb045-1122-4834-9fea-6294fc690f67-vda', 'timestamp': '2026-01-22T18:05:55.470057', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'instance-00000053', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ff225ed8-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.230496148, 'message_signature': 'a17b4ebcdc008fb095a97bb7801d0568743cbc0a2221e3dd775d3d3b10ba72bd'}]}, 'timestamp': '2026-01-22 18:05:55.484309', '_unique_id': 'd2512e7245b04711a5fa02dade7c6b86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.486 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.489 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for dd2cb045-1122-4834-9fea-6294fc690f67 / tapcba5e4c6-9f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.489 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd099cb3b-dedd-47ce-b2e0-eaf57b5c6fc3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'instance-00000053-dd2cb045-1122-4834-9fea-6294fc690f67-tapcba5e4c6-9f', 'timestamp': '2026-01-22T18:05:55.486783', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'tapcba5e4c6-9f', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcba5e4c6-9f'}, 'message_id': 'ff234014-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.24724577, 'message_signature': 'e2b37120c8fbc28f47507ec076aa823e34a29a80041943283edad09727f65165'}]}, 'timestamp': '2026-01-22 18:05:55.489990', '_unique_id': 'f24a3873583e4dcba777fbf7b0c29bb2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.490 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.491 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.491 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/network.outgoing.packets volume: 134 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dce15c02-0c85-472a-b912-6278be9515bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 134, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'instance-00000053-dd2cb045-1122-4834-9fea-6294fc690f67-tapcba5e4c6-9f', 'timestamp': '2026-01-22T18:05:55.491706', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'tapcba5e4c6-9f', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcba5e4c6-9f'}, 'message_id': 'ff2390f0-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.24724577, 'message_signature': '175f597cfd271911809c5acd45b59728f88663d44b71c681bd2504a41c25bb5d'}]}, 'timestamp': '2026-01-22 18:05:55.492013', '_unique_id': '1337259e9ca54d7ebad51f76ca1f9ad8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.492 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.493 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.493 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.493 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1479257049>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1479257049>]
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.493 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.493 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92d91e71-cb47-4fa7-9813-eb2571d4528b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'instance-00000053-dd2cb045-1122-4834-9fea-6294fc690f67-tapcba5e4c6-9f', 'timestamp': '2026-01-22T18:05:55.493902', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'tapcba5e4c6-9f', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcba5e4c6-9f'}, 'message_id': 'ff23e6f4-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.24724577, 'message_signature': '3387aaca4ae943c64bd4472eb47dd2773b2bbf9b864b53c41f1130c2a748ec8b'}]}, 'timestamp': '2026-01-22 18:05:55.494218', '_unique_id': '09fac420ed9744cead0070da4c73b642'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.495 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.495 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a664555-b965-4be3-8651-23294dabe984', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'instance-00000053-dd2cb045-1122-4834-9fea-6294fc690f67-tapcba5e4c6-9f', 'timestamp': '2026-01-22T18:05:55.495514', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'tapcba5e4c6-9f', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcba5e4c6-9f'}, 'message_id': 'ff2424e8-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.24724577, 'message_signature': 'd37645bfbf7d883d0c9842a8aa7ea6b4227009b1f2f078252fb554197de2b18a'}]}, 'timestamp': '2026-01-22 18:05:55.495762', '_unique_id': 'a6f41547995645c49ac52d5ea6ea2f7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.496 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.497 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/disk.device.write.bytes volume: 72998912 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3bf6bb7-049f-4baa-ae55-c850651ec702', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72998912, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'dd2cb045-1122-4834-9fea-6294fc690f67-vda', 'timestamp': '2026-01-22T18:05:55.496982', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'instance-00000053', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ff245ecc-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.230496148, 'message_signature': 'ca0e979f5e865bf11d5e2b9885558e7e76eb5ded37655d73b85c478148757ebd'}]}, 'timestamp': '2026-01-22 18:05:55.497270', '_unique_id': '8b92211ea2b44e84ab6ae952f7e4f6d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.498 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/network.incoming.bytes volume: 7331 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e1f64b0-cd99-4ba7-aefb-54961a156c5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7331, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'instance-00000053-dd2cb045-1122-4834-9fea-6294fc690f67-tapcba5e4c6-9f', 'timestamp': '2026-01-22T18:05:55.498846', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'tapcba5e4c6-9f', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcba5e4c6-9f'}, 'message_id': 'ff24a792-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.24724577, 'message_signature': 'd0c145c0a4aab320ef8d7abe1b4b547c585214ce365f4251042a5342f4d51874'}]}, 'timestamp': '2026-01-22 18:05:55.499106', '_unique_id': '1179e012339542edbf8669e55541ea88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.499 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.500 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.506 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25f828f5-6f37-4819-b9ef-9d9322f28165', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'dd2cb045-1122-4834-9fea-6294fc690f67-vda', 'timestamp': '2026-01-22T18:05:55.500401', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'instance-00000053', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ff25c910-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.260853287, 'message_signature': 'afd8111e1be769f5cc715ad3b9e9d19d48a0f5e143a78dc87c35e8075e498bc1'}]}, 'timestamp': '2026-01-22 18:05:55.506651', '_unique_id': 'e19470b97a5d43b08e79c141e23ba752'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.507 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.508 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.509 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.509 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1479257049>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1479257049>]
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.509 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.523 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/cpu volume: 11100000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09bebd7f-af45-4d92-b239-6fe92ac25a71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11100000000, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'timestamp': '2026-01-22T18:05:55.509457', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'instance-00000053', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ff287a48-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.283967871, 'message_signature': '7ba544be03dc6db5721684ad5fdb8159e35efadd369a030a6205c85dcbacb0e0'}]}, 'timestamp': '2026-01-22 18:05:55.524288', '_unique_id': '0521f6d96991478f98072d0abe2972b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.525 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.526 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0930e9ec-3261-4990-b573-c6f319dbdaff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'instance-00000053-dd2cb045-1122-4834-9fea-6294fc690f67-tapcba5e4c6-9f', 'timestamp': '2026-01-22T18:05:55.526068', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'tapcba5e4c6-9f', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcba5e4c6-9f'}, 'message_id': 'ff28cf52-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.24724577, 'message_signature': '7bfdfffebf45929029606cc15c5697856f075e5f6822b4647b5ed2d938db29fd'}]}, 'timestamp': '2026-01-22 18:05:55.526375', '_unique_id': '999171fa540740e086715bcb4c0cbad6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.527 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25572631-31af-42fd-a161-85ae30ea7f2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'instance-00000053-dd2cb045-1122-4834-9fea-6294fc690f67-tapcba5e4c6-9f', 'timestamp': '2026-01-22T18:05:55.527831', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'tapcba5e4c6-9f', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcba5e4c6-9f'}, 'message_id': 'ff29149e-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.24724577, 'message_signature': '6d4cb2a325d070190d954383bdb0ce57526f66709189fc24f79253bdd267f5a1'}]}, 'timestamp': '2026-01-22 18:05:55.528160', '_unique_id': '7e938dc202cc42dc9e8cc7735dc66212'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.528 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.529 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.529 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25aed280-6bdf-4a56-a919-3192a333e07f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'dd2cb045-1122-4834-9fea-6294fc690f67-vda', 'timestamp': '2026-01-22T18:05:55.529449', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'instance-00000053', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ff2951ac-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.260853287, 'message_signature': '2549385303a7fd3130020c02e55f1aac3d85f046fc5140e701ea01a00ca360fc'}]}, 'timestamp': '2026-01-22 18:05:55.529702', '_unique_id': '31cbfeea14e146f8ba96a9d0cdaa212d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.530 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5331d37-22dd-4c99-a515-43e04f7c6ba4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'instance-00000053-dd2cb045-1122-4834-9fea-6294fc690f67-tapcba5e4c6-9f', 'timestamp': '2026-01-22T18:05:55.530854', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'tapcba5e4c6-9f', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcba5e4c6-9f'}, 'message_id': 'ff2989e2-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.24724577, 'message_signature': 'e0c53dea36db240ced5126d2fc4487830eccb2904a484389b4615e88c22b7e93'}]}, 'timestamp': '2026-01-22 18:05:55.531120', '_unique_id': '367c0e80fa4147cb99315c8cb858671e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.531 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.532 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.532 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1154aef1-c8c9-4df8-afa2-dce1c549eac8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'instance-00000053-dd2cb045-1122-4834-9fea-6294fc690f67-tapcba5e4c6-9f', 'timestamp': '2026-01-22T18:05:55.532393', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'tapcba5e4c6-9f', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcba5e4c6-9f'}, 'message_id': 'ff29c59c-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.24724577, 'message_signature': '32f6cae0e1ea5675c0ee670d12d05e7e12c58c7974bb664c11a842b2f21fd31f'}]}, 'timestamp': '2026-01-22 18:05:55.532692', '_unique_id': '0c263a0d1d8b4431ace7b3da159e4229'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.533 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/network.outgoing.bytes volume: 11910 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75e8f4e9-8b5b-4ba8-b55e-21251c2ec89b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11910, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'instance-00000053-dd2cb045-1122-4834-9fea-6294fc690f67-tapcba5e4c6-9f', 'timestamp': '2026-01-22T18:05:55.533817', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'tapcba5e4c6-9f', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcba5e4c6-9f'}, 'message_id': 'ff29fd64-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.24724577, 'message_signature': 'aadf17555b25e289913c478f87caebf1dbb0abcbb733b8c6e8f349079593fc14'}]}, 'timestamp': '2026-01-22 18:05:55.534071', '_unique_id': 'f24532530e33444fa4572b898b9342b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.535 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.535 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.535 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1479257049>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1479257049>]
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.535 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.535 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/disk.device.write.requests volume: 324 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14fa8f0d-3774-4094-a588-9c85d6bd05f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 324, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'dd2cb045-1122-4834-9fea-6294fc690f67-vda', 'timestamp': '2026-01-22T18:05:55.535488', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'instance-00000053', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ff2a3fa4-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.230496148, 'message_signature': '5707569a044b85282640038c43c010a180023d45187533c49d93f1b7306846f2'}]}, 'timestamp': '2026-01-22 18:05:55.535791', '_unique_id': 'c456e65ec41c44009fa14247adb34cbc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.537 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.537 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.537 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1479257049>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1479257049>]
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.537 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.537 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac73adf7-497b-4e02-aba5-d764db2d9b51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'dd2cb045-1122-4834-9fea-6294fc690f67-vda', 'timestamp': '2026-01-22T18:05:55.537417', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'instance-00000053', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ff2a89a0-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.260853287, 'message_signature': 'd5841070463f5e4a7c75456887bfffb1d7b12088061787393546a5a70cf6d344'}]}, 'timestamp': '2026-01-22 18:05:55.537697', '_unique_id': 'd95df4cc233a4f30833aa0bb1bc69a86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.538 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/disk.device.write.latency volume: 3018073991 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d6acc1a-e3d8-4370-8387-a9ac9d8a24e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3018073991, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'dd2cb045-1122-4834-9fea-6294fc690f67-vda', 'timestamp': '2026-01-22T18:05:55.538946', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'instance-00000053', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ff2ac4a6-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.230496148, 'message_signature': '8a91fde77087ba8d3918bbe3db537381381bb0b1a7e00d02fe417f86d785c4ba'}]}, 'timestamp': '2026-01-22 18:05:55.539163', '_unique_id': '20dbcf1bfb0449b8b01c8211f6f754d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.540 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.540 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/disk.device.read.latency volume: 174111464 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15ed8b7c-646c-457b-b635-cbf414e4ac3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 174111464, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'dd2cb045-1122-4834-9fea-6294fc690f67-vda', 'timestamp': '2026-01-22T18:05:55.540387', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'instance-00000053', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ff2afd0e-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.230496148, 'message_signature': 'bcd50331f4d219db207e4303397d5d7c32d19ad5898bdc38c0160c17c838b5ec'}]}, 'timestamp': '2026-01-22 18:05:55.540606', '_unique_id': '2823d42d5eab441dadb94eb9512a68fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.541 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/disk.device.read.requests volume: 1116 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b2b49cd-9182-46a8-b4d3-20a747b5d88b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1116, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'dd2cb045-1122-4834-9fea-6294fc690f67-vda', 'timestamp': '2026-01-22T18:05:55.541807', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'instance-00000053', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ff2b358a-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.230496148, 'message_signature': '9f1384598d3adfef9fdb92108dcf42c99692eebf27cdc9334a30483b98510e71'}]}, 'timestamp': '2026-01-22 18:05:55.542087', '_unique_id': '60baebbf6b08494c868b09cac6c0c6cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.542 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 DEBUG ceilometer.compute.pollsters [-] dd2cb045-1122-4834-9fea-6294fc690f67/memory.usage volume: 43.3046875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30009f61-63c7-43a3-a039-d64333259df1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.3046875, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_name': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_name': None, 'resource_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'timestamp': '2026-01-22T18:05:55.543165', 'resource_metadata': {'display_name': 'tempest-server-test-1479257049', 'name': 'instance-00000053', 'instance_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'instance_type': 'm1.nano', 'host': '9609ec74ca2f8dac30350a40c32d6040073b98501f5870039b1471af', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8d1ce660-7497-440b-8666-00c695d0b4d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}, 'image_ref': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ff2b6992-f7bc-11f0-9e69-fa163eaea1db', 'monotonic_time': 7519.283967871, 'message_signature': '9ef2492406c6b31e601c2cc7d238743d332981830434979b249eab082fc635c4'}]}, 'timestamp': '2026-01-22 18:05:55.543384', '_unique_id': 'c9074bde722c41deaed0667e6cb1c77a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 18:05:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:05:55.543 12 ERROR oslo_messaging.notify.messaging 
Jan 22 18:05:57 compute-0 nova_compute[183075]: 2026-01-22 18:05:57.702 183079 INFO nova.compute.manager [None req-d0800a3e-2dd3-4192-a574-7045e764e18c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Get console output
Jan 22 18:05:57 compute-0 nova_compute[183075]: 2026-01-22 18:05:57.707 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:05:58 compute-0 nova_compute[183075]: 2026-01-22 18:05:58.226 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:05:59 compute-0 nova_compute[183075]: 2026-01-22 18:05:59.902 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:02 compute-0 nova_compute[183075]: 2026-01-22 18:06:02.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:06:02 compute-0 nova_compute[183075]: 2026-01-22 18:06:02.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:06:02 compute-0 nova_compute[183075]: 2026-01-22 18:06:02.787 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 18:06:02 compute-0 nova_compute[183075]: 2026-01-22 18:06:02.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 18:06:02 compute-0 nova_compute[183075]: 2026-01-22 18:06:02.836 183079 INFO nova.compute.manager [None req-2c64b49f-f6df-4e3d-8ea9-274f11eba9ba 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Get console output
Jan 22 18:06:02 compute-0 nova_compute[183075]: 2026-01-22 18:06:02.842 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:06:03 compute-0 nova_compute[183075]: 2026-01-22 18:06:03.280 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:03 compute-0 nova_compute[183075]: 2026-01-22 18:06:03.898 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-dd2cb045-1122-4834-9fea-6294fc690f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:06:03 compute-0 nova_compute[183075]: 2026-01-22 18:06:03.899 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-dd2cb045-1122-4834-9fea-6294fc690f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:06:03 compute-0 nova_compute[183075]: 2026-01-22 18:06:03.899 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 18:06:03 compute-0 nova_compute[183075]: 2026-01-22 18:06:03.899 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid dd2cb045-1122-4834-9fea-6294fc690f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:06:04 compute-0 nova_compute[183075]: 2026-01-22 18:06:04.905 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:05 compute-0 podman[248148]: 2026-01-22 18:06:05.368169044 +0000 UTC m=+0.068851510 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Jan 22 18:06:05 compute-0 podman[248147]: 2026-01-22 18:06:05.380853026 +0000 UTC m=+0.084687887 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 18:06:05 compute-0 podman[248146]: 2026-01-22 18:06:05.392585583 +0000 UTC m=+0.102487718 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:06:06 compute-0 nova_compute[183075]: 2026-01-22 18:06:06.724 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "e0fdd3fd-57d6-4237-8d21-0716e1687405" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:06:06 compute-0 nova_compute[183075]: 2026-01-22 18:06:06.725 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "e0fdd3fd-57d6-4237-8d21-0716e1687405" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:06:06 compute-0 nova_compute[183075]: 2026-01-22 18:06:06.745 183079 DEBUG nova.compute.manager [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 18:06:06 compute-0 nova_compute[183075]: 2026-01-22 18:06:06.788 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Updating instance_info_cache with network_info: [{"id": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "address": "fa:16:3e:29:3d:20", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe29:3d20", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba5e4c6-9f", "ovs_interfaceid": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:06:06 compute-0 nova_compute[183075]: 2026-01-22 18:06:06.808 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-dd2cb045-1122-4834-9fea-6294fc690f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:06:06 compute-0 nova_compute[183075]: 2026-01-22 18:06:06.808 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 18:06:06 compute-0 nova_compute[183075]: 2026-01-22 18:06:06.809 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:06:06 compute-0 nova_compute[183075]: 2026-01-22 18:06:06.809 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 18:06:06 compute-0 nova_compute[183075]: 2026-01-22 18:06:06.809 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:06:06 compute-0 nova_compute[183075]: 2026-01-22 18:06:06.820 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:06:06 compute-0 nova_compute[183075]: 2026-01-22 18:06:06.820 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:06:06 compute-0 nova_compute[183075]: 2026-01-22 18:06:06.825 183079 DEBUG nova.virt.hardware [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 18:06:06 compute-0 nova_compute[183075]: 2026-01-22 18:06:06.825 183079 INFO nova.compute.claims [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Claim successful on node compute-0.ctlplane.example.com
Jan 22 18:06:06 compute-0 nova_compute[183075]: 2026-01-22 18:06:06.834 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.050 183079 DEBUG nova.compute.provider_tree [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.066 183079 DEBUG nova.scheduler.client.report [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.086 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.087 183079 DEBUG nova.compute.manager [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.091 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.091 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.091 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.159 183079 DEBUG nova.compute.manager [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.160 183079 DEBUG nova.network.neutron [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.191 183079 INFO nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.200 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.220 183079 DEBUG nova.compute.manager [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.258 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.259 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.320 183079 DEBUG nova.compute.manager [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.322 183079 DEBUG nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.323 183079 INFO nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Creating image(s)
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.323 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "/var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.324 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "/var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.324 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "/var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.337 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.337 183079 DEBUG oslo_concurrency.processutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.400 183079 DEBUG oslo_concurrency.processutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.401 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.402 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.412 183079 DEBUG oslo_concurrency.processutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.466 183079 DEBUG oslo_concurrency.processutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.467 183079 DEBUG oslo_concurrency.processutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.501 183079 DEBUG oslo_concurrency.processutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218,backing_fmt=raw /var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.502 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "dc114733697ffcf2ceab5e1bcdf92e07c516f218" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.502 183079 DEBUG oslo_concurrency.processutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.559 183079 DEBUG oslo_concurrency.processutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dc114733697ffcf2ceab5e1bcdf92e07c516f218 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.561 183079 DEBUG nova.virt.disk.api [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Checking if we can resize image /var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.561 183079 DEBUG oslo_concurrency.processutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.578 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.580 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5553MB free_disk=73.32141876220703GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.581 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.581 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.623 183079 DEBUG oslo_concurrency.processutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.626 183079 DEBUG nova.virt.disk.api [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Cannot resize image /var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.626 183079 DEBUG nova.objects.instance [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lazy-loading 'migration_context' on Instance uuid e0fdd3fd-57d6-4237-8d21-0716e1687405 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.639 183079 DEBUG nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.640 183079 DEBUG nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Ensure instance console log exists: /var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.641 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.641 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.641 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.646 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance dd2cb045-1122-4834-9fea-6294fc690f67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.647 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance e0fdd3fd-57d6-4237-8d21-0716e1687405 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.647 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.647 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.712 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.728 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.752 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.752 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:06:07 compute-0 nova_compute[183075]: 2026-01-22 18:06:07.940 183079 DEBUG nova.policy [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '42329d6b6bc04d9daacda0eb41f36019', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 18:06:08 compute-0 nova_compute[183075]: 2026-01-22 18:06:08.326 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:08 compute-0 podman[248235]: 2026-01-22 18:06:08.352287533 +0000 UTC m=+0.059926719 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 18:06:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:09.129 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:06:09 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:09.129 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 18:06:09 compute-0 nova_compute[183075]: 2026-01-22 18:06:09.130 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:09 compute-0 nova_compute[183075]: 2026-01-22 18:06:09.228 183079 DEBUG nova.network.neutron [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Successfully created port: 8cf692cc-450a-4d35-99f9-bbf0b17493ab _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 18:06:09 compute-0 nova_compute[183075]: 2026-01-22 18:06:09.731 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:06:09 compute-0 nova_compute[183075]: 2026-01-22 18:06:09.907 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:11 compute-0 nova_compute[183075]: 2026-01-22 18:06:11.044 183079 DEBUG nova.network.neutron [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Successfully updated port: 8cf692cc-450a-4d35-99f9-bbf0b17493ab _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 18:06:11 compute-0 nova_compute[183075]: 2026-01-22 18:06:11.061 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "refresh_cache-e0fdd3fd-57d6-4237-8d21-0716e1687405" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:06:11 compute-0 nova_compute[183075]: 2026-01-22 18:06:11.061 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquired lock "refresh_cache-e0fdd3fd-57d6-4237-8d21-0716e1687405" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:06:11 compute-0 nova_compute[183075]: 2026-01-22 18:06:11.061 183079 DEBUG nova.network.neutron [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 18:06:11 compute-0 nova_compute[183075]: 2026-01-22 18:06:11.146 183079 DEBUG nova.compute.manager [req-9614e4b1-1f61-4958-998e-1c33a4b83855 req-eb9e2b06-4b4b-4718-9268-05cea2218966 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Received event network-changed-8cf692cc-450a-4d35-99f9-bbf0b17493ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:06:11 compute-0 nova_compute[183075]: 2026-01-22 18:06:11.146 183079 DEBUG nova.compute.manager [req-9614e4b1-1f61-4958-998e-1c33a4b83855 req-eb9e2b06-4b4b-4718-9268-05cea2218966 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Refreshing instance network info cache due to event network-changed-8cf692cc-450a-4d35-99f9-bbf0b17493ab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 18:06:11 compute-0 nova_compute[183075]: 2026-01-22 18:06:11.147 183079 DEBUG oslo_concurrency.lockutils [req-9614e4b1-1f61-4958-998e-1c33a4b83855 req-eb9e2b06-4b4b-4718-9268-05cea2218966 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-e0fdd3fd-57d6-4237-8d21-0716e1687405" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:06:11 compute-0 nova_compute[183075]: 2026-01-22 18:06:11.218 183079 DEBUG nova.network.neutron [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.846 183079 DEBUG nova.network.neutron [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Updating instance_info_cache with network_info: [{"id": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "address": "fa:16:3e:c1:1b:b1", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec1:1bb1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cf692cc-45", "ovs_interfaceid": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.930 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Releasing lock "refresh_cache-e0fdd3fd-57d6-4237-8d21-0716e1687405" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.931 183079 DEBUG nova.compute.manager [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Instance network_info: |[{"id": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "address": "fa:16:3e:c1:1b:b1", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec1:1bb1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cf692cc-45", "ovs_interfaceid": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.931 183079 DEBUG oslo_concurrency.lockutils [req-9614e4b1-1f61-4958-998e-1c33a4b83855 req-eb9e2b06-4b4b-4718-9268-05cea2218966 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-e0fdd3fd-57d6-4237-8d21-0716e1687405" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.932 183079 DEBUG nova.network.neutron [req-9614e4b1-1f61-4958-998e-1c33a4b83855 req-eb9e2b06-4b4b-4718-9268-05cea2218966 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Refreshing network info cache for port 8cf692cc-450a-4d35-99f9-bbf0b17493ab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.935 183079 DEBUG nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Start _get_guest_xml network_info=[{"id": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "address": "fa:16:3e:c1:1b:b1", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec1:1bb1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cf692cc-45", "ovs_interfaceid": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e1b65bbe-5c14-4552-a5d9-d275c9dd42d3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.939 183079 WARNING nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.944 183079 DEBUG nova.virt.libvirt.host [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.944 183079 DEBUG nova.virt.libvirt.host [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.947 183079 DEBUG nova.virt.libvirt.host [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.948 183079 DEBUG nova.virt.libvirt.host [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.948 183079 DEBUG nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.948 183079 DEBUG nova.virt.hardware [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T16:53:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8d1ce660-7497-440b-8666-00c695d0b4d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T16:53:28Z,direct_url=<?>,disk_format='qcow2',id=e1b65bbe-5c14-4552-a5d9-d275c9dd42d3,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8d88b00a23ef40338653b967006abf05',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T16:53:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.949 183079 DEBUG nova.virt.hardware [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.949 183079 DEBUG nova.virt.hardware [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.949 183079 DEBUG nova.virt.hardware [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.949 183079 DEBUG nova.virt.hardware [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.950 183079 DEBUG nova.virt.hardware [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.950 183079 DEBUG nova.virt.hardware [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.950 183079 DEBUG nova.virt.hardware [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.950 183079 DEBUG nova.virt.hardware [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.951 183079 DEBUG nova.virt.hardware [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.951 183079 DEBUG nova.virt.hardware [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.955 183079 DEBUG nova.virt.libvirt.vif [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T18:06:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1265808566',display_name='tempest-server-test-1265808566',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1265808566',id=84,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN4nUgc+oV7ByTq9qVVAwVt7Z5aS2u6R2o/Luqd0qC5fBtJq9ShHu4J7y3Jz2xuF9QZoBf9pscb0QblSJ2VeHfwaE0UkbJGn/O5SM0HSKnGhFZW4WRYEXRQdnsqUpTi/Qw==',key_name='tempest-keypair-test-2068520959',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bfeeb28f5b97478fac3f61cc12827bb3',ramdisk_id='',reservation_id='r-w0z2sm4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessSecGroupDualStackSlaacTest-795693713',owner_user_name='tempest-StatelessSecGroupDualStackSlaacTest-795693713-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T18:06:07Z,user_data=None,user_id='42329d6b6bc04d9daacda0eb41f36019',uuid=e0fdd3fd-57d6-4237-8d21-0716e1687405,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "address": "fa:16:3e:c1:1b:b1", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec1:1bb1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cf692cc-45", "ovs_interfaceid": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.956 183079 DEBUG nova.network.os_vif_util [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Converting VIF {"id": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "address": "fa:16:3e:c1:1b:b1", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec1:1bb1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cf692cc-45", "ovs_interfaceid": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.956 183079 DEBUG nova.network.os_vif_util [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:1b:b1,bridge_name='br-int',has_traffic_filtering=True,id=8cf692cc-450a-4d35-99f9-bbf0b17493ab,network=Network(cd33fee9-f283-4792-8796-9cc0f4021aaf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cf692cc-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.957 183079 DEBUG nova.objects.instance [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lazy-loading 'pci_devices' on Instance uuid e0fdd3fd-57d6-4237-8d21-0716e1687405 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.970 183079 DEBUG nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] End _get_guest_xml xml=<domain type="kvm">
Jan 22 18:06:12 compute-0 nova_compute[183075]:   <uuid>e0fdd3fd-57d6-4237-8d21-0716e1687405</uuid>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   <name>instance-00000054</name>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   <memory>131072</memory>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   <vcpu>1</vcpu>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   <metadata>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <nova:name>tempest-server-test-1265808566</nova:name>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <nova:creationTime>2026-01-22 18:06:12</nova:creationTime>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <nova:flavor name="m1.nano">
Jan 22 18:06:12 compute-0 nova_compute[183075]:         <nova:memory>128</nova:memory>
Jan 22 18:06:12 compute-0 nova_compute[183075]:         <nova:disk>1</nova:disk>
Jan 22 18:06:12 compute-0 nova_compute[183075]:         <nova:swap>0</nova:swap>
Jan 22 18:06:12 compute-0 nova_compute[183075]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 18:06:12 compute-0 nova_compute[183075]:         <nova:vcpus>1</nova:vcpus>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       </nova:flavor>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <nova:owner>
Jan 22 18:06:12 compute-0 nova_compute[183075]:         <nova:user uuid="42329d6b6bc04d9daacda0eb41f36019">tempest-StatelessSecGroupDualStackSlaacTest-795693713-project-member</nova:user>
Jan 22 18:06:12 compute-0 nova_compute[183075]:         <nova:project uuid="bfeeb28f5b97478fac3f61cc12827bb3">tempest-StatelessSecGroupDualStackSlaacTest-795693713</nova:project>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       </nova:owner>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <nova:root type="image" uuid="e1b65bbe-5c14-4552-a5d9-d275c9dd42d3"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <nova:ports>
Jan 22 18:06:12 compute-0 nova_compute[183075]:         <nova:port uuid="8cf692cc-450a-4d35-99f9-bbf0b17493ab">
Jan 22 18:06:12 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fec1:1bb1" ipVersion="6"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:         </nova:port>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       </nova:ports>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     </nova:instance>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   </metadata>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   <sysinfo type="smbios">
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <system>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <entry name="manufacturer">RDO</entry>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <entry name="product">OpenStack Compute</entry>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <entry name="serial">e0fdd3fd-57d6-4237-8d21-0716e1687405</entry>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <entry name="uuid">e0fdd3fd-57d6-4237-8d21-0716e1687405</entry>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <entry name="family">Virtual Machine</entry>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     </system>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   </sysinfo>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   <os>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <boot dev="hd"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <smbios mode="sysinfo"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   </os>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   <features>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <acpi/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <apic/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <vmcoreinfo/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   </features>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   <clock offset="utc">
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <timer name="hpet" present="no"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   </clock>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   <cpu mode="host-model" match="exact">
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   </cpu>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   <devices>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <disk type="file" device="disk">
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <source file="/var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405/disk"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <target dev="vda" bus="virtio"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     </disk>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <interface type="ethernet">
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <mac address="fa:16:3e:c1:1b:b1"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <mtu size="1442"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <target dev="tap8cf692cc-45"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     </interface>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <serial type="pty">
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <log file="/var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405/console.log" append="off"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     </serial>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <video>
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <model type="virtio"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     </video>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <input type="tablet" bus="usb"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <rng model="virtio">
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <backend model="random">/dev/urandom</backend>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     </rng>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <controller type="usb" index="0"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     <memballoon model="virtio">
Jan 22 18:06:12 compute-0 nova_compute[183075]:       <stats period="10"/>
Jan 22 18:06:12 compute-0 nova_compute[183075]:     </memballoon>
Jan 22 18:06:12 compute-0 nova_compute[183075]:   </devices>
Jan 22 18:06:12 compute-0 nova_compute[183075]: </domain>
Jan 22 18:06:12 compute-0 nova_compute[183075]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.971 183079 DEBUG nova.compute.manager [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Preparing to wait for external event network-vif-plugged-8cf692cc-450a-4d35-99f9-bbf0b17493ab prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.972 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.972 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.972 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.973 183079 DEBUG nova.virt.libvirt.vif [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T18:06:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1265808566',display_name='tempest-server-test-1265808566',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1265808566',id=84,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN4nUgc+oV7ByTq9qVVAwVt7Z5aS2u6R2o/Luqd0qC5fBtJq9ShHu4J7y3Jz2xuF9QZoBf9pscb0QblSJ2VeHfwaE0UkbJGn/O5SM0HSKnGhFZW4WRYEXRQdnsqUpTi/Qw==',key_name='tempest-keypair-test-2068520959',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bfeeb28f5b97478fac3f61cc12827bb3',ramdisk_id='',reservation_id='r-w0z2sm4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-StatelessSecGroupDualStackSlaacTest-795693713',owner_user_name='tempest-StatelessSecGroupDualStackSlaacTest-795693713-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T18:06:07Z,user_data=None,user_id='42329d6b6bc04d9daacda0eb41f36019',uuid=e0fdd3fd-57d6-4237-8d21-0716e1687405,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "address": "fa:16:3e:c1:1b:b1", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec1:1bb1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cf692cc-45", "ovs_interfaceid": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.973 183079 DEBUG nova.network.os_vif_util [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Converting VIF {"id": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "address": "fa:16:3e:c1:1b:b1", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec1:1bb1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cf692cc-45", "ovs_interfaceid": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.974 183079 DEBUG nova.network.os_vif_util [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:1b:b1,bridge_name='br-int',has_traffic_filtering=True,id=8cf692cc-450a-4d35-99f9-bbf0b17493ab,network=Network(cd33fee9-f283-4792-8796-9cc0f4021aaf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cf692cc-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.974 183079 DEBUG os_vif [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:1b:b1,bridge_name='br-int',has_traffic_filtering=True,id=8cf692cc-450a-4d35-99f9-bbf0b17493ab,network=Network(cd33fee9-f283-4792-8796-9cc0f4021aaf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cf692cc-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.975 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.975 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.975 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.978 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.978 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8cf692cc-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.978 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8cf692cc-45, col_values=(('external_ids', {'iface-id': '8cf692cc-450a-4d35-99f9-bbf0b17493ab', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:1b:b1', 'vm-uuid': 'e0fdd3fd-57d6-4237-8d21-0716e1687405'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.979 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:12 compute-0 NetworkManager[55454]: <info>  [1769105172.9808] manager: (tap8cf692cc-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/362)
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.983 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.989 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:12 compute-0 nova_compute[183075]: 2026-01-22 18:06:12.990 183079 INFO os_vif [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:1b:b1,bridge_name='br-int',has_traffic_filtering=True,id=8cf692cc-450a-4d35-99f9-bbf0b17493ab,network=Network(cd33fee9-f283-4792-8796-9cc0f4021aaf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cf692cc-45')
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.032 183079 DEBUG nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.033 183079 DEBUG nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] No VIF found with MAC fa:16:3e:c1:1b:b1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 18:06:13 compute-0 kernel: tap8cf692cc-45: entered promiscuous mode
Jan 22 18:06:13 compute-0 NetworkManager[55454]: <info>  [1769105173.1079] manager: (tap8cf692cc-45): new Tun device (/org/freedesktop/NetworkManager/Devices/363)
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.111 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:13 compute-0 ovn_controller[95372]: 2026-01-22T18:06:13Z|00903|binding|INFO|Claiming lport 8cf692cc-450a-4d35-99f9-bbf0b17493ab for this chassis.
Jan 22 18:06:13 compute-0 ovn_controller[95372]: 2026-01-22T18:06:13Z|00904|binding|INFO|8cf692cc-450a-4d35-99f9-bbf0b17493ab: Claiming fa:16:3e:c1:1b:b1 10.100.0.13 2001:db8::f816:3eff:fec1:1bb1
Jan 22 18:06:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:13.118 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:1b:b1 10.100.0.13 2001:db8::f816:3eff:fec1:1bb1'], port_security=['fa:16:3e:c1:1b:b1 10.100.0.13 2001:db8::f816:3eff:fec1:1bb1'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8::f816:3eff:fec1:1bb1/64', 'neutron:device_id': 'e0fdd3fd-57d6-4237-8d21-0716e1687405', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ad003613-25d7-4407-87d3-e15c431e7689', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7c020e6d-68ca-4390-a036-a6097b05932f, chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=8cf692cc-450a-4d35-99f9-bbf0b17493ab) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:06:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:13.119 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 8cf692cc-450a-4d35-99f9-bbf0b17493ab in datapath cd33fee9-f283-4792-8796-9cc0f4021aaf bound to our chassis
Jan 22 18:06:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:13.120 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cd33fee9-f283-4792-8796-9cc0f4021aaf
Jan 22 18:06:13 compute-0 ovn_controller[95372]: 2026-01-22T18:06:13Z|00905|binding|INFO|Setting lport 8cf692cc-450a-4d35-99f9-bbf0b17493ab ovn-installed in OVS
Jan 22 18:06:13 compute-0 ovn_controller[95372]: 2026-01-22T18:06:13Z|00906|binding|INFO|Setting lport 8cf692cc-450a-4d35-99f9-bbf0b17493ab up in Southbound
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.128 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:13.137 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[b3e584ba-b777-4b9f-86b8-8994976360f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:06:13 compute-0 systemd-udevd[248271]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 18:06:13 compute-0 NetworkManager[55454]: <info>  [1769105173.1576] device (tap8cf692cc-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 18:06:13 compute-0 NetworkManager[55454]: <info>  [1769105173.1582] device (tap8cf692cc-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 18:06:13 compute-0 systemd-machined[154382]: New machine qemu-84-instance-00000054.
Jan 22 18:06:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:13.170 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[8853ac1e-6e17-46d4-a08d-a7d7b1e9246e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:06:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:13.175 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[ba18a365-48e0-4697-8059-1a62addebca9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:06:13 compute-0 systemd[1]: Started Virtual Machine qemu-84-instance-00000054.
Jan 22 18:06:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:13.208 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[0f2afd77-d337-44f8-bfb3-de7435c5ded9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:06:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:13.229 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[bdd98dda-77f8-45a1-9003-51fb4b0577f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd33fee9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:ef:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 115, 'tx_packets': 54, 'rx_bytes': 9890, 'tx_bytes': 6131, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 115, 'tx_packets': 54, 'rx_bytes': 9890, 'tx_bytes': 6131, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 246], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746717, 'reachable_time': 40874, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 17, 'inoctets': 1264, 'indelivers': 4, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 17, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1264, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 17, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 4, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248282, 'error': None, 'target': 'ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:06:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:13.247 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f293e044-7b94-4a0d-b1a7-abca57bab0a8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapcd33fee9-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 746731, 'tstamp': 746731}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248285, 'error': None, 'target': 'ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapcd33fee9-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 746734, 'tstamp': 746734}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248285, 'error': None, 'target': 'ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:06:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:13.250 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd33fee9-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.252 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.253 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:13.254 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd33fee9-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:06:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:13.254 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 18:06:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:13.255 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcd33fee9-f0, col_values=(('external_ids', {'iface-id': '4c6b8b92-89d6-4150-a6bb-444d2d5ca88c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:06:13 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:13.255 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.364 183079 DEBUG nova.compute.manager [req-d87156da-6e92-4cd9-81d9-4ba41f2fc54f req-a69cb0c4-1639-4ec8-bac1-1f3881bd7130 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Received event network-vif-plugged-8cf692cc-450a-4d35-99f9-bbf0b17493ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.365 183079 DEBUG oslo_concurrency.lockutils [req-d87156da-6e92-4cd9-81d9-4ba41f2fc54f req-a69cb0c4-1639-4ec8-bac1-1f3881bd7130 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.365 183079 DEBUG oslo_concurrency.lockutils [req-d87156da-6e92-4cd9-81d9-4ba41f2fc54f req-a69cb0c4-1639-4ec8-bac1-1f3881bd7130 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.365 183079 DEBUG oslo_concurrency.lockutils [req-d87156da-6e92-4cd9-81d9-4ba41f2fc54f req-a69cb0c4-1639-4ec8-bac1-1f3881bd7130 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.366 183079 DEBUG nova.compute.manager [req-d87156da-6e92-4cd9-81d9-4ba41f2fc54f req-a69cb0c4-1639-4ec8-bac1-1f3881bd7130 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Processing event network-vif-plugged-8cf692cc-450a-4d35-99f9-bbf0b17493ab _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.812 183079 DEBUG nova.compute.manager [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.813 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769105173.8116055, e0fdd3fd-57d6-4237-8d21-0716e1687405 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.813 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] VM Started (Lifecycle Event)
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.816 183079 DEBUG nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.821 183079 INFO nova.virt.libvirt.driver [-] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Instance spawned successfully.
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.821 183079 DEBUG nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.834 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.840 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.843 183079 DEBUG nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.844 183079 DEBUG nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.844 183079 DEBUG nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.845 183079 DEBUG nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.845 183079 DEBUG nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.846 183079 DEBUG nova.virt.libvirt.driver [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.874 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.875 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769105173.812731, e0fdd3fd-57d6-4237-8d21-0716e1687405 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.875 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] VM Paused (Lifecycle Event)
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.909 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.913 183079 DEBUG nova.virt.driver [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] Emitting event <LifecycleEvent: 1769105173.816006, e0fdd3fd-57d6-4237-8d21-0716e1687405 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.913 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] VM Resumed (Lifecycle Event)
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.918 183079 INFO nova.compute.manager [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Took 6.60 seconds to spawn the instance on the hypervisor.
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.919 183079 DEBUG nova.compute.manager [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.942 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.945 183079 DEBUG nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.977 183079 INFO nova.compute.manager [None req-ce60abdd-042b-401d-aab1-c729a34d9ada - - - - - -] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 18:06:13 compute-0 nova_compute[183075]: 2026-01-22 18:06:13.987 183079 INFO nova.compute.manager [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Took 7.19 seconds to build instance.
Jan 22 18:06:14 compute-0 nova_compute[183075]: 2026-01-22 18:06:14.010 183079 DEBUG oslo_concurrency.lockutils [None req-8f242a8b-c2ff-42cf-a236-c0e477f0f463 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "e0fdd3fd-57d6-4237-8d21-0716e1687405" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:06:14 compute-0 nova_compute[183075]: 2026-01-22 18:06:14.910 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:14 compute-0 nova_compute[183075]: 2026-01-22 18:06:14.923 183079 DEBUG nova.network.neutron [req-9614e4b1-1f61-4958-998e-1c33a4b83855 req-eb9e2b06-4b4b-4718-9268-05cea2218966 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Updated VIF entry in instance network info cache for port 8cf692cc-450a-4d35-99f9-bbf0b17493ab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 18:06:14 compute-0 nova_compute[183075]: 2026-01-22 18:06:14.924 183079 DEBUG nova.network.neutron [req-9614e4b1-1f61-4958-998e-1c33a4b83855 req-eb9e2b06-4b4b-4718-9268-05cea2218966 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Updating instance_info_cache with network_info: [{"id": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "address": "fa:16:3e:c1:1b:b1", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec1:1bb1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cf692cc-45", "ovs_interfaceid": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:06:14 compute-0 nova_compute[183075]: 2026-01-22 18:06:14.948 183079 DEBUG oslo_concurrency.lockutils [req-9614e4b1-1f61-4958-998e-1c33a4b83855 req-eb9e2b06-4b4b-4718-9268-05cea2218966 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-e0fdd3fd-57d6-4237-8d21-0716e1687405" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:06:15 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:15.132 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:06:15 compute-0 nova_compute[183075]: 2026-01-22 18:06:15.151 183079 INFO nova.compute.manager [None req-34ff20f8-6f4a-46a6-b138-e8d12c636d38 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Get console output
Jan 22 18:06:15 compute-0 nova_compute[183075]: 2026-01-22 18:06:15.157 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:06:15 compute-0 nova_compute[183075]: 2026-01-22 18:06:15.431 183079 DEBUG nova.compute.manager [req-83ac172d-080f-4fe9-aad6-60eb51c6522a req-fb996780-467d-42c2-904e-ad6260c9f299 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Received event network-vif-plugged-8cf692cc-450a-4d35-99f9-bbf0b17493ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:06:15 compute-0 nova_compute[183075]: 2026-01-22 18:06:15.431 183079 DEBUG oslo_concurrency.lockutils [req-83ac172d-080f-4fe9-aad6-60eb51c6522a req-fb996780-467d-42c2-904e-ad6260c9f299 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:06:15 compute-0 nova_compute[183075]: 2026-01-22 18:06:15.432 183079 DEBUG oslo_concurrency.lockutils [req-83ac172d-080f-4fe9-aad6-60eb51c6522a req-fb996780-467d-42c2-904e-ad6260c9f299 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:06:15 compute-0 nova_compute[183075]: 2026-01-22 18:06:15.432 183079 DEBUG oslo_concurrency.lockutils [req-83ac172d-080f-4fe9-aad6-60eb51c6522a req-fb996780-467d-42c2-904e-ad6260c9f299 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:06:15 compute-0 nova_compute[183075]: 2026-01-22 18:06:15.432 183079 DEBUG nova.compute.manager [req-83ac172d-080f-4fe9-aad6-60eb51c6522a req-fb996780-467d-42c2-904e-ad6260c9f299 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] No waiting events found dispatching network-vif-plugged-8cf692cc-450a-4d35-99f9-bbf0b17493ab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:06:15 compute-0 nova_compute[183075]: 2026-01-22 18:06:15.432 183079 WARNING nova.compute.manager [req-83ac172d-080f-4fe9-aad6-60eb51c6522a req-fb996780-467d-42c2-904e-ad6260c9f299 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Received unexpected event network-vif-plugged-8cf692cc-450a-4d35-99f9-bbf0b17493ab for instance with vm_state active and task_state None.
Jan 22 18:06:17 compute-0 nova_compute[183075]: 2026-01-22 18:06:17.980 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:19 compute-0 nova_compute[183075]: 2026-01-22 18:06:19.912 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:20 compute-0 nova_compute[183075]: 2026-01-22 18:06:20.910 183079 INFO nova.compute.manager [None req-211244a8-801d-4e2e-b462-47e4d0a0e851 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Get console output
Jan 22 18:06:20 compute-0 nova_compute[183075]: 2026-01-22 18:06:20.915 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:06:21 compute-0 podman[248295]: 2026-01-22 18:06:21.350811019 +0000 UTC m=+0.058926342 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 18:06:22 compute-0 nova_compute[183075]: 2026-01-22 18:06:22.982 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:24 compute-0 nova_compute[183075]: 2026-01-22 18:06:24.914 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:25 compute-0 podman[248340]: 2026-01-22 18:06:25.340844051 +0000 UTC m=+0.050851044 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 18:06:26 compute-0 ovn_controller[95372]: 2026-01-22T18:06:26Z|00170|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c1:1b:b1 10.100.0.13
Jan 22 18:06:26 compute-0 ovn_controller[95372]: 2026-01-22T18:06:26Z|00171|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c1:1b:b1 10.100.0.13
Jan 22 18:06:26 compute-0 nova_compute[183075]: 2026-01-22 18:06:26.759 183079 INFO nova.compute.manager [None req-a98c3f0e-0831-40cb-9893-41ede60c931d 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Get console output
Jan 22 18:06:26 compute-0 nova_compute[183075]: 2026-01-22 18:06:26.764 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:06:27 compute-0 nova_compute[183075]: 2026-01-22 18:06:27.984 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:28 compute-0 nova_compute[183075]: 2026-01-22 18:06:28.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:06:29 compute-0 nova_compute[183075]: 2026-01-22 18:06:29.916 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:31.278 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:06:31 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:31.279 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 18:06:31 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:06:31 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:06:31 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:06:31 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:06:31 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:06:31 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 18:06:31 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.013 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:06:32 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.13:36512 [22/Jan/2026:18:06:31.277] listener listener/metadata 0/0/0/737/737 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.014 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.7358706
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.022 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.023 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.047 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.047 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 200  len: 169 time: 0.0247161
Jan 22 18:06:32 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.13:36520 [22/Jan/2026:18:06:32.021] listener listener/metadata 0/0/0/26/26 200 153 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.052 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.052 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.067 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.068 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1" status: 200  len: 341 time: 0.0154772
Jan 22 18:06:32 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.13:36534 [22/Jan/2026:18:06:32.051] listener listener/metadata 0/0/0/16/16 200 325 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys/0/openssh-key HTTP/1.1"
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.073 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.074 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.087 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.087 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0136337
Jan 22 18:06:32 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.13:36548 [22/Jan/2026:18:06:32.072] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.092 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.093 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:06:32 compute-0 nova_compute[183075]: 2026-01-22 18:06:32.110 183079 INFO nova.compute.manager [None req-1a3eaf32-cb04-4545-abf9-324c6320d812 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Get console output
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.111 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.112 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0191410
Jan 22 18:06:32 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.13:36550 [22/Jan/2026:18:06:32.091] listener listener/metadata 0/0/0/20/20 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Jan 22 18:06:32 compute-0 nova_compute[183075]: 2026-01-22 18:06:32.115 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.118 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.119 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.139 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.139 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 142 time: 0.0202904
Jan 22 18:06:32 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.13:36564 [22/Jan/2026:18:06:32.117] listener listener/metadata 0/0/0/22/22 200 126 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.146 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.148 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.205 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.206 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 147 time: 0.0581260
Jan 22 18:06:32 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.13:36570 [22/Jan/2026:18:06:32.145] listener listener/metadata 0/0/0/60/60 200 131 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.211 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.213 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.237 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:06:32 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.13:36586 [22/Jan/2026:18:06:32.211] listener listener/metadata 0/0/0/26/26 200 119 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.238 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 135 time: 0.0251329
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.243 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.244 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.260 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:06:32 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.13:36596 [22/Jan/2026:18:06:32.242] listener listener/metadata 0/0/0/18/18 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.260 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 166 time: 0.0168352
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.265 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.266 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.281 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.281 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 166 time: 0.0158367
Jan 22 18:06:32 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.13:36610 [22/Jan/2026:18:06:32.264] listener listener/metadata 0/0/0/16/16 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.288 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.289 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:06:32 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.13:36626 [22/Jan/2026:18:06:32.287] listener listener/metadata 0/0/0/19/19 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.307 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0180488
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.319 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.320 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.335 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.335 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 143 time: 0.0154567
Jan 22 18:06:32 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.13:36632 [22/Jan/2026:18:06:32.318] listener listener/metadata 0/0/0/17/17 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.340 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.340 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.370 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.370 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0300999
Jan 22 18:06:32 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.13:36636 [22/Jan/2026:18:06:32.339] listener listener/metadata 0/0/0/31/31 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.375 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.375 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.386 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.387 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0118992
Jan 22 18:06:32 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.13:36640 [22/Jan/2026:18:06:32.374] listener listener/metadata 0/0/0/12/12 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.391 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.392 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.406 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.407 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 166 time: 0.0147021
Jan 22 18:06:32 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.13:36644 [22/Jan/2026:18:06:32.391] listener listener/metadata 0/0/0/15/15 200 150 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.416 104990 DEBUG eventlet.wsgi.server [-] (104990) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.417 104990 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Accept: */*
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Connection: close
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Content-Type: text/plain
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: Host: 169.254.169.254
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: User-Agent: curl/7.84.0
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Forwarded-For: 10.100.0.13
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: X-Ovn-Network-Id: cd33fee9-f283-4792-8796-9cc0f4021aaf __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.430 104990 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 18:06:32 compute-0 haproxy-metadata-proxy-cd33fee9-f283-4792-8796-9cc0f4021aaf[247901]: 10.100.0.13:36654 [22/Jan/2026:18:06:32.415] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Jan 22 18:06:32 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:32.431 104990 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0139945
Jan 22 18:06:32 compute-0 nova_compute[183075]: 2026-01-22 18:06:32.985 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:34 compute-0 nova_compute[183075]: 2026-01-22 18:06:34.974 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:36 compute-0 podman[248366]: 2026-01-22 18:06:36.348492896 +0000 UTC m=+0.052943934 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 22 18:06:36 compute-0 podman[248367]: 2026-01-22 18:06:36.376119094 +0000 UTC m=+0.067526079 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=openstack_network_exporter, version=9.6, maintainer=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git)
Jan 22 18:06:36 compute-0 podman[248365]: 2026-01-22 18:06:36.389362502 +0000 UTC m=+0.091926649 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:06:37 compute-0 nova_compute[183075]: 2026-01-22 18:06:37.339 183079 INFO nova.compute.manager [None req-f8751d75-00de-426d-9126-6ae1bc7b7670 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Get console output
Jan 22 18:06:37 compute-0 nova_compute[183075]: 2026-01-22 18:06:37.343 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:06:37 compute-0 nova_compute[183075]: 2026-01-22 18:06:37.988 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:39 compute-0 podman[248431]: 2026-01-22 18:06:39.366607833 +0000 UTC m=+0.067755485 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 18:06:39 compute-0 nova_compute[183075]: 2026-01-22 18:06:39.975 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:41.990 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:06:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:41.990 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:06:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:06:41.991 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:06:42 compute-0 nova_compute[183075]: 2026-01-22 18:06:42.990 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:43 compute-0 ovn_controller[95372]: 2026-01-22T18:06:43Z|00907|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 22 18:06:43 compute-0 nova_compute[183075]: 2026-01-22 18:06:43.368 183079 INFO nova.compute.manager [None req-516d5a8a-4526-44c0-a688-1529c20205ba 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Get console output
Jan 22 18:06:43 compute-0 nova_compute[183075]: 2026-01-22 18:06:43.373 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:06:44 compute-0 nova_compute[183075]: 2026-01-22 18:06:44.976 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:47 compute-0 nova_compute[183075]: 2026-01-22 18:06:47.992 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:48 compute-0 nova_compute[183075]: 2026-01-22 18:06:48.656 183079 INFO nova.compute.manager [None req-e335880a-ff3f-4217-be7e-76c562e3d2d6 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Get console output
Jan 22 18:06:48 compute-0 nova_compute[183075]: 2026-01-22 18:06:48.660 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:06:49 compute-0 nova_compute[183075]: 2026-01-22 18:06:49.978 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:52 compute-0 podman[248452]: 2026-01-22 18:06:52.357546403 +0000 UTC m=+0.071760573 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 18:06:52 compute-0 nova_compute[183075]: 2026-01-22 18:06:52.994 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:53 compute-0 nova_compute[183075]: 2026-01-22 18:06:53.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:06:53 compute-0 nova_compute[183075]: 2026-01-22 18:06:53.848 183079 INFO nova.compute.manager [None req-65cec092-94ba-4911-b341-d0ca32f64286 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Get console output
Jan 22 18:06:53 compute-0 nova_compute[183075]: 2026-01-22 18:06:53.852 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:06:54 compute-0 nova_compute[183075]: 2026-01-22 18:06:54.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:06:54 compute-0 nova_compute[183075]: 2026-01-22 18:06:54.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:06:54 compute-0 nova_compute[183075]: 2026-01-22 18:06:54.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:06:54 compute-0 nova_compute[183075]: 2026-01-22 18:06:54.980 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:56 compute-0 podman[248476]: 2026-01-22 18:06:56.341474456 +0000 UTC m=+0.049133611 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 18:06:57 compute-0 nova_compute[183075]: 2026-01-22 18:06:57.995 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:06:59 compute-0 nova_compute[183075]: 2026-01-22 18:06:59.012 183079 INFO nova.compute.manager [None req-f0c23842-95ec-44ee-bc7f-00436160bf37 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Get console output
Jan 22 18:06:59 compute-0 nova_compute[183075]: 2026-01-22 18:06:59.017 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:06:59 compute-0 nova_compute[183075]: 2026-01-22 18:06:59.980 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:02 compute-0 nova_compute[183075]: 2026-01-22 18:07:02.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:07:02 compute-0 nova_compute[183075]: 2026-01-22 18:07:02.998 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:03 compute-0 nova_compute[183075]: 2026-01-22 18:07:03.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:07:03 compute-0 nova_compute[183075]: 2026-01-22 18:07:03.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 18:07:03 compute-0 nova_compute[183075]: 2026-01-22 18:07:03.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 18:07:04 compute-0 nova_compute[183075]: 2026-01-22 18:07:04.105 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "refresh_cache-dd2cb045-1122-4834-9fea-6294fc690f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:07:04 compute-0 nova_compute[183075]: 2026-01-22 18:07:04.106 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquired lock "refresh_cache-dd2cb045-1122-4834-9fea-6294fc690f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:07:04 compute-0 nova_compute[183075]: 2026-01-22 18:07:04.106 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 18:07:04 compute-0 nova_compute[183075]: 2026-01-22 18:07:04.106 183079 DEBUG nova.objects.instance [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lazy-loading 'info_cache' on Instance uuid dd2cb045-1122-4834-9fea-6294fc690f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:07:04 compute-0 nova_compute[183075]: 2026-01-22 18:07:04.186 183079 INFO nova.compute.manager [None req-f7ff435c-f34c-492f-84bc-90d1c1ff2147 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Get console output
Jan 22 18:07:04 compute-0 nova_compute[183075]: 2026-01-22 18:07:04.192 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:07:04 compute-0 nova_compute[183075]: 2026-01-22 18:07:04.982 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:05 compute-0 nova_compute[183075]: 2026-01-22 18:07:05.644 183079 DEBUG nova.network.neutron [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Updating instance_info_cache with network_info: [{"id": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "address": "fa:16:3e:29:3d:20", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe29:3d20", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba5e4c6-9f", "ovs_interfaceid": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:07:05 compute-0 nova_compute[183075]: 2026-01-22 18:07:05.983 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Releasing lock "refresh_cache-dd2cb045-1122-4834-9fea-6294fc690f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:07:05 compute-0 nova_compute[183075]: 2026-01-22 18:07:05.983 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 18:07:05 compute-0 nova_compute[183075]: 2026-01-22 18:07:05.984 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:07:05 compute-0 nova_compute[183075]: 2026-01-22 18:07:05.984 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 18:07:07 compute-0 podman[248501]: 2026-01-22 18:07:07.342736082 +0000 UTC m=+0.052716868 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 18:07:07 compute-0 podman[248502]: 2026-01-22 18:07:07.34267943 +0000 UTC m=+0.051410142 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6)
Jan 22 18:07:07 compute-0 podman[248500]: 2026-01-22 18:07:07.429400007 +0000 UTC m=+0.143414972 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 18:07:07 compute-0 nova_compute[183075]: 2026-01-22 18:07:07.787 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:07:07 compute-0 nova_compute[183075]: 2026-01-22 18:07:07.812 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:07:07 compute-0 nova_compute[183075]: 2026-01-22 18:07:07.812 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:07:07 compute-0 nova_compute[183075]: 2026-01-22 18:07:07.812 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:07:07 compute-0 nova_compute[183075]: 2026-01-22 18:07:07.812 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 18:07:07 compute-0 nova_compute[183075]: 2026-01-22 18:07:07.873 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:07:07 compute-0 nova_compute[183075]: 2026-01-22 18:07:07.934 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:07:07 compute-0 nova_compute[183075]: 2026-01-22 18:07:07.935 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:07:07 compute-0 nova_compute[183075]: 2026-01-22 18:07:07.997 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.000 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.004 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.057 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.058 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.111 183079 DEBUG oslo_concurrency.processutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.262 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.263 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5364MB free_disk=73.29324340820312GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.263 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.264 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.337 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance dd2cb045-1122-4834-9fea-6294fc690f67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.338 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Instance e0fdd3fd-57d6-4237-8d21-0716e1687405 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.338 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.338 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.383 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.394 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.413 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 18:07:08 compute-0 nova_compute[183075]: 2026-01-22 18:07:08.413 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:07:09 compute-0 nova_compute[183075]: 2026-01-22 18:07:09.312 183079 INFO nova.compute.manager [None req-c285883a-7574-4c0a-b3da-b1d070c715bf 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Get console output
Jan 22 18:07:09 compute-0 nova_compute[183075]: 2026-01-22 18:07:09.317 211515 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 18:07:09 compute-0 nova_compute[183075]: 2026-01-22 18:07:09.984 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:10 compute-0 podman[248571]: 2026-01-22 18:07:10.361501395 +0000 UTC m=+0.064806705 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 18:07:11 compute-0 nova_compute[183075]: 2026-01-22 18:07:11.413 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:07:12 compute-0 ovn_controller[95372]: 2026-01-22T18:07:12Z|00908|binding|INFO|Releasing lport 4c6b8b92-89d6-4150-a6bb-444d2d5ca88c from this chassis (sb_readonly=0)
Jan 22 18:07:12 compute-0 NetworkManager[55454]: <info>  [1769105232.0198] manager: (patch-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/364)
Jan 22 18:07:12 compute-0 nova_compute[183075]: 2026-01-22 18:07:12.019 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:12 compute-0 NetworkManager[55454]: <info>  [1769105232.0208] manager: (patch-br-int-to-provnet-c63dea44-3fe3-44c3-ba47-76ee1ea0ad94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/365)
Jan 22 18:07:12 compute-0 ovn_controller[95372]: 2026-01-22T18:07:12Z|00909|binding|INFO|Releasing lport 4c6b8b92-89d6-4150-a6bb-444d2d5ca88c from this chassis (sb_readonly=0)
Jan 22 18:07:12 compute-0 nova_compute[183075]: 2026-01-22 18:07:12.048 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:12 compute-0 nova_compute[183075]: 2026-01-22 18:07:12.052 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:12 compute-0 nova_compute[183075]: 2026-01-22 18:07:12.271 183079 DEBUG nova.compute.manager [req-656089d8-c3e8-46ff-b59e-637432ba921a req-2ec15552-0d40-46a3-a8ab-42152ed3cb9a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Received event network-changed-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:07:12 compute-0 nova_compute[183075]: 2026-01-22 18:07:12.272 183079 DEBUG nova.compute.manager [req-656089d8-c3e8-46ff-b59e-637432ba921a req-2ec15552-0d40-46a3-a8ab-42152ed3cb9a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Refreshing instance network info cache due to event network-changed-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 18:07:12 compute-0 nova_compute[183075]: 2026-01-22 18:07:12.272 183079 DEBUG oslo_concurrency.lockutils [req-656089d8-c3e8-46ff-b59e-637432ba921a req-2ec15552-0d40-46a3-a8ab-42152ed3cb9a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-dd2cb045-1122-4834-9fea-6294fc690f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:07:12 compute-0 nova_compute[183075]: 2026-01-22 18:07:12.272 183079 DEBUG oslo_concurrency.lockutils [req-656089d8-c3e8-46ff-b59e-637432ba921a req-2ec15552-0d40-46a3-a8ab-42152ed3cb9a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-dd2cb045-1122-4834-9fea-6294fc690f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:07:12 compute-0 nova_compute[183075]: 2026-01-22 18:07:12.272 183079 DEBUG nova.network.neutron [req-656089d8-c3e8-46ff-b59e-637432ba921a req-2ec15552-0d40-46a3-a8ab-42152ed3cb9a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Refreshing network info cache for port cba5e4c6-9f20-4080-a0b3-f1ce8a74528e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 18:07:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:12.295 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:71:dc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:f1:b0:2e:df:ce'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:07:12 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:12.295 104629 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 18:07:12 compute-0 nova_compute[183075]: 2026-01-22 18:07:12.296 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:13 compute-0 nova_compute[183075]: 2026-01-22 18:07:13.002 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:13 compute-0 nova_compute[183075]: 2026-01-22 18:07:13.601 183079 DEBUG nova.network.neutron [req-656089d8-c3e8-46ff-b59e-637432ba921a req-2ec15552-0d40-46a3-a8ab-42152ed3cb9a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Updated VIF entry in instance network info cache for port cba5e4c6-9f20-4080-a0b3-f1ce8a74528e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 18:07:13 compute-0 nova_compute[183075]: 2026-01-22 18:07:13.601 183079 DEBUG nova.network.neutron [req-656089d8-c3e8-46ff-b59e-637432ba921a req-2ec15552-0d40-46a3-a8ab-42152ed3cb9a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Updating instance_info_cache with network_info: [{"id": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "address": "fa:16:3e:29:3d:20", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe29:3d20", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba5e4c6-9f", "ovs_interfaceid": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:07:13 compute-0 nova_compute[183075]: 2026-01-22 18:07:13.621 183079 DEBUG oslo_concurrency.lockutils [req-656089d8-c3e8-46ff-b59e-637432ba921a req-2ec15552-0d40-46a3-a8ab-42152ed3cb9a a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-dd2cb045-1122-4834-9fea-6294fc690f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:07:14 compute-0 nova_compute[183075]: 2026-01-22 18:07:14.266 183079 DEBUG nova.compute.manager [req-f122f530-5313-48a5-8210-f7e87db68245 req-6833cccd-fbbf-4266-928c-827501170a26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Received event network-changed-8cf692cc-450a-4d35-99f9-bbf0b17493ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:07:14 compute-0 nova_compute[183075]: 2026-01-22 18:07:14.266 183079 DEBUG nova.compute.manager [req-f122f530-5313-48a5-8210-f7e87db68245 req-6833cccd-fbbf-4266-928c-827501170a26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Refreshing instance network info cache due to event network-changed-8cf692cc-450a-4d35-99f9-bbf0b17493ab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 18:07:14 compute-0 nova_compute[183075]: 2026-01-22 18:07:14.266 183079 DEBUG oslo_concurrency.lockutils [req-f122f530-5313-48a5-8210-f7e87db68245 req-6833cccd-fbbf-4266-928c-827501170a26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "refresh_cache-e0fdd3fd-57d6-4237-8d21-0716e1687405" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 18:07:14 compute-0 nova_compute[183075]: 2026-01-22 18:07:14.266 183079 DEBUG oslo_concurrency.lockutils [req-f122f530-5313-48a5-8210-f7e87db68245 req-6833cccd-fbbf-4266-928c-827501170a26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquired lock "refresh_cache-e0fdd3fd-57d6-4237-8d21-0716e1687405" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 18:07:14 compute-0 nova_compute[183075]: 2026-01-22 18:07:14.267 183079 DEBUG nova.network.neutron [req-f122f530-5313-48a5-8210-f7e87db68245 req-6833cccd-fbbf-4266-928c-827501170a26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Refreshing network info cache for port 8cf692cc-450a-4d35-99f9-bbf0b17493ab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 18:07:14 compute-0 nova_compute[183075]: 2026-01-22 18:07:14.986 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:15 compute-0 nova_compute[183075]: 2026-01-22 18:07:15.465 183079 DEBUG nova.network.neutron [req-f122f530-5313-48a5-8210-f7e87db68245 req-6833cccd-fbbf-4266-928c-827501170a26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Updated VIF entry in instance network info cache for port 8cf692cc-450a-4d35-99f9-bbf0b17493ab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 18:07:15 compute-0 nova_compute[183075]: 2026-01-22 18:07:15.466 183079 DEBUG nova.network.neutron [req-f122f530-5313-48a5-8210-f7e87db68245 req-6833cccd-fbbf-4266-928c-827501170a26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Updating instance_info_cache with network_info: [{"id": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "address": "fa:16:3e:c1:1b:b1", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec1:1bb1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cf692cc-45", "ovs_interfaceid": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:07:15 compute-0 nova_compute[183075]: 2026-01-22 18:07:15.484 183079 DEBUG oslo_concurrency.lockutils [req-f122f530-5313-48a5-8210-f7e87db68245 req-6833cccd-fbbf-4266-928c-827501170a26 a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Releasing lock "refresh_cache-e0fdd3fd-57d6-4237-8d21-0716e1687405" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 18:07:16 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:16.298 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c288e768-a990-4b51-bd88-fd8dddb8c85d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.176 183079 DEBUG oslo_concurrency.lockutils [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "e0fdd3fd-57d6-4237-8d21-0716e1687405" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.178 183079 DEBUG oslo_concurrency.lockutils [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "e0fdd3fd-57d6-4237-8d21-0716e1687405" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.178 183079 DEBUG oslo_concurrency.lockutils [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.179 183079 DEBUG oslo_concurrency.lockutils [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.179 183079 DEBUG oslo_concurrency.lockutils [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.180 183079 INFO nova.compute.manager [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Terminating instance
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.181 183079 DEBUG nova.compute.manager [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 18:07:17 compute-0 kernel: tap8cf692cc-45 (unregistering): left promiscuous mode
Jan 22 18:07:17 compute-0 NetworkManager[55454]: <info>  [1769105237.2108] device (tap8cf692cc-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.252 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:17 compute-0 ovn_controller[95372]: 2026-01-22T18:07:17Z|00910|binding|INFO|Releasing lport 8cf692cc-450a-4d35-99f9-bbf0b17493ab from this chassis (sb_readonly=0)
Jan 22 18:07:17 compute-0 ovn_controller[95372]: 2026-01-22T18:07:17Z|00911|binding|INFO|Setting lport 8cf692cc-450a-4d35-99f9-bbf0b17493ab down in Southbound
Jan 22 18:07:17 compute-0 ovn_controller[95372]: 2026-01-22T18:07:17Z|00912|binding|INFO|Removing iface tap8cf692cc-45 ovn-installed in OVS
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.254 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:17.263 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:1b:b1 10.100.0.13 2001:db8::f816:3eff:fec1:1bb1'], port_security=['fa:16:3e:c1:1b:b1 10.100.0.13 2001:db8::f816:3eff:fec1:1bb1'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8::f816:3eff:fec1:1bb1/64', 'neutron:device_id': 'e0fdd3fd-57d6-4237-8d21-0716e1687405', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ad003613-25d7-4407-87d3-e15c431e7689', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.186'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7c020e6d-68ca-4390-a036-a6097b05932f, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=8cf692cc-450a-4d35-99f9-bbf0b17493ab) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.265 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:17.265 104629 INFO neutron.agent.ovn.metadata.agent [-] Port 8cf692cc-450a-4d35-99f9-bbf0b17493ab in datapath cd33fee9-f283-4792-8796-9cc0f4021aaf unbound from our chassis
Jan 22 18:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:17.266 104629 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cd33fee9-f283-4792-8796-9cc0f4021aaf
Jan 22 18:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:17.284 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[e3961e5b-b5d6-4d60-a51b-605910c30330]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:07:17 compute-0 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d00000054.scope: Deactivated successfully.
Jan 22 18:07:17 compute-0 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d00000054.scope: Consumed 15.055s CPU time.
Jan 22 18:07:17 compute-0 systemd-machined[154382]: Machine qemu-84-instance-00000054 terminated.
Jan 22 18:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:17.313 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[67fc7ce4-ee68-4238-bce2-9973f4035e59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:17.317 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d24eec70-ae3a-44c9-947e-884a20bd36e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:17.346 211665 DEBUG oslo.privsep.daemon [-] privsep: reply[d60097f1-adfa-4cf6-8722-3d1eb783ddf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:17.361 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[f78826b8-250a-4bcf-83cc-1c14a61ed441]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd33fee9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:ef:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 224, 'tx_packets': 106, 'rx_bytes': 19248, 'tx_bytes': 12059, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 224, 'tx_packets': 106, 'rx_bytes': 19248, 'tx_bytes': 12059, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 246], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746717, 'reachable_time': 40874, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 28, 'inoctets': 2080, 'indelivers': 7, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 28, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 2080, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 28, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 7, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248604, 'error': None, 'target': 'ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:17.380 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[4d209736-98cb-4c79-9be3-cfbe9d6f8d38]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapcd33fee9-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 746731, 'tstamp': 746731}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248605, 'error': None, 'target': 'ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapcd33fee9-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 746734, 'tstamp': 746734}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248605, 'error': None, 'target': 'ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:17.382 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd33fee9-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.383 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.388 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:17.388 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd33fee9-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:17.390 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 18:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:17.390 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcd33fee9-f0, col_values=(('external_ids', {'iface-id': '4c6b8b92-89d6-4150-a6bb-444d2d5ca88c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:07:17 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:17.391 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.403 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.408 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.422 183079 DEBUG nova.compute.manager [req-e286fd30-f488-45cd-800e-ca43991cced5 req-d411d5de-2281-4d4e-8c99-cba2a4fd07bf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Received event network-vif-unplugged-8cf692cc-450a-4d35-99f9-bbf0b17493ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.422 183079 DEBUG oslo_concurrency.lockutils [req-e286fd30-f488-45cd-800e-ca43991cced5 req-d411d5de-2281-4d4e-8c99-cba2a4fd07bf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.422 183079 DEBUG oslo_concurrency.lockutils [req-e286fd30-f488-45cd-800e-ca43991cced5 req-d411d5de-2281-4d4e-8c99-cba2a4fd07bf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.423 183079 DEBUG oslo_concurrency.lockutils [req-e286fd30-f488-45cd-800e-ca43991cced5 req-d411d5de-2281-4d4e-8c99-cba2a4fd07bf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.423 183079 DEBUG nova.compute.manager [req-e286fd30-f488-45cd-800e-ca43991cced5 req-d411d5de-2281-4d4e-8c99-cba2a4fd07bf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] No waiting events found dispatching network-vif-unplugged-8cf692cc-450a-4d35-99f9-bbf0b17493ab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.423 183079 DEBUG nova.compute.manager [req-e286fd30-f488-45cd-800e-ca43991cced5 req-d411d5de-2281-4d4e-8c99-cba2a4fd07bf a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Received event network-vif-unplugged-8cf692cc-450a-4d35-99f9-bbf0b17493ab for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.442 183079 INFO nova.virt.libvirt.driver [-] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Instance destroyed successfully.
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.443 183079 DEBUG nova.objects.instance [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lazy-loading 'resources' on Instance uuid e0fdd3fd-57d6-4237-8d21-0716e1687405 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.482 183079 DEBUG nova.virt.libvirt.vif [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T18:06:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1265808566',display_name='tempest-server-test-1265808566',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1265808566',id=84,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN4nUgc+oV7ByTq9qVVAwVt7Z5aS2u6R2o/Luqd0qC5fBtJq9ShHu4J7y3Jz2xuF9QZoBf9pscb0QblSJ2VeHfwaE0UkbJGn/O5SM0HSKnGhFZW4WRYEXRQdnsqUpTi/Qw==',key_name='tempest-keypair-test-2068520959',keypairs=<?>,launch_index=0,launched_at=2026-01-22T18:06:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bfeeb28f5b97478fac3f61cc12827bb3',ramdisk_id='',reservation_id='r-w0z2sm4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessSecGroupDualStackSlaacTest-795693713',owner_user_name='tempest-StatelessSecGroupDualStackSlaacTest-795693713-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T18:06:13Z,user_data=None,user_id='42329d6b6bc04d9daacda0eb41f36019',uuid=e0fdd3fd-57d6-4237-8d21-0716e1687405,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "address": "fa:16:3e:c1:1b:b1", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec1:1bb1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cf692cc-45", "ovs_interfaceid": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.483 183079 DEBUG nova.network.os_vif_util [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Converting VIF {"id": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "address": "fa:16:3e:c1:1b:b1", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec1:1bb1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cf692cc-45", "ovs_interfaceid": "8cf692cc-450a-4d35-99f9-bbf0b17493ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.483 183079 DEBUG nova.network.os_vif_util [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:1b:b1,bridge_name='br-int',has_traffic_filtering=True,id=8cf692cc-450a-4d35-99f9-bbf0b17493ab,network=Network(cd33fee9-f283-4792-8796-9cc0f4021aaf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cf692cc-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.484 183079 DEBUG os_vif [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:1b:b1,bridge_name='br-int',has_traffic_filtering=True,id=8cf692cc-450a-4d35-99f9-bbf0b17493ab,network=Network(cd33fee9-f283-4792-8796-9cc0f4021aaf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cf692cc-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.485 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.485 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cf692cc-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.487 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.488 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.491 183079 INFO os_vif [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:1b:b1,bridge_name='br-int',has_traffic_filtering=True,id=8cf692cc-450a-4d35-99f9-bbf0b17493ab,network=Network(cd33fee9-f283-4792-8796-9cc0f4021aaf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cf692cc-45')
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.491 183079 INFO nova.virt.libvirt.driver [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Deleting instance files /var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405_del
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.492 183079 INFO nova.virt.libvirt.driver [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Deletion of /var/lib/nova/instances/e0fdd3fd-57d6-4237-8d21-0716e1687405_del complete
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.536 183079 INFO nova.compute.manager [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.537 183079 DEBUG oslo.service.loopingcall [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.537 183079 DEBUG nova.compute.manager [-] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 18:07:17 compute-0 nova_compute[183075]: 2026-01-22 18:07:17.538 183079 DEBUG nova.network.neutron [-] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 18:07:19 compute-0 nova_compute[183075]: 2026-01-22 18:07:19.625 183079 DEBUG nova.compute.manager [req-439ec603-5e16-4749-8ac6-5a32ec4006ad req-d1e33f86-ddf4-4359-a2ea-bc58ff28aaff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Received event network-vif-plugged-8cf692cc-450a-4d35-99f9-bbf0b17493ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:07:19 compute-0 nova_compute[183075]: 2026-01-22 18:07:19.625 183079 DEBUG oslo_concurrency.lockutils [req-439ec603-5e16-4749-8ac6-5a32ec4006ad req-d1e33f86-ddf4-4359-a2ea-bc58ff28aaff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:07:19 compute-0 nova_compute[183075]: 2026-01-22 18:07:19.626 183079 DEBUG oslo_concurrency.lockutils [req-439ec603-5e16-4749-8ac6-5a32ec4006ad req-d1e33f86-ddf4-4359-a2ea-bc58ff28aaff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:07:19 compute-0 nova_compute[183075]: 2026-01-22 18:07:19.626 183079 DEBUG oslo_concurrency.lockutils [req-439ec603-5e16-4749-8ac6-5a32ec4006ad req-d1e33f86-ddf4-4359-a2ea-bc58ff28aaff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "e0fdd3fd-57d6-4237-8d21-0716e1687405-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:07:19 compute-0 nova_compute[183075]: 2026-01-22 18:07:19.626 183079 DEBUG nova.compute.manager [req-439ec603-5e16-4749-8ac6-5a32ec4006ad req-d1e33f86-ddf4-4359-a2ea-bc58ff28aaff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] No waiting events found dispatching network-vif-plugged-8cf692cc-450a-4d35-99f9-bbf0b17493ab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:07:19 compute-0 nova_compute[183075]: 2026-01-22 18:07:19.626 183079 WARNING nova.compute.manager [req-439ec603-5e16-4749-8ac6-5a32ec4006ad req-d1e33f86-ddf4-4359-a2ea-bc58ff28aaff a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Received unexpected event network-vif-plugged-8cf692cc-450a-4d35-99f9-bbf0b17493ab for instance with vm_state active and task_state deleting.
Jan 22 18:07:19 compute-0 nova_compute[183075]: 2026-01-22 18:07:19.924 183079 DEBUG nova.network.neutron [-] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:07:19 compute-0 nova_compute[183075]: 2026-01-22 18:07:19.942 183079 INFO nova.compute.manager [-] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Took 2.40 seconds to deallocate network for instance.
Jan 22 18:07:19 compute-0 nova_compute[183075]: 2026-01-22 18:07:19.988 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:19 compute-0 nova_compute[183075]: 2026-01-22 18:07:19.994 183079 DEBUG oslo_concurrency.lockutils [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:07:19 compute-0 nova_compute[183075]: 2026-01-22 18:07:19.995 183079 DEBUG oslo_concurrency.lockutils [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:07:20 compute-0 nova_compute[183075]: 2026-01-22 18:07:20.082 183079 DEBUG nova.compute.provider_tree [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:07:20 compute-0 nova_compute[183075]: 2026-01-22 18:07:20.096 183079 DEBUG nova.scheduler.client.report [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:07:20 compute-0 nova_compute[183075]: 2026-01-22 18:07:20.116 183079 DEBUG oslo_concurrency.lockutils [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:07:20 compute-0 nova_compute[183075]: 2026-01-22 18:07:20.141 183079 INFO nova.scheduler.client.report [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Deleted allocations for instance e0fdd3fd-57d6-4237-8d21-0716e1687405
Jan 22 18:07:20 compute-0 nova_compute[183075]: 2026-01-22 18:07:20.201 183079 DEBUG oslo_concurrency.lockutils [None req-ca6e118e-6159-48e1-980f-23f686ef8f04 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "e0fdd3fd-57d6-4237-8d21-0716e1687405" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.487 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.707 183079 DEBUG oslo_concurrency.lockutils [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "dd2cb045-1122-4834-9fea-6294fc690f67" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.707 183079 DEBUG oslo_concurrency.lockutils [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "dd2cb045-1122-4834-9fea-6294fc690f67" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.708 183079 DEBUG oslo_concurrency.lockutils [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.708 183079 DEBUG oslo_concurrency.lockutils [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.708 183079 DEBUG oslo_concurrency.lockutils [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.709 183079 INFO nova.compute.manager [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Terminating instance
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.710 183079 DEBUG nova.compute.manager [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 18:07:22 compute-0 kernel: tapcba5e4c6-9f (unregistering): left promiscuous mode
Jan 22 18:07:22 compute-0 NetworkManager[55454]: <info>  [1769105242.7364] device (tapcba5e4c6-9f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.741 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:22 compute-0 ovn_controller[95372]: 2026-01-22T18:07:22Z|00913|binding|INFO|Releasing lport cba5e4c6-9f20-4080-a0b3-f1ce8a74528e from this chassis (sb_readonly=0)
Jan 22 18:07:22 compute-0 ovn_controller[95372]: 2026-01-22T18:07:22Z|00914|binding|INFO|Setting lport cba5e4c6-9f20-4080-a0b3-f1ce8a74528e down in Southbound
Jan 22 18:07:22 compute-0 ovn_controller[95372]: 2026-01-22T18:07:22Z|00915|binding|INFO|Removing iface tapcba5e4c6-9f ovn-installed in OVS
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.744 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:22.751 104629 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3d:20 10.100.0.4 2001:db8::f816:3eff:fe29:3d20'], port_security=['fa:16:3e:29:3d:20 10.100.0.4 2001:db8::f816:3eff:fe29:3d20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28 2001:db8::f816:3eff:fe29:3d20/64', 'neutron:device_id': 'dd2cb045-1122-4834-9fea-6294fc690f67', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bfeeb28f5b97478fac3f61cc12827bb3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ad003613-25d7-4407-87d3-e15c431e7689', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.239'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7c020e6d-68ca-4390-a036-a6097b05932f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>], logical_port=cba5e4c6-9f20-4080-a0b3-f1ce8a74528e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f87fb988f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 18:07:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:22.753 104629 INFO neutron.agent.ovn.metadata.agent [-] Port cba5e4c6-9f20-4080-a0b3-f1ce8a74528e in datapath cd33fee9-f283-4792-8796-9cc0f4021aaf unbound from our chassis
Jan 22 18:07:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:22.755 104629 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cd33fee9-f283-4792-8796-9cc0f4021aaf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 18:07:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:22.756 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[2f0a1ec1-b100-4bb0-95e6-d04740384acc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:07:22 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:22.757 104629 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf namespace which is not needed anymore
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.756 183079 DEBUG nova.compute.manager [req-346d25a2-49a1-4326-b38b-faba78c7b3e5 req-4c4a4c31-5906-4ce9-85e8-91108c60c6ad a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Received event network-vif-deleted-8cf692cc-450a-4d35-99f9-bbf0b17493ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.760 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:22 compute-0 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d00000053.scope: Deactivated successfully.
Jan 22 18:07:22 compute-0 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d00000053.scope: Consumed 18.165s CPU time.
Jan 22 18:07:22 compute-0 systemd-machined[154382]: Machine qemu-83-instance-00000053 terminated.
Jan 22 18:07:22 compute-0 podman[248623]: 2026-01-22 18:07:22.834091416 +0000 UTC m=+0.061835533 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:07:22 compute-0 neutron-haproxy-ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf[247895]: [NOTICE]   (247899) : haproxy version is 2.8.14-c23fe91
Jan 22 18:07:22 compute-0 neutron-haproxy-ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf[247895]: [NOTICE]   (247899) : path to executable is /usr/sbin/haproxy
Jan 22 18:07:22 compute-0 neutron-haproxy-ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf[247895]: [WARNING]  (247899) : Exiting Master process...
Jan 22 18:07:22 compute-0 neutron-haproxy-ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf[247895]: [ALERT]    (247899) : Current worker (247901) exited with code 143 (Terminated)
Jan 22 18:07:22 compute-0 neutron-haproxy-ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf[247895]: [WARNING]  (247899) : All workers exited. Exiting... (0)
Jan 22 18:07:22 compute-0 systemd[1]: libpod-612aeafed49e1c0708b67dd5cbf4552d27a9cbfe9bdea022d7dafca64af2dc9b.scope: Deactivated successfully.
Jan 22 18:07:22 compute-0 podman[248670]: 2026-01-22 18:07:22.891340006 +0000 UTC m=+0.042884812 container died 612aeafed49e1c0708b67dd5cbf4552d27a9cbfe9bdea022d7dafca64af2dc9b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 18:07:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-612aeafed49e1c0708b67dd5cbf4552d27a9cbfe9bdea022d7dafca64af2dc9b-userdata-shm.mount: Deactivated successfully.
Jan 22 18:07:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-f264e97b6c1ea68ae92434cdc69afa24e06bcdbc67e718e964538da7258a37a1-merged.mount: Deactivated successfully.
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.934 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.938 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:22 compute-0 podman[248670]: 2026-01-22 18:07:22.940394113 +0000 UTC m=+0.091938919 container cleanup 612aeafed49e1c0708b67dd5cbf4552d27a9cbfe9bdea022d7dafca64af2dc9b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 18:07:22 compute-0 systemd[1]: libpod-conmon-612aeafed49e1c0708b67dd5cbf4552d27a9cbfe9bdea022d7dafca64af2dc9b.scope: Deactivated successfully.
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.964 183079 INFO nova.virt.libvirt.driver [-] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Instance destroyed successfully.
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.964 183079 DEBUG nova.objects.instance [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lazy-loading 'resources' on Instance uuid dd2cb045-1122-4834-9fea-6294fc690f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.977 183079 DEBUG nova.virt.libvirt.vif [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T18:04:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1479257049',display_name='tempest-server-test-1479257049',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1479257049',id=83,image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN4nUgc+oV7ByTq9qVVAwVt7Z5aS2u6R2o/Luqd0qC5fBtJq9ShHu4J7y3Jz2xuF9QZoBf9pscb0QblSJ2VeHfwaE0UkbJGn/O5SM0HSKnGhFZW4WRYEXRQdnsqUpTi/Qw==',key_name='tempest-keypair-test-2068520959',keypairs=<?>,launch_index=0,launched_at=2026-01-22T18:05:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bfeeb28f5b97478fac3f61cc12827bb3',ramdisk_id='',reservation_id='r-iommjg3l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e1b65bbe-5c14-4552-a5d9-d275c9dd42d3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-StatelessSecGroupDualStackSlaacTest-795693713',owner_user_name='tempest-StatelessSecGroupDualStackSlaacTest-795693713-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T18:05:04Z,user_data=None,user_id='42329d6b6bc04d9daacda0eb41f36019',uuid=dd2cb045-1122-4834-9fea-6294fc690f67,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "address": "fa:16:3e:29:3d:20", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe29:3d20", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba5e4c6-9f", "ovs_interfaceid": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.978 183079 DEBUG nova.network.os_vif_util [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Converting VIF {"id": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "address": "fa:16:3e:29:3d:20", "network": {"id": "cd33fee9-f283-4792-8796-9cc0f4021aaf", "bridge": "br-int", "label": "tempest-test-network--1302050819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe29:3d20", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "bfeeb28f5b97478fac3f61cc12827bb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba5e4c6-9f", "ovs_interfaceid": "cba5e4c6-9f20-4080-a0b3-f1ce8a74528e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.979 183079 DEBUG nova.network.os_vif_util [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:29:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=cba5e4c6-9f20-4080-a0b3-f1ce8a74528e,network=Network(cd33fee9-f283-4792-8796-9cc0f4021aaf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcba5e4c6-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.979 183079 DEBUG os_vif [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:29:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=cba5e4c6-9f20-4080-a0b3-f1ce8a74528e,network=Network(cd33fee9-f283-4792-8796-9cc0f4021aaf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcba5e4c6-9f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.980 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:22 compute-0 nova_compute[183075]: 2026-01-22 18:07:22.980 183079 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcba5e4c6-9f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:07:23 compute-0 nova_compute[183075]: 2026-01-22 18:07:23.017 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:23 compute-0 nova_compute[183075]: 2026-01-22 18:07:23.019 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:23 compute-0 nova_compute[183075]: 2026-01-22 18:07:23.021 183079 INFO os_vif [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:29:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=cba5e4c6-9f20-4080-a0b3-f1ce8a74528e,network=Network(cd33fee9-f283-4792-8796-9cc0f4021aaf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcba5e4c6-9f')
Jan 22 18:07:23 compute-0 nova_compute[183075]: 2026-01-22 18:07:23.021 183079 INFO nova.virt.libvirt.driver [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Deleting instance files /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67_del
Jan 22 18:07:23 compute-0 nova_compute[183075]: 2026-01-22 18:07:23.022 183079 INFO nova.virt.libvirt.driver [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Deletion of /var/lib/nova/instances/dd2cb045-1122-4834-9fea-6294fc690f67_del complete
Jan 22 18:07:23 compute-0 podman[248713]: 2026-01-22 18:07:23.038447647 +0000 UTC m=+0.077181000 container remove 612aeafed49e1c0708b67dd5cbf4552d27a9cbfe9bdea022d7dafca64af2dc9b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:07:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:23.042 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[6c597dea-09ff-4729-824a-2a02dcc67399]: (4, ('Thu Jan 22 06:07:22 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf (612aeafed49e1c0708b67dd5cbf4552d27a9cbfe9bdea022d7dafca64af2dc9b)\n612aeafed49e1c0708b67dd5cbf4552d27a9cbfe9bdea022d7dafca64af2dc9b\nThu Jan 22 06:07:22 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf (612aeafed49e1c0708b67dd5cbf4552d27a9cbfe9bdea022d7dafca64af2dc9b)\n612aeafed49e1c0708b67dd5cbf4552d27a9cbfe9bdea022d7dafca64af2dc9b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:07:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:23.044 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[19a730cb-2dfd-4f18-a0b0-a76c9b22dc51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:07:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:23.045 104629 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd33fee9-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 18:07:23 compute-0 nova_compute[183075]: 2026-01-22 18:07:23.046 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:23 compute-0 kernel: tapcd33fee9-f0: left promiscuous mode
Jan 22 18:07:23 compute-0 nova_compute[183075]: 2026-01-22 18:07:23.060 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:23.064 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[0f82ebd2-001f-4b99-9d77-e4fd9fee1ca1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:07:23 compute-0 nova_compute[183075]: 2026-01-22 18:07:23.073 183079 INFO nova.compute.manager [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 22 18:07:23 compute-0 nova_compute[183075]: 2026-01-22 18:07:23.074 183079 DEBUG oslo.service.loopingcall [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 18:07:23 compute-0 nova_compute[183075]: 2026-01-22 18:07:23.074 183079 DEBUG nova.compute.manager [-] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 18:07:23 compute-0 nova_compute[183075]: 2026-01-22 18:07:23.074 183079 DEBUG nova.network.neutron [-] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 18:07:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:23.077 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[7f6a7c84-b0f4-4443-a071-d92fc23c90b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:07:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:23.078 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[44d94539-2662-4e7d-9da2-0b147087368f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:07:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:23.095 211630 DEBUG oslo.privsep.daemon [-] privsep: reply[863f476a-dc95-47e5-94af-40e92149bb70]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746708, 'reachable_time': 16908, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248730, 'error': None, 'target': 'ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:07:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:23.098 105117 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cd33fee9-f283-4792-8796-9cc0f4021aaf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 18:07:23 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:23.098 105117 DEBUG oslo.privsep.daemon [-] privsep: reply[ef8728af-1a6a-4a1a-9aee-cf8c8c745614]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 18:07:23 compute-0 systemd[1]: run-netns-ovnmeta\x2dcd33fee9\x2df283\x2d4792\x2d8796\x2d9cc0f4021aaf.mount: Deactivated successfully.
Jan 22 18:07:23 compute-0 nova_compute[183075]: 2026-01-22 18:07:23.951 183079 DEBUG nova.network.neutron [-] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 18:07:23 compute-0 nova_compute[183075]: 2026-01-22 18:07:23.971 183079 INFO nova.compute.manager [-] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Took 0.90 seconds to deallocate network for instance.
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.034 183079 DEBUG oslo_concurrency.lockutils [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.034 183079 DEBUG oslo_concurrency.lockutils [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.128 183079 DEBUG nova.compute.provider_tree [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.152 183079 DEBUG nova.scheduler.client.report [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.193 183079 DEBUG oslo_concurrency.lockutils [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.223 183079 INFO nova.scheduler.client.report [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Deleted allocations for instance dd2cb045-1122-4834-9fea-6294fc690f67
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.287 183079 DEBUG oslo_concurrency.lockutils [None req-f39ac089-ab2c-473e-bf39-202ed4e3323c 42329d6b6bc04d9daacda0eb41f36019 bfeeb28f5b97478fac3f61cc12827bb3 - - default default] Lock "dd2cb045-1122-4834-9fea-6294fc690f67" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.834 183079 DEBUG nova.compute.manager [req-fd8cbb45-1dd2-43fe-9872-6c25e14d44af req-c7c2352c-7401-4831-8759-2467ef1ac83b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Received event network-vif-unplugged-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.834 183079 DEBUG oslo_concurrency.lockutils [req-fd8cbb45-1dd2-43fe-9872-6c25e14d44af req-c7c2352c-7401-4831-8759-2467ef1ac83b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.834 183079 DEBUG oslo_concurrency.lockutils [req-fd8cbb45-1dd2-43fe-9872-6c25e14d44af req-c7c2352c-7401-4831-8759-2467ef1ac83b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.835 183079 DEBUG oslo_concurrency.lockutils [req-fd8cbb45-1dd2-43fe-9872-6c25e14d44af req-c7c2352c-7401-4831-8759-2467ef1ac83b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.835 183079 DEBUG nova.compute.manager [req-fd8cbb45-1dd2-43fe-9872-6c25e14d44af req-c7c2352c-7401-4831-8759-2467ef1ac83b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] No waiting events found dispatching network-vif-unplugged-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.835 183079 WARNING nova.compute.manager [req-fd8cbb45-1dd2-43fe-9872-6c25e14d44af req-c7c2352c-7401-4831-8759-2467ef1ac83b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Received unexpected event network-vif-unplugged-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e for instance with vm_state deleted and task_state None.
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.835 183079 DEBUG nova.compute.manager [req-fd8cbb45-1dd2-43fe-9872-6c25e14d44af req-c7c2352c-7401-4831-8759-2467ef1ac83b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Received event network-vif-plugged-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.836 183079 DEBUG oslo_concurrency.lockutils [req-fd8cbb45-1dd2-43fe-9872-6c25e14d44af req-c7c2352c-7401-4831-8759-2467ef1ac83b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Acquiring lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.836 183079 DEBUG oslo_concurrency.lockutils [req-fd8cbb45-1dd2-43fe-9872-6c25e14d44af req-c7c2352c-7401-4831-8759-2467ef1ac83b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.836 183079 DEBUG oslo_concurrency.lockutils [req-fd8cbb45-1dd2-43fe-9872-6c25e14d44af req-c7c2352c-7401-4831-8759-2467ef1ac83b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] Lock "dd2cb045-1122-4834-9fea-6294fc690f67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.836 183079 DEBUG nova.compute.manager [req-fd8cbb45-1dd2-43fe-9872-6c25e14d44af req-c7c2352c-7401-4831-8759-2467ef1ac83b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] No waiting events found dispatching network-vif-plugged-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.836 183079 WARNING nova.compute.manager [req-fd8cbb45-1dd2-43fe-9872-6c25e14d44af req-c7c2352c-7401-4831-8759-2467ef1ac83b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Received unexpected event network-vif-plugged-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e for instance with vm_state deleted and task_state None.
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.837 183079 DEBUG nova.compute.manager [req-fd8cbb45-1dd2-43fe-9872-6c25e14d44af req-c7c2352c-7401-4831-8759-2467ef1ac83b a94120332d8a40f480276baca8d41dec 6016498b20884b46a136bbc4d1cca897 - - default default] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Received event network-vif-deleted-cba5e4c6-9f20-4080-a0b3-f1ce8a74528e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 18:07:24 compute-0 nova_compute[183075]: 2026-01-22 18:07:24.990 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:27 compute-0 podman[248731]: 2026-01-22 18:07:27.350360186 +0000 UTC m=+0.056546721 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 18:07:28 compute-0 nova_compute[183075]: 2026-01-22 18:07:28.018 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:29 compute-0 nova_compute[183075]: 2026-01-22 18:07:29.991 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:32 compute-0 nova_compute[183075]: 2026-01-22 18:07:32.441 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769105237.4401884, e0fdd3fd-57d6-4237-8d21-0716e1687405 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:07:32 compute-0 nova_compute[183075]: 2026-01-22 18:07:32.442 183079 INFO nova.compute.manager [-] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] VM Stopped (Lifecycle Event)
Jan 22 18:07:32 compute-0 nova_compute[183075]: 2026-01-22 18:07:32.466 183079 DEBUG nova.compute.manager [None req-210f1441-88ab-4c38-a8f5-31ac21ae2e90 - - - - - -] [instance: e0fdd3fd-57d6-4237-8d21-0716e1687405] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:07:33 compute-0 nova_compute[183075]: 2026-01-22 18:07:33.021 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:34 compute-0 nova_compute[183075]: 2026-01-22 18:07:34.992 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:35 compute-0 nova_compute[183075]: 2026-01-22 18:07:35.467 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:35 compute-0 nova_compute[183075]: 2026-01-22 18:07:35.536 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:37 compute-0 nova_compute[183075]: 2026-01-22 18:07:37.964 183079 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769105242.9624386, dd2cb045-1122-4834-9fea-6294fc690f67 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 18:07:37 compute-0 nova_compute[183075]: 2026-01-22 18:07:37.965 183079 INFO nova.compute.manager [-] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] VM Stopped (Lifecycle Event)
Jan 22 18:07:37 compute-0 nova_compute[183075]: 2026-01-22 18:07:37.988 183079 DEBUG nova.compute.manager [None req-89b337f6-2815-4315-92b9-70cd668af6c2 - - - - - -] [instance: dd2cb045-1122-4834-9fea-6294fc690f67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 18:07:38 compute-0 nova_compute[183075]: 2026-01-22 18:07:38.023 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:38 compute-0 podman[248758]: 2026-01-22 18:07:38.366461703 +0000 UTC m=+0.062307937 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 18:07:38 compute-0 podman[248757]: 2026-01-22 18:07:38.372593169 +0000 UTC m=+0.063236892 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true)
Jan 22 18:07:38 compute-0 podman[248756]: 2026-01-22 18:07:38.397857472 +0000 UTC m=+0.094142708 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 18:07:39 compute-0 nova_compute[183075]: 2026-01-22 18:07:39.993 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:41 compute-0 podman[248815]: 2026-01-22 18:07:41.353622701 +0000 UTC m=+0.064279191 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute)
Jan 22 18:07:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:41.992 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:07:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:41.993 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:07:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:07:41.993 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:07:43 compute-0 nova_compute[183075]: 2026-01-22 18:07:43.025 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:44 compute-0 nova_compute[183075]: 2026-01-22 18:07:44.995 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:48 compute-0 nova_compute[183075]: 2026-01-22 18:07:48.027 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:49 compute-0 nova_compute[183075]: 2026-01-22 18:07:49.996 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:53 compute-0 nova_compute[183075]: 2026-01-22 18:07:53.029 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:53 compute-0 podman[248835]: 2026-01-22 18:07:53.329975626 +0000 UTC m=+0.040783685 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 18:07:54 compute-0 nova_compute[183075]: 2026-01-22 18:07:54.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:07:54 compute-0 nova_compute[183075]: 2026-01-22 18:07:54.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:07:55 compute-0 nova_compute[183075]: 2026-01-22 18:07:55.053 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.466 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.467 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.467 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.467 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.467 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.467 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.467 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.467 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.467 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.467 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.468 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.469 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.469 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 ceilometer_agent_compute[192753]: 2026-01-22 18:07:55.469 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:07:55 compute-0 nova_compute[183075]: 2026-01-22 18:07:55.789 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:07:56 compute-0 nova_compute[183075]: 2026-01-22 18:07:56.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:07:58 compute-0 nova_compute[183075]: 2026-01-22 18:07:58.031 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:07:58 compute-0 podman[248859]: 2026-01-22 18:07:58.362125635 +0000 UTC m=+0.073090338 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 18:08:00 compute-0 nova_compute[183075]: 2026-01-22 18:08:00.055 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:03 compute-0 nova_compute[183075]: 2026-01-22 18:08:03.032 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:03 compute-0 nova_compute[183075]: 2026-01-22 18:08:03.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:08:05 compute-0 nova_compute[183075]: 2026-01-22 18:08:05.057 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:05 compute-0 nova_compute[183075]: 2026-01-22 18:08:05.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:08:05 compute-0 nova_compute[183075]: 2026-01-22 18:08:05.788 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 18:08:06 compute-0 nova_compute[183075]: 2026-01-22 18:08:06.102 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 18:08:06 compute-0 nova_compute[183075]: 2026-01-22 18:08:06.103 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:08:06 compute-0 nova_compute[183075]: 2026-01-22 18:08:06.103 183079 DEBUG nova.compute.manager [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 18:08:07 compute-0 ovn_controller[95372]: 2026-01-22T18:08:07Z|00916|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Jan 22 18:08:07 compute-0 nova_compute[183075]: 2026-01-22 18:08:07.788 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:08:07 compute-0 nova_compute[183075]: 2026-01-22 18:08:07.812 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:08:07 compute-0 nova_compute[183075]: 2026-01-22 18:08:07.813 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:08:07 compute-0 nova_compute[183075]: 2026-01-22 18:08:07.813 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:08:07 compute-0 nova_compute[183075]: 2026-01-22 18:08:07.813 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 18:08:07 compute-0 nova_compute[183075]: 2026-01-22 18:08:07.978 183079 WARNING nova.virt.libvirt.driver [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 18:08:07 compute-0 nova_compute[183075]: 2026-01-22 18:08:07.980 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5722MB free_disk=73.34965515136719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 18:08:07 compute-0 nova_compute[183075]: 2026-01-22 18:08:07.980 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:08:07 compute-0 nova_compute[183075]: 2026-01-22 18:08:07.981 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:08:08 compute-0 nova_compute[183075]: 2026-01-22 18:08:08.033 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:08 compute-0 nova_compute[183075]: 2026-01-22 18:08:08.175 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 18:08:08 compute-0 nova_compute[183075]: 2026-01-22 18:08:08.176 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 18:08:08 compute-0 nova_compute[183075]: 2026-01-22 18:08:08.253 183079 DEBUG nova.compute.provider_tree [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed in ProviderTree for provider: 2513134c-f67c-4237-84bf-4ebe2450d610 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 18:08:08 compute-0 nova_compute[183075]: 2026-01-22 18:08:08.269 183079 DEBUG nova.scheduler.client.report [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Inventory has not changed for provider 2513134c-f67c-4237-84bf-4ebe2450d610 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 18:08:08 compute-0 nova_compute[183075]: 2026-01-22 18:08:08.291 183079 DEBUG nova.compute.resource_tracker [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 18:08:08 compute-0 nova_compute[183075]: 2026-01-22 18:08:08.292 183079 DEBUG oslo_concurrency.lockutils [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.311s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:08:09 compute-0 podman[248885]: 2026-01-22 18:08:09.343793389 +0000 UTC m=+0.053743495 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 22 18:08:09 compute-0 podman[248886]: 2026-01-22 18:08:09.357705085 +0000 UTC m=+0.061497114 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=openstack_network_exporter, vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 18:08:09 compute-0 podman[248884]: 2026-01-22 18:08:09.382503806 +0000 UTC m=+0.094756924 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 18:08:10 compute-0 nova_compute[183075]: 2026-01-22 18:08:10.102 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:11 compute-0 nova_compute[183075]: 2026-01-22 18:08:11.292 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:08:11 compute-0 podman[248951]: 2026-01-22 18:08:11.865490021 +0000 UTC m=+0.071066234 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 18:08:13 compute-0 nova_compute[183075]: 2026-01-22 18:08:13.036 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:15 compute-0 nova_compute[183075]: 2026-01-22 18:08:15.106 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:18 compute-0 nova_compute[183075]: 2026-01-22 18:08:18.038 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:20 compute-0 nova_compute[183075]: 2026-01-22 18:08:20.110 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:23 compute-0 nova_compute[183075]: 2026-01-22 18:08:23.040 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:24 compute-0 podman[248973]: 2026-01-22 18:08:24.346711026 +0000 UTC m=+0.054272730 container health_status 266f1e4dcbee30932b5012b766f0db1ec292ee2d195430b96db31cd218057e69 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 18:08:25 compute-0 nova_compute[183075]: 2026-01-22 18:08:25.112 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:28 compute-0 nova_compute[183075]: 2026-01-22 18:08:28.042 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:29 compute-0 podman[248999]: 2026-01-22 18:08:29.357887829 +0000 UTC m=+0.064502186 container health_status 04a89507163f2f3d440ace32e85de5f96f6faa9f6f8fb43b37b4cf4075752d75 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 18:08:29 compute-0 sshd-session[248972]: Connection reset by authenticating user root 176.120.22.47 port 23462 [preauth]
Jan 22 18:08:30 compute-0 nova_compute[183075]: 2026-01-22 18:08:30.114 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:32 compute-0 nova_compute[183075]: 2026-01-22 18:08:32.783 183079 DEBUG oslo_service.periodic_task [None req-753c14e1-6596-4f4b-8bb2-def316186f46 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 18:08:33 compute-0 nova_compute[183075]: 2026-01-22 18:08:33.045 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:34 compute-0 sshd-session[249027]: Accepted publickey for zuul from 192.168.122.10 port 56676 ssh2: ECDSA SHA256:XN6iZwzcCiCbU0l3vCSWxZPd4ElPyx+ZEvhzj+S5SUw
Jan 22 18:08:34 compute-0 systemd-logind[796]: New session 27 of user zuul.
Jan 22 18:08:34 compute-0 systemd[1]: Started Session 27 of User zuul.
Jan 22 18:08:34 compute-0 sshd-session[249027]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 18:08:34 compute-0 sudo[249031]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 22 18:08:34 compute-0 sudo[249031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 18:08:35 compute-0 nova_compute[183075]: 2026-01-22 18:08:35.115 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:37 compute-0 sshd-session[249023]: Connection reset by authenticating user root 176.120.22.47 port 59256 [preauth]
Jan 22 18:08:38 compute-0 nova_compute[183075]: 2026-01-22 18:08:38.047 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:38 compute-0 sshd-session[249024]: Connection reset by authenticating user root 176.120.22.47 port 59268 [preauth]
Jan 22 18:08:38 compute-0 ovs-vsctl[249206]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 22 18:08:39 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 249055 (sos)
Jan 22 18:08:39 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 22 18:08:39 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 22 18:08:39 compute-0 podman[249256]: 2026-01-22 18:08:39.672332327 +0000 UTC m=+0.073260504 container health_status 642f580748dfef4201a3cb8efe9285c783b4c190f84f4cf6ef4b274cf82810dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:08:39 compute-0 podman[249257]: 2026-01-22 18:08:39.68575085 +0000 UTC m=+0.084592970 container health_status c45452bc8a7aad7fb63d6eb2fe824eb4c9feef8cee642a0540a29970f0b3cf30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.6, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Jan 22 18:08:39 compute-0 podman[249254]: 2026-01-22 18:08:39.710474029 +0000 UTC m=+0.111347854 container health_status 3b675a52ec1ec9b7a19122d7f750f9234e7586b104b881cf22dc3042ee47cbee (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:08:39 compute-0 virtqemud[182696]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 22 18:08:39 compute-0 virtqemud[182696]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 22 18:08:39 compute-0 virtqemud[182696]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 22 18:08:40 compute-0 nova_compute[183075]: 2026-01-22 18:08:40.117 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:40 compute-0 crontab[249677]: (root) LIST (root)
Jan 22 18:08:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:08:41.993 104629 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 18:08:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:08:41.995 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 18:08:41 compute-0 ovn_metadata_agent[104624]: 2026-01-22 18:08:41.995 104629 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 18:08:42 compute-0 podman[249756]: 2026-01-22 18:08:42.391611416 +0000 UTC m=+0.078545567 container health_status 7b1c66d8e31ed1934107b30806565439b2d73c465ef4ac7240033838002bd55a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'c244bcc6e7350fc99c1f362b79039930130b270d3aec63c74fe882870d576860-4231f523a6946d20457f4f2ef86b6ce005ff5ba41729376709128776eb5ce850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 18:08:43 compute-0 nova_compute[183075]: 2026-01-22 18:08:43.050 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:43 compute-0 systemd[1]: Starting Hostname Service...
Jan 22 18:08:43 compute-0 systemd[1]: Started Hostname Service.
Jan 22 18:08:45 compute-0 nova_compute[183075]: 2026-01-22 18:08:45.156 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:45 compute-0 sshd-session[249172]: Connection reset by authenticating user root 176.120.22.47 port 59298 [preauth]
Jan 22 18:08:46 compute-0 sshd-session[249178]: Connection reset by authenticating user root 176.120.22.47 port 59316 [preauth]
Jan 22 18:08:48 compute-0 nova_compute[183075]: 2026-01-22 18:08:48.052 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:50 compute-0 nova_compute[183075]: 2026-01-22 18:08:50.157 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:51 compute-0 ovs-appctl[250986]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 18:08:51 compute-0 ovs-appctl[250990]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 18:08:51 compute-0 ovs-appctl[250993]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 18:08:52 compute-0 sshd-session[249904]: Connection reset by authenticating user root 176.120.22.47 port 24368 [preauth]
Jan 22 18:08:53 compute-0 nova_compute[183075]: 2026-01-22 18:08:53.053 183079 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 18:08:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck2491102825-merged.mount: Deactivated successfully.
Jan 22 18:08:53 compute-0 sshd-session[250160]: Invalid user ubuntu from 176.120.22.47 port 24378
